DATA ENGINEER - JOHANNESBURG IT
Johannesburg - Gauteng
DATA ENGINEER - PARKTOWN JHB
Market related salary
Role Purpose
As a Data engineer, you will work in the data engineering team to build and maintain data pipelines to ingest data into the warehouse and support integration into other systems. You will apply your knowledge of good data engineering practices and standards, and data technologies to deliver target state design and implementation.
You will be part of a multi-disciplinary technology team, and work closely with our customers (business, software vendors and partners). Within the team, you will be responsible for a suite of data processes and will participate in all aspects from design through to testing and implementation. You will be surrounded by data professionals that strive for excellence and data best practices to realise business value.
The role is challenging, and you must be adept at problem solving and be able to respond to changing priorities and rapidly evolving requirements that may have a direct impact on services to users.
This role would suit a professional who is keen to grow their career in a busy team that values cognitive diversity and diversity of lived experience.
Key Focus Areas
- Design and support delivery of engineering solutions in the data warehouse,
- Balancing enhancing ETL processes and modelling while supporting short term data and reporting requirements from across the business.
- Work collaboratively with cross-functional teams, including the Infrastructure DevOps team, solution architects, subject matter experts, data modelers, finance, underwriting, operations, etc.
- This collaboration aims to ensure that the data platform ecosystem is optimal in supporting business needs.
- Work collaboratively with vendors and partners to ensure that data engineering delivery and practices meet all standards.
Key Performance Areas
- Maintain, support, and monitor existing production SSIS packages and SQL Queries and Stored Procedures, CICD pipelines to make sure all data loads on data warehouse meet data quality standards and business SLA requirements.
- Build, maintain, support, and monitor Synapse data engineering pipelines on data platform if required.
- Participate and contribute to data architecture design, data modelling, gathering and analysis of data requirements; understand document, communicate, and build appropriate solutions.
- Participate, design, build, deliver, and document data related projects with various environment specific data analytics technologies.
- Promote data engineering best practices with CICD pipelines and automation.
- Collaborate and work closely with team members and contribute significantly to building a high performing, collaborative, transparent, and result-driven data engineering team.
- Support the Data Engineering Practice Team Manager with best fit-to-purpose data engineering solutions, quality engineering artifacts and high standard documentation.
- Follow Data Ethics standards to protect personal information and do the right thing to meet
- our customers, partners, and communitys expectations.
Competencies, Knowledge And Skills
- Be available and engaged, flexible and resourceful with a can-do attitude.
- Passion for continuous learning with curiosity for technology and analytics
- Excellent communicator, collaborator and people person with an ability to build rapport
- quickly.
- Ability to effectively manage challenging situations without loss of focus when under pressure.
- Be comfortable with ambiguity and willing to get outside of comfort zone while delivering tangible results.
- Role model values of honesty, integrity and respect for cognitive diversity and psychological safety in the workplace
- Open to different perspectives and able to influence people from diverse backgrounds, viewpoints, functions, and organisational hierarchy with competing priorities.
Minimum Requirements
- 8+ year demonstrable experience in design, build and support of data engineering pipelines in the data warehousing, data ingestion, cleansing, manipulation, modelling and Reporting.
- Experience in ETL using Microsoft technologies.
- Strong experience in writing MS SQL server queries, stored procedures, and SSIS Packages. Experience with SSRS would be an advantage.
- Experience of manipulating semi-structured data (XML, JSON)
- Strong knowledge and extensive experience in working in an Agile framework with CI/CD using modern DevOps / Data Ops integrated processes with YAML pipelines.
- Bachelors degree in computer/data science technical or related field is a must. Post-graduate is highly regarded.
Knowledge of Azure Synapse data engineering pipelines, PySpark notebooks, data platform lake house architecture and Azure SQL ODS storage is desirable.
Should you not hear from us within 14 days of your applications, please consider as unsuccessful
Apply for this Job