25 Etl Developer jobs in South Africa

ETL Developer

R900000 - R1200000 Y Circana South Africa

Posted today

Job Viewed

Tap Again To Close

Job Description

Data Engineer (UK)

Let's be unstoppable together

At Circana, we are fueled by our passion for continuous learning and growth, we seek and share feedback freely, and we celebrate victories both big and small in an environment that is flexible and accommodating to our work and personal lives. We're a global company dedicated to fostering inclusivity and belonging. We value and celebrate the unique experiences, cultures, and viewpoints that each individual brings. By embracing a wide range of backgrounds, skills, expertise, and beyond, we create a stronger, more innovative environment for our employees, clients, and communities. With us, you can always bring your full self to work. Join our inclusive, committed team to be a challenger, own outcomes, and stay curious together. Circana is proud to be Certified by Great Place To Work. This prestigious award is based entirely on what current employees say about their experience working at Circana.

Learn more at

Job Summary:

What is the role & team:

Join our Global Professional Services team as a Data Engineer, where you will be instrumental in bridging the gap between client needs, business objectives, and cutting-edge technology solutions, particularly supporting our new Private Cloud clients.

This role is for more than just a service provider; we are seeking a consultant-minded go-getter who is client-obsessed, a liquid thinker who thrives at pace, and is bold and brave in challenging the status quo. We are a team that values being part of a successful business and building partnerships based on trust, deep knowledge, respect, and unwavering support.

We are passionate about technology, data and AI and how it can be leveraged to solve our clients' most pressing business challenges. You will be a key player in understanding our clients' world, becoming their trusted partner, and proactively identifying opportunities for innovation and efficiency. Curiosity, accountability, and a positive, questioning mindset are at the core of our team's DNA – we don't just aim to meet expectations, we strive to exceed them.

You will work collaboratively with global Circana teams, acting as the vital link to ensure seamless, unified support and strategic technological partnership for our clients.

In this role we are seeking a skilled and motivated Data Engineer to join a growing Global Team based in the UK. You will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for Data Engineering and a desire to make a significant impact, we encourage you to apply

Key Responsibilities:

  • ETL/ELT Pipeline Development:
    • Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow.
  • Implement batch and real-time data processing solutions using Apache Spark.
  • Ensure data quality, governance, and security throughout the data lifecycle.
  • Cloud Data Engineering:
    • Manage and optimize cloud infrastructure (Azure) for data processing workloads, with a focus on cost-effectiveness.
  • Implement and maintain CI/CD pipelines for data workflows to ensure smooth and reliable deployments.
  • Big Data & Analytics:
    • Develop and optimize large-scale data processing pipelines using Apache Spark and PySpark.
  • Implement data partitioning, caching, and performance tuning techniques to enhance Spark-based workloads.
  • Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning initiatives.
  • Workflow Orchestration (Airflow):
    • Design and maintain DAGs (Directed Acyclic Graphs) in Apache Airflow to automate complex data workflows.
  • Monitor, troubleshoot, and optimize job execution and dependencies within Airflow.
  • Team Leadership & Collaboration:
    • Provide technical guidance and mentorship to a team of data engineers in India.
  • Foster a collaborative environment and promote best practices for coding standards, version control, and documentation.

Required Skills & Experience:

  • 5+ years of proven experience in data engineering, with hands-on expertise in Azure Data Services, PySpark, Apache Spark, and Apache Airflow.
  • Strong programming skills in Python and SQL, with the ability to write efficient and maintainable code.
  • Deep understanding of Spark internals, including RDDs, DataFrames, DAG execution, partitioning, and performance optimization techniques.
  • Experience with designing and managing Airflow DAGs, scheduling, and dependency management.
  • Knowledge of CI/CD pipelines, containerization technologies (Docker, Kubernetes), and DevOps principles applied to data workflows.
  • Excellent problem-solving skills and a proven ability to optimize large-scale data processing tasks.
  • Prior experience in leading teams and working in Agile/Scrum development environments.
  • Client facing role so strong communication and collaboration skills are vital

  • A track record of working effectively global remote teams

Bonus Points:

  • Experience with data modeling and data warehousing concepts.
  • Familiarity with data visualization tools and techniques.
  • Knowledge of machine learning algorithms and frameworks.

What are we looking for?

We are seeking an individual who embodies the following characteristics:

  • An authentic optimist with a genuine passion for solving problems and driving progress.
  • Intensely curious – you love to understand the 'why' and 'how' behind things, especially when it comes to technology and client challenges.
  • Highly accountable for your actions, your deliverables, and the success of your clients.
  • Comfortable and adept at building strong, professional relationships with both external clients and internal technical and business teams.
  • Deeply committed to giving your best for your team and ensuring our technology solutions exceed client expectations.
  • Confident in making informed decisions, weighing options and considering the technical and business implications.
  • Highly organised, with meticulous attention to detail in managing documentation, communication, and time.
  • Passionate about technology, Data, AI, and their application in solving real-world business problems.
  • Previous experience translating client requirements into technical specifications

Previous experience building relationships with clients.

LI-AM19
This advertiser has chosen not to accept applicants from your region.

Lead ETL Developer

R200000 - R250000 Y Circana South Africa

Posted today

Job Viewed

Tap Again To Close

Job Description

Let's be unstoppable together

At Circana, we are fueled by our passion for continuous learning and growth, we seek and share feedback freely, and we celebrate victories both big and small in an environment that is flexible and accommodating to our work and personal lives. We're a global company dedicated to fostering inclusivity and belonging. We value and celebrate the unique experiences, cultures, and viewpoints that each individual brings. By embracing a wide range of backgrounds, skills, expertise, and beyond, we create a stronger, more innovative environment for our employees, clients, and communities. With us, you can always bring your full self to work. Join our inclusive, committed team to be a challenger, own outcomes, and stay curious together. Circana is proud to be Certified by Great Place To Work. This prestigious award is based entirely on what current employees say about their experience working at Circana.

Learn more at

Lead Data Engineer Job Description (UK)

What is the role & team:

Join our Global Professional Services team as a Lead Data Engineer, where you will be instrumental in bridging the gap between client needs, business objectives, and cutting-edge technology solutions, particularly supporting our new Private Cloud clients.

This role is for more than just a service provider; we are seeking a consultant-minded go-getter who is client-obsessed, a liquid thinker who thrives at pace, and is bold and brave in challenging the status quo. We are a team that values being part of a successful business and building partnerships based on trust, deep knowledge, respect, and unwavering support.

We are passionate about technology, data and AI and how it can be leveraged to solve our clients' most pressing business challenges. You will be a key player in understanding our clients' world, becoming their trusted partner, and proactively identifying opportunities for innovation and efficiency. Curiosity, accountability, and a positive, questioning mindset are at the core of our team's DNA – we don't just aim to meet expectations, we strive to exceed them.

You will work collaboratively with global Circana teams, acting as the vital link to ensure seamless, unified support and strategic technological partnership for our clients.

In this role we are seeking a skilled and motivated Data Engineer to join a growing Global Team based in the UK. You will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for Data Engineering and a desire to make a significant impact, we encourage you to apply

Key Responsibilities:

Data Engineering & Data Pipeline Development

  • Design, develop, and optimize scalable DATA workflows using Python, PySpark, and Airflow
  • Implement real-time and batch data processing using Spark
  • Enforce best practices for data quality, governance, and security throughout the data lifecycle
  • Ensure data availability, reliability and performance through monitoring and automation.

Cloud Data Engineering:

  • Manage cloud infrastructure and cost optimization for data processing workloads
  • Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments.

Big Data & Analytics:

  • Build and optimize large-scale data processing pipelines using Apache Spark and PySpark
  • Implement data partitioning, caching, and performance tuning for Spark-based workloads.

  • Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning initiatives.

Workflow Orchestration (Airflow)

  • Design and maintain DAGs (Directed Acyclic Graphs) in Airflow to automate complex data workflows
  • Monitor, troubleshoot, and optimize job execution and dependencies

Team Leadership & Collaboration

  • Lead a team of data engineers, providing technical guidance and mentorship
  • Foster a collaborative environment and promote best practices for coding standards, version control, and documentation.

Required Skills & Experience:

  • 8+ years of experience in data engineering with expertise in Azure, PySpark, Spark, and Airflow.
  • Strong programming skills in Python, SQL with the ability to write efficient and maintainable code
  • Deep understanding of Spark internals (RDDs, DataFrames, DAG execution, partitioning, etc.)
  • Experience with Airflow DAGs, scheduling, and dependency management
  • Knowledge of Git, Docker, Kubernetes, Terraform, and apply best practices of DevOps for CI/CD workflows
  • Excellent problem-solving skills and ability to optimize large-scale data processing.
  • Experience in leading teams and working in Agile/Scrum environments
  • This a client facing role, strong communication and collaboration skills are vital
  • A proven track record of working effectively global remote teams

Bonus Points:

  • Experience with data modeling and data warehousing concepts
  • Familiarity with data visualization tools and techniques
  • Knowledge of machine learning algorithms and frameworks

What are we looking for?

We are seeking an individual who embodies the following characteristics:

  • An authentic optimist with a genuine passion for solving problems and driving progress.
  • Intensely curious – you love to understand the 'why' and 'how' behind things, especially when it comes to technology and client challenges.
  • Highly accountable for your actions, your deliverables, and the success of your clients.
  • Comfortable and adept at building strong, professional relationships with both external clients and internal technical and business teams.
  • Deeply committed to giving your best for your team and ensuring our technology solutions exceed client expectations.
  • Confident in making informed decisions, weighing options and considering the technical and business implications.
  • Highly organised, with meticulous attention to detail in managing documentation, communication, and time.
  • Passionate about technology, Data, AI, and their application in solving real-world business problems.
  • Previous experience translating client requirements into technical specifications
  • Previous experience building relationships with clients.
LI-AM19
This advertiser has chosen not to accept applicants from your region.

ETL Data Senior Developer

R200000 - R250000 Y Circana

Posted today

Job Viewed

Tap Again To Close

Job Description

At Circana, we are fueled by our passion for continuous learning and growth, we seek and share feedback freely, and we celebrate victories both big and small in an environment that is flexible and accommodating to our work and personal lives.
We
're a global company dedicated to fostering inclusivity and belonging. We value and celebrate the unique experiences, cultures, and viewpoints that each individual brings. By embracing a wide range of backgrounds, skills, expertise, and beyond, we create a stronger, more innovative environment for our employees, clients, and communities. With us, you can always bring your full self to work.
Join our inclusive, committed team to be a challenger, own outcomes, and stay curious together. Circana is proud to be Certified
by Great Place To Work
**. This prestigious award is based entirely on what current employees say about their experience working at Circana.

Learn more at
Job Summary:
What is the role & team:**
Join our Global Professional Services team as a Data Engineer, where you will be instrumental in bridging the gap between client needs, business objectives, and cutting-edge technology solutions, particularly supporting our new Private Cloud clients.

This role is for more than just a service provider; we are seeking a
consultant-minded go-getter
who is
client-obsessed
, a
liquid thinker
who thrives at pace, and is
bold and brave
in challenging the status quo. We are a team that values being part of a successful business and building partnerships based on
trust, deep knowledge, respect, and unwavering support
.

We are
passionate about technology, data and AI
and how it can be leveraged to solve our clients' most pressing business challenges. You will be a key player in understanding our clients' world, becoming their trusted partner, and proactively identifying opportunities for innovation and efficiency.
Curiosity, accountability, and a positive, questioning mindset
are at the core of our team's DNA – we don't just aim to meet expectations, we strive to exceed them.

You will work collaboratively with global Circana teams, acting as the vital link to ensure seamless, unified support and strategic technological partnership for our clients.

In this role we are seeking a skilled and motivated Data Engineer to join a growing Global Team based in the UK. You will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for
Data Engineering
and a desire to make a significant impact, we encourage you to apply

Key Responsibilities:

  • ETL/ELT Pipeline Development:

  • Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow.

  • Implement batch and real-time data processing solutions using Apache Spark.
  • Ensure data quality, governance, and security throughout the data lifecycle.

  • Cloud Data Engineering:

  • Manage and optimize cloud infrastructure (Azure) for data processing workloads, with a focus on cost-effectiveness.

  • Implement and maintain CI/CD pipelines for data workflows to ensure smooth and reliable deployments.

  • Big Data & Analytics:

  • Develop and optimize large-scale data processing pipelines using Apache Spark and PySpark.

  • Implement data partitioning, caching, and performance tuning techniques to enhance Spark-based workloads.
  • Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning initiatives.

  • Workflow Orchestration (Airflow):

  • Design and maintain DAGs (Directed Acyclic Graphs) in Apache Airflow to automate complex data workflows.

  • Monitor, troubleshoot, and optimize job execution and dependencies within Airflow.

  • Team Leadership & Collaboration:

  • Provide technical guidance and mentorship to a team of data engineers in India.

  • Foster a collaborative environment and promote best practices for coding standards, version control, and documentation.

Required Skills & Experience:

  • 5+ years of proven experience in data engineering, with hands-on expertise in Azure Data Services, PySpark, Apache Spark, and Apache Airflow.
  • Strong programming skills in Python and SQL, with the ability to write efficient and maintainable code.
  • Deep understanding of Spark internals, including RDDs, DataFrames, DAG execution, partitioning, and performance optimization techniques.
  • Experience with designing and managing Airflow DAGs, scheduling, and dependency management.
  • Knowledge of CI/CD pipelines, containerization technologies (Docker, Kubernetes), and DevOps principles applied to data workflows.
  • Excellent problem-solving skills and a proven ability to optimize large-scale data processing tasks.
  • Prior experience in leading teams and working in Agile/Scrum development environments.
  • Client facing role so strong communication and collaboration skills are vital
  • A track record of working effectively global remote teams

Bonus Points:

  • Experience with data modelling and data warehousing concepts.
  • Familiarity with data visualization tools and techniques.
  • Knowledge of machine learning algorithms and frameworks.

What are we looking for?
We are seeking an individual who embodies the following characteristics:

  • An authentic optimist with a genuine passion for solving problems and driving progress.
  • Intensely curious – you love to understand the 'why' and 'how' behind things, especially when it comes to technology and client challenges.
  • Highly accountable for your actions, your deliverables, and the success of your clients.
  • Comfortable and adept at building strong, professional relationships with both external clients and internal technical and business teams.
  • Deeply committed to giving your best for your team and ensuring our technology solutions exceed client expectations.
  • Confident in making informed decisions, weighing options and considering the technical and business implications.
  • Highly organised, with meticulous attention to detail in managing documentation, communication, and time.
  • Passionate about technology, Data, AI, and their application in solving real-world business problems.
  • Previous experience translating client requirements into technical specifications

Previous experience building relationships with clients.

This advertiser has chosen not to accept applicants from your region.

Data Warehouse Developer

Bellville, Western Cape Momentum

Posted today

Job Viewed

Tap Again To Close

Job Description

Metropolitan is one of the oldest financial services brands in South Africa. With a 125-year legacy of serving the communities in which it operates, Metropolitan represents true empowerment in serving Africa's people through affordable financial solutions that create financial growth and security. Metropolitan operates in South Africa, but the brand is also present in 7 African countries including, Namibia, Botswana, Kenya, Ghana and Lesotho. Metropolitan provides financial wellness solutions that meet the needs of low-income clients, including funeral insurance, health, savings, hospital cash-back cover, retirement solutions and life insurance.

Disclaimer As an applicant, please verify the legitimacy of this job advert on our company career page.

Role Purpose
Are you passionate about data architecture and analytics? We are seeking a Data Warehouse Developer to design and develop dimensional data models, data marts, and enterprise data models using MS SQL. You will play a key role in shaping our data infrastructure, ensuring efficient data storage, retrieval, and processing across different analytical data using a variety of tools.

Applicants might be invited to perform a technical skills assessment in person at the Parc du Cap offices in Bellville as part of the interview process.

Requirements

  • Matric
  • Degree in Information Technology/ Data Science/ Computer Science and or relevant equivalent qualification
  • 5+ years of experience in Business Intelligence development, specifically with data warehouse environments (essential)
  • Proven experience in sourcing from multiple database technologies and building data marts in MS-SQL (essential)

Knowledge:

  • Understanding of SQL and Database technologies (MS SQL Server, DB2, Postgres, Mongo).
  • Experience with dimensional modelling and Kimball methodology.
  • Strong proficiency with BI tools such as SSRS, Microsoft Power BI, or similar platforms.
  • Expertise in SQL, ETL development, data pipelines, and data integration.
  • Experience with cloud-based data platforms (AWS, Azure, or Google Cloud) is a plus.
  • Excellent problem-solving and troubleshooting skills.
  • Strong communication skills, with the ability to work with both technical and business stakeholders.

Duties & Responsibilities
INTERNAL PROCESS

  • Design, develop, and maintain efficient dimensional data models and data marts.
  • Architect enterprise data models to support analytical and business intelligence needs.
  • Implement and maintain ETL processes for seamless data integration.
  • Collaborate with cross-functional teams to gather data requirements and ensure high-quality data solutions.
  • Optimize query performance, ensuring high availability and responsiveness of the data warehouse platform.
  • Ensure data integrity, security, and consistency across all systems and solutions.
  • Provide mentorship and guidance to junior developers, ensuring best practices are followed.
  • Continuously improve data warehouse architecture to support evolving business needs and emerging technologies.

CLIENT

  • Provide authoritative expertise and advice to clients and stakeholders.
  • Build and maintain relationships with clients, internal and external stakeholders.
  • Deliver on service level agreements made with internal and external stakeholders and clients.
  • Make recommendations to improve client service within area of responsibility.
  • Participate and contribute to a culture which build rewarding relationships, facilitates feedback and provides exceptional client service.

PEOPLE

  • Develop and maintain productive and collaborative working relationships with peers and stakeholders.
  • Positively influence and participate in change initiatives within the team and across the business.
  • Continuously develop own expertise in terms of professional, industry and legislation knowledge.
  • Contribute to continuous innovation through the development, sharing and implementation of new ideas.
  • Take ownership for driving career development through available channels.

FINANCE

  • Identify opportunities to enhance cost effectiveness and increase operational efficiency.
  • Provide input into the risk identification processes and communicate recommendations in the appropriate forum.

Competencies

  • Examining Information
  • Interpreting Data
  • Developing Expertise
  • Providing Insights
  • Articulating Information
  • Meeting Timescales
  • Attention to detail
  • Producing Output
This advertiser has chosen not to accept applicants from your region.

Data Warehouse Developer

Bellville, Western Cape MetLife

Posted today

Job Viewed

Tap Again To Close

Job Description

Introduction

Metropolitan is one of the oldest financial services brands in South Africa. With a 125-year legacy of serving the communities in which it operates, Metropolitan represents true empowerment in serving Africa's people through affordable financial solutions that create financial growth and security. Metropolitan operates in South Africa, but the brand is also present in 7 African countries including, Namibia, Botswana, Kenya, Ghana and Lesotho. Metropolitan provides financial wellness solutions that meet the needs of low-income clients, including funeral insurance, health, savings, hospital cash-back cover, retirement solutions and life insurance.

Disclaimer

As an applicant, please verify the legitimacy of this job advert on our company career page.

Role Purpose

Are you passionate about data architecture and analytics? We are seeking a Data Warehouse Developer to design and develop dimensional data models, data marts, and enterprise data models using MS SQL. You will play a key role in shaping our data infrastructure, ensuring efficient data storage, retrieval, and processing across different analytical data using a variety of tools.

Applicants might be invited to perform a technical skills assessment in person at the Parc du Cap offices in Bellville as part of the interview process.

Requirements

  • Matric

  • Degree in Information Technology/ Data Science/ Computer Science and or relevant equivalent qualification

  • 5+ years of experience in Business Intelligence development, specifically with data warehouse environments (essential)

  • Proven experience in sourcing from multiple database technologies and building data marts in MS-SQL (essential)

Knowledge:

  • Understanding of SQL and Database technologies (MS SQL Server, DB2, Postgres, Mongo).

  • Experience with dimensional modelling and Kimball methodology.

  • Strong proficiency with BI tools such as SSRS, Microsoft Power BI, or similar platforms.

  • Expertise in SQL, ETL development, data pipelines, and data integration.

  • Experience with cloud-based data platforms (AWS, Azure, or Google Cloud) is a plus.

  • Excellent problem-solving and troubleshooting skills.

  • Strong communication skills, with the ability to work with both technical and business stakeholders.

Duties & Responsibilities

INTERNAL PROCESS

  • Design, develop, and maintain efficient dimensional data models and data marts.

  • Architect enterprise data models to support analytical and business intelligence needs.

  • Implement and maintain ETL processes for seamless data integration.

  • Collaborate with cross-functional teams to gather data requirements and ensure high-quality data solutions.

  • Optimize query performance, ensuring high availability and responsiveness of the data warehouse platform.

  • Ensure data integrity, security, and consistency across all systems and solutions.

  • Provide mentorship and guidance to junior developers, ensuring best practices are followed.

  • Continuously improve data warehouse architecture to support evolving business needs and emerging technologies.

CLIENT

  • Provide authoritative expertise and advice to clients and stakeholders.

  • Build and maintain relationships with clients, internal and external stakeholders.

  • Deliver on service level agreements made with internal and external stakeholders and clients.

  • Make recommendations to improve client service within area of responsibility.

  • Participate and contribute to a culture which build rewarding relationships, facilitates feedback and provides exceptional client service.

PEOPLE

  • Develop and maintain productive and collaborative working relationships with peers and stakeholders.

  • Positively influence and participate in change initiatives within the team and across the business.

  • Continuously develop own expertise in terms of professional, industry and legislation knowledge.

  • Contribute to continuous innovation through the development, sharing and implementation of new ideas.

  • Take ownership for driving career development through available channels.

FINANCE

  • Identify opportunities to enhance cost effectiveness and increase operational efficiency.

  • Provide input into the risk identification processes and communicate recommendations in the appropriate forum.

Competencies

  • Examining Information

  • Interpreting Data

  • Developing Expertise

  • Providing Insights

  • Articulating Information

  • Meeting Timescales

  • Attention to detail

  • Producing Output

This advertiser has chosen not to accept applicants from your region.

Data Warehouse Developer

R900000 - R1200000 Y Mr Price Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Our new
Data Warehouse Developer
will be responsible for designing, developing, and maintaining enterprise data warehouse to meet the group's needs.

You would be part of the Mr Price Advance team which looks after data analytics, reporting, data science and RPA across the group

Why you should join this team?

You'll be involved in projects involving BI and Analytics across multiple data disciplines: SQL and Big Data.

We're also in the process of migrating the current data warehouses and reporting from on premise solutions to the cloud which means the opportunities for technical growth are endless

What we're looking for?

  • Relevant IT Degree or Diploma
  • 3+ years' experience in:
  • Data warehouse design and implementation, infrastructure components and analytical tools.
  • Data warehouse modelling
  • Structured Query Language (SQL)
  • Experience in Snowflake data warehousing
  • Troubleshooting

A day in your life?

  • Building large scale databases
  • Data modelling (Data Vault 2.0 and Star Schema), data management, and data transformation
  • Perform ongoing monitoring, automation and refinement of implemented solutions.
  • Provide analysis and issue resolution on business reported concerns.
  • Participate in design to influence delivery of various business intelligence solutions.
  • Contributes to the designs and plans for the integration for all data warehouse technical components.
  • Contributes to the design, automation, and acquisition, transformation and data loading processes in a high volume, high availability data warehouse environment
  • Contributes to the design of the data warehouse architecture
  • Provides production support to solve immediate problems and keeps databases in production
  • Analyses information requirements of customers and supports teams and contributes to the design of the best technical solutions
  • Technical implementation of the data warehouse
  • Participates in testing of the data design, tool design, data extracts
  • Optimisation, support and maintenance of the data warehouse environment
  • Participates in testing and validation of the data model throughout the various stages of the development process.

Mr Price Group Limited is an equal opportunity employer and is committed to Employment Equity.

This advertiser has chosen not to accept applicants from your region.

Data Warehouse Engineer

R400000 - R1200000 Y Nando's South Africa

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

The core role of a Data Warehouse Engineer is to design, build, and maintain the infrastructure necessary for data storage, processing, and analysis. This involves working with various data technologies, programming languages, and data modeling techniques to ensure data is accurate, reliable, and easily accessible to other teams within the Nando's SA region. Additionally, the application of the Ralph Kimball methodology is an integral part of this responsibility.

Minimum Requirements:

Bachelor's Degree or equivalent NQF level 6+ qualification (Computer Science or Information Systems or related)

Data Warehouse or Business Intelligence Certification (preferable but not essential)

4 – 6 years' experience as a Data Warehouse Engineer or BI Developer

Solid understanding of the Ralph Kimbal Methodology (Ideal – please list any other methodologies that you have experience of)

Experience with designing and developing data solutions using MS SQL Server

Experience with OLAP/ Cube Development in Tabular

Experience with SQL Server Integration Services (SSIS)

Experience designing and developing data integration solutions.

Experience in documenting business processes for data solutions

Experience with BI and Data warehouse security

Experience with various BI tools such as Power BI

Data Warehouse Development and Management

Lead the design, development, and optimisation for all data applications, including extracting, transforming, and integrating data from multiple sources with final approval from the manager.

Work with all external vendors in the provision of database integration, build, and maintenance services

Lead the development and maintenance of the database architecture for specific projects, as required.

Work with DBA to maintain and configure SQL database servers and services with final approval from the manager.

Regularly review database architectures and designs to ensure compliance with business objectives as well as requirements for data integrity, quality, performance, management, scalability, reliability, and security and report back to the manager.

Transform data into actionable insights using the appropriate BI tools and systems with oversight from the manager.

Adhere to core data governance disciplines when performing BI activities, including data quality management, information lifecycle management, and information security and privacy.

Data Integration

Engage with relevant stakeholders to identify business intelligence requirements and document these according to prescribed standards and methods.

Assist in conducting a gap and impact analysis on identified requirements to formulate clear data structures and ensure appropriate delivery to stakeholders.

Translate business and technical requirements into efficient and sustainable data solutions.

Design and implement effective cross-functional data solutions and processes.

Validating and testing data solution outputs for accuracy, user experience, and performance, ensuring that all errors and inefficiencies are corrected.

Collaborate with team members, end users, and other relevant stakeholders to provide input into strategies for integrating disparate systems.

Provide users with post-implementation support, including managing incidents and resolving problems using analysis, troubleshooting, and validation.

Stakeholder Management

Consult with users to determine service delivery requirements and improvement opportunities.

Establish and maintain effective working relationships with all internal and external stakeholders.

Business Support and Enablement

Contribute towards the planning, maintenance, and improvement of databases, including incorporating new releases and upgrades to ensure optimal performance.

Assist the manager in evaluating the effectiveness of existing internal processes and applications and propose solutions and opportunities for automation and audit controls.

Identify inefficiencies in system processes, recommending and implementing changes.

Gauge the effectiveness and efficiency of existing systems, developing, and executing strategies for improving or further leveraging these systems in consultation with the manager.

Research, evaluate, and recommend new tools and applications for use in assigned responsibilities.

User Support and Training

Provide 2nd level support for end users.

Escalate urgent and unresolved tickets to the relevant parties.

Participate in creating and improving the procedures for desktop support.

Prepare or assist in the preparation of data training content in the quest to make Nando's a data-driven organisation.

Compliance and Governance

Comply with all electronic and physical security procedures and standards.

Follow standard service desk procedures, including the logging of issues.

Adherence to Nando's technology standards, policies, and procedures

Records Management and Reporting

Maintain all IT records and tracking for their area of responsibility and provide managers and users with regular updates as well as any relevant status and progress information.

Maintain a record of all inquiries from the initial call to incident resolution and provide the necessary information and documentation for issues that require escalation.

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Etl developer Jobs in South Africa !

Data Warehouse Analyst

Centurion, Gauteng R500000 - R1200000 Y Momentum

Posted today

Job Viewed

Tap Again To Close

Job Description

Through our client-facing brands Metropolitan and Momentum, with Multiply (wellness and rewards programme), and our other specialist brands, including Guardrisk and Eris Property Group, the group enables business and people from all walks of life to achieve their financial goals and life aspirations.

We help people grow their savings, protect what matters to them and invest for the future. We help companies and organisations care for and reward their employees and members. Through our own network of advisers or via independent brokers and utilising new platforms Momentum Metropolitan provides practical financial solutions for people, communities and businesses. Visit us at

Disclaimer As an applicant, please verify the legitimacy of this job advert on our company career page.

Role Purpose
The Data Warehouse Analyst plays a crucial role in translating business needs into actionable insights. This role involves gathering, analyzing, and interpreting data to provide recommendations that drive strategic decision-making. The Data Warehouse Analyst possesses a strong understanding of data warehousing, ETL processes, data visualization, and reporting techniques. They are comfortable working with various data sources and BI tools and can effectively communicate complex information to both technical and non-technical audiences.

Requirements

  • Bachelor's degree in a quantitative field (e.g., Computer Science, Statistics, Mathematics, Business Analytics) or equivalent experience.
  • 5-8 years of experience in Data Warehouse analysis.
  • 5 years' experience in data modelling in a Data Warehouse.
  • Strong understanding of data warehousing concepts, ETL processes, and data modeling techniques.
  • Proficiency in SQL and experience working with relational databases.
  • Experience with at least one BI visualization tool (e.g., Tableau, Power BI, Looker).
  • Strong analytical and problem-solving skills.
  • Excellent communication and presentation skills, with the ability to explain complex technical concepts to non-technical audiences.
  • Ability to work independently and as part of a team.
  • Experience with cloud-based data warehousing solutions (e.g., Snowflake, BigQuery, Redshift) is a plus.
  • Experience with scripting languages (e.g., Python, R) is a plus.
  • Knowledge of statistical methods and techniques is a plus.

Duties & Responsibilities

  • Requirements Gathering: Collaborate with stakeholders to understand business needs and translate them into specific data and reporting requirements.
  • Data Collection & Preparation: Identify, collect, and clean data from various sources (e.g., databases, CRM systems, web analytics) ensuring data accuracy and integrity.
  • Data Modeling & Warehousing: Contribute to the design and maintenance of data models and data warehouses, optimizing for performance and scalability.
  • ETL Processes: Develop and maintain ETL (Extract, Transform, Load) processes to move data from source systems to the data warehouse.
  • Data Analysis & Interpretation: Conduct in-depth data analysis to identify trends, patterns, and insights that can inform business decisions.
  • Report & Dashboard Development: Design, develop, and maintain interactive dashboards and reports using BI tools (e.g., Tableau, Power BI, Looker) to visualize data and communicate findings effectively.
  • Performance Monitoring: Monitor the performance of BI solutions and identify areas for improvement.
  • Documentation: Create and maintain documentation for data sources, ETL processes, reports, and dashboards.
  • Collaboration & Communication: Effectively communicate findings and recommendations to stakeholders at all levels, including technical and non-technical audiences.
  • Continuous Learning: Stay up-to-date with the latest BI trends, tools, and techniques.
  • Mentorship: Mentor junior analysts and provide guidance on best practices.

Competencies

  • Data Analysis
  • Data Warehousing
  • ETL
  • SQL
  • Data Visualization
  • Reporting
  • Business Acumen
  • Communication (Written & Verbal)
  • Problem-Solving
  • Critical Thinking
  • Collaboration
This advertiser has chosen not to accept applicants from your region.

Data Warehouse Engineer

2001 Johannesburg, Gauteng Parvana

Posted 104 days ago

Job Viewed

Tap Again To Close

Job Description

Permanent
About our client: Our client offers financial service solutions helping their clients achieve their dreams. With an emphasis on culture fit, they boast a dedicated team of over 600 employees, many with over a decade of tenure. They have built their culture on a feeling of togetherness, trust and respect and are always looking to support employees' continuous learning. Using Agile, they provide diverse services with a focus on research, innovation and improvement.What you will be doing: Design, develop, and modify database structures, relationships, data flows, and data interfaces within the data warehouse.Analyse, create and modify data structures to adapt to business needs and add enhanced functionality.Develop and modify ETL processes to load data from various sources into the data warehouse.Use programming languages, software development methods, and best practices to develop new data warehouse structures and reports and modify existing ones.Unit test and debug code.Participate in requirements gathering, analysis, technical design, testing, documentation and project planning.Consult with clients to gather information about needs, objectives, and requirements.Identify and propose technical solutions to client requests and system problems.Follow department standards and create written documentation and diagrams as required.Thorough understanding of data warehouse architecture, data structures, and fundamental design principles.Ability to work independently with general supervision, which may include providing guidance and support to entry-level professionals or support staff.Contribute to the development of data warehouses for clients in the Insurance, Lending, and Employee Benefits sectors.Responsible for software development, production support, and providing expertise in Data Warehouse and Reporting Applications.What our client is looking for: A relevant tertiary degree would be ideal (Computer Science, IT, Data Science, Information Systems, etc.)Strong interest in analytical / dimensional modelling, and data analytics tools. Insurance knowledge is a plus.In-depth knowledge of business intelligence tools including data warehousing, and ETL (Extraction, Transformation, and Load). Yellowfin knowledge is beneficial.Knowledge and understanding of the software development life cycle.Knowledge and experience in relational databases, SQL, PostgreSQL.Experience in database design and modelling for data warehouse and business intelligence applications, including relational database structures and normal forms.Analytical and troubleshooting skills for complex technical issues and tasks.Ability to present and explain complex technical topics, problems, and alternative solutions to others.Experience estimating solution development and delivering solutions within those estimations.Knowledge of Insurance Systems.Experience with or knowledge of most of the following software, languages and tools, or similar products would be preferred: SQL Server DBMS, T-SQL (ANSI), PostgreSQL, Python.Job ID: J For a more comprehensive list of opportunities that we have on offer, do visit our website -
This advertiser has chosen not to accept applicants from your region.

Senior Data Warehouse Engineer

R450000 - R900000 Y DeARX

Posted today

Job Viewed

Tap Again To Close

Job Description

About The Job Senior Data Warehouse Engineer
Duration:
12-month contract with possibility to convert to perm

Location:
Pretoria

Are you the
go-to expert
when it comes to
stabilising, optimising, and governing busy data warehouse environments
?

Do you thrive under pressure in
high-volume, highly regulated industries
and know how to keep things running smoothly while driving improvements?

We're looking for a
Senior Data Warehouse Engineer
to take charge of a
mission-critical environment
on a 12-month contract, based full-time onsite in Pretoria. This is a role where
your expertise will directly impact stability, efficiency, and compliance
and where you'll be seen as the
Subject Matter Expert
.

Role Responsibilities

  • Architect & Optimise: Design, build, and maintain scalable, stable warehouse environments that support business growth.
  • ETL Leadership: Develop and optimise SSIS pipelines with robust controls for accuracy, reliability, and automation.
  • Full Data Lifecycle: Manage ingestion, transformation, storage, archival, and retrieval processes end-to-end.
  • Governance & Compliance: Establish strong data governance practices, ensuring segregation, cataloguing, and audit readiness.
  • Continuous Improvement: Simplify, centralise, and streamline operations to ensure stability and scalability.
  • Audit Champion: Partner with auditors and consultants to ensure smooth reviews and compliance checks.

Role Requirements

  • Proven experience in complex, high-volume warehouse environments.
  • Background in banking or financial services (preferred), with exposure to regulatory and audit-heavy environments.
  • Strong expertise in data governance, frameworks, and control processes.
  • Technical strength in:

  • MS SQL (on-prem, heterogeneous environment: Mongo, Postgres, Open Query)

  • ETL using SSIS, automated control environments, and SQLM Control integrated with SSMS.

  • Demonstrated ability to stabilise and scale existing data ecosystems.

  • Comfortable engaging with auditors, consultants, and both technical and business stakeholders.

Why This Role?

  • Impact: Your work will directly stabilise and improve a warehouse that underpins critical operations.
  • Challenge: An opportunity to tackle high-volume, high-stakes data environments.
  • Recognition: Be the trusted SME in a role where your skills are highly visible and valued.
  • Contract with Purpose: A 12-month engagement that could open doors to further opportunities.

If you're a
seasoned Data Warehouse Engineer
ready to bring
stability, governance, and efficiency
to a busy environment, we'd love to hear from you.

Apply now and put your expertise where it really matters.

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Etl Developer Jobs