21 Data Pipelines jobs in South Africa

Lead Developer: Data Architecture

R600000 - R1200000 Y Work Mosadi Solutions

Posted today

Job Viewed

Tap Again To Close

Job Description

Lead Developer

3 Months Renewable Contract

Available Immediately

Banking

Role Overview

We are seeking a highly skilled
Lead Developer
with strong expertise in
data architecture, data engineering, data modelling, and SQL
to join our team in the banking sector. The successful candidate will play a critical role in designing, building, and optimizing scalable data solutions while providing technical leadership and mentorship to a junior team of developers and analysts. This role requires both hands-on technical expertise and the ability to guide and upskill a growing team.

Key Responsibilities

  • Lead the design, development, and implementation of
    robust data architectures
    to support banking applications and analytics.
  • Develop, optimize, and maintain
    ETL/ELT pipelines, data warehouses, and data lakes
    .
  • Drive the design and enforcement of
    data modelling standards
    to ensure accuracy, consistency, and scalability across systems.
  • Write efficient and optimized
    SQL queries
    , stored procedures, and database scripts for complex data processing.
  • Collaborate with stakeholders across business and IT to translate requirements into scalable technical solutions.
  • Ensure data solutions are compliant with
    regulatory, governance, and security requirements
    in the banking sector.
  • Provide
    hands-on technical leadership
    , code reviews, and best practices to elevate the technical quality of deliverables.
  • Mentor, coach, and upskill a junior team of developers and analysts, fostering a culture of knowledge sharing and continuous improvement.
  • Stay current with industry trends, emerging technologies, and best practices in
    data engineering and architecture
    .

Key Skills & Competencies

  • Strong leadership and
    mentorship abilities
    , with proven experience developing junior talent.
  • Excellent communication and stakeholder management skills, with the ability to explain complex technical concepts to non-technical audiences.
  • Strong problem-solving skills, analytical thinking, and attention to detail.
  • Ability to work under pressure in a
    regulated and fast-paced banking environment
    .

Technical Requirements

  • Proven expertise in
    SQL development
    (query optimization, stored procedures, performance tuning).
  • Strong knowledge of
    data architecture and data modelling principles
    (relational, dimensional, and NoSQL).
  • Experience with
    data engineering frameworks
    and technologies (e.g., Python, Spark, Kafka, Airflow, or similar).
  • Experience designing and managing
    data warehouses and/or data lakes
    (e.g., Snowflake, Redshift, BigQuery, or equivalent).
  • Proficiency in
    ETL/ELT design and implementation
    .
  • Familiarity with
    cloud platforms
    (AWS, Azure, or GCP) and their data services.
  • Strong understanding of
    data governance, data quality, and security practices
    within financial services.

Qualifications & Experience

  • Bachelor's degree in Computer Science, Information Systems, Engineering, or related field (Master's advantageous).
  • 8+ years of experience in
    data engineering, data architecture, and SQL development
    .
  • 3+ years in a
    leadership or senior developer role
    , with mentorship experience.
  • Prior experience in the
    banking or financial services sector
    is strongly preferred.
This advertiser has chosen not to accept applicants from your region.

Data Engineering and Architecture Lead

R1200000 - R2400000 Y X, bigly labs

Posted today

Job Viewed

Tap Again To Close

Job Description

At
X, bigly labs
, we're Dis-Chem's high-performance innovation hub, where bold ideas meet data, design, and radical customer focus. Our mission is simple:
power the future of healthcare
by lowering costs, improving outcomes, and unlocking new possibilities. We're driven by one big question:
How do we use data + technology today to create healthier lives tomorrow?
Here, we don't just imagine the future,
we build it
. From cutting-edge digital solutions to smarter, patient-focused experiences, we're reimagining health tech to make breakthroughs possible.

Welcome to
X, bigly labs.
This is healthcare, reimagined.

From retail and pharmacy to clinics, insurance and beyond, we're applying machine learning and smart systems to reimagine healthcare pathways and create personalised experiences at scale. Here, your work doesn't sit in a notebook or on a whiteboard. It becomes real. It shapes decisions. It improves lives. And we do it all the Bigly way, questioning, challenging, and building audaciously, together.

We are seeking a visionary
Data Engineering and Architecture Lead
to lead the design, development, and optimisation of our enterprise data ecosystem. This role is responsible for building scalable, secure, and high-performance data platforms that power analytics, AI, and operational excellence across Dischem and its affiliated businesses. You will be the architect of our data assets by ensuring that our data infrastructure is not only robust and compliant, but also agile enough to support innovation in healthcare, retail, insurance, and beyond.

WHAT WE'RE LOOKING FOR?
Minimum

  • Bachelor's degree (or the equivalent) in Computer Science, Data Engineering, or related field
  • 7+ years of experience in data engineering or architecture, with 3+ years in a leadership role
  • Deep expertise in SQL, Python or/and Spark, Cloud platforms (Azure or AWS) and cloud-native tools (e.g. DataBricks)
  • Proven experience designing, implementing scaling enterprise data platforms using the Medallion Architecture
  • Strong understanding of data governance, security, and compliance frameworks
  • Excellent leadership, communication, and stakeholder engagement skills

Advantageous

  • Experience in healthcare, retail, or insurance data ecosystems
  • Familiarity with data mesh, lakehouse architecture and real-time data processing
  • Certifications in cloud architecture or data engineering

WHAT YOU WILL BE DOING?

  • Enterprise Data Architecture
  • Data Engineering Leadership
  • Platform Enablement & Innovation
  • Stakeholder Engagement & Governance
  • Enterprise Data Architecture
  • Data Engineering Leadership
  • Platform Enablement & Innovation
  • Financial & Vendor Management

WHO YOU ARE?

  • Structured thinking and systems problem-solving
  • Commercial fluency and ability to articulate value levers
  • Strategic clarity balanced with practical execution
  • Able to co-create solutions with technical teams
  • Influences without authority and facilitates decision forums
  • Drives initiatives independently with high standards of quality

Our values aren't just ideals, they're the through-lines in how we think, build, and make decisions that impact real lives. From bold experimentation in digital solutions to platforms built on integrity, we're shaping a culture designed for
progress that lasts
. It's a culture that
designs for the future
, asks better questions, and answers them with care, urgency, and systems that scale.

Think you've got the energy, the curiosity, and the guts? Stay close
b
igly things are ahead.

This advertiser has chosen not to accept applicants from your region.

Data Engineering and Architecture Lead

Johannesburg, Gauteng E-Merge IT Recruitment

Posted 10 days ago

Job Viewed

Tap Again To Close

Job Description

permanent

We’re building the next-generation data backbone for a dynamic organisation.

We create scalable, secure, cloud-first platforms that power analytics, AI, and business intelligence across the enterprise.

Currently searching for a Data Engineering and Architecture Lead with deep AWS Cloud expertise and a passion for solving complex data challenges.


Requirements:

  • Bachelor’s degree in computer science, information systems, or related field
  • 10+ years’ experience in data engineering or architecture
  • 5+ years leadership experience
  • 5+ years architectural experience
  • Proven experience with AWS services (S3, Glue, Redshift, Lambda, EMR, Athena)
  • Expertise in SQL, Python, and ETL/ELT development
  • Knowledge of data modelling (Kimball, Data Vault) and data governance
  • Leadership experience managing technical teams

Responsibilities:

  • Lead design and implementation of enterprise data architecture and integration strategies
  • Build and manage scalable, high-performance data pipelines
  • Ensure data availability and quality to support analytics, BI, and AI initiatives
  • Collaborate with business and technology teams to translate requirements into solutions
  • Define best practices, enforce standards, and mentor a team of data engineers

Reference number for this position is GZ60852 which is a permanent position based in Melrose Arch offering a cost to company salary of R1.8m per annum negotiable on experience and ability. Contact Garth on or call him on to discuss this and other opportunities.

Are you ready for a change of scenery? The E-Merge IT recruitment is a specialist niche recruitment agency. We offer our candidates options so that we can successfully place the right developers with the right companies in the right roles. Check out the E-Merge website for more great positions.

Do you have a friend who is a developer or technology specialist? We pay cash for successful referrals!

This advertiser has chosen not to accept applicants from your region.

Data Integration

DLK Group

Posted today

Job Viewed

Tap Again To Close

Job Description

The role of the Senior Data Integration Analyst encompasses many activities, including (but not limited to):

  • Designing and implementing advanced data integration pipelines and ETL (Extract, Transform, Load) processes.
  • Managing complex integrations across multiple systems and platforms to ensure seamless data flow.
  • Collaborating with stakeholders to understand and define data integration requirements.
  • Overseeing data governance and ensuring data integrity throughout integration processes.
  • Mentoring, providing technical guidance and support.
  • Troubleshooting and optimizing integration workflows for performance and reliability.
Requirements

Minimum Qualification:

A Degree in Information Communication Technology (ICT) field incorporating (but not limited to) Computer Science or Software Engineering or Information Systems.

Minimum Experience:

  • Minimum of 5 years' experience working with SQL, ETL, APIs; 3 years in data integration and agile-scrum environment within enterprise organisations of >1000 users.
This advertiser has chosen not to accept applicants from your region.

Data Integration Analyst

R900000 - R1200000 Y iLaunch

Posted today

Job Viewed

Tap Again To Close

Job Description

Design and implement advanced integration pipelines and ETL processes

Seamless Dataflow - Manage complex integrations to ensure seamless data flow

Collaboratewith stakeholders to define and understand data integration requirements

For senior roles:

Oversee data Governance and data integrity

Mentor in technical integration and troubleshoot to optimise integration performance

Degree in Information Communication( ICT related, Computer Science, Information Systems)

3 Years experience in ETL, SQL and API's, Integration and Analyses

Experience working in a large enterprise with at least a 1000 user headcount and multiproject environment

2-3 years experience in web-based application environments and Microsoft Office professional

Experience definingand implementing product/integration requirements in a Sprint, Scrum/Agile environment

Use of Integration tools such asAzure Data Factory, Informatica, or Talend

Between 3 - 5 Years

This advertiser has chosen not to accept applicants from your region.

Senior Data Integration Engineer

R2000000 - R2500000 Y Indsafri

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description:

Job Title: Senior Data Integration Engineer (Salesforce, Databricks & MuleSoft)

Location: Johannesburg (Hybrid)

Employment Type: Contract

Contract Tenure: 6 to 12 months

Job Summary

We are seeking a highly experienced and strategic Senior Data Integration Engineer to architect, build, and manage the data pipelines that power our customer intelligence ecosystem. In this critical role, you will be the subject matter expert responsible for designing and implementing robust integrations between our core platforms: Salesforce Data Cloud, Databricks, and MuleSoft.

You will be responsible for creating a seamless flow of data, enabling advanced analytics, and empowering our business to activate real-time customer insights. The ideal candidate is a hands-on expert who can translate complex business requirements into scalable, secure, and high-performance technical solutions.

Required Skills & Experience

  • 6+ years of professional experience
    in a data engineering, integration development, or data architecture role.
  • Proven hands-on experience with MuleSoft:
    Demonstrable expertise in designing, building, and managing APIs using the Anypoint Platform (API-led connectivity, DataWeave, connectors).
  • Strong proficiency in Databricks:
    Hands-on experience developing data pipelines using
    PySpark
    , SQL, Delta Lake, and job orchestration.
  • Demonstrable experience with Salesforce Data Cloud:
    In-depth knowledge of its data model, ingestion methods (Connectors, Ingestion API), identity resolution, segmentation, and activation capabilities.
  • Expert SQL & Python Skills:
    Ability to write complex, efficient SQL queries and Python code for data manipulation and automation.
  • Solid understanding of data modeling principles
    and experience designing and working with ETL/ELT processes.
  • Experience working with major cloud platforms (
    AWS, Azure, or GCP
    ).

Preferred Qualifications

  • Certifications:
  • Salesforce Data Cloud Consultant
  • MuleSoft Certified Developer / Architect
  • Databricks Certified Data Engineer Professional
  • Experience with other Salesforce clouds (e.g., Marketing Cloud, Sales Cloud).
  • Knowledge of CI/CD and DevOps practices in a data context.
  • Familiarity with streaming data technologies (e.g., Kafka).
This advertiser has chosen not to accept applicants from your region.

Senior Data Integration Engineer

R1800000 - R2500000 Y Indsafri

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Summary

We are seeking a highly experienced and strategic Senior Data Integration Engineer to architect, build, and manage the data pipelines that power our customer intelligence ecosystem. In this critical role, you will be the subject matter expert responsible for designing and implementing robust integrations between our core platforms: Salesforce Data Cloud, Databricks, and MuleSoft.

You will be responsible for creating a seamless flow of data, enabling advanced analytics, and empowering our business to activate real-time customer insights. The ideal candidate is a hands-on expert who can translate complex business requirements into scalable, secure, and high-performance technical solutions.

Key Responsibilities

  • Architect Integration Solutions:
    Lead the design and architecture of data integration patterns and end-to-end data flows between source systems, MuleSoft, Databricks, and Salesforce Data Cloud.
  • Develop MuleSoft APIs:
    Design, develop, and deploy reusable, API-led integration solutions using MuleSoft's Any point Platform to ingest data into the ecosystem and to syndicate data to downstream systems.
  • Build Advanced Data Pipelines in Databricks:
    Implement complex data transformation, cleansing, and enrichment pipelines using PySpark and SQL within the Databricks Lakehouse Platform. Prepare and model data for ingestion into Salesforce Data Cloud and for advanced analytics use cases.
  • Master Salesforce Data Cloud:
    Configure and manage Salesforce Data Cloud, including setting up data streams, performing data mapping and harmonization, defining identity resolution rules, and creating insightful calculated metrics.
  • Enable Data Activation:
    Collaborate with marketing, sales, and service teams to build and activate complex audience segments from Salesforce Data Cloud for use in personalization and campaign execution.
  • Ensure Governance and Performance:
    Implement data quality checks, error handling, and performance monitoring across all platforms. Ensure solutions adhere to data governance policies, security standards, and privacy regulations.
  • Mentorship and Best Practices:
    Act as a senior technical resource for the team, establishing best practices for integration and data engineering. Provide guidance and mentorship to junior team members.
  • Stakeholder Collaboration:
    Work closely with business analysts, data scientists, and platform owners to gather requirements and deliver solutions that provide tangible business value.

Required Skills & Experience

  • 6+ years of professional experience
    in a data engineering, integration development, or data architecture role.
  • Proven hands-on experience with MuleSoft:
    Demonstrable expertise in designing, building, and managing APIs using the Any point Platform (API-led connectivity, Data Weave, connectors).
  • Strong proficiency in Databricks:
    Hands-on experience developing data pipelines using
    PySpark
    , SQL, Delta Lake, and job orchestration.
  • Demonstrable experience with Salesforce Data Cloud:
    In-depth knowledge of its data model, ingestion methods (Connectors, Ingestion API), identity resolution, segmentation, and activation capabilities.
  • Expert SQL & Python Skills:
    Ability to write complex, efficient SQL queries and Python code for data manipulation and automation.
  • Solid understanding of data modeling principles
    and experience designing and working with ETL/ELT processes.
  • Experience working with major cloud platforms (
    AWS, Azure, or GCP
    ).

Preferred Qualifications

Certifications:

  • Salesforce Data Cloud Consultant
  • MuleSoft Certified Developer / Architect
  • Databricks Certified Data Engineer Professional
  • Experience with other Salesforce clouds (e.g., Marketing Cloud, Sales Cloud).
  • Knowledge of CI/CD and DevOps practices in a data context.
  • Familiarity with streaming data technologies (e.g., Kafka).
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data pipelines Jobs in South Africa !

Senior Data Integration/ Analyst

7100 Cape Town, Western Cape Sabenza IT & Recruitment

Posted 14 days ago

Job Viewed

Tap Again To Close

Job Description

Design and implement advanced data integration pipelines and ETL (Extract, Transform, Load) processes.Architect scalable integration solutions across multiple systems and platforms.Ensure optimal data flow and system interoperability through efficient API and service-based integrations.Oversee data governance practices and ensure adherence to organisational standards.Maintain high levels of data quality, consistency, and integrity across all integrated environments.Establish and enforce best practices for integration security, compliance, and performance monitoring.Engage with stakeholders to understand business and technical requirements for data integration.Translate integration needs into detailed design specifications and actionable implementation plans.Partner with development, data, and business teams to deliver end-to-end data integration solutions.Diagnose and resolve integration and data flow issues in production and testing environments.Optimise integration workflows for scalability, reliability, and performance efficiency.Continuously review and enhance existing integration processes to support evolving business needs.Provide technical guidance and mentorship to junior and mid-level data integration team members.Promote knowledge sharing, best practices, and continuous improvement within the team.Contribute to technical strategy, design reviews, and architectural decisions.RequirementsDegree in Information and Communication Technology (ICT) or related field (Computer Science, Software Engineering, or Information Systems).Minimum of 5 years’ experience working with SQL, ETL tools, and APIs.At least 3 years’ experience in data integration within an agile-scrum environment and enterprise organisations (>1000 users).Advanced proficiency in SQL, ETL tools (e.g., SSIS, Talend, Informatica, Azure Data Factory), and data integration frameworks.Strong experience with API management, data warehousing, and cloud-based integration platforms.Deep understanding of data governance, metadata management, and data quality principles.
This advertiser has chosen not to accept applicants from your region.

Data Integration / Analyst (Senior-Level)

7400 Cape Town, Western Cape DLK Group

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

The role of the Senior Data Integration Analyst encompasses many activities, including (but not limited to): Designing and implementing advanced data integration pipelines and ETL (Extract, Transform, Load) processes.Managing complex integrations across multiple systems and platforms to ensure seamless data flow.Collaborating with stakeholders to understand and define data integration requirements.Overseeing data governance and ensuring data integrity throughout integration processes.Mentoring, providing technical guidance and support.Troubleshooting and optimizing integration workflows for performance and reliability.RequirementsMinimum Qualification: A Degree in Information Communication Technology (ICT) field incorporating (but not limited to) Computer Science or Software Engineering or Information Systems. Minimum Experience: Minimum of 5 years’ experience working with SQL, ETL, APIs; 3 years in data integration and agile-scrum environment within enterprise organisations of >1000 users.
This advertiser has chosen not to accept applicants from your region.

Mid-Level Data Integration/Analyst

7100 Cape Town, Western Cape Sabenza IT & Recruitment

Posted 14 days ago

Job Viewed

Tap Again To Close

Job Description

Design, develop, and maintain efficient data integration pipelines and workflows.Implement ETL (Extract, Transform, Load) processes to move and prepare data for analysis and reporting.Identify, analyse, and resolve data flow or integration issues in a timely manner.Implement system fixes and process improvements to ensure data accuracy and consistency.Support data mapping, transformation, and reconciliation activities between source and target systems.Ensure data integrity during migration, transformation, and integration processes.Work closely with data engineers, business analysts, and software teams to ensure integration solutions align with project and business requirements.Participate in agile-scrum ceremonies and contribute to sprint planning, reviews, and retrospectives.Maintain comprehensive documentation for integration processes, configurations, and data flow designs.Adhere to data governance, security, and compliance standards in all integration activitiesRequirementsDegree in Information and Communication Technology (ICT) or related field (Computer Science, Software Engineering, or Information Systems).Minimum of 3 years’ experience working with SQL, ETL processes, and APIs.At least 2 years’ experience in data analysis or integration within an agile-scrum environment.Proficiency in SQL and ETL tools (e.g., SSIS, Talend, Informatica, Azure Data Factory).Experience with API integration and data exchange between systems.Strong analytical and problem-solving skills.Understanding of data quality, governance, and validation principles.
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Pipelines Jobs