54 Data Integration jobs in South Africa

Data Integration

DLK Group

Posted today

Job Viewed

Tap Again To Close

Job Description

The role of the Senior Data Integration Analyst encompasses many activities, including (but not limited to):

  • Designing and implementing advanced data integration pipelines and ETL (Extract, Transform, Load) processes.
  • Managing complex integrations across multiple systems and platforms to ensure seamless data flow.
  • Collaborating with stakeholders to understand and define data integration requirements.
  • Overseeing data governance and ensuring data integrity throughout integration processes.
  • Mentoring, providing technical guidance and support.
  • Troubleshooting and optimizing integration workflows for performance and reliability.
Requirements

Minimum Qualification:

A Degree in Information Communication Technology (ICT) field incorporating (but not limited to) Computer Science or Software Engineering or Information Systems.

Minimum Experience:

  • Minimum of 5 years' experience working with SQL, ETL, APIs; 3 years in data integration and agile-scrum environment within enterprise organisations of >1000 users.
This advertiser has chosen not to accept applicants from your region.

Data Integration Analyst

R900000 - R1200000 Y iLaunch

Posted today

Job Viewed

Tap Again To Close

Job Description

Design and implement advanced integration pipelines and ETL processes

Seamless Dataflow - Manage complex integrations to ensure seamless data flow

Collaboratewith stakeholders to define and understand data integration requirements

For senior roles:

Oversee data Governance and data integrity

Mentor in technical integration and troubleshoot to optimise integration performance

Degree in Information Communication( ICT related, Computer Science, Information Systems)

3 Years experience in ETL, SQL and API's, Integration and Analyses

Experience working in a large enterprise with at least a 1000 user headcount and multiproject environment

2-3 years experience in web-based application environments and Microsoft Office professional

Experience definingand implementing product/integration requirements in a Sprint, Scrum/Agile environment

Use of Integration tools such asAzure Data Factory, Informatica, or Talend

Between 3 - 5 Years

This advertiser has chosen not to accept applicants from your region.

Senior Data Integration Engineer

R2000000 - R2500000 Y Indsafri

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description:

Job Title: Senior Data Integration Engineer (Salesforce, Databricks & MuleSoft)

Location: Johannesburg (Hybrid)

Employment Type: Contract

Contract Tenure: 6 to 12 months

Job Summary

We are seeking a highly experienced and strategic Senior Data Integration Engineer to architect, build, and manage the data pipelines that power our customer intelligence ecosystem. In this critical role, you will be the subject matter expert responsible for designing and implementing robust integrations between our core platforms: Salesforce Data Cloud, Databricks, and MuleSoft.

You will be responsible for creating a seamless flow of data, enabling advanced analytics, and empowering our business to activate real-time customer insights. The ideal candidate is a hands-on expert who can translate complex business requirements into scalable, secure, and high-performance technical solutions.

Required Skills & Experience

  • 6+ years of professional experience
    in a data engineering, integration development, or data architecture role.
  • Proven hands-on experience with MuleSoft:
    Demonstrable expertise in designing, building, and managing APIs using the Anypoint Platform (API-led connectivity, DataWeave, connectors).
  • Strong proficiency in Databricks:
    Hands-on experience developing data pipelines using
    PySpark
    , SQL, Delta Lake, and job orchestration.
  • Demonstrable experience with Salesforce Data Cloud:
    In-depth knowledge of its data model, ingestion methods (Connectors, Ingestion API), identity resolution, segmentation, and activation capabilities.
  • Expert SQL & Python Skills:
    Ability to write complex, efficient SQL queries and Python code for data manipulation and automation.
  • Solid understanding of data modeling principles
    and experience designing and working with ETL/ELT processes.
  • Experience working with major cloud platforms (
    AWS, Azure, or GCP
    ).

Preferred Qualifications

  • Certifications:
  • Salesforce Data Cloud Consultant
  • MuleSoft Certified Developer / Architect
  • Databricks Certified Data Engineer Professional
  • Experience with other Salesforce clouds (e.g., Marketing Cloud, Sales Cloud).
  • Knowledge of CI/CD and DevOps practices in a data context.
  • Familiarity with streaming data technologies (e.g., Kafka).
This advertiser has chosen not to accept applicants from your region.

Senior Data Integration Engineer

R1800000 - R2500000 Y Indsafri

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Summary

We are seeking a highly experienced and strategic Senior Data Integration Engineer to architect, build, and manage the data pipelines that power our customer intelligence ecosystem. In this critical role, you will be the subject matter expert responsible for designing and implementing robust integrations between our core platforms: Salesforce Data Cloud, Databricks, and MuleSoft.

You will be responsible for creating a seamless flow of data, enabling advanced analytics, and empowering our business to activate real-time customer insights. The ideal candidate is a hands-on expert who can translate complex business requirements into scalable, secure, and high-performance technical solutions.

Key Responsibilities

  • Architect Integration Solutions:
    Lead the design and architecture of data integration patterns and end-to-end data flows between source systems, MuleSoft, Databricks, and Salesforce Data Cloud.
  • Develop MuleSoft APIs:
    Design, develop, and deploy reusable, API-led integration solutions using MuleSoft's Any point Platform to ingest data into the ecosystem and to syndicate data to downstream systems.
  • Build Advanced Data Pipelines in Databricks:
    Implement complex data transformation, cleansing, and enrichment pipelines using PySpark and SQL within the Databricks Lakehouse Platform. Prepare and model data for ingestion into Salesforce Data Cloud and for advanced analytics use cases.
  • Master Salesforce Data Cloud:
    Configure and manage Salesforce Data Cloud, including setting up data streams, performing data mapping and harmonization, defining identity resolution rules, and creating insightful calculated metrics.
  • Enable Data Activation:
    Collaborate with marketing, sales, and service teams to build and activate complex audience segments from Salesforce Data Cloud for use in personalization and campaign execution.
  • Ensure Governance and Performance:
    Implement data quality checks, error handling, and performance monitoring across all platforms. Ensure solutions adhere to data governance policies, security standards, and privacy regulations.
  • Mentorship and Best Practices:
    Act as a senior technical resource for the team, establishing best practices for integration and data engineering. Provide guidance and mentorship to junior team members.
  • Stakeholder Collaboration:
    Work closely with business analysts, data scientists, and platform owners to gather requirements and deliver solutions that provide tangible business value.

Required Skills & Experience

  • 6+ years of professional experience
    in a data engineering, integration development, or data architecture role.
  • Proven hands-on experience with MuleSoft:
    Demonstrable expertise in designing, building, and managing APIs using the Any point Platform (API-led connectivity, Data Weave, connectors).
  • Strong proficiency in Databricks:
    Hands-on experience developing data pipelines using
    PySpark
    , SQL, Delta Lake, and job orchestration.
  • Demonstrable experience with Salesforce Data Cloud:
    In-depth knowledge of its data model, ingestion methods (Connectors, Ingestion API), identity resolution, segmentation, and activation capabilities.
  • Expert SQL & Python Skills:
    Ability to write complex, efficient SQL queries and Python code for data manipulation and automation.
  • Solid understanding of data modeling principles
    and experience designing and working with ETL/ELT processes.
  • Experience working with major cloud platforms (
    AWS, Azure, or GCP
    ).

Preferred Qualifications

Certifications:

  • Salesforce Data Cloud Consultant
  • MuleSoft Certified Developer / Architect
  • Databricks Certified Data Engineer Professional
  • Experience with other Salesforce clouds (e.g., Marketing Cloud, Sales Cloud).
  • Knowledge of CI/CD and DevOps practices in a data context.
  • Familiarity with streaming data technologies (e.g., Kafka).
This advertiser has chosen not to accept applicants from your region.

Senior Data Integration/ Analyst

7100 Cape Town, Western Cape Sabenza IT & Recruitment

Posted 14 days ago

Job Viewed

Tap Again To Close

Job Description

Design and implement advanced data integration pipelines and ETL (Extract, Transform, Load) processes.Architect scalable integration solutions across multiple systems and platforms.Ensure optimal data flow and system interoperability through efficient API and service-based integrations.Oversee data governance practices and ensure adherence to organisational standards.Maintain high levels of data quality, consistency, and integrity across all integrated environments.Establish and enforce best practices for integration security, compliance, and performance monitoring.Engage with stakeholders to understand business and technical requirements for data integration.Translate integration needs into detailed design specifications and actionable implementation plans.Partner with development, data, and business teams to deliver end-to-end data integration solutions.Diagnose and resolve integration and data flow issues in production and testing environments.Optimise integration workflows for scalability, reliability, and performance efficiency.Continuously review and enhance existing integration processes to support evolving business needs.Provide technical guidance and mentorship to junior and mid-level data integration team members.Promote knowledge sharing, best practices, and continuous improvement within the team.Contribute to technical strategy, design reviews, and architectural decisions.RequirementsDegree in Information and Communication Technology (ICT) or related field (Computer Science, Software Engineering, or Information Systems).Minimum of 5 years’ experience working with SQL, ETL tools, and APIs.At least 3 years’ experience in data integration within an agile-scrum environment and enterprise organisations (>1000 users).Advanced proficiency in SQL, ETL tools (e.g., SSIS, Talend, Informatica, Azure Data Factory), and data integration frameworks.Strong experience with API management, data warehousing, and cloud-based integration platforms.Deep understanding of data governance, metadata management, and data quality principles.
This advertiser has chosen not to accept applicants from your region.

Data Integration / Analyst (Senior-Level)

7400 Cape Town, Western Cape DLK Group

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

The role of the Senior Data Integration Analyst encompasses many activities, including (but not limited to): Designing and implementing advanced data integration pipelines and ETL (Extract, Transform, Load) processes.Managing complex integrations across multiple systems and platforms to ensure seamless data flow.Collaborating with stakeholders to understand and define data integration requirements.Overseeing data governance and ensuring data integrity throughout integration processes.Mentoring, providing technical guidance and support.Troubleshooting and optimizing integration workflows for performance and reliability.RequirementsMinimum Qualification: A Degree in Information Communication Technology (ICT) field incorporating (but not limited to) Computer Science or Software Engineering or Information Systems. Minimum Experience: Minimum of 5 years’ experience working with SQL, ETL, APIs; 3 years in data integration and agile-scrum environment within enterprise organisations of >1000 users.
This advertiser has chosen not to accept applicants from your region.

Mid-Level Data Integration/Analyst

7100 Cape Town, Western Cape Sabenza IT & Recruitment

Posted 14 days ago

Job Viewed

Tap Again To Close

Job Description

Design, develop, and maintain efficient data integration pipelines and workflows.Implement ETL (Extract, Transform, Load) processes to move and prepare data for analysis and reporting.Identify, analyse, and resolve data flow or integration issues in a timely manner.Implement system fixes and process improvements to ensure data accuracy and consistency.Support data mapping, transformation, and reconciliation activities between source and target systems.Ensure data integrity during migration, transformation, and integration processes.Work closely with data engineers, business analysts, and software teams to ensure integration solutions align with project and business requirements.Participate in agile-scrum ceremonies and contribute to sprint planning, reviews, and retrospectives.Maintain comprehensive documentation for integration processes, configurations, and data flow designs.Adhere to data governance, security, and compliance standards in all integration activitiesRequirementsDegree in Information and Communication Technology (ICT) or related field (Computer Science, Software Engineering, or Information Systems).Minimum of 3 years’ experience working with SQL, ETL processes, and APIs.At least 2 years’ experience in data analysis or integration within an agile-scrum environment.Proficiency in SQL and ETL tools (e.g., SSIS, Talend, Informatica, Azure Data Factory).Experience with API integration and data exchange between systems.Strong analytical and problem-solving skills.Understanding of data quality, governance, and validation principles.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data integration Jobs in South Africa !

Integration & Data Solutions Specialist

TreasuryOne

Posted today

Job Viewed

Tap Again To Close

Job Description

Reporting to the Senior IT Manager, the Integration & Data Solutions Specialist will design, implement, and maintain system integrations and data flows across the business. This role combines hands-on technical expertise with data-driven problem solving - building secure, automated connections between platforms (e.g. Flowgear, API's, internal systems) while also supporting data analysis and reporting through Power BI.

Systems Integration

  • Develop, maintain, and optise integrations using Flowgear and other middleware platforms.
  • Write and maintain scripts (PowerShell, Python, or similar) to automate processes and improve efficiency.
  • Build and support secure API's and data pipelines between internal and client systems.
  • Troubleshoot integration issues and liaise with vendors when required.

Data & Analytics

  • Support data transformation and reporting initiatives across departments.
  • Develop dashboards and reports in Power BI for internal teams and clients.
  • Assist with data modelling and ensuring data quality within reporting solutions.
  • Collaborate with finance and operations teams to deliver actionable insights.

Innovation & Continuous Improvements

  • Research and evaluate new technologies for integration and analytics.
  • Drive automation and process optimisation across IT operations.
  • Contribute to the IT strategy by recommending best practices for data flow and reporting.

Requirements

Education

  • Bachelor's Degree in IT, Computer Science, or related field.

Experience

  • 3-5 years Experience in system integration, scripting, or related roles.
  • Proficiency in Scripting languages (PowerShell, Python, SQL)
  • Experience with Power BI of other BI tools.
  • Knowledge of API's, JSON, and data structures.
  • Strong analytical mindset with problem-solving ability.
  • Good communication skills to work across business units.

Work Environment

  • Hybrid of integration engineering and data analytics.
  • Hands-on technical role with opportunities to contribute to business intelligence initiatives.
  • Collaboration with IT, Finance, and Operations teams.

Join TreasuryONE for a rewarding career path

This advertiser has chosen not to accept applicants from your region.

Specialist: Data Engineering

Centurion, Gauteng R900000 - R1200000 Y Clyrofor SA

Posted today

Job Viewed

Tap Again To Close

Job Description

We are seeking a skilled and motivated
Specialist: Data Engineering
to join our dynamic Financial Services team.

The ideal candidate will play a key role in implementing the company's Data Strategy by driving data awareness, engagement, and monetization while ensuring operational excellence across platforms.

This role involves building and optimizing data pipelines, managing data architectures on both on-premises and cloud environments, ensuring data quality and compliance, and supporting Payments & Ecommerce teams with reliable data solutions.

The position is ideal for a technically strong data engineer with a solid understanding of data frameworks, who is eager to grow in a fast-paced and innovation-driven environment.

Key Responsibilities

Strategy Development & Implementation

  • Implement the Data Strategy to drive customer awareness, engagement, experience, and monetization.
  • Provide input into data monetization aspects of the Business Plan and cascade to OpCos.
  • Drive product revenue growth, operational excellence, and customer satisfaction.

Operational Delivery – Data Platforms

  • Support in developing frameworks and technical architectures for Financial Services innovation.
  • Assist the Payments & Ecommerce team in achieving annual goals.
  • Implement and manage data architectures across multiple platforms (web, mobile, social).
  • Ensure continuous alignment of data platforms with Financial Services strategy and evolving business requirements.
  • Oversee delivery of data platform services and ensure integration efficiency and quality.

Data Engineering

  • Design, build, and maintain ETL/ELT pipelines from multiple data sources.
  • Set up and manage centralized data storage systems and warehouses (e.g., Fabric, PrestoDB, Oracle).
  • Utilize cloud technologies such as
    Microsoft Azure
    for scalable deployments.
  • Implement workflow management tools (Apache Airflow, Prefect, or Dagster).
  • Ensure data accuracy, automation, and observability in data flows.
  • Optimize pipelines and analytics for performance, scalability, and reliability.

Governance & Reporting

  • Participate in strategic and operational meetings, providing technical guidance.
  • Support enterprise-wide transformation initiatives related to data management.
  • Ensure adequate risk mitigation and compliance with regional data regulations.
  • Prepare regular progress and performance reports for management.

Qualifications and Experience

  • Bachelor's degree in
    Computer Science, Big Data, Database Administration
    , or a related field.
  • Minimum
    3 years' experience
    in
    Advanced Data Engineering
    .
  • Proven experience in
    Big Data On-premises and Cloud Data Pipeline (M. Fabric)
    deployment.
  • Proficiency in
    SQL, Python, R, and Power BI
    .
  • Experience with
    Oracle, Teradata, or Snowflake
    , and cloud platforms such as
    AWS, Azure, or GCP
    .
  • Strong stakeholder management and communication skills.
  • Experience in
    Telecommunications or Financial Services
    is an advantage.
  • Willingness and flexibility to travel within Africa.

Skills and Competencies

  • Data pipeline and API development expertise.
  • Knowledge of Machine Learning Operations pipeline deployment.
  • Strong analytical and problem-solving abilities.
  • Agile and digital-first mindset.
  • High attention to detail and commitment to quality.
  • Excellent relationship-building and presentation skills.

Behavioural Qualities

Analytical and Detail-Oriented | Business-Focused | Self-Driven | Results-Oriented | Collaborative | Emotionally Intelligent

This advertiser has chosen not to accept applicants from your region.

Manager, Data Engineering

Roodepoort, Gauteng R1500000 - R2500000 Y Standard Bank

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Overview

Business Segment: Insurance & Asset Management

Location: ZA, GP, Roodepoort, 4 Ellis Street

Job Type: Full-time

Job Ref ID: A-0001

Date Posted: 10/6/2025

Job Description

To develop and maintain complete data architecture across several application platforms, provide capability across application platforms. To design, build, operationalise, secure and monitor data pipelines and data stores to applicable architecture, solution designs, standards, policies and governance requirements thus making data accessible for the evaluation and optimisation for downstream use case consumption. To execute data engineering duties according to standards, frameworks, and roadmaps

Qualifications

Type of Qualification: First Degree

Field of Study: Business Commerce

Type of Qualification: First Degree

Field of Study: Information Studies

Type of Qualification: First Degree

Field of Study: Information Technology

Experience Required

Software Engineering

Technology

5-7 years

Experience in building databases, warehouses, reporting and data integration solutions. Experience building and optimising big data data-pipelines, architectures and data sets. Experience in creating and integrating APIs. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement

8-10 years

Deep understanding of data pipelining and performance optimisation, data principles, how data fits in an organisation, including customers, products and transactional information. Knowledge of integration patterns, styles, protocols and systems theory

8-10 years

Experience in database programming languages including SQL, PL/SQL, SPARK and or appropriate data tooling. Experience with data pipeline and workflow management tools

Additional Information

Behavioural Competencies:

Adopting Practical Approaches

Articulating Information

Checking Things

Developing Expertise

Documenting Facts

Embracing Change

Examining Information

Interpreting Data

Managing Tasks

Producing Output

Taking Action

Team Working

Technical Competencies:

Big Data Frameworks and Tools

Data Engineering

Data Integrity

Data Quality

IT Knowledge

Stakeholder Management (IT)

Please note: All our recruitment processes comply with the applicable local laws and regulations. We will never ask for money or any from of payment as part of our recruitment process. If you experience this, please contact our Fraud line on or

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Integration Jobs