54 Data Integration jobs in South Africa
Data Integration
Posted today
Job Viewed
Job Description
The role of the Senior Data Integration Analyst encompasses many activities, including (but not limited to):
- Designing and implementing advanced data integration pipelines and ETL (Extract, Transform, Load) processes.
- Managing complex integrations across multiple systems and platforms to ensure seamless data flow.
- Collaborating with stakeholders to understand and define data integration requirements.
- Overseeing data governance and ensuring data integrity throughout integration processes.
- Mentoring, providing technical guidance and support.
- Troubleshooting and optimizing integration workflows for performance and reliability.
Minimum Qualification:
A Degree in Information Communication Technology (ICT) field incorporating (but not limited to) Computer Science or Software Engineering or Information Systems.
Minimum Experience:
- Minimum of 5 years' experience working with SQL, ETL, APIs; 3 years in data integration and agile-scrum environment within enterprise organisations of >1000 users.
Data Integration Analyst
Posted today
Job Viewed
Job Description
Design and implement advanced integration pipelines and ETL processes
Seamless Dataflow - Manage complex integrations to ensure seamless data flow
Collaboratewith stakeholders to define and understand data integration requirements
For senior roles:
Oversee data Governance and data integrity
Mentor in technical integration and troubleshoot to optimise integration performance
Degree in Information Communication( ICT related, Computer Science, Information Systems)
3 Years experience in ETL, SQL and API's, Integration and Analyses
Experience working in a large enterprise with at least a 1000 user headcount and multiproject environment
2-3 years experience in web-based application environments and Microsoft Office professional
Experience definingand implementing product/integration requirements in a Sprint, Scrum/Agile environment
Use of Integration tools such asAzure Data Factory, Informatica, or Talend
Between 3 - 5 Years
Senior Data Integration Engineer
Posted today
Job Viewed
Job Description
Job Description:
Job Title: Senior Data Integration Engineer (Salesforce, Databricks & MuleSoft)
Location: Johannesburg (Hybrid)
Employment Type: Contract
Contract Tenure: 6 to 12 months
Job Summary
We are seeking a highly experienced and strategic Senior Data Integration Engineer to architect, build, and manage the data pipelines that power our customer intelligence ecosystem. In this critical role, you will be the subject matter expert responsible for designing and implementing robust integrations between our core platforms: Salesforce Data Cloud, Databricks, and MuleSoft.
You will be responsible for creating a seamless flow of data, enabling advanced analytics, and empowering our business to activate real-time customer insights. The ideal candidate is a hands-on expert who can translate complex business requirements into scalable, secure, and high-performance technical solutions.
Required Skills & Experience
- 6+ years of professional experience
in a data engineering, integration development, or data architecture role. - Proven hands-on experience with MuleSoft:
Demonstrable expertise in designing, building, and managing APIs using the Anypoint Platform (API-led connectivity, DataWeave, connectors). - Strong proficiency in Databricks:
Hands-on experience developing data pipelines using
PySpark
, SQL, Delta Lake, and job orchestration. - Demonstrable experience with Salesforce Data Cloud:
In-depth knowledge of its data model, ingestion methods (Connectors, Ingestion API), identity resolution, segmentation, and activation capabilities. - Expert SQL & Python Skills:
Ability to write complex, efficient SQL queries and Python code for data manipulation and automation. - Solid understanding of data modeling principles
and experience designing and working with ETL/ELT processes. - Experience working with major cloud platforms (
AWS, Azure, or GCP
).
Preferred Qualifications
- Certifications:
- Salesforce Data Cloud Consultant
- MuleSoft Certified Developer / Architect
- Databricks Certified Data Engineer Professional
- Experience with other Salesforce clouds (e.g., Marketing Cloud, Sales Cloud).
- Knowledge of CI/CD and DevOps practices in a data context.
- Familiarity with streaming data technologies (e.g., Kafka).
Senior Data Integration Engineer
Posted today
Job Viewed
Job Description
Job Summary
We are seeking a highly experienced and strategic Senior Data Integration Engineer to architect, build, and manage the data pipelines that power our customer intelligence ecosystem. In this critical role, you will be the subject matter expert responsible for designing and implementing robust integrations between our core platforms: Salesforce Data Cloud, Databricks, and MuleSoft.
You will be responsible for creating a seamless flow of data, enabling advanced analytics, and empowering our business to activate real-time customer insights. The ideal candidate is a hands-on expert who can translate complex business requirements into scalable, secure, and high-performance technical solutions.
Key Responsibilities
- Architect Integration Solutions:
Lead the design and architecture of data integration patterns and end-to-end data flows between source systems, MuleSoft, Databricks, and Salesforce Data Cloud. - Develop MuleSoft APIs:
Design, develop, and deploy reusable, API-led integration solutions using MuleSoft's Any point Platform to ingest data into the ecosystem and to syndicate data to downstream systems. - Build Advanced Data Pipelines in Databricks:
Implement complex data transformation, cleansing, and enrichment pipelines using PySpark and SQL within the Databricks Lakehouse Platform. Prepare and model data for ingestion into Salesforce Data Cloud and for advanced analytics use cases. - Master Salesforce Data Cloud:
Configure and manage Salesforce Data Cloud, including setting up data streams, performing data mapping and harmonization, defining identity resolution rules, and creating insightful calculated metrics. - Enable Data Activation:
Collaborate with marketing, sales, and service teams to build and activate complex audience segments from Salesforce Data Cloud for use in personalization and campaign execution. - Ensure Governance and Performance:
Implement data quality checks, error handling, and performance monitoring across all platforms. Ensure solutions adhere to data governance policies, security standards, and privacy regulations. - Mentorship and Best Practices:
Act as a senior technical resource for the team, establishing best practices for integration and data engineering. Provide guidance and mentorship to junior team members. - Stakeholder Collaboration:
Work closely with business analysts, data scientists, and platform owners to gather requirements and deliver solutions that provide tangible business value.
Required Skills & Experience
- 6+ years of professional experience
in a data engineering, integration development, or data architecture role. - Proven hands-on experience with MuleSoft:
Demonstrable expertise in designing, building, and managing APIs using the Any point Platform (API-led connectivity, Data Weave, connectors). - Strong proficiency in Databricks:
Hands-on experience developing data pipelines using
PySpark
, SQL, Delta Lake, and job orchestration. - Demonstrable experience with Salesforce Data Cloud:
In-depth knowledge of its data model, ingestion methods (Connectors, Ingestion API), identity resolution, segmentation, and activation capabilities. - Expert SQL & Python Skills:
Ability to write complex, efficient SQL queries and Python code for data manipulation and automation. - Solid understanding of data modeling principles
and experience designing and working with ETL/ELT processes. - Experience working with major cloud platforms (
AWS, Azure, or GCP
).
Preferred Qualifications
Certifications:
- Salesforce Data Cloud Consultant
- MuleSoft Certified Developer / Architect
- Databricks Certified Data Engineer Professional
- Experience with other Salesforce clouds (e.g., Marketing Cloud, Sales Cloud).
- Knowledge of CI/CD and DevOps practices in a data context.
- Familiarity with streaming data technologies (e.g., Kafka).
Senior Data Integration/ Analyst
Posted 14 days ago
Job Viewed
Job Description
Data Integration / Analyst (Senior-Level)
Posted 8 days ago
Job Viewed
Job Description
Mid-Level Data Integration/Analyst
Posted 14 days ago
Job Viewed
Job Description
Be The First To Know
About the latest Data integration Jobs in South Africa !
Integration & Data Solutions Specialist
Posted today
Job Viewed
Job Description
Reporting to the Senior IT Manager, the Integration & Data Solutions Specialist will design, implement, and maintain system integrations and data flows across the business. This role combines hands-on technical expertise with data-driven problem solving - building secure, automated connections between platforms (e.g. Flowgear, API's, internal systems) while also supporting data analysis and reporting through Power BI.
Systems Integration
- Develop, maintain, and optise integrations using Flowgear and other middleware platforms.
- Write and maintain scripts (PowerShell, Python, or similar) to automate processes and improve efficiency.
- Build and support secure API's and data pipelines between internal and client systems.
- Troubleshoot integration issues and liaise with vendors when required.
Data & Analytics
- Support data transformation and reporting initiatives across departments.
- Develop dashboards and reports in Power BI for internal teams and clients.
- Assist with data modelling and ensuring data quality within reporting solutions.
- Collaborate with finance and operations teams to deliver actionable insights.
Innovation & Continuous Improvements
- Research and evaluate new technologies for integration and analytics.
- Drive automation and process optimisation across IT operations.
- Contribute to the IT strategy by recommending best practices for data flow and reporting.
Requirements
Education
- Bachelor's Degree in IT, Computer Science, or related field.
Experience
- 3-5 years Experience in system integration, scripting, or related roles.
- Proficiency in Scripting languages (PowerShell, Python, SQL)
- Experience with Power BI of other BI tools.
- Knowledge of API's, JSON, and data structures.
- Strong analytical mindset with problem-solving ability.
- Good communication skills to work across business units.
Work Environment
- Hybrid of integration engineering and data analytics.
- Hands-on technical role with opportunities to contribute to business intelligence initiatives.
- Collaboration with IT, Finance, and Operations teams.
Join TreasuryONE for a rewarding career path
Specialist: Data Engineering
Posted today
Job Viewed
Job Description
We are seeking a skilled and motivated
Specialist: Data Engineering
to join our dynamic Financial Services team.
The ideal candidate will play a key role in implementing the company's Data Strategy by driving data awareness, engagement, and monetization while ensuring operational excellence across platforms.
This role involves building and optimizing data pipelines, managing data architectures on both on-premises and cloud environments, ensuring data quality and compliance, and supporting Payments & Ecommerce teams with reliable data solutions.
The position is ideal for a technically strong data engineer with a solid understanding of data frameworks, who is eager to grow in a fast-paced and innovation-driven environment.
Key Responsibilities
Strategy Development & Implementation
- Implement the Data Strategy to drive customer awareness, engagement, experience, and monetization.
- Provide input into data monetization aspects of the Business Plan and cascade to OpCos.
- Drive product revenue growth, operational excellence, and customer satisfaction.
Operational Delivery – Data Platforms
- Support in developing frameworks and technical architectures for Financial Services innovation.
- Assist the Payments & Ecommerce team in achieving annual goals.
- Implement and manage data architectures across multiple platforms (web, mobile, social).
- Ensure continuous alignment of data platforms with Financial Services strategy and evolving business requirements.
- Oversee delivery of data platform services and ensure integration efficiency and quality.
Data Engineering
- Design, build, and maintain ETL/ELT pipelines from multiple data sources.
- Set up and manage centralized data storage systems and warehouses (e.g., Fabric, PrestoDB, Oracle).
- Utilize cloud technologies such as
Microsoft Azure
for scalable deployments. - Implement workflow management tools (Apache Airflow, Prefect, or Dagster).
- Ensure data accuracy, automation, and observability in data flows.
- Optimize pipelines and analytics for performance, scalability, and reliability.
Governance & Reporting
- Participate in strategic and operational meetings, providing technical guidance.
- Support enterprise-wide transformation initiatives related to data management.
- Ensure adequate risk mitigation and compliance with regional data regulations.
- Prepare regular progress and performance reports for management.
Qualifications and Experience
- Bachelor's degree in
Computer Science, Big Data, Database Administration
, or a related field. - Minimum
3 years' experience
in
Advanced Data Engineering
. - Proven experience in
Big Data On-premises and Cloud Data Pipeline (M. Fabric)
deployment. - Proficiency in
SQL, Python, R, and Power BI
. - Experience with
Oracle, Teradata, or Snowflake
, and cloud platforms such as
AWS, Azure, or GCP
. - Strong stakeholder management and communication skills.
- Experience in
Telecommunications or Financial Services
is an advantage. - Willingness and flexibility to travel within Africa.
Skills and Competencies
- Data pipeline and API development expertise.
- Knowledge of Machine Learning Operations pipeline deployment.
- Strong analytical and problem-solving abilities.
- Agile and digital-first mindset.
- High attention to detail and commitment to quality.
- Excellent relationship-building and presentation skills.
Behavioural Qualities
Analytical and Detail-Oriented | Business-Focused | Self-Driven | Results-Oriented | Collaborative | Emotionally Intelligent
Manager, Data Engineering
Posted today
Job Viewed
Job Description
Job Overview
Business Segment: Insurance & Asset Management
Location: ZA, GP, Roodepoort, 4 Ellis Street
Job Type: Full-time
Job Ref ID: A-0001
Date Posted: 10/6/2025
Job Description
To develop and maintain complete data architecture across several application platforms, provide capability across application platforms. To design, build, operationalise, secure and monitor data pipelines and data stores to applicable architecture, solution designs, standards, policies and governance requirements thus making data accessible for the evaluation and optimisation for downstream use case consumption. To execute data engineering duties according to standards, frameworks, and roadmaps
Qualifications
Type of Qualification: First Degree
Field of Study: Business Commerce
Type of Qualification: First Degree
Field of Study: Information Studies
Type of Qualification: First Degree
Field of Study: Information Technology
Experience Required
Software Engineering
Technology
5-7 years
Experience in building databases, warehouses, reporting and data integration solutions. Experience building and optimising big data data-pipelines, architectures and data sets. Experience in creating and integrating APIs. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
8-10 years
Deep understanding of data pipelining and performance optimisation, data principles, how data fits in an organisation, including customers, products and transactional information. Knowledge of integration patterns, styles, protocols and systems theory
8-10 years
Experience in database programming languages including SQL, PL/SQL, SPARK and or appropriate data tooling. Experience with data pipeline and workflow management tools
Additional Information
Behavioural Competencies:
Adopting Practical Approaches
Articulating Information
Checking Things
Developing Expertise
Documenting Facts
Embracing Change
Examining Information
Interpreting Data
Managing Tasks
Producing Output
Taking Action
Team Working
Technical Competencies:
Big Data Frameworks and Tools
Data Engineering
Data Integrity
Data Quality
IT Knowledge
Stakeholder Management (IT)
Please note: All our recruitment processes comply with the applicable local laws and regulations. We will never ask for money or any from of payment as part of our recruitment process. If you experience this, please contact our Fraud line on or