227 Data Integration Tools jobs in South Africa
Data Solutions Architect
Posted today
Job Viewed
Job Description
Contract
Experience8 to 30 years
SalaryNegotiable
Job Published08 October 2025
Job Reference No.Job Description
We're seeking a Data Solutions Architect / Senior Data Engineer to join a growing Data and AI team working on an innovative cloud-based data warehousing and AI solution. The team is developing a client-facing platform that integrates data warehousing with a RAG (Retrieval-Augmented Generation) system — transforming unstructured and structured data into organized, summarized, and insightful information for business use.
You'll play a leading role in building out the production-ready environment, ensuring compliance, scalability, and performance, while contributing to the development of advanced AI-driven insights and automation capabilities.
High-Level Project Overview
The platform focuses on the aggregation, synthesis, and summarization of unstructured data through a secure, scalable Azure-based architecture.
A proof of concept has already been built (a chatbot web app hosted on Azure), and the next phase involves expanding this into a fully integrated production solution.
Your work will involve:
- Designing and developing scalable data pipelines, storage, and processing components in Azure.
- Supporting the integration of RAG systems with AI models and vector databases.
- Enabling robust data flow between AI, search, and warehousing layers.
- Contributing to architectural decisions on performance, governance, and scalability.
Tech Stack
- Framework / Orchestration: Azure AI Foundry (for AI workflow orchestration and management)
- LLM Provider: Azure OpenAI Service (designed to be model-agnostic for future extensibility)
- Storage: Azure Blob Storage Gen 2 (for documents and source data)
- Vector Store / Search: Azure AI Search (vector + hybrid search capabilities)
- App Hosting: Azure App Service (chatbot web app interface integrated with RAG)
- Embedding Model: Azure OpenAI text-embedding-3-large
- Data Warehousing: Azure Data Factory for data extraction, transformation, and integration between AI Foundry, AI Search, and Blob Storage
Key Responsibilities
- Architect and implement end-to-end data pipelines and data warehousing solutions in Azure.
- Design and optimize ETL/ELT workflows using Azure Data Factory or equivalent.
- Collaborate with AI developers and cloud engineers to connect data pipelines to AI/RAG systems.
- Implement data models to support text retrieval, embedding, and summarization processes.
- Ensure compliance with data governance and security best practices.
- Mentor and support junior team members as the data capability scales.
Required Skills & Experience
- 7+ years' experience as a Data Engineer or Data Architect in enterprise environments.
- Strong proficiency in Azure Cloud (Data Factory, Blob Storage, Synapse, AI Foundry, OpenAI).
- Advanced SQL and Python development experience.
- Proven experience with cloud data migration and modern data warehousing.
- Knowledge of vector databases, AI model integration, or RAG frameworks highly advantageous.
- Understanding of data orchestration, governance, and security principles.
- Experience in insurance or financial services preferred.
Why Join
This is a greenfield opportunity to help build a Data & AI capability from the ground up. The team currently consists of four engineers and is expected to grow rapidly in 2026. You'll be working on cutting-edge Azure and AI technologies, shaping an intelligent platform that makes sense of large, messy datasets and transforms them into business-ready insights.
In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent
If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.
Data EngineeringData ArchitectureEnterprise ArchitectureMicrosoft AzureSQLPythonData Warehousing
IndustriesInsuranceFinancial Services
Integration & Data Solutions Specialist
Posted today
Job Viewed
Job Description
Reporting to the Senior IT Manager, the Integration & Data Solutions Specialist will design, implement, and maintain system integrations and data flows across the business. This role combines hands-on technical expertise with data-driven problem solving - building secure, automated connections between platforms (e.g. Flowgear, API's, internal systems) while also supporting data analysis and reporting through Power BI.
Systems Integration
- Develop, maintain, and optise integrations using Flowgear and other middleware platforms.
- Write and maintain scripts (PowerShell, Python, or similar) to automate processes and improve efficiency.
- Build and support secure API's and data pipelines between internal and client systems.
- Troubleshoot integration issues and liaise with vendors when required.
Data & Analytics
- Support data transformation and reporting initiatives across departments.
- Develop dashboards and reports in Power BI for internal teams and clients.
- Assist with data modelling and ensuring data quality within reporting solutions.
- Collaborate with finance and operations teams to deliver actionable insights.
Innovation & Continuous Improvements
- Research and evaluate new technologies for integration and analytics.
- Drive automation and process optimisation across IT operations.
- Contribute to the IT strategy by recommending best practices for data flow and reporting.
Requirements
Education
- Bachelor's Degree in IT, Computer Science, or related field.
Experience
- 3-5 years Experience in system integration, scripting, or related roles.
- Proficiency in Scripting languages (PowerShell, Python, SQL)
- Experience with Power BI of other BI tools.
- Knowledge of API's, JSON, and data structures.
- Strong analytical mindset with problem-solving ability.
- Good communication skills to work across business units.
Work Environment
- Hybrid of integration engineering and data analytics.
- Hands-on technical role with opportunities to contribute to business intelligence initiatives.
- Collaboration with IT, Finance, and Operations teams.
Join TreasuryONE for a rewarding career path
Network Data Solutions Delivery Engineer
Posted today
Job Viewed
Job Description
P.I. Works is the first company in the world to automate the management of a commercial 5G network.
Our automated network management products and services empower mobile operators to accelerate network transformation and to drive network quality and efficiency on the path to 5G. These solutions have been deployed across more than 84 mobile operators in 58 countries around the world.
Our success is built by our people that strive for excellence and deliver great value to our customers.
At the core of our success are a culture built on integrity, trust, and respect. We are looking for Network Data Solutions Delivery Engineer to lead mobile network product delivery projects, with a focus on next-generation optimization techniques with Network Automation Solutions, and keeps full engagement with our customer as a single point of contact.
Job Description:
- Work closely with customer mobile network teams for network performance monitoring and reporting activities
- Provide onsite support for customer and feedback to product, integration and support teams
- Create custom dashboards and automated reports on key business metrics for various use cases
- Participate in technical meetings, monitor products performance, track customer demands
- Prepare and issue technical guidelines and results of technical studies
- Prepare and issue engineering level and executive-level reports
Required Qualifications:
- Diploma from Engineering Departments relevant to Mobile Telecommunications or Computer Science
- Basic Knowledge on Telecom domain KPIs and Parameters (Radio, Core, Transmission)
- Good level of SQL and ETL platform knowledge, able to write complex SQL queries
- Experience with tools in performance management (Optima/Helix, PrOptima, PRS, Eniq, etc.) is a plus
- Very good command of English
- Must have an excellent customer relationship with both verbal and written communication skills
- Experience with OEM vendor EMS/OSS systems and Managed Objects, parameters and measurements is a plus
- Reporting / BI Tools experience (PowerBI, Tableau, Qlikview, etc.) is a plus
Data Science Manager: Telemetry Solutions
Posted today
Job Viewed
Job Description
Who are we?
We made Miway to change the way insurance works, to make it work your way, protecting what matters to you and making life a whole lot easier.
Why Miway?We know what it's like to be in an early morning accident on the way to work, to watch helplessly as water floods your business premises, or have your most precious possessions stolen, to lose a truckload of costly stock, or be stranded a long way from home.
So, we created products that put you first, that help you get back on your feet, that make a difference when you're down and dealing with a loss. That's insurance the way it should be. Insurance your way.
That's Miway.You may want to be part of that. Insurance the way it should be. Built around you.It's about you - not about us.
Why Join Us?
This is a rare opportunity to lead at the frontier of actuarial innovation - where data science, machine learning, and software engineering create real-world impact. As Data Science Manager: Telemetry Solutions, you'll help shape the future of insurance.
- Lead actuarial and data science initiatives across telematics, claims, and operational intelligence.
- Work with rich, real-time datasets from connected vehicles and IoT platforms to unlock insights and build predictive systems.
- Collaborate with cross-functional teams - including product, development, and claims - to integrate analytics into key decision-making processes.
- Apply and grow your technical skills across AI, machine learning, and production-grade systems in a dynamic environment.
- Join a forward-looking team that values innovation, ownership, and continuous learning.
If you're technically strong, strategically curious, and passionate about using data science to tackle bold, modern challenges - this is your opportunity to lead the change.
What will you do?
The Data Science Manager: Telemetry Solutions is a future-focused leadership role that bridges actuarial science, data science, machine learning, and engineering. You will lead a multidisciplinary team that leverages telematics and alternative data to deliver intelligent solutions across usage-based insurance (UBI), claims innovation, product development, and dynamic risk assessment.
This role offers a unique opportunity to operate at the intersection of data, strategy, and technology - driving meaningful transformation in how insurance products are developed, risks are understood, and operations are optimised.
Key Responsibilities
- Architect and deploy scalable, real-time systems for ingesting, processing, and analysing high-frequency vehicle sensor and telemetry data.
- Develop and maintain robust APIs and automation tools in Python and C# to support internal platforms and customer-facing applications.
- Oversee the creation and deployment of AI models - including computer vision and large language models (LLMs) - to enhance claims automation and document intelligence.
- Champion innovation within the actuarial domain by blending traditional methodologies with modern MLOps, software engineering, and streaming analytics.
- Manage end-to-end machine learning pipelines using tools like MLflow, ensuring reproducibility, governance, and performance monitoring.
- Collaborate closely with Product, IT, Data Engineering, and Claims teams to embed intelligent analytics into core business processes and customer experiences.
- Mentor and grow a high-performing team of data scientists, and software engineers.
- Provide technical insight and data-driven narratives to support strategic decision-making through dashboards, models, and actuarial reporting.
Qualifications
- Degree in Actuarial Science, Data Science, Statistics, Computer Science, or a related field (Honours or Master's preferred).
Experience
- Minimum of 5 years' experience in data science, or data engineering roles, preferably in a telematics-driven insurance environment.
- Proficient in Python and C#, with a solid foundation in object-oriented programming and API development.
- Advanced SQL skills, including data processing, database management, and query optimisation (Essential).
- Practical experience using Git for version control and collaboration(Essential).
- Hands-on experience with machine learning libraries and frameworks such as XGBoost, spaCy, Hugging Face, TensorFlow, or PyTorch.
- Proven ability to deploy and monitor models using MLflow and related MLOps tools.
- Experience with large-scale datasets, ideally from IoT or telematics sources.
- Working knowledge of Docker for containerisation (Intermediate to Advanced – Advantageous).
- Exposure to messaging systems such as RabbitMQ (Advantageous).
- Strong understanding of software engineering principles, including object-oriented design and clean code practices.
- Solid business acumen with the ability to translate technical work into strategic outcomes.
- Excellent communication skills, with the ability to explain complex technical concepts to non-technical stakeholders.
- Demonstrated leadership experience and a passion for mentoring cross- functional teams.
- Strong analytical and problem-solving skills, with the ability to extract insights from high-volume, high-dimensional data
Knowledge And Skills
Actuarial Problem Solving
Issues management
Business knowledge
Business analysis
Personal Attributes
Self-development - Contributing independently
Interpersonal savvy - Contributing independently
Nimble learning - Contributing independently
Tech savvy - Contributing independently
Core Competencies
Cultivates innovation - Contributing independently
Customer focus - Contributing independently
Drives results - Contributing independently
Collaborates - Contributing independently
Being resilient - Contributing independently
Build a successful career with us
We're all about building strong, lasting relationships with our employees. We know that you have hopes for your future – your career, your personal development and of achieving great things. We pride ourselves in helping our employees to realise their worth. Through its five business clusters – Sanlam Fintech, Sanlam Life and Savings, Sanlam Investment Group, Sanlam Allianz, Santam, as well as MiWay and the Group Office – the group provides many opportunities for growth and development.
Turnaround time
The shortlisting process will only start once the application due date has been reached. The time taken to complete this process will depend on how far you progress and the availability of managers.
Deadline to apply: 19 September 2025.
Our commitment to transformation
At MiWay we believe in cultivating a positive and dynamic working environment that gives you freedom and opportunity to succeed. MiWay is committed to transformation and embracing diversity. This is what drives us to achieve a multicultural workplace with employment equity as a key goal to create an inclusive workforce, reflective of the demographics of our society.
Data Engineer Data Engineer
Posted today
Job Viewed
Job Description
We are looking for an experienced intermediate Data Engineer. Candidate must have strong SQL & SSIS capabilities, good production support with experience for being on call on a 7-day rotation cycle. AWS cloud skills would be a massive benefit as well.
Skills & Experience Required:
Technical Stack:
- Strong proficiency in
SSIS (SQL Server Integration Services)
– candidates should demonstrate extensive experience in developing, optimising, and maintaining SSIS packages in a production environment. - Proven
ability to perform 24/7 on-call support
and handle production support issues effectively and independently. - Ability to balance operational responsibilities with new development – we're looking for someone who is not only strong in maintaining and supporting existing systems but also has the capability to drive and implement new projects independently.
- Self-starter with the confidence to take ownership of deliverables, proactively identify issues, and provide solutions without needing constant direction.
- Strong
SQL
and
data modelling
skills (dimensional and normalized) - Proficient in
Python
or
Scala - Strong SSIS knowledge
- Experience with
Spark
(PySpark preferred) - Experience with
cloud platforms
, ideally
AWS
(e.g., S3, Glue, Athena, EMR) - Knowledge of
data warehouse
and
data lake
architectures - Exposure to
CI/CD pipelines
and containerization (e.g., Docker, GitLab CI)
Production Support Requirements:
This role includes participation in a rotational production support schedule. The successful candidate must be willing and able to:
- Be on call every third week as part of a structured support roster.
- Respond to after-hours callouts, including late-night or early-morning alerts.
- Support and troubleshoot issues in the nightly batch process to ensure successful completion.
- Work collaboratively with operations and infrastructure teams to resolve time-sensitive issues under pressure.
- Maintain logs, escalate critical incidents, and ensure accurate handovers.
- This support responsibility is critical to ensure the availability and continuity of data services required by business users and systems across the enterprise.
Data Engineer
Posted today
Job Viewed
Job Description
Exact location – Rosebank, Firestation, walking distance to Gautrain Station
Responsibilities will include, but not be limited to the following:
- Develop and optimize data pipelines for collecting, processing, and storing large volumes of production and operational data.
- Play a pivotal role in documenting business processes, procedures, and policies for transparency and consistency.
- Design and implement scalable data architecture solutions tailored to production/operation environments.
- Collaborate with clients, project managers, engineers, and operators to understand data requirements.
- Ensure data integrity, security, and compliance with project requirements and outcomes.
- Monitor and troubleshoot data flow issues and optimize data processing workflows.
- Support the integration of IoT devices, sensors, and industrial systems with data platforms.
- Generate reports and dashboards to visualize real-time and historical data insights.
- Stay updated with the latest industrial data engineering tools and technologies.
- Travel to operational sites when required to ensure hardware and data capturing is functional.
Qualifications and experience:
- Bachelor's or master's degree in Computer Science, Data Engineering, Industrial Engineering, or related field.
- Proven experience in data engineering, especially in industrial or equipment operations settings.
- Strong proficiency in programming languages.
- Experience with cloud platforms and data tools (AVEVA, Power-BI)
- An understanding of the power platform, its functionalities, and its integration potential will be an advantage.
- Knowledge of IoT protocols and industrial automation systems.
- Familiarity with data modelling, ETL processes, and database management.
- Excellent problem-solving and communication skills.
- Attention to detail and a commitment to accuracy.
Data Engineer
Posted today
Job Viewed
Job Description
Ready to architect the future of data on Google Cloud Platform?
Join Lumina Africa (PTY) LTD and lead innovative data solutions using cutting-edge GCP technologies. We're seeking a creative Data Engineer who thinks beyond traditional approaches and brings fresh perspectives to cloud-native data architectures.
What Makes This Role Exceptional:
GCP Innovation Hub
- Work exclusively with Google Cloud's latest data and AI services
Global Tech Group
- Part of Lumina Tech Group with operations across Dubai, London, and South Africa
Cloud-First Culture
- Build scalable, serverless data solutions from day one
Rapid Growth Environment
- Shape our expanding South African data practice
Core GCP Technologies You'll Master:
- BigQuery
- Design and optimize large-scale data warehouses
- Cloud Dataflow
- Build real-time and batch processing pipelines
- Pub/Sub
- Implement event-driven data architectures
- Cloud Composer (Airflow)
- Orchestrate complex data workflows
- Dataproc
- Manage Spark and Hadoop workloads
- Cloud Storage
- Architect data lake solutions
- Vertex AI
- Integrate ML pipelines with data engineering workflows
What You'll Architect:
- Design cloud-native data pipelines using GCP services
- Build real-time streaming solutions with Pub/Sub and Dataflow
- Optimize BigQuery performance for petabyte-scale analytics
- Implement Infrastructure as Code using Terraform and Cloud Deployment Manager
- Create innovative data solutions that challenge conventional approaches
- Mentor teams on GCP best practices and modern data patterns
Essential GCP Expertise:
- 3+ years
hands-on experience with Google Cloud Platform - BigQuery mastery
- complex SQL, partitioning, clustering, optimization
- Cloud Dataflow
- Apache Beam, streaming and batch processing
- Python/Java
- Strong programming skills for data pipeline development
- Terraform/Cloud Deployment Manager
- Infrastructure as Code
- Pub/Sub & Cloud Functions
- Event-driven architectures
- GCP Certifications
preferred (Professional Data Engineer, Cloud Architect)
Bonus Skills:
- Experience with
dbt
for data transformation - Kubernetes
and
Cloud Run
for containerized workloads - Looker
or
Data Studio
for visualization - Apache Spark
on Dataproc - Cloud Security
and
IAM
best practices
Why Choose Lumina Africa:
GCP-Focused Career Path
- Specialize in Google Cloud's data ecosystem
Competitive Package
- Market-leading salary + GCP certification support
Hybrid Working
- Modern offices with flexible arrangements
Innovation Budget
- Resources for experimenting with new GCP services
International Projects
- Collaborate across UAE, UK, and African markets
Fast-Track Growth
- Lead data initiatives in our expanding practice
Ready to Build the Future on GCP?
If you're passionate about Google Cloud Platform and ready to architect innovative data solutions that drive South Africa's digital transformation, we want to hear from you.
Be The First To Know
About the latest Data integration tools Jobs in South Africa !
Data Engineer
Posted today
Job Viewed
Job Description
We are seeking an experienced
Data Engineer
with strong expertise in
Google Cloud Platform
to join a fast-growing, innovative organisation. This role offers the chance to design, build, and optimise scalable data pipelines and architectures that support impactful decision-making across the business.
If you are analytically sharp, self-motivated, and enjoy working in dynamic environments, this could be the perfect opportunity. A passion for African business, curiosity, and a sense of humour will help you thrive in our energetic and forward-thinking culture.
Key Responsibilities
- Design and develop scalable data pipelines and architectures using
Google Cloud Platform technologies
(BigQuery, Dataflow, Pub/Sub, Cloud Storage). - Build and manage ETL processes to transform diverse data sources into structured, reliable formats.
- Collaborate with data scientists and analysts to deliver solutions that enable insights and smarter decisions.
- Maintain documentation for pipelines, data models, and architecture to ensure clarity and consistency.
- Troubleshoot and resolve data issues while safeguarding quality and integrity.
- Optimise data workflows for performance, scalability, and cost efficiency.
- Automate data-related processes to streamline operations.
- Stay ahead of industry trends and adopt best practices in Google Cloud Platform and data engineering.
Requirements
- Bachelor's or Master's degree in Computer Science, Data Science, or related field.
- 5+ years of experience
as a Data Engineer or in a similar role. - Strong programming skills in
BigQuery, Python, SQL, and Google Cloud Platform
. - Proven experience with
ETL development and data modeling
. - Familiarity with
data lakehouse concepts
and techniques. - Excellent problem-solving and analytical skills.
- Strong communication and collaboration abilities.
- Hands-on experience with
Google Cloud Platform technologies
(BigQuery, Dataflow, Pub/Sub, Cloud Storage). - Experience in financial services would be an advantage.
Apply
Danielle Paxton
Senior Specialist Recruitment Consultant
Data Engineer
Posted today
Job Viewed
Job Description
Do you like building data systems and pipelines? Do you enjoy interpreting trends and patterns? Are you able to recognize the
deeper meaning of data
?
Join Elixirr Digital as a
Data Engineer
and help us analyze and organize raw data to provide valuable business insights to our clients and stakeholders
As
Data Engineer
, you will be responsible to ensure the availability and quality of the data so that it becomes usable by target data users. You will be working on a set of operations aimed at creating processes and mechanisms for the flow and access of data in accordance with the project scope and deadlines
Discover the opportunity to join our
Data & Analytics
department and work closely with a group of like-minded individuals with cutting-edge technologies
What you will be doing as a Data Engineer at Elixirr Digital?
- Working closely with Data Architects on AWS, Azure, or IBM architecture designs.
- Maintaining and building data ecosystems by working on the implementation of data ingestions, often in collaboration with other data engineers, analysts, DevOps, and data scientists.
- Ensuring the security of cloud infrastructure and processes by implementing best practices.
- Applying modern principles and methodologies to advance business initiatives and capabilities.
- Identifying and consulting on ways to improve data processing, reliability, efficiency, and quality, as well as solution cost and performance.
- Preparing test cases and strategies for unit testing, system, and integration testing.
Competencies and skillset we expect you to have to successfully perform your job:
- Proficient in Python with extensive experience in data processing and analysis.
- Strong SQL expertise, adept at writing efficient queries and optimizing database performance
- Previous working experience with the Azure/AWS data stack.
- Experienced in software development lifecycle methodologies, with a focus on Agile practices.
We Could Be a Perfect Fit If You Are:
- Passionate about technology. You anticipate, recognize, and resolve technical problems using a variety of specialized tools for application development and support.
- Independent. You are a self-motivated and ambitious individual, capable of managing multiple responsibilities effectively.
- Problem-solver. You think creatively and find solutions to complex challenges.
- Creative and outside-the-box thinker. You look beyond blog posts and whitepapers, competitions, and even state-of-the-art benchmarks to solve real-world problems.
- Communicator. Strong verbal and written communication skills are essential to ensure effective collaboration and timely delivery of results within the team.
- Proficient in English. We work across continents in a global environment, so fluent English, both written and spoken is a must.
Why is Elixirr Digital the right next step for you?
From working with cutting-edge technologies to solving complex challenges for global clients, we make sure your work matters. And while you're building great things, we're here to support you.
Compensation & Equity:
- Performance bonus
- Employee Stock Options Grant
- Employee Share Purchase Plan (ESPP)
- Competitive compensation
Health & Wellbeing:
- Health benefits plan
- Flexible working hours
- Pension plan
Projects & Tools:
- Modern equipment
- Big clients and interesting projects
- Cutting-edge technologies
Learning & Growth:
- Growth and development opportunities
- Internal LMS & knowledge hubs
We don't just offer a job - we create space for you to grow, thrive, and be recognized.
Intrigued? Apply now
Data Engineer
Posted today
Job Viewed
Job Description
Exciting Opportunity:
Data Engineer
(Kafka & Flink)
PBT Group is seeking a
skilled
Data Engineer
with a drive for real-time data streaming and cutting-edge architecture. If you thrive in building and optimising scalable data processing systems, this role is for you
What you'll do:
- Design, develop, and maintain
high-performance data pipelines
and streaming solutions - Integrate and optimise data flows across diverse systems
- Collaborate with architects and data teams to deliver
robust, scalable, and secure
data solutions
What you'll bring:
- Proven experience with
Apache Flink
(real-time stream processing) and
Apache Kafka
(event-driven pipelines) - Strong development skills in
Python,
Java
, or Scala - Hands-on experience in
Spring Boot
,
cloud or containerised environments
(
AWS
, Kubernetes, Docker) - Solid grasp of
data modelling
, integration, and data quality principles
Why join us?
Be part of an innovative team driving real-time data transformation across industries, using modern technologies and agile delivery methods.
If you're ready to take your data engineering career to the next level, we'd love to hear from you
In line with the POPI Act, please provide consent for us to keep your details on our database for future opportunities.
If you do not receive feedback within two weeks, kindly consider your application unsuccessful.