505 Data Engineers jobs in South Africa
Data Engineers
Posted 27 days ago
Job Viewed
Job Description
Get AI-powered advice on this job and more exclusive features.
Direct message the job poster from Chisl Group
Executive Event Host | Growth Partner @ Chisl | Forbes Africa Columnist | Trusted Introducer of People, Ideas & OpportunityCan you build a model that tells a Kudu from an Eland from 200 feet in the air?
Or detect if a truck driver’s eyes are off the road — in real time?
Do you thrive on messy data — parsing it, cleaning it, structuring it — and turning it into insights that actually matter?
Do you care about elegant solutions?
From auto-deploying infrastructure as code to orchestrating Fivetran pipelines, it’s about building systems that just work — reliably, at scale.
Are you excited by the latest tech — but only when it solves real business problems?
Then you’re probably fluent in things like:
• Python, SQL, and vector embeddings
• OCR tools like Tesseract or AWS Textract
• NLP frameworks like spaCy or Hugging Face
• Transformers, fine-tuning, and custom classification models
• Docker, FastAPI, Airflow, and maybe even a bit of MLOps
• And yes, wrestling ugly PDFs into structured, machine-readable datasets
We’re building applied AI systems that don’t just live in notebooks — they power decisions, detect risk, automate tasks, and unlock entirely new capabilities.
If that sounds like your vibe, drop me a message.
We should talk.
Seniority level- Seniority level Entry level
- Employment type Full-time
- Job function Information Technology
Referrals increase your chances of interviewing at Chisl Group by 2x
Sign in to set job alerts for “Data Engineer” roles.Johannesburg Metropolitan Area 1 week ago
Kempton Park, Gauteng, South Africa 1 week ago
Centurion, Gauteng, South Africa 1 day ago
Sandton, Gauteng, South Africa 1 month ago
Randburg, Gauteng, South Africa 3 days ago
Randburg, Gauteng, South Africa 2 weeks ago
Centurion, Gauteng, South Africa 2 weeks ago
Johannesburg Metropolitan Area 1 week ago
Pretoria, Gauteng, South Africa 2 weeks ago
Johannesburg Metropolitan Area 5 days ago
Pretoria, Gauteng, South Africa 2 weeks ago
Bryanston, Gauteng, South Africa 3 weeks ago
Pretoria, Gauteng, South Africa 2 months ago
Pretoria, Gauteng, South Africa 3 weeks ago
Johannesburg, Gauteng, South Africa 1 month ago
SATIC - Data & Insights (Senior Associate)Johannesburg Metropolitan Area 4 days ago
Johannesburg, Gauteng, South Africa 1 month ago
Johannesburg, Gauteng, South Africa 1 month ago
Johannesburg Metropolitan Area 5 days ago
Centurion, Gauteng, South Africa 1 week ago
SATIC - Data Governance Analyst (Associate)Sandton, Gauteng, South Africa 1 month ago
Johannesburg, Gauteng, South Africa 1 month ago
Centurion, Gauteng, South Africa 1 week ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrData engineers
Posted 1 day ago
Job Viewed
Job Description
Data Engineers (Denodo)
Posted today
Job Viewed
Job Description
InfyStrat is on the lookout for skilled and driven Data Engineers with expertise in Denodo to join our innovative data team. As a Data Engineer, you will be responsible for designing, building, and maintaining data integration solutions that leverage Denodo’s data virtualization platform. Your role will be pivotal in transforming complex data into actionable insights, thereby empowering our stakeholders to make data-informed decisions. We are seeking candidates who are not only technically proficient but also enthusiastic about working with diverse datasets and developing efficient data pipelines. At InfyStrat, we value creativity, collaboration, and continuous learning. You will be part of a vibrant team that thrives on tackling challenges and driving the future of our data capabilities. If you are passionate about data engineering and are well-versed in Denodo, we invite you to apply and help us shape the data landscape of InfyStrat.
Responsibilities- Design and implement data integration solutions using Denodo to ensure seamless access to diverse data sources.
- Develop and maintain data models and metadata repositories.
- Optimize data virtualization processes for improved performance and scalability.
- Collaborate with data analysts, business stakeholders, and IT teams to gather requirements and deliver solutions.
- Monitor and troubleshoot data pipeline issues to ensure data quality and integrity.
- Stay updated with the latest trends and technologies in data engineering and virtualization.
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 3+ years of experience in data engineering or a similar role, with a strong focus on Denodo.
- Proficiency in SQL and experience with data modeling techniques.
- Familiarity with ETL processes and data warehousing concepts.
- Experience working with cloud platforms (e.g., AWS, Azure, Google Cloud) is a plus.
- Strong problem-solving skills and the ability to work independently.
- Excellent communication skills and the ability to work collaboratively in a team environment.
Intermediate Data Engineers
Posted 5 days ago
Job Viewed
Job Description
With us being in September, nows the moment to move beyond practice throws and play in the big leagues. Life is too short to sit on the sidelinesjoin a team where every pass counts and your contribution makes the difference.
At Communicate Recruitment, were the coaches connecting Data Engineers with the right opportunities. Whether youre throwing efficient data pipelines, catching and cleaning messy datasets, or passing insights for the next play, well put you in the perfect position to advance.
Skills & Experience:
Minimum 3-5 Years experience in a related.
Qualification:
A relevant degree or qualification gets you on the roster.
Solid intermediate experience ensures you can handle complex throws, maintain coordination, and contribute to the teams wins.
Dont let this disc fly pastapply now and make September 2025 the month you catch your next Data Engineering career milestone! ð #ITCareers #FrisbeeMindset #TeamworkWins
Contact DYLAN MAWONA on
Data Architects & Data Engineers (AWS)
Posted today
Job Viewed
Job Description
Data architects & data engineers (aws)
Posted today
Job Viewed
Job Description
Responsibilities: Data Architects Design scalable, secure, and high-performance cloud-based data architectures on AWS Define data governance, security policies, and best practices for cloud-based systems Collaborate with stakeholders to understand business needs and translate them into data solutions Evaluate and recommend tools, frameworks, and technologies for cloud data environments Data Engineers Build and maintain robust data pipelines and ETL/ELT workflows in AWS Integrate diverse data sources and ensure data quality and integrity Optimize performance of data systems for analytics and reporting Work closely with Data Architects, Analysts, and Developers to support data-driven initiatives Requirements: 3 years of experience in Data Engineering or Data Architecture roles Strong, hands-on expertise with AWS (e.g., S3, Redshift, Glue, Lambda, EMR, Athena, etc.) Solid understanding of cloud-based data warehousing, data lakes, and data modeling Proficient in SQL and scripting languages such as Python Experience with CI/CD and data pipeline automation tools is an advantage Strong problem-solving and communication skills Ability to work independently in a remote-first environment Contact Hire Resolve for your next career-changing move.
Our client is offering a highly competitive salary for this role based on experience.
Apply for this role today, contact Sonique Beetge at or on Linked In You can also visit the Hire Resolve website: hireresolve.us or email us your CV:
Data Architects & Data Engineers (AWS) – Remote
Posted today
Job Viewed
Job Description
Be The First To Know
About the latest Data engineers Jobs in South Africa !
Data architects & data engineers (aws) – remote
Posted today
Job Viewed
Job Description
Responsibilities: Data Architects Design scalable, secure, and high-performance cloud-based data architectures on AWS Define data governance, security policies, and best practices for cloud-based systems Collaborate with stakeholders to understand business needs and translate them into data solutions Evaluate and recommend tools, frameworks, and technologies for cloud data environments Data Engineers Build and maintain robust data pipelines and ETL/ELT workflows in AWS Integrate diverse data sources and ensure data quality and integrity Optimize performance of data systems for analytics and reporting Work closely with Data Architects, Analysts, and Developers to support data-driven initiatives Requirements: 3 years of experience in Data Engineering or Data Architecture roles Strong, hands-on expertise with AWS (e.g., S3, Redshift, Glue, Lambda, EMR, Athena, etc.) Solid understanding of cloud-based data warehousing, data lakes, and data modeling Proficient in SQL and scripting languages such as Python Experience with CI/CD and data pipeline automation tools is an advantage Strong problem-solving and communication skills Ability to work independently in a remote-first environment Contact Hire Resolve for your next career-changing move.
Our client is offering a highly competitive salary for this role based on experience.
Apply for this role today, contact Gaby Turner at or on Linked In You can also visit the Hire Resolve website: hireresolve.us or email us your CV:
Research Assistant (Administrative tax data | Big Data)
Posted 1 day ago
Job Viewed
Job Description
UNU-WIDER is seeking exceptional candidates for the position of Research Assistant, based in Pretoria, South Africa, to support the SA-TIED programme. This role involves managing and enhancing tax datasets, assisting researchers, and ensuring high standards of data confidentiality.
For the full job description and application details, please click here.
UNU offers three types of contracts: fixed-term staff positions (General Service, National Officer and Professional), Personnel Service Agreement positions (PSA), and consultant positions (CTC). For more information, see the Contract Types page.
1 articles, publications, projects, experts. #J-18808-LjbffrData Warehousing Specialist | Centurion
Posted 21 days ago
Job Viewed
Job Description
Join a leading Financial Services firm in their quest for excellence! Are you a skilled Data Warehousing Specialist seeking an exciting opportunity to make a significant impact? Our client, a prominent player in the financial industry, is actively seeking a talented professional like yourself to join their dynamic team. You will need to establish and lead a world-class data analytics/warehouse capability for the company to enable future needs for advanced analytics and AI.
Maintain and support:
- Existing MIS databases.
- Existing reports and dashboards.
- Existing data warehouses.
Develop, test, deploy, maintain and support: new databases, and reporting, data warehouse and business intelligence applications from high-level business requirements and designs, through the Software Development Life Cycle.
Remain informed about developments and trends in the data enablement field to assist the business to keep its data analytics and management capability up-to-date, and able to meet the future needs of the business in a constantly maturing and increasingly complex short-term insurance industry.
Outputs:
Internal Process:
- Collaborate with Project Managers and Business Leaders to deliver quality, effective management information, data warehouse and business intelligence applications, in line with the agreed development process and business needs.
- Collaborate with stakeholders to gather requirements, conduct analysis and prioritise requests.
- Conduct research and evaluate potential technical solutions to identified business problems.
- Translate business requirements into workable solutions and document solutions into technical specifications, partnering with Business and/or System Analysts when required.
- Design and code new database and analytics functionality using code that is readable, maintainable and reusable.
- Conduct Unit Testing of own code and resolve all issues/queries timeously.
- Contribute to user acceptance testing (UAT) to ensure that functionality is working correctly.
- Deliver solutions into the applicable production environment once testing has been completed.
- Provide stakeholders with regular feedback on the technical design and timelines for solution ensuring that business needs are met.
- Maintain existing databases and applications according to change requests approved by business as and when needed.
- Diagnose root causes of issues through problem-solving and recommend potential solutions.
- Monitor performance of solutions and make recommendations to improve the performance and functionality of the solutions, where appropriate.
- Log issues found in existing systems as internal change controls and ensure successful resolution of issues.
Responsibilities:
Develop, implement and document Business Intelligence Solutions:
- Contribute to the overall data warehouse architecture and database designs.
- Maintain and oversee the administration and maintenance of the data warehouse.
- Develop and maintain Business Intelligence and reporting technologies and processes.
- Translate stakeholder requirements into technical specifications for Business Intelligence (BI) reports and applications.
- Design and develop reports and dashboards based on Business Requirements Document (BRD) and customer specifications.
- Develop feasible technical specifications and process flows for data provision activities in support of the development of business intelligence solutions.
- Ensure the continued maintenance and enhancement to existing business intelligence solutions.
- Within user specifications extract, transform and load (ELT) data using the relevant tools.
- Verify and quality assure of data provided.
- Provide support to business intelligence users on data-related issues.
Future development and planning:
- Conduct research and undergo training where appropriate, in order to remain abreast of data enablement trends and understand their application in the short-term insurance industry.
- Assist management and colleagues to make the right decisions in terms of planning future data enablement infrastructure, architecture and applications in the company short-term insurance business, in alignment with the company’s standards and the South African financial services regulatory framework.
Self-management and Teamwork:
- Provide authoritative expertise and advice to colleagues.
- Develop and maintain productive and collaborative working relationships with peers and team members.
- Deliver on Service Level Agreements made with colleagues.
- Continuously develop own expertise in terms of industry and subject matter development and application thereof in an area of specialisation.
- Participate and contribute to a culture of work-centric thinking, productivity, service delivery and quality management.
- Contribute to continuous innovation through the development, sharing and implementation of new ideas and involvement of peers.
- Take ownership for driving career development.
Finance:
- Manage financial and other company resources under your control with due respect.
Competencies:
- Business Acumen.
- Client / Stakeholder Commitment.
- Drive for Results.
- Leads Change and Innovation.
- Motivating and Inspiring Team.
- Collaboration.
- Impact and Influence.
- Self-Awareness and Insight.
- Diversity and Inclusiveness.
- Growing Talent.
Skills:
- Communication – articulating information and challenging ideas.
- Analysing and interpreting data.
- Problem-solving.
- Planning and organising – time and task management.
Experience and Qualifications:
- Relevant IT and data analytics qualifications e.g. B.Tech or B.Sc. (Informatics) – Essential.
- Dimensional modelling and/or relevant Microsoft certification – Advantageous.
- On-the-job training/qualifications:
- Microsoft Sql Server.
- Oracle.
- Power BI.
- Advanced MS Excel.
- Starquest.
Some experience in predictive analytic platforms will be advantageous. These include:
- Python.
- Scala.
- Spark.
- AWS Sagemaker.
Some experience in Azure / AWS platform services will be advantageous. These include:
- Azure SQL elastic instance.
- Data factory.
- PowerBI.
- AWS RDS.
Experience:
- Methodologies.
- The candidate must have ability to elicit data requirements from stakeholders.
- The candidate must have clear documentation skills.
Principles:
- The candidate must be familiar with design patterns in the data development industry.
- The candidate must have a solid understanding of Metadata constructs.
- The candidate must have clear understanding of EDW.
- Knowledge of Domain driven design would be an advantage.
- The candidate must be familiar with the concept of Data Marts.
- The candidate must be familiar with abstraction techniques.
Modelling:
- The candidate must have proven data modelling techniques (3 years).
- The candidate must have knowledge and experience in Ralph Kimball data warehouse modelling (3 years).
- Knowledge of Immon data warehouse modelling techniques would add an advantage.
- The candidate must have data normalization skills most especially the 2nd Normal form.
Data Transportation:
- The candidate must have solid experience of ETL systems.
- The candidate must have solid experience of sourcing, staging and loading.
- The candidate must be familiar with parallel loading principles.
- The candidate must be familiar with source to target mapping.
Development Software:
- The candidate must have advanced knowledge of T-SQL (4 years) and the following concepts.
- Dynamic T-SQL.
- Multi-threading.
- Performance optimisation and tuning.
- The candidate must have practical experience of SQL Server Database Engine (4 years).
- The candidate must have practical experience of MS SSIS ETL software (4 years).
- The candidate must have practical experience of MS SSAS OLAP software (4 years).
- The candidate must have practical experience of MS Visual Studio Data Tools (4 years).
- Knowledge of Database Administration would add an advantage.
- Expert knowledge in configuration of database hardware resources.
Repository type:
- The candidate must be able to source data from different repositories.
- The candidate must be fully acquainted with Microsoft SQL Server repository.
- Knowledge of Data Lake would be an advantage.
- Knowledge of Oracle would be an advantage.
- Knowledge of Hadoop is not essential but will also be an advantage.