884 Data Engineers jobs in Gauteng

Data Engineers

Centurion, Gauteng Chisl Group

Posted 6 days ago

Job Viewed

Tap Again To Close

Job Description

Get AI-powered advice on this job and more exclusive features.

Direct message the job poster from Chisl Group

Executive Event Host | Growth Partner @ Chisl | Forbes Africa Columnist | Trusted Introducer of People, Ideas & Opportunity

Can you build a model that tells a Kudu from an Eland from 200 feet in the air?

Or detect if a truck driver’s eyes are off the road — in real time?

Do you thrive on messy data — parsing it, cleaning it, structuring it — and turning it into insights that actually matter?

Do you care about elegant solutions?

From auto-deploying infrastructure as code to orchestrating Fivetran pipelines, it’s about building systems that just work — reliably, at scale.

Are you excited by the latest tech — but only when it solves real business problems?

Then you’re probably fluent in things like:

• Python, SQL, and vector embeddings

• OCR tools like Tesseract or AWS Textract

• NLP frameworks like spaCy or Hugging Face

• Transformers, fine-tuning, and custom classification models

• Docker, FastAPI, Airflow, and maybe even a bit of MLOps

• And yes, wrestling ugly PDFs into structured, machine-readable datasets

We’re building applied AI systems that don’t just live in notebooks — they power decisions, detect risk, automate tasks, and unlock entirely new capabilities.

If that sounds like your vibe, drop me a message.

We should talk.

Seniority level
  • Seniority level Entry level
Employment type
  • Employment type Full-time
Job function
  • Job function Information Technology

Referrals increase your chances of interviewing at Chisl Group by 2x

Sign in to set job alerts for “Data Engineer” roles.

Johannesburg Metropolitan Area 1 week ago

Kempton Park, Gauteng, South Africa 1 week ago

Centurion, Gauteng, South Africa 1 day ago

Sandton, Gauteng, South Africa 1 month ago

Randburg, Gauteng, South Africa 3 days ago

Randburg, Gauteng, South Africa 2 weeks ago

Centurion, Gauteng, South Africa 2 weeks ago

Johannesburg Metropolitan Area 1 week ago

Pretoria, Gauteng, South Africa 2 weeks ago

Johannesburg Metropolitan Area 5 days ago

Pretoria, Gauteng, South Africa 2 weeks ago

Bryanston, Gauteng, South Africa 3 weeks ago

Pretoria, Gauteng, South Africa 2 months ago

Pretoria, Gauteng, South Africa 3 weeks ago

Johannesburg, Gauteng, South Africa 1 month ago

SATIC - Data & Insights (Senior Associate)

Johannesburg Metropolitan Area 4 days ago

Johannesburg, Gauteng, South Africa 1 month ago

Johannesburg, Gauteng, South Africa 1 month ago

Johannesburg Metropolitan Area 5 days ago

Centurion, Gauteng, South Africa 1 week ago

SATIC - Data Governance Analyst (Associate)

Sandton, Gauteng, South Africa 1 month ago

Johannesburg, Gauteng, South Africa 1 month ago

Centurion, Gauteng, South Africa 1 week ago

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineers (Denodo)

Johannesburg, Gauteng InfyStrat Software Services

Posted 6 days ago

Job Viewed

Tap Again To Close

Job Description

Overview

InfyStrat is looking for skilled and driven Data Engineers with expertise in Denodo to join our data team. As a Data Engineer, you will design, build, and maintain data integration solutions on Denodo's data virtualization platform. Your role will transform complex data into actionable insights, empowering stakeholders to make data-driven decisions. We value creativity, collaboration, and continuous learning, and you will be part of a vibrant team that thrives on tackling challenges and driving the future of our data capabilities. If you are passionate about data engineering and well-versed in Denodo, we invite you to apply and help shape the data landscape of InfyStrat.



Responsibilities

  • Design and implement data integration solutions using Denodo to ensure seamless access to diverse data sources

  • Develop and maintain data models and metadata repositories

  • Optimize data virtualization processes for performance and scalability

  • Collaborate with data analysts, business stakeholders, and IT teams to gather requirements and deliver solutions

  • Monitor and troubleshoot data pipelines to ensure data quality and integrity

  • Stay updated with the latest trends and technologies in data engineering and virtualization



Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field

  • 3+ years of experience in data engineering or a similar role with a strong focus on Denodo

  • Proficiency in SQL and data modeling techniques

  • Familiarity with ETL processes and data warehousing concepts

  • Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) is a plus

  • Strong problem-solving skills and the ability to work independently

  • Excellent communication skills and the ability to work collaboratively in a team



Seniority level
  • Mid-Senior level


Employment type
  • Full-time


Job function
  • Information Technology


Industries
  • IT Services and IT Consulting


Location: Johannesburg, Gauteng, South Africa

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Intermediate Data Engineers

Johannesburg, Gauteng Communicate Recruitment

Posted 12 days ago

Job Viewed

Tap Again To Close

Job Description


With us being in September, nows the moment to move beyond practice throws and play in the big leagues. Life is too short to sit on the sidelinesjoin a team where every pass counts and your contribution makes the difference.

At Communicate Recruitment, were the coaches connecting Data Engineers with the right opportunities. Whether youre throwing efficient data pipelines, catching and cleaning messy datasets, or passing insights for the next play, well put you in the perfect position to advance.

Skills & Experience:
Minimum 3-5 Years experience in a related.

Qualification:
A relevant degree or qualification gets you on the roster.
Solid intermediate experience ensures you can handle complex throws, maintain coordination, and contribute to the teams wins.

Dont let this disc fly pastapply now and make September 2025 the month you catch your next Data Engineering career milestone! 🏆 #ITCareers #FrisbeeMindset #TeamworkWins


Contact DYLAN MAWONA on
This advertiser has chosen not to accept applicants from your region.

Data Architects & Data Engineers (AWS)

Johannesburg, Gauteng Hire Resolve

Posted today

Job Viewed

Tap Again To Close

Job Description

A leading technology solutions provider, is urgently seeking experienced Data Architects and Data Engineers based in Johannesburg, Gauteng to support upcoming projects focused on AWS-based data infrastructure. Responsibilities: Data Architects Design scalable, secure, and high-performance cloud-based data architectures on AWS Define data governance, security policies, and best practices for cloud-based systems Collaborate with stakeholders to understand business needs and translate them into data solutions Evaluate and recommend tools, frameworks, and technologies for cloud data environments Data Engineers Build and maintain robust data pipelines and ETL/ELT workflows in AWS Integrate diverse data sources and ensure data quality and integrity Optimize performance of data systems for analytics and reporting Work closely with Data Architects, Analysts, and Developers to support data-driven initiatives Requirements: 3 years of experience in Data Engineering or Data Architecture roles Strong, hands-on expertise with AWS (e.g., S3, Redshift, Glue, Lambda, EMR, Athena, etc.) Solid understanding of cloud-based data warehousing, data lakes, and data modeling Proficient in SQL and scripting languages such as Python Experience with CI/CD and data pipeline automation tools is an advantage Strong problem-solving and communication skills Ability to work independently in a remote-first environment Contact Hire Resolve for your next career-changing move. Our client is offering a highly competitive salary for this role based on experience. Apply for this role today, contact Sonique Beetge at or on LinkedIn You can also visit the Hire Resolve website: hireresolve.us or email us your CV:
This advertiser has chosen not to accept applicants from your region.

Data architects & data engineers (aws)

Johannesburg, Gauteng Hire Resolve

Posted today

Job Viewed

Tap Again To Close

Job Description

permanent
A leading technology solutions provider, is urgently seeking experienced Data Architects and Data Engineers based in Johannesburg, Gauteng to support upcoming projects focused on AWS-based data infrastructure.

Responsibilities: Data Architects Design scalable, secure, and high-performance cloud-based data architectures on AWS Define data governance, security policies, and best practices for cloud-based systems Collaborate with stakeholders to understand business needs and translate them into data solutions Evaluate and recommend tools, frameworks, and technologies for cloud data environments Data Engineers Build and maintain robust data pipelines and ETL/ELT workflows in AWS Integrate diverse data sources and ensure data quality and integrity Optimize performance of data systems for analytics and reporting Work closely with Data Architects, Analysts, and Developers to support data-driven initiatives Requirements: 3 years of experience in Data Engineering or Data Architecture roles Strong, hands-on expertise with AWS (e.g., S3, Redshift, Glue, Lambda, EMR, Athena, etc.) Solid understanding of cloud-based data warehousing, data lakes, and data modeling Proficient in SQL and scripting languages such as Python Experience with CI/CD and data pipeline automation tools is an advantage Strong problem-solving and communication skills Ability to work independently in a remote-first environment Contact Hire Resolve for your next career-changing move.

Our client is offering a highly competitive salary for this role based on experience.

Apply for this role today, contact Sonique Beetge at or on Linked In You can also visit the Hire Resolve website: hireresolve.us or email us your CV:
This advertiser has chosen not to accept applicants from your region.

Data Architects & Data Engineers (AWS) – Remote

Gauteng, Gauteng Hire Resolve

Posted today

Job Viewed

Tap Again To Close

Job Description

A leading technology solutions provider, is urgently seeking experienced Data Architects and Data Engineers based in Johannesburg, Gauteng to support upcoming projects focused on AWS-based data infrastructure. Responsibilities: Data Architects Design scalable, secure, and high-performance cloud-based data architectures on AWS Define data governance, security policies, and best practices for cloud-based systems Collaborate with stakeholders to understand business needs and translate them into data solutions Evaluate and recommend tools, frameworks, and technologies for cloud data environments Data Engineers Build and maintain robust data pipelines and ETL/ELT workflows in AWS Integrate diverse data sources and ensure data quality and integrity Optimize performance of data systems for analytics and reporting Work closely with Data Architects, Analysts, and Developers to support data-driven initiatives Requirements: 3 years of experience in Data Engineering or Data Architecture roles Strong, hands-on expertise with AWS (e.g., S3, Redshift, Glue, Lambda, EMR, Athena, etc.) Solid understanding of cloud-based data warehousing, data lakes, and data modeling Proficient in SQL and scripting languages such as Python Experience with CI/CD and data pipeline automation tools is an advantage Strong problem-solving and communication skills Ability to work independently in a remote-first environment Contact Hire Resolve for your next career-changing move. Our client is offering a highly competitive salary for this role based on experience. Apply for this role today, contact Gaby Turner at or on LinkedIn You can also visit the Hire Resolve website: hireresolve.us or email us your CV:
This advertiser has chosen not to accept applicants from your region.

Data architects & data engineers (aws) – remote

Gauteng, Gauteng Hire Resolve

Posted today

Job Viewed

Tap Again To Close

Job Description

permanent
A leading technology solutions provider, is urgently seeking experienced Data Architects and Data Engineers based in Johannesburg, Gauteng to support upcoming projects focused on AWS-based data infrastructure.

Responsibilities: Data Architects Design scalable, secure, and high-performance cloud-based data architectures on AWS Define data governance, security policies, and best practices for cloud-based systems Collaborate with stakeholders to understand business needs and translate them into data solutions Evaluate and recommend tools, frameworks, and technologies for cloud data environments Data Engineers Build and maintain robust data pipelines and ETL/ELT workflows in AWS Integrate diverse data sources and ensure data quality and integrity Optimize performance of data systems for analytics and reporting Work closely with Data Architects, Analysts, and Developers to support data-driven initiatives Requirements: 3 years of experience in Data Engineering or Data Architecture roles Strong, hands-on expertise with AWS (e.g., S3, Redshift, Glue, Lambda, EMR, Athena, etc.) Solid understanding of cloud-based data warehousing, data lakes, and data modeling Proficient in SQL and scripting languages such as Python Experience with CI/CD and data pipeline automation tools is an advantage Strong problem-solving and communication skills Ability to work independently in a remote-first environment Contact Hire Resolve for your next career-changing move.

Our client is offering a highly competitive salary for this role based on experience.

Apply for this role today, contact Gaby Turner at or on Linked In You can also visit the Hire Resolve website: hireresolve.us or email us your CV:
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data engineers Jobs in Gauteng !

Big Data Data Engineer

Johannesburg, Gauteng PBT Group

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

Big Data Data Engineer job vacancy in Johannesburg.

We are seeking a skilled Data Engineer to design and develop scalable data pipelines that ingest raw, unstructured JSON data from source systems and transform it into clean, structured datasets within our Hadoop-based data platform.

The ideal candidate will play a critical role in enabling data availability, quality, and usability by engineering the movement of data from the Raw Layer to the Published and Functional Layers.

Overview

Big Data Data Engineer job vacancy in Johannesburg.

Key Responsibilities:

  • Design, build, and maintain robust data pipelines to ingest raw JSON data from source systems into the Hadoop Distributed File System (HDFS).
  • Transform and enrich unstructured data into structured formats (e.g., Parquet, ORC) for the Published Layer using tools like PySpark, Hive, or Spark SQL.
  • Develop workflows to further process and organize data into Functional Layers optimized for business reporting and analytics.
  • Implement data validation, cleansing, schema enforcement, and deduplication as part of the transformation process.
  • Collaborate with Data Analysts, BI Developers, and Business Users to understand data requirements and ensure datasets are production-ready.
  • Optimize ETL/ELT processes for performance and reliability in a large-scale distributed environment.
  • Maintain metadata, lineage, and documentation for transparency and governance.
  • Monitor pipeline performance and implement error handling and alerting mechanisms.
Technical Skills & Experience
  • 3+ years of experience in data engineering or ETL development within a big data environment.
  • Strong experience with Hadoop ecosystem tools: HDFS, Hive, Spark, YARN, and Sqoop.
  • Proficiency in PySpark, Spark SQL, and HQL (Hive Query Language).
  • Experience working with unstructured JSON data and transforming it into structured formats.
  • Solid understanding of data lake architectures: Raw, Published, and Functional layers.
  • Familiarity with workflow orchestration tools like Airflow, Oozie, or NiFi.
  • Experience with schema design, data modeling, and partitioning strategies.
  • Comfortable with version control tools (e.g., Git) and CI/CD processes.
Nice to Have
  • Experience with data cataloging and governance tools (e.g., Apache Atlas, Alation).
  • Exposure to cloud-based Hadoop platforms like AWS EMR, Azure HDInsight, or GCP Dataproc.
  • Experience with containerization (e.g., Docker) and/or Kubernetes for pipeline deployment.
  • Familiarity with data quality frameworks (e.g., Deequ, Great Expectations).
Qualifications
  • Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field.
  • Relevant certifications (e.g., Cloudera, Databricks, AWS Big Data) are a plus.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Big Data Data Engineer

Johannesburg, Gauteng PBT Group

Posted 6 days ago

Job Viewed

Tap Again To Close

Job Description

We are seeking a skilled Data Engineer to design and develop scalable data pipelines that ingest raw, unstructured JSON data from source systems and transform it into clean, structured datasets within our Hadoop-based data platform. The ideal candidate will play a critical role in enabling data availability, quality, and usability by engineering the movement of data from the Raw Layer to the Published and Functional Layers.

Key Responsibilities:

  • Design, build, and maintain robust data pipelines to ingest raw JSON data from source systems into the Hadoop Distributed File System (HDFS).
  • Transform and enrich unstructured data into structured formats (e.g., Parquet, ORC) for the Published Layer using tools like PySpark, Hive, or Spark SQL.
  • Develop workflows to further process and organize data into Functional Layers optimized for business reporting and analytics.
  • Implement data validation, cleansing, schema enforcement, and deduplication as part of the transformation process.
  • Collaborate with Data Analysts, BI Developers, and Business Users to understand data requirements and ensure datasets are production-ready.
  • Optimize ETL/ELT processes for performance and reliability in a large-scale distributed environment.
  • Maintain metadata, lineage, and documentation for transparency and governance.
  • Monitor pipeline performance and implement error handling and alerting mechanisms.

Technical Skills & Experience:

  • 3+ years of experience in data engineering or ETL development within a big data environment.
  • Strong experience with Hadoop ecosystem tools: HDFS, Hive, Spark, YARN, and Sqoop.
  • Proficiency in PySpark, Spark SQL, and HQL (Hive Query Language).
  • Experience working with unstructured JSON data and transforming it into structured formats.
  • Solid understanding of data lake architectures: Raw, Published, and Functional layers.
  • Familiarity with workflow orchestration tools like Airflow, Oozie, or NiFi.
  • Experience with schema design, data modeling, and partitioning strategies.
  • Comfortable with version control tools (e.g., Git) and CI/CD processes.

Nice to Have:

  • Experience with data cataloging and governance tools (e.g., Apache Atlas, Alation).
  • Exposure to cloud-based Hadoop platforms like AWS EMR, Azure HDInsight, or GCP Dataproc.
  • Experience with containerization (e.g., Docker) and/or Kubernetes for pipeline deployment.
  • Familiarity with data quality frameworks (e.g., Deequ, Great Expectations).

Qualifications:

  • Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field.
  • Relevant certifications (e.g., Cloudera, Databricks, AWS Big Data) are a plus.

* In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent

* If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.

This advertiser has chosen not to accept applicants from your region.

Big data data engineer

Johannesburg, Gauteng PBT Group

Posted today

Job Viewed

Tap Again To Close

Job Description

permanent
Big Data Data Engineer job vacancy in Johannesburg. We are seeking a skilled Data Engineer to design and develop scalable data pipelines that ingest raw, unstructured JSON data from source systems and transform it into clean, structured datasets within our Hadoop-based data platform. The ideal candidate will play a critical role in enabling data availability, quality, and usability by engineering the movement of data from the Raw Layer to the Published and Functional Layers. Overview Big Data Data Engineer job vacancy in Johannesburg. Key Responsibilities: Design, build, and maintain robust data pipelines to ingest raw JSON data from source systems into the Hadoop Distributed File System (HDFS). Transform and enrich unstructured data into structured formats (e.g., Parquet, ORC) for the Published Layer using tools like Py Spark, Hive, or Spark SQL. Develop workflows to further process and organize data into Functional Layers optimized for business reporting and analytics. Implement data validation, cleansing, schema enforcement, and deduplication as part of the transformation process. Collaborate with Data Analysts, BI Developers, and Business Users to understand data requirements and ensure datasets are production-ready. Optimize ETL/ELT processes for performance and reliability in a large-scale distributed environment. Maintain metadata, lineage, and documentation for transparency and governance. Monitor pipeline performance and implement error handling and alerting mechanisms. Technical Skills & Experience 3+ years of experience in data engineering or ETL development within a big data environment. Strong experience with Hadoop ecosystem tools: HDFS, Hive, Spark, YARN, and Sqoop. Proficiency in Py Spark, Spark SQL, and HQL (Hive Query Language). Experience working with unstructured JSON data and transforming it into structured formats. Solid understanding of data lake architectures: Raw, Published, and Functional layers. Familiarity with workflow orchestration tools like Airflow, Oozie, or Ni Fi. Experience with schema design, data modeling, and partitioning strategies. Comfortable with version control tools (e.g., Git) and CI/CD processes. Nice to Have Experience with data cataloging and governance tools (e.g., Apache Atlas, Alation). Exposure to cloud-based Hadoop platforms like AWS EMR, Azure HDInsight, or GCP Dataproc. Experience with containerization (e.g., Docker) and/or Kubernetes for pipeline deployment. Familiarity with data quality frameworks (e.g., Deequ, Great Expectations). Qualifications Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field. Relevant certifications (e.g., Cloudera, Databricks, AWS Big Data) are a plus. #J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Engineers Jobs View All Jobs in Gauteng