7,788 Data Professionals jobs in South Africa
Data Engineer / Data Scientist
Posted today
Job Viewed
Job Description
Data Engineer/ Data Analytics Engineer
Posted today
Job Viewed
Job Description
We believe data isn’t just numbers – it’s the story of your business waiting to be told We build clever, lean, data-driven systems for humans – not just machines.
And we’re looking for a Data Analytics Engineer who gets that data is only useful when it’s accessible, understandable.
Requirements:
- Experience in the telecoms industry
- Infrastructure as code experience. (terraform and Pulumi)
- Experience working in a scale up/dynamic consulting environment
- Exposure to accounting concepts
- 5+ years’ experience in data analytics, data engineering, or related technical roles, with significant hands-on coding in Python, PySpark, Notebook and SQL. Proven experience working on data integration projects (ETL, data pipelines, APIs)
- Proven ability to build and support data integration pipelines (ETL, dataflows, APIs) at scale
- Proven ability to build and deploy Data pipelines to Cloud production environments using automated CI/CD pipelines
- Strong experience with modern analytics platforms (Databricks, Delta Lake, cloud storage); experience with relational databases and data modelling is a plus. Experience in Agile/Scrum environments
- Strong experience in working with cloud-based environment (Azure, AWS)
- Track record of translating business logic and requirements into production-grade, testable code. Experience working with data analytics tools and platforms (e.g., Power BI, SuperSet, Tableau, Looker, Grafana)
- Solid grasp of data quality, data validation, and monitoring concepts
- Strong communication skills—able to present technical logic and results to both technical and non-technical audiences
- Experience in Agile/Scrum environments and working collaboratively with both business and engineering teams
Reference Number for this position is GZ60747 which is a contract REMOTE position offering a cost to company salary of R1.4m per annum negotiable on experience and ability. Contact Garth on or call him on to discuss this and other opportunities.
Are you ready for a change of scenery? The E-Merge IT recruitment is a specialist niche recruitment agency. We offer our candidates options so that we can successfully place the right developers with the right companies in the right roles. Check out the E-Merge website for more great positions.
Do you have a friend who is a developer or technology specialist? We pay cash for successful referrals!
Data Engineer
Posted today
Job Viewed
Job Description
Data Engineer
Posted today
Job Viewed
Job Description
Data Engineer
Posted today
Job Viewed
Job Description
Data Engineer
Posted today
Job Viewed
Job Description
Overview
Transporeon is a SaaS company founded in 2000 in Ulm, Germany. The company provides logistics solutions across several areas, including buying and selling logistics services, organizing shipment execution, organizing dock, yard, truck, and driver schedules, and invoice auditing for logistics services. It has grown significantly over the years, reaching €150m in revenue before being acquired by Trimble for $2 billion USD in 2022. Transporeon has one of the largest networks of shippers and carriers in Europe, with approximately 1,400 employees:
What you will do- Design and implement scalable data solutions on AWS, including data lakes, warehouses, and streaming systems. (We might migrate our data lake to Azure, so Azure knowledge is beneficial).
- Develop, optimize, and maintain data pipelines using AWS services.
- Implement robust ETL/ELT processes and event-driven data ingestion.
- Establish and enforce data governance policies, ensuring data quality, security, and compliance.
- Optimize cloud resources for performance, availability, and cost-efficiency.
- Partner with cross-functional teams to gather requirements and deliver comprehensive cloud-based solutions.
- Identify opportunities to enhance systems, processes, and technologies while troubleshooting complex technical challenges.
- AWS: Glue, Lambda, Step Function, Batch, ECS, Quicksight, Machine Learning, Sagemaker, etc.
- DevOps: CloudFormation, Terraform, Git, CodeBuild
- Database: Redshift, PostgreSQL, DynamoDB, Athena
- Language: Bash, Python, SQL
- AI: Cursor
- Experience in AWS platforms, including data services. Basic knowledge of Azure is preferred.
- Experience in data engineering, strong competence in ETL processes, data warehousing, and big data technologies.
- Advanced skills in scripting, Python, SQL, and infrastructure automation tools.
- Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).
- Experience with data visualization tools (e.g., QuickSight) is a plus.
- Openness to work daily with AI tools like Cursor.
- Ability to work independently and responsibly; if you are not living near our offices, self-discipline and interpersonal skills to work remotely.
Remote role in the following countries: Estonia, Latvia, Lithuania, Poland, Slovakia, Hungary, Romania, Portugal, Spain, Italy, Croatia. We are not offering freelancing contracts, only local employment contracts.
Our Inclusiveness CommitmentWe believe in celebrating our differences. Diversity, Equity, and Inclusion guide our current success and our ongoing efforts to improve. We actively seek to add members to our community who represent our customers and the places we live and work. We have programs to ensure our people are seen, heard, and welcomed, and that they belong, no matter who they are or where they come from.
#J-18808-LjbffrData Engineer
Posted today
Job Viewed
Job Description
Overview
Transporeon is a SaaS company founded in 2000 in Ulm, Germany. The company provides logistics solutions across several areas, including buying and selling logistics services, organizing shipment execution, organizing dock, yard, truck, and driver schedules, and invoice auditing for logistics services. It has grown significantly over the years, reaching €150m in revenue before being acquired by Trimble for $2 billion USD in 2022. Transporeon has one of the largest networks of shippers and carriers in Europe, with approximately 1,400 employees:
What you will do- Design and implement scalable data solutions on AWS, including data lakes, warehouses, and streaming systems. (We might migrate our data lake to Azure, so Azure knowledge is beneficial).
- Develop, optimize, and maintain data pipelines using AWS services.
- Implement robust ETL/ELT processes and event-driven data ingestion.
- Establish and enforce data governance policies, ensuring data quality, security, and compliance.
- Optimize cloud resources for performance, availability, and cost-efficiency.
- Partner with cross-functional teams to gather requirements and deliver comprehensive cloud-based solutions.
- Identify opportunities to enhance systems, processes, and technologies while troubleshooting complex technical challenges.
- AWS: Glue, Lambda, Step Function, Batch, ECS, Quicksight, Machine Learning, Sagemaker, etc.
- DevOps: CloudFormation, Terraform, Git, CodeBuild
- Database: Redshift, PostgreSQL, DynamoDB, Athena
- Language: Bash, Python, SQL
- AI: Cursor
- Experience in AWS platforms, including data services. Basic knowledge of Azure is preferred.
- Experience in data engineering, strong competence in ETL processes, data warehousing, and big data technologies.
- Advanced skills in scripting, Python, SQL, and infrastructure automation tools.
- Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).
- Experience with data visualization tools (e.g., QuickSight) is a plus.
- Openness to work daily with AI tools like Cursor.
- Ability to work independently and responsibly; if you are not living near our offices, self-discipline and interpersonal skills to work remotely.
Remote role in the following countries: Estonia, Latvia, Lithuania, Poland, Slovakia, Hungary, Romania, Portugal, Spain, Italy, Croatia. We are not offering freelancing contracts, only local employment contracts.
Our Inclusiveness CommitmentWe believe in celebrating our differences. Diversity, Equity, and Inclusion guide our current success and our ongoing efforts to improve. We actively seek to add members to our community who represent our customers and the places we live and work. We have programs to ensure our people are seen, heard, and welcomed, and that they belong, no matter who they are or where they come from.
#J-18808-LjbffrBe The First To Know
About the latest Data professionals Jobs in South Africa !
Data Engineer
Posted today
Job Viewed
Job Description
Overview
Transporeon is a SaaS company founded in 2000 in Ulm, Germany. The company provides logistics solutions across several areas, including buying and selling logistics services, organizing shipment execution, organizing dock, yard, truck, and driver schedules, and invoice auditing for logistics services. It has grown significantly over the years, reaching €150m in revenue before being acquired by Trimble for $2 billion USD in 2022. Transporeon has one of the largest networks of shippers and carriers in Europe, with approximately 1,400 employees:
What you will do- Design and implement scalable data solutions on AWS, including data lakes, warehouses, and streaming systems. (We might migrate our data lake to Azure, so Azure knowledge is beneficial).
- Develop, optimize, and maintain data pipelines using AWS services.
- Implement robust ETL/ELT processes and event-driven data ingestion.
- Establish and enforce data governance policies, ensuring data quality, security, and compliance.
- Optimize cloud resources for performance, availability, and cost-efficiency.
- Partner with cross-functional teams to gather requirements and deliver comprehensive cloud-based solutions.
- Identify opportunities to enhance systems, processes, and technologies while troubleshooting complex technical challenges.
- AWS: Glue, Lambda, Step Function, Batch, ECS, Quicksight, Machine Learning, Sagemaker, etc.
- DevOps: CloudFormation, Terraform, Git, CodeBuild
- Database: Redshift, PostgreSQL, DynamoDB, Athena
- Language: Bash, Python, SQL
- AI: Cursor
- Experience in AWS platforms, including data services. Basic knowledge of Azure is preferred.
- Experience in data engineering, strong competence in ETL processes, data warehousing, and big data technologies.
- Advanced skills in scripting, Python, SQL, and infrastructure automation tools.
- Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).
- Experience with data visualization tools (e.g., QuickSight) is a plus.
- Openness to work daily with AI tools like Cursor.
- Ability to work independently and responsibly; if you are not living near our offices, self-discipline and interpersonal skills to work remotely.
Remote role in the following countries: Estonia, Latvia, Lithuania, Poland, Slovakia, Hungary, Romania, Portugal, Spain, Italy, Croatia. We are not offering freelancing contracts, only local employment contracts.
Our Inclusiveness CommitmentWe believe in celebrating our differences. Diversity, Equity, and Inclusion guide our current success and our ongoing efforts to improve. We actively seek to add members to our community who represent our customers and the places we live and work. We have programs to ensure our people are seen, heard, and welcomed, and that they belong, no matter who they are or where they come from.
#J-18808-LjbffrData Engineer
Posted today
Job Viewed
Job Description
Overview
Transporeon is a SaaS company founded in 2000 in Ulm, Germany. The company provides logistics solutions across several areas, including buying and selling logistics services, organizing shipment execution, organizing dock, yard, truck, and driver schedules, and invoice auditing for logistics services. It has grown significantly over the years, reaching €150m in revenue before being acquired by Trimble for $2 billion USD in 2022. Transporeon has one of the largest networks of shippers and carriers in Europe, with approximately 1,400 employees:
What you will do- Design and implement scalable data solutions on AWS, including data lakes, warehouses, and streaming systems. (We might migrate our data lake to Azure, so Azure knowledge is beneficial).
- Develop, optimize, and maintain data pipelines using AWS services.
- Implement robust ETL/ELT processes and event-driven data ingestion.
- Establish and enforce data governance policies, ensuring data quality, security, and compliance.
- Optimize cloud resources for performance, availability, and cost-efficiency.
- Partner with cross-functional teams to gather requirements and deliver comprehensive cloud-based solutions.
- Identify opportunities to enhance systems, processes, and technologies while troubleshooting complex technical challenges.
- AWS: Glue, Lambda, Step Function, Batch, ECS, Quicksight, Machine Learning, Sagemaker, etc.
- DevOps: CloudFormation, Terraform, Git, CodeBuild
- Database: Redshift, PostgreSQL, DynamoDB, Athena
- Language: Bash, Python, SQL
- AI: Cursor
- Experience in AWS platforms, including data services. Basic knowledge of Azure is preferred.
- Experience in data engineering, strong competence in ETL processes, data warehousing, and big data technologies.
- Advanced skills in scripting, Python, SQL, and infrastructure automation tools.
- Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).
- Experience with data visualization tools (e.g., QuickSight) is a plus.
- Openness to work daily with AI tools like Cursor.
- Ability to work independently and responsibly; if you are not living near our offices, self-discipline and interpersonal skills to work remotely.
Remote role in the following countries: Estonia, Latvia, Lithuania, Poland, Slovakia, Hungary, Romania, Portugal, Spain, Italy, Croatia. We are not offering freelancing contracts, only local employment contracts.
Our Inclusiveness CommitmentWe believe in celebrating our differences. Diversity, Equity, and Inclusion guide our current success and our ongoing efforts to improve. We actively seek to add members to our community who represent our customers and the places we live and work. We have programs to ensure our people are seen, heard, and welcomed, and that they belong, no matter who they are or where they come from.
#J-18808-LjbffrData Engineer
Posted today
Job Viewed
Job Description
Overview
Transporeon is a SaaS company founded in 2000 in Ulm, Germany. The company provides logistics solutions across several areas, including buying and selling logistics services, organizing shipment execution, organizing dock, yard, truck, and driver schedules, and invoice auditing for logistics services. It has grown significantly over the years, reaching €150m in revenue before being acquired by Trimble for $2 billion USD in 2022. Transporeon has one of the largest networks of shippers and carriers in Europe, with approximately 1,400 employees:
What you will do- Design and implement scalable data solutions on AWS, including data lakes, warehouses, and streaming systems. (We might migrate our data lake to Azure, so Azure knowledge is beneficial).
- Develop, optimize, and maintain data pipelines using AWS services.
- Implement robust ETL/ELT processes and event-driven data ingestion.
- Establish and enforce data governance policies, ensuring data quality, security, and compliance.
- Optimize cloud resources for performance, availability, and cost-efficiency.
- Partner with cross-functional teams to gather requirements and deliver comprehensive cloud-based solutions.
- Identify opportunities to enhance systems, processes, and technologies while troubleshooting complex technical challenges.
- AWS: Glue, Lambda, Step Function, Batch, ECS, Quicksight, Machine Learning, Sagemaker, etc.
- DevOps: CloudFormation, Terraform, Git, CodeBuild
- Database: Redshift, PostgreSQL, DynamoDB, Athena
- Language: Bash, Python, SQL
- AI: Cursor
- Experience in AWS platforms, including data services. Basic knowledge of Azure is preferred.
- Experience in data engineering, strong competence in ETL processes, data warehousing, and big data technologies.
- Advanced skills in scripting, Python, SQL, and infrastructure automation tools.
- Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).
- Experience with data visualization tools (e.g., QuickSight) is a plus.
- Openness to work daily with AI tools like Cursor.
- Ability to work independently and responsibly; if you are not living near our offices, self-discipline and interpersonal skills to work remotely.
Remote role in the following countries: Estonia, Latvia, Lithuania, Poland, Slovakia, Hungary, Romania, Portugal, Spain, Italy, Croatia. We are not offering freelancing contracts, only local employment contracts.
Our Inclusiveness CommitmentWe believe in celebrating our differences. Diversity, Equity, and Inclusion guide our current success and our ongoing efforts to improve. We actively seek to add members to our community who represent our customers and the places we live and work. We have programs to ensure our people are seen, heard, and welcomed, and that they belong, no matter who they are or where they come from.
#J-18808-Ljbffr