871 Aws jobs in South Africa
AWS Engineer
Posted 16 days ago
Job Viewed
Job Description
Our Client in the IT industry is looking for an AWS Data Engineer. If you meet the below requirements, kindly send us your CV.
Duties & ResponsibilitiesESSENTIAL SKILLS REQUIREMENTS:
Above average experience/understanding (in order of importance):
- Terraform
- Python 3x
- SQL - Oracle/PostgreSQL
- Py Spark
- Boto3
- ETL
- Docker
- Linux / Unix
- Big Data
- Powershell / Bash
- BMW Cloud Data Hub (CDH)
- BMW CDEC Blueprint
Experience in working with Enterprise Collaboration tools such as Confluence, JIRA, etc.
Experience developing technical documentation and artefacts.
Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV, etc.
Experience working with Data Quality Tools such as Great Expectations.
Knowledge of the Agile Working Model.
Any additional responsibilities assigned in the Agile Working Model (AWM) Charter.
ADVANTAGEOUS SKILLS REQUIREMENTS:
- Demonstrate expertise in data modelling Oracle SQL.
- Exceptional analytical skills analysing large and complex data sets.
- Perform thorough testing and data validation to ensure the accuracy of data transformations.
- Strong written and verbal communication skills, with precise documentation.
- Self-driven team player with ability to work independently and multi-task.
- Experience building data pipelines using AWS Glue or Data Pipeline, or similar platforms.
- Familiar with data stores such as AWS S3, and AWS RDS or DynamoDB.
- Experience and solid understanding of various software design patterns.
- Experience preparing specifications from which programs will be written, designed, coded, tested, and debugged.
- Strong organizational skills.
- Experience developing and working with REST APIs is a bonus.
- Basic experience in Networking and troubleshooting network issues.
Basic experience/understanding of AWS Components (in order of importance):
- Glue
- CloudWatch
- SNS
- Athena
- S3
- Kinesis Streams (Kinesis, Kinesis Firehose)
- Lambda
- DynamoDB
- Step Function
- Param Store
- Secrets Manager
- Code Build/Pipeline
- CloudFormation
- Business Intelligence (BI) Experience
- Technical data modelling and schema design (“not drag and drop”)
- Kafka
AWS EMR
Redshift
Market Related - Monthly
#J-18808-LjbffrAws engineer
Posted today
Job Viewed
Job Description
AWS Cloud Engineer
Posted 2 days ago
Job Viewed
Job Description
Job Description
Position: Mid-Level AWS Cloud Engineer
Company: Top Tech & Fintech Player
Location: R720K Hybrid (JHB North)
Ready to advance your cloud career? A leading JSE-listed Fintech company focused on tech-driven investment platforms is seeking a Mid-Level AWS Cloud Engineer to join their innovative team. This company is transforming the financial landscape by making investing easy and accessible.
You will be part of a high-performing, entrepreneurial team with a forward-thinking mindset and a shared mission to empower users through cutting-edge technology. If you are passionate about AWS, Terraform, and building scalable cloud environments, this opportunity is for you.
Requirements:
- 5 years of experience in the IT industry with at least 1 year of hands-on AWS exposure
- Solid Linux system administration skills
- Strong understanding of Terraform and infrastructure-as-code principles
- Knowledge of AWS networking components (VPC, EC2, etc.)
- Experience with monitoring tools like CloudWatch or Elastic
- Familiarity with cloud security best practices
- Programming experience (at least 2 years in any language)
- AWS certifications (Solutions Architect, Cloud Practitioner) are a plus
Qualifications:
- An IT-related degree or a National Diploma in IT or a related field
- Cloud certifications or similar qualifications
This hybrid role is based in Johannesburg North. If you're eager to build impactful cloud solutions within a team that values innovation, impact, and inclusion, let's connect.
Application: Please apply directly or through our website.
For more roles, visit our website or follow us on LinkedIn and Instagram (@60dsixtydegrees).
Key Skills: AWS, Terraform, Linux, Cloud Security, Networking, Monitoring
Employment Type: Full-Time
Experience: 5+ years
Vacancy: 1
Salary: R60,000 - R80,000 per month
#J-18808-LjbffrAWS Data Engineer
Posted 2 days ago
Job Viewed
Job Description
Overview
We are looking for an experience AWS Data Engineer to join our team.
Responsibilities- Data Management and API integration.
- Experience in handling data management requirements integration platforms and have expertise with APIs.
- Design and implement data pipelines.
- Develop and maintain data architectures and ETL processes.
- Ensure data accuracy consistency and quality across systems.
- Collaborate with data scientists analysts and business stakeholders.
- Migrate data systems to cloud platforms (e.g. AWS Snowflake).
- Support financial reconciliations and regulatory reporting.
- Automate repetitive data tasks and optimize performance.
- Monitor and troubleshoot data workflows and infrastructure.
- Matric and a Bachelors degree in Computer Science Information Technology or related field.
- 3-5 years of experience in data engineering or similar roles.
- Experience in financial services or insurance is advantageous.
- Must have Senior Snowflake / Matillion experience / skill.
- SQL (advanced) Python Java.
- AWS Snowflake (Azure or GCP is a plus).
- SQL (MySQL PostgreSQL) NoSQL
- Experience with dimensional modeling and tools like DBT
AWS, Snowflake, SQL
Key SkillsApache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala
Employment TypeFull Time
Experienceyears
Vacancy1
#J-18808-LjbffrAWS Cloud Engineer
Posted 5 days ago
Job Viewed
Job Description
Overview
NEW WORK IN : Mid-Level AWS Cloud Engineer – Top Tech & Fintech Player – R720K+ | Hybrid (JHB North)
Ready to level up your cloud career? A JSE-listed Fintech is on the hunt for a Mid-Level AWS Cloud Engineer to join their tech-savvy tribe. With a core focus on tech-driven investment platforms, this business is reshaping the financial landscape—making investing easy, and accessible!
You'll be part of a high-performing team with an entrepreneurial spirit, a forward-thinking mindset, and a shared mission to empower users through cutting-edge tech. If you're passionate about AWS, Terraform, and building scalable cloud environments, this one's for you.
Responsibilities- Join a high-performing, entrepreneurial team to design and build scalable cloud environments on AWS, with a focus on tech-driven investment platforms.
- Leverage Terraform and infrastructure-as-code to provision and manage cloud resources.
- Collaborate to empower users through cutting-edge technology and secure, reliable cloud solutions.
- 5+ years' experience in the IT industry, with 1+ years hands-on AWS exposure
- Solid Linux system administration skills
- Strong grasp of Terraform and infrastructure-as-code principles
- Understanding of AWS networking components (VPC, EC2, etc.)
- Exposure to monitoring tools like CloudWatch or Elastic
- Familiarity with cloud security best practices
- Programming experience (2+ years in any language) will serve you well
- AWS certifications (Solutions Architect, Cloud Practitioner) for the win!
- You have an IT related degree or a National Diploma in IT or related field
- Cloud certifications or similar
This is a hybrid role based in Johannesburg North; If you're ready to build cloud solutions that matter—while working with a team that values innovation, impact, and inclusion—then let’s chat.
#J-18808-LjbffrAWS DevOps Engineer
Posted 10 days ago
Job Viewed
Job Description
If you’re the kind of engineer who enjoys solving real infrastructure problems – quietly making the complex run smoothly in the background – this might be a great fit for you.
This is a Cape Town-based company doing big work in the payment infrastructure space. Their platform supports some of the world’s biggest brands as they expand into South Africa and beyond. Think high transaction volumes, security at scale, and uptime that can’t flinch. It’s smart, high-stakes engineering—but with a strong focus on doing things properly.
About the role:
They’re looking for an AWS DevOps Engineer who knows how to build for reliability, visibility, and security. You’ll be the one designing scalable AWS infrastructure, building slick CI/CD pipelines, and keeping the monitoring sharp enough to spot trouble before it starts. You’ll also help developers ship faster, without cutting corners.
You’ll report into the VP of Engineering and join a small, highly capable team that likes asking “Why?” and enjoys solving the harder stuff.
Hybrid | Gardens, Cape Town | Permanent.
Salary: Up to R75 000 per month.
Key things you’ll work on:
- Building out infrastructure using Terraform (lots of it)
- Creating and maintaining CI/CD pipelines (GitHub Actions ideally)
- Designing secure cloud environments (AWS preferred)
- Keeping systems observable and monitored (Grafana, Prometheus, etc.)
- Managing incidents and working on continuous improvements
- Supporting dev teams to move faster, safer
- Writing clean documentation and owning your space
Who they’re looking for:
Someone who’s curious, capable, and comfortable in their craft. You’re not afraid of complexity—you enjoy making it cleaner. You’re probably the kind of person who reads logs like bedtime stories and gets a quiet thrill from clean deployments.
You should have:
- 3+ years as a DevOps Engineer or similar
- Strong Terraform and Infrastructure-as-Code experience
- Solid AWS experience and cloud-first mindset
- Docker and Kubernetes (Helm is a plus)
- Secure networking and systems thinking
- Bash or Python scripting skills
- Experience with observability and alerting tools
- Incident response experience and good communication chops
Bonus points if you’ve tackled:
- Zero-downtime deployments
- Scaling infra with growing traffic volumes
- Developer tooling to boost speed without risking quality
- Balancing security, speed, and compliance in fintech/payments
What they offer:
- Hybrid work setup (Cape Town-based)
- High autonomy and trust
- Sharp teammates, clean processes, and strong engineering standards
- An environment where ambition is backed with action
AWS Data Engineer
Posted 12 days ago
Job Viewed
Job Description
Overview
Be part of our team of Data Specialists and embark on a career of the future!
Ready to take your data engineering career to new heights? PBT Group is looking for a Senior AWS Data Engineer to design, build, and lead cutting-edge data solutions in a dynamic, agile environment.
What You’ll Do- Architect modern data analytics frameworks.
- Translate complex requirements into scalable, secure, high-performance pipelines.
- Build & optimize batch/real-time data solutions using AWS & Big Data tools.
- Lead engineering efforts across multiple agile projects.
- 5+ yrs in Data/Software Engineering with team leadership experience (3–5 yrs).
- 2+ yrs in Big Data & AWS (EMR, EC2, S3).
- ETL expert – especially Talend, cloud migration, & data pipeline support.
- Strong in Python, PySpark, SQL, and data modeling.
- Agile mindset (Scrum, Kanban).
- Familiar with Hadoop ecosystem, production support, and DevOps for BI.
- Experience with Spark, streaming data tools, & scalable system design.
- Talend & AWS hands-on (1+ yr).
- Bachelor’s in Computer Science/Engineering or equivalent experience.
Be part of a team that thrives on innovation, collaboration, and cloud-first data transformation.
Apply now and help shape the future of data!
In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent
SkillsFinancial Services Information Technology (IT) Insurance
#J-18808-LjbffrBe The First To Know
About the latest Aws Jobs in South Africa !
AWS Data Engineer
Posted 12 days ago
Job Viewed
Job Description
What You’ll Do
- Architect modern data analytics frameworks.
- Translate complex requirements into scalable, secure, high-performance pipelines.
- Build & optimize batch / real-time data solutions using AWS & Big Data tools.
- Lead engineering efforts across multiple agile projects.
- 5+ yrs in Data / Software Engineering with team leadership experience (3–5 yrs).
- 2+ yrs in Big Data & AWS (EMR, EC2, S3).
- ETL expert – especially Talend, cloud migration, & data pipeline support.
- Strong in Python, PySpark, SQL, and data modeling.
- Agile mindset (Scrum, Kanban).
- Familiar with Hadoop ecosystem, production support, and DevOps for BI.
- Experience with Spark, streaming data tools, & scalable system design.
- BI data modeling background (3+ yrs).
- Bachelor’s in Computer Science / Engineering or equivalent experience.
- AWS Certified (Associate+ level preferred).
Be part of a team that thrives on innovation, collaboration, and cloud-first data transformation.
Apply now and help shape the future of data!
In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent.
If you have not heard from us in two weeks, please note that you were unsuccessful for the role. However, we will keep your resume on file and reach out if any other suitable opportunity arises in the future.
#J-18808-LjbffrAWS Data Engineer
Posted 15 days ago
Job Viewed
Job Description
Overview
Responsible for creating and managing the technological part of data infrastructure in every step of data flow. From configuring data sources to integrating analytical tools — all these systems would be architected, built, and managed by a general-role data engineer.
Data Architecture and Management- Design and maintain scalable data architectures using AWS services for example, but not limited to, AWS S3, AWS Glue and AWS Athena.
- Implement data partitioning and cataloging strategies to enhance data organization and accessibility.
- Work with schema evolution and versioning to ensure data consistency.
- Develop and manage metadata repositories and data dictionaries.
- Assist and support with defining, setup and maintenance of data access roles and privileges.
- Design, develop and optimize scalable ETL pipelines using batch and real-time processing frameworks (using AWS Glue and PySpark).
- Implement data extraction, transformation and loading processes from various structured and unstructured sources.
- Optimize ETL jobs for performance, cost efficiency and scalability.
- Develop and integrate APIs to ingest and export data between various source and target systems, ensuring seamless ETL workflows.
- Enable scalable deployment of ML models by integrating data pipelines with ML workflows.
- Automate data workflows and ensure they are fault tolerant and optimized.
- Implement logging, monitoring and alerting for data pipelines.
- Optimize ETL job performance by tuning configurations and analyzing resource usage.
- Optimize data storage solutions for performance, cost and scalability.
- Ensure the optimisation of AWS resources for scalability for data ingestion and outputs.
- Deploy machine learning models into productions using cloud based services like AWS Sagemaker.
- Ensure API security, authentication and access control best practices.
- Implement data encryption, access control and compliance with GDPR, HIPAA, SOC2 etc.
- Establish data governance policies, including access control and security best practices.
- Work closely with data scientists, analysts and business teams to understand data needs.
- Collaborate with backend teams to integrate data pipelines into CI/CD.
- Assist with developmental leadership to the team through coaching, code reviews and mentorship.
- Ensure technological alignment with B2C division strategy supporting overarching hearX strategy and vision.
- Identify and encourage areas for growth and improvement within the team.
- Document data processes, transformations and architectural decisions.
- Maintain high standards of software quality within the team by adhering to good processes, practices and habits, including compliance to QMS system, and data and system security requirements.
- Ensure compliance to the established processes and standards for the development lifecycle, including but not limited to data archival.
- Drive compliance to the hearX Quality Management System in line with the Quality Objectives, Quality Manual, and all processes related to the design, development and implementation of software related to medical devices.
- Comply to ISO, CE, FDA (and other) standards and requirements as is applicable to assigned products.
- Safeguard confidential information and data.
Bachelor’s degree in Computer Science or Engineering (or similar)
- Honors degree in Computer Science or Engineering (or similar)
- AWS Certified Solutions Architect or
- AWS Certified Data Analyst
5+ years working experience
Required nature of experience- Experience with AWS services used for data warehousing, computing and transformations i.e. AWS Glue (crawlers, jobs, triggers, and catalog), AWS S3, AWS Lambda, AWS Step Functions, AWS Athena and AWS CloudWatch
- Experience with SQL and NoSQL databases (e.g., PostgreSQL, MySQL, DynamoDB)
- Experience with SQL for querying and transformation of data
- Strong skills in Python (especially PySpark for AWS Glue)
- Strong knowledge of data modeling, schema design and database optimization
- Proficiency with AWS and infrastructure as code
- Knowledge of SQL, Python, AWS serverless microservices,
- Deploying and managing ML models in production
- Version control (Git), unit testing and agile methodologies
This job description is not a definitive or exhaustive list of responsibilities and is subject to change depending on changing business requirements. Employees will be consulted on any changes. If you do not hear from us within 30 days, please consider your application unsuccessful.
#J-18808-LjbffrAWS Data Engineer
Posted 16 days ago
Job Viewed
Job Description
Ready to take your data engineering career to new heights? PBT Group is looking for a Senior AWS Data Engineer to design, build, and lead cutting-edge data solutions in a dynamic, agile environment.
What You’ll Do:
Architect modern data analytics frameworks.
Translate complex requirements into scalable, secure, high-performance pipelines.
Build & optimize batch/real-time data solutions using AWS & Big Data tools.
Lead engineering efforts across multiple agile projects.
What You Bring:
5+ yrs in Data/Software Engineering with team leadership experience (3–5 yrs).
2+ yrs in Big Data & AWS (EMR, EC2, S3).AWS (EMR, EC2, S3
ETL expert – especially Talend, cloud migration, & data pipeline support.
Strong in Python, PySpark, SQL , and data modeling.
Agile mindset (Scrum, Kanban).
Familiar with Hadoop ecosystem, production support, and DevOps for BI.
Nice to Have:
Experience with Spark, streaming data tools, & scalable system design.
BI data modeling background (3+ yrs).
Qualifications:
Bachelor’s in Computer Science/Engineering or equivalent experience.
AWS Certified (Associate+ level preferred).
Be part of a team that thrives on innovation, collaboration, and cloud-first data transformation.
Apply now and help shape the future of data!
* In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent
"If you have not heard from us in two weeks, please note that you were unsuccessful for the role. However, we will keep your resume on file and reach out if any other suitable opportunity arises in the future".