130 Data Engineer Aws jobs in Johannesburg
Data Engineer(AWS)
Posted 4 days ago
Job Viewed
Job Description
Falcorp Technologies is seeking a mature and meticulous Data Analyst who will be responsible for retrieving and gathering data, organizing it, and using it to reach meaningful conclusions. We’re looking for a Data Analyst who has strong SQL experience; telco experience will be advantageous.
Duties & Responsibilities- Extract, translate, and reform data
- Conduct regular and adhoc data projects
- Implement and maintain solutions according to data sourcing and data transformation using SQL
- Communicate with clients and end-users to determine purpose and end solution requirements
- Data extraction and data manipulation to fulfill client requirements, including adhoc requests, documentation, and effective communication with external clients and internal stakeholders
- Investigation, design, and implementation of improvements to existing processes
- R and Python
- SQL
- SSMS / SSIS
- Visual Studio
- VS Code
- Cloud (AWS / Azure)
- Industry qualification: National Diploma or Degree
- 3+ years experience in Data Analysis using SQL and Python / R
Data Engineer – AWS
Posted 18 days ago
Job Viewed
Job Description
Deloitte is a leading global provider of audit and assurance, consulting, financial advisory, risk advisory, tax, and related services. Our global network of member firms and related entities in more than 150 countries and territories (collectively, the “Deloitte organization”) serves four out of five Fortune Global 500 companies. Learn how Deloitte’s approximately 457,000 people make an impact that matters at .
Innovation, transformation, and leadership occur in many ways. At Deloitte, our ability to help solve clients’ most complex issues is distinct. We deliver strategy and implementation, from a business and technology perspective, to help lead in the markets where our clients compete.
Are you a game changer? Do you believe in adding advantage at every level in everything you do? You may be one of us.
Deloitte Consulting is growing, with a focus on developing our already powerful teams across our portfolio of offerings. We are looking for smart, accountable, innovative professionals with technical expertise and deep industry experience insights. Our unique combination of six areas of expertise, our well-developed industry structure, and our integrated signature solutions create an offering never seen before on the African continent.
Be at the forefront of the revolution.
AI-enabled technologies are shaking business foundations. Some find this daunting. We see opportunity—for clients, societies, and people.
Deloitte’s AI & Data Specialists partner with clients to leverage AI and achieve new levels of organizational excellence. We turn data into insights, into action—on an industrial scale.
Join us as we enable clients to grasp the future and reach new heights. Learn from the best in the field to create solutions blending data science, data engineering, and process engineering with our industry-leading expertise.
Job DescriptionSupport the Technical Lead in establishing new patterns, standards, processes, and procedures for the client’s solution and data community.
Specialize in data integration and data warehousing concepts to extract data from disparate sources, transform it according to business requirements, and load the required tables for downstream consumption.
Design and build solutions, communicate effectively with technical and business teams, and convey solutions aligned with business needs.
Delivery Leadership:
- Define high-level solution design options based on client requirements
- Create design standards and patterns that are reusable in client solutions
- Prototype potential solutions rapidly for design trade-off discussions
- Mentor and train junior team members
- Conduct code reviews of team members’ work
- Break down tasks accurately and provide estimations for solutions
- Learn new technologies quickly
- Apply structured problem-solving approaches
- Develop data models and designs within client architecture and standards
- Understand complex business environments and requirements to design solutions based on best practices
- Document designs and solutions for client understanding
- Complete deliverables for architectural approval at the client
- Understand DataOps approach to solution architecture
- Solid experience in data and SQL is required
Technical Skills:
Demonstrated experience in database development is essential. Experience in other areas is a bonus.
Database:
- SAP HANA
- SQL Server
Database Development:
- Views, functions, stored procedures, query optimization, indexing, OLAP / MDX
Cloud:
- AWS
ETL Tools:
- SSIS, IBM DataStage, SAP Data Services, Informatica or similar
Programming Languages:
- Java, Python, UNIX & Shell Commands (Python / shell / Perl) is a plus
- Data Vault (preferred), Kimball (preferred), 3rd Normal Form / OLAP / MDX, Streaming (NiFi / Kafka)
Methodologies:
- Agile, PMBOK, DataOps / DevOps
Data Acquisition:
- One-off, CDC, Streaming
Minimum: Bachelor’s Degree in Data Science, Engineering or related field
Preferred: Postgraduate Degree in Data Science, Engineering or related field, and data-related cloud certifications
Experience:
3-5 years of client-facing experience in data roles
Additional InformationBehavioral Skills:
- Excellent communication skills, both written and verbal
- Ability to develop and grow technical teams
- Objective-oriented with a strong client delivery focus
- Builds strong, trusting relationships with clients
- Focus on quality and risk management
- Strong problem-solving skills
- Ability to understand and navigate complex environments and systems
- Keen to understand how things work
At Deloitte, we foster an inclusive environment where everyone can be themselves and thrive. We are committed to fairness, respect, and diversity across the African continent.
Note: The list of tasks and responsibilities is not exhaustive. Deloitte may assign additional duties as operationally required.
#J-18808-LjbffrData Engineer: AWS/ Azure
Posted 6 days ago
Job Viewed
Job Description
Job Overview
- You will work within a team solving complex problems to deliver real business value across a wide range of industries such as mining, telcos, retail and financial services.
- You will help develop best practices and drive improvements in data engineering across the business.
Responsibilities, Activities and Key Deliverables
- Analyse and organise raw data
- Design data engineering solutions to meet business requirements
- Build scalable data pipelines that clean, transform and aggregate data from different sources using appropriate tools and technologies
- Collaborate with data scientists to prepare data sets for analytical modeling
- Identify ways to enhance data quality and reliability
- Manage technical delivery of projects
- Mentor junior data engineers.
Cloud & AWS - Data Engineer - Consultant
Posted 4 days ago
Job Viewed
Job Description
- Full-time
At Deloitte, our Purpose is to make an impact that matters for our clients, our people, and society. This is the lens for which our global strategy is set. It unites Deloitte professionals across geographies, businesses, and skills. It makes us better at what we do and how we do it. It enables us to deliver on our promises to stakeholders, while creating the lasting impact we seek.
Harnessing the talent of 450,000+ people located across more than 150 countries and territories, our size and scale puts us in a unique position to help change the world for the better—by bringing together the services we provide, the societal investments we make, and the collaborations we advance through our ecosystems.
Deloitte offers career opportunities across Audit & Assurance (A&A), Tax & Legal (T&L) and our Consulting services business, which is made up of Strategy, Risk & Transactions Advisory (SR&T) andTechnology & Transformation (T&T).
Job DescriptionWe are looking for a Data Engineering Consultant to make an impact in our team and at the clients. This role will support the Engagement team in delivery of services to clients on engagements and projects.
Main Purpose of the Job:
Supporting the Technical Lead in establishing new patterns, standards, processes and procedures for client’s solution and data community. Specialize in data integration and data warehousing concepts to extract data from disparate sources and transform it as per business requirement and load the required tables that can be consumed downstream. Helping design and build solutions, communicating to both technical and business teams at a client and covey solutions to business requirements.
- Able to define a structured approach to problem solving
- Completion of data models and designs within client’s architecture and standards
- Build robust data pipelines and ETL’s using integration tools and services
- Understanding complex business environments and requirements and design a solution based on leading practices
- Ability to document design and implement solutions for client product owners
- Completion of deliverables for gaining architectural approval at client
- Understanding of DataOps approach to solution architecture.
- Solid experience in data and SQL is required
Must have experience in one or more services and technologies listed below:
- Database Development: Experience Views, functions, stored procedures, Optimisation of queries, building indexes, OLAP / MDX
- ETL: AWS Glue, Athena, SSIS, IBM DataStage / SAP Data Services, AWS DMS, Appflow Programming, SQL (TSQL /HQL etc), Python, Spark, UNIX & Shell Commands (Python / shell / Perl) is a plus
- Modelling: Data Vault (pref), Kimball (Pref), 3rd Normal Form / OLAP / MDX)
- Data Acquisition: Pipeline creation, Automation and data delivery Once off, CDC, Streaming
- Excellent communication skills, both written and verbal
- Ability to develop & grow technical teams
- Objective oriented with strong client delivery focus
- Client focused by building strong trusting relationships with clients
- Focus on quality and risk
- Sound problem solving ability
- Ability to understand and comprehend complex environments and systems.
- Inquisitive by nature and keen to figure out how things work
- Continuous learning mindset
Minimum Qualifications
Bachelor’s Degree in Data Science, Engineering, BSc Computer Science, or related Degree
Desired Qualifications
Post Graduate Honours Degree in Data Science Engineering or related Degree,
Data and technology related cloud certifications
1 - 3 years working experience in data engineering, with client-facing exposure
Experience with AWS is preferred
Additional InformationAt Deloitte, we want everyone to feel they can be themselves and to thrive at work—in every country, in everything we do, every day.We aim to create a workplace where everyone is treated fairly and with respect, including reasonable accommodation for persons with disabilities. Weseek to create and leverage our diverse workforce to build an inclusive environment across the African continent.
Note: The list of tasks / duties and responsibilities contained in this document is not necessarily exhaustive. Deloitte may ask the employee to carry out additional duties or responsibilities, which may fall reasonably within the ambit of the role profile, depending on operational requirements.
Be careful of Recruitment Scams: Fraudsters or employment scammers often pose as legitimate recruiters, employers, recruitment consultants or job placement firms, advertising false job opportunities through email, text messages and WhatsApp messages. They aim to cheat jobseekers out of money or to steal personal information.
To help you look out for potential recruitment scams, here are some Red Flags:
- Upfront Payment Requests :Deloitte will never ask for any upfront payment for background checks, job training, or supplies.
- Requests for Personal Information :Be wary if you are asked for sensitive personal information, especially early in the recruitment process and without a clear need for it. Fraudulent links or contractual documents may require the provision of sensitive personal data or copy documents (e.g., government issued numbers or identity documents, passports or passport numbers, bank account statements or numbers, parent’s data) that may be used for identity fraud. Do not provide or send any of these documents or data. Please note we will never ask for photographs at any stage of the recruitment process.
- Unprofessional Communication :Scammers may communicate in an unprofessional manner. Their messages may be filled with poor grammar and spelling errors. The look and feel may not be consistent with the Deloitte corporate brand.
If you're unsure, make direct contact with Deloitte using our official contact details. Be careful not to use any contact details provided in the suspicious job advertisement or email.
Job Location #J-18808-LjbffrAWS Data Engineer Market related
Posted 4 days ago
Job Viewed
Job Description
The AWS Data Engineer will be responsible for building and maintaining Big Data Pipelines using Data Platforms and must ensure that data is shared in line with the information classification requirements on a need-to-know basis.
Duties & Responsibilities- Exceptional analytical skills analyzing large and complex data sets.
- Perform thorough testing and data validation to ensure the accuracy of data transformations.
- Strong written and verbal communication skills, with precise documentation.
- Self-driven team player with ability to work independently and multi-task.
- Experience building data pipeline using AWS Glue or Data Pipeline, or similar platforms.
- Familiar with data stores such as AWS S3, and AWS RDS or DynamoDB.
- Experience and solid understanding of various software design patterns.
- Experience preparing specifications from which programs will be written, designed, coded, tested, and debugged.
- Strong organizational skills. Experience developing and working with REST APIs is a bonus.
- Basic experience in Networking and troubleshooting network issues.
- Degree
- Certifications such as AWS Certified Cloud Practitioner, AWS Certified SysOps Associate, AWS Certified Developer Associate, AWS Certified Architect Associate, AWS Certified Architect Professional, Hashicorp Certified Terraform Associate
- Automotive industry experience
- Terraform
- Python 3.x
- Py Spark
- Boto3
- ETL
- Docker
- Powershell / Bash
- Must have 3 – 5 years working experience
- 3 – 5 years experience in working with Enterprise Collaboration tools such as Confluence, JIRA
- 3 – 5 years experience developing technical documentation and artifacts.
- Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV
- Experience working with Data Quality Tools such as Great Expectations
- Knowledge of the Agile Working Model
Market related
#J-18808-LjbffrAWS Data Engineer – Senior Consultant
Posted 5 days ago
Job Viewed
Job Description
At Deloitte, our Purpose is to make an impact that matters for our clients, our people, and society. This is the lens through which our global strategy is set. It unites Deloitte professionals across geographies, businesses, and skills. It makes us better at what we do and how we do it. It enables us to deliver on our promises to stakeholders, while creating the lasting impact we seek.
Harnessing the talent of 450,000+ people located across more than 150 countries and territories, our size and scale put us in a unique position to help change the world for the better—by bringing together the services we provide, the societal investments we make, and the collaborations we advance through our ecosystems.
Deloitte offers career opportunities across Audit & Assurance (A&A), Tax & Legal (T&L), and our Consulting services business, which includes Strategy, Risk & Transactions Advisory (SR&T), and Technology & Transformation (T&T).
Job DescriptionWe are seeking an AWS Data Engineer to join our AI and Data practice. The ideal candidate is passionate about data and technology solutions, with strong problem-solving and analytical skills, and is tech-savvy with a solid understanding of software development. You should be driven to learn more, keep up with market evolution and industry trends.
You will work throughout the entire engagement cycle, specializing in modern data solutions such as data ingestion/data pipeline frameworks, data warehouse & data lake architectures, cognitive computing, and cloud services.
Technical Requirements for the role:- Support AI & Data teams and implement end-to-end modern data platforms supporting analytics and AI use cases.
- Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to identify and define data structures, formats, pipelines, metadata, and workload orchestration capabilities.
- Address data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations.
- Estimate effort and mentor junior colleagues.
- Participate in technical meetings with clients and advise on technical options based on leading practices.
- Work as a data engineer on AWS and other technologies.
- Apply deep technology knowledge to drive continuous improvement.
- Maintain good communication skills, both written and verbal.
- Build interpersonal relationships and foster collaboration.
- Focus on client delivery and quality.
- Be adaptable, problem-solver, and analytical thinker.
Bachelor’s Degree (or higher) in fields like Computer Science, Information Management, Big Data & Analytics, or related areas is preferred.
Having one or more AWS certifications is preferred, but experience with cloud platform solutions is mandatory, such as:
- AWS Solutions Architect – Associate/Professional
Experience required:
At least 3 years in implementing innovative data solutions leveraging Big Data frameworks, supporting on-premise or AWS cloud environments for analytics and AI use cases.
At least 3 years in extracting, transforming, and loading data from various sources including structured, unstructured, and semi-structured data using SQL, NoSQL, and data pipelines for real-time, streaming, batch, and on-demand workloads.
Experience with data warehousing or data lakes.
Ability to communicate complex technical concepts clearly to non-technical stakeholders and work within Agile development environments.
Must demonstrate experience in services and technologies like:
- Database development: Views, functions, stored procedures, query optimization, indexes, OLAP / MDX
- Cloud platforms: AWS (preferred), Azure, GCP, Snowflake
- ETL tools: AWS Glue, Athena, SSIS, IBM DataStage / SAP Data Services, AWS DMS, AppFlow
- Data acquisition: pipeline creation, automation, data delivery, CDC, streaming
- Designing structured problem-solving approaches, data models, and architecture standards.
At Deloitte, we foster an inclusive environment where everyone can be themselves and thrive. We are committed to fairness and respect, including reasonable accommodations for persons with disabilities. We leverage our diverse workforce to build a more inclusive environment across Africa.
Note: The list of tasks and responsibilities is not exhaustive. Deloitte may assign additional duties as required by operational needs.
Beware of Recruitment Scams: Fraudulent schemes may pose as legitimate recruiters. Be cautious of requests for upfront payments, personal information, or unprofessional communication. Contact Deloitte directly through official channels for verification.
#J-18808-LjbffrIntermediate/Senior AWS Data Engineer
Posted 25 days ago
Job Viewed
Job Description
A Technology and Business Consulting Firm that was founded through a combination of technology, data, financial and actuarial science principles. is looking for a highly motivate Intermediate or Senior AWS Data Engineer with strong expertise in building scalable data pipelines on AWS. You will work with major financial institutions, designing and implementing modern cloud-based data solutions. This is a hands-on role requiring a solid foundation in Python or C#, AWS Glue (PySpark), and cloud-based ETL systems.
Responsibilities:
- Design, build, and optimize robust data pipelines and architectures on AWS.
- Lead the implementation of scalable and secure data solutions.
- Ingest data into AWS S3 and transform/load into RDS/Redshift.
- Build AWS Glue Jobs using PySpark or Glue Spark .
- Use AWS Lambda (Python/C#) for event-driven data transformation.
- Collaborate on migration and deployment automation (Dev to Prod).
- Support data governance, lineage, and best practices in security.
- Deliver data insights through well-structured models and pipelines.
- Work with batch, real-time (Kafka), and streaming architectures.
- Interact with stakeholders and communicate technical concepts clearly.
Qualifications and experience:
- Proficiency in SQL and data modelling principles.
- Deep experience with AWS services: Glue, Lambda, S3, RDS, Redshift, DynamoDB, Kinesis, SQS/SNS, IAM.
- CI/CD, DevOps and scripting (PowerShell, Bash, Python, etc.).
- Familiarity with RDBMS systems: PostgreSQL, MySQL, SQL Server.
- Agile/Scrum methodology and full SDLC experience.
- Kafka and real-time data ingestion experience (advantageous).
- Strong Python or C# programming, including OOP and libraries for data engineering.
The Reference Number for this position is NG60600 which is a Permanent, Hybrid position in Johannesburg offering a salary of R600,000 up to R900,000 per annum negotiable based on experience. E-mail Nokuthula on nokuthulag@ e-Merge.co.za or call her for a chat on to discuss this and other opportunities.
Are you ready for a change of scenery? E-Merge IT recruitment is a niche recruitment agency. We offer our candidates options so that we can successfully place the right people with the right companies, in the right roles. Check out the E-Merge IT website for more great positions.
Do you have a friend who is a developer or technology specialist? We pay cash for successful referrals!
Be The First To Know
About the latest Data engineer aws Jobs in Johannesburg !
Data Engineer – Senior Consultant – AWS
Posted 18 days ago
Job Viewed
Job Description
Deloitte is a leading global provider of audit and assurance, consulting, financial advisory, risk advisory, tax and related services. Our global network of member firms and related entities in more than 150 countries and territories (collectively, the “Deloitte organisation”) serves four out of five Fortune Global 500 companies. Learn how Deloitte’s approximately 457,000 people make an impact that matters at .
Innovation, transformation and leadership occur in many ways. At Deloitte, our ability to help solve clients’ most complex issues is distinct. We deliver strategy and implementation, from a business and technology view, to help lead in the markets where our clients compete.
Are you a game changer? Do you believe in adding advantage at every level in everything you do? You may be one of us.
Deloitte Consulting is growing, with a focus on developing our already powerful teams across our portfolio of offerings. We are looking for smart, accountable, innovative professionals with technical expertise and deep industry experience insights. The combination of our 6 areas of expertise, our well-developed industry structure and our integrated signature solutions is a unique offering never seen before on the African continent.
Be at the forefront of the revolution.
AI-enabled technologies are shaking business foundations. Some find this daunting. We see opportunity—for clients, societies, and people.
Deloitte’s AI & Data Specialists partner with clients to leverage AI and reach new levels of organisational excellence. We turn data into insights, into action—at an industrial scale.
Join us as we enable clients to grasp the future and reach new heights. Learn from the best in the field to create solutions blending data science, data engineering, and process engineering with our industry-leading expertise.
Job DescriptionWorking with and supporting Technical Lead in establishing new patterns, standards, processes and procedures for client’s solution and data community.
Specialize in data integration and data warehousing concepts to extract data from disparate sources and transform it as per business requirement and load the required tables that can be consumed downstream.
Helping design and build solutions, communicating to both technical and business teams at a client and convey solutions to business requirements.
Delivery Leadership:
- Define high-level solution design options based on client requirements
- Creation of design standards and patterns reusable in a client’s solution
- Experience in rapid prototyping of potential solutions for design trade-discussions
- Mentoring and training of Junior members of the team
- Completing code reviews of team members
- Accurate breakdown and estimations of tasks for solution
- Ability to pick up and learn new technology quickly
- Able to define a structured approach to problem-solving
- Completion of data models and designs within client’s architecture and standards
- Understanding complex business environments and requirements and designing a solution based on leading practices
- Ability to document design and solutions for understanding by client product owners
- Completion of deliverables for gaining architectural approval at client
- Understanding of DataOps approach to solution architecture.
- Solid experience in data and SQL is required
Technical:
Demonstrate experience in database and database development. Experience in other areas is a bonus.
DataBase:
- SAP Hana
- SQL Server
Database Development:
- Experience with Views, functions, stored procedures, optimisation of queries, building indexes, OLAP / MDX
Cloud:
- AWS
ETL:
- SSIS
- IBM DataStage
- SAP Data Services
- Informatica or similar
Programming:
- Java
- Python
- UNIX & Shell Commands (Python / shell / Perl) is a plus
- Data Vault (preferred)
- Kimball (preferred)
- 3rd Normal Form / OLAP / MDX
- Streaming (NiFi / Kafka)
Methodologies:
- Agile
- PMBOK
- DataOps / DevOps
Data Acquisition:
- Once off, CDC, Streaming
Minimum: Bachelor’s Degree in Data Science, Engineering or related Degree
Preferred: Post Grad Degree in Data Science Engineering or related Degree,
Data related cloud certifications
Experience:
3 - 5 years working experience with client facing experience
Additional InformationBehavioural:
- Excellent communication skills, both written and verbal
- Ability to develop & grow technical teams
- Objective oriented with strong client delivery focus
- Client focused by building strong trusting relationships with clients
- Focus on quality and risk
- Sound problem-solving ability
- Ability to understand and comprehend complex environments and systems.
- Inquisitive by nature and keen to figure out how things work
At Deloitte, we want everyone to feel they can be themselves and to thrive at work—in every country, in everything we do, every day. We aim to create a workplace where everyone is treated fairly and with respect, including reasonable accommodation for persons with disabilities. We seek to create and leverage our diverse workforce to build an inclusive environment across the African continent.
#J-18808-LjbffrAWS Data Migration Engineer
Posted 8 days ago
Job Viewed
Job Description
SUMMARY :
We are passionate about helping enterprises accelerate their digital transformation. As a leading cloud consulting and managed services provider, we specialize in delivering scalable, secure, and high-performance cloud solutions. We're looking for an experienced AWS Data Migration Engineer to join our growing Cloud Services team and play a critical role in large-scale cloud data migration initiatives.
POSITION INFO : Job Description
As an AWS Data Migration Engineer , you will be responsible for designing, executing, and managing the migration of data from on-premises or other cloud platforms to Amazon Web Services (AWS). You will work closely with cloud architects, database administrators, and DevOps teams to ensure seamless data transfers that meet security, compliance, and performance requirements.
Required Qualifications :
- Bachelor's degree / Diploma in Computer Science, Information Technology, or related field.
- 4+ years of experience in data migration or cloud engineering.
- 3+ years of hands-on experience with AWS services, particularly DMS, EC2, S3, Glue, Lambda, and Redshift .
- Strong proficiency in SQL , Python , or Shell scripting .
- Experience migrating large datasets (terabytes or more) securely and efficiently.
- Understanding of data warehousing, ETL / ELT processes, and database platforms (e.g., Oracle, SQL Server, PostgreSQL, MySQL).
- Knowledge of networking, IAM policies, and encryption best practices in AWS.
Preferred Qualifications :
Please Apply Now!
#J-18808-LjbffrIntermediate or Senior AWS Data Engineer
Posted 6 days ago
Job Viewed
Job Description
A technology and business consulting firm that was founded through a combination of technology, data, financial and actuarial science principles. is looking for a highly motivate Intermediate or Senior AWS Data Engineer with strong expertise in building scalable data pipelines on AWS. You will work with major financial institutions, designing and implementing modern cloud-based data solutions. This is a hands-on role requiring a solid foundation in Python or C#, AWS Glue (PySpark), and cloud-based ETL systems.
Responsibilities:
- Design, build, and optimize robust data pipelines and architectures on AWS.
- Lead the implementation of scalable and secure data solutions.
- Ingest data into AWS S3 and transform/load into RDS/Redshift.
- Build AWS Glue Jobs using PySpark or Glue Spark .
- Use AWS Lambda (Python/C#) for event-driven data transformation.
- Collaborate on migration and deployment automation (Dev to Prod).
- Support data governance, lineage, and best practices in security.
- Deliver data insights through well-structured models and pipelines.
- Work with batch, real-time (Kafka), and streaming architectures.
- Interact with stakeholders and communicate technical concepts clearly.
Qualifications and experience:
- Proficiency in SQL and data modelling principles.
- Deep experience with AWS services: Glue, Lambda, S3, RDS, Redshift, DynamoDB, Kinesis, SQS/SNS, IAM.
- CI/CD, DevOps and scripting (PowerShell, Bash, Python, etc.).
- Familiarity with RDBMS systems: PostgreSQL, MySQL, SQL Server.
- Agile/Scrum methodology and full SDLC experience.
- Kafka and real-time data ingestion experience (advantageous).
- Strong Python or C# programming, including OOP and libraries for data engineering.
The reference number for this position is NG60600 which is a p ermanent, hybrid position in Johannesburg offering a salary of R600k up to R900k per annum negotiable based on experience. E-mail Nokuthula on nokuthulag@ e-Merge.co.za or call her for a chat on to discuss this and other opportunities.
Are you ready for a change of scenery? E-Merge IT recruitment is a niche recruitment agency. We offer our candidates options so that we can successfully place the right people with the right companies, in the right roles. Check out the E-Merge IT website for more great positions.
Do you have a friend who is a developer or technology specialist? We pay cash for successful referrals!