304 Nosql Database jobs in South Africa
Data Engineer Data Engineer
Posted today
Job Viewed
Job Description
We are looking for an experienced intermediate Data Engineer. Candidate must have strong SQL & SSIS capabilities, good production support with experience for being on call on a 7-day rotation cycle. AWS cloud skills would be a massive benefit as well.
Skills & Experience Required:
Technical Stack:
- Strong proficiency in
SSIS (SQL Server Integration Services)
– candidates should demonstrate extensive experience in developing, optimising, and maintaining SSIS packages in a production environment. - Proven
ability to perform 24/7 on-call support
and handle production support issues effectively and independently. - Ability to balance operational responsibilities with new development – we're looking for someone who is not only strong in maintaining and supporting existing systems but also has the capability to drive and implement new projects independently.
- Self-starter with the confidence to take ownership of deliverables, proactively identify issues, and provide solutions without needing constant direction.
- Strong
SQL
and
data modelling
skills (dimensional and normalized) - Proficient in
Python
or
Scala - Strong SSIS knowledge
- Experience with
Spark
(PySpark preferred) - Experience with
cloud platforms
, ideally
AWS
(e.g., S3, Glue, Athena, EMR) - Knowledge of
data warehouse
and
data lake
architectures - Exposure to
CI/CD pipelines
and containerization (e.g., Docker, GitLab CI)
Production Support Requirements:
This role includes participation in a rotational production support schedule. The successful candidate must be willing and able to:
- Be on call every third week as part of a structured support roster.
- Respond to after-hours callouts, including late-night or early-morning alerts.
- Support and troubleshoot issues in the nightly batch process to ensure successful completion.
- Work collaboratively with operations and infrastructure teams to resolve time-sensitive issues under pressure.
- Maintain logs, escalate critical incidents, and ensure accurate handovers.
- This support responsibility is critical to ensure the availability and continuity of data services required by business users and systems across the enterprise.
Data Engineer
Posted today
Job Viewed
Job Description
Exact location – Rosebank, Firestation, walking distance to Gautrain Station
Responsibilities will include, but not be limited to the following:
- Develop and optimize data pipelines for collecting, processing, and storing large volumes of production and operational data.
- Play a pivotal role in documenting business processes, procedures, and policies for transparency and consistency.
- Design and implement scalable data architecture solutions tailored to production/operation environments.
- Collaborate with clients, project managers, engineers, and operators to understand data requirements.
- Ensure data integrity, security, and compliance with project requirements and outcomes.
- Monitor and troubleshoot data flow issues and optimize data processing workflows.
- Support the integration of IoT devices, sensors, and industrial systems with data platforms.
- Generate reports and dashboards to visualize real-time and historical data insights.
- Stay updated with the latest industrial data engineering tools and technologies.
- Travel to operational sites when required to ensure hardware and data capturing is functional.
Qualifications and experience:
- Bachelor's or master's degree in Computer Science, Data Engineering, Industrial Engineering, or related field.
- Proven experience in data engineering, especially in industrial or equipment operations settings.
- Strong proficiency in programming languages.
- Experience with cloud platforms and data tools (AVEVA, Power-BI)
- An understanding of the power platform, its functionalities, and its integration potential will be an advantage.
- Knowledge of IoT protocols and industrial automation systems.
- Familiarity with data modelling, ETL processes, and database management.
- Excellent problem-solving and communication skills.
- Attention to detail and a commitment to accuracy.
Data Engineer
Posted today
Job Viewed
Job Description
Ready to architect the future of data on Google Cloud Platform?
Join Lumina Africa (PTY) LTD and lead innovative data solutions using cutting-edge GCP technologies. We're seeking a creative Data Engineer who thinks beyond traditional approaches and brings fresh perspectives to cloud-native data architectures.
What Makes This Role Exceptional:
GCP Innovation Hub
- Work exclusively with Google Cloud's latest data and AI services
Global Tech Group
- Part of Lumina Tech Group with operations across Dubai, London, and South Africa
Cloud-First Culture
- Build scalable, serverless data solutions from day one
Rapid Growth Environment
- Shape our expanding South African data practice
Core GCP Technologies You'll Master:
- BigQuery
- Design and optimize large-scale data warehouses
- Cloud Dataflow
- Build real-time and batch processing pipelines
- Pub/Sub
- Implement event-driven data architectures
- Cloud Composer (Airflow)
- Orchestrate complex data workflows
- Dataproc
- Manage Spark and Hadoop workloads
- Cloud Storage
- Architect data lake solutions
- Vertex AI
- Integrate ML pipelines with data engineering workflows
What You'll Architect:
- Design cloud-native data pipelines using GCP services
- Build real-time streaming solutions with Pub/Sub and Dataflow
- Optimize BigQuery performance for petabyte-scale analytics
- Implement Infrastructure as Code using Terraform and Cloud Deployment Manager
- Create innovative data solutions that challenge conventional approaches
- Mentor teams on GCP best practices and modern data patterns
Essential GCP Expertise:
- 3+ years
hands-on experience with Google Cloud Platform - BigQuery mastery
- complex SQL, partitioning, clustering, optimization
- Cloud Dataflow
- Apache Beam, streaming and batch processing
- Python/Java
- Strong programming skills for data pipeline development
- Terraform/Cloud Deployment Manager
- Infrastructure as Code
- Pub/Sub & Cloud Functions
- Event-driven architectures
- GCP Certifications
preferred (Professional Data Engineer, Cloud Architect)
Bonus Skills:
- Experience with
dbt
for data transformation - Kubernetes
and
Cloud Run
for containerized workloads - Looker
or
Data Studio
for visualization - Apache Spark
on Dataproc - Cloud Security
and
IAM
best practices
Why Choose Lumina Africa:
GCP-Focused Career Path
- Specialize in Google Cloud's data ecosystem
Competitive Package
- Market-leading salary + GCP certification support
Hybrid Working
- Modern offices with flexible arrangements
Innovation Budget
- Resources for experimenting with new GCP services
International Projects
- Collaborate across UAE, UK, and African markets
Fast-Track Growth
- Lead data initiatives in our expanding practice
Ready to Build the Future on GCP?
If you're passionate about Google Cloud Platform and ready to architect innovative data solutions that drive South Africa's digital transformation, we want to hear from you.
Data Engineer
Posted today
Job Viewed
Job Description
We are seeking an experienced
Data Engineer
with strong expertise in
Google Cloud Platform
to join a fast-growing, innovative organisation. This role offers the chance to design, build, and optimise scalable data pipelines and architectures that support impactful decision-making across the business.
If you are analytically sharp, self-motivated, and enjoy working in dynamic environments, this could be the perfect opportunity. A passion for African business, curiosity, and a sense of humour will help you thrive in our energetic and forward-thinking culture.
Key Responsibilities
- Design and develop scalable data pipelines and architectures using
Google Cloud Platform technologies
(BigQuery, Dataflow, Pub/Sub, Cloud Storage). - Build and manage ETL processes to transform diverse data sources into structured, reliable formats.
- Collaborate with data scientists and analysts to deliver solutions that enable insights and smarter decisions.
- Maintain documentation for pipelines, data models, and architecture to ensure clarity and consistency.
- Troubleshoot and resolve data issues while safeguarding quality and integrity.
- Optimise data workflows for performance, scalability, and cost efficiency.
- Automate data-related processes to streamline operations.
- Stay ahead of industry trends and adopt best practices in Google Cloud Platform and data engineering.
Requirements
- Bachelor's or Master's degree in Computer Science, Data Science, or related field.
- 5+ years of experience
as a Data Engineer or in a similar role. - Strong programming skills in
BigQuery, Python, SQL, and Google Cloud Platform
. - Proven experience with
ETL development and data modeling
. - Familiarity with
data lakehouse concepts
and techniques. - Excellent problem-solving and analytical skills.
- Strong communication and collaboration abilities.
- Hands-on experience with
Google Cloud Platform technologies
(BigQuery, Dataflow, Pub/Sub, Cloud Storage). - Experience in financial services would be an advantage.
Apply
Danielle Paxton
Senior Specialist Recruitment Consultant
Data Engineer
Posted today
Job Viewed
Job Description
Do you like building data systems and pipelines? Do you enjoy interpreting trends and patterns? Are you able to recognize the
deeper meaning of data
?
Join Elixirr Digital as a
Data Engineer
and help us analyze and organize raw data to provide valuable business insights to our clients and stakeholders
As
Data Engineer
, you will be responsible to ensure the availability and quality of the data so that it becomes usable by target data users. You will be working on a set of operations aimed at creating processes and mechanisms for the flow and access of data in accordance with the project scope and deadlines
Discover the opportunity to join our
Data & Analytics
department and work closely with a group of like-minded individuals with cutting-edge technologies
What you will be doing as a Data Engineer at Elixirr Digital?
- Working closely with Data Architects on AWS, Azure, or IBM architecture designs.
- Maintaining and building data ecosystems by working on the implementation of data ingestions, often in collaboration with other data engineers, analysts, DevOps, and data scientists.
- Ensuring the security of cloud infrastructure and processes by implementing best practices.
- Applying modern principles and methodologies to advance business initiatives and capabilities.
- Identifying and consulting on ways to improve data processing, reliability, efficiency, and quality, as well as solution cost and performance.
- Preparing test cases and strategies for unit testing, system, and integration testing.
Competencies and skillset we expect you to have to successfully perform your job:
- Proficient in Python with extensive experience in data processing and analysis.
- Strong SQL expertise, adept at writing efficient queries and optimizing database performance
- Previous working experience with the Azure/AWS data stack.
- Experienced in software development lifecycle methodologies, with a focus on Agile practices.
We Could Be a Perfect Fit If You Are:
- Passionate about technology. You anticipate, recognize, and resolve technical problems using a variety of specialized tools for application development and support.
- Independent. You are a self-motivated and ambitious individual, capable of managing multiple responsibilities effectively.
- Problem-solver. You think creatively and find solutions to complex challenges.
- Creative and outside-the-box thinker. You look beyond blog posts and whitepapers, competitions, and even state-of-the-art benchmarks to solve real-world problems.
- Communicator. Strong verbal and written communication skills are essential to ensure effective collaboration and timely delivery of results within the team.
- Proficient in English. We work across continents in a global environment, so fluent English, both written and spoken is a must.
Why is Elixirr Digital the right next step for you?
From working with cutting-edge technologies to solving complex challenges for global clients, we make sure your work matters. And while you're building great things, we're here to support you.
Compensation & Equity:
- Performance bonus
- Employee Stock Options Grant
- Employee Share Purchase Plan (ESPP)
- Competitive compensation
Health & Wellbeing:
- Health benefits plan
- Flexible working hours
- Pension plan
Projects & Tools:
- Modern equipment
- Big clients and interesting projects
- Cutting-edge technologies
Learning & Growth:
- Growth and development opportunities
- Internal LMS & knowledge hubs
We don't just offer a job - we create space for you to grow, thrive, and be recognized.
Intrigued? Apply now
Data Engineer
Posted today
Job Viewed
Job Description
Exciting Opportunity:
Data Engineer
(Kafka & Flink)
PBT Group is seeking a
skilled
Data Engineer
with a drive for real-time data streaming and cutting-edge architecture. If you thrive in building and optimising scalable data processing systems, this role is for you
What you'll do:
- Design, develop, and maintain
high-performance data pipelines
and streaming solutions - Integrate and optimise data flows across diverse systems
- Collaborate with architects and data teams to deliver
robust, scalable, and secure
data solutions
What you'll bring:
- Proven experience with
Apache Flink
(real-time stream processing) and
Apache Kafka
(event-driven pipelines) - Strong development skills in
Python,
Java
, or Scala - Hands-on experience in
Spring Boot
,
cloud or containerised environments
(
AWS
, Kubernetes, Docker) - Solid grasp of
data modelling
, integration, and data quality principles
Why join us?
Be part of an innovative team driving real-time data transformation across industries, using modern technologies and agile delivery methods.
If you're ready to take your data engineering career to the next level, we'd love to hear from you
In line with the POPI Act, please provide consent for us to keep your details on our database for future opportunities.
If you do not receive feedback within two weeks, kindly consider your application unsuccessful.
Data Engineer
Posted today
Job Viewed
Job Description
Brief description
The main purpose of this position is to build, maintain and optimise business intelligence (BI) data pipelines
that feed from various data systems across the South African Reserve Bank (SARB) and enable the support
of Data as a Service (DaaS) to the SARB.
Detailed description
The successful candidate will be responsible for the following key performance areas:
- Implement data service standards and frameworks across the SARB to ensure optimised solutions and adherence to best practice, that is, data operations, development and operations as well as machine learning and operations.
- Take responsibility for BI data pipelines and flows for domain specific analytic implementations across the SARB.
- Ensure understanding of client's data requirements in order to drive continuous development of data services and address evolving business needs.
- Design and build data pipelines that are robust, modular, scalable, deployable, reproducible and versioned for analytics and reporting purposes.
- Continually monitor and optimise domain specific data pipelines to ensure data availability and optimal long-term performance of data pipelines.
- Implement new data engineering features.
- Implement data sharing technology services for the SARB, in alignment with the BI and Business Solutions and Technology Department (BSTD) Strategy.
- Diagnoses, manage and enhance the performance of BI data marts and warehouses across the SARB by applying data engineering techniques such as distributed computing and data optimisation.
- Resolve data issues across BI data marts, data warehouses and data lakes.
- Implement initiatives to ensure compliance and adherence to security and application standards with respect to all BI data services.
- Identify and manage the mitigation of risks relating to domain-specific BI data services.
- Proactively engage and problem-solve with cross functional stakeholders ‒ from technical data teams to managers ‒ to address their data needs in order to build impactful analytics solutions.
- Provide reporting and recommendations on data service performance, improvements and data availability for domain-specific solutions to management.
- Keep abreast of industry best practices and technologies and lead implementation thereof to optimise effective and efficient data pipelines and services.
- Impart knowledge of the technical environment to other data engineers, systems development, database administrator, infrastructure and enterprise architecture and enterprise information management teams.
Be The First To Know
About the latest Nosql database Jobs in South Africa !
Data Engineer
Posted today
Job Viewed
Job Description
The role of the Data Engineer encompasses many activities, including (but not limited to):
- Data Modeling: Designing logical and physical data models to support the data requirements of applications and analytics; ensuring data models meet business requirements and are optimized for performance and scalability.
- Data Integration: Designing and implementing data integration processes, including ETL (extract, transform, load) and ELT (extract, load, transform) workflows; ensuring seamless integration of data from various sources, both internal and external.
- Data Security and Compliance: Ensuring data architectures comply with relevant data privacy and security regulations; implementing data security measures, including encryption, access controls, and monitoring.
Minimum Qualification:
- NQF 6 or higher tertiary qualification in Information Communication Technology (ICT) field incorporating (but not limited to) Information Systems; Cloud certification.
Minimum Experience:
- Minimum of 6 years' experience in a field of a Data Engineer role.
Data Engineer
Posted today
Job Viewed
Job Description
Internship Opportunity – Data Engineer
We are looking for enthusiastic and motivated
Data Engineering
to join our dynamic team at
NeoStats Analytics
What we're looking for:
- 0-2 yrs years of experience and strong academic background in
Computer Science, IT, or related fields
. - Basic knowledge of
SQL, Python (Pandas, NumPy)
, and data manipulation. - Understanding of
ETL concepts
,
data warehousing
, or
data pipelines
(a plus). - Exposure to
cloud platforms
like
AWS, GCP, or Azure
(bonus). - Passion for
data engineering
, problem-solving, and learning new technologies. - Good internet connectivity and willingness to collaborate in a fast-paced team environment.
Send your details directly to
, and let's connect
This is a fantastic opportunity to gain
hands-on experience
in
data pipelines, cloud systems, and analytics solutions
while working with a global team transforming businesses into
data-driven organizations
.
Data Engineer
Posted today
Job Viewed
Job Description
We are looking for Data Engineers who can join our team to architect and build data platforms to power insight-led business solutions. We are currently working on AWS and Microsoft Azure cloud platforms, using SQL, Python, Java, .NET and other technologies.
Are you inspired to engineer enterprise-grade, scalable data platforms that solve challenging business problems? Are you inspired to transform the way people and businesses work, and the experience they deliver to their customers? Do you want to continuously learn and grow, by working with the most talented people in South Africa? Then you will love it here
Note:
We give full consideration to every applicant's fit to this role, so if you decide to apply and you do not hear from BSG within a maximum of a four-week period, please consider your application unsuccessful at this time.
Main Purpose of the Role:
Data Engineers work in multi-skilled teams to architect, design and build data platforms and data products. These provide insights for better decision-making and smarter business processes.
Minimum Qualifications:
- Honours or Masters' Degree in Computer Science, Information Systems, Engineering, Physics, Mathematics, Statistics or related field
- 4 years' experience in enterprise software development, including proficiency in SQL and Python / Java / .NET
- Awareness of Data Engineering paradigms such as data warehousing, data mesh or data vault
Experience:
- 4-8 years' experience working in teams to build enterprise-grade software solutions
- Experience with big data tools, such as found in the Hadoop ecosystem, is advantageous
- Experience with AWS or MS Azure cloud data services is preferable
Job Objectives:
- Be an ambassador for BSG's insight-led business solutions
- Attract talent to BSG's Data and Analytics capability, and coach emerging Data Engineers
- Support relationship sales through expertise, analysis of our clients' problems / opportunities and use cases, and high-level design of solution options
- Work with others to define the business problem and identify which data-driven insights and data sources will help to solve that problem
- Work with Data Scientists to identify relevant data from internal / external sources, join / transform the data and explore it for insight
- Work together with Data Scientists to build data pipelines and architect, train, validate and test advanced analytics / machine learning models, using enterprise-grade software engineering practices
- Communicate to business and technical stakeholders how and why the insights and / or models work
- Deploy models into production, on cloud (or sometimes on-premise) AI / ML / data platforms
Skills and Attributes:
- Relate to people and build productive working relationships
- Understand client business problems and identify data-driven solutions
- A clear understanding of the data engineering and data science lifecycle and its constituent parts: data exploration, data preparation, data wrangling and feature engineering, tools, analytical methods, model evaluation, deployment, and performance monitoring
- Estimate the effort, skills and dependencies to deliver data-driven solutions
- Select appropriate technologies and tools, and learn new ones
- Architect and build data pipelines, including data aggregation and transformation
- Deploy pipelines in production
- Communicate clearly to to business and technical stakeholders