6 Senior Data Engineer Remote jobs in Western Cape
Senior Data Engineer (Remote)
Posted 6 days ago
Job Viewed
Job Description
About us
LifeCheq is a fintech company changing how South Africans manage their personal
finances. Our platform combines smart tech, deep financial expertise, and a unique
approach to financial advice. We're growing rapidly, backed by major investors including
Futuregrowth, African Rainbow Capital, and Naspers Foundry.
We’re building a deeply data-driven platform and are looking for technically sharp engineers
to help push its foundations to the next level.
What you'll do
You’ll take ownership of our analytics data layer and work closely with our Platform and ML
squads. Your job is to build a fast, clean, and reliable system that integrates backend data
from an event-sourced architecture, external services, and JSON-based event logs—then
serves performant, well-modelled views to analysts, ML pipelines, dashboards, and even the
product frontend.
Early work will include:
- Designing a real-time ingestion pipeline for event-sourced data streams and JSON logs
on S3, and implementing it for low-latency performance. - Defining a clear semantic model and restructuring existing views into a maintainable,
well-layered architecture. - Optimising Delta Lake performance to deliver near–real-time data (<5 minutes) on a
reasonable compute budget. - Formalising schemas for core entities and building structured views that can be safely
used across analytics, reporting, and application layers. - Cleaning up pipeline logic to reduce latency, improve readability, and lower cognitive
overhead.
This work is central to unlocking productivity across the business. Success looks like an
analytics foundation that’s predictable, fast, and easy to extend—powering everything from
dashboards to ML models. As the platform scales, there will be opportunities to extend this
foundation—designing for higher data volumes, streaming complexity, and broader cross-
team data needs.
Who we're looking for
You’re a strong engineer who prioritises fundamentals and performance. You deeply understand
database internals, query optimisation, and distributed systems. You're not tied to specific
tech stacks - your technical depth and ability to reason from first principles matter more
than specific experience with particular tools. This isn’t a role for someone looking
to grow into seniority - it’s for someone already operating at a high level of technical autonomy.
You have:
- Excellent SQL skills and deep understanding of databases
(indexing, query optimisation, internals). - Strong coding skills in Python, Clojure or Scala; your code is clean, efficient, and
production-ready. - Experience with distributed data processing technologies like Spark.
- A genuine interest in understanding the business and carefully modelling data
for clarity and performance. - A rigorous, analytical mindset—you identify bottlenecks proactively
and design solutions thoughtfully.
This role suits someone with strong end-to-end design skills. You think clearly about
architecture, weigh trade-offs carefully, and implement pragmatic, well-reasoned solutions.
You’ll have real autonomy in how you approach problems—including the freedom to reshape
parts of the stack—while being expected to explain and motivate your decisions with clarity.
You enjoy working closely with both engineers and business stakeholders, and see
collaboration as essential to building systems that are both technically sound and practically
useful.
Bonus points
- Interest in functional programming and experience with Clojure, Scala, or related
ecosystems. - Track record of building clean systems in high-ownership environments like startups or
small technical teams. - Experience working with Databricks, Delta Lake, and AWS.
- Familiarity with Terraform or similar infrastructure-as-code tools.
Working with us
LifeCheq values technical depth, autonomy, and clear thinking. You’ll be part of a small,
capable team that enjoys solving hard problems together and takes pride in doing things
properly. It’s a collaborative, engineering-driven environment where well-reasoned decisions
carry weight and where your ideas will be taken seriously. There’s plenty of room to shape
systems and standards—so long as they’re driven by sound reasoning and a clear-eyed
view of trade-offs.
This is a fully remote role, and we ask for availability during our core hours (10:00 - 16:00
GMT+2).
BI Data Engineer (Remote)
Posted 11 days ago
Job Viewed
Job Description
A leading Travel Tech company in Cape Town is seeking a BI Data Engineer to optimize its data infrastructure and turn raw data into actionable insights that drive strategic decisions. You will manage the full data pipeline—from ingestion and transformation to reporting and visualization—while leveraging your skills in predictive analytics and data modeling to enhance business outcomes. Working closely with Product squads and Account Managers, you’ll bridge gaps in processes and systems to strengthen the data ecosystem. A relevant degree and 5+ years of experience in BI, data engineering, or a similar role are required.
DUTIES:- Manage, monitor, and optimize BI / Data infrastructure costs through GCP.
- Lead and improve the end-to-end data pipeline process, from data ingestion and transformation to reporting systems, automating as much as possible.
- Introduce AI tools and methodologies across the data pipeline to improve efficiencies, data monitoring, data quality and overall business value delivered.
- Collaborate with peers and stakeholders to understand and translate business requirements into robust data solutions.
- Design, develop, and maintain data models, including machine learning models, ensuring data accuracy, reliability, availability, and evolution.
- Use reporting tools such as PowerBI to create and automate comprehensive data visualizations and reports.
- Manage Google Analytics and Google Tag Manager accounts to drive decisions for web-based platforms.
- Communicate actionable insights with proactive recommendations using statistical techniques.
- Collaborate with stakeholders to identify new opportunities for data insights and develop strategies to meet business goals and drive more value to their clients.
- Create and maintain reporting templates and dashboards for internal and external audiences.
- Independently identify, troubleshoot, and resolve data-related issues, ensuring a seamless data flow and maintaining high data quality and accuracy standards.
- Take ownership of data-related decisions and provide guidance on future data-specific hiring needs as well as tools.
Cloud Data Engineer (remote)
Posted 11 days ago
Job Viewed
Job Description
Join our team of innovative Developers, QAs, and DevOps specialists using frameworks and approaches like Scrum and Kanban, along with the newest technology in cloud development to provide innovative solutions to customers’ needs, and to build our own products for the global market.
Do you want to be part of a dynamic team constantly challenging the status quo and finding "smarter" ways of doing things? If you answered yes, keep reading.
Who Are We?1Nebula is a next-generation SaaS Technology Business focused on providing businesses with cloud & technology expense management services and tools to accelerate their cloud journey.
We hire amazing people from across South Africa, making our diverse group of team members, called N48Xers, a force to be reckoned with.
Learn more about our people brand by watching our culture videos:
Job Opportunity at 1NebulaThe purpose of this position is to ensure that the organization’s data is properly managed, stored, and used to support the organization’s business goals and objectives.
What You Will Do (Your key responsibilities):- Design and develop cloud-based data strategies, architectures, and solutions that enable the organization’s business objectives.
- Lead cloud-based data initiatives to ensure scalability, availability, performance, and security of data.
- Collaborate with stakeholders to develop cloud-based data solutions that meet their requirements.
- Develop and review cloud-based data solutions to ensure compliance with enterprise standards and best practices.
- Monitor and manage cloud-based data solutions to ensure their optimal performance.
- Work with other IT teams to coordinate the integration of cloud-based data solutions with other existing systems.
- Research and recommend new technologies and approaches to enhance the organization’s cloud data architecture.
- Train and mentor junior cloud data architects.
- Assist in the preparation of project plans, cost estimates, and timelines for cloud-based data initiatives.
- Monitor industry trends and best practices for cloud-based data solutions.
- Architect and maintain advanced data documentation and metadata.
- Lead the development of sophisticated machine learning and AI models.
Although we believe in the potential of others more than anything else, there are some minimum requirements we look for, so consider these before applying:
- Minimum 3 years Bachelor’s Degree in Computer Science, Business, or similar field.
- Minimum of 3 years of relevant experience in modern ETL pipeline and Data Warehouse development.
- Experience using Azure Data Factory / Azure Synapse for ETL development.
- Knowledge of programming languages.
- Data engineering certification.
- Enjoy 20 leave days a year plus a “mulligan day” each quarter after meeting all your deliverables. That’s 4 extra leave days annually!
- High-spec laptops and equipment for remote work.
- A one-off at-home office allowance.
- Flexible working hours from Monday to Friday.
- Access to our Well-being program and Employee support services.
- Paid Microsoft courses and certifications to grow your skills.
- Training allowance every 2 years for non-work hobbies.
- Participation in our Culture and Social Responsibility initiatives.
- Free Gap cover and a relaxed/hybrid work environment.
Data Engineer (Remote) Market related
Posted 10 days ago
Job Viewed
Job Description
A leading FinTech company seeks to fill the role of a Data Engineer with a knack for data analysis to join its team. It is a remote role that includes data manipulation, modelling, and being responsible for the understanding of inter-dependencies between various data sources and business processes involving data flow. You will also be involved in designing and building data management solutions using the software and other tools.
DUTIES:
- The configuration and implementation at various clients.
- Understand the business requirements of clients, the focus being the research and investment process of these Asset Managers.
- Construct end-to-end data service solutions.
- Liaise and interface with clients in a support role, providing 2nd Tier support and enhancement services.
- Understand and manage the client’s data requirements, the data being specific to the financial markets.
- Contribute towards a team that develops, constructs, tests, and maintains architectures (such as databases and large-scale processing systems).
- Ensure data architecture will support the requirements of the client’s business.
- Employ a variety of languages and tools (e.g., scripting languages) to marry systems together.
- Recommend ways to improve data reliability, efficiency, and quality.
- Automate work by using process flow tools.
- Provide feedback to the Development team regarding new functionality and issue logging.
- Creation of user interfaces allowing users to upload their own data.
REQUIREMENTS:
Qualifications –
- Tertiary degree in BSc Computer Science, B.IT or Informatics related Degrees, Mathematics, Applied Mathematics, Actuarial Science or an Engineering Degree.
Experience/Skills -
- Understanding and working experience in data integration and transformation.
- Information and Technology services industry experience.
- Data Analysis, modelling and surfacing.
- Data Cleaning / Integrity Checking.
- SQL, SSIS, database scripting (stored procedures, user-defined functions, queries, triggers).
- Present information using data visualization techniques (such as QlikView, Power BI and Tableau).
- Experience of creating reports using Excel or equivalent.
- Iterative Testing including debugging and refactoring.
- Constructing data queries by combining multiple data sources.
Advantageous –
- Some experience in a programming language (such as Python).
- Experience of consuming APIs.
- Any sort of ETL or Data Warehousing knowledge.
- Statistical languages (such as R and MATLAB).
- Any experience within Asset Management and Financial Services.
ATTRIBUTES:
- Client oriented.
- Good at problem solving (core to the role).
- Enjoys supporting technical implementations.
- Able to build strong networks and relationships, internally and with clients.
- Can effectively multi-task and manage conflicting priorities, coupled with good attention to detail.
- Aptitude for providing timely and accurate responses to client inquiries and managing client expectations.
- A passion to learn and extend knowledge outside of the work sphere.
- The ability to self-manage and self-motivate.
- Capable of identifying, embracing and initiating change in an agile and fast-paced environment.
While we would really like to respond to every application, should you not be contacted for this position within 10 working days please consider your application unsuccessful.
#J-18808-LjbffrData Engineer (Remote Position) - Cape Town, Western Cape
Posted 10 days ago
Job Viewed
Job Description
- Tertiary degree in BSc Computer Science, B.IT or Informatics related degrees, Mathematics, Applied Mathematics, Actuarial Science or an Engineering degree
- Understanding and working experience in data integration and transformation
- Data analysis, modelling and surfacing
- Data cleaning / Integrity checking
- Experience of creating reports using Excel or equivalent
- SQL, SSIS, database scripting (stored procedures, user defined functions, queries, triggers)
- Iterative testing including debugging and refactoring
- Constructing data queries by combining multiple data sources
- Present information using data visualization techniques (such as QlikView, PowerBI and Tableau)
- Experience of consuming APIs (advantageous)
- Some experience in a programming language (advantageous)
- Any sort of ETL or Data Warehousing knowledge (advantageous)
- Statistical languages (such as R and Matlab) (advantageous)
- Information and technology services
- Asset management and financial services (distinct advantage)
- A passion to learn and extend knowledge outside of the work sphere
- Good at problem solving (core to the role)
- The ability to self-manage and self-motivate
- The ability to communicate clearly with clients and the team
- Be a Team player
- Be adaptable
- Be able to efficiently and effectively plan and structure tasks
- Thrive in an agile environment
This is an exciting opportunity for a highly analytical person with a knack for data analysis. Involving manipulation, modelling and being responsible for the understanding of inter-dependencies between various data sources and business processes involving data flow. The Data Engineer will also be involved in designing and building data management solutions. There will also be elements of integrating with data science tools allowing business users to visualise their data.
Duties & Responsibilities:- The configuration and implementation of client’s software at their various clients.
- Understand the business requirements of clients, the focus being the research and investment process of these Asset Managers.
- Construct end to end data service solutions.
- Liaise and Interface with clients in a support role, providing 2nd Tier support and enhancement services.
- Understand and manage the client’s data requirements, the data being specific to the financial markets.
- Contribute towards a team that develops, constructs, tests and maintains architectures (such as data bases and large-scale processing systems).
- Ensure data architecture will support the requirements of the client’s business.
- Employ a variety of languages and tools (e.g. scripting languages) to marry systems together.
- Recommend ways to improve data reliability, efficiency and quality.
- Employ sophisticated analytics and statistical methods to prepare data for use in prescriptive modelling.
- Automate work by using process flow tools.
- Provide feedback to the Development team regarding new functionality and issue logging.
- Creation of user interfaces allowing users to upload their own data.
Data Scientist / ML Engineer Cape Town, SA / Remote
Posted 11 days ago
Job Viewed
Job Description
Location: Cape Town, SA / Remote
Employment Type: Full-time
Salary: Competitive + Share Options
Reporting To: Lead Data Scientist
Clairo AI is a London-based AI agent platform focused on fully private AI. Our platform enables businesses to deploy AI agents that connect to their internal data sources while ensuring complete privacy—without relying on external APIs.
We operate on our own bare-metal infrastructure and have developed a Kubernetes platform to deploy our AI models, microservices, and enterprise applications in a scalable manner. Our AI stack is designed to solve complex business challenges across data search, automation, and AI-powered decision-making .
Following a successful funding round, we are expanding our engineering and data science teams , hiring four to five new team members .
At Clairo AI, we embed Responsible AI into everything we do:
- Privacy: We prioritise data security and client confidentiality.
- Sustainability: Our Icelandic data centre enables us to run AI infrastructure with net-zero emissions.
- Trust: We ensure AI reliability through trust metrics, citations, and better prompting.
- Education: We actively educate businesses on how to use AI effectively and responsibly.
Join us in redefining AI for business—private, secure, and high-performance .
We are looking for a Data Scientist / ML Engineer to enhance the agentic capabilities of Clairo AI. This role involves building out Clairo’s AI agent framework.
You will work under the Lead Data Scientist, who is spearheading our AI vision. Your primary responsibility will be building, experimenting, and testing AI agent components to determine their effectiveness in various applications. These components will be deployed within Clairo’s Kubernetes stack for both customer solutions and Clairo’s SaaS platform.
If you are passionate about building AI-powered agentic systems, experimenting with cutting-edge agent SDKs, and deploying scalable AI models, this role will provide deep technical challenges and real-world impact.
- Develop and optimise AI agentic components within Clairo’s AI platform.
- Generate hypotheses, experiment, and test new AI techniques to evaluate their effectiveness in real-world applications.
- Contribute to and maintain the Agent code framework . Be a gatekeeper for code quality and design philosophy .
- Deploy AI models and components into Clairo’s Kubernetes stack , ensuring scalability and reliability .
- Work closely with the data science and engineering teams to drive innovation in Clairo’s AI agent framework.
- Contribute to MLOps best practices , ensuring efficient AI model lifecycle management .
- Strong grasp of general software engineering skills : version control, code orchestration, design patterns, Linux terminals, virtualisation technologies (Docker & K8s).
- Familiarity with Natural Language Processing (NLP) concepts: tokens, vectorisation, embeddings, semantic vs. syntactic meanings.
- Familiarity with LangChain or other agent SDKs/frameworks.
- Understanding of AI agent architectures and how to build modular AI systems .
- Strong problem-solving skills and ability to work in an experimental, research-driven environment .
- Strong grasp of traditional machine learning model paradigms and concepts (Supervised vs. Unsupervised, Tree models, optimisation, etc.).
- Understanding of retrieval-augmented generation (RAG) , vector databases , and knowledge retrieval systems .
- Experience with Hugging Face and fine-tuning LLMs .
- Experience with Crew AI (big plus).
- Ability to deploy AI models into Kubernetes-based infrastructure .
- Experience with Knowledge Graphs .
- Background in multi-agent AI architectures .
- Familiarity with private AI infrastructure and security best practices .
- Competitive salary with share options.
- Opportunity to relocate to the UK – After one year, Clairo AI will discuss visa sponsorship for candidates who wish to move to the UK.
- Unlimited leave – Take time off whenever you need it.
- Hybrid and flexible working – Work when and where you want, as long as the work gets done.
- Client project opportunities – Work directly with UK businesses, designing AI agents to solve complex challenges.
- Networking with top UK entrepreneurs – Clairo AI is closely connected to leading business figures, providing strong career-building opportunities.
- Cape Town team culture fund – Sponsored beer money for social gatherings with the Cape Town-based team.
- Future office space in Innovation City, Cape Town – A growing tech hub set to become Africa’s equivalent of Silicon Valley .
First Name
Last Name
Apply Now First Name Last Name Email Address Describe why you're interestedinAI Upload Resume Supported formats: PDF only Maximum size: 1 MB
#J-18808-LjbffrBe The First To Know
About the latest Senior data engineer remote Jobs in Western Cape !