52 Data Solutions jobs in South Africa

Integration & Data Solutions Specialist

TreasuryOne

Posted today

Job Viewed

Tap Again To Close

Job Description

Reporting to the Senior IT Manager, the Integration & Data Solutions Specialist will design, implement, and maintain system integrations and data flows across the business. This role combines hands-on technical expertise with data-driven problem solving - building secure, automated connections between platforms (e.g. Flowgear, API's, internal systems) while also supporting data analysis and reporting through Power BI.

Systems Integration

  • Develop, maintain, and optise integrations using Flowgear and other middleware platforms.
  • Write and maintain scripts (PowerShell, Python, or similar) to automate processes and improve efficiency.
  • Build and support secure API's and data pipelines between internal and client systems.
  • Troubleshoot integration issues and liaise with vendors when required.

Data & Analytics

  • Support data transformation and reporting initiatives across departments.
  • Develop dashboards and reports in Power BI for internal teams and clients.
  • Assist with data modelling and ensuring data quality within reporting solutions.
  • Collaborate with finance and operations teams to deliver actionable insights.

Innovation & Continuous Improvements

  • Research and evaluate new technologies for integration and analytics.
  • Drive automation and process optimisation across IT operations.
  • Contribute to the IT strategy by recommending best practices for data flow and reporting.

Requirements

Education

  • Bachelor's Degree in IT, Computer Science, or related field.

Experience

  • 3-5 years Experience in system integration, scripting, or related roles.
  • Proficiency in Scripting languages (PowerShell, Python, SQL)
  • Experience with Power BI of other BI tools.
  • Knowledge of API's, JSON, and data structures.
  • Strong analytical mindset with problem-solving ability.
  • Good communication skills to work across business units.

Work Environment

  • Hybrid of integration engineering and data analytics.
  • Hands-on technical role with opportunities to contribute to business intelligence initiatives.
  • Collaboration with IT, Finance, and Operations teams.

Join TreasuryONE for a rewarding career path

This advertiser has chosen not to accept applicants from your region.

Data Solutions Architect

R1200000 - R2400000 Y PBT Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Employment Type

Contract

Experience

8 to 30 years

Salary

Negotiable

Job Published

08 October 2025

Job Reference No.
Job Description

We're seeking a Data Solutions Architect / Senior Data Engineer to join a growing Data and AI team working on an innovative cloud-based data warehousing and AI solution. The team is developing a client-facing platform that integrates data warehousing with a RAG (Retrieval-Augmented Generation) system — transforming unstructured and structured data into organized, summarized, and insightful information for business use.

You'll play a leading role in building out the production-ready environment, ensuring compliance, scalability, and performance, while contributing to the development of advanced AI-driven insights and automation capabilities.

High-Level Project Overview

The platform focuses on the aggregation, synthesis, and summarization of unstructured data through a secure, scalable Azure-based architecture.

A proof of concept has already been built (a chatbot web app hosted on Azure), and the next phase involves expanding this into a fully integrated production solution.

Your work will involve:

  • Designing and developing scalable data pipelines, storage, and processing components in Azure.
  • Supporting the integration of RAG systems with AI models and vector databases.
  • Enabling robust data flow between AI, search, and warehousing layers.
  • Contributing to architectural decisions on performance, governance, and scalability.

Tech Stack

  • Framework / Orchestration: Azure AI Foundry (for AI workflow orchestration and management)
  • LLM Provider: Azure OpenAI Service (designed to be model-agnostic for future extensibility)
  • Storage: Azure Blob Storage Gen 2 (for documents and source data)
  • Vector Store / Search: Azure AI Search (vector + hybrid search capabilities)
  • App Hosting: Azure App Service (chatbot web app interface integrated with RAG)
  • Embedding Model: Azure OpenAI text-embedding-3-large
  • Data Warehousing: Azure Data Factory for data extraction, transformation, and integration between AI Foundry, AI Search, and Blob Storage

Key Responsibilities

  • Architect and implement end-to-end data pipelines and data warehousing solutions in Azure.
  • Design and optimize ETL/ELT workflows using Azure Data Factory or equivalent.
  • Collaborate with AI developers and cloud engineers to connect data pipelines to AI/RAG systems.
  • Implement data models to support text retrieval, embedding, and summarization processes.
  • Ensure compliance with data governance and security best practices.
  • Mentor and support junior team members as the data capability scales.

Required Skills & Experience

  • 7+ years' experience as a Data Engineer or Data Architect in enterprise environments.
  • Strong proficiency in Azure Cloud (Data Factory, Blob Storage, Synapse, AI Foundry, OpenAI).
  • Advanced SQL and Python development experience.
  • Proven experience with cloud data migration and modern data warehousing.
  • Knowledge of vector databases, AI model integration, or RAG frameworks highly advantageous.
  • Understanding of data orchestration, governance, and security principles.
  • Experience in insurance or financial services preferred.

Why Join

This is a greenfield opportunity to help build a Data & AI capability from the ground up. The team currently consists of four engineers and is expected to grow rapidly in 2026. You'll be working on cutting-edge Azure and AI technologies, shaping an intelligent platform that makes sense of large, messy datasets and transforms them into business-ready insights.

  • In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent

  • If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.

Skills

Data EngineeringData ArchitectureEnterprise ArchitectureMicrosoft AzureSQLPythonData Warehousing

Industries

InsuranceFinancial Services

This advertiser has chosen not to accept applicants from your region.

Network Data Solutions Delivery Engineer

R900000 - R1200000 Y P.I. Works Inc

Posted today

Job Viewed

Tap Again To Close

Job Description

P.I. Works is the first company in the world to automate the management of a commercial 5G network.

Our automated network management products and services empower mobile operators to accelerate network transformation and to drive network quality and efficiency on the path to 5G. These solutions have been deployed across more than 84 mobile operators in 58 countries around the world.

Our success is built by our people that strive for excellence and deliver great value to our customers.

At the core of our success are a culture built on integrity, trust, and respect. We are looking for Network Data Solutions Delivery Engineer to lead mobile network product delivery projects, with a focus on next-generation optimization techniques with Network Automation Solutions, and keeps full engagement with our customer as a single point of contact.

Job Description:

  • Work closely with customer mobile network teams for network performance monitoring and reporting activities
  • Provide onsite support for customer and feedback to product, integration and support teams
  • Create custom dashboards and automated reports on key business metrics for various use cases
  • Participate in technical meetings, monitor products performance, track customer demands
  • Prepare and issue technical guidelines and results of technical studies
  • Prepare and issue engineering level and executive-level reports

Required Qualifications:

  • Diploma from Engineering Departments relevant to Mobile Telecommunications or Computer Science
  • Basic Knowledge on Telecom domain KPIs and Parameters (Radio, Core, Transmission)
  • Good level of SQL and ETL platform knowledge, able to write complex SQL queries
  • Experience with tools in performance management (Optima/Helix, PrOptima, PRS, Eniq, etc.) is a plus
  • Very good command of English
  • Must have an excellent customer relationship with both verbal and written communication skills
  • Experience with OEM vendor EMS/OSS systems and Managed Objects, parameters and measurements is a plus
  • Reporting / BI Tools experience (PowerBI, Tableau, Qlikview, etc.) is a plus
This advertiser has chosen not to accept applicants from your region.

Lead Developer: Data Architecture

R600000 - R1200000 Y Work Mosadi Solutions

Posted today

Job Viewed

Tap Again To Close

Job Description

Lead Developer

3 Months Renewable Contract

Available Immediately

Banking

Role Overview

We are seeking a highly skilled
Lead Developer
with strong expertise in
data architecture, data engineering, data modelling, and SQL
to join our team in the banking sector. The successful candidate will play a critical role in designing, building, and optimizing scalable data solutions while providing technical leadership and mentorship to a junior team of developers and analysts. This role requires both hands-on technical expertise and the ability to guide and upskill a growing team.

Key Responsibilities

  • Lead the design, development, and implementation of
    robust data architectures
    to support banking applications and analytics.
  • Develop, optimize, and maintain
    ETL/ELT pipelines, data warehouses, and data lakes
    .
  • Drive the design and enforcement of
    data modelling standards
    to ensure accuracy, consistency, and scalability across systems.
  • Write efficient and optimized
    SQL queries
    , stored procedures, and database scripts for complex data processing.
  • Collaborate with stakeholders across business and IT to translate requirements into scalable technical solutions.
  • Ensure data solutions are compliant with
    regulatory, governance, and security requirements
    in the banking sector.
  • Provide
    hands-on technical leadership
    , code reviews, and best practices to elevate the technical quality of deliverables.
  • Mentor, coach, and upskill a junior team of developers and analysts, fostering a culture of knowledge sharing and continuous improvement.
  • Stay current with industry trends, emerging technologies, and best practices in
    data engineering and architecture
    .

Key Skills & Competencies

  • Strong leadership and
    mentorship abilities
    , with proven experience developing junior talent.
  • Excellent communication and stakeholder management skills, with the ability to explain complex technical concepts to non-technical audiences.
  • Strong problem-solving skills, analytical thinking, and attention to detail.
  • Ability to work under pressure in a
    regulated and fast-paced banking environment
    .

Technical Requirements

  • Proven expertise in
    SQL development
    (query optimization, stored procedures, performance tuning).
  • Strong knowledge of
    data architecture and data modelling principles
    (relational, dimensional, and NoSQL).
  • Experience with
    data engineering frameworks
    and technologies (e.g., Python, Spark, Kafka, Airflow, or similar).
  • Experience designing and managing
    data warehouses and/or data lakes
    (e.g., Snowflake, Redshift, BigQuery, or equivalent).
  • Proficiency in
    ETL/ELT design and implementation
    .
  • Familiarity with
    cloud platforms
    (AWS, Azure, or GCP) and their data services.
  • Strong understanding of
    data governance, data quality, and security practices
    within financial services.

Qualifications & Experience

  • Bachelor's degree in Computer Science, Information Systems, Engineering, or related field (Master's advantageous).
  • 8+ years of experience in
    data engineering, data architecture, and SQL development
    .
  • 3+ years in a
    leadership or senior developer role
    , with mentorship experience.
  • Prior experience in the
    banking or financial services sector
    is strongly preferred.
This advertiser has chosen not to accept applicants from your region.

Data Integration

DLK Group

Posted today

Job Viewed

Tap Again To Close

Job Description

The role of the Senior Data Integration Analyst encompasses many activities, including (but not limited to):

  • Designing and implementing advanced data integration pipelines and ETL (Extract, Transform, Load) processes.
  • Managing complex integrations across multiple systems and platforms to ensure seamless data flow.
  • Collaborating with stakeholders to understand and define data integration requirements.
  • Overseeing data governance and ensuring data integrity throughout integration processes.
  • Mentoring, providing technical guidance and support.
  • Troubleshooting and optimizing integration workflows for performance and reliability.
Requirements

Minimum Qualification:

A Degree in Information Communication Technology (ICT) field incorporating (but not limited to) Computer Science or Software Engineering or Information Systems.

Minimum Experience:

  • Minimum of 5 years' experience working with SQL, ETL, APIs; 3 years in data integration and agile-scrum environment within enterprise organisations of >1000 users.
This advertiser has chosen not to accept applicants from your region.

Data Science Manager: Telemetry Solutions

Midrand, Gauteng R104000 - R130878 Y MiWay Insurance Limited

Posted today

Job Viewed

Tap Again To Close

Job Description

Who are we?
We made Miway to change the way insurance works, to make it work your way, protecting what matters to you and making life a whole lot easier.

Why Miway?We know what it's like to be in an early morning accident on the way to work, to watch helplessly as water floods your business premises, or have your most precious possessions stolen, to lose a truckload of costly stock, or be stranded a long way from home.

So, we created products that put you first, that help you get back on your feet, that make a difference when you're down and dealing with a loss. That's insurance the way it should be. Insurance your way.

That's Miway.You may want to be part of that. Insurance the way it should be. Built around you.It's about you - not about us.

Why Join Us?
This is a rare opportunity to lead at the frontier of actuarial innovation - where data science, machine learning, and software engineering create real-world impact. As Data Science Manager: Telemetry Solutions, you'll help shape the future of insurance.

  • Lead actuarial and data science initiatives across telematics, claims, and operational intelligence.
  • Work with rich, real-time datasets from connected vehicles and IoT platforms to unlock insights and build predictive systems.
  • Collaborate with cross-functional teams - including product, development, and claims - to integrate analytics into key decision-making processes.
  • Apply and grow your technical skills across AI, machine learning, and production-grade systems in a dynamic environment.
  • Join a forward-looking team that values innovation, ownership, and continuous learning.

If you're technically strong, strategically curious, and passionate about using data science to tackle bold, modern challenges - this is your opportunity to lead the change.

What will you do?
The Data Science Manager: Telemetry Solutions is a future-focused leadership role that bridges actuarial science, data science, machine learning, and engineering. You will lead a multidisciplinary team that leverages telematics and alternative data to deliver intelligent solutions across usage-based insurance (UBI), claims innovation, product development, and dynamic risk assessment.

This role offers a unique opportunity to operate at the intersection of data, strategy, and technology - driving meaningful transformation in how insurance products are developed, risks are understood, and operations are optimised.

Key Responsibilities

  • Architect and deploy scalable, real-time systems for ingesting, processing, and analysing high-frequency vehicle sensor and telemetry data.
  • Develop and maintain robust APIs and automation tools in Python and C# to support internal platforms and customer-facing applications.
  • Oversee the creation and deployment of AI models - including computer vision and large language models (LLMs) - to enhance claims automation and document intelligence.
  • Champion innovation within the actuarial domain by blending traditional methodologies with modern MLOps, software engineering, and streaming analytics.
  • Manage end-to-end machine learning pipelines using tools like MLflow, ensuring reproducibility, governance, and performance monitoring.
  • Collaborate closely with Product, IT, Data Engineering, and Claims teams to embed intelligent analytics into core business processes and customer experiences.
  • Mentor and grow a high-performing team of data scientists, and software engineers.
  • Provide technical insight and data-driven narratives to support strategic decision-making through dashboards, models, and actuarial reporting.

Qualifications

  • Degree in Actuarial Science, Data Science, Statistics, Computer Science, or a related field (Honours or Master's preferred).

Experience

  • Minimum of 5 years' experience in data science, or data engineering roles, preferably in a telematics-driven insurance environment.
  • Proficient in Python and C#, with a solid foundation in object-oriented programming and API development.
  • Advanced SQL skills, including data processing, database management, and query optimisation (Essential).
  • Practical experience using Git for version control and collaboration(Essential).
  • Hands-on experience with machine learning libraries and frameworks such as XGBoost, spaCy, Hugging Face, TensorFlow, or PyTorch.
  • Proven ability to deploy and monitor models using MLflow and related MLOps tools.
  • Experience with large-scale datasets, ideally from IoT or telematics sources.
  • Working knowledge of Docker for containerisation (Intermediate to Advanced – Advantageous).
  • Exposure to messaging systems such as RabbitMQ (Advantageous).
  • Strong understanding of software engineering principles, including object-oriented design and clean code practices.
  • Solid business acumen with the ability to translate technical work into strategic outcomes.
  • Excellent communication skills, with the ability to explain complex technical concepts to non-technical stakeholders.
  • Demonstrated leadership experience and a passion for mentoring cross- functional teams.
  • Strong analytical and problem-solving skills, with the ability to extract insights from high-volume, high-dimensional data

Knowledge And Skills
Actuarial Problem Solving

Issues management

Business knowledge

Business analysis

Personal Attributes
Self-development - Contributing independently

Interpersonal savvy - Contributing independently

Nimble learning - Contributing independently

Tech savvy - Contributing independently

Core Competencies
Cultivates innovation - Contributing independently

Customer focus - Contributing independently

Drives results - Contributing independently

Collaborates - Contributing independently

Being resilient - Contributing independently

Build a successful career with us
We're all about building strong, lasting relationships with our employees. We know that you have hopes for your future – your career, your personal development and of achieving great things. We pride ourselves in helping our employees to realise their worth. Through its five business clusters – Sanlam Fintech, Sanlam Life and Savings, Sanlam Investment Group, Sanlam Allianz, Santam, as well as MiWay and the Group Office – the group provides many opportunities for growth and development.

Turnaround time
The shortlisting process will only start once the application due date has been reached. The time taken to complete this process will depend on how far you progress and the availability of managers.

Deadline to apply: 19 September 2025.
Our commitment to transformation
At MiWay we believe in cultivating a positive and dynamic working environment that gives you freedom and opportunity to succeed. MiWay is committed to transformation and embracing diversity. This is what drives us to achieve a multicultural workplace with employment equity as a key goal to create an inclusive workforce, reflective of the demographics of our society.

This advertiser has chosen not to accept applicants from your region.

Data Integration Analyst

R900000 - R1200000 Y iLaunch

Posted today

Job Viewed

Tap Again To Close

Job Description

Design and implement advanced integration pipelines and ETL processes

Seamless Dataflow - Manage complex integrations to ensure seamless data flow

Collaboratewith stakeholders to define and understand data integration requirements

For senior roles:

Oversee data Governance and data integrity

Mentor in technical integration and troubleshoot to optimise integration performance

Degree in Information Communication( ICT related, Computer Science, Information Systems)

3 Years experience in ETL, SQL and API's, Integration and Analyses

Experience working in a large enterprise with at least a 1000 user headcount and multiproject environment

2-3 years experience in web-based application environments and Microsoft Office professional

Experience definingand implementing product/integration requirements in a Sprint, Scrum/Agile environment

Use of Integration tools such asAzure Data Factory, Informatica, or Talend

Between 3 - 5 Years

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data solutions Jobs in South Africa !

Data Engineering and Architecture Lead

R1200000 - R2400000 Y X, bigly labs

Posted today

Job Viewed

Tap Again To Close

Job Description

At
X, bigly labs
, we're Dis-Chem's high-performance innovation hub, where bold ideas meet data, design, and radical customer focus. Our mission is simple:
power the future of healthcare
by lowering costs, improving outcomes, and unlocking new possibilities. We're driven by one big question:
How do we use data + technology today to create healthier lives tomorrow?
Here, we don't just imagine the future,
we build it
. From cutting-edge digital solutions to smarter, patient-focused experiences, we're reimagining health tech to make breakthroughs possible.

Welcome to
X, bigly labs.
This is healthcare, reimagined.

From retail and pharmacy to clinics, insurance and beyond, we're applying machine learning and smart systems to reimagine healthcare pathways and create personalised experiences at scale. Here, your work doesn't sit in a notebook or on a whiteboard. It becomes real. It shapes decisions. It improves lives. And we do it all the Bigly way, questioning, challenging, and building audaciously, together.

We are seeking a visionary
Data Engineering and Architecture Lead
to lead the design, development, and optimisation of our enterprise data ecosystem. This role is responsible for building scalable, secure, and high-performance data platforms that power analytics, AI, and operational excellence across Dischem and its affiliated businesses. You will be the architect of our data assets by ensuring that our data infrastructure is not only robust and compliant, but also agile enough to support innovation in healthcare, retail, insurance, and beyond.

WHAT WE'RE LOOKING FOR?
Minimum

  • Bachelor's degree (or the equivalent) in Computer Science, Data Engineering, or related field
  • 7+ years of experience in data engineering or architecture, with 3+ years in a leadership role
  • Deep expertise in SQL, Python or/and Spark, Cloud platforms (Azure or AWS) and cloud-native tools (e.g. DataBricks)
  • Proven experience designing, implementing scaling enterprise data platforms using the Medallion Architecture
  • Strong understanding of data governance, security, and compliance frameworks
  • Excellent leadership, communication, and stakeholder engagement skills

Advantageous

  • Experience in healthcare, retail, or insurance data ecosystems
  • Familiarity with data mesh, lakehouse architecture and real-time data processing
  • Certifications in cloud architecture or data engineering

WHAT YOU WILL BE DOING?

  • Enterprise Data Architecture
  • Data Engineering Leadership
  • Platform Enablement & Innovation
  • Stakeholder Engagement & Governance
  • Enterprise Data Architecture
  • Data Engineering Leadership
  • Platform Enablement & Innovation
  • Financial & Vendor Management

WHO YOU ARE?

  • Structured thinking and systems problem-solving
  • Commercial fluency and ability to articulate value levers
  • Strategic clarity balanced with practical execution
  • Able to co-create solutions with technical teams
  • Influences without authority and facilitates decision forums
  • Drives initiatives independently with high standards of quality

Our values aren't just ideals, they're the through-lines in how we think, build, and make decisions that impact real lives. From bold experimentation in digital solutions to platforms built on integrity, we're shaping a culture designed for
progress that lasts
. It's a culture that
designs for the future
, asks better questions, and answers them with care, urgency, and systems that scale.

Think you've got the energy, the curiosity, and the guts? Stay close
b
igly things are ahead.

This advertiser has chosen not to accept applicants from your region.

Data Engineering and Architecture Lead

Johannesburg, Gauteng E-Merge IT Recruitment

Posted 10 days ago

Job Viewed

Tap Again To Close

Job Description

permanent

We’re building the next-generation data backbone for a dynamic organisation.

We create scalable, secure, cloud-first platforms that power analytics, AI, and business intelligence across the enterprise.

Currently searching for a Data Engineering and Architecture Lead with deep AWS Cloud expertise and a passion for solving complex data challenges.


Requirements:

  • Bachelor’s degree in computer science, information systems, or related field
  • 10+ years’ experience in data engineering or architecture
  • 5+ years leadership experience
  • 5+ years architectural experience
  • Proven experience with AWS services (S3, Glue, Redshift, Lambda, EMR, Athena)
  • Expertise in SQL, Python, and ETL/ELT development
  • Knowledge of data modelling (Kimball, Data Vault) and data governance
  • Leadership experience managing technical teams

Responsibilities:

  • Lead design and implementation of enterprise data architecture and integration strategies
  • Build and manage scalable, high-performance data pipelines
  • Ensure data availability and quality to support analytics, BI, and AI initiatives
  • Collaborate with business and technology teams to translate requirements into solutions
  • Define best practices, enforce standards, and mentor a team of data engineers

Reference number for this position is GZ60852 which is a permanent position based in Melrose Arch offering a cost to company salary of R1.8m per annum negotiable on experience and ability. Contact Garth on or call him on to discuss this and other opportunities.

Are you ready for a change of scenery? The E-Merge IT recruitment is a specialist niche recruitment agency. We offer our candidates options so that we can successfully place the right developers with the right companies in the right roles. Check out the E-Merge website for more great positions.

Do you have a friend who is a developer or technology specialist? We pay cash for successful referrals!

This advertiser has chosen not to accept applicants from your region.

Senior Data Integration Engineer

R2000000 - R2500000 Y Indsafri

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description:

Job Title: Senior Data Integration Engineer (Salesforce, Databricks & MuleSoft)

Location: Johannesburg (Hybrid)

Employment Type: Contract

Contract Tenure: 6 to 12 months

Job Summary

We are seeking a highly experienced and strategic Senior Data Integration Engineer to architect, build, and manage the data pipelines that power our customer intelligence ecosystem. In this critical role, you will be the subject matter expert responsible for designing and implementing robust integrations between our core platforms: Salesforce Data Cloud, Databricks, and MuleSoft.

You will be responsible for creating a seamless flow of data, enabling advanced analytics, and empowering our business to activate real-time customer insights. The ideal candidate is a hands-on expert who can translate complex business requirements into scalable, secure, and high-performance technical solutions.

Required Skills & Experience

  • 6+ years of professional experience
    in a data engineering, integration development, or data architecture role.
  • Proven hands-on experience with MuleSoft:
    Demonstrable expertise in designing, building, and managing APIs using the Anypoint Platform (API-led connectivity, DataWeave, connectors).
  • Strong proficiency in Databricks:
    Hands-on experience developing data pipelines using
    PySpark
    , SQL, Delta Lake, and job orchestration.
  • Demonstrable experience with Salesforce Data Cloud:
    In-depth knowledge of its data model, ingestion methods (Connectors, Ingestion API), identity resolution, segmentation, and activation capabilities.
  • Expert SQL & Python Skills:
    Ability to write complex, efficient SQL queries and Python code for data manipulation and automation.
  • Solid understanding of data modeling principles
    and experience designing and working with ETL/ELT processes.
  • Experience working with major cloud platforms (
    AWS, Azure, or GCP
    ).

Preferred Qualifications

  • Certifications:
  • Salesforce Data Cloud Consultant
  • MuleSoft Certified Developer / Architect
  • Databricks Certified Data Engineer Professional
  • Experience with other Salesforce clouds (e.g., Marketing Cloud, Sales Cloud).
  • Knowledge of CI/CD and DevOps practices in a data context.
  • Familiarity with streaming data technologies (e.g., Kafka).
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Solutions Jobs