325 Data Storage Specialist jobs in South Africa

Data Solutions Architect

Cape Town, Western Cape PBT Group

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

Data Solutions Architect job vacancy in Cape Town.

We’re seeking a Data Solutions Architect / Senior Data Engineer to join a growing Data and AI team working on an innovative cloud-based data warehousing and AI solution.

The team is developing a client-facing platform that integrates data warehousing with a RAG (Retrieval-Augmented Generation) system — transforming unstructured and structured data into organized, summarized, and insightful information for business use.

You’ll play a leading role in building out the production-ready environment, ensuring compliance, scalability, and performance, while contributing to the development of advanced AI-driven insights and automation capabilities.

High-Level Project Overview:

The platform focuses on the aggregation, synthesis, and summarization of unstructured data through a secure, scalable Azure-based architecture.

A proof of concept has already been built (a chatbot web app hosted on Azure), and the next phase involves expanding this into a fully integrated production solution.

Your work will involve:

  • Designing and developing scalable data pipelines, storage, and processing components in Azure.
  • Supporting the integration of RAG systems with AI models and vector databases.
  • Enabling robust data flow between AI, search, and warehousing layers.
  • Contributing to architectural decisions on performance, governance, and scalability.

Tech Stack:

  • Framework / Orchestration: Azure AI Foundry (for AI workflow orchestration and management)
  • LLM Provider: Azure OpenAI Service (designed to be model-agnostic for future extensibility)
  • Storage: Azure Blob Storage Gen 2 (for documents and source data)
  • Vector Store / Search: Azure AI Search (vector + hybrid search capabilities)
  • App Hosting: Azure App Service (chatbot web app interface integrated with RAG)
  • Embedding Model: Azure OpenAI text-embedding-3-large
  • Data Warehousing: Azure Data Factory for data extraction, transformation, and integration between AI Foundry, AI Search, and Blob Storage

Key Responsibilities:

  • Architect and implement end-to-end data pipelines and data warehousing solutions in Azure.
  • Design and optimize ETL/ELT workflows using Azure Data Factory or equivalent.
  • Collaborate with AI developers and cloud engineers to connect data pipelines to AI/RAG systems.
  • Implement data models to support text retrieval, embedding, and summarization processes.
  • Ensure compliance with data governance and security best practices.
  • Mentor and support junior team members as the data capability scales.

Required Skills & Experience:

  • 7+ years’ experience as a Data Engineer or Data Architect in enterprise environments.
  • Strong proficiency in Azure Cloud (Data Factory, Blob Storage, Synapse, AI Foundry, OpenAI).
  • Advanced SQL and Python development experience.
  • Proven experience with cloud data migration and modern data warehousing.
  • Knowledge of vector databases, AI model integration, or RAG frameworks highly advantageous.
  • Understanding of data orchestration, governance, and security principles.
  • Experience in insurance or financial services preferred.

Why Join:

This is a greenfield opportunity to help build a Data & AI capability from the ground up. The team currently consists of four engineers and is expected to grow rapidly in 2026. You’ll be working on cutting-edge Azure and AI technologies, shaping an intelligent platform that makes sense of large, messy datasets and transforms them into business-ready insights.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Solutions Architect

R1200000 - R2400000 Y PBT Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Employment Type

Contract

Experience

8 to 30 years

Salary

Negotiable

Job Published

08 October 2025

Job Reference No.
Job Description

We're seeking a Data Solutions Architect / Senior Data Engineer to join a growing Data and AI team working on an innovative cloud-based data warehousing and AI solution. The team is developing a client-facing platform that integrates data warehousing with a RAG (Retrieval-Augmented Generation) system — transforming unstructured and structured data into organized, summarized, and insightful information for business use.

You'll play a leading role in building out the production-ready environment, ensuring compliance, scalability, and performance, while contributing to the development of advanced AI-driven insights and automation capabilities.

High-Level Project Overview

The platform focuses on the aggregation, synthesis, and summarization of unstructured data through a secure, scalable Azure-based architecture.

A proof of concept has already been built (a chatbot web app hosted on Azure), and the next phase involves expanding this into a fully integrated production solution.

Your work will involve:

  • Designing and developing scalable data pipelines, storage, and processing components in Azure.
  • Supporting the integration of RAG systems with AI models and vector databases.
  • Enabling robust data flow between AI, search, and warehousing layers.
  • Contributing to architectural decisions on performance, governance, and scalability.

Tech Stack

  • Framework / Orchestration: Azure AI Foundry (for AI workflow orchestration and management)
  • LLM Provider: Azure OpenAI Service (designed to be model-agnostic for future extensibility)
  • Storage: Azure Blob Storage Gen 2 (for documents and source data)
  • Vector Store / Search: Azure AI Search (vector + hybrid search capabilities)
  • App Hosting: Azure App Service (chatbot web app interface integrated with RAG)
  • Embedding Model: Azure OpenAI text-embedding-3-large
  • Data Warehousing: Azure Data Factory for data extraction, transformation, and integration between AI Foundry, AI Search, and Blob Storage

Key Responsibilities

  • Architect and implement end-to-end data pipelines and data warehousing solutions in Azure.
  • Design and optimize ETL/ELT workflows using Azure Data Factory or equivalent.
  • Collaborate with AI developers and cloud engineers to connect data pipelines to AI/RAG systems.
  • Implement data models to support text retrieval, embedding, and summarization processes.
  • Ensure compliance with data governance and security best practices.
  • Mentor and support junior team members as the data capability scales.

Required Skills & Experience

  • 7+ years' experience as a Data Engineer or Data Architect in enterprise environments.
  • Strong proficiency in Azure Cloud (Data Factory, Blob Storage, Synapse, AI Foundry, OpenAI).
  • Advanced SQL and Python development experience.
  • Proven experience with cloud data migration and modern data warehousing.
  • Knowledge of vector databases, AI model integration, or RAG frameworks highly advantageous.
  • Understanding of data orchestration, governance, and security principles.
  • Experience in insurance or financial services preferred.

Why Join

This is a greenfield opportunity to help build a Data & AI capability from the ground up. The team currently consists of four engineers and is expected to grow rapidly in 2026. You'll be working on cutting-edge Azure and AI technologies, shaping an intelligent platform that makes sense of large, messy datasets and transforms them into business-ready insights.

  • In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent

  • If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.

Skills

Data EngineeringData ArchitectureEnterprise ArchitectureMicrosoft AzureSQLPythonData Warehousing

Industries

InsuranceFinancial Services

This advertiser has chosen not to accept applicants from your region.

Data solutions architect

Cape Town, Western Cape PBT Group

Posted today

Job Viewed

Tap Again To Close

Job Description

permanent
Data Solutions Architect job vacancy in Cape Town. We’re seeking a Data Solutions Architect / Senior Data Engineer to join a growing Data and AI team working on an innovative cloud-based data warehousing and AI solution. The team is developing a client-facing platform that integrates data warehousing with a RAG (Retrieval-Augmented Generation) system — transforming unstructured and structured data into organized, summarized, and insightful information for business use. You’ll play a leading role in building out the production-ready environment, ensuring compliance, scalability, and performance, while contributing to the development of advanced AI-driven insights and automation capabilities. High-Level Project Overview: The platform focuses on the aggregation, synthesis, and summarization of unstructured data through a secure, scalable Azure-based architecture. A proof of concept has already been built (a chatbot web app hosted on Azure), and the next phase involves expanding this into a fully integrated production solution. Your work will involve: Designing and developing scalable data pipelines, storage, and processing components in Azure. Supporting the integration of RAG systems with AI models and vector databases. Enabling robust data flow between AI, search, and warehousing layers. Contributing to architectural decisions on performance, governance, and scalability. Tech Stack: Framework / Orchestration: Azure AI Foundry (for AI workflow orchestration and management) LLM Provider: Azure Open AI Service (designed to be model-agnostic for future extensibility) Storage: Azure Blob Storage Gen 2 (for documents and source data) Vector Store / Search: Azure AI Search (vector + hybrid search capabilities) App Hosting: Azure App Service (chatbot web app interface integrated with RAG) Embedding Model: Azure Open AI text-embedding-3-large Data Warehousing: Azure Data Factory for data extraction, transformation, and integration between AI Foundry, AI Search, and Blob Storage Key Responsibilities: Architect and implement end-to-end data pipelines and data warehousing solutions in Azure. Design and optimize ETL/ELT workflows using Azure Data Factory or equivalent. Collaborate with AI developers and cloud engineers to connect data pipelines to AI/RAG systems. Implement data models to support text retrieval, embedding, and summarization processes. Ensure compliance with data governance and security best practices. Mentor and support junior team members as the data capability scales. Required Skills & Experience: 7+ years’ experience as a Data Engineer or Data Architect in enterprise environments. Strong proficiency in Azure Cloud (Data Factory, Blob Storage, Synapse, AI Foundry, Open AI). Advanced SQL and Python development experience. Proven experience with cloud data migration and modern data warehousing. Knowledge of vector databases, AI model integration, or RAG frameworks highly advantageous. Understanding of data orchestration, governance, and security principles. Experience in insurance or financial services preferred. Why Join: This is a greenfield opportunity to help build a Data & AI capability from the ground up. The team currently consists of four engineers and is expected to grow rapidly in 2026. You’ll be working on cutting-edge Azure and AI technologies, shaping an intelligent platform that makes sense of large, messy datasets and transforms them into business-ready insights. #J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

AI & Data Solutions Architect

Gauteng, Gauteng InfyStrat

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Overview

Role: AI & Data Solutions Architect

Experience: 8-10 Years

Must-have skills
  • Strong understanding of AI/ML concepts
  • Proficiency in Python (or other relevant languages)
  • Experience with prompt engineering - Knowledge of LLMs and their applications
  • Familiarity with MLOps principles
  • Understanding of Nvidia H200 GPUs to design, operate and manage
  • Basics subject matter expertise of telecom and prior work experience in the area
  • Deep understanding of Azure/AWS or GCP cloud platform
  • Expertise in relevant cloud services and tools
  • Knowledge of Cloud (AWS/Azure or GCP) data storage and processing services
Nice-to-have skills
  • Ruby on Rails
  • Python
  • Kubernetes, Docker
  • Java
  • Nvidia H200 GPU handling
  • Azure DevOps
  • Azure Cosmos DB
  • Docker & Kubernetes containerization
Key responsibilities
  • Design and implement GenAI solutions
  • Develop and optimize prompts
  • Build and deploy AI models
  • Ensure ethical considerations and responsible AI practices. Stay updated on the latest GenAI advancements
  • Integrate with various LLM services and other cloud AI and non-AI services
  • Utilize machine learning tools for design, build and deployment for model training and deployment (like Azure Machine Learning or similar tools)

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

AI & Data Solutions Architect

Johannesburg, Gauteng InfyStrat Software Services

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Job Description

Role: AI & Data Solutions Architect

Experience: 8-10 Years

Must Have Skills
  • Strong understanding of AI/ML concepts
  • Proficiency in Python (or other relevant languages).
  • Experience with prompt engineering- Knowledge of LLMs and their applications.
  • Familiarity with MLOps principles
  • Understanding of Nvidia GPU H200 types to design, operate and manage
  • Basic subject matter expertise of telecom and prior work experience in the area
  • Deep understanding of Azure / AWS or GCP cloud platform.
  • Expertise in relevant cloud services and tools.
  • Knowledge of Cloud (AWS/Azure or GCP) data storage and processing services
Good To Have Skills
  • Ruby on rails
  • Python
  • Kubernetes , docker
  • Java
  • Nvidia H200 GPU handling
  • Azure DevOps
  • Azure cosmos DB
  • Docker & Kubernetes containerization
Requirements

Key responsibilities:

  • Design and implement GenAI solutions.
  • Develop and optimize prompts.
  • Build and deploy AI models.
  • Ensure ethical considerations and responsible AI practices. Stay updated on the latest GenAI advancements
  • Integrate with various LLM Services and other cloud AI and non-AI services
  • Utilize machine learning tools for design, build and deploy for model training and deployment like Azure Machine learning or similar tools

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Integration & Data Solutions Specialist

Pillango Placements

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Key Duties & Responsibilities

Systems Integration
  • Develop, maintain, and optimise integrations using Flowgear and other middleware platforms.
  • Write and maintain scripts (PowerShell, Python, or similar) to automate processes and improve efficiency.
  • Build and support secure APIs and data pipelines between internal and client systems.
  • Troubleshoot integration issues and liaise with vendors when required.
Data & Analytics
  • Support data transformation and reporting initiatives across departments.
  • Develop dashboards and reports in Power BI for internal teams and clients.
  • Assist with data modelling and ensuring data quality within reporting solutions.
  • Collaborate with finance and operations teams to deliver actionable insights.
Innovation & Continuous Improvement
  • Research and evaluate new technologies for integration and analytics.
  • Drive automation and process optimisation across IT operations.
  • Contribute to the IT strategy by recommending best practices for data flow and reporting.

Experience & Qualifications
  • Bachelors Degree in IT, Computer Science, or related field (preferred).
  • 35 years experience in system integration, scripting, or related roles.
  • Hands-on experience with Flowgear or other integration platforms (e.g., Dell Boomi, MuleSoft).
  • Proficiency in scripting languages (PowerShell, Python, SQL).
  • Experience with Power BI or other BI tools (advantageous).
  • Knowledge of APIs, JSON, and data structures.
  • Strong analytical mindset with problem-solving ability.
  • Good communication skills to work across business units.
Work Environment
  • Combination of integration engineering and data analytics.
  • Hands-on technical role with opportunities to contribute to business intelligence initiatives.
  • Collaboration with IT, Finance, and Operations teams.
SALARY

The final offer will be based on the skills and requirements outlined above. The offer will be competitive, aligned with your unique profile and value proposition.
This advertiser has chosen not to accept applicants from your region.

Integration & Data Solutions Specialist

New

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Duties & Responsibilities Systems Integration Develop, maintain, and optimise integrations using Flowgear and other middleware platforms. Write and maintain scripts (PowerShell, Python, or similar) to automate processes and improve efficiency. Build and support secure APIs and data pipelines between internal and client systems. Troubleshoot integration issues and liaise with vendors when required. Data & Analytics Support data transformation and reporting initiatives across departments. Develop dashboards and reports in Power BI for internal teams and clients. Assist with data modelling and ensuring data quality within reporting solutions. Collaborate with finance and operations teams to deliver actionable insights. Innovation & Continuous Improvement Research and evaluate new technologies for integration and analytics. Drive automation and process optimisation across IT operations. Contribute to the IT strategy by recommending best practices for data flow and reporting. Experience & Qualifications Bachelors Degree in IT, Computer Science, or related field (preferred). 35 years experience in system integration, scripting, or related roles. Hands-on experience with Flowgear or other integration platforms (e.g., Dell Boomi, MuleSoft). Proficiency in scripting languages (PowerShell, Python, SQL). Experience with Power BI or other BI tools (advantageous). Knowledge of APIs, JSON, and data structures. Strong analytical mindset with problem-solving ability. Good communication skills to work across business units. Work Environment Combination of integration engineering and data analytics. Hands-on technical role with opportunities to contribute to business intelligence initiatives. Collaboration with IT, Finance, and Operations teams. SALARY The final offer will be based on the skills and requirements outlined above. The offer will be competitive, aligned with your unique profile and value proposition.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data storage specialist Jobs in South Africa !

Azure Data Solutions Architect

R120000 - R180000 Y Deloitte

Posted today

Job Viewed

Tap Again To Close

Job Description

Company Description

Deloitte is the largest private professional services network in the world. Every day, approximately 457,000 professionals in more than 150 countries demonstrate their commitment to a single vision: to be the standard of excellence, while working towards one purpose – to make an impact that matters. Click here to read more about Deloitte.

About the division

Innovation, transformation and leadership occur in many ways. At Deloitte, our ability to help solve clients' most complex issues is distinct. We deliver strategy and implementation, from a business and technology view, to help lead in the markets where our client's compete.

Are you a game changer? Do you believe in adding advantage at every level in everything you do? You may be one of us.

Deloitte consulting is growing, with a focus on developing our already powerful teams across our portfolio of offerings. We are looking for smart, accountable, innovative professionals with technical expertise and deep industry experience insights. The combination of our 6 areas of expertise, our well-developed industry structure and our integrated signature solutions is a unique offering never seen before on the African continent.

Job Description

As a cloud data solutions architect, you will be responsible for the designing and building of the Azure solution architecture to enable modern data platforms. This includes all aspects of the solution required to source integrate, store, model, analyse and report.

  • Responsible for designing solutions, aligned with customer requirements for projects initiated by supported lines of business in coordination with other technical departments
  • Creating a well-informed cloud strategy and managing the adaptation process
  • Evaluating cloud applications, hardware, and software
  • Developing and organising cloud systems
  • Responsible for design and implementation of data modelling and data architecture for e.g., DataFactory/Kimball/ Inmon/DataMesh/DataVault 2.0/LakeHouse/Medallion Architecture.
  • Responsible for designing ETL/ELT, Data Ingestion, Data Validation and Data Quality in the data pipeline.
  • Responsible for designing and preparing the data dictionary and data catalogue related to the data model.
  • 3 years' Experience building cloud solutions.
  • Apply technical knowledge to architect and design solutions that meet business and IT needs.
  • A good understanding and hands-on market leading ETL/BI tools.
  • Proficiency in SQL and Python.
  • Good understanding of ADO Devops/ JIRA, Confluence and Release Management (Dev to Prod).
  • Analyses business and proposal data and technical requirements to develop a wide range of data related solutions in cloud and traditional Technologies.
  • Assists project team with requirements completion and submittal of responses to RFP's
  • Prepares design specifications, analyses and recommendations based on customer requirements.
  • Participate in the design, development, planning, modification and/or improvement of existing data and analytical systems.
  • Research system designs, concepts, feasibility and enhancement solutions utilizing advanced technical theory and knowledge.
  • Partners with project team to complete technical design documents
  • Coordinates and contributes to compilation and writing of proposal documents.
  • Prepares design proposals to reflect cost, schedule and technical approaches.
  • Cloud Experience: Azure/AWS/GCP/Snowflake/Databricks
  • Methodology Experience: Agile, DMBOK, DataOps, MLOps
  • Consulting Experience Preferred
  • End-to-end technology architecture design
  • Understanding of Information Architecture domain (e.g., DAMA / DMBOK)
  • Engage stakeholders in related domains to deliver a sound technology base for BI solutions
  • Maintain awareness of new developments in AI and Data technology. Applies to enterprise solutions as relevant.
  • Knowledge of Information security
  • Define data management processes
  • Define Systems management processes
  • Define Business continuity processes
  • Data integration (real-time, streaming, virtualization)

Qualifications

Minimum Qualification/s:

  • Preferred: BCom Informatics
  • BSC Computer Science/ BSC Honours Computer Science,
  • BSC Maths Statistics,
  • BEng (all disciplines)

Desired Qualification/s:

  • TOGAF,
  • Cloud related architecture / associate certifications
  • Data Engineering,
  • Data Science,
  • Data Analytics and Analytics technology specific certifications.

Additional Information

Minimum Experience:

  • 10 years' relevant (Data Management,
  • Data Engineering,
  • Data Science,
  • Data Analytics and Analytics technology) working experience.
  • BI / Analytics /
  • Data solution architect.

Desired Experience:

  • 10 years relevant working experience in a client facing role; five of these in a management role.
  • Business development / market making experience preferred.

Or

  • 10 years analytics delivery experience with subject matter expert knowledge in Data Management / Data Engineering / Data Science / Data Analytics / Analytics technology.
  • Expert in their chosen domain having architected and delivered numerous complex analytics and data solutions.

At Deloitte, we want everyone to feel they can be themselves and to thrive at work—in every country, in everything we do, every day. We aim to create a workplace where everyone is treated fairly and with respect, including reasonable accommodation for persons with disabilities. We seek to create and leverage our diverse workforce to build an inclusive environment across the African continent.

Note: The list of tasks / duties and responsibilities contained in this document is not necessarily exhaustive. Deloitte may ask the employee to carry out additional duties or responsibilities, which may fall reasonably within the ambit of the role profile, depending on operational requirements.

Be careful of Recruitment Scams: Fraudsters or employment scammers often pose as legitimate recruiters, employers, recruitment consultants or job placement firms, advertising false job opportunities through email, text messages and WhatsApp messages. They aim to cheat jobseekers out of money or to steal personal information.

To help you look out for potential recruitment scams, here are some Red Flags:

  • Upfront Payment Requests: Deloitte will never ask for any upfront payment for background checks, job training, or supplies.
  • Requests for Personal Information: Be wary if you are asked for sensitive personal information, especially early in the recruitment process and without a clear need for it. Fraudulent links or contractual documents may require the provision of sensitive personal data or copy documents (e.g., government issued numbers or identity documents, passports or passport numbers, bank account statements or numbers, parent's data) that may be used for identity fraud. Do not provide or send any of these documents or data. Please note we will never ask for photographs at any stage of the recruitment process.
  • Unprofessional Communication: Scammers may communicate in an unprofessional manner. Their messages may be filled with poor grammar and spelling errors. The look and feel may not be consistent with the Deloitte corporate brand.

If you're unsure, make direct contact with Deloitte using our official contact details. Be careful not to use any contact details provided in the suspicious job advertisement or email.

This advertiser has chosen not to accept applicants from your region.

Data Solutions Architect / Senior Data Engineer

Cape Town, Western Cape PBT Group

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

Overview

We’re seeking a Data Solutions Architect / Senior Data Engineer to join a growing Data and AI team working on an innovative cloud-based data warehousing and AI solution. The team is developing a client-facing platform that integrates data warehousing with a RAG (Retrieval-Augmented Generation) system — transforming unstructured and structured data into organized, summarized, and insightful information for business use.

You’ll play a leading role in building out the production-ready environment, ensuring compliance, scalability, and performance, while contributing to the development of advanced AI-driven insights and automation capabilities.

High-Level Project Overview

The platform focuses on the aggregation, synthesis, and summarization of unstructured data through a secure, scalable Azure-based architecture.

A proof of concept has already been built (a chatbot web app hosted on Azure), and the next phase involves expanding this into a fully integrated production solution.

Your work will involve:

  • Designing and developing scalable data pipelines, storage, and processing components in Azure.
  • Supporting the integration of RAG systems with AI models and vector databases.
  • Enabling robust data flow between AI, search, and warehousing layers.
  • Contributing to architectural decisions on performance, governance, and scalability.
Tech Stack
  • Framework / Orchestration: Azure AI Foundry (for AI workflow orchestration and management)
  • LLM Provider: Azure OpenAI Service (designed to be model-agnostic for future extensibility)
  • Storage: Azure Blob Storage Gen 2 (for documents and source data)
  • Vector Store / Search: Azure AI Search (vector + hybrid search capabilities)
  • App Hosting: Azure App Service (chatbot web app interface integrated with RAG)
  • Embedding Model: Azure OpenAI text-embedding-3-large
  • Data Warehousing: Azure Data Factory for data extraction, transformation, and integration between AI Foundry, AI Search, and Blob Storage
Key Responsibilities
  • Architect and implement end-to-end data pipelines and data warehousing solutions in Azure.
  • Design and optimize ETL/ELT workflows using Azure Data Factory or equivalent.
  • Collaborate with AI developers and cloud engineers to connect data pipelines to AI/RAG systems.
  • Implement data models to support text retrieval, embedding, and summarization processes.
  • Ensure compliance with data governance and security best practices.
  • Mentor and support junior team members as the data capability scales.
Required Skills & Experience
  • 7+ years’ experience as a Data Engineer or Data Architect in enterprise environments.
  • Strong proficiency in Azure Cloud (Data Factory, Blob Storage, Synapse, AI Foundry, OpenAI).
  • Advanced SQL and Python development experience.
  • Proven experience with cloud data migration and modern data warehousing .
  • Knowledge of vector databases , AI model integration , or RAG frameworks highly advantageous.
  • Understanding of data orchestration, governance, and security principles.
  • Experience in insurance or financial services preferred.
Why Join

This is a greenfield opportunity to help build a Data & AI capability from the ground up . The team currently consists of four engineers and is expected to grow rapidly in 2026. You’ll be working on cutting-edge Azure and AI technologies, shaping an intelligent platform that makes sense of large, messy datasets and transforms them into business-ready insights.

In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent.

If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Solutions Architect / Senior Data Engineer

Cape Town, Western Cape PBT Group

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

We’re seeking a Data Solutions Architect / Senior Data Engineer to join a growing Data and AI team working on an innovative cloud-based data warehousing and AI solution. The team is developing a client-facing platform that integrates data warehousing with a RAG (Retrieval-Augmented Generation) system — transforming unstructured and structured data into organized, summarized, and insightful information for business use.

You’ll play a leading role in building out the production-ready environment, ensuring compliance, scalability, and performance, while contributing to the development of advanced AI-driven insights and automation capabilities.

High-Level Project Overview

The platform focuses on the aggregation, synthesis, and summarization of unstructured data through a secure, scalable Azure-based architecture.


A proof of concept has already been built (a chatbot web app hosted on Azure), and the next phase involves expanding this into a fully integrated production solution.

Your work will involve:

  • Designing and developing scalable data pipelines, storage, and processing components in Azure.
  • Supporting the integration of RAG systems with AI models and vector databases.
  • Enabling robust data flow between AI, search, and warehousing layers.
  • Contributing to architectural decisions on performance, governance, and scalability.

Tech Stack

  • Framework / Orchestration: Azure AI Foundry (for AI workflow orchestration and management)
  • LLM Provider: Azure OpenAI Service (designed to be model-agnostic for future extensibility)
  • Storage: Azure Blob Storage Gen 2 (for documents and source data)
  • Vector Store / Search: Azure AI Search (vector + hybrid search capabilities)
  • App Hosting: Azure App Service (chatbot web app interface integrated with RAG)
  • Embedding Model: Azure OpenAI text-embedding-3-large
  • Data Warehousing: Azure Data Factory for data extraction, transformation, and integration between AI Foundry, AI Search, and Blob Storage

Key Responsibilities

  • Architect and implement end-to-end data pipelines and data warehousing solutions in Azure.
  • Design and optimize ETL/ELT workflows using Azure Data Factory or equivalent.
  • Collaborate with AI developers and cloud engineers to connect data pipelines to AI/RAG systems.
  • Implement data models to support text retrieval, embedding, and summarization processes.
  • Ensure compliance with data governance and security best practices.
  • Mentor and support junior team members as the data capability scales.

Required Skills & Experience

  • 7+ years’ experience as a Data Engineer or Data Architect in enterprise environments.
  • Strong proficiency in Azure Cloud (Data Factory, Blob Storage, Synapse, AI Foundry, OpenAI).
  • Advanced SQL and Python development experience.
  • Proven experience with cloud data migration and modern data warehousing .
  • Knowledge of vector databases , AI model integration , or RAG frameworks highly advantageous.
  • Understanding of data orchestration, governance, and security principles.
  • Experience in insurance or financial services preferred.

Why Join

This is a greenfield opportunity to help build a Data & AI capability from the ground up . The team currently consists of four engineers and is expected to grow rapidly in 2026. You’ll be working on cutting-edge Azure and AI technologies, shaping an intelligent platform that makes sense of large, messy datasets and transforms them into business-ready insights.

* In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent

* If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Storage Specialist Jobs