325 Data Storage Specialist jobs in South Africa
Data Solutions Architect
Posted 7 days ago
Job Viewed
Job Description
Data Solutions Architect job vacancy in Cape Town.
We’re seeking a Data Solutions Architect / Senior Data Engineer to join a growing Data and AI team working on an innovative cloud-based data warehousing and AI solution.
The team is developing a client-facing platform that integrates data warehousing with a RAG (Retrieval-Augmented Generation) system — transforming unstructured and structured data into organized, summarized, and insightful information for business use.
You’ll play a leading role in building out the production-ready environment, ensuring compliance, scalability, and performance, while contributing to the development of advanced AI-driven insights and automation capabilities.
High-Level Project Overview:
The platform focuses on the aggregation, synthesis, and summarization of unstructured data through a secure, scalable Azure-based architecture.
A proof of concept has already been built (a chatbot web app hosted on Azure), and the next phase involves expanding this into a fully integrated production solution.
Your work will involve:
- Designing and developing scalable data pipelines, storage, and processing components in Azure.
- Supporting the integration of RAG systems with AI models and vector databases.
- Enabling robust data flow between AI, search, and warehousing layers.
- Contributing to architectural decisions on performance, governance, and scalability.
Tech Stack:
- Framework / Orchestration: Azure AI Foundry (for AI workflow orchestration and management)
- LLM Provider: Azure OpenAI Service (designed to be model-agnostic for future extensibility)
- Storage: Azure Blob Storage Gen 2 (for documents and source data)
- Vector Store / Search: Azure AI Search (vector + hybrid search capabilities)
- App Hosting: Azure App Service (chatbot web app interface integrated with RAG)
- Embedding Model: Azure OpenAI text-embedding-3-large
- Data Warehousing: Azure Data Factory for data extraction, transformation, and integration between AI Foundry, AI Search, and Blob Storage
Key Responsibilities:
- Architect and implement end-to-end data pipelines and data warehousing solutions in Azure.
- Design and optimize ETL/ELT workflows using Azure Data Factory or equivalent.
- Collaborate with AI developers and cloud engineers to connect data pipelines to AI/RAG systems.
- Implement data models to support text retrieval, embedding, and summarization processes.
- Ensure compliance with data governance and security best practices.
- Mentor and support junior team members as the data capability scales.
Required Skills & Experience:
- 7+ years’ experience as a Data Engineer or Data Architect in enterprise environments.
- Strong proficiency in Azure Cloud (Data Factory, Blob Storage, Synapse, AI Foundry, OpenAI).
- Advanced SQL and Python development experience.
- Proven experience with cloud data migration and modern data warehousing.
- Knowledge of vector databases, AI model integration, or RAG frameworks highly advantageous.
- Understanding of data orchestration, governance, and security principles.
- Experience in insurance or financial services preferred.
Why Join:
This is a greenfield opportunity to help build a Data & AI capability from the ground up. The team currently consists of four engineers and is expected to grow rapidly in 2026. You’ll be working on cutting-edge Azure and AI technologies, shaping an intelligent platform that makes sense of large, messy datasets and transforms them into business-ready insights.
#J-18808-LjbffrData Solutions Architect
Posted today
Job Viewed
Job Description
Contract
Experience8 to 30 years
SalaryNegotiable
Job Published08 October 2025
Job Reference No.Job Description
We're seeking a Data Solutions Architect / Senior Data Engineer to join a growing Data and AI team working on an innovative cloud-based data warehousing and AI solution. The team is developing a client-facing platform that integrates data warehousing with a RAG (Retrieval-Augmented Generation) system — transforming unstructured and structured data into organized, summarized, and insightful information for business use.
You'll play a leading role in building out the production-ready environment, ensuring compliance, scalability, and performance, while contributing to the development of advanced AI-driven insights and automation capabilities.
High-Level Project Overview
The platform focuses on the aggregation, synthesis, and summarization of unstructured data through a secure, scalable Azure-based architecture.
A proof of concept has already been built (a chatbot web app hosted on Azure), and the next phase involves expanding this into a fully integrated production solution.
Your work will involve:
- Designing and developing scalable data pipelines, storage, and processing components in Azure.
- Supporting the integration of RAG systems with AI models and vector databases.
- Enabling robust data flow between AI, search, and warehousing layers.
- Contributing to architectural decisions on performance, governance, and scalability.
Tech Stack
- Framework / Orchestration: Azure AI Foundry (for AI workflow orchestration and management)
- LLM Provider: Azure OpenAI Service (designed to be model-agnostic for future extensibility)
- Storage: Azure Blob Storage Gen 2 (for documents and source data)
- Vector Store / Search: Azure AI Search (vector + hybrid search capabilities)
- App Hosting: Azure App Service (chatbot web app interface integrated with RAG)
- Embedding Model: Azure OpenAI text-embedding-3-large
- Data Warehousing: Azure Data Factory for data extraction, transformation, and integration between AI Foundry, AI Search, and Blob Storage
Key Responsibilities
- Architect and implement end-to-end data pipelines and data warehousing solutions in Azure.
- Design and optimize ETL/ELT workflows using Azure Data Factory or equivalent.
- Collaborate with AI developers and cloud engineers to connect data pipelines to AI/RAG systems.
- Implement data models to support text retrieval, embedding, and summarization processes.
- Ensure compliance with data governance and security best practices.
- Mentor and support junior team members as the data capability scales.
Required Skills & Experience
- 7+ years' experience as a Data Engineer or Data Architect in enterprise environments.
- Strong proficiency in Azure Cloud (Data Factory, Blob Storage, Synapse, AI Foundry, OpenAI).
- Advanced SQL and Python development experience.
- Proven experience with cloud data migration and modern data warehousing.
- Knowledge of vector databases, AI model integration, or RAG frameworks highly advantageous.
- Understanding of data orchestration, governance, and security principles.
- Experience in insurance or financial services preferred.
Why Join
This is a greenfield opportunity to help build a Data & AI capability from the ground up. The team currently consists of four engineers and is expected to grow rapidly in 2026. You'll be working on cutting-edge Azure and AI technologies, shaping an intelligent platform that makes sense of large, messy datasets and transforms them into business-ready insights.
In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent
If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.
Data EngineeringData ArchitectureEnterprise ArchitectureMicrosoft AzureSQLPythonData Warehousing
IndustriesInsuranceFinancial Services
Data solutions architect
Posted today
Job Viewed
Job Description
AI & Data Solutions Architect
Posted 1 day ago
Job Viewed
Job Description
Overview
Role: AI & Data Solutions Architect
Experience: 8-10 Years
Must-have skills- Strong understanding of AI/ML concepts
- Proficiency in Python (or other relevant languages)
- Experience with prompt engineering - Knowledge of LLMs and their applications
- Familiarity with MLOps principles
- Understanding of Nvidia H200 GPUs to design, operate and manage
- Basics subject matter expertise of telecom and prior work experience in the area
- Deep understanding of Azure/AWS or GCP cloud platform
- Expertise in relevant cloud services and tools
- Knowledge of Cloud (AWS/Azure or GCP) data storage and processing services
- Ruby on Rails
- Python
- Kubernetes, Docker
- Java
- Nvidia H200 GPU handling
- Azure DevOps
- Azure Cosmos DB
- Docker & Kubernetes containerization
- Design and implement GenAI solutions
- Develop and optimize prompts
- Build and deploy AI models
- Ensure ethical considerations and responsible AI practices. Stay updated on the latest GenAI advancements
- Integrate with various LLM services and other cloud AI and non-AI services
- Utilize machine learning tools for design, build and deployment for model training and deployment (like Azure Machine Learning or similar tools)
AI & Data Solutions Architect
Posted 4 days ago
Job Viewed
Job Description
Job Description
Role: AI & Data Solutions Architect
Experience: 8-10 Years
Must Have Skills- Strong understanding of AI/ML concepts
- Proficiency in Python (or other relevant languages).
- Experience with prompt engineering- Knowledge of LLMs and their applications.
- Familiarity with MLOps principles
- Understanding of Nvidia GPU H200 types to design, operate and manage
- Basic subject matter expertise of telecom and prior work experience in the area
- Deep understanding of Azure / AWS or GCP cloud platform.
- Expertise in relevant cloud services and tools.
- Knowledge of Cloud (AWS/Azure or GCP) data storage and processing services
- Ruby on rails
- Python
- Kubernetes , docker
- Java
- Nvidia H200 GPU handling
- Azure DevOps
- Azure cosmos DB
- Docker & Kubernetes containerization
Key responsibilities:
- Design and implement GenAI solutions.
- Develop and optimize prompts.
- Build and deploy AI models.
- Ensure ethical considerations and responsible AI practices. Stay updated on the latest GenAI advancements
- Integrate with various LLM Services and other cloud AI and non-AI services
- Utilize machine learning tools for design, build and deploy for model training and deployment like Azure Machine learning or similar tools
Integration & Data Solutions Specialist
Posted 11 days ago
Job Viewed
Job Description
Systems Integration
- Develop, maintain, and optimise integrations using Flowgear and other middleware platforms.
- Write and maintain scripts (PowerShell, Python, or similar) to automate processes and improve efficiency.
- Build and support secure APIs and data pipelines between internal and client systems.
- Troubleshoot integration issues and liaise with vendors when required.
- Support data transformation and reporting initiatives across departments.
- Develop dashboards and reports in Power BI for internal teams and clients.
- Assist with data modelling and ensuring data quality within reporting solutions.
- Collaborate with finance and operations teams to deliver actionable insights.
- Research and evaluate new technologies for integration and analytics.
- Drive automation and process optimisation across IT operations.
- Contribute to the IT strategy by recommending best practices for data flow and reporting.
Experience & Qualifications
- Bachelors Degree in IT, Computer Science, or related field (preferred).
- 35 years experience in system integration, scripting, or related roles.
- Hands-on experience with Flowgear or other integration platforms (e.g., Dell Boomi, MuleSoft).
- Proficiency in scripting languages (PowerShell, Python, SQL).
- Experience with Power BI or other BI tools (advantageous).
- Knowledge of APIs, JSON, and data structures.
- Strong analytical mindset with problem-solving ability.
- Good communication skills to work across business units.
- Combination of integration engineering and data analytics.
- Hands-on technical role with opportunities to contribute to business intelligence initiatives.
- Collaboration with IT, Finance, and Operations teams.
The final offer will be based on the skills and requirements outlined above. The offer will be competitive, aligned with your unique profile and value proposition.
Integration & Data Solutions Specialist
Posted today
Job Viewed
Job Description
Be The First To Know
About the latest Data storage specialist Jobs in South Africa !
Azure Data Solutions Architect
Posted today
Job Viewed
Job Description
Company Description
Deloitte is the largest private professional services network in the world. Every day, approximately 457,000 professionals in more than 150 countries demonstrate their commitment to a single vision: to be the standard of excellence, while working towards one purpose – to make an impact that matters. Click here to read more about Deloitte.
About the division
Innovation, transformation and leadership occur in many ways. At Deloitte, our ability to help solve clients' most complex issues is distinct. We deliver strategy and implementation, from a business and technology view, to help lead in the markets where our client's compete.
Are you a game changer? Do you believe in adding advantage at every level in everything you do? You may be one of us.
Deloitte consulting is growing, with a focus on developing our already powerful teams across our portfolio of offerings. We are looking for smart, accountable, innovative professionals with technical expertise and deep industry experience insights. The combination of our 6 areas of expertise, our well-developed industry structure and our integrated signature solutions is a unique offering never seen before on the African continent.
Job Description
As a cloud data solutions architect, you will be responsible for the designing and building of the Azure solution architecture to enable modern data platforms. This includes all aspects of the solution required to source integrate, store, model, analyse and report.
- Responsible for designing solutions, aligned with customer requirements for projects initiated by supported lines of business in coordination with other technical departments
- Creating a well-informed cloud strategy and managing the adaptation process
- Evaluating cloud applications, hardware, and software
- Developing and organising cloud systems
- Responsible for design and implementation of data modelling and data architecture for e.g., DataFactory/Kimball/ Inmon/DataMesh/DataVault 2.0/LakeHouse/Medallion Architecture.
- Responsible for designing ETL/ELT, Data Ingestion, Data Validation and Data Quality in the data pipeline.
- Responsible for designing and preparing the data dictionary and data catalogue related to the data model.
- 3 years' Experience building cloud solutions.
- Apply technical knowledge to architect and design solutions that meet business and IT needs.
- A good understanding and hands-on market leading ETL/BI tools.
- Proficiency in SQL and Python.
- Good understanding of ADO Devops/ JIRA, Confluence and Release Management (Dev to Prod).
- Analyses business and proposal data and technical requirements to develop a wide range of data related solutions in cloud and traditional Technologies.
- Assists project team with requirements completion and submittal of responses to RFP's
- Prepares design specifications, analyses and recommendations based on customer requirements.
- Participate in the design, development, planning, modification and/or improvement of existing data and analytical systems.
- Research system designs, concepts, feasibility and enhancement solutions utilizing advanced technical theory and knowledge.
- Partners with project team to complete technical design documents
- Coordinates and contributes to compilation and writing of proposal documents.
- Prepares design proposals to reflect cost, schedule and technical approaches.
- Cloud Experience: Azure/AWS/GCP/Snowflake/Databricks
- Methodology Experience: Agile, DMBOK, DataOps, MLOps
- Consulting Experience Preferred
- End-to-end technology architecture design
- Understanding of Information Architecture domain (e.g., DAMA / DMBOK)
- Engage stakeholders in related domains to deliver a sound technology base for BI solutions
- Maintain awareness of new developments in AI and Data technology. Applies to enterprise solutions as relevant.
- Knowledge of Information security
- Define data management processes
- Define Systems management processes
- Define Business continuity processes
- Data integration (real-time, streaming, virtualization)
Qualifications
Minimum Qualification/s:
- Preferred: BCom Informatics
- BSC Computer Science/ BSC Honours Computer Science,
- BSC Maths Statistics,
- BEng (all disciplines)
Desired Qualification/s:
- TOGAF,
- Cloud related architecture / associate certifications
- Data Engineering,
- Data Science,
- Data Analytics and Analytics technology specific certifications.
Additional Information
Minimum Experience:
- 10 years' relevant (Data Management,
- Data Engineering,
- Data Science,
- Data Analytics and Analytics technology) working experience.
- BI / Analytics /
- Data solution architect.
Desired Experience:
- 10 years relevant working experience in a client facing role; five of these in a management role.
- Business development / market making experience preferred.
Or
- 10 years analytics delivery experience with subject matter expert knowledge in Data Management / Data Engineering / Data Science / Data Analytics / Analytics technology.
- Expert in their chosen domain having architected and delivered numerous complex analytics and data solutions.
At Deloitte, we want everyone to feel they can be themselves and to thrive at work—in every country, in everything we do, every day. We aim to create a workplace where everyone is treated fairly and with respect, including reasonable accommodation for persons with disabilities. We seek to create and leverage our diverse workforce to build an inclusive environment across the African continent.
Note: The list of tasks / duties and responsibilities contained in this document is not necessarily exhaustive. Deloitte may ask the employee to carry out additional duties or responsibilities, which may fall reasonably within the ambit of the role profile, depending on operational requirements.
Be careful of Recruitment Scams: Fraudsters or employment scammers often pose as legitimate recruiters, employers, recruitment consultants or job placement firms, advertising false job opportunities through email, text messages and WhatsApp messages. They aim to cheat jobseekers out of money or to steal personal information.
To help you look out for potential recruitment scams, here are some Red Flags:
- Upfront Payment Requests: Deloitte will never ask for any upfront payment for background checks, job training, or supplies.
- Requests for Personal Information: Be wary if you are asked for sensitive personal information, especially early in the recruitment process and without a clear need for it. Fraudulent links or contractual documents may require the provision of sensitive personal data or copy documents (e.g., government issued numbers or identity documents, passports or passport numbers, bank account statements or numbers, parent's data) that may be used for identity fraud. Do not provide or send any of these documents or data. Please note we will never ask for photographs at any stage of the recruitment process.
- Unprofessional Communication: Scammers may communicate in an unprofessional manner. Their messages may be filled with poor grammar and spelling errors. The look and feel may not be consistent with the Deloitte corporate brand.
If you're unsure, make direct contact with Deloitte using our official contact details. Be careful not to use any contact details provided in the suspicious job advertisement or email.
Data Solutions Architect / Senior Data Engineer
Posted 7 days ago
Job Viewed
Job Description
Overview
We’re seeking a Data Solutions Architect / Senior Data Engineer to join a growing Data and AI team working on an innovative cloud-based data warehousing and AI solution. The team is developing a client-facing platform that integrates data warehousing with a RAG (Retrieval-Augmented Generation) system — transforming unstructured and structured data into organized, summarized, and insightful information for business use.
You’ll play a leading role in building out the production-ready environment, ensuring compliance, scalability, and performance, while contributing to the development of advanced AI-driven insights and automation capabilities.
High-Level Project OverviewThe platform focuses on the aggregation, synthesis, and summarization of unstructured data through a secure, scalable Azure-based architecture.
A proof of concept has already been built (a chatbot web app hosted on Azure), and the next phase involves expanding this into a fully integrated production solution.
Your work will involve:
- Designing and developing scalable data pipelines, storage, and processing components in Azure.
- Supporting the integration of RAG systems with AI models and vector databases.
- Enabling robust data flow between AI, search, and warehousing layers.
- Contributing to architectural decisions on performance, governance, and scalability.
- Framework / Orchestration: Azure AI Foundry (for AI workflow orchestration and management)
- LLM Provider: Azure OpenAI Service (designed to be model-agnostic for future extensibility)
- Storage: Azure Blob Storage Gen 2 (for documents and source data)
- Vector Store / Search: Azure AI Search (vector + hybrid search capabilities)
- App Hosting: Azure App Service (chatbot web app interface integrated with RAG)
- Embedding Model: Azure OpenAI text-embedding-3-large
- Data Warehousing: Azure Data Factory for data extraction, transformation, and integration between AI Foundry, AI Search, and Blob Storage
- Architect and implement end-to-end data pipelines and data warehousing solutions in Azure.
- Design and optimize ETL/ELT workflows using Azure Data Factory or equivalent.
- Collaborate with AI developers and cloud engineers to connect data pipelines to AI/RAG systems.
- Implement data models to support text retrieval, embedding, and summarization processes.
- Ensure compliance with data governance and security best practices.
- Mentor and support junior team members as the data capability scales.
- 7+ years’ experience as a Data Engineer or Data Architect in enterprise environments.
- Strong proficiency in Azure Cloud (Data Factory, Blob Storage, Synapse, AI Foundry, OpenAI).
- Advanced SQL and Python development experience.
- Proven experience with cloud data migration and modern data warehousing .
- Knowledge of vector databases , AI model integration , or RAG frameworks highly advantageous.
- Understanding of data orchestration, governance, and security principles.
- Experience in insurance or financial services preferred.
This is a greenfield opportunity to help build a Data & AI capability from the ground up . The team currently consists of four engineers and is expected to grow rapidly in 2026. You’ll be working on cutting-edge Azure and AI technologies, shaping an intelligent platform that makes sense of large, messy datasets and transforms them into business-ready insights.
In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent.
If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.
#J-18808-LjbffrData Solutions Architect / Senior Data Engineer
Posted 11 days ago
Job Viewed
Job Description
We’re seeking a Data Solutions Architect / Senior Data Engineer to join a growing Data and AI team working on an innovative cloud-based data warehousing and AI solution. The team is developing a client-facing platform that integrates data warehousing with a RAG (Retrieval-Augmented Generation) system — transforming unstructured and structured data into organized, summarized, and insightful information for business use.
You’ll play a leading role in building out the production-ready environment, ensuring compliance, scalability, and performance, while contributing to the development of advanced AI-driven insights and automation capabilities.
High-Level Project Overview
The platform focuses on the aggregation, synthesis, and summarization of unstructured data through a secure, scalable Azure-based architecture.
A proof of concept has already been built (a chatbot web app hosted on Azure), and the next phase involves expanding this into a fully integrated production solution.
Your work will involve:
- Designing and developing scalable data pipelines, storage, and processing components in Azure.
- Supporting the integration of RAG systems with AI models and vector databases.
- Enabling robust data flow between AI, search, and warehousing layers.
- Contributing to architectural decisions on performance, governance, and scalability.
Tech Stack
- Framework / Orchestration: Azure AI Foundry (for AI workflow orchestration and management)
- LLM Provider: Azure OpenAI Service (designed to be model-agnostic for future extensibility)
- Storage: Azure Blob Storage Gen 2 (for documents and source data)
- Vector Store / Search: Azure AI Search (vector + hybrid search capabilities)
- App Hosting: Azure App Service (chatbot web app interface integrated with RAG)
- Embedding Model: Azure OpenAI text-embedding-3-large
- Data Warehousing: Azure Data Factory for data extraction, transformation, and integration between AI Foundry, AI Search, and Blob Storage
Key Responsibilities
- Architect and implement end-to-end data pipelines and data warehousing solutions in Azure.
- Design and optimize ETL/ELT workflows using Azure Data Factory or equivalent.
- Collaborate with AI developers and cloud engineers to connect data pipelines to AI/RAG systems.
- Implement data models to support text retrieval, embedding, and summarization processes.
- Ensure compliance with data governance and security best practices.
- Mentor and support junior team members as the data capability scales.
Required Skills & Experience
- 7+ years’ experience as a Data Engineer or Data Architect in enterprise environments.
- Strong proficiency in Azure Cloud (Data Factory, Blob Storage, Synapse, AI Foundry, OpenAI).
- Advanced SQL and Python development experience.
- Proven experience with cloud data migration and modern data warehousing .
- Knowledge of vector databases , AI model integration , or RAG frameworks highly advantageous.
- Understanding of data orchestration, governance, and security principles.
- Experience in insurance or financial services preferred.
Why Join
This is a greenfield opportunity to help build a Data & AI capability from the ground up . The team currently consists of four engineers and is expected to grow rapidly in 2026. You’ll be working on cutting-edge Azure and AI technologies, shaping an intelligent platform that makes sense of large, messy datasets and transforms them into business-ready insights.
* In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent
* If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.