18 Cloud Data Engineer jobs in South Africa

Cloud Data Engineer

Mindworx Consulting and Academy

Posted today

Job Viewed

Tap Again To Close

Job Description

Role Overview

We are seeking a passionate and experienced
Data Engineer
who thrives on transforming raw data into powerful insights. You'll work primarily within the
Google Cloud Platform (GCP)
ecosystem, leveraging tools like
BigQuery, Dataproc, Kubernetes, and AI Hub
to design, build, and maintain scalable data solutions.

You should "dream in Python and speak SQL," with a strong desire to turn complex, messy data into meaningful stories and visual insights that empower strategic business decisions.

Key Responsibilities

  • Design, develop, and optimize
    large-scale data pipelines
    (ETL/ELT) using modern data tools and frameworks.
  • Build and maintain
    cloud-based data infrastructure
    within
    Google Cloud Platform
    .
  • Work with
    structured and unstructured data
    , ensuring quality, consistency, and governance.
  • Develop and deploy
    RESTful APIs
    for scalable data access.
  • Collaborate with data scientists, analysts, and business stakeholders to translate business needs into technical solutions.
  • Implement and maintain
    source control and CI/CD processes
    using Git.
  • Continuously monitor, optimize, and troubleshoot data workflows for performance and reliability.

Required Qualifications

  • Bachelor's degree
    in Computer Science, Software Engineering, Data Engineering, or related field (Honours minimum;
    Master's preferred
    ).
  • 5+ years
    professional experience in software or data engineering.
  • Strong proficiency in
    Python
    (data manipulation, API development, automation).
  • Advanced
    SQL
    skills and familiarity with
    NoSQL
    technologies.
  • Proven experience in
    large-scale ETL
    design and implementation.
  • Strong understanding of
    RESTful services
    and API architecture.
  • Hands-on experience with
    cloud environments

    GCP preferred
    , but AWS/Azure acceptable.
  • Experience with
    source control systems (Git)
    .

Beneficial (Nice-to-Have) Skills

Exposure or experience with the following will be advantageous:

  • Airflow / Cloud Composer
    (Scheduling & Orchestration)
  • Kubernetes & Docker
    (Containerization)
  • BigQuery
    and
    Data Warehousing
    concepts
  • ElasticSearch
  • Data Governance
    frameworks and practices
  • Apache Beam / Cloud Dataflow
  • Apache Spark / Dataproc

What We Offer

  • A data-driven culture focused on innovation and collaboration.
  • Opportunities to work with the latest cloud and AI technologies.
  • A supportive environment for growth, learning, and experimentation.
  • Competitive compensation and flexible work arrangements.
This advertiser has chosen not to accept applicants from your region.

Cloud Data Engineer

R900000 - R1200000 Y PBT Group

Posted today

Job Viewed

Tap Again To Close

Job Description

We at PBT Group are looking for a Cloud Data Engineer with hands-on experience in AWS or GCP cloud technologies to join our consulting team. You'll be contributing to the development of scalable data platforms in a fast-moving, regulated environment.

Key Responsibilities:

  • Design, implement, and optimize cloud-native data pipelines using AWS (Glue, EMR, Redshift) or GCP (Dataflow, BigQuery, Pub/Sub)
  • Build and maintain ETL/ELT workflows for structured and unstructured data
  • Develop scalable APIs and microservices to support data access and integration
  • Enhance system performance and observability using tools like CloudWatch, Stackdriver, and Open Telemetry
  • Ensure data security and governance through reviews, automated testing, and compliance practices
  • Support CI/CD pipelines and Infrastructure as Code using GitHub Actions, Terraform, CloudFormation, or Deployment Manager

Qualifications and Experience:

  • 4+ years of professional experience in cloud data engineering or backend development
  • Solid experience deploying and operating data workloads on AWS or GCP
  • Strong understanding of data modeling, SQL/NoSQL databases, and performance tuning
  • Familiarity with REST/gRPC APIs, microservices, and event-driven architecture
  • Experience with Python or Java for data transformation and orchestration
  • Experience with Docker and Kubernetes for containerized deployments
  • Bonus: Experience with Apache Spark, Airflow, or dbt
This advertiser has chosen not to accept applicants from your region.

Cloud Data Engineer

R900000 - R1200000 Y AncerlConsult

Posted today

Job Viewed

Tap Again To Close

Job Description

Location:
Johannesburg (Hybrid) or Remote SA |
Contract:
6–12 months, extendable

About the role

Build reliable Azure data pipelines and models that power BI analytics and reporting.

What you'll do

  • Design/operate ingestion (batch/stream), transformation, orchestration (ADF/Databricks/Functions).
  • Model data for analytics; optimize lakehouse/warehouse performance.
  • Implement lineage, quality checks, and cost control.
  • Collaborate with BI to serve governed datasets.

Must-haves

  • Certs:
    AZ-104
    and
    DP-203
    .
  • 3+ years data engineering on Azure (Spark/SQL/Delta).

Nice-to-haves

  • PL-300
    ;
    DP-600
    (Fabric).
This advertiser has chosen not to accept applicants from your region.

Senior Cloud Data Engineer

R70000 - R120000 Y ExecutivePlacements - The JOB Portal

Posted today

Job Viewed

Tap Again To Close

Job Description

Senior Cloud Data Engineer

Recruiter:
Mindworx Consulting

Job Ref:
JNB /Sipho

Date posted:
Monday, September 15, 2025

Location:
Cape Town, South Africa

SUMMARY:
What will you do?

  • You will be working primarily within the Google Cloud Environment using a variety of the tools that Google offers from Biquery and Dataproc to Kubernettes and AI Hub. You should idea dream in Python and speak SQL. You should not be afraid to dive into dirty data and help the team make sense of it. We are in the game of taking data and turning it into amazing stories and pretty pictures that help the decision makers drive the business forward.

POSITION INFO:
*What should you have? *

  • Relevant software engineering degree at least at Honours level (Master's preferred)
  • 5+ years of development experience working with Python
  • Data skills (Traditional SQL and No-SQL)
  • Large-scale ETL
  • High-scale RESTful Services
  • Cloud experience (Google Cloud Platform preferred)
  • Experience with source control (Git)

**Beneficial Skills

You will be exposed to these in our environment, so it would be great if you had prior experience, but it's not a problem if you don't.**

  • Scheduling and Orchestration (Airflow/Composer)
  • Containerisation (Kubernetes, Docker)
  • BigQuery
  • Elastic search
  • Data Warehousing concepts
  • Data governance Concepts.
  • Apache Beam (Cloud Dataflow)
  • Apache Spark (Dataproc)
This advertiser has chosen not to accept applicants from your region.

Intermediate Cloud Data Platform Support Engineer

R900000 - R1200000 Y ExecutivePlacements - The JOB Portal

Posted today

Job Viewed

Tap Again To Close

Job Description

Intermediate Cloud Data Platform Support Engineer

Recruiter:
WatersEdge Solutions

Job Ref:

Date posted:
Friday, September 12, 2025

Location:
Cape Town, South Africa

SUMMARY:
POSITION INFO:
Location:
Remote (South Africa)

Employment Type:
Full-Time

Industry:
Cloud Infrastructure | Data Engineering | Azure

WatersEdge Solutions is seeking a technically astute
Intermediate Cloud Data Platform Support Engineer
to support and optimise cloud-based data platforms for a high-growth data-driven business. If you're passionate about performance tuning, incident resolution, and platform reliability, this is a chance to work in a cutting-edge environment that values proactive problem solving and automation.

About The Role
Reporting to the Data Engineering function and working alongside Full Stack Support Engineers, you'll be responsible for keeping mission-critical systems running smoothly across Microsoft Azure environments. Your day-to-day will include responding to incidents, implementing automation scripts, and driving improvements in monitoring, observability, and system resilience.

Key Responsibilities

  • Provide technical support for Azure-hosted data platforms including warehouses, pipelines, and distributed computing services
  • Resolve production incidents related to system performance, stability, and data reliability
  • Perform root cause analysis and contribute to long-term system improvements
  • Monitor system health and automate recurring tasks with Python, Bash, or PowerShell
  • Develop and maintain real-time monitoring dashboards and alerts
  • Collaborate with engineers and client-facing teams on technical issue resolution
  • Implement runbooks, documentation, and contribute to continuous improvement initiatives
  • Stay current with emerging tools in Azure, Kubernetes, Databricks, and cloud-native ecosystems

What You'll Bring

  • 3+ years' experience in cloud support, cloud operations, or data engineering
  • Strong hands-on experience with Microsoft Azure and cloud-native data platforms
  • Solid scripting skills in Python, Bash, or PowerShell
  • Deep understanding of SQL, T-SQL, Spark, and incident management best practices
  • Familiarity with observability tools and monitoring frameworks
  • Ability to support production systems with high standards for uptime and data quality
  • Excellent communication skills across technical and non-technical stakeholders
  • Degree in Computer Science, Information Systems, or Engineering

Nice to Have

  • Azure certifications in cloud engineering or architecture
  • Experience with Databricks, Microsoft Fabric, or Spark-based analytics platforms
  • Knowledge of Kubernetes, Docker, or multi-cloud environments (AWS/GCP)
  • Exposure to infrastructure-as-code (Terraform) and CI/CD pipelines

What's On Offer

  • Remote-first flexibility with a collaborative culture
  • Competitive salary and performance incentives
  • Continuous learning opportunities and home office reimbursements
  • Wellness initiatives, social events, and transparent team communication
  • A high-impact role supporting scalable, data-driven technologies

Company Culture
At WatersEdge Solutions, we match top engineering talent with purpose-driven innovation. You'll join a team that values autonomy, continuous improvement, and high performance—where every contribution matters and systems are built to scale securely and efficiently.

If you have not been contacted within 10 working days, please consider your application unsuccessful.

This advertiser has chosen not to accept applicants from your region.

Cloud Computing Robot

Cape Town, Western Cape Communicate Recruitment

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description


Cloud Platform Integration: Certified proficiency in AWS, Azure, or Google Cloud
Containerization & Orchestration: Kubernetes, Docker, and Terraform-based deployment
AI/ML Operationalization: Experience deploying and managing machine learning models at scale
Security Protocols: Knowledge of Zero-Trust frameworks, encryption, and compliance standards (GDPR, HIPAA)
Fault Tolerance: Automated backup, disaster recovery, and high-availability design

Qualification:
Tertiary is preferred


Contact JADE PERUMAL on
This advertiser has chosen not to accept applicants from your region.

Specialist: Data Engineering

Centurion, Gauteng R900000 - R1200000 Y Clyrofor SA

Posted today

Job Viewed

Tap Again To Close

Job Description

We are seeking a skilled and motivated
Specialist: Data Engineering
to join our dynamic Financial Services team.

The ideal candidate will play a key role in implementing the company's Data Strategy by driving data awareness, engagement, and monetization while ensuring operational excellence across platforms.

This role involves building and optimizing data pipelines, managing data architectures on both on-premises and cloud environments, ensuring data quality and compliance, and supporting Payments & Ecommerce teams with reliable data solutions.

The position is ideal for a technically strong data engineer with a solid understanding of data frameworks, who is eager to grow in a fast-paced and innovation-driven environment.

Key Responsibilities

Strategy Development & Implementation

  • Implement the Data Strategy to drive customer awareness, engagement, experience, and monetization.
  • Provide input into data monetization aspects of the Business Plan and cascade to OpCos.
  • Drive product revenue growth, operational excellence, and customer satisfaction.

Operational Delivery – Data Platforms

  • Support in developing frameworks and technical architectures for Financial Services innovation.
  • Assist the Payments & Ecommerce team in achieving annual goals.
  • Implement and manage data architectures across multiple platforms (web, mobile, social).
  • Ensure continuous alignment of data platforms with Financial Services strategy and evolving business requirements.
  • Oversee delivery of data platform services and ensure integration efficiency and quality.

Data Engineering

  • Design, build, and maintain ETL/ELT pipelines from multiple data sources.
  • Set up and manage centralized data storage systems and warehouses (e.g., Fabric, PrestoDB, Oracle).
  • Utilize cloud technologies such as
    Microsoft Azure
    for scalable deployments.
  • Implement workflow management tools (Apache Airflow, Prefect, or Dagster).
  • Ensure data accuracy, automation, and observability in data flows.
  • Optimize pipelines and analytics for performance, scalability, and reliability.

Governance & Reporting

  • Participate in strategic and operational meetings, providing technical guidance.
  • Support enterprise-wide transformation initiatives related to data management.
  • Ensure adequate risk mitigation and compliance with regional data regulations.
  • Prepare regular progress and performance reports for management.

Qualifications and Experience

  • Bachelor's degree in
    Computer Science, Big Data, Database Administration
    , or a related field.
  • Minimum
    3 years' experience
    in
    Advanced Data Engineering
    .
  • Proven experience in
    Big Data On-premises and Cloud Data Pipeline (M. Fabric)
    deployment.
  • Proficiency in
    SQL, Python, R, and Power BI
    .
  • Experience with
    Oracle, Teradata, or Snowflake
    , and cloud platforms such as
    AWS, Azure, or GCP
    .
  • Strong stakeholder management and communication skills.
  • Experience in
    Telecommunications or Financial Services
    is an advantage.
  • Willingness and flexibility to travel within Africa.

Skills and Competencies

  • Data pipeline and API development expertise.
  • Knowledge of Machine Learning Operations pipeline deployment.
  • Strong analytical and problem-solving abilities.
  • Agile and digital-first mindset.
  • High attention to detail and commitment to quality.
  • Excellent relationship-building and presentation skills.

Behavioural Qualities

Analytical and Detail-Oriented | Business-Focused | Self-Driven | Results-Oriented | Collaborative | Emotionally Intelligent

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Cloud data engineer Jobs in South Africa !

Manager, Data Engineering

Roodepoort, Gauteng R1500000 - R2500000 Y Standard Bank

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Overview

Business Segment: Insurance & Asset Management

Location: ZA, GP, Roodepoort, 4 Ellis Street

Job Type: Full-time

Job Ref ID: A-0001

Date Posted: 10/6/2025

Job Description

To develop and maintain complete data architecture across several application platforms, provide capability across application platforms. To design, build, operationalise, secure and monitor data pipelines and data stores to applicable architecture, solution designs, standards, policies and governance requirements thus making data accessible for the evaluation and optimisation for downstream use case consumption. To execute data engineering duties according to standards, frameworks, and roadmaps

Qualifications

Type of Qualification: First Degree

Field of Study: Business Commerce

Type of Qualification: First Degree

Field of Study: Information Studies

Type of Qualification: First Degree

Field of Study: Information Technology

Experience Required

Software Engineering

Technology

5-7 years

Experience in building databases, warehouses, reporting and data integration solutions. Experience building and optimising big data data-pipelines, architectures and data sets. Experience in creating and integrating APIs. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement

8-10 years

Deep understanding of data pipelining and performance optimisation, data principles, how data fits in an organisation, including customers, products and transactional information. Knowledge of integration patterns, styles, protocols and systems theory

8-10 years

Experience in database programming languages including SQL, PL/SQL, SPARK and or appropriate data tooling. Experience with data pipeline and workflow management tools

Additional Information

Behavioural Competencies:

Adopting Practical Approaches

Articulating Information

Checking Things

Developing Expertise

Documenting Facts

Embracing Change

Examining Information

Interpreting Data

Managing Tasks

Producing Output

Taking Action

Team Working

Technical Competencies:

Big Data Frameworks and Tools

Data Engineering

Data Integrity

Data Quality

IT Knowledge

Stakeholder Management (IT)

Please note: All our recruitment processes comply with the applicable local laws and regulations. We will never ask for money or any from of payment as part of our recruitment process. If you experience this, please contact our Fraud line on or

This advertiser has chosen not to accept applicants from your region.

Data Engineering Technical Lead

R1200000 - R2400000 Y PwC South Africa

Posted today

Job Viewed

Tap Again To Close

Job Description

Position Overview:

The Technical Lead is responsible for the development and delivery of Cloud technical data solutions that support enterprise data management and data usage requirements. The Lead ensures that all technical data services and solutions are designed and implemented according to best practices and support firm regulatory, business enablement requirements. The Cloud data solutions should enable robust use of Artificial Intelligence and Machine Learning. The Technical Lead coordinates delivery of technical solutions across Data Engineering and Data Infrastructure services and engages with stakeholders to ensure that deliverables meet the business strategy and requirements within the agreed timelines.

Our Africa Tech and Digital team is a dynamic community of self-starters and innovators who embrace change and the constant evolution of technology. We are committed to fostering an environment where curiosity thrives and creativity is celebrated. By leveraging AI-assisted tools and respecting diverse perspectives, we drive collaborative success while enhancing our collective knowledge and skills. We believe in the power of technology to transform and improve our processes, always seeking to challenge the status quo. Together, we work with urgency and empathy, united in our mission to create impactful solutions that benefit our firm and our clients.

Key Responsibilities:

  1. Technical Data Strategy:

  2. Implement technology that aligns with the business strategy and fulfills business and network requirements for data services.

  3. Engagement and collaboration with internal and LoS technical teams to ensure alignment, awareness and adoption of technical data solutions and services in support of business requirements.
  4. Forecast technology trends be able to proactively Identify technology pivots and roadmap. Evaluate new approaches, technologies, and services to ensure that Data solutions and services meet changing business demand and enable innovative solutions that can be implemented with agility, urgency and agency.

  5. Data Modelling:

  6. Implement data solutions that are in line with data modelling best practices.

  7. Provide technical data services that meet the ongoing requirements of the business in a cost effective and efficient manner.
  8. Implement data solutions and services promote ethical and responsible use of data within the firm.
  9. Engage with local and global SMEs and teams to ensure that the technical data solutions and services adhere to standards, assist the firm in meeting regulatory requirements and align to the network roadmap.

  10. Technical Standards:

  11. Develop technical standards for each pillar of the Software Development Lfe0cycle (SDLC) in collaboration with other SMEs across AfricaTech.

  12. Collaborate with, and assist, internal and LoS teams with understanding and implementing technical standards requirements of the firm and advise on best practices.

  13. Solutions Design

  14. Collaborate in product design sessions to provide input on technical data requirements and approaches that support the data requirements of the desired business product outcomes.

  15. Assist with defining technical delivery plans that will meet the requirements for data-driven solutions.
  16. Evaluate the delivery of data solutions to ensure that they support strategic deliver, meet business requirements and adhere to technical standards.

  17. Operational Management

  18. Act as a mentor to technical staff within the team.

  19. Assist with the development of technical training curriculums.
  20. Provide technical support and input into data communities of interest and practice.
  21. Assist with internal and LoS data training programmes.

  22. Continuous Improvement:

  23. Identify opportunities to enhance data quality processes, tools, and techniques, leveraging innovative approaches to improve overall data governance and quality strategies.

  24. Contribute to ongoing assessments of the effectiveness of data quality practices, proposing adjustments based on industry trends and organizational needs.
  25. Drive the optimisation and automation of redundant and obsolete processes
  26. Master and champion, the use of AI tools in the delivery of Data Governance and Quality processes.

Qualifications and Skills:

Education:

  • Any Bachelor Degree in Computer Science or equivalent relevant work experience.
  • Certifications in Microsoft Databricks.
  • Experience: Minimum of years of experience in cloud data engineering or related roles within a complex organizational environment.

Technical Skills:

  • Experience in designing and managing data warehouses.
  • Experience in ensuring data security and compliance with regulations, including encryption and privacy laws.
  • Data quality analysis skills.
  • Strong analytical and problem-solving abilities to interpret complex data and translate business requirements into technical specifications.
  • Proficiency in data mining, predictive modelling, and working with large datasets to uncover insights.
  • Designing and implementing enterprise data management frameworks.
  • Knowledge of data governance and data quality principles.
  • Large-scale technical data project implementation experience.
  • Ability to formulate and drive a technical data strategy and budget.
  • Ability to mentor and develop technical data teams.
  • Ability to facilitate broader staff training.
  • Ability to explain technical concepts at all levels.
  • Stakeholder engagement and strong communication skills at all levels.
  • Data modelling and identification and linking of Gloden Sources
  • Metadata Management
  • Technical skills including:
  • Azure cloud data services architecture (Synapse, Purview).
  • Understanding of API development (REST, SOAP, APIM).
  • DevOps experience (task management, code deployment, pipeline management).
  • Terraform experience.
  • SQL experience.
  • Python experience.
  • Power BI experience.
  • Alteryx experience.
  • Databricks experience (ideal).
  • MS Fabric experience (ideal).
  • Data modeling experience.
  • Phython
  • Use of AI and ML within Databricks and Synapse

Soft Skills:

  • Excellent communication and interpersonal skills, with the ability to engage effectively with diverse stakeholders.
  • Analytical and detail-oriented mindset, with a proactive approach to problem solving and decision making.

At PwC, our purpose is to build trust in society and solve important problems. As we navigate an increasingly complex world, we are dedicated to ensuring that the systems on which communities and economies depend can adapt and thrive. Each role within our organization contributes to this mission, reinforcing our commitment to high ethical standards and the importance of trust.

Our five core values guide our actions and define who we are. They emphasize building trust through professionalism, ethical behavior, and a commitment to quality in all our interactions, whether with clients, colleagues, or the broader community. We respect privacy and confidentiality, and we strive for transparency in our operations, demonstrating care and integrity in our relationships.

As part of our human-led, tech-powered approach, we empower our people through technology and foster an environment were speaking up is encouraged, and diverse perspectives are celebrated. By embodying these principles in our daily work, we can collectively drive impactful outcomes that resonate with our clients and society as a whole. Together, let's embrace our purpose and values to create meaningful change.

The Data Engineering Technical Lead plays a critical role in establishing data governance frameworks that will build trust in our data, open revenues for data monetization and upholding the quality and integrity of the organization's data assets. Your expertise in data governance and management practices and commitment to fostering a culture of accountability will significantly enhance the effectiveness of data governance initiatives within PwC Africa. You need to be passionate about data governance and PwC's purpose needs to resonate with you.

This advertiser has chosen not to accept applicants from your region.

Data Engineering Practice Lead

R1200000 - R2400000 Y NTT DATA, Inc.

Posted today

Job Viewed

Tap Again To Close

Job Description

Make an impact with NTT DATA
Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it's a place where you can grow, belong and thrive.

The Director, Data Engineering is a leadership role accountable for operationally managing a Data Engineering Team.

The Data Engineering Practice lead will lead the design, development of the enterprise data platform and pipelines.

This role is further responsible for overseeing SAP data migrations, ensuring seamless integration and transformation of legacy systems into modern cloud-based architectures.

The person will drive innovation, scalability, and governance in data engineering practices while managing a high-performing team.

Key Responsibilities:

  • Manages a team of Managers and Data Engineers to ensure achievement of team objectives/KPIs and through continuous coaching and mentoring helps the team to understand and apply data to solve business solutions.
  • Develop the new Data Platform - Execute the data engineering strategy per the selected platform.
  • SAP Data Migration - Lead end-to-end SAP data migration projects, including planning, mapping, transformation, and validation.
  • Engineering Team Leadership - Build and manage a team of data engineers and technical leads. Foster a culture of innovation, collaboration, and continuous improvement.
  • Data Pipeline Development - Oversee development of robust ETL/ELT pipelines using Azure Data Factory, Databricks, Snowflake Streams/Tasks, and SAP tools. Ensure pipelines are optimized for performance, reliability, and cost-efficiency.
  • Data Management - partner with data management practice lead and governance lead to ensure adherence to data quality, lineage, and security standards. Implement monitoring and alerting for data pipeline health and SLA compliance.
  • Stakeholder Engagement - Collaborate with business units and the internal analytics teams and IT to align data engineering efforts with strategic goals. Translate business requirements into scalable technical solutions.
  • Ensure industry standards best practices are applied to development activities and continuously ensures the team remains abreast of latest technology and tools in the area of data engineering.
  • Works across multiple areas on data set to ensure solutions are successfully executed, within agreed upon time frames.

Knowledge and Attributes:

  • Excellent ability to thrive in a dynamic, fast-paced environment.
  • Strong quantitative and qualitative analysis skills.
  • Strong desire to acquire more knowledge to keep up to speed with the ever-evolving field of data science.
  • Strong curiosity to sift through data to find answers and more insights.
  • Excellent understanding of the information technology industry within a matrixed organization and the typical business problems such organizations face.
  • Excellent ability to translate technical findings clearly and fluently to non-technical team business stakeholders to enable informed decision-making.
  • Excellent ability to create a storyline around the data to make it easy to interpret and understand.
  • Self-driven and able to work independently yet acts as a team player.
  • Excellent ability to apply data science principles through a business lens.
  • Strong desire to create strategies and solutions that challenge and expand the thinking of peers and senior leadership teams.
  • Excellent ability to think strategically about how to use data to drive competitive advantages.
  • Excellent management and leadership skills.

Academic Qualifications and Certifications:

  • Bachelor's degree or equivalent in Data Science, Business Analytics, Mathematics, Economics, Engineering, Computer Science or a related field.
  • Relevant programming certification preferred.
  • Agile certification preferred.

Required experience:

  • Significant demonstrable experience with one or more programming languages, statistical packages and database languages.
  • Significant demonstrable experience with data warehouse and data lake technical architectures, infrastructure components, ETL/ ELT, and reporting/analytic tools.
  • Significant demonstrable experience applying statistical methods to solve business problems.
  • Significant demonstrated experience with distributed data platform tools such as MS Azure, SAP Snowflake, Spark, MySQL, etc.
  • Significant demonstrated experience in working in micro-services architecture working with APIs development.
  • Significant demonstrable experience of full-stack software development with prolific coding abilities.
  • Significant demonstrated experience with Agile Development Methodologies and Test-Driven Development.
  • Solid line manager experience leading and managing a team of Data Engineering Teams

Workplace type:
Hybrid Working

About NTT DATA
NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo.

Equal Opportunity Employer
NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today.

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Cloud Data Engineer Jobs