777 Data Analytics jobs in South Africa

Data Analytics

Midrand, Gauteng R144000 - R240000 Y Deloitte

Posted today

Job Viewed

Tap Again To Close

Job Description

Company Description

At Deloitte, our Purpose is to make an impact that matters for our clients, our people, and society. This is the lens for which our global strategy is set. It unites Deloitte professionals across geographies, businesses, and skills. It makes us better at what we do and how we do it. It enables us to deliver on our promises to stakeholders, while creating the lasting impact we seek.

Harnessing the talent of 450,000+ people located across more than 150 countries and territories, our size and scale puts us in a unique position to help change the world for the better—by bringing together the services we provide, the societal investments we make, and the collaborations we advance through our ecosystems.

Deloitte offers career opportunities across Audit & Assurance (A&A), Tax & Legal (T&L) and our Consulting services business, which is made up of Strategy, Risk & Transactions Advisory (SR&T) and Technology & Transformation (T&T).

Are you ready to apply your knowledge and background to exciting new challenges? From learning to leadership, this is your chance to take your career to the next level.

About the Division

Our Audit & Assurance services go beyond merely meeting statutory requirements. We help our clients perform better and achieve their business objectives. We listen to their needs, think about the business implications and tailor our approach accordingly. Click here to read more about our Audit & Assurance practice

Job Description

As an Associate Director the main purpose of the job is to support the Business Area Leader through driving and implementation of strategy, revenue generation and business growth specifically in the area of Data Management.

Technical Competencies

  • Expertise in data management with excellent industry and business knowledge
  • Strong business acumen
  • Demonstrated leadership skills
  • Sales and negotiation skills
  • Ability to manage and execute projects
  • Demonstrated execution of complex projects to profitable outcomes
  • Skilled in drafting and presenting client proposals

Across multiple areas including:

  • Data Governance
  • Master Data management
  • Meta Data management
  • Data Privacy
  • Data Architecture
  • Data Modelling
  • Agile programme delivery

Behavioural Competencies

  • Exceptional communication skills, both written and verbal
  • Able deliver multiple engagements on time and within budget
  • Proven ability to make decisions and the right judgement calls in complex projects and situations
  • Creates a culture of trust, ownership and accountability across teams and projects
  • On the job coaching for managers and professional staff and taking accountability for multiple large engagements
  • Manages large engagement / multiple engagement deadlines holistically, identifying risks and escalating
  • Drives continuous improvement
  • Custodian of the business, shaping offerings that we need to proactively take to the market

Qualifications

Minimum Qualifications

  • BSC Computer Science, BSC Honours Computer Science, BSC Maths Statistics, BEng (all disciplines), BCom (Informatics preferred). Or if direct entry proven experience and any relevant qualifications

Desired Qualifications

  • Certifications in analytics practices, technologies or advanced degrees in data management/data analytics, advanced degrees in business such as an MBA or MBL advantageous

Minimum Experience

  • 12 years working experience
  • Analytics experience in leading delivery and selling solutions across the spectrum (data management, data engineering, machine learning etc.) to address business requirements. Proven track record of successful projects in the Data management space.
  • Running a team or group including P&L, or market offering
  • Lead executive-level meetings , discussions and best practices development
  • Lead sales activities (RFP, orals, etc.)
  • Board and executive level experience

Desired Experience

  • 12 years relevant working in a client facing role; 6 of these in a senior management/leadership role; Business development / market making experience

Additional Information

At Deloitte, we want everyone to feel they can be themselves and to thrive at work—in every country, in everything we do, every day. We aim to create a workplace where everyone is treated fairly and with respect, including reasonable accommodation for persons with disabilities. We seek to create and leverage our diverse workforce to build an inclusive environment across the African continent.

Note: The list of tasks / duties and responsibilities contained in this document is not necessarily exhaustive. Deloitte may ask the employee to carry out additional duties or responsibilities, which may fall reasonably within the ambit of the role profile, depending on operational requirements.

Be careful of Recruitment Scams: Fraudsters or employment scammers often pose as legitimate recruiters, employers, recruitment consultants or job placement firms, advertising false job opportunities through email, text messages and WhatsApp messages. They aim to cheat jobseekers out of money or to steal personal information.

To help you look out for potential recruitment scams, here are some Red Flags:

  • Upfront Payment Requests: Deloitte will never ask for any upfront payment for background checks, job training, or supplies.
  • Requests for Personal Information: Be wary if you are asked for sensitive personal information, especially early in the recruitment process and without a clear need for it. Fraudulent links or contractual documents may require the provision of sensitive personal data or copy documents (e.g., government issued numbers or identity documents, passports or passport numbers, bank account statements or numbers, parent's data) that may be used for identity fraud. Do not provide or send any of these documents or data. Please note we will never ask for photographs at any stage of the recruitment process.
  • Unprofessional Communication: Scammers may communicate in an unprofessional manner. Their messages may be filled with poor grammar and spelling errors. The look and feel may not be consistent with the Deloitte corporate brand.

If you're unsure, make direct contact with Deloitte using our official contact details. Be careful not to use any contact details provided in the suspicious job advertisement or email.

This advertiser has chosen not to accept applicants from your region.

Data Analytics

R90000 - R120000 Y Scatec ASA

Posted today

Job Viewed

Tap Again To Close

Job Description

Want to join a frontrunner in renewable energy that is actively seeking early entry into new markets globally? Since the establishment in 2007, Scatec has acquired extensive knowledge and experience in developing, building and operating solar, wind and hydro power plants and storage solutions. Driven by our company values and competent global workforce, we aim to deliver competitive and sustainable renewable energy globally; protect our environment and improve quality of life through innovative integration of technology; and create shareholder value. We are present on four continents and are headquartered in Oslo, Norway.

Main purpose of position
Currently we are looking for a
Data Analytics & AI Lead
in
Cape Town, South Africa
to be part of our global team working together towards our vision –
Improving our future.
As our
Data Analytics & AI Lead
you will lead the design and expansion of the Data & Analytics Platform through hands-on work in Azure Databricks, work closely with global teams and Digital & IT to align data needs and integrate solutions, drive adoption of data-driven workflows in partnership with business stakeholders, and develop AI use cases that support strategic decisions.

Main Responsibilities

  • Lead the development and implementation of data analytics and AI strategies to drive business growth and innovation.
  • Oversee the design and execution of data-driven projects, ensuring alignment with organizational goals and objectives.
  • Guide the platform's evolution by adopting Modern Data Architecture practices, including Infrastructure as Code (IaC) and Azure DevOps for agile development and operations (DataOps).
  • Collaborate with cross-functional teams to integrate data analytics and AI solutions into existing systems and processes, ensuring seamless adoption and scalability.
  • Stay abreast of industry trends and advancements in AI and data analytics, incorporating relevant insights into strategic planning and execution.
  • Provide guidance and mentorship to team members, fostering a culture of continuous learning and improvement.
  • Work closely with the Head of Enterprise Architecture to ensure alignment and integration of the Data & Analytics Platform within the broader enterprise architecture landscape.
  • Ensure compliance with data privacy and security regulations, maintaining the integrity and confidentiality of sensitive information

Qualifications And Competencies

  • 8+ years of relevant experience in data architecture, data management and data modelling
  • Extensive experience in data architecture, particularly within a modern data environment utilizing Azure Databricks, dbt, Terraform, and Azure DevOps.
  • Significant hands-on experience in data engineering, including but not limited to the development and optimization of data pipelines, data modelling, and data lakehouse.
  • Proven track record in leading the development and expansion of large-scale data platforms, preferably within a Modern Data Architecture deployment.
  • Strong familiarity with cloud-based data solutions, particularly within the Azure ecosystem.
  • Experience leading and managing cross-functional teams, including business analysts, data analysts, and data engineers.
  • Interest in data analytics trends and advancements, e.g. AI, Industry 4.0
  • Expertise in SQL and Python for advanced data manipulation, modelling, analysis, and processing within a Data Lakehouse context.
  • Familiar with modern data transformation tools, such as dbt
  • Strong experience with Azure cloud data services, particularly those relevant to Data Lakehouse architecture such as Azure Data Lake Storage, Azure Databricks, and related technologies.

Personal characteristics
It is part of every employee's terms of reference to contribute to Scatec's vision: Improving our Future and adhere to our company values which are:

  • Predictable: demonstrate clear communication and listening skills, shares information in an open and honest way
  • Driving results: demonstrate determination, pro-activeness, can prioritise and work independently
  • Changemaker: demonstrate entrepreneurship, can challenge, fast learner, take initiates and adjust
  • Working together: demonstrate teamwork, shares responsibilities, can compromise, has a can-do attitude

For the particular role we also expect

  • Demonstrated ability to implement and manage data solutions that align with the principles of Modern Data architecture, specifically within the framework of a Data Lakehouse model.
  • Strong data modelling capability, conceptual modelling, dimensional modelling, knowledge graphs
  • Fluent (both written and spoken) in English

We offer
Scatec is an exciting, innovative and ambitious company operating in a growing industry. We offer a challenging and interesting position where you will be part of a flexible, diverse and truly international working environment consisting of highly competent and committed colleagues with a positive drive to make a difference.

Scatec is an equal opportunity employer and values diversity. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, disability status, protected veteran status, or any other basis protected by appropriate law. All hiring decisions are made based on merit, competence and business need.

Applications will be processed on a continuous basis

This advertiser has chosen not to accept applicants from your region.

Data Analytics Engineer

Gauteng, Gauteng StructureIt Ltd

Posted 16 days ago

Job Viewed

Tap Again To Close

Job Description

workfromhome

Main focus of the role

We are seeking a technically excellent, customer-focused Analytics Engineer to drive data integration and transformation for a telecoms analytics platform we are taking to market. This role sits at the intersection of client data environments and our internal product development efforts—requiring both business insight and technical acumen.

Your primary focus will be to gather, clarify, and deeply understand business and reporting requirements—often expressed in SQL, Excel, or ad hoc analysis. You will work closely with business stakeholders, analysts, and the data architecture team to translate these requirements into clear, actionable technical specifications.

Working as part of a collaborative team, you’ll help ensure these specifications are accurately and efficiently implemented—typically in PySpark/SparkSql based data pipelines—by supporting the development process, providing subject-matter expertise, and validating that delivered data outputs match the original requirements. Your work will be critical in ensuring that our analytics solutions are robust, repeatable, and deliver trusted results to the business

This role requires a strong analytical mindset, excellent communication skills, and hands-on experience with data transformation and integration processes.

What you’ll do

  • Customer Engagement & Requirements Gathering
    • Engage directly with customers to understand their business needs and data environments.
    • Elicit, document, and validate functional and non-functional requirements.
    • Conduct workshops and interviews with client stakeholders to capture use cases and success criteria.
  • Data Analysis & Integration Design
    • Analyze complex customer data sets, schemas, and data flows to assess integration needs.
    • Collaborate with data engineers and developers to design effective data ingestion, transformation, and mapping processes.
    • Validate data quality, completeness, and alignment with business requirements.
  • Technical Collaboration & Delivery
    • Support the development of scalable data integration and transformation pipelines by assisting with coding, testing, and implementing solutions—primarily using PySpark, SparkSQL, and Python.
    • Translate business and analytical requirements into clear, actionable technical specifications to guide the engineering team’s implementation.
    • Contribute hands-on to codebases: write and review code, implement validation checks, and assist with the development of complex transformation logic as needed.
    • Help automate and maintain data validation and quality checks to ensure reliability and accuracy of analytics outputs.
    • Collaborate closely with the FPA team to understand financial reporting needs, ensure alignment of technical solutions with finance objectives, and validate outputs support business decision-making.
    • Work with modern data formats and platforms (Parquet, Delta Lake, S3/Blob Storage, Databricks, etc.) to enable efficient data handling and integration.
    • Participate in solution architecture and technical discussions, collaborating on user story creation and acceptance criteria to ensure technical solutions align with business needs.
  • Product & Analytics Alignment
    • Work closely with our product team to ensure that customer data is accurately reflected in analytics outputs.
    • Provide feedback on product improvements based on client needs and data insights.
    • Monitor and evaluate the effectiveness of integrated data in supporting customer decision-making.

To fly in your role, you’ll need

  • 5+ years’ experience in data analytics, data engineering, or related technical roles, with significant hands-on coding in Python, PySpark, and SQL.Proven experience working on data integration projects (ETL, data pipelines, APIs).
  • Proven ability to build and support data integration pipelines (ETL, dataflows, APIs) at scale.Ability to translate complex technical concepts into business language and vice versa.
  • Strong experience with relational databases and data modelling; experience with modern analytics platforms (Databricks, Delta Lake, cloud storage) is a plus.Experience in Agile/Scrum environments.
  • Track record of translating business logic and requirements into production-grade, testable code.Experience working with data analytics tools and platforms (e.g., Power BI, SuperSet, Tableau, Looker, Grafana).
  • Solid grasp of data quality, data validation, and monitoring concepts.
  • Strong communication skills—able to present technical logic and results to both technical and non-technical audiences.
  • Experience in Agile/Scrum environments and working collaboratively with both business and engineering teams.

Nice to have

  • Experience in the telecoms industry.
  • Familiarity with the software development lifecycle and Agile methodologies
  • Infrastructure as code experience. (terraform and Pulumi)
  • Experience working in a scale up/dynamic consulting environment.
  • Exposure to accounting concepts.

Benefits

  • Working
    • Fully remote working / or from the office with daily lunch
    • Flexible working hours
    • High-spec Dell laptop
    • Money towards a keyboard of your choice, that is yours to keep
  • Insurance - fully paid on top of, not out of your salary
    • Medical Aid, including Gap Cover
    • Life Insurance, with Disability Insurance and Funeral cover
  • Learning
    • Learning Budget - Books or Courses - you choose how to use it
  • Culture
    • People-first culture that encourages work/life balance
    • Everyone has a voice, regardless of title
    • Psychological safety
  • Leave
    • 20 days annual leave
    • Paid Maternity, Paternity, Study & Moving, Conference, CSR Volunteering leave
  • Long-Term Loyalty Benefits
    • 2 years - monthly budget towards a cell phone contract, uber travel vouchers or petrol card
    • 3 years – can apply for a study bursary to enhance your current and future role with us
    • 5 years - 3 additional days of annual leave
    • 7 years – a local weekend away at our cost
    • 10 years - 3 month paid sabbatical

Interested?
Send your CV to

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Analytics Engineer

Gauteng, Gauteng StructureIt Ltd

Posted 16 days ago

Job Viewed

Tap Again To Close

Job Description

workfromhome

Main focus of the role

We are seeking a technically excellent, customer-focused Analytics Engineer to drive data integration and transformation for a telecoms analytics platform we are taking to market. This role sits at the intersection of client data environments and our internal product development efforts—requiring both business insight and technical acumen.

Your primary focus will be to gather, clarify, and deeply understand business and reporting requirements—often expressed in SQL, Excel, or ad hoc analysis. You will work closely with business stakeholders, analysts, and the data architecture team to translate these requirements into clear, actionable technical specifications.

Working as part of a collaborative team, you’ll help ensure these specifications are accurately and efficiently implemented—typically in PySpark/SparkSql based data pipelines—by supporting the development process, providing subject-matter expertise, and validating that delivered data outputs match the original requirements. Your work will be critical in ensuring that our analytics solutions are robust, repeatable, and deliver trusted results to the business

This role requires a strong analytical mindset, excellent communication skills, and hands-on experience with data transformation and integration processes.

What you’ll do

  • Customer Engagement & Requirements Gathering
    • Engage directly with customers to understand their business needs and data environments.
    • Elicit, document, and validate functional and non-functional requirements.
    • Conduct workshops and interviews with client stakeholders to capture use cases and success criteria.
  • Data Analysis & Integration Design
    • Analyze complex customer data sets, schemas, and data flows to assess integration needs.
    • Collaborate with data engineers and developers to design effective data ingestion, transformation, and mapping processes.
    • Validate data quality, completeness, and alignment with business requirements.
  • Technical Collaboration & Delivery
    • Support the development of scalable data integration and transformation pipelines by assisting with coding, testing, and implementing solutions—primarily using PySpark, SparkSQL, and Python.
    • Translate business and analytical requirements into clear, actionable technical specifications to guide the engineering team’s implementation.
    • Contribute hands-on to codebases: write and review code, implement validation checks, and assist with the development of complex transformation logic as needed.
    • Help automate and maintain data validation and quality checks to ensure reliability and accuracy of analytics outputs.
    • Collaborate closely with the FPA team to understand financial reporting needs, ensure alignment of technical solutions with finance objectives, and validate outputs support business decision-making.
    • Work with modern data formats and platforms (Parquet, Delta Lake, S3/Blob Storage, Databricks, etc.) to enable efficient data handling and integration.
    • Participate in solution architecture and technical discussions, collaborating on user story creation and acceptance criteria to ensure technical solutions align with business needs.
  • Product & Analytics Alignment
    • Work closely with our product team to ensure that customer data is accurately reflected in analytics outputs.
    • Provide feedback on product improvements based on client needs and data insights.
    • Monitor and evaluate the effectiveness of integrated data in supporting customer decision-making.

To fly in your role, you’ll need

  • 5+ years’ experience in data analytics, data engineering, or related technical roles, with significant hands-on coding in Python, PySpark, and SQL.Proven experience working on data integration projects (ETL, data pipelines, APIs).
  • Proven ability to build and support data integration pipelines (ETL, dataflows, APIs) at scale.Ability to translate complex technical concepts into business language and vice versa.
  • Strong experience with relational databases and data modelling; experience with modern analytics platforms (Databricks, Delta Lake, cloud storage) is a plus.Experience in Agile/Scrum environments.
  • Track record of translating business logic and requirements into production-grade, testable code.Experience working with data analytics tools and platforms (e.g., Power BI, SuperSet, Tableau, Looker, Grafana).
  • Solid grasp of data quality, data validation, and monitoring concepts.
  • Strong communication skills—able to present technical logic and results to both technical and non-technical audiences.
  • Experience in Agile/Scrum environments and working collaboratively with both business and engineering teams.

Nice to have

  • Experience in the telecoms industry.
  • Familiarity with the software development lifecycle and Agile methodologies
  • Infrastructure as code experience. (terraform and Pulumi)
  • Experience working in a scale up/dynamic consulting environment.
  • Exposure to accounting concepts.

Benefits

  • Working
    • Fully remote working / or from the office with daily lunch
    • Flexible working hours
    • High-spec Dell laptop
    • Money towards a keyboard of your choice, that is yours to keep
  • Insurance - fully paid on top of, not out of your salary
    • Medical Aid, including Gap Cover
    • Life Insurance, with Disability Insurance and Funeral cover
  • Learning
    • Learning Budget - Books or Courses - you choose how to use it
  • Culture
    • People-first culture that encourages work/life balance
    • Everyone has a voice, regardless of title
    • Psychological safety
  • Leave
    • 20 days annual leave
    • Paid Maternity, Paternity, Study & Moving, Conference, CSR Volunteering leave
  • Long-Term Loyalty Benefits
    • 2 years - monthly budget towards a cell phone contract, uber travel vouchers or petrol card
    • 3 years – can apply for a study bursary to enhance your current and future role with us
    • 5 years - 3 additional days of annual leave
    • 7 years – a local weekend away at our cost
    • 10 years - 3 month paid sabbatical

Interested?
Send your CV to

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Analytics Engineer

Gauteng, Gauteng StructureIt Ltd

Posted 16 days ago

Job Viewed

Tap Again To Close

Job Description

workfromhome

Main focus of the role

We are seeking a technically excellent, customer-focused Analytics Engineer to drive data integration and transformation for a telecoms analytics platform we are taking to market. This role sits at the intersection of client data environments and our internal product development efforts—requiring both business insight and technical acumen.

Your primary focus will be to gather, clarify, and deeply understand business and reporting requirements—often expressed in SQL, Excel, or ad hoc analysis. You will work closely with business stakeholders, analysts, and the data architecture team to translate these requirements into clear, actionable technical specifications.

Working as part of a collaborative team, you’ll help ensure these specifications are accurately and efficiently implemented—typically in PySpark/SparkSql based data pipelines—by supporting the development process, providing subject-matter expertise, and validating that delivered data outputs match the original requirements. Your work will be critical in ensuring that our analytics solutions are robust, repeatable, and deliver trusted results to the business

This role requires a strong analytical mindset, excellent communication skills, and hands-on experience with data transformation and integration processes.

What you’ll do

  • Customer Engagement & Requirements Gathering
    • Engage directly with customers to understand their business needs and data environments.
    • Elicit, document, and validate functional and non-functional requirements.
    • Conduct workshops and interviews with client stakeholders to capture use cases and success criteria.
  • Data Analysis & Integration Design
    • Analyze complex customer data sets, schemas, and data flows to assess integration needs.
    • Collaborate with data engineers and developers to design effective data ingestion, transformation, and mapping processes.
    • Validate data quality, completeness, and alignment with business requirements.
  • Technical Collaboration & Delivery
    • Support the development of scalable data integration and transformation pipelines by assisting with coding, testing, and implementing solutions—primarily using PySpark, SparkSQL, and Python.
    • Translate business and analytical requirements into clear, actionable technical specifications to guide the engineering team’s implementation.
    • Contribute hands-on to codebases: write and review code, implement validation checks, and assist with the development of complex transformation logic as needed.
    • Help automate and maintain data validation and quality checks to ensure reliability and accuracy of analytics outputs.
    • Collaborate closely with the FPA team to understand financial reporting needs, ensure alignment of technical solutions with finance objectives, and validate outputs support business decision-making.
    • Work with modern data formats and platforms (Parquet, Delta Lake, S3/Blob Storage, Databricks, etc.) to enable efficient data handling and integration.
    • Participate in solution architecture and technical discussions, collaborating on user story creation and acceptance criteria to ensure technical solutions align with business needs.
  • Product & Analytics Alignment
    • Work closely with our product team to ensure that customer data is accurately reflected in analytics outputs.
    • Provide feedback on product improvements based on client needs and data insights.
    • Monitor and evaluate the effectiveness of integrated data in supporting customer decision-making.

To fly in your role, you’ll need

  • 5+ years’ experience in data analytics, data engineering, or related technical roles, with significant hands-on coding in Python, PySpark, and SQL.Proven experience working on data integration projects (ETL, data pipelines, APIs).
  • Proven ability to build and support data integration pipelines (ETL, dataflows, APIs) at scale.Ability to translate complex technical concepts into business language and vice versa.
  • Strong experience with relational databases and data modelling; experience with modern analytics platforms (Databricks, Delta Lake, cloud storage) is a plus.Experience in Agile/Scrum environments.
  • Track record of translating business logic and requirements into production-grade, testable code.Experience working with data analytics tools and platforms (e.g., Power BI, SuperSet, Tableau, Looker, Grafana).
  • Solid grasp of data quality, data validation, and monitoring concepts.
  • Strong communication skills—able to present technical logic and results to both technical and non-technical audiences.
  • Experience in Agile/Scrum environments and working collaboratively with both business and engineering teams.

Nice to have

  • Experience in the telecoms industry.
  • Familiarity with the software development lifecycle and Agile methodologies
  • Infrastructure as code experience. (terraform and Pulumi)
  • Experience working in a scale up/dynamic consulting environment.
  • Exposure to accounting concepts.

Benefits

  • Working
    • Fully remote working / or from the office with daily lunch
    • Flexible working hours
    • High-spec Dell laptop
    • Money towards a keyboard of your choice, that is yours to keep
  • Insurance - fully paid on top of, not out of your salary
    • Medical Aid, including Gap Cover
    • Life Insurance, with Disability Insurance and Funeral cover
  • Learning
    • Learning Budget - Books or Courses - you choose how to use it
  • Culture
    • People-first culture that encourages work/life balance
    • Everyone has a voice, regardless of title
    • Psychological safety
  • Leave
    • 20 days annual leave
    • Paid Maternity, Paternity, Study & Moving, Conference, CSR Volunteering leave
  • Long-Term Loyalty Benefits
    • 2 years - monthly budget towards a cell phone contract, uber travel vouchers or petrol card
    • 3 years – can apply for a study bursary to enhance your current and future role with us
    • 5 years - 3 additional days of annual leave
    • 7 years – a local weekend away at our cost
    • 10 years - 3 month paid sabbatical

Interested?
Send your CV to

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Analytics Engineer

Eastern Cape, Eastern Cape StructureIt Ltd

Posted 16 days ago

Job Viewed

Tap Again To Close

Job Description

workfromhome

Main focus of the role

We are seeking a technically excellent, customer-focused Analytics Engineer to drive data integration and transformation for a telecoms analytics platform we are taking to market. This role sits at the intersection of client data environments and our internal product development efforts—requiring both business insight and technical acumen.

Your primary focus will be to gather, clarify, and deeply understand business and reporting requirements—often expressed in SQL, Excel, or ad hoc analysis. You will work closely with business stakeholders, analysts, and the data architecture team to translate these requirements into clear, actionable technical specifications.

Working as part of a collaborative team, you’ll help ensure these specifications are accurately and efficiently implemented—typically in PySpark/SparkSql based data pipelines—by supporting the development process, providing subject-matter expertise, and validating that delivered data outputs match the original requirements. Your work will be critical in ensuring that our analytics solutions are robust, repeatable, and deliver trusted results to the business

This role requires a strong analytical mindset, excellent communication skills, and hands-on experience with data transformation and integration processes.

What you’ll do

  • Customer Engagement & Requirements Gathering
    • Engage directly with customers to understand their business needs and data environments.
    • Elicit, document, and validate functional and non-functional requirements.
    • Conduct workshops and interviews with client stakeholders to capture use cases and success criteria.
  • Data Analysis & Integration Design
    • Analyze complex customer data sets, schemas, and data flows to assess integration needs.
    • Collaborate with data engineers and developers to design effective data ingestion, transformation, and mapping processes.
    • Validate data quality, completeness, and alignment with business requirements.
  • Technical Collaboration & Delivery
    • Support the development of scalable data integration and transformation pipelines by assisting with coding, testing, and implementing solutions—primarily using PySpark, SparkSQL, and Python.
    • Translate business and analytical requirements into clear, actionable technical specifications to guide the engineering team’s implementation.
    • Contribute hands-on to codebases: write and review code, implement validation checks, and assist with the development of complex transformation logic as needed.
    • Help automate and maintain data validation and quality checks to ensure reliability and accuracy of analytics outputs.
    • Collaborate closely with the FPA team to understand financial reporting needs, ensure alignment of technical solutions with finance objectives, and validate outputs support business decision-making.
    • Work with modern data formats and platforms (Parquet, Delta Lake, S3/Blob Storage, Databricks, etc.) to enable efficient data handling and integration.
    • Participate in solution architecture and technical discussions, collaborating on user story creation and acceptance criteria to ensure technical solutions align with business needs.
  • Product & Analytics Alignment
    • Work closely with our product team to ensure that customer data is accurately reflected in analytics outputs.
    • Provide feedback on product improvements based on client needs and data insights.
    • Monitor and evaluate the effectiveness of integrated data in supporting customer decision-making.

To fly in your role, you’ll need

  • 5+ years’ experience in data analytics, data engineering, or related technical roles, with significant hands-on coding in Python, PySpark, and SQL.Proven experience working on data integration projects (ETL, data pipelines, APIs).
  • Proven ability to build and support data integration pipelines (ETL, dataflows, APIs) at scale.Ability to translate complex technical concepts into business language and vice versa.
  • Strong experience with relational databases and data modelling; experience with modern analytics platforms (Databricks, Delta Lake, cloud storage) is a plus.Experience in Agile/Scrum environments.
  • Track record of translating business logic and requirements into production-grade, testable code.Experience working with data analytics tools and platforms (e.g., Power BI, SuperSet, Tableau, Looker, Grafana).
  • Solid grasp of data quality, data validation, and monitoring concepts.
  • Strong communication skills—able to present technical logic and results to both technical and non-technical audiences.
  • Experience in Agile/Scrum environments and working collaboratively with both business and engineering teams.

Nice to have

  • Experience in the telecoms industry.
  • Familiarity with the software development lifecycle and Agile methodologies
  • Infrastructure as code experience. (terraform and Pulumi)
  • Experience working in a scale up/dynamic consulting environment.
  • Exposure to accounting concepts.

Benefits

  • Working
    • Fully remote working / or from the office with daily lunch
    • Flexible working hours
    • High-spec Dell laptop
    • Money towards a keyboard of your choice, that is yours to keep
  • Insurance - fully paid on top of, not out of your salary
    • Medical Aid, including Gap Cover
    • Life Insurance, with Disability Insurance and Funeral cover
  • Learning
    • Learning Budget - Books or Courses - you choose how to use it
  • Culture
    • People-first culture that encourages work/life balance
    • Everyone has a voice, regardless of title
    • Psychological safety
  • Leave
    • 20 days annual leave
    • Paid Maternity, Paternity, Study & Moving, Conference, CSR Volunteering leave
  • Long-Term Loyalty Benefits
    • 2 years - monthly budget towards a cell phone contract, uber travel vouchers or petrol card
    • 3 years – can apply for a study bursary to enhance your current and future role with us
    • 5 years - 3 additional days of annual leave
    • 7 years – a local weekend away at our cost
    • 10 years - 3 month paid sabbatical

Interested?
Send your CV to

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Analytics Engineer

Eastern Cape, Eastern Cape StructureIt Ltd

Posted 16 days ago

Job Viewed

Tap Again To Close

Job Description

workfromhome

Main focus of the role

We are seeking a technically excellent, customer-focused Analytics Engineer to drive data integration and transformation for a telecoms analytics platform we are taking to market. This role sits at the intersection of client data environments and our internal product development efforts—requiring both business insight and technical acumen.

Your primary focus will be to gather, clarify, and deeply understand business and reporting requirements—often expressed in SQL, Excel, or ad hoc analysis. You will work closely with business stakeholders, analysts, and the data architecture team to translate these requirements into clear, actionable technical specifications.

Working as part of a collaborative team, you’ll help ensure these specifications are accurately and efficiently implemented—typically in PySpark/SparkSql based data pipelines—by supporting the development process, providing subject-matter expertise, and validating that delivered data outputs match the original requirements. Your work will be critical in ensuring that our analytics solutions are robust, repeatable, and deliver trusted results to the business

This role requires a strong analytical mindset, excellent communication skills, and hands-on experience with data transformation and integration processes.

What you’ll do

  • Customer Engagement & Requirements Gathering
    • Engage directly with customers to understand their business needs and data environments.
    • Elicit, document, and validate functional and non-functional requirements.
    • Conduct workshops and interviews with client stakeholders to capture use cases and success criteria.
  • Data Analysis & Integration Design
    • Analyze complex customer data sets, schemas, and data flows to assess integration needs.
    • Collaborate with data engineers and developers to design effective data ingestion, transformation, and mapping processes.
    • Validate data quality, completeness, and alignment with business requirements.
  • Technical Collaboration & Delivery
    • Support the development of scalable data integration and transformation pipelines by assisting with coding, testing, and implementing solutions—primarily using PySpark, SparkSQL, and Python.
    • Translate business and analytical requirements into clear, actionable technical specifications to guide the engineering team’s implementation.
    • Contribute hands-on to codebases: write and review code, implement validation checks, and assist with the development of complex transformation logic as needed.
    • Help automate and maintain data validation and quality checks to ensure reliability and accuracy of analytics outputs.
    • Collaborate closely with the FPA team to understand financial reporting needs, ensure alignment of technical solutions with finance objectives, and validate outputs support business decision-making.
    • Work with modern data formats and platforms (Parquet, Delta Lake, S3/Blob Storage, Databricks, etc.) to enable efficient data handling and integration.
    • Participate in solution architecture and technical discussions, collaborating on user story creation and acceptance criteria to ensure technical solutions align with business needs.
  • Product & Analytics Alignment
    • Work closely with our product team to ensure that customer data is accurately reflected in analytics outputs.
    • Provide feedback on product improvements based on client needs and data insights.
    • Monitor and evaluate the effectiveness of integrated data in supporting customer decision-making.

To fly in your role, you’ll need

  • 5+ years’ experience in data analytics, data engineering, or related technical roles, with significant hands-on coding in Python, PySpark, and SQL.Proven experience working on data integration projects (ETL, data pipelines, APIs).
  • Proven ability to build and support data integration pipelines (ETL, dataflows, APIs) at scale.Ability to translate complex technical concepts into business language and vice versa.
  • Strong experience with relational databases and data modelling; experience with modern analytics platforms (Databricks, Delta Lake, cloud storage) is a plus.Experience in Agile/Scrum environments.
  • Track record of translating business logic and requirements into production-grade, testable code.Experience working with data analytics tools and platforms (e.g., Power BI, SuperSet, Tableau, Looker, Grafana).
  • Solid grasp of data quality, data validation, and monitoring concepts.
  • Strong communication skills—able to present technical logic and results to both technical and non-technical audiences.
  • Experience in Agile/Scrum environments and working collaboratively with both business and engineering teams.

Nice to have

  • Experience in the telecoms industry.
  • Familiarity with the software development lifecycle and Agile methodologies
  • Infrastructure as code experience. (terraform and Pulumi)
  • Experience working in a scale up/dynamic consulting environment.
  • Exposure to accounting concepts.

Benefits

  • Working
    • Fully remote working / or from the office with daily lunch
    • Flexible working hours
    • High-spec Dell laptop
    • Money towards a keyboard of your choice, that is yours to keep
  • Insurance - fully paid on top of, not out of your salary
    • Medical Aid, including Gap Cover
    • Life Insurance, with Disability Insurance and Funeral cover
  • Learning
    • Learning Budget - Books or Courses - you choose how to use it
  • Culture
    • People-first culture that encourages work/life balance
    • Everyone has a voice, regardless of title
    • Psychological safety
  • Leave
    • 20 days annual leave
    • Paid Maternity, Paternity, Study & Moving, Conference, CSR Volunteering leave
  • Long-Term Loyalty Benefits
    • 2 years - monthly budget towards a cell phone contract, uber travel vouchers or petrol card
    • 3 years – can apply for a study bursary to enhance your current and future role with us
    • 5 years - 3 additional days of annual leave
    • 7 years – a local weekend away at our cost
    • 10 years - 3 month paid sabbatical

Interested?
Send your CV to

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data analytics Jobs in South Africa !

Data Analytics Engineer

North West, North West StructureIt Ltd

Posted 16 days ago

Job Viewed

Tap Again To Close

Job Description

workfromhome

Main focus of the role

We are seeking a technically excellent, customer-focused Analytics Engineer to drive data integration and transformation for a telecoms analytics platform we are taking to market. This role sits at the intersection of client data environments and our internal product development efforts—requiring both business insight and technical acumen.

Your primary focus will be to gather, clarify, and deeply understand business and reporting requirements—often expressed in SQL, Excel, or ad hoc analysis. You will work closely with business stakeholders, analysts, and the data architecture team to translate these requirements into clear, actionable technical specifications.

Working as part of a collaborative team, you’ll help ensure these specifications are accurately and efficiently implemented—typically in PySpark/SparkSql based data pipelines—by supporting the development process, providing subject-matter expertise, and validating that delivered data outputs match the original requirements. Your work will be critical in ensuring that our analytics solutions are robust, repeatable, and deliver trusted results to the business

This role requires a strong analytical mindset, excellent communication skills, and hands-on experience with data transformation and integration processes.

What you’ll do

  • Customer Engagement & Requirements Gathering
    • Engage directly with customers to understand their business needs and data environments.
    • Elicit, document, and validate functional and non-functional requirements.
    • Conduct workshops and interviews with client stakeholders to capture use cases and success criteria.
  • Data Analysis & Integration Design
    • Analyze complex customer data sets, schemas, and data flows to assess integration needs.
    • Collaborate with data engineers and developers to design effective data ingestion, transformation, and mapping processes.
    • Validate data quality, completeness, and alignment with business requirements.
  • Technical Collaboration & Delivery
    • Support the development of scalable data integration and transformation pipelines by assisting with coding, testing, and implementing solutions—primarily using PySpark, SparkSQL, and Python.
    • Translate business and analytical requirements into clear, actionable technical specifications to guide the engineering team’s implementation.
    • Contribute hands-on to codebases: write and review code, implement validation checks, and assist with the development of complex transformation logic as needed.
    • Help automate and maintain data validation and quality checks to ensure reliability and accuracy of analytics outputs.
    • Collaborate closely with the FPA team to understand financial reporting needs, ensure alignment of technical solutions with finance objectives, and validate outputs support business decision-making.
    • Work with modern data formats and platforms (Parquet, Delta Lake, S3/Blob Storage, Databricks, etc.) to enable efficient data handling and integration.
    • Participate in solution architecture and technical discussions, collaborating on user story creation and acceptance criteria to ensure technical solutions align with business needs.
  • Product & Analytics Alignment
    • Work closely with our product team to ensure that customer data is accurately reflected in analytics outputs.
    • Provide feedback on product improvements based on client needs and data insights.
    • Monitor and evaluate the effectiveness of integrated data in supporting customer decision-making.

To fly in your role, you’ll need

  • 5+ years’ experience in data analytics, data engineering, or related technical roles, with significant hands-on coding in Python, PySpark, and SQL.Proven experience working on data integration projects (ETL, data pipelines, APIs).
  • Proven ability to build and support data integration pipelines (ETL, dataflows, APIs) at scale.Ability to translate complex technical concepts into business language and vice versa.
  • Strong experience with relational databases and data modelling; experience with modern analytics platforms (Databricks, Delta Lake, cloud storage) is a plus.Experience in Agile/Scrum environments.
  • Track record of translating business logic and requirements into production-grade, testable code.Experience working with data analytics tools and platforms (e.g., Power BI, SuperSet, Tableau, Looker, Grafana).
  • Solid grasp of data quality, data validation, and monitoring concepts.
  • Strong communication skills—able to present technical logic and results to both technical and non-technical audiences.
  • Experience in Agile/Scrum environments and working collaboratively with both business and engineering teams.

Nice to have

  • Experience in the telecoms industry.
  • Familiarity with the software development lifecycle and Agile methodologies
  • Infrastructure as code experience. (terraform and Pulumi)
  • Experience working in a scale up/dynamic consulting environment.
  • Exposure to accounting concepts.

Benefits

  • Working
    • Fully remote working / or from the office with daily lunch
    • Flexible working hours
    • High-spec Dell laptop
    • Money towards a keyboard of your choice, that is yours to keep
  • Insurance - fully paid on top of, not out of your salary
    • Medical Aid, including Gap Cover
    • Life Insurance, with Disability Insurance and Funeral cover
  • Learning
    • Learning Budget - Books or Courses - you choose how to use it
  • Culture
    • People-first culture that encourages work/life balance
    • Everyone has a voice, regardless of title
    • Psychological safety
  • Leave
    • 20 days annual leave
    • Paid Maternity, Paternity, Study & Moving, Conference, CSR Volunteering leave
  • Long-Term Loyalty Benefits
    • 2 years - monthly budget towards a cell phone contract, uber travel vouchers or petrol card
    • 3 years – can apply for a study bursary to enhance your current and future role with us
    • 5 years - 3 additional days of annual leave
    • 7 years – a local weekend away at our cost
    • 10 years - 3 month paid sabbatical

Interested?
Send your CV to

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Analytics Engineer

Gauteng, Gauteng StructureIt Ltd

Posted 16 days ago

Job Viewed

Tap Again To Close

Job Description

workfromhome

Main focus of the role

We are seeking a technically excellent, customer-focused Analytics Engineer to drive data integration and transformation for a telecoms analytics platform we are taking to market. This role sits at the intersection of client data environments and our internal product development efforts—requiring both business insight and technical acumen.

Your primary focus will be to gather, clarify, and deeply understand business and reporting requirements—often expressed in SQL, Excel, or ad hoc analysis. You will work closely with business stakeholders, analysts, and the data architecture team to translate these requirements into clear, actionable technical specifications.

Working as part of a collaborative team, you’ll help ensure these specifications are accurately and efficiently implemented—typically in PySpark/SparkSql based data pipelines—by supporting the development process, providing subject-matter expertise, and validating that delivered data outputs match the original requirements. Your work will be critical in ensuring that our analytics solutions are robust, repeatable, and deliver trusted results to the business

This role requires a strong analytical mindset, excellent communication skills, and hands-on experience with data transformation and integration processes.

What you’ll do

  • Customer Engagement & Requirements Gathering
    • Engage directly with customers to understand their business needs and data environments.
    • Elicit, document, and validate functional and non-functional requirements.
    • Conduct workshops and interviews with client stakeholders to capture use cases and success criteria.
  • Data Analysis & Integration Design
    • Analyze complex customer data sets, schemas, and data flows to assess integration needs.
    • Collaborate with data engineers and developers to design effective data ingestion, transformation, and mapping processes.
    • Validate data quality, completeness, and alignment with business requirements.
  • Technical Collaboration & Delivery
    • Support the development of scalable data integration and transformation pipelines by assisting with coding, testing, and implementing solutions—primarily using PySpark, SparkSQL, and Python.
    • Translate business and analytical requirements into clear, actionable technical specifications to guide the engineering team’s implementation.
    • Contribute hands-on to codebases: write and review code, implement validation checks, and assist with the development of complex transformation logic as needed.
    • Help automate and maintain data validation and quality checks to ensure reliability and accuracy of analytics outputs.
    • Collaborate closely with the FPA team to understand financial reporting needs, ensure alignment of technical solutions with finance objectives, and validate outputs support business decision-making.
    • Work with modern data formats and platforms (Parquet, Delta Lake, S3/Blob Storage, Databricks, etc.) to enable efficient data handling and integration.
    • Participate in solution architecture and technical discussions, collaborating on user story creation and acceptance criteria to ensure technical solutions align with business needs.
  • Product & Analytics Alignment
    • Work closely with our product team to ensure that customer data is accurately reflected in analytics outputs.
    • Provide feedback on product improvements based on client needs and data insights.
    • Monitor and evaluate the effectiveness of integrated data in supporting customer decision-making.

To fly in your role, you’ll need

  • 5+ years’ experience in data analytics, data engineering, or related technical roles, with significant hands-on coding in Python, PySpark, and SQL.Proven experience working on data integration projects (ETL, data pipelines, APIs).
  • Proven ability to build and support data integration pipelines (ETL, dataflows, APIs) at scale.Ability to translate complex technical concepts into business language and vice versa.
  • Strong experience with relational databases and data modelling; experience with modern analytics platforms (Databricks, Delta Lake, cloud storage) is a plus.Experience in Agile/Scrum environments.
  • Track record of translating business logic and requirements into production-grade, testable code.Experience working with data analytics tools and platforms (e.g., Power BI, SuperSet, Tableau, Looker, Grafana).
  • Solid grasp of data quality, data validation, and monitoring concepts.
  • Strong communication skills—able to present technical logic and results to both technical and non-technical audiences.
  • Experience in Agile/Scrum environments and working collaboratively with both business and engineering teams.

Nice to have

  • Experience in the telecoms industry.
  • Familiarity with the software development lifecycle and Agile methodologies
  • Infrastructure as code experience. (terraform and Pulumi)
  • Experience working in a scale up/dynamic consulting environment.
  • Exposure to accounting concepts.

Benefits

  • Working
    • Fully remote working / or from the office with daily lunch
    • Flexible working hours
    • High-spec Dell laptop
    • Money towards a keyboard of your choice, that is yours to keep
  • Insurance - fully paid on top of, not out of your salary
    • Medical Aid, including Gap Cover
    • Life Insurance, with Disability Insurance and Funeral cover
  • Learning
    • Learning Budget - Books or Courses - you choose how to use it
  • Culture
    • People-first culture that encourages work/life balance
    • Everyone has a voice, regardless of title
    • Psychological safety
  • Leave
    • 20 days annual leave
    • Paid Maternity, Paternity, Study & Moving, Conference, CSR Volunteering leave
  • Long-Term Loyalty Benefits
    • 2 years - monthly budget towards a cell phone contract, uber travel vouchers or petrol card
    • 3 years – can apply for a study bursary to enhance your current and future role with us
    • 5 years - 3 additional days of annual leave
    • 7 years – a local weekend away at our cost
    • 10 years - 3 month paid sabbatical

Interested?
Send your CV to

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Analytics Engineer

Western Cape, Western Cape StructureIt Ltd

Posted 16 days ago

Job Viewed

Tap Again To Close

Job Description

workfromhome

Main focus of the role

We are seeking a technically excellent, customer-focused Analytics Engineer to drive data integration and transformation for a telecoms analytics platform we are taking to market. This role sits at the intersection of client data environments and our internal product development efforts—requiring both business insight and technical acumen.

Your primary focus will be to gather, clarify, and deeply understand business and reporting requirements—often expressed in SQL, Excel, or ad hoc analysis. You will work closely with business stakeholders, analysts, and the data architecture team to translate these requirements into clear, actionable technical specifications.

Working as part of a collaborative team, you’ll help ensure these specifications are accurately and efficiently implemented—typically in PySpark/SparkSql based data pipelines—by supporting the development process, providing subject-matter expertise, and validating that delivered data outputs match the original requirements. Your work will be critical in ensuring that our analytics solutions are robust, repeatable, and deliver trusted results to the business

This role requires a strong analytical mindset, excellent communication skills, and hands-on experience with data transformation and integration processes.

What you’ll do

  • Customer Engagement & Requirements Gathering
    • Engage directly with customers to understand their business needs and data environments.
    • Elicit, document, and validate functional and non-functional requirements.
    • Conduct workshops and interviews with client stakeholders to capture use cases and success criteria.
  • Data Analysis & Integration Design
    • Analyze complex customer data sets, schemas, and data flows to assess integration needs.
    • Collaborate with data engineers and developers to design effective data ingestion, transformation, and mapping processes.
    • Validate data quality, completeness, and alignment with business requirements.
  • Technical Collaboration & Delivery
    • Support the development of scalable data integration and transformation pipelines by assisting with coding, testing, and implementing solutions—primarily using PySpark, SparkSQL, and Python.
    • Translate business and analytical requirements into clear, actionable technical specifications to guide the engineering team’s implementation.
    • Contribute hands-on to codebases: write and review code, implement validation checks, and assist with the development of complex transformation logic as needed.
    • Help automate and maintain data validation and quality checks to ensure reliability and accuracy of analytics outputs.
    • Collaborate closely with the FPA team to understand financial reporting needs, ensure alignment of technical solutions with finance objectives, and validate outputs support business decision-making.
    • Work with modern data formats and platforms (Parquet, Delta Lake, S3/Blob Storage, Databricks, etc.) to enable efficient data handling and integration.
    • Participate in solution architecture and technical discussions, collaborating on user story creation and acceptance criteria to ensure technical solutions align with business needs.
  • Product & Analytics Alignment
    • Work closely with our product team to ensure that customer data is accurately reflected in analytics outputs.
    • Provide feedback on product improvements based on client needs and data insights.
    • Monitor and evaluate the effectiveness of integrated data in supporting customer decision-making.

To fly in your role, you’ll need

  • 5+ years’ experience in data analytics, data engineering, or related technical roles, with significant hands-on coding in Python, PySpark, and SQL.Proven experience working on data integration projects (ETL, data pipelines, APIs).
  • Proven ability to build and support data integration pipelines (ETL, dataflows, APIs) at scale.Ability to translate complex technical concepts into business language and vice versa.
  • Strong experience with relational databases and data modelling; experience with modern analytics platforms (Databricks, Delta Lake, cloud storage) is a plus.Experience in Agile/Scrum environments.
  • Track record of translating business logic and requirements into production-grade, testable code.Experience working with data analytics tools and platforms (e.g., Power BI, SuperSet, Tableau, Looker, Grafana).
  • Solid grasp of data quality, data validation, and monitoring concepts.
  • Strong communication skills—able to present technical logic and results to both technical and non-technical audiences.
  • Experience in Agile/Scrum environments and working collaboratively with both business and engineering teams.

Nice to have

  • Experience in the telecoms industry.
  • Familiarity with the software development lifecycle and Agile methodologies
  • Infrastructure as code experience. (terraform and Pulumi)
  • Experience working in a scale up/dynamic consulting environment.
  • Exposure to accounting concepts.

Benefits

  • Working
    • Fully remote working / or from the office with daily lunch
    • Flexible working hours
    • High-spec Dell laptop
    • Money towards a keyboard of your choice, that is yours to keep
  • Insurance - fully paid on top of, not out of your salary
    • Medical Aid, including Gap Cover
    • Life Insurance, with Disability Insurance and Funeral cover
  • Learning
    • Learning Budget - Books or Courses - you choose how to use it
  • Culture
    • People-first culture that encourages work/life balance
    • Everyone has a voice, regardless of title
    • Psychological safety
  • Leave
    • 20 days annual leave
    • Paid Maternity, Paternity, Study & Moving, Conference, CSR Volunteering leave
  • Long-Term Loyalty Benefits
    • 2 years - monthly budget towards a cell phone contract, uber travel vouchers or petrol card
    • 3 years – can apply for a study bursary to enhance your current and future role with us
    • 5 years - 3 additional days of annual leave
    • 7 years – a local weekend away at our cost
    • 10 years - 3 month paid sabbatical

Interested?
Send your CV to

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Analytics Jobs