162 Aws Engineer jobs in South Africa
Aws Engineer Tshwane (Pretoria)
Posted 9 days ago
Job Viewed
Job Description
Our Client in the IT industry is looking for an AWS Data Engineer. If you meet the below requirements, kindly send us your CV.
Duties & ResponsibilitiesESSENTIAL SKILLS REQUIREMENTS:
Above average experience/understanding (in order of importance):
- Terraform
- Python 3.x
- SQL - Oracle/PostgreSQL
- Py Spark
- Boto3
- ETL
- Docker
- Linux / Unix
- Big Data
- Powershell / Bash
- BMW Cloud Data Hub (CDH)
- BMW CDEC Blueprint
Experience developing technical documentation and artefacts.
Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV, etc.
Experience working with Data Quality Tools such as Great Expectations.
Knowledge of the Agile Working Model.
Any additional responsibilities assigned in the Agile Working Model (AWM) Charter.
ADVANTAGEOUS SKILLS REQUIREMENTS:
Demonstrate expertise in data modelling Oracle SQL.
Exceptional analytical skills analysing large and complex data sets.
Perform thorough testing and data validation to ensure the accuracy of data transformations.
Strong written and verbal communication skills, with precise documentation.
Self-driven team player with the ability to work independently and multi-task.
Experience building data pipelines using AWS Glue or Data Pipeline, or similar platforms.
Familiar with data stores such as AWS S3, and AWS RDS or DynamoDB.
Experience and solid understanding of various software design patterns.
Experience preparing specifications from which programs will be written, designed, coded, tested, and debugged.
Strong organizational skills. Experience developing and working with REST APIs is a bonus.
Basic experience in Networking and troubleshooting network issues.
Basic experience/understanding of AWS Components (in order of importance):
- Glue
- CloudWatch
- SNS
- Athena
- S3
- Kinesis Streams (Kinesis, Kinesis Firehose)
- Lambda
- DynamoDB
- Step Function
- Param Store
- Secrets Manager
- Code Build/Pipeline
- CloudFormation
- Business Intelligence (BI) Experience
- Technical data modelling and schema design (“not drag and drop”)
- Kafka
RedshiftPackage & Remuneration
Market Related - Monthly
#J-18808-LjbffrAWS DevOps Engineer
Posted 3 days ago
Job Viewed
Job Description
If you’re the kind of engineer who enjoys solving real infrastructure problems – quietly making the complex run smoothly in the background – this might be a great fit for you.
This is a Cape Town-based company doing big work in the payment infrastructure space. Their platform supports some of the world’s biggest brands as they expand into South Africa and beyond. Think high transaction volumes, security at scale, and uptime that can’t flinch. It’s smart, high-stakes engineering—but with a strong focus on doing things properly.
About the role:
They’re looking for an AWS DevOps Engineer who knows how to build for reliability, visibility, and security. You’ll be the one designing scalable AWS infrastructure, building slick CI/CD pipelines, and keeping the monitoring sharp enough to spot trouble before it starts. You’ll also help developers ship faster, without cutting corners.
You’ll report into the VP of Engineering and join a small, highly capable team that likes asking “Why?” and enjoys solving the harder stuff.
Hybrid | Gardens, Cape Town | Permanent.
Salary: Up to R75 000 per month.
Key things you’ll work on:
- Building out infrastructure using Terraform (lots of it)
- Creating and maintaining CI/CD pipelines (GitHub Actions ideally)
- Designing secure cloud environments (AWS preferred)
- Keeping systems observable and monitored (Grafana, Prometheus, etc.)
- Managing incidents and working on continuous improvements
- Supporting dev teams to move faster, safer
- Writing clean documentation and owning your space
Who they’re looking for:
Someone who’s curious, capable, and comfortable in their craft. You’re not afraid of complexity—you enjoy making it cleaner. You’re probably the kind of person who reads logs like bedtime stories and gets a quiet thrill from clean deployments.
You should have:
- 3+ years as a DevOps Engineer or similar
- Strong Terraform and Infrastructure-as-Code experience
- Solid AWS experience and cloud-first mindset
- Docker and Kubernetes (Helm is a plus)
- Secure networking and systems thinking
- Bash or Python scripting skills
- Experience with observability and alerting tools
- Incident response experience and good communication chops
Bonus points if you’ve tackled:
- Zero-downtime deployments
- Scaling infra with growing traffic volumes
- Developer tooling to boost speed without risking quality
- Balancing security, speed, and compliance in fintech/payments
What they offer:
- Hybrid work setup (Cape Town-based)
- High autonomy and trust
- Sharp teammates, clean processes, and strong engineering standards
- An environment where ambition is backed with action
AWS Data Engineer
Posted 3 days ago
Job Viewed
Job Description
The role of a Data Engineer involves constructing and maintaining data pipelines and datamarts, emphasizing scalability, repeatability, and security.
Data Engineers play a pivotal role in facilitating the acquisition of data from diverse sources, ensuring its conformity to data quality standards, and enabling downstream users to access data promptly. This position is an integral part of an agile team.
Key Responsibilities:
- Architecting Data analytics framework.
- Translating complex functional and technical requirements into detailed architecture, design, and high-performance software.
- Leading the development of data and batch/real-time analytical solutions by leveraging transformative technologies.
- Engaging in multiple projects as a technical lead, overseeing user story analysis, design, software development, testing, and automation tool creation.
Duties: Primary Job Objectives:
- Development and Operations
- Database Development and Operations
- Establishment and Adherence to Policies, Standards, and Procedures
- Communication
- Business Continuity and Disaster Recovery Planning
- Research and Evaluation
- Coaching and Mentoring
Required Skills, Knowledge, and Experience:
- A minimum of 5 years of experience in Data Engineering or Software Engineering.
- Demonstrated leadership experience, managing teams of engineers for 3-5 years.
- A minimum of 2 years of experience in Big Data.
- At least 5 years of experience with Extract, Transform, and Load (ETL) processes.
- A minimum of 2 years of experience with AWS (Amazon Web Services).
- Demonstrated experience with agile or other rapid application development methodologies for at least 2 years (e.g., Agile, Kanban, Scrum).
- 5 years of proven expertise in object-oriented design, coding, testing patterns, and working with commercial or open-source software platforms and large-scale data infrastructures.
- Proficiency in creating data feeds from on-premise to AWS Cloud (2 years).
- Support experience for data feeds in production on a break-fix basis (2 years).
- A minimum of 4 years of experience in creating data marts using Talend or similar ETL development tools.
- Proficiency in data manipulation using Python and PySpark (2 years).
- Experience in processing data using the Hadoop paradigm, particularly with EMR, AWS’s distribution of Hadoop (2 years).
- DevOps experience in Big Data and Business Intelligence, including automated testing and deployment (2 years).
- Extensive knowledge of various programming or scripting languages.
- Expertise in data modeling and an understanding of different data structures and their suitability for specific use cases.
Additional Technical Skills Required:
- The ability to design highly scalable distributed systems using various open-source tools.
- Proficiency in both batch and streaming Big Data tools.
- Experience with Talend for at least 1 year.
- Familiarity with AWS services such as EMR, EC2, and S3 for at least 1 year.
- Proficiency in Python for at least 1 year.
- Familiarity with PySpark or Spark (desirable for at least 1 year).
- Experience in Business Intelligence data modeling for 3 years.
- Proficiency in SQL for 3 years.
Qualifications / Certifications:
- A Bachelor’s degree in computer science, computer engineering, or equivalent work experience for a minimum of 4 years.
- AWS Certification, at least at the associate level.
AWS Data Engineer
Posted 3 days ago
Job Viewed
Job Description
PBT Group is currently offering an opportunity for an AWS Data Engineer with 2 to 5 years of relevant experience.
The role of a Data Engineer involves constructing and maintaining data pipelines and datamarts, emphasizing scalability, repeatability, and security. Data Engineers play a pivotal role in facilitating the acquisition of data from diverse sources, ensuring its conformity to data quality standards, and enabling downstream users to access data promptly. This position is an integral part of an agile team.
These professionals are entrusted with the responsibility of establishing the infrastructure required to derive insights from raw data, integrating data from various sources seamlessly. They empower solutions by efficiently managing substantial volumes of data, both in batch and real-time, utilizing cutting-edge technologies from the realms of big data and cloud computing. Additional responsibilities encompass the development of proof-of-concepts and the implementation of intricate big data solutions, with a primary focus on collecting, parsing, managing, analyzing, and visualizing extensive datasets. They are adept at employing technologies to resolve challenges associated with handling vast amounts of data in diverse formats, thereby delivering innovative solutions.
Data is a technically demanding role that necessitates a broad spectrum of expertise in software development and programming. These professionals possess knowledge in data analysis, understanding end-user and business requirements, and have the ability to translate these needs into technical solutions. They exhibit a strong grasp of physical database design and the systems development lifecycle. Collaboration within a team environment is essential for success in this role.
Key Responsibilities:
- Architecting Data analytics framework.
- Translating complex functional and technical requirements into detailed architecture, design, and high-performance software.
- Leading the development of data and batch / real-time analytical solutions by leveraging transformative technologies.
- Engaging in multiple projects as a technical lead, overseeing user story analysis, design, software development, testing, and automation tool creation.
Duties: Primary Job Objectives:
- Development and Operations
- Database Development and Operations
- Establishment and Adherence to Policies, Standards, and Procedures
- Communication
- Business Continuity and Disaster Recovery Planning
- Research and Evaluation
- Coaching and Mentoring
Required Skills, Knowledge, and Experience:
- A minimum of 5 years of experience in Data Engineering or Software Engineering.
- Demonstrated leadership experience, managing teams of engineers for 3-5 years.
- A minimum of 2 years of experience in Big Data.
- At least 5 years of experience with Extract, Transform, and Load (ETL) processes.
- A minimum of 2 years of experience with AWS (Amazon Web Services).
- Demonstrated experience with agile or other rapid application development methodologies for at least 2 years (e.g., Agile, Kanban, Scrum).
- 5 years of proven expertise in object-oriented design, coding, testing patterns, and working with commercial or open-source software platforms and large-scale data infrastructures.
- Proficiency in creating data feeds from on-premise to AWS Cloud (2 years).
- Support experience for data feeds in production on a break-fix basis (2 years).
- A minimum of 4 years of experience in creating data marts using Talend or similar ETL development tools.
- Proficiency in data manipulation using Python and PySpark (2 years).
- Experience in processing data using the Hadoop paradigm, particularly with EMR, AWS's distribution of Hadoop (2 years).
- DevOps experience in Big Data and Business Intelligence, including automated testing and deployment (2 years).
- Extensive knowledge of various programming or scripting languages.
- Expertise in data modeling and an understanding of different data structures and their suitability for specific use cases.
Additional Technical Skills Required:
- The ability to design highly scalable distributed systems using various open-source tools.
- Proficiency in both batch and streaming Big Data tools.
- Experience with Talend for at least 1 year.
- Familiarity with AWS services such as EMR, EC2, and S3 for at least 1 year.
- Proficiency in Python for at least 1 year.
- Familiarity with PySpark or Spark (desirable for at least 1 year).
- Experience in Business Intelligence data modeling for 3 years.
- Proficiency in SQL for 3 years.
Qualifications / Certifications:
- A Bachelor's degree in computer science, computer engineering, or equivalent work experience for a minimum of 4 years.
- AWS Certification, at least at the associate level.
AWS Data Engineer
Posted 15 days ago
Job Viewed
Job Description
Job Purpose:
Responsible for creating and managing the technological part of data infrastructure in every step of data flow. From configuring data sources to integrating analytical tools all these systems would be architected, built and managed by a general-role Data Engineer.
Minimum education (essential):
Bachelors degree in Computer Science or Engineering (or similar)
Minimum education (desirable):
- Honors degree in Computer Science or Engineering (or similar)
- AWS Certified Data Engineer; or
- AWS Certified Solutions Architect; or
- AWS Certified Data Analyst
Minimum applicable experience (years):
5+ years working experience
Required nature of experience:
- Data Engineering development
- Experience with AWS services used for data warehousing, computing and transformations i.e. AWS Glue (crawlers, jobs, triggers, and catalog), AWS S3, AWS Lambda, AWS Step Functions, AWS Athena and AWS CloudWatch
- Experience with SQL and NoSQL databases (e.g., PostgreSQL, MySQL, DynamoDB)
- Experience with SQL for querying and transformation of data
Skills and Knowledge (essential):
- Strong skills in Python (especially PySpark for AWS Glue)
- Strong knowledge of data modelling, schema design and database optimization
- Proficiency with AWS and infrastructure as code
Skills and Knowledge (desirable):
- Knowledge of SQL, Python, AWS serverless microservices,
- Deploying and managing ML models in production
- Version control (Git), unit testing and agile methodologies
Data Architecture and Management 20%
- Design and maintain scalable data architectures using AWS services for example, but not limited to, AWS S3, AWS Glue and AWS Athena.
- Implement data partitioning and cataloging strategies to enhance data organization and accessibility.
- Work with schema evolution and versioning to ensure data consistency.
- Develop and manage metadata repositories and data dictionaries.
- Assist and support with defining setup and maintenance of data access roles and privileges.
- Design, develop and optimize scalable ETL pipelines using batch and real-time processing frameworks (using AWS Glue and PySpark).
- Implement data extraction, transformation and loading processes from various structured and unstructured sources.
- Optimize ETL jobs for performance, cost efficiency and scalability.
- Develop and integrate APIs to ingest and export data between various source and target systems, ensuring seamless ETL workflows.
- Enable scalable deployment of ML models by integrating data pipelines with ML workflows.
- Automate data workflows and ensure they are fault tolerant and optimized.
- Implement logging, monitoring and alerting for data pipelines.
- Optimize ETL job performance by tuning configurations and analyzing resource usage.
- Optimize data storage solutions for performance, cost and scalability.
- Ensure the optimisation of AWS resources for scalability for data ingestion and outputs.
- Deploy machine learning models into productions using cloud-based services like AWS SageMaker.
- Ensure API security, authentication and access control best practices.
- Implement data encryption, access control and compliance with GDPR, HIPAA, SOC2 etc.
- Establish data governance policies, including access control and security best practices.
- Work closely with data scientists, analysts and business teams to understand data needs.
- Collaborate with backend teams to integrate data pipelines into CI/CD.
- Assist with developmental leadership to the team through coaching, code reviews and mentorship.
- Ensure technological alignment with B2C division strategy supporting overarching strategy and vision.
- Identify and encourage areas for growth and improvement within the team.
Document data processes, transformations and architectural decisions.
Maintain high standards of software quality within the team by adhering to good processes, practices and habits, including compliance to QMS system, and data and system security requirements.
Ensure compliance to the established processes and standards for the development lifecycle, including but not limited to data archival.
Drive compliance to the Quality Management System in line with the Quality Objectives, Quality Manual, and all processes related to the design, development and implementation of software related to medical devices.
Comply to ISO, CE, FDA (and other) standards and requirements as is applicable to assigned products.
Safeguard confidential information and data.
AWS Data Engineer
Posted 15 days ago
Job Viewed
Job Description
Overview
An exciting opportunity has become available for an experienced AWS Data Engineer. In this role, you'll be responsible for building and managing scalable data systems, from setting up data sources to integrating analytical tools, using AWS services.
Key Responsibilities:
- Data Architecture & Management: Design and maintain data systems using AWS services (e.g., AWS S3, AWS Glue, Athena). Organize data effectively and ensure easy access through data partitioning and cataloging strategies.
- ETL Pipeline Development: Develop and optimize ETL (Extract, Transform, Load) pipelines using AWS Glue and PySpark. Focus on improving performance, scalability, and cost-efficiency in batch and real-time data processing.
- Automation & Monitoring: Automate workflows and ensure they run efficiently. Set up monitoring and alerts for data pipelines, and optimize AWS resources for scalability and performance.
- Security & Compliance: Follow security best practices, including API authentication, data encryption, and compliance with GDPR, HIPAA, and SOC2.
- Collaboration & Mentorship: Work closely with data scientists, analysts, and backend teams to integrate data pipelines. Provide mentorship to junior team members and encourage growth and collaboration.
- Quality Management: Ensure high-quality software development standards by adhering to best practices and complying with industry regulations like ISO, CE, and FDA.
Key Skills and Experience (5+ Years Required):
- AWS Services: Experience with AWS Glue, S3, Lambda, Athena, and CloudWatch.
- Data Engineering: Proven experience developing and optimizing ETL pipelines. Familiarity with SQL and NoSQL databases (e.g., PostgreSQL, MySQL, DynamoDB).
- Programming: Strong skills in Python, particularly using PySpark with AWS Glue.
- Data Modeling & Optimization: Experience in data modeling, schema design, and database optimization.
- Machine Learning: Experience integrating data pipelines with machine learning workflows and deploying models with AWS SageMaker.
- Compliance & Security: Knowledge of data governance, API security, and compliance with industry standards and regulations.
Education & Certification Requirements:
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- AWS certifications, such as AWS Certified Data Engineer, Solutions Architect, or Data Analyst, are highly desirable.
Be The First To Know
About the latest Aws engineer Jobs in South Africa !
AWS Data Engineer
Posted 15 days ago
Job Viewed
Job Description
PBT Group is currently offering an opportunity for a Senior AWS Data Engineer.
The role of a Data Engineer involves constructing and maintaining data pipelines and datamarts, emphasizing scalability, repeatability, and security. Data Engineers play a pivotal role in facilitating the acquisition of data from diverse sources, ensuring its conformity to data quality standards, and enabling downstream users to access data promptly. This position is an integral part of an agile team.
These professionals are entrusted with the responsibility of establishing the infrastructure required to derive insights from raw data, integrating data from various sources seamlessly. They empower solutions by efficiently managing substantial volumes of data, both in batch and real-time, utilizing cutting-edge technologies from the realms of big data and cloud computing. Additional responsibilities encompass the development of proof-of-concepts and the implementation of intricate big data solutions, with a primary focus on collecting, parsing, managing, analyzing, and visualizing extensive datasets. They are adept at employing technologies to resolve challenges associated with handling vast amounts of data in diverse formats, thereby delivering innovative solutions.
Data Engineering is a technically demanding role that necessitates a broad spectrum of expertise in software development and programming. These professionals possess knowledge in data analysis, understanding end-user and business requirements, and have the ability to translate these needs into technical solutions. They exhibit a strong grasp of physical database design and the systems development lifecycle. Collaboration within a team environment is essential for success in this role.
Key Responsibilities:
- Architecting Data analytics framework.
- Translating complex functional and technical requirements into detailed architecture, design, and high-performance software.
- Leading the development of data and batch/real-time analytical solutions by leveraging transformative technologies.
- Engaging in multiple projects as a technical lead, overseeing user story analysis, design, software development, testing, and automation tool creation.
Duties: Primary Job Objectives:
- Development and Operations
- Database Development and Operations
- Establishment and Adherence to Policies, Standards, and Procedures
- Communication
- Business Continuity and Disaster Recovery Planning
- Research and Evaluation
- Coaching and Mentoring
Required Skills, Knowledge, and Experience:
- A minimum of 5 years of experience in Data Engineering or Software Engineering.
- Demonstrated leadership experience, managing teams of engineers for 3-5 years.
- A minimum of 2 years of experience in Big Data.
- At least 5 years of experience with Extract, Transform, and Load (ETL) processes.
- A minimum of 2 years of experience with AWS (Amazon Web Services).
- Demonstrated experience with agile or other rapid application development methodologies for at least 2 years (e.g., Agile, Kanban, Scrum).
- 5 years of proven expertise in object-oriented design, coding, testing patterns, and working with commercial or open-source software platforms and large-scale data infrastructures.
- Proficiency in creating data feeds from on-premise to AWS Cloud (2 years).
- Support experience for data feeds in production on a break-fix basis (2 years).
- A minimum of 4 years of experience in creating data marts using Talend or similar ETL development tools.
- Proficiency in data manipulation using Python and PySpark (2 years).
- Experience in processing data using the Hadoop paradigm, particularly with EMR, AWS's distribution of Hadoop (2 years).
- DevOps experience in Big Data and Business Intelligence, including automated testing and deployment (2 years).
- Extensive knowledge of various programming or scripting languages.
- Expertise in data modeling and an understanding of different data structures and their suitability for specific use cases.
Additional Technical Skills Required:
- The ability to design highly scalable distributed systems using various open-source tools.
- Proficiency in both batch and streaming Big Data tools.
- Experience with Talend for at least 1 year.
- Familiarity with AWS services such as EMR, EC2, and S3 for at least 1 year.
- Proficiency in Python for at least 1 year.
- Familiarity with PySpark or Spark (desirable for at least 1 year).
- Experience in Business Intelligence data modeling for 3 years.
- Proficiency in SQL for 3 years.
Qualifications/Certifications:
- A Bachelor's degree in computer science, computer engineering, or equivalent work experience for a minimum of 4 years.
- AWS Certification, at least at the associate level.
* In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent
AWS DevOps Engineer - Senior
Posted 3 days ago
Job Viewed
Job Description
Join to apply for the AWS DevOps Engineer - Senior role at Lumenalta
1 day ago Be among the first 25 applicants
Join to apply for the AWS DevOps Engineer - Senior role at Lumenalta
Get AI-powered advice on this job and more exclusive features.
Experience Remote done Right. Over 20 years of remote experience, all 500+ staff are 100% remote and we still grow vibrant relationships, provide exceptional opportunities for career growth while working with stellar clients on ambitious projects.
What we're working on:
Enterprise companies turn to us to help them launch innovative digital products that interact with hundreds of millions of customers, transactions and data points. The problems we solve every day are real and require creativity, grit and determination. We are building a culture that challenges norms while fostering experimentation and personal growth. In order to grasp the scale of problems we face, ideally, you have some exposure to Logistics, FinTech, Transportation, Insurance, Media or other complex multifactor industries
What You’ll Be Doing:
- Collaborating with development teams to design, implement, and optimize highly available, scalable, and secure cloud solutions on AWS.
- Automating infrastructure provisioning and configuration management using tools like Ansible and Terraform.
- Developing and maintaining CI/CD pipelines using Jenkins to streamline deployment processes.
- Monitoring, troubleshooting, and optimizing AWS infrastructure and services to ensure maximum performance and reliability.
- Ensuring compliance with security best practices and helping implement robust access controls using IAM, KMS, and other AWS security services.
- Creating and maintaining infrastructure as code (IaC) with CloudFormation or Terraform to ensure consistent and reproducible deployments.
- Driving continuous improvement by automating processes, implementing new tools, and refining workflows to enhance operational efficiency.
What We're Looking For:
- Strong background in Linux administration and expertise in managing large-scale systems.
- +6 years of experience in designing, deploying, and maintaining AWS infrastructure with services like API Gateway, DynamoDB, Lambda, IAM, KMS, and CloudFormation.
- Hands-on experience in automation and configuration management using tools such as Ansible or Terraform.
- Proficiency in version control systems like Git (we use GitHub for collaboration).
- Experience in building and managing CI/CD pipelines using Jenkins or similar tools.
- Ability to collaborate effectively across functional teams and communicate complex technical concepts to both technical and non-technical stakeholders.
- AWS certifications (e.g., AWS Certified DevOps Engineer, AWS Certified Solutions Architect) are highly preferred.
Lumenalta is committed to hiring exceptional talent from a wide variety of diverse backgrounds. If you share our values and enthusiasm for digital transformation, we encourage you to apply
What's it like to work at Lumenalta?
Seniority level- Seniority level Mid-Senior level
- Employment type Full-time
- Job function Consulting, Information Technology, and Engineering
- Industries IT Services and IT Consulting, Banking, and Advertising Services
Referrals increase your chances of interviewing at Lumenalta by 2x
Get notified about new DevOps Engineer jobs in South Africa .
City of Cape Town, Western Cape, South Africa 2 weeks ago
Front End Developers - AI Training (Remote)Cape Town, Western Cape, South Africa 4 days ago
Software Engineer (Python) - Supply ChainCape Town, Western Cape, South Africa 1 month ago
Software Engineer (Python) - Supply ChainCape Town, Western Cape, South Africa 1 month ago
South Africa $60,000.00-$120,000.00 1 month ago
City of Cape Town, Western Cape, South Africa 2 days ago
Cape Town, Western Cape, South Africa 2 months ago
Cape Town, Western Cape, South Africa 2 months ago
Cape Town, Western Cape, South Africa 2 weeks ago
Johannesburg, Gauteng, South Africa 1 month ago
Junior Software Engineer - Cross-platform C++ - MultipassCape Town, Western Cape, South Africa 2 months ago
Cape Town, Western Cape, South Africa 5 months ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrAWS Cloud Engineer Market related
Posted 9 days ago
Job Viewed
Job Description
Client Details:
Our client works with multiple organisations to find a better way to transact. They bring together committed people dedicated to delivering innovative enterprise solutions that help their customers contribute to economic growth. As a vibrant and innovative SaaS company, they deliver industry-leading expertise and technology to solve real problems every day.
Role Responsibilities:
- Deployment, automation, management and maintenance of AWS cloud-based production systems.
- Ensuring availability, performance, security and scalability of AWS production systems.
- Managing / administrating Linux environments.
- Evaluating new technology alternatives and vendor products.
- System troubleshooting and problem resolution across the cloud infrastructure stack.
- Provision of critical system security by leveraging best practices and prolific cloud security solutions.
- Providing recommendations for architecture and process improvements.
- Implementation of security protocols by evaluating business strategies and requirements.
- Maintenance / management of tools for automation of different operational processes.
- Definition, deployment and management of metrics, logging, monitoring and alerting.
- Tracking and understanding emerging practices and standards.
- Participating in educational opportunities.
- Reading professional publications and participating in professional organisations.
- Leading and growing a Cloud Engineering team.
Preferred Qualifications:
- Bachelor's Degree in Information Technology, Systems Engineering, IT Engineering.
- AWS certifications required (SysOps Administrator is preferred).
Relevant Skills / Experience:
- 5+ years' experience in Infrastructure, Software Development and DevOps.
- 3+ years' experience with Linux operating systems.
- 2+ years' experience with AWS Cloud Services.
- 1+ years' experience working as a team lead.
- Experience in the following would be beneficial:
- Java or a similar object-oriented programming language.
- Financial services or banking organisations.
- Engineering data pipelines such as Apache Kafka.
- ELK or similar on Public Cloud Platforms.
- Developing and supporting infrastructure and Cloud capabilities for container-orchestration architectures.
- Working with scripting and provisioning tools like Ansible, CloudFormation or equivalent.
- Configuration of VPN and firewall management.
- Authentication and authorization technologies / protocols (LDAP, Kerberos, AD, OAuth 2.0, OpenID Connect, SAML).
Job ID: J104213
PS Even if you feel you don't have all the skills listed or if this spec isn't what you are looking for, feel free to send your CV as we probably have other opportunities that could interest you. For a more comprehensive and updated list of opportunities that we have on offer, do visit our website -
Desired Experience & QualificationAWS, Cloud Engineer, DevOps, Linux, Java, Team Lead
#J-18808-Ljbffr