Data Engineer III
Listed on 2025-12-09
-
IT/Tech
Data Engineer, Cloud Computing
AWS Data Engineer III
Hybrid role
Compensation: $68
ABOUTTHE ROLE
Our client is seeking an experienced AWS Data Engineer III to design, develop, and maintain robust data warehousing solutions on AWS. In this role, you will leverage your deep expertise in AWS services—including S3, EMR, Glue, Lambda, Athena, and Redshift—to build and optimize scalable data pipelines and ensure high data quality. You will collaborate closely with lead developers, data architects, and solution architects to define technical requirements and deliver user-focused solutions that support analytical and business needs.
The ideal candidate will have a strong background in Python, Spark, PySpark, and infrastructure as code tools such as Terraform or Cloud Formation, as well as a solid understanding of networking, security, and Dev Ops practices. Your work will be critical in delivering reliable, high-performance data solutions that empower business decision-making.
- Design, develop, and maintain scalable data pipelines using AWS services (S3, EMR, Glue, Lambda, Athena, etc.)
- Collaborate with lead developers, data architects, and solution architects to define technical requirements, outline technical scope, and lead delivery of technical solutions
- Confirm required developers and skill sets specific to the product and collaborate on key technical decisions
- Implement and optimize data warehousing solutions with a focus on performance, durability, and end-user experience
- Develop and refine ETL processes and workflows using tools such as Python, Spark, PySpark, and Pandas
- Utilize infrastructure as code (Terraform/Cloud Formation) for provisioning and managing AWS resources
- Integrate messaging systems (e.g., Kafka, preferably Confluent Kafka) and event-driven architectures into data solutions
- Implement security best practices, including IAM roles, policies, and secrets management (Vault, AWS Secrets Manager)
- Monitor and automate alerts for production systems using AWS monitoring tools (Cloud Watch, Cloud Trail, Cloud Watch Events)
- Participate in Dev Ops practices, including CI/CD pipeline management (Bitbucket, Concourse)
- Perform hands‑on development, peer reviews, and stand up development instances with appropriate security controls and migration paths
- Identify data gaps and deliver automated solutions to enhance analytical capabilities and data enrichment
- Work with RDBMS platforms, write complex SQL queries, and integrate with REST APIs and API gateways
- Apply deep understanding of networking (DNS, TCP/IP, VPN) in cloud data engineering solutions
- Orchestrate workflows using AWS Step Functions or Airflow
- Manage and resolve issues in production data warehouse environments on AWS, ensuring high data quality and reliability
- Design data warehousing solutions with the end‑user in mind, ensuring ease of use without compromising on performance
- 5+ years of hands‑on AWS experience
- Proficiency with AWS services: S3, EMR, Glue Jobs, Lambda, Athena, Cloud Trail, SNS, SQS, Cloud Watch, Step Functions, Quick Sight
- Experience with Kafka/messaging, preferably Confluent Kafka
- Hands‑on experience with EMR databases:
Glue Catalog, Lake Formation, Redshift, Dynamo
DB, Aurora - Expertise in AWS data warehousing tools such as Amazon Redshift and Amazon Athena
- Proven track record in designing and implementing data warehouse solutions using AWS
- Skilled in data modeling and executing ETL processes for data warehousing
- Competence in developing and refining data pipelines within AWS
- Proficient in real‑time and batch data processing
- Strong understanding of database management fundamentals
- Expertise in creating alerts and automated solutions for production issues
- Proficiency in Python, Spark, PySpark, and Pandas
- Experience with Infrastructure as Code:
Terraform/Cloud Formation - Experience with Secrets Management platforms like Vault and AWS Secrets Manager
- Experience with Event‑Driven Architecture
- Familiarity with Dev Ops pipelines (CI/CD):
Bitbucket, Concourse - Experience with RDBMS platforms and strong SQL skills
- Experience with REST APIs and API Gateway
- Deep knowledge of IAM roles and policies
- Experience using AWS monitoring services:
Cloud Watch, Cloud Trail, Cloud Watch Events - Understanding of networking: DNS, TCP/IP, VPN
- Experience with AWS workflow orchestration tools like Airflow or Step Functions
- Ability to collaborate with cross‑functional teams and lead technical solution delivery
- Ability to manage and resolve issues in production data warehouse environments on AWS
- Ability to build new data pipelines, identify data gaps, and provide automated analytical solutions
- Ability to perform hands‑on development and peer review for certain components/tech stack on the product
- Experience standing up development instances and migration paths with required security, access, and roles
- Strong communication and collaboration skills
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).