More jobs:
Data Engineer - Python, SQL, AWS
Job in
Durham, Durham County, North Carolina, 27703, USA
Listed on 2025-12-01
Listing for:
Compunnel, Inc.
Full Time
position Listed on 2025-12-01
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing
Job Description & How to Apply Below
We are seeking an experienced Data Engineer with a strong background in Python, SQL, and AWS to join our team.
This role involves designing and developing scalable data pipelines, performing data analysis and modeling, and contributing to the creation of an enterprise-wide Data Lake on AWS.
The ideal candidate will be passionate about data, enjoy working in collaborative environments, and have a strong desire to innovate and learn.
Key Responsibilities- Design and implement scalable ETL/ELT pipelines using AWS Glue, Lambda, Step Functions, and other AWS services.
- Integrate structured and unstructured data from diverse sources into data lakes and warehouses (e.g., S3, Redshift, RDS, Athena).
- Build and maintain cloud infrastructure for data analytics platforms using Terraform, Cloud Formation, or similar IaC tools.
- Collaborate with data engineers, data scientists, and analysts to deliver high-quality platforms for data loading, reporting, and machine learning.
- Optimize data models and queries for performance and scalability.
- Monitor data pipelines and troubleshoot issues to ensure reliability and data integrity.
- Implement CI/CD pipelines for data engineering workflows using Git Lab, Bitbucket, Jenkins, or Git Hub Actions.
- Ensure compliance with data governance and security best practices.
- Implement access controls and encryption for sensitive data.
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- Extensive experience with relational databases such as Oracle or Snowflake.
- Experience in data warehousing, data modeling, and creation of data marts.
- Hands‑on experience with AWS services including S3, Glue, Lambda, Redshift, RDS, Athena, and Step Functions.
- Experience with ETL technologies such as Informatica or Snap Logic.
- Proficiency in SQL and PySpark.
- Familiarity with orchestration tools like Apache Airflow or MWAA.
- Understanding of Dev Ops tools and practices (CDK, CI/CD, Git, Terraform).
- Experience with Agile methodologies (Kanban and SCRUM).
- Experience with big data tools (Spark, Hive, Kafka).
- Knowledge of containerization (Docker, Kubernetes).
- Familiarity with data visualization tools (e.g., Power BI).
- AWS certifications (e.g., AWS Certified Data Analytics – Specialty).
- Experience with Business Intelligence and dashboard development.
- Exposure to Dev Ops, Continuous Integration, and Continuous Delivery tools (Maven, Jenkins, Stash, Ansible, Docker).
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×