×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer - Python, SQL, AWS

Job in Durham, Durham County, North Carolina, 27703, USA
Listing for: Compunnel, Inc.
Full Time position
Listed on 2025-10-30
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 95000 - 120000 USD Yearly USD 95000.00 120000.00 YEAR
Job Description & How to Apply Below

We are seeking an experienced Data Engineer with a strong background in Python, SQL, and AWS to join our team.

This role involves designing and developing scalable data pipelines, performing data analysis and modeling, and contributing to the creation of an enterprise-wide Data Lake on AWS.

The ideal candidate will be passionate about data, enjoy working in collaborative environments, and have a strong desire to innovate and learn.

Key Responsibilities
  • Design and implement scalable ETL/ELT pipelines using AWS Glue, Lambda, Step Functions, and other AWS services.
  • Integrate structured and unstructured data from diverse sources into data lakes and warehouses (e.g., S3, Redshift, RDS, Athena).
  • Build and maintain cloud infrastructure for data analytics platforms using Terraform, Cloud Formation, or similar IaC tools.
  • Collaborate with data engineers, data scientists, and analysts to deliver high-quality platforms for data loading, reporting, and machine learning.
  • Optimize data models and queries for performance and scalability.
  • Monitor data pipelines and troubleshoot issues to ensure reliability and data integrity.
  • Implement CI/CD pipelines for data engineering workflows using Git Lab, Bitbucket, Jenkins, or Git Hub Actions.
  • Ensure compliance with data governance and security best practices.
  • Implement access controls and encryption for sensitive data.
Required Qualifications
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • Extensive experience with relational databases such as Oracle or Snowflake.
  • Experience in data warehousing, data modeling, and creation of data marts.
  • Hands‑on experience with AWS services including S3, Glue, Lambda, Redshift, RDS, Athena, and Step Functions.
  • Experience with ETL technologies such as Informatica or Snap Logic.
  • Proficiency in SQL and PySpark.
  • Familiarity with orchestration tools like Apache Airflow or MWAA.
  • Understanding of Dev Ops tools and practices (CDK, CI/CD, Git, Terraform).
  • Experience with Agile methodologies (Kanban and SCRUM).
Preferred Qualifications
  • Experience with big data tools (Spark, Hive, Kafka).
  • Knowledge of containerization (Docker, Kubernetes).
  • Familiarity with data visualization tools (e.g., Power BI).
  • AWS certifications (e.g., AWS Certified Data Analytics – Specialty).
  • Experience with Business Intelligence and dashboard development.
  • Exposure to Dev Ops, Continuous Integration, and Continuous Delivery tools (Maven, Jenkins, Stash, Ansible, Docker).
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary