×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in Aurora, Arapahoe County, Colorado, 80012, USA
Listing for: liberintechnologies
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Job Description & How to Apply Below
Responsibilities
  • Design, develop, and optimize scalable data architectures in AWS environments.
  • Implement and manage data pipelines for real-time and batch data ingestion, processing, and transformation.
  • Model relational and non-relational databases, ensuring efficiency and scalability.
  • Manage and optimize databases in AWS using services such as Amazon RDS, Aurora, Dynamo

    DB, S3, Glue, EMR, Lambda, Kinesis, and others.
  • Develop and optimize ETLs/ELTs using tools like AWS Glue, Apache Spark, dbt, and Airflow.
  • Ensure data quality, security, and governance through best practices and tools like Lake Formation and AWS IAM.
  • Collaborate with Data Science, Analytics, and Development teams to optimize data access and processing.
  • Implement data quality testing and automated validations.
Required Experience, Skills, and Qualifications

Education:
  • Bachelor’s degree in Systems Engineering, Computer Science, Mathematics, Statistics, or related fields.
  • AWS Certified Data Analytics – Specialty or AWS Certified Solutions Architect certification (preferred).
Experience:
  • 3 to 5 years of experience in data engineering in cloud-first environments.
  • Experience working with large-scale data (Big Data) and scalable architectures.
  • Experience integrating data from multiple sources in structured and unstructured environments.
Skills and

Competencies:
  • Data Modeling:
    Experience in designing SQL and No

    SQL databases optimized for performance and scalability.
  • Cloud & AWS Data Services:
    Advanced knowledge of Redshift, RDS, Dynamo

    DB, Glue, Kinesis, Lambda, EMR, S3, and other AWS data tools.
  • Data Processing:
    Experience with Apache Spark, PySpark, Pandas, SQL, dbt, and Apache Airflow.
  • Programming

    Languages:

    Strong proficiency in Python, SQL, Scala, or Java.
  • Automation & Dev Ops:
    Experience with Terraform, Docker, and CI/CD for deploying data infrastructures.
  • Optimization & Performance:
    Expertise in SQL query performance tuning, storage, and data processing optimization.
  • Data Security & Governance:
    Implementation of access policies, auditing, and regulatory compliance (GDPR, HIPAA).
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary