More jobs:
Senior AWS Cloud Data Engineer
Job in
Vienna, Fairfax County, Virginia, 22184, USA
Listed on 2026-02-18
Listing for:
Cynet systems Inc
Full Time
position Listed on 2026-02-18
Job specializations:
-
IT/Tech
Data Engineer -
Engineering
Data Engineer
Job Description & How to Apply Below
Job Description
- The Sr. AWS Cloud Data Engineer will design, develop, and support enterprise-scale data integration and data warehouse solutions.
- This role involves building Ab Initio data pipelines, integrating with AWS services, and transforming complex datasets into consumable data layers for business applications.
- The ideal candidate will bring strong expertise in ETL development, Hadoop ecosystems, AWS integrations, and data warehouse best practices.
- Collaborate with Business Analysts and Product teams to gather and analyze data requirements.
- Design and develop Ab Initio graphs and data pipelines to extract data from databases, flat files, and message queues.
- Transform and model data to create scalable and consumable data layers for downstream applications.
- Support and maintain data pipelines including bug fixes, performance tuning, and enhancements.
- Document technical designs, architecture diagrams, and operational runbooks.
- Ensure adherence to engineering best practices including high code quality and automated testing.
- Perform thorough unit testing and troubleshoot production issues.
- Support junior developers and work independently in a fast-paced environment.
- Bachelor’s Degree in Computer Science, Information Technology, Engineering, or related field.
- 10+ years of overall IT experience with strong focus in Data Integration and Data Warehousing.
- Minimum 5 years of hands‑on ETL design and development experience using Ab Initio.
- 1 to 2 years of Data Integration experience on Hadoop platforms, preferably Cloudera.
- Experience integrating Ab Initio with AWS S3, Redshift, or other AWS database services.
- Working knowledge of Hadoop technologies including HDFS, Hive, and Impala.
- Strong understanding of SQL and ability to write high‑performing queries.
- Solid knowledge of OLTP and OLAP data models and data warehouse fundamentals.
- Experience with Unix or Linux shell scripting.
- Knowledge of Agile development methodologies.
- Experience with Ab Initio Change Data Capture in ETL projects.
- Basic Java development experience.
- Experience with reusable code components and modular design practices.
- Ab Initio ETL development.
- AWS cloud services including S3 and Redshift.
- Hadoop ecosystem technologies.
- SQL optimization and performance tuning.
- Unix or Linux scripting.
- Strong analytical and problem‑solving abilities.
- Ability to work independently and mentor junior team members.
- Strong written and verbal communication skills.
- Commitment to quality, documentation, and operational excellence.
- Bachelor’s Degree required.
- Relevant cloud or data engineering certifications are preferred.
Position Requirements
10+ Years
work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×