More jobs:
Cloud/Data Engineer
Job in
Fairfax, Fairfax County, Virginia, 22032, USA
Listed on 2026-02-06
Listing for:
Prometheus Federal Services (PFS)
Full Time
position Listed on 2026-02-06
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Data Science Manager
Job Description & How to Apply Below
Overview
Prometheus Federal Services (PFS) is a trusted partner to federal health agencies, delivering mission-driven data, analytics, and technology solutions. We are seeking skilled Cloud/Data Engineering to design, build, and modernize data platforms that support analytics, automation, and AI/ML initiatives across federal health programs. In this role, you will develop scalable cloud-based data pipelines, implement data quality and governance frameworks, and contribute to modernization efforts that drive insight, performance, and operational excellence for our federal clients.
Essential Duties and Responsibilities- Design, develop, and deploy end-to-end pipelines for data acquisition, preparation, cleaning, and transformation
- Design, develop, and deliver data quality monitoring and observability tools to track data lineage and manage integrity across data pipelines
- Design, develop, and manage cloud solutions to deliver data, analytics, and AI/ML products, including migration and modernization efforts
- Manage cloud environments to support data and analytics development, including provisioning, configuration, and cost management
- Apply Dev Ops tools to deliver enhanced automation, workflow orchestration, and monitoring
- Bachelor's degree in Computer Science, Data Science, or a related field
- Five (5) + years of experience with data engineering and data architecture development, including developing scalable ETL/ELT pipelines for reporting and analytics
- Three (3) + years of experience working with cloud infrastructure (i.e., AWS, Azure, GCP), including environment provisioning, configuration, and management
- Three (3) + years of experience utilizing languages such as SQL, Python, and PySpark for data ingestion, cleaning, and transformation
- Experience working with Dev Ops/Dev Sec Ops tools and frameworks, including Continuous Integration/Continuous Delivery (CI/CD), build automation, and Infrastructure as Code (e.g., Terraform, Cloud Formation) and containerization (e.g., Docker, Kubernetes)
- Experience working across a variety of data warehousing tools, including SQL and Oracle
- Experience with Git Hub, Git Lab, and Git Hub Actions
- Excellent written and verbal communication skills with both technical and non-technical stakeholders
- Authorized to work in the U.S. indefinitely without sponsorship
- Ability to obtain a public trust
- Experience building data pipelines and architecture to manage ingest, integration, and data product development for large-scale unstructured datasets (PDF, documents) across multiple source systems
- Experience working with data platforms and processing tools, including Databricks and Spark
- Experience migrating legacy and on-premises data and systems to cloud environments
- Experience developing data products through a medallion architecture
- Knowledge and experience applying data governance and data quality management tools, frameworks, and best practices
- Amazon or Azure cloud certifications
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×