×
Register Here to Apply for Jobs or Post Jobs. X

Flex; Project Federal GCP Data Engineer

Job in Peoria, Peoria County, Illinois, 61639, USA
Listing for: Slalom
Full Time position
Listed on 2026-02-12
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Data Science Manager
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Position: Slalom Flex (Project Based)- Federal GCP Data Engineer

(U.S. Citizenship Required)

About Us

Slalom is a purpose-led, global business and technology consulting company. From strategy to implementation, our approach is fiercely human. In six+ countries and 43+ markets, we deeply understand our customers—and their customers—to deliver practical, end-to-end solutions that drive meaningful impact. Backed by close partnerships with over 400 leading technology providers, our 10,000+ strong team helps people and organizations dream bigger, move faster, and build better tomorrows for all.

We’re honored to be consistently recognized as a great place to work, including being one of Fortune’s 100 Best Companies to Work For seven years running. Learn more at

GCP Data Engineer

(U.S. Citizenship Required)

About The Role

We are seeking a GCP Data Engineer with strong Big Query experience to support a major federal engagement focused on disaster recovery, data modernization, and mission‑critical analytics for FEMA. This role is a hands‑on engineering position working within a secure Google Cloud Platform environment to design, build, and optimize scalable data pipelines and analytics capabilities that enable high‑quality insights and operational excellence for our federal client.

This position requires U.S. citizenship and the ability to obtain and maintain a Public Trust clearance
.

What You Will Do

Data Engineering & Cloud Development

  • Design, build, and maintain cloud‑native ETL/ELT data pipelines using Big Query, Dataform, Python, Cloud Composer (Airflow), Cloud Functions, and Cloud Storage.
  • Develop Big Query‑centric data models, transformations, and analytics layers supporting downstream Looker dashboards and federal reporting needs.
  • Implement modern analytics engineering practices, including version‑controlled SQLX (Dataform), modular transformations, data quality checks, and documentation.

Client Leadership & Delivery

  • Collaborate with federal stakeholders to understand data ingestion, transformation, governance, and reporting requirements.
  • Translate technical designs and delivery timelines for both technical and non‑technical audiences.
  • Support modernization of legacy data environments into scalable GCP‑based architectures.
  • Ensure all solutions align with federal data governance, security, and performance standards.

Solution Optimization & Innovation

  • Optimize Big Query workloads using partitioning, clustering, incremental processing, and cost‑efficient modeling.
  • Maintain robust CI/CD practices using Git Lab or Git Hub for version control, merge requests, and promotion pipelines.
  • Develop and maintain data lineage, metadata documentation, and enterprise data models.
  • Identify linkages across disparate datasets to build unified, interoperable data architectures.
  • Perform cleanup of existing datasets and transformation logic where needed.

Collaboration & Team Leadership

  • Work closely with data architects, BI developers, cloud engineers, and data scientists.
  • Participate in SAFe Agile ceremonies including daily standups, retrospectives, and PI planning.
  • Track work in Jira and maintain documentation in Confluence.
  • Support testing, deployment, and quality assurance of data products.
  • Mentor junior data engineering team members and contribute to best‑practice frameworks.
Must-Have Qualifications
  • U.S. citizenship
  • Ability to obtain and maintain a federal Public Trust clearance
  • 3+ years of experience in cloud-based data engineering
  • Strong hands-on expertise with Google Big Query
  • Proficiency in Python for pipeline development, automation, and cloud integration
  • Experience building data pipelines in GCP, including Big Query, Dataform, Airflow/Cloud Composer, Cloud Functions, or similar
  • Strong SQL skills, including data modeling and data quality testing
  • Experience with Git-based version control and CI/CD concepts
  • Familiarity with data governance, metadata management, and compliance considerations
  • Strong communication and stakeholder engagement skills
Nice-to-Have Skills
  • Experience supporting federal or regulated environments
  • Familiarity with Looker and downstream BI enablement
  • Understanding of ML workloads or data structures optimized for modeling
  • Experience with Agile/Scrum or SAFe
  • Knowledge of data quality…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary