×
Register Here to Apply for Jobs or Post Jobs. X

Innovation and Automation Specialist

Job in Chantilly, Fairfax County, Virginia, 22021, USA
Listing for: Bespoke Technologies, Inc.
Full Time position
Listed on 2025-12-23
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 100000 - 130000 USD Yearly USD 100000.00 130000.00 YEAR
Job Description & How to Apply Below

BT-162 - Innovation and Automation Specialist
Skill Level: Mid

Location:

Chantilly/Herndon

Role

Description:


As a Data Engineer Specialist on the Innovation and Automation team, you will serve as a subject matter expert, blending deep data engineering expertise with a passion for automation. You will not build individual data pipelines for business users; instead, you will build the factory that produces them. Your mission is to design, develop, and implement the reusable frameworks, automated patterns, and core tooling that our data engineering teams will use to build their own pipelines faster, more reliably, and more consistently.

This is a highly technical, hands-on role for a problem-solver who wants to act as a force multiplier for the entire data organization.

Responsibilities:

  • Act as a technical expert on the design and implementation of automated data engineering solutions.
  • Develop and maintain a library of standardized, reusable ETL/ELT pipeline templates using Python, SQL, and frameworks like Databricks or Snowflake.
  • Engineer and implement robust, automated data quality and testing frameworks (e.g., using tools like Great Expectations) that are embedded within the core pipeline templates.
  • Contribute to the development of Infrastructure-as-Code (IaC) modules (Terraform) for the automated provisioning of data infrastructure.
  • Enhance and optimize the CI/CD for Data (Data Ops) pipelines, ensuring seamless and reliable deployment of data workflows.
  • Serve as an escalation point for the most complex data engineering and automation challenges, providing expert-level troubleshooting and guidance to other engineers.
  • Mentor other data engineers on automation best practices, code standards, and the use of the frameworks you build.
  • Research and prototype cutting-edge data engineering and automation technologies to drive continuous improvement.

Required Qualifications:

  • 5+ years of hands-on experience in data engineering.
  • Expert-level programming skills in Python and advanced SQL.
  • Proven, in-depth experience building and optimizing data pipelines in a cloud environment (AWS, Azure) on platforms like Databricks or Snowflake.
  • Strong, hands-on experience with Infrastructure-as-Code (IaC) using Terraform.
  • Demonstrable experience with CI/CD principles and tools (e.g., Git Lab CI, Jenkins, Git Hub Actions) applied to data workflows.
  • Deep understanding of modern data architecture, data modeling, and software engineering best practices.

Preferred Qualifications:

  • Experience in a Dev Ops or Site Reliability Engineering (SRE) role.
  • Direct experience developing and operationalizing a "pipeline factory" or similar framework.
  • Familiarity with data orchestration tools (e.g., Airflow) and containerization (Docker, Kubernetes).
  • Proven ability to diagnose and resolve complex performance, data quality, and system-level issues.
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary