Technology and Operations - Data Engineer
Listed on 2026-01-05
-
IT/Tech
Data Engineer
Technology and Operations - Data Engineer
Location:
Sterling - 45580 Terminal
-Virginia
Job Title:
Technology and Operations - Data Engineer
We are looking for a seasoned Data Engineer with a strong foundation in big data technologies and a growing proficiency in AI/ML systems. This individual will bring deep expertise in large-scale data processing frameworks (both open‑source and proprietary), OLAP/OLTP systems, and real‑time data streaming. The ideal candidate will also demonstrate a passion for enabling AI‑driven solutions through robust, scalable data infrastructure.
Key Responsibilities- Design, develop, and maintain highly scalable, fault‑tolerant real‑time, near real‑time, and batch data pipelines.
- Implement data quality checks, validation, and cleaning processes to ensure high data accuracy and integrity.
- Continuously monitor and optimize data pipelines and databases for performance, resource utilization, and cost efficiency.
- Uphold high standards in code quality, testing, and documentation.
- Mentor junior data engineers and provide technical leadership within the team.
- Perform exploratory and quantitative analytics, data mining, and discovery to support AI/ML initiatives.
- Collaborate with data analysts and business stakeholders to make data accessible and actionable.
- Participate in 24x7 platform support rotations as needed.
- Bachelor’s degree or higher in Computer Science or a related field.
- 7+ years of experience in data engineering roles.
- Proficiency in programming languages such as Scala, Python, or Java.
- Expertise in distributed data processing frameworks like Apache Spark or Flink.
- Experience with stream processing systems such as Kafka or Kinesis.
- Strong knowledge of cloud platforms (e.g., AWS) and cloud‑native data platforms like Databricks, Snowflake, or Redshift.
- Solid understanding of SQL, schema design, dimensional modeling, and ETL best practices.
- Experience with workflow orchestration tools such as Apache Airflow.
- Familiarity with CI/CD pipelines, preferably Git Hub Actions.
- Strong technical documentation and issue‑tracking skills.
- Experience with analytics tools such as Looker or Tableau.
- Demonstrated ability to adopt and integrate new technologies.
- Experience in direct‑to‑consumer digital businesses is a plus.
- Familiarity with statistical modeling libraries (e.g., Scikit‑learn, XGBoost, Tensor Flow).
- Experience with AI/ML frameworks such as Tensor Flow, PyTorch, or Scikit‑learn.
- Exposure to GenAI technologies, including LLMs and RAG pipelines.
Seniority level: Mid‑Senior level
Employment type: Contract
Job function: Product Management, Marketing, and Design
Industries: IT Services and IT Consulting, Engineering Services, and Information Services
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).