Data Engineer SME
Listed on 2026-01-01
-
IT/Tech
Data Engineer, AI Engineer
Company Overview
Iron Eagle
X (IEX), a wholly owned subsidiary of General Dynamics Information Technology, delivers agile IT and Intelligence solutions. Combining small-team flexibility with global scale, IEX leverages emerging technologies to provide innovative, user-focused solutions that empower organizations and end users to operate smarter, faster, and more securely in dynamic environments.
Job Description:
We are seeking an Data Engineering SME to design, build, and operate data pipelines that ingest, store, and process high-volume, multi-source data primarily for modern AI/ML processes. You will partner with software, analytics, and product teams to create model-ready datasets (features, embeddings, and prompts), implement scalable storage layers (data lakehouse and vector stores), and enable low-latency retrieval for query, inference, and RAG.
Responsibilities include orchestrating streaming and batch pipelines, optimizing compute for GPU/CPU workloads, enforcing data quality and governance, and instrumenting observability. This role is ideal for someone passionate about turning raw data into reliable, performant inputs for AI models and other analytics while right-sizing technologies and resources for scale and speed. This is an onsite position in Crystal City, VA.
- Design, develop, and implement scalable data pipelines and ETL processes using Apache Airflow, with a focus on data for AI applications.
- Develop messaging solutions utilizing Kafka to support real-time data streaming and event-driven architectures.
- Build and maintain high-performance data retrieval solutions using Elastic Search/Open Search.
- Implement and optimize Python-based data processing solutions.
- Integrate batch and streaming data processing techniques to enhance data availability and accessibility.
- Ensure adherence to security and compliance requirements when working with classified data.
- Work closely with cross-functional teams to define data strategies and develop technical solutions aligned with mission objectives.
- Deploy and manage cloud-based infrastructure to support scalable and resilient data solutions.
- Optimize data storage, retrieval, and processing efficiency.
- Experience with Apache Airflow for workflow orchestration.
- Strong programming skills in Python.
- Experience with Elastic Search/Open Search for data indexing and search functionalities.
- Understanding of vector databases, embedding models, and vector search for AI applications.
- Expertise in event-driven architecture and microservices development.
- Hands-on experience with cloud services (e.g. MinIO), including data storage and compute resources.
- Strong understanding of data pipeline orchestration and workflow automation.
- Working knowledge of Linux environments and database optimization techniques.
- Strong understanding of version control with Git.
- Due to US Government Contract Requirements, only US Citizens are eligible for this role.
- Proficiency in Kafka for messaging and real-time data processing.
- Understanding of LLM prompt engineering and associated ETL applications.
- Knowledge of Super Set for data visualization and analytics.
- Familiarity with Kubernetes for container orchestration.
- Exposure to Apache Spark for large-scale data processing.
Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field (or equivalent experience). Advanced degrees are a plus.
Security ClearanceAn active TS/SCI security clearance is REQUIRED, and candidates must have or be willing to obtain a CI Poly. Candidates without this clearance will not be considered.
Equal Opportunity Employer / Individuals with Disabilities / Protected Veterans #iexjobs
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).