More jobs:
Job Description & How to Apply Below
Overview LMI is seeking a Senior Data Engineer supporting Special Operations Command (SOCOM) on-site in Crystal City. The position involves providing big data ETL support on classified data and systems to directly support client mission. This position requires an active Top Secret SCI Clearance, for which you must be a US citizen. LMI is a new breed of digital solutions provider dedicated to accelerating government impact with innovation and speed.
Investing in technology and prototypes ahead of need, LMI brings commercial-grade platforms and mission-ready AI to federal agencies at commercial speed. Leveraging our mission-ready technology and solutions, proven expertise in federal deployment, and strategic relationships, we enhance outcomes for the government, efficiently and effectively. With a focus on agility and collaboration, LMI serves the defense, space, healthcare, and civilian sectors-helping agencies navigate complexity and outpace change.
Headquartered in Tysons, Virginia, LMI is committed to delivering impactful results that strengthen missions and drive lasting value. Responsibilities Responsibilities will include:
* Design, develop, and implement scalable data pipelines and ETL processes using Apache Airflow
* Implement and optimize Python-based data processing solutions.
* Utilize Trino for distributed SQL query processing.
* Develop messaging solutions utilizing Kafka to support real-time data streaming and event-driven architectures.
* Build and maintain high-performance data retrieval solutions using Elastic Search/Open Search.
* Integrate batch and streaming data processing techniques to enhance data availability and accessibility.
* Ensure adherence to security and compliance requirements when working with classified data
* Work closely with cross-functional teams to define data strategies and develop technical solutions aligned with mission objectives.
* Deploy and manage cloud-based infrastructure to support scalable and resilient data solutions, including AWS S3.
* Optimize data storage, retrieval, and processing efficiency, focusing on big data. Qualifications Required Skills
* Bachelor's degree or higher in data engineering, computer science, data science, or similar
* At least 5 years of experience in data engineering or related work
* Expert in Python with significant on the job experience
* Advanced proficiency in SQL (especially Postgres and/or Trino) with significant on the job experience
* Significant experience with big data technology and ETL
* Experience with Airflow (or equivalent) as an ETL workflow technology
* Experience with Kubernetes as an operating environment and Helm as infrastructure management
* Experience with AWS or equivalent cloud based services (e.g. S3 or S3-like services)
* Active TS/SCI Clearance, for which you must be a US citizen Desired Skills
* Experience with Postgres or Trino
* Experience with Apache Iceberg, Hive, HDFS, or Spark
* Previous DOD experience
* Active CI Poly #LI-SH1
Position Requirements
10+ Years
work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×