Mission Data Fabric Engineer
Listed on 2026-02-16
-
IT/Tech
Data Engineer
Overview
Tyto Athene is searching for a mission oriented Data Fabric Engineer to join our CTO shop. This role is multi-faceted. The successful candidate will provide product, design and development leadership to scope, design, build, field and evolve cutting edge data mesh solutions that accentuate meeting the practical needs of DoD and US Federal Customer customers to work across episodic and enterprise environments.
You will leverage your experience in design, building and deploying mission data mesh solutions for Defense customers, particularly in accommodating the needs of DDIL environments, multiple security level requirements and the unique considerations of data oriented zero trust implementation patterns. You will also leverage your deep knowledge of integrating and making sense of mission edge sensor data to help build open, flexible, secure and integrable mission data mesh solutions.
- Design, develop, and deploy data products tailored for classified and controlled unclassified information (CUI) environments, ensuring strict adherence to DoD security mandates
- Implement open, secure and extensible data solutions that allow for simple data integration with current and emerging data streams at the mission edge
- Develop data products with multi-level security (MLS) capabilities
- Integrate Zero Trust principles into data product design and access patterns, ensuring continuous verification of users, devices, and data
- Design and implement data mesh components and data products optimized for DDIL environments, considering low bandwidth, high latency, and intermittent connectivity
- Develop resilient data synchronization and replication strategies that ensure data availability and consistency at the tactical edge, even when disconnected
- Utilize edge computing principles and technologies to enable localized data processing and analysis within DDIL scenarios, minimizing reliance on central connectivity
- Architect data solutions that can function effectively with disconnected operations and intelligently synchronize when connectivity is restored
- Engineer data pipelines that securely handle data ingress and egress across different security enclaves and network boundaries
- Develop data products that seamlessly integrate with existing and legacy DoD mission systems, applications, and data sources, often requiring expertise in diverse data formats and protocols
- Promote and implement Open Standards and interoperability frameworks to facilitate data exchange across joint and coalition environments
- Work with various DoD components and agencies to standardize data definitions and exchange formats, fostering a cohesive data ecosystem
- Contribute to the design and implementation of tactical edge data mesh nodes that enable rapid decision-making and intelligence gathering in operational environments
- Optimize data products for minimal resource consumption and efficient operation on constrained hardware at the tactical edge
- Establish and enforce DoD-specific data governance policies, including data ownership, data classification, data retention, and data disposal guidelines
- Automate compliance checks and policy enforcement using computational governance frameworks, ensuring continuous adherence to DoD regulations and directives
Required:
- 10 years+ experience in building and delivering solutions for the US federal government customers
- Bachelor's Degree in Engineering, Computer Science, or related field; equivalent, relevant experience will be considered.
- 5+ years of cybersecurity experience with responsibilities involving complex client requirements assessment, solutions design, and implementation within the technology services industry.
- 5+ years experience in designing and delivering distributed data solutions in a DoD DDIL environment
- Experience with Codice Foundation’s Distributed Data Framework (DDF), Apache Solr, Solr cloud and other open distributed data management frameworks
- Experience building event-driven data architectures for streaming sensor data using nifi, kafka, MQTT and airflow
- Experience building highly available distributed/federated data fabrics consisting of many connected compute nodes
- 3+…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).