More jobs:
Mid-Level Data Engineer
Job in
Huntsville, Madison County, Alabama, 35803, USA
Listed on 2026-03-03
Listing for:
Astrion
Full Time
position Listed on 2026-03-03
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing
Job Description & How to Apply Below
If your skills, experience, and qualifications match those in this job overview, do not delay your application.
Mid-Level Data Engineer
Location:
Huntsville, Alabama
Job Status:
Full Time
Clearance:
Active TS/CI
Astrion is seeking a Mid-Level Data Engineer to join our analytics team in Huntsville, Alabama.
In this role, you will construct data analytical infrastructure, data engineering, data mining, exploratory analysis, predictive analysis, and statistical analysis. You will leverage scientific techniques to transform petabyte-scale data into insightful data products to enable data-driven decisions. You will partner with Data Scientists to refactor manual workflows to produce highly automated MLOPs based workloads. You will perform using scrumban techniques and be embedded with end users.
For this role you must have an active Secret security clearance with ability to obtain TS/SCI with CI polygraph.
REQUIRED QUALIFICATIONS / SKILLS
Significant experience as a Data Engineer or advanced analytical role
Experience in data engineering and/or software development
Expertise with Python, GIT, YAML, Docker, and SQL
Knowledge of CI/CD, Dev Sec Ops , and Agile methodologies
Experience developing back-end systems and services
Understanding of software design and system integration
PREFERRED QUALIFICATIONS / SKILLS
Experience as a Site Reliability Engineer (SRE)
Experience with AIOps and Fin Ops
Experience with Petabyte scale data sets
Experience with large-scale, multi-INT analytics
BS or MS in Computer Science, Statistics, Mathematics, Physics or a quantitative field
RESPONSIBILITIES
Build and maintain data pipelines, ETL processes, and storage systems
Develop services and extend infrastructure to enable machine learning workflows
Integrate software components into functional data systems
Write clean, testable, maintainable code in Python and other languages
Implement CI/CD pipelines and Dev Sec Ops best practices
Create xrlwcon technical documentation for software systems
Collaborate across teams to share knowledge and leading practices
The team will work with technologies including:
Open source, Commercial, and Government software packages such as Kafka, Beam, Num Py, Kubeflow, Nvidia Triton, PyTorch, Tensor Flow, Weaviate, Neo4j, Grafana, etc.
Cloud native techniques and containerization with Docker
Infrastructure as Code with Terraform
Leverage Git Ops patterns and CI/CD with tools like Git Lab, Argo, and Harness
Perform SAST/DAST security with tools like Sonar Qube
Perform Kubernetes and K3s orchestration with tools like Rancher and Konvoy
#CJ
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×