Junior Data Engineer – Enterprise Monitoring
Listed on 2026-01-02
-
IT/Tech
Data Engineer, Data Analyst
Summary
Junior Data Engineer – Enterprise Monitoring
Springfield, VA
Are you ready to enhance your skills and build your career in a rapidly evolving business climate? Are you looking for a career where professional development is embedded in your employer’s core culture? If so, Chenega Military, Intelligence & Operations Support (MIOS) could be the place for you! Join our team of professionals who support large‑scale government operations by leveraging cutting‑edge technology and take your career to the next level!
Chenega Agile Real‑Time Solutions (CARS) was created with the purpose of providing integrated enterprise IT support to Federal customers both CONUS and OCONUS. CARS employs Subject Matter Experts (SMEs) with decades of experience working in the Federal marketplace.
Chenega Agile Real‑Time Solutions (CARS) is seeking a Junior Data Engineer – Enterprise Monitoring to contribute to our team. This role is an opportunity to engage directly in data collection, processing, and infrastructure support. You will operate within an established framework, collaborating closely with experienced team members, and will be counted on to independently manage tasks and proactively seek solutions to ensure we maintain reliable, secure, and scalable data practices.
We are looking for someone with proven data analysis skills who demonstrates a strong desire and aptitude to learn new technologies, especially as tools and cloud services (AWS/Azure/GCP) evolve, as well as becoming proficient in our infrastructure monitoring tools (Grafana/Prometheus).
- Build Dashboards:
Design and maintain interactive dashboards, charts, and graphs using tools like Grafana, Tableau, or Power BI to communicate insights effectively. - Translate Data to Metrics:
Work with stakeholders to translate data analysis results into clear, operational metrics that support day‑to‑day decision‑making. - Data Integrity:
Perform basic data quality checks and troubleshooting to ensure accuracy and reliability of reported metrics. - Utilize Cloud Platforms:
Learn to operate within our cloud environments (e.g., AWS, Azure, GCP), focusing initially on data access, storage, and retrieval. - Contribute to Platforms:
Understand and adhere to best practices for cost‑effective data storage and basic security protocols as you gain experience with services like AWS S3 or Azure Data Lake. - Ensure all data solutions comply with DoD/Federal security standards, including encryption, access control, and audit logging.
- Support classified data environments with strict adherence to TS/SCI protocols and governance policies.
- Gather Requirements:
Collaborate with managers, data scientists, and analysts to understand requirements and translate requests into actionable data deliverables (dashboards/reports). - Present Findings:
Clearly explain insights and data findings to both technical and non‑technical audiences. - Automate Tasks:
Develop basic scripts (SQL/Python) to automate repetitive data analysis or reporting tasks. - Maintain Documentation:
Keep documentation of data systems, workflows, and business rules current and organized. - Adopt Engineering Practices:
Learn to use version control (Git) for code management and adopt best practices for code collaboration within a data environment. - Other duties as assigned.
- Master’s Degree and 0+ years of relevant experience or
- Bachelor's Degree and 2+ years relevant experience or
- Associate’s degree and 4+ years relevant experience or
- High school diploma or GED equivalent and 6+ years of experience
- Experience to include:
- Proficiency in a dashboarding tool (e.g., Tableau, Power BI) and willingness to learn enterprise monitoring tools (Grafana, Prometheus).
- Solid programming skills in Python (R or JavaScript a plus).
- Proficiency in SQL and experience working with various database query languages.
- Basic understanding of data movement concepts with a strong desire to learn and assist in the design and maintenance of ETL/ELT pipelines.
- Knowledge of or a strong desire to learn cloud environments (AWS, Azure, GCP).
- Familiarity with data warehousing, data modeling, version control (Git), and workflow automation principles
- Experience in…
- Experience to include:
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).