×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in Milford Mill, Baltimore County, Maryland, USA
Listing for: Index Analytics LLC
Full Time position
Listed on 2026-01-02
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below

Join to apply for the Data Engineer role at Index Analytics LLC

Get AI-powered advice on this job and more exclusive features.

Job Overview

The Data Engineer will assist in supporting and moving the existing ETL solution into a more modernized platform while providing direct guidance to other members of the team. This includes acting as a go-to resource when technical challenges arise and providing leadership and coordination as part of a project team of data engineers, Cloud Architects, analysts, and testers to implement a Snowflake environment to ingest a wide variety of data sources for data analytics.

The Data Engineer determines structural, interface, and business requirements for developing and installing solutions. This includes the design of relational databases, other types of databases, and associated interfaces used for data storage and processing. The Data Engineer develops warehouse and data mart implementation strategies, data acquisition, and archive recovery. In pursuit of system and service optimization, the Data Engineer may perform other duties, such as investigating new data products and technology, evaluating new data sources, and reviewing existing products and products under development for adherence to the organization’s quality standards and ease of integration.

Responsibilities
  • Design and onboard new ETL streams using AWS Glue and Py Spark
  • Migrate existing ETL streams from Databricks and Linux/Python scripts to AWS Glue
  • Re-architect and incrementally improve the performance of ETL solutions without impacting the delivery of new features requested by the client
  • Assist with development efforts to modernize data ingestion patterns
  • Maintain existing ETL pipelines and debug pipeline failures
  • Completing development tasks such as building custom reports, developing complex queries, ETL and data warehousing, and disseminating data to stakeholders.
  • Cloud engineering, testing, Dev Ops, app support, data migrations, data loads, and scheduling jobs
  • Work with Dev Sec Ops  team to set up any supporting infrastructure.
  • Remediate security vulnerabilities in code identified through static code analysis or environment scanning.
  • Data Modeling to support ingestion of a wide variety of CMS data sources and requirements from data analysts and other stakeholders
  • Using Python and/or Linux shell to perform file management and other scripting tasks
  • Optimizing existing processes to improve performance
  • Work closely with product owners and Dev Ops to ensure compliance with SDLC processes
  • Collaborate with business analysts to gather requirements, develop, and document business rules, create test scenarios to ensure properly working code, and communicate technical concepts for transparency
Qualifications
  • US citizen or authorized to work and lived in the US for 3 years out of the last 5 years.
  • 5 years of relevant experience.
  • Bachelor’s Degree or equivalent OR 4 years’ relevant experience in lieu of degree.
  • Strong experience with Cloud platforms such as AWS.
  • Experience with AWS Glue a plus.
  • Very strong experience with Python or PySpark (2+ years).
  • Writing Python or PySpark code for ETL processes.
  • Experience with AWS Lambda a plus.
  • Experience working with semi-structured (XML, JSON, PARQUET) data.
  • Strong experience with SQL.
  • Deployment automation experience via CI/CD tools such as AWS Code Pipeline is a plus.
  • Experience with version control, preferably with Git Hub.
  • Experience working within an Agile development environment or development and testing activities.
  • Working knowledge of database security, audit, and RBAC controls.
  • Knowledge of Snowflake cloud database platform.
  • Hands‑on experience with database performance tuning, clustering key analysis, sizing, and cost optimization.
  • Snow Pro certification or AWS certifications are a plus.
  • Experience analyzing data and presenting information to stakeholders.
  • Ability to obtain Public Trust level clearances.

We're dedicated to ensuring a safe and transparent recruitment process for all candidates and have implemented robust measures to protect your personal information. All employment-related communications will originate from a secure portal () or a corporate email address (). If you…

To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary