×
Register Here to Apply for Jobs or Post Jobs. X

Data​/ETL Developer

Job in Baltimore, Anne Arundel County, Maryland, 21276, USA
Listing for: TriTech Enterprise Systems, Inc.
Full Time position
Listed on 2025-12-19
Job specializations:
  • IT/Tech
    Data Engineer, Data Warehousing, Data Analyst, Cloud Computing
Salary/Wage Range or Industry Benchmark: 100000 - 125000 USD Yearly USD 100000.00 125000.00 YEAR
Job Description & How to Apply Below

Overview

Tri Tech Enterprise Systems, Inc. (Tri Tech) is seeking a Data/ETL Developer to support a State of Maryland contract. This hybrid position is located in Baltimore, Maryland. The candidate will be responsible for designing, building, and maintaining data pipelines and infrastructure to support data-driven decisions and analytics.

Responsibilities
  • Design, develop and maintain data pipelines and ETL processes to collect, process, and store structured and unstructured data
  • Build data architecture and storage solutions, including data lake houses, data lakes, data warehouses, and data marts to support analytics and reporting
  • Develop data reliability, efficiency, and quality checks and processes
  • Prepare data for data modeling
  • Monitor and optimize data architecture and data processing systems
  • Collaborate with multiple teams to understand requirements and objectives
  • Administer testing and troubleshooting related to performance, reliability, and scalability
  • Create and update documentation
Additional Responsibilities

In addition to the responsibilities listed above, the individual will perform the following using data architecture and modeling techniques:

  • Design and implement robust, scalable data models to support PMM applications, analytics, and business intelligence initiatives
  • Optimize data warehousing solutions and manage data migrations in the AWS ecosystem, utilizing Amazon Redshift, RDS, and Document

    DB services
ETL Development
  • Develop and maintain scalable ETL pipelines using AWS Glue and other AWS services to enhance data collection, integration, and aggregation
  • Ensure data integrity and timeliness in the data pipeline, troubleshooting issues during data processing
Data Integration
  • Integrate data from various sources using AWS technologies, ensuring seamless data flow across systems
  • Collaborate with stakeholders to define data ingestion requirements and implement solutions to meet business needs
Performance Optimization
  • Monitor, tune, and manage database performance to ensure efficient data loads and queries
  • Implement best practices for data management within AWS to optimize storage and computing costs
Security and Compliance
  • Ensure all data practices comply with regulatory requirements and department policies
  • Implement and maintain security measures to protect data within AWS services
Team Collaboration and Leadership
  • Lead and mentor junior data engineers and team members on AWS best practices and technical challenges
  • Collaborate with UI/API teams, business analysts, and other stakeholders to support data-driven decision-making
Innovation and Continuous Improvement
  • Explore and adopt new technologies within the AWS cloud to enhance the capabilities of the data platform
  • Continuously improve existing systems by analyzing business needs and technology trends
Education
  • This position requires a bachelor’s or master’s degree from an accredited college or university with a major in computer science, statistics, mathematics, economics, or related field
  • Three (3) years of equivalent experience in a related field may be substituted for the bachelor’s degree
Experience
  • Minimum of three (3) years of experience as a data engineer
  • Specialized experience: experience as a data engineer or similar role with strong understanding of data architecture and ETL processes
  • Minimum 5+ years ETL coding experience
  • Proficiency in Python and SQL for data processing and automation
  • Experience with distributed computing frameworks like Apache Spark
  • Experience with AWS data environment (Glue, S3, Document

    DB, Redshift, RDS, Athena, etc.)
  • Experience with data warehouses/RDBMS (e.g., Redshift) and No

    SQL data stores (Document

    DB, Dynamo

    DB, Open Search, etc.)
  • Experience in building data lakes using AWS Lake Formation
  • Experience with workflow orchestration and scheduling tools (AWS Step Functions, AWS MWAA, etc.)
  • Strong understanding of relational databases (tables, views, indexes, table spaces)
  • Experience with source control tools such as Git Hub and related CI/CD processes
  • Ability to analyze a company’s data needs and strong problem-solving skills
  • Experience with the SDLC and Agile methodologies

Tri Tech is an Equal Opportunity Employer.

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary