More jobs:
Sr Data Engineer
Job in
Westbrook, Cumberland County, Maine, 04098, USA
Listed on 2025-12-02
Listing for:
DeWinter Group
Full Time
position Listed on 2025-12-02
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Data Analyst, Data Security
Job Description & How to Apply Below
Sr Data Engineer
Duration: 5 months (likely to extend)
Start Date: ASAP
Location:
Westbrook, ME (2x per week onsite)
W2/C2C: W2 Only
Additional Notes- This role requires an "On Call" rotation - 1 week per month/5 weeks
- This is the primary reason for the role; the product requires a 24/7 support presence
- Strong background in AWS cloud services, with a focus on data storage and pub/sub platforms (S3, SNS, SQS, Lambda, etc.)
- Proven experience in building and maintaining operational data pipelines, particularly with modern technologies like dbt, Airflow, and Snowflake.
- Strong proficiency in Python, and SQL.
- Familiarity with Infrastructure as Code solutions like Terraform
- Experience with Quality Engineering processes
We are seeking a highly motivated and experienced Senior Data Engineer to join our team and support the accelerated advancement of our global instrument data pipelines. This role is pivotal in developing changes that deliver instrument data to stakeholders.
Key Responsibilities- Design and implement ingestion and storage solutions using AWS services such as S3, SNS, SQS, and Lambda.
- Develop and implement analytical solutions leveraging Snowflake, Airflow, Apache Iceberg
- Collaborate with cross-functional teams to understand and meet data needs for processing.
- Develop and maintain scalable and reliable data solutions that support operational and business requirements.
- Document data flows, architecture decisions, and metadata to ensure maintainability and knowledge sharing.
- Design and implement fault-tolerant systems, ensuring high availability and resilience in our data processing pipelines.
- Actively participate in testing and quality engineering (QE) processes, collaborating closely with the QE team to ensure the reliability and accuracy of data solutions.
- Strong problem-solving skills and the ability to operate independently, sometimes with limited information.
- Strong communication skills, both verbal and written, including the ability to communicate complex issues to both technical and non-technical users in a professional, positive, and clear manner.
- Initiative and self-motivation with strong planning and organizational skills.
- Ability to prioritize and adapt to changing business needs.
- Proven experience in building and maintaining operational data pipelines, particularly with modern technologies like dbt, Airflow, and Snowflake.
- Strong proficiency in Python, and SQL.
- Strong background in AWS cloud services, with a focus on data storage and pub/sub platforms (S3, SNS, SQS, Lambda, etc.).
- Familiarity with Infrastructure as Code solutions like Terraform is a plus
- Familiarity with a broad range of technologies, including:
- Cloud-native data processing and analytics
- SQL Databases, specifically for analytics
- Orchestration and ELT technologies like Airflow and dbt
- Scripting and programming with Python, or similar languages
- Infrastructure as Code languages like terraform
- Ability to translate complex business requirements into scalable and efficient data solutions.
- Strong multitasking skills and the ability to prioritize effectively in a fast-paced environment.
- Candidates should have a minimum of eight years of experience in a similar role, preferably within a technology-driven environment.
- Experience building data services and ETL/ELT pipelines in the cloud using Infrastructure as Code and large-scale data processing engines
- Strong experience with SQL and one or more programming languages (Python preferred)
- Meeting delivery timelines for project milestones.
- Effective collaboration with cross-functional teams.
- Ensure high standards of data accuracy and accessibility in a fast-paced, dynamic environment.
- Reduction in data pipeline failures or downtime through resilient and fault-tolerant design.
- Demonstrated contribution to the stability and scalability of the platform through well-architected, maintainable code.
- Positive feedback from stakeholders (engineering, product, or customer-facing teams) on delivered solutions.
- Active contribution to Agile ceremonies and improvement of team velocity or estimation accuracy.
- Proactive identification and mitigation of data-related risks, including security or compliance issues.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×