×
Register Here to Apply for Jobs or Post Jobs. X

Scientific Data Infrastructure Engineer

Job in Westbrook, Cumberland County, Maine, 04098, USA
Listing for: IDEXX GmbH
Full Time position
Listed on 2026-01-10
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Data Security
Job Description & How to Apply Below
We're proud to be a global leader in pet healthcare innovation. Our diagnostic instruments, software, tests, and services help veterinarians around the world advance medical care, improve staff efficiency, and build more economically successful practices. At IDEXX, you'll be part of a team that's passionate about making a difference in the lives of pets, people, and our planet.
** This role is onsite in Westbrook, Maine.**##
** What You Will Do
*** Infrastructure & Automation Leadership
** Design and implement CI/CD pipelines using Git Hub Actions, Git Lab CI/CD, AWS Code Pipeline, and Google Cloud Build to streamline deployment of mass spectrometry-based data processing systems and proteomic computing workloads
* Develop and maintain infrastructure-as-code solutions using Terraform for AWS and Google Cloud environments
* Build automated deployment systems for serverless functions using AWS Lambda and Google Cloud Run
* Orchestrate large-scale batch processing jobs using AWS Batch and Google Cloud Batch
* Database Architecture & Data Pipeline Development
** Design and implement scalable database solutions for proteomic, metabolomic and genomic data storage and retrieval
* Architect and optimize Snowflake data warehouses for large-scale multi-omic datasets
* Build ETL/ELT workflows for instrument data ingestion, including metadata capture and provenance tracking
* Manage both SQL and No

SQL database systems supporting research applications
* Implement data governance, backup, disaster recovery, and audit trail strategies
* Scientific Computing Operations
** Create and manage computing infrastructure for mass spectrometry-based data processing
* Implement scalable solutions for high-throughput multi-omic data pipelines from analytical instruments
* Deploy and maintain data annotation platforms and curation systems
* Build monitoring and alerting systems that track pipeline health, processing backlogs, and system performance
* Cross-functional Collaboration
** Partner with research scientists, bioinformaticians, and software engineers to understand computational requirements and translate scientific needs into technical solutions
* Provide technical leadership to implement modern Dev Ops practices across research workflows
* Develop documentation, playbooks, and training materials to enable self-service capabilities for research teams
* Mentor team members and drive adoption of Dev Ops best practices
** What You Need to Succeed
*** Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience)
* 7-10+ years of experience in Dev Ops, Database Architecture, or related fields
* Proven track record of leading complex infrastructure projects, preferably in research or data-intensive environments
* Strong experience with CI/CD tools (Git Hub Actions, Git Lab CI/CD, AWS Code Pipeline, Google Cloud Build, Jenkins, ArgoCD)
* Proficiency in infrastructure-as-code (Terraform, Cloud Formation)
* Advanced Python programming and scripting capabilities (Bash, Power Shell)
* Experience with container orchestration (Kubernetes, Docker)
* Cloud platform expertise (AWS, Google Cloud) with focus on serverless computing and batch processing systems
* Strong database administration and architecture skills including:  + Snowflake data warehouse design, optimization, and administration  + SQL databases (Postgre

SQL, MySQL, SQL Server)  + No

SQL databases (Mongo

DB, Dynamo

DB, Cassandra)  + Database performance tuning and ETL/ELT pipeline development
* Preferred Qualifications
** Experience in life sciences, biotechnology, diagnostics, or other research-intensive industries
* Familiarity with scientific data workflows, laboratory informatics, or instrument data pipelines
* Knowledge of LCMS, mass spectrometry, or other analytical chemistry data formats and processing
* Understanding of bioinformatics file formats and scientific data standards
* Understanding of regulatory requirements for diagnostic software (ISO 13485, FDA 21 CFR Part 11)
* Experience with Atlassian suite administration (Jira, Confluence, Bitbucket)
* Familiarity with Active Directory and identity management systems
* Snowflake Snow Pro certification##
** What…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary