Data Engineer
Listed on 2026-01-02
-
IT/Tech
Data Engineer, Data Analyst
Company :
Highmark Health
Job Description : JOB SUMMARYDue to Department of Defense (DOD) contract requirements, the incumbent who fills this position
* must be a US citizen and must also pass a background check.*
- If you live within 50 miles of Pittsburgh, PA, you will be required to work on-site Tuesday, Wednesday, and Thursday.
- If you live beyond 50 miles of Pittsburgh, PA, this will be a fully remote role. Remote employees should anticipate limited travel (3-4 times annually) to Pittsburgh, PA for company events
This role is an integral member of a technical team, responsible for supporting the design, development, and maintenance of the organization's data infrastructure, ensuring the efficient and reliable flow of data across various systems. This position is responsible for ensuring the reliable movement, transformation, and storage of data across various platforms. The incumbent will work closely with data professionals, analysts, and engineers to build and optimize data pipelines, ensuring data quality and integrity.
The ideal candidate is adaptable, solution-oriented, and capable of collaborating across multiple teams to support a broad range of data initiatives. The incumbent and team will be involved in every aspect of the data process, from idea generation, analysis, design, build, and support, using the latest technologies and design patterns.
- Design, develop, and maintain robust data processes and solutions to ensure the efficient movement and transformation of data across multiple systems.
- Develop and maintain data models, databases, and data warehouses to support business intelligence and analytics needs
- Collaborate with stakeholders across IT, product, analytics, and business teams to gather requirements and provide data solutions that meet organizational needs
- Monitor work against the production schedule, provide progress updates, and report any issues or technical difficulties to lead developers regularly.
- Implement and manage data governance practices, ensuring data quality, integrity, and compliance with relevant regulations.
- Collaborate on the design and implementation of data security measures, including access controls, encryption, and data masking
- Perform data analysis and provide insights to support decision‑making across various departments
- Stay current with industry trends and emerging technologies in data engineering, recommending new tools and best practices as needed
- Other duties as assigned or requested.
- 3 years of experience in design and analysis of algorithms, data structures, and design patterns in the building and deploying of scalable, highly available systems
- 3 years of experience in a data engineering, ETL development, or data management role.
- 3 years of experience in SQL and experience with database technologies (e.g., MySQL, Postgre
SQL, Mongo
DB). - 3 years of experience in data warehousing concepts and experience with data warehouse solutions (e.g., Snowflake, Redshift, Big Query)
- Proficiency in Python for data manipulation, scripting, API integrations, and developing robust data engineering solutions.
- Strong SQL skills for complex data extraction, transformation, and loading (ETL/ELT) across various database systems.
- Demonstrated experience with version control systems (Git/Git Hub/Git Lab) for collaborative development, code management, and deployment best practices.
- Experience implementing and following SDLC best practices for data solutions, ensuring robust development, testing, and deployment.
- Hands‑on experience with Google Big Query or comparable cloud‑native data warehouse solutions (e.g., Snowflake, Amazon Redshift, Azure Synapse Analytics) for designing, developing, and optimizing data architecture.
- Knowledge of data governance, data quality, and data lineage best practices.
- Demonstrated experience with dimensional modeling (Star Schema, Snowflake Schema) and other data modeling techniques, including Slowly Changing Dimensions (SCDs).
- Experience with Infrastructure as Code (IaC) tools like Terraform for managing cloud resources.
- Experience building and managing data pipelines using workflow…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).