×
Register Here to Apply for Jobs or Post Jobs. X

Data Operations Engineer

Job in Myrtle Point, Coos County, Oregon, 97458, USA
Listing for: Virgin Pulse
Full Time position
Listed on 2025-12-01
Job specializations:
  • IT/Tech
    Data Engineer, Database Administrator, Data Analyst
Job Description & How to Apply Below
Location: Myrtle Point

Overview

Who We Are

Ready to create a healthier world? We are ready for you! Personify Health is on a mission to simplify and personalize the health experience to improve healthand reduce costs for companies and their people. At Personify Health, we believe in offering total rewards, flexible opportunities, and a diverse inclusive community, where every voice matters. Together, we’re shaping a healthier, more engaged future.

Responsibilities

Ready to Build Your Data Engineering Career in Healthcare Technology?

We're seeking an early-career data professional who can support development and maintenance of data systems used in healthcare and TPA claims processing. As our Data Engineer, you'll assist with building and troubleshooting data pipelines while ensuring data accuracy across on-prem and cloud environments. This role offers hands‑on learning opportunities working with ETL/ELT, SQL, and healthcare data workflows while spending approximately 50% of your time handling support tickets and operational data issues.

What makes this role different

  • Healthcare data focus: Work with healthcare and TPA claims processing data systems, gaining specialized industry expertise
  • Balanced responsibilities: Split time between pipeline development (50%) and operational support tickets (50%) for well‑rounded experience
  • Growth opportunity: Learn from senior engineers while developing technical skills in ETL/ELT, SQL, Python, and cloud data workflows
  • Cross‑functional collaboration: Partner with Data Analysts, Developers, and business users to support reporting and data operations

What You'll Actually Do

  • Support data pipelines: Assist in maintaining and troubleshooting ETL/ELT data pipelines used for healthcare and TPA claims processing across on‑prem and cloud environments.
  • Handle operational support: Manage support tickets (~50% of time), responding to user requests, researching data questions, and helping resolve operational data problems efficiently.
  • Work with core technologies: Use Python and SQL to support data extraction, transformation, validation, and loading while monitoring pipeline performance and resolving data issues.
  • Monitor and troubleshoot: Review logs, investigate failed jobs, and correct data discrepancies while supporting daily process monitoring including production processes and application performance.
  • Maintain data quality: Execute routine data quality checks, maintain documentation, and follow up on accuracy concerns to ensure reliable data across systems.
  • Support database operations: Work with data management tasks in systems such as Postgre

    SQL, Oracle, and cloud‑based databases while learning healthcare data formats.
  • Collaborate cross‑functionally: Partner with Data Analysts, Developers, and business users to understand data needs and support ongoing reporting and data operations.
  • Continue learning: Participate in team meetings, sprint activities, and knowledge‑sharing sessions while working with senior team members to develop data engineering skills.
Qualifications

What You Bring to Our Mission

The foundational experience:
  • 2-3 years experience in data engineering, analytics engineering, or related technical role
  • AWS Certification (or willingness to obtain within 6-12 months), such as AWS Cloud Practitioner or AWS Developer – Associate
  • Experience handling support tickets or operational data issues strongly preferred (~50% of role)
The technical expertise:
  • Proficiency in Python (required) and SQL, including writing queries, joins, basic transformations, and troubleshooting
  • Hands‑on experience with relational databases (Postgre

    SQL, Oracle, AWS RDS) and familiarity with basic data warehouse concepts
  • Understanding of ETL/ELT pipelines, data validation, and data quality monitoring
  • Basic knowledge of Linux command line for navigating servers and running scripts
  • Some exposure to cloud environments (AWS, Azure) preferred but not required
The collaboration tools:
  • Familiarity with JIRA and Git/Bitbucket for version control and task management
  • Effective written and verbal communication skills with ability to document findings and processes
The professional qualities:
  • Strong attention to detail and focus on data accuracy and…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary