×
Register Here to Apply for Jobs or Post Jobs. X

GenAI Machine Learning Ops Engineer

Job in Raleigh, Wake County, North Carolina, 27601, USA
Listing for: Genworth Financial, Inc.
Full Time position
Listed on 2025-12-27
Job specializations:
  • IT/Tech
    Cloud Computing, Data Engineer
Salary/Wage Range or Industry Benchmark: 125000 - 150000 USD Yearly USD 125000.00 150000.00 YEAR
Job Description & How to Apply Below
GenAI Machine Learning Ops Engineer page is loaded## GenAI Machine Learning Ops Engineer locations:
Raleigh, North Carolina time type:
Full time posted on:
Posted 2 Days Agojob requisition :
REQ-250492

At Enact, we understand that there’s no place like home. That’s why we bring our deep expertise, insightful offerings, and extra mile service to work every day to help lenders put more people in homes and keep them there.

We’re looking for a
** GenAI*
* **** M
* * achine Learning Ops Engineer
** in
** Raleigh, NC
** to join us in fulfilling our mission, while utilizing our values of excellence, improvement, and connection. In this role, you’ll design, build, and operate the end-to-end platform and pipelines that move models from notebooks to resilient, observable, cost-efficient production services on AWS. You’ll partner with Data Science, Cloud Engineering, and Enterprise Architecture to standardize patterns, automate delivery, and ensure models are versioned, governed, and monitored across their lifecycle—batch and real time.

The ideal candidate has a proven track record building and operating cloud-native ML platforms and pipelines—taking models from experimentation to reliable, observable, and cost-efficient production. They lead end-to-end delivery across training, packaging, CI/CD, deployment (batch and real-time), and monitoring, using modern tooling and Infrastructure as Code. They’re comfortable mentoring data scientists and engineers when needed on MLOps best practices—containerization, testing, model/version governance, and cloud-first design—while setting standards, templates, and golden paths that level up the team’s ability to ship ML services at scale.
** LOCATION
* * Enact Headquarters, Raleigh, NC – Hybrid Schedule
** YOUR RESPONSIBILITIES**
* ** Productionize ML:
** Build repeatable paths from experimentation to deployment (batch, streaming, and low-latency endpoints), including feature engineering, training, evaluation, packaging, and release.
* ** Own ML Platform**:
Stand up and operate core platform components—model registry, feature store, experiment tracking, artifact stores, and standardized CI/CD for ML.
* ** Pipeline Engineering:
** Author robust data/ML pipelines (orchestrated with Step Functions / Airflow / Argo) that train, validate, and release models on schedules or events
* ** Observability & Quality:
** Implement end-to-end monitoring, data validation, model/drift checks, and alerting SLA/SLOs.
* ** Governance & Risk:
** Enforce model/version lineage, reproducibility, approvals, rollback plans, auditability, and cost controls aligned to enterprise policies.
* ** Partner & Mentor:
** Collaborate with on-shore/off-shore teams; coach data scientists on packaging, testing, and performance; contribute to standards and reviews.
* ** Dev Ex & Templates:
** Provide golden paths, IaC modules, and templates that help DS/DE teams ship safely and quickly (containers, build specs, Git workflows).
* ** Hands-on Delivery:
** Prototype new patterns; troubleshoot production issues across data, model, and infrastructure layers.
** YOUR QUALIFICATIONS
*** Bachelor’s degree in computer science, information technology, cloud engineering or similar field
* 5+ years experience with Python (pandas, PySpark, scikit-learn; familiarity with PyTorch/Tensor Flow helpful), bash, Make; strong containerization with Docker.
* 5+ years experience with ML Tooling:

Experience with Sage Maker (training, processing, pipelines, model registry, endpoints) or equivalents (Kubeflow, MLflow/Feast, Vertex, Databricks ML).
* 5+ years experience with Pipelines & Orchestration:
Step Functions, Sage Maker Pipelines, event-driven designs with Event Bridge/SQS/Kinesis.
* 3+ + years experience with AWS Foundations: ECR/ECS, Lambda, API Gateway, S3, Glue/Athena/EMR, RDS/Aurora (Postgre

SQL/MySQL), Dynamo

DB, Cloud Watch, IAM, VPC, WAF.
* Snowflake Foundations:  Warehouses, databases, schemas, stages, Snowflake SQL, RBAC, UDF, Snowpark
* 3+ years hands on experience with CI/CD:
Code Build/Code Pipeline or Git Hub Actions/Git Lab; blue/green, canary, and shadow deployments for models and services.
* Proven experience with feature pipelines for…
Position Requirements
5+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary