AI Engineer - Remote
Remote / Online - Candidates ideally in
Minnetonka, Hennepin County, Minnesota, 55345, USA
Listed on 2026-02-14
Minnetonka, Hennepin County, Minnesota, 55345, USA
Listing for:
ChatGPT Jobs
Full Time, Remote/Work from Home
position Listed on 2026-02-14
Job specializations:
-
IT/Tech
AI Engineer, Machine Learning/ ML Engineer, Data Scientist, Data Engineer
Job Description & How to Apply Below
Job Description
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities.
Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
AI Engineer - Remote
LocationUnited Health Group, Minnetonka, MN (On-site / Remote)
Salary$116.70K - $140.20K/yr
Employment TypeFull-time
Primary Responsibilities- Design, develop, and deploy AI/ML and Generative AI models for predictive, prescriptive, and generative analytics across healthcare datasets
- Implement advanced architectures including LLMs (GPT, Gemini, LLaMA), Retrieval-Augmented Generation (RAG), and Agentic Frameworks
- Build and optimize end-to-end pipelines using Python (Sci-kit Learn, Pandas, Flask, Lang Chain), PySpark, T-SQL and SQL
- Develop and fine-tune multiple GenAI models for NLP, summarization, prompt engineering, and conversational AI
- Apply MLOps best practices: model versioning, drift analysis, quantization, MLFlow, containerization with Docker, and CI/CD pipelines
- Work with cloud platforms:
Azure (Databricks, ML Studio, Data Factory, Data Lake, Delta Tables), AWS, and GCP for scalable deployments - Integrate data warehousing solutions like Snowflake and manage large-scale data pipelines
- Collaborate in an Agile environment, participate in sprint planning, and maintain code repositories using Git Hub/Git
- Ensure compliance with security and governance standards for healthcare data
- Coach and mentor junior team members
- AI/ML Foundations
- Design and implement machine learning and deep learning models for classification, NLP tasks.
- Build and maintain end-to-end ML pipelines including data preprocessing, model training, evaluation, and deployment.
- Generative AI & LLM Engineering
- Develop and fine-tune LLM-based applications using Lang Chain, Lang Graph, and other GenAI frameworks.
- Build Multi Agentic workflows and RAG (Retrieval-Augmented Generation) pipelines for enterprise use cases.
- Leverage AWS Bedrock and Google Vertex AI for scalable and production-grade GenAI deployments.
- LLM Security & Responsible AI
- Implement guardrails to prevent prompt injection, reduce hallucinations, and ensure safe model outputs.
- Apply best practices for LLM security, including output moderation, access control, and auditability.
- Ensure compliance with Responsible AI principles-fairness, transparency, and explainability.
- Cloud-Native AI Development
- Deploy and manage GenAI solutions on AWS and Google Suite, utilizing services like Bedrock, Sage Maker, Vertex AI.
- Integrate LLMs with enterprise systems using REST APIs, SDKs, and orchestration tools.
- Collaboration & Mentorship
- Work closely with product managers, data scientists, and platform teams to translate business needs into GenAI solutions.
- Mentor junior engineers and contribute to internal knowledge-sharing initiatives.
- 5 years of hands‑on experience in AI/ML techniques like Prompt Engineering, RAG (Retrieval Augmented Generation) and Agentic AI
- Hands‑on experience with Generative AI frameworks/architectures (Lang Chain, Hugging Face, OpenAI APIs)
- Solid expertise in Python, PySpark, T‑SQL, SQL, and big data technologies (Hadoop, Spark)
- Deep knowledge of statistics, data modeling, and simulation
- Proficiency in cloud technologies:
Azure (Databricks, ML Studio), AWS Bedrock, Azure Foundry, Kafka, and cloud‑native AI services - Familiarity with CI/CD pipelines, Git Hub Actions, and containerization tools
- Solid understanding of LLM security, prompt engineering, and responsible AI practices
- Proven excellent problem‑solving skills and ability to handle ambiguity
- Internal Data management and Big data handling experience
- Experience with LLMs (GPT, Gemini, LLaMA) and prompt‑based learning
- Knowledge of Kafka, Tensor Flow, and advanced…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×