Senior AI Engineer - On-site or Hybrid
Listed on 2025-10-29
-
IT/Tech
AI Engineer, Machine Learning/ ML Engineer
Senior AI Engineer - On-site or Hybrid Opportunity
Join to apply for the Senior AI Engineer - On-site or Hybrid Opportunity role at The Mutual Group.
OverviewAs a Senior AI Engineer, you will be responsible for designing, building, and deploying solutions that leverage large language models (LLMs), Generative AI, and natural language processing (NLP) to enhance customer and agent experiences. This includes developing intelligent automation, AI-powered services, and supporting teams in the responsible and effective use of AI technologies. Success in this role requires creativity, a passion for learning, and the ability to clearly explain complex ideas.
You’ll join a team that values curiosity, open communication, and a strong commitment to responsible AI—working together to create smart, impactful tools that make a real difference.
- Multimodal Data Processing and Automation:
Ingest and preprocess diverse file types: PDFs, scanned images, emails (EML/MSG), audio, video, and structured/unstructured text from content management systems. - Apply OCR (Optical Character Recognition) and speech-to-text models to extract meaningful data from documents and media.
- Use Lang Chain + Lang Graph to orchestrate agentic workflows for parsing and reasoning across multimodal inputs.
- Build AI pipelines that classify, extract, and validate key entities (e.g., policy numbers, claim dates, insured parties) from documents.
- Integrate LLMs via Bedrock or Hugging Face to summarize, interpret, and flag anomalies in claims and underwriting documents.
- Implement retrieval-augmented generation (RAG) using Vector DBs to ground LLM responses in enterprise knowledge.
- AWS Cloud Engineering Activities
- Model Development & Deployment:
Train and fine-tune models in Sage Maker using custom datasets and embeddings. Deploy models as Sage Maker endpoints or Lambda functions for real-time inference. Use Step Functions to orchestrate complex AI workflows across services.
- MLOps & Dev Ops Integration:
Build CI/CD pipelines using tools like Code Pipeline, Git Hub Actions, or Jenkins to automate model training, testing, and deployment. Monitor model drift, performance, and compliance using Sage Maker Model Monitor and custom logging. Apply Infrastructure as Code (IaC) with Terraform or Cloud Formation for reproducible environments.
- Data Engineering & Pipelines:
Design scalable ETL pipelines to transform raw multimodal data into structured formats using AWS Glue, Lambda, and Step Functions. Store embeddings and metadata in Vector DBs like Pinecone, Weaviate, or Amazon Kendra. Ensure data lineage, versioning, and governance using tools like AWS Lake Formation or Apache Atlas.
- Ethical, Legal & Compliance:
Implement Responsible AI practices: bias detection, explainability, and audit trails. Ensure HIPAA and SOC2 compliance in data handling and model outputs. Use Bedrock Guardrails or custom filters to prevent hallucinations and ensure safe LLM responses.
- Staying Ahead of AI Trends:
Continuously evaluate new models from Hugging Face, OpenAI, and Anthropic for relevance to insurance workflows. Attend AWS AI/ML webinars, contribute to open-source Lang Chain agents, and experiment with agentic architectures. Prototype with new AWS services like Amazon Q, Titan models, or multimodal Bedrock endpoints.
- Qualifications
- Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related field, or equivalent work experience.
- 8+ years in software, data, or AI engineering, with 3+ years working directly with AI/ML models.
- Experience building and deploying LLMs, transformers, or GenAI tools.
- Hands-on knowledge of Python and tools like Tensor Flow, PyTorch, or scikit-learn.
- Worked with Hugging Face, OpenAI APIs, AWS Bedrock, Lang Chain, or other GenAI platforms.
- Built AI solutions in AWS, Azure, or GCP using Sage Maker, Azure ML, or Vertex AI.
- Familiar with data engineering tools like Apache Spark, Kafka, and Airflow.
- Experience working with modern data stack technologies such as Snowflake, Redshift, Delta Lake, S3, Azure Data Lake, and Big Query.
- Some experience with Dev Ops practices, like using CI/CD pipelines, Docker, and Kubernetes.
- Comf…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).