×
Register Here to Apply for Jobs or Post Jobs. X

Software Development Engineer; GCP, AI and Data Pipelines

Job in Germany, Pike County, Ohio, USA
Listing for: Hispanic Alliance for Career Enhancement
Full Time position
Listed on 2026-02-19
Job specializations:
  • Software Development
    AI Engineer, Data Engineer
Salary/Wage Range or Industry Benchmark: 106605 - 236900 USD Yearly USD 106605.00 236900.00 YEAR
Job Description & How to Apply Below
Position: Staff Software Development Engineer (GCP, AI and Data Pipelines)
Location: Germany

We're building a world of health around every individual - shaping a more connected, convenient and compassionate health experience. At CVS Health®, you'll be surrounded by passionate colleagues who care deeply, innovate with purpose, hold ourselves accountable and prioritize safety and quality in everything we do. Join us and be part of something bigger - helping to simplify health care one person, one family and one community at a time.

Position

Summary

This role leads the architecture and delivery of advanced data and AI solutions, specializing in GCP data pipelines, MCP server setup, and agentic AI. The engineer is responsible for technical vision, hands‑on implementation, and embedding AI into engineering best practices.

GCP Data Pipeline Engineering

Architect, design, and implement robust, scalable data pipelines on GCP using services such as Big Query, Dataflow, Pub/Sub, and Vertex AI. Ensure data pipelines are optimized for performance, reliability, and security.

MCP Server Setup & Integration

Lead the setup and configuration of MCP (Model Context Protocol or Manage, Control, and Plan) servers to standardize how AI systems access tools, data, and context. Ensure MCP servers provide a universal protocol for AI agents, enabling modularity and ease of integration with new models and tools. Implement best practices for database interaction, image management, access controls, and automation integration with orchestration frameworks.

Agentic

AI Development

Integrate agentic AI and automation frameworks into data pipelines and engineering workflows. Enable autonomous, intelligent data processing and analytics by leveraging LLMs and agentic orchestration.

Engineering Best Practices for Incorporating AI

AI‑Driven Code Quality & Automation: Leverage AI‑powered tools for code generation, code review, and automated testing to improve code quality and accelerate development cycles. Integrate AI agents into CI/CD pipelines for intelligent build, test, and deployment orchestration. Use MCP servers to standardize access to codebases, data, and documentation, enabling modular and reusable AI integrations.

Data Governance & Security: Implement robust data governance policies, including data lineage, access controls, and audit trails for all AI‑driven data pipelines. Ensure all AI models and data flows comply with enterprise security standards and privacy regulations. Use MCP server features for secure, auditable, and policy‑driven access to data and tools.

Responsible & Ethical AI: Apply organizational AI governance frameworks to ensure fairness, transparency, and explainability in all AI‑powered solutions. Conduct regular fairness assessments and bias detection for AI models, leveraging available toolkits and governance processes. Document AI decision logic and maintain traceability for all automated actions.

Operational Excellence & Monitoring: Implement telemetry and monitoring for all AI and data pipeline components, including model performance, data freshness, and system health. Use automated alerting and self‑healing mechanisms where possible, leveraging AI for anomaly detection and root cause analysis. Continuously evaluate and optimize AI models and pipelines for efficiency and reliability.

Knowledge Sharing: Actively participate in the AI engineering community of practice to share best practices, lessons learned, and reusable assets. Mentor team members on AI integration patterns, responsible AI use, and advanced GCP/MCP techniques. Maintain comprehensive documentation for all AI integrations, including architecture diagrams, data flows, and operational runbooks.

Technical Leadership & Collaboration

Mentor and coach engineers at all levels, foster a culture of innovation and continuous learning, and collaborate with cross‑functional stakeholders to align technical solutions with strategic business goals.

Required

Skills & Qualifications
  • 7+ years of experience in software engineering, with significant experience in cloud‑native data pipeline development and AI/ML.
  • Advanced expertise in GCP services, infrastructure, and automation.
  • Hands‑on experience with MCP server setup, database automation, and agentic…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary