×
Register Here to Apply for Jobs or Post Jobs. X

EY - GDS Consulting - AI and DATA - Snowflake Architect - Senior Manager

Job in Indiana, Indiana County, Pennsylvania, 15705, USA
Listing for: Ernst & Young Advisory Services Sdn Bhd
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer, AI Engineer, Data Science Manager, Cloud Computing
Salary/Wage Range or Industry Benchmark: 100000 - 125000 USD Yearly USD 100000.00 125000.00 YEAR
Job Description & How to Apply Below
Location: Indiana

EY - GDS Consulting - AI and DATA
- Snowflake Architect
-Senior Manager

Other locations:
Anywhere in Country

Date:
Feb 9, 2026

Requisition

At EY, we’re all in to shape your future with confidence.

We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go.

Join EY and help to build a better working world.

EY- GDS - Data and Analytics – Senior Manager – Snowflake Solutions Architect

As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance.

The opportunity

We’re looking for a candidate with 14+ years with strong architect expertise into Snowflake Solutions Architect and delivery lead

Role

Purpose:

  • Lead the design, development, and delivery of high-impact Proof of Concepts (PoCs) and Proof of Value (PoVs) focused on Snowflake, Snowflake Cortex, and GenAI-driven data solutions.
  • Act as a technical solution owner responsible for showcasing platform capabilities, validating architecture feasibility, and demonstrating business value through working prototypes.
  • Drive innovation by building reusable accelerators, demo frameworks, and reference implementations for Snowflake and industry-specific use cases.
  • Enable rapid experimentation and adoption of modern data and AI platforms through hands‑on engineering leadership.

Key Responsibilities

  • Lead end‑to‑end design and implementation of PoCs and PoVs using Snowflake and related ecosystem tools.
  • Build working prototypes demonstrating Snowflake Cortex, GenAI capabilities, advanced analytics, and modern data architectures.
  • Develop solution blueprints, architecture diagrams, technical documentation, and PoC implementation guides.
  • Author reusable PoV frameworks, demo assets, code templates, and automation scripts.
  • Implement Snowflake features such as Snowpark, Streams, Tasks, Dynamic Tables, and Native App capabilities as part of solution demonstrations.
  • Design and optimize data pipelines using Snowflake and Spark‑based processing frameworks.
  • Build AI‑powered PoCs including intelligent search, conversational analytics, semantic data exploration, and recommendation use cases.
  • Integrate Snowflake with cloud platforms, external data sources, APIs, and enterprise systems.
  • Implement real‑time and near real‑time ingestion patterns for streaming‑based PoC use cases.
  • Establish Dev Ops and automation practices for PoC environments including CI/CD pipelines and deployment workflows.
  • Develop data‑driven applications and dashboards using Streamlit and UI frameworks where required.
  • Build backend services and APIs using FastAPI or REST‑based frameworks to support solution architectures.
  • Ensure PoC solutions follow performance optimization, cost efficiency, security best practices, and scalability guidelines.
  • Collaborate with internal engineering teams to transfer PoC learnings into production‑ready patterns.
  • Mentor solution engineers and architects on Snowflake best practices and emerging technologies.
  • Stay current with Snowflake roadmap, Cortex enhancements, and GenAI ecosystem advancements.

Required Skills & Experience

Mandatory

  • Strong hands‑on experience with Snowflake (minimum 2+ production project implementations).
  • Expertise in Apache Spark / PySpark for large‑scale data processing and transformation.
  • Proven experience in building end‑to‑end PoCs and PoVs for data platform and analytics solutions.
  • Solid understanding of cloud data architectures and modern analytics platforms.
  • Strong coding and debugging skills across SQL and Python‑based workloads.
  • Excellent technical communication and documentation skills.
  • Ability to independently own and deliver complex solution prototypes.

Desired / Preferred

  • Experience with GenAI platforms, Snowflake Cortex, LLM integration, vector search, and AI‑powered analytics.
  • Exposure to streaming platforms and real‑time ingestion frameworks (Kafka, Event Hubs, Kinesis).
  • Hands‑on experience with Dev Ops and MLOps tools…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary