×
Register Here to Apply for Jobs or Post Jobs. X

Senior Solutions Architect

Job in Maryland Heights, St. Louis city, Missouri, 63043, USA
Listing for: World Wide Technology
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Data Warehousing
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Location: Maryland Heights

Overview

Why WWT?
At World Wide Technology, we work together to make a new world happen. Our important work benefits our clients and partners as much as it does our people and communities across the globe. WWT is dedicated to achieving its mission of creating a profitable growth company that is also a Great Place to Work for All. We achieve this through our world-class culture, generous benefits and by delivering cutting-edge technology solutions for our clients.

Founded in 1990, WWT is a global technology solutions provider leading the AI and Digital Revolution. WWT combines the power of strategy, execution and partnership to accelerate digital transformational outcomes for organizations around the globe. Through its Advanced Technology Center, a collaborative ecosystem of the world s most advanced hardware and software solutions, WWT helps clients and partners conceptualize, test and validate innovative technology solutions for the best business outcomes and then deploys them at scale through its global warehousing, distribution and integration capabilities.

With over 14,000 employees across WWT and Softchoice and more than 60 locations around the world, WWT s culture, built on a set of core values and established leadership philosophies, has been recognized 14 years in a row by Fortune and Great Place to Work® for its unique blend of determination, innovation and creating a great place to work for all.

Want to work with highly motivated individuals on high-performance teams? Join WWT today!

We’re seeking a hands-on Data Architect to lead the design, modernization, and governance of our analytics platform on Microsoft Fabric. You will define the target architecture across One Lake, Lakehouse/Data Warehouse, Direct Lake, Power BI, and Data Engineering experiences, while orchestrating migrations and integrations from Oracle, Snowflake and other. This role blends deep technical architecture with practical delivery—partnering with data engineers, BI developers, and business stakeholders to deliver trusted, performant, and governed data products.

Responsibilities

Architecture & Strategy

  • Define end-to-end Fabric data engineering architecture (One Lake, Lakehouse, Warehouse, Delta tables, medallion layers) aligned to business domains, enterprise architecture, platform architecture and data product strategy.
  • Establish dimensional and semantic models for Power BI leveraging Direct Lake, Composite Models, and shared datasets.
  • Create standards for data modeling, partitioning, indexing, and performance optimization across Fabric pipelines, notebooks, and warehouses.
  • Develop reference architectures for batch, micro-batch, and streaming ingestion; choose the right pattern (Dataflows Gen2, Pipelines, Notebooks, Spark Structured Streaming).

Data Integration & Migration (Oracle & Snowflake)

  • Lead migration paths from Oracle (e.g., PL/SQL-based systems) and Snowflake to Fabric Lakehouse/Warehouse; define incremental loads, CDC, and cutover strategies.
  • Design robust ingestion using Snowpipe/Snowflake Tasks & Streams, Oracle CDC (e.g., Golden Gate), or landing via ADF/Fabric Pipelines to Delta Lake.
  • Rationalize Snowflake objects (schemas/tables/stages) and Oracle PL/SQL logic into Spark/SQL transformations, reusable notebook patterns, and Dataflows Gen2 where appropriate.
  • Implement secure, governed data sharing and zero-copy migration patterns, minimizing downtime and cost.
  • Proficient in building reliable, realtime data pipelines using Kafka—covering event streaming architecture, streaming ingestion with Fabric and Spark, Kafka Connect and schema management, and the design of low-latency processing with Kafka Streams or Spark.

Governance, Security, & Compliance

  • Operationalize data catalog, lineage, classifications, policies for Fabric and connected sources.
  • Define RBAC, workspace and item-level security, row-level and object-level security for BI and warehouse artifacts.
  • Establish data quality rules, observability (logging/metrics), SLAs, and error handling across pipelines and streaming jobs.
  • Partner with Info Sec for encryption, key management, and compliance (e.g., HIPAA/PCI/SOX depending on industry).

Performance,…

Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary