×
Register Here to Apply for Jobs or Post Jobs. X

Data Solutions Architect; Local

Job in Phoenix, Maricopa County, Arizona, 85003, USA
Listing for: TexcelVision Inc.
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below
Position: Data Solutions Architect (Required Local Candidates)

Solution Architect Data Overview

The Solution Architect Data is responsible for contributing to the design, modernization, and optimization of enterprise-scale data systems, as well as the maintenance and operations strategy for CHP . This role involves designing and implementing data systems that organize, store, and manage data within our cloud data platform.

The architect will perform continuous maintenance and operations work for CHP in the cloud environment. They will review and analyze CHP’s data infrastructure , plan future database solutions, and implement systems to support data management for CHP users .

Additionally, this role is accountable for ensuring data integrity , making sure the CHP team adheres to data governance standards to maintain accuracy, consistency, and reliability across all systems. The architect will identify data discrepancies and quality issues, and work to resolve them.

This position requires a strong blend of architectural leadership , technical depth, and the ability to collaborate with business stakeholders, data engineers, machine learning practitioners, and domain experts to deliver scalable, secure, and reliable AI-driven solutions.

The ideal candidate will have a proven track record of delivering end-to-end ETL/ELT pipelines across Databricks, Azure, and AWS environments.

Key Responsibilities
  • Design scalable data lake and data architectures using Databricks and cloud-native services.
  • Develop metadata-driven, parameterized ingestion frameworks and multi
    - layer data architectures.
  • Optimize data workloads and performance
    .
  • Define data governance frameworks for CHP.
  • Design and develop robust data pipelines.
  • Architect AI systems, including RAG workflows and prompt engineering.
  • Lead cloud migration initiatives from legacy systems to modern data platforms
    .
  • Provide architectural guidance, best practices, and technical leadership across teams.
  • Build documentation, reusable modules, and standardized patterns
    .
Required Skills and Experience
  • Strong expertise in cloud platforms, primarily Azure or AWS
    .
  • Hands-on experience with Databricks.
  • Deep proficiency in Python and SQL.
  • Expertise in building ETL/ELT pipelines and ADF workflows.
  • Experience architecting data lakes and implementing data governance frameworks
    .
  • Hands-on experience with CI/CD, Dev Ops, and Git-based development
    .
  • Ability to translate business requirements into technical architecture.
Technical Expertise

Programming
:
Python, SQL, R

Big Data
:
Hadoop, Spark, Kafka, Hive

Cloud Platforms
:
Azure (ADF, Databricks, Azure OpenAI), AWS

Data Warehousing
:
Redshift, SQL Server

ETL/ELT Tools
: SSIS

Required Educational Background
  • Bachelor’s degree in Computer Science, Information Technology, Information Systems, Engineering, or a related field.
  • 6+ years of experience in data engineering or .NET development.
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary