×
Register Here to Apply for Jobs or Post Jobs. X

Data Architect

Remote / Online - Candidates ideally in
Scottsdale, Maricopa County, Arizona, 85261, USA
Listing for: Arizona State University
Full Time, Remote/Work from Home position
Listed on 2025-12-23
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below
Position: Staff Data Architect

Join to apply for the Staff Data Architect role at Arizona State University
.

Business and Data Analyst 3

Job Family:
Business and Data Analytics

Time Type:
Full time

Max Pay – Depends On

Experience:

$ USD Annual

Apply before 11:59 PM Arizona time the day before the posted End Date.

Minimum Qualifications

Bachelor's degree and five (5) years of experience appropriate to the area of assignment/field; OR, any equivalent combination of experience and/or training from which comparable knowledge, skills and abilities have been achieved.

Job Profile Summary

Defines systems requirements, makes recommendations for technology selection, and performs moderately complex data analysis to ensure data management objectives within a work unit are met.

Job Description

Are you passionate about building resilient, scalable data platforms that support strategic decision-making? Do you have a track record of delivering modern, cloud‑native data architectures? EdPlus Action Lab, a dynamic unit of Arizona State University, is reimagining educational analytics and needs a highly motivated Staff Data Architect to join our team. This role will leverage cutting‑edge technologies such as AWS and GCP to design and optimize robust data pipelines, guide projects from conception to deployment, and shape how Action Lab operationalizes ETL processes and other data integrations.

Why

EdPlus Action Lab?
  • Mission‑Driven Work:
    Your work supports increasing access to education for all learners.
  • Modern Stack:
    We’re not afraid of new tools: dbt, Airflow, Redshift, and more.
  • Team Culture:
    We prioritize collaboration across disciplines and functions.
  • Professional Growth:
    You’ll be a thought partner in an evolving data capability maturity roadmap.
Essential Duties
  • Lead the design, architecture, and implementation of highly complex and scalable data pipelines and ETL processes using dbt‑core and Apache Airflow.
  • Serve as a technical lead and subject matter expert in data modeling, performance optimization, and cloud infrastructure (AWS, GCP).
  • Contribute as a key developer in extending new platform capabilities while ensuring the stability and maintainability of existing codebases.
  • Exercise significant control over major data engineering projects, taking direct responsibility for their success and outcomes.
  • May lead working teams by providing technical direction, guidance, and mentorship to both senior and junior data engineers.
  • Design, define, and hone new processes, best practices, and technical standards for the data engineering team and other Action Lab members.
  • Troubleshoot and resolve the most challenging data‑related issues, developing innovative solutions and strategies.
  • Drive significant influence on stakeholders across EdPlus and Action Lab, effectively communicating complex technical concepts and project strategies to both technical and non‑technical audiences.
  • Introduce new technologies, processes, and knowledge sharing initiatives to the whole Action Lab team.
  • Develop and implement advanced strategies for data quality, governance, and security across all platforms (AWS Redshift, S3, GCP Big Query, GCS).
  • Evaluate and recommend new technologies and tools to improve the efficiency, scalability, and capabilities of the data infrastructure.
  • Serve as a mentorship figure for the working team, fostering a culture of technical excellence and continuous learning.
  • Leverage deep expertise in data visualization tools such as Tableau or Looker (nice to have) to guide the creation of impactful dashboards and reports.
  • Assume or coordinate other duties or projects as assigned or directed.

NOTE: This is not a fully remote position. Must be able to reliably commute to Scottsdale, AZ.

Desired Qualifications
  • Bachelor's degree or higher in a related field.
Deep Expertise And Advanced Technical Skills
  • Extensive experience (8+ years) designing, developing, and optimizing complex data pipelines and ETL processes in a production environment.
  • Demonstrated deep expertise in dbt‑core and Apache Airflow, including advanced features and best practices.
  • Exceptional proficiency in SQL and extensive experience with AWS Redshift, S3, GCP Big Query, and GCS.
  • Strong command of multiple programming…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary