Data Architect
Listed on 2026-03-02
-
IT/Tech
Data Engineer, Data Science Manager
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect in San Francisco, CA, with a contract length of over 6 months and a pay rate of $ - $ per year. Key skills include AWS expertise, Python proficiency, and data modeling.
United States
$ USD
#Computer Science #Security #Data Processing #Data Governance #Terraform #Automation #Data Engineering #Data Architecture #Data Manipulation #ETL #Data Vault #Lambda #Cloud #Infrastructure as Code (IaC) #API #Data Modeling #Scala #Schema Design #AWS #AWS S3 #Redshift #Airflow #SQL #ML #Kafka #Vault #Python #Data Science
Job TitleData Architect / Data Platform Architect
LocationSan Francisco, CA 94105 (
Required:
Onsite 5 days a week)
W2 Contractor
Contract LengthMore than 6 months
Work StyleOn-site
OverviewWe are looking for a seasoned Data Architect to own the technical vision for our data ecosystem. In this role, you will architect a best-in-class Data Platform on AWS and contribute hands‑on code using Python. This role requires a strategic thinker who can design high‑level data models while also being comfortable writing production‑grade code alongside the engineering team. This position is 100% onsite in San Francisco.
You will work directly with stakeholders across the business to translate complex requirements into scalable, reliable data solutions.
- Design and implement scalable data architectures and ETL pipelines on AWS (S3, Redshift, Glue, EMR, Kinesis, Airflow).
- Build and optimize the core Data Platform, focusing on data governance, security, and schema design.
- Write production-level Python code for data processing, API development, and automation.
- Define data standards and best practices for data modeling, ingestion, and lifecycle management.
- Work closely with data scientists and analysts to ensure data availability for analytics and machine learning initiatives.
- Troubleshoot performance bottlenecks and optimize query performance for scalability.
- 7+ years in data engineering or data architecture, with at least 2 years in a lead or architect role.
- Deep experience with the modern AWS data stack (Glue, Redshift, S3, Lambda, Step Functions, or EMR).
- Expert-level proficiency in Python for data manipulation and application development.
- Strong SQL skills required.
- Proven experience building reusable data platform components (not just one‑off pipelines).
- Strong knowledge of Kimball, Inmon, or Data Vault methodologies.
- Nice to have:
Experience with streaming technologies (Kafka, MSK, or Kinesis). - Familiarity with Infrastructure as Code (Terraform or Cloud Formation).
- Bachelor's degree in Computer Science, Engineering, or related field.
$ - $ per year
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).