More jobs:
Snowflake Architect
Job in
Dallas, Dallas County, Texas, 75201, USA
Listed on 2026-03-03
Listing for:
TATA Consulting Services
Full Time
position Listed on 2026-03-03
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Data Warehousing, Big Data
Job Description & How to Apply Below
* Strong hands-on experience with Snowflake architecture and performance tuning
* Expertise in DBT (models, testing, macros, documentation, environments)
* Solid experience with ETL/ELT frameworks and data integration patterns
* Proficiency in Python for data engineering and automation
* Experience with Snowpark Implementation
* Strong knowledge of cloud data services (AWS, Azure, or GCP)
* Advanced SQL and data modeling skills
Roles & Responsibilities We are seeking an experienced Snowflake Data Architect to design, build, and optimize scalable cloud-based data platforms. The ideal candidate will have deep expertise in Snowflake, DBT, Snowpark, ETL/ELT pipelines, Python, and cloud data services (AWS, Azure, or GCP). This role will lead architecture decisions, ensure best practices, and enable analytics and data science teams with high-quality, reliable data solutions.
________________________________________
Key Responsibilities:
Architecture & Design Design and implement end-to-end Snowflake-based data architectures for analytics, reporting, and advanced data use cases Define data modeling strategies (dimensional, data vault, and analytical models) optimized for Snowflake Establish standards for data ingestion, transformation, storage, and consumption. Snowflake Platform Management Architect and manage Snowflake features including Warehouses, Databases, Schemas, Cloning, Time Travel, Secure Data Sharing, Data Clean Rooms and Resource Monitoring Optimize performance and cost using warehouse sizing, clustering, caching, and query optimization Implement security best practices including RBAC, masking policies, row access policies, and data governance.
Data Transformation & ETL/ELT Lead ELT pipeline development using DBT (models, macros, tests, documentation, and deployments) Design and implement ETL/ELT pipelines using cloud-native Snowpark and third-party tools. Implement Real time streaming and Batch data Processing. Ensure data quality, lineage, and observability across pipelines. Cloud & Big Data Integration Architect solutions leveraging cloud data services (AWS, Azure, or GCP) such as object storage, messaging, and orchestration services Integrate Apache Spark (Databricks or equivalent) for large-scale data processing and advanced transformations Support hybrid and multi-cloud data architectures.
Development & Automation Develop data processing and automation solutions using Python Build reusable frameworks for ingestion, transformation, validation, and monitoring Implement CI/CD pipelines for data workloads and DBT, Snowpark deployments. Leadership & Collaboration Partner with business stakeholders, analytics, and data science teams to translate requirements into scalable solutions Mentor data engineers and analysts on Snowflake, DBT, Snowpark and data engineering best practices <
/div>
Provide architectural guidance, documentation, and design reviews Salary Range- $120,000-$130,000 a year #LI-OJ1
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×