More jobs:
Senior Data Architect
Job in
Dallas, Dallas County, Texas, 75215, USA
Listed on 2026-02-16
Listing for:
Giggso
Full Time
position Listed on 2026-02-16
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Data Warehousing
Job Description & How to Apply Below
Overview
Founded in 2018, Giggso is a responsible AI platform for enterprise operations with security and automations.
Role Description
We are seeking a visionary and highly technical Lead Data Architect to design, build, and help with customer needs in Dallas, Texas. This role is pivotal in bridging the gap between complex business requirements and scalable, high-performance technical solutions.
The ideal candidate will be a master of modern cloud data stacks—specifically Snowflake and Databricks—with a unique ability to handle complex geospatial datasets while maintaining rigorous regulatory compliance standards.
Key Responsibilities- Architecture Strategy: Define the long-term architectural roadmap for our data ecosystem, ensuring it supports scalable growth and advanced analytics.
- Platform Integration: Design and implement seamless integrations between Snowflake (for warehousing) and Databricks (for lakehouse/AI workloads), leveraging the best of both platforms.
- Geospatial Implementation: An architect specialized in pipelines and storage querying and visualization capabilities. Oversee the development of robust ETL/ELT pipelines using Apache Spark, Delta Lake, and Hadoop ecosystems.
- Governance & Compliance: Act as the primary architect for data privacy and security, ensuring all designs comply with GDPR, CCPA, or other relevant industry-specific regulations.
- Data Modeling: Create and maintain enterprise-level conceptual, logical, and physical data models tailored for both structured and unstructured data.
- Category Required Expertise: Cloud Data Platforms Advanced proficiency in Snowflake (Snowpark, Streams, Tasks) and Databricks (Unity Catalog, Delta Live Tables).
- Big Data Stack: Deep experience with Apache Spark, Hadoop, and distributed computing principles.
- Geospatial Tools: Experience with PostGIS, Esri, or Snowflake/Databricks native spatial functions and H3 indexing.
- Languages: Expert-level SQL and Python (specifically PySpark).
- Data Governance: Hands-on experience with data cataloging, lineage tools,
Position Requirements
10+ Years
work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×