×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Architect

Job in Dallas, Dallas County, Texas, 75215, USA
Listing for: Giggso
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Data Warehousing
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below

Overview

Founded in 2018, Giggso is a responsible AI platform for enterprise operations with security and automations.

Role Description

We are seeking a visionary and highly technical Lead Data Architect to design, build, and help with customer needs in Dallas, Texas. This role is pivotal in bridging the gap between complex business requirements and scalable, high-performance technical solutions.

The ideal candidate will be a master of modern cloud data stacks—specifically Snowflake and Databricks—with a unique ability to handle complex geospatial datasets while maintaining rigorous regulatory compliance standards.

Key Responsibilities
  • Architecture Strategy: Define the long-term architectural roadmap for our data ecosystem, ensuring it supports scalable growth and advanced analytics.
  • Platform Integration: Design and implement seamless integrations between Snowflake (for warehousing) and Databricks (for lakehouse/AI workloads), leveraging the best of both platforms.
  • Geospatial Implementation: An architect specialized in pipelines and storage querying and visualization capabilities. Oversee the development of robust ETL/ELT pipelines using Apache Spark, Delta Lake, and Hadoop ecosystems.
  • Governance & Compliance: Act as the primary architect for data privacy and security, ensuring all designs comply with GDPR, CCPA, or other relevant industry-specific regulations.
  • Data Modeling: Create and maintain enterprise-level conceptual, logical, and physical data models tailored for both structured and unstructured data.
Technical Requirements
  • Category Required Expertise: Cloud Data Platforms Advanced proficiency in Snowflake (Snowpark, Streams, Tasks) and Databricks (Unity Catalog, Delta Live Tables).
  • Big Data Stack: Deep experience with Apache Spark, Hadoop, and distributed computing principles.
  • Geospatial Tools: Experience with PostGIS, Esri, or Snowflake/Databricks native spatial functions and H3 indexing.
  • Languages: Expert-level SQL and Python (specifically PySpark).
  • Data Governance: Hands-on experience with data cataloging, lineage tools,
#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary