Senior Data Engineer Data Engineer
Snowflake, Navajo County, Arizona, 85937, USA
Listed on 2025-12-01
-
IT/Tech
Data Engineer, Data Warehousing, Data Security, Data Analyst
Location: Snowflake
Jaxel is seeking a Senior Data Engineer to join our cutting-edge team.
What You’ll DoDesign and build scalable, efficient ETL/ELT data pipelines using modern tools (e.g., dbt, Airflow, Fivetran, custom Python jobs).
Develop and maintain data models, schemas, and transformations in Snowflake and other cloud data warehouses (e.g., Big Query, Redshift).
Integrate data from various sources (internal systems, APIs, 3rd party data providers).
Work with stakeholders to understand data needs and deliver reliable,
clean, and well-documented datasets
.
Build and maintain data quality and validation checks to ensure high trust in our data.
Optimize query performance and storage costs in Snowflake and other platforms.
Collaborate with analytics, product, and engineering teams to support data-driven features and reporting needs.
Implement and maintain infrastructure-as-code
, CI/CD workflows, and version control for data pipelines.
Ensure data security, access control
, and compliance with relevant policies (e.g., GDPR, HIPAA if applicable).
3+ years of experience as a Data Engineer or in a similar role.
Strong SQL skills and deep understanding of data warehousing principles
.
Hands-on experience with Snowflake and at least one other data warehouse (Big Query, Redshift, etc.).
Experience with modern data pipeline tools such as Airflow, dbt, Fivetran, Dagster
, or custom Python/Scala jobs.
Proficiency in Python (or another scripting language) for data manipulation and orchestration.
Experience building and maintaining production-grade ETL/ELT pipelines
.
Familiarity with cloud platforms like AWS, GCP, or Azure (e.g., S3, Lambda, Cloud Functions).
Strong attention to data quality, testing, and documentation
Additional requirements (optional)
Experience with
real-time / streaming data (Kafka, Spark Streaming, etc.).
Exposure to data governance, lineage
, and metadata tools (e.g., Amundsen, Data Hub).
Understanding of data privacy, compliance
, and security best practices
.
Familiarity with infrastructure-as-code (e.g., Terraform) and CI/CD pipelines (Git Hub Actions, Git Lab CI, etc.).
Experience collaborating in agile teams and using tools like Jira, Confluence, etc.
What We Offer- Remote work opportunity.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).