×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Snowflake, Navajo County, Arizona, 85937, USA
Listing for: Universal Access & Systems Solutions
Full Time position
Listed on 2025-12-02
Job specializations:
  • IT/Tech
    Data Engineer, Big Data
Job Description & How to Apply Below
Position: SENIOR DATA ENGINEER
Location: Snowflake

The Senior Data Engineer plays a pivotal role in the design, development, and optimization of robust data pipelines and infrastructure that power AI-driven and data-intensive applications.

Responsibilities:
  • Design, develop, and maintain scalable, efficient, and secure ETL (Extract, Transform, Load) pipelines.
  • Build real-time and batch data processing frameworks for AI, analytics, and business intelligence.
  • Optimize data storage, retrieval, and performance across structured and unstructured data sources.
Data Governance & Security
  • Implement data governance frameworks, ensuring data integrity, consistency, and compliance with security standards.
  • Monitor and enforce best practices for data privacy, encryption, and regulatory compliance (GDPR, CCPA, etc.).
  • Work closely with Data Scientists and Machine Learning Engineers to support AI model training, inference, and deployment.
  • Partner with Dev Ops teams to automate data workflows and ensure smooth integration with cloud platforms.
  • Act as a liaison between business stakeholders and technical teams, translating business needs into data solutions.
Performance Optimization & Monitoring
  • Continuously optimize data infrastructure for performance, reliability, and scalability.
  • Implement monitoring solutions to detect and resolve data pipeline issues proactively.
Qualifications:
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
  • 3+ years of experience in data engineering, including ETL pipeline development.
  • Expertise in SQL and No

    SQL databases (Postgre

    SQL, Mongo

    DB, Cassandra, etc.).
  • Proficiency in big data technologies such as Apache Spark, Kafka, and Hadoop.
  • Hands-on experience with cloud platforms (AWS, Azure, GCP) and data warehousing (Big Query, Snowflake, Redshift).
  • Strong knowledge of data orchestration tools (Airflow, Prefect, or similar).
  • Experience with data security best practices and compliance requirements.
  • Excellent problem-solving skills and the ability to work in a fast-paced environment.
#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary