×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in Snowflake, Navajo County, Arizona, 85937, USA
Listing for: Ilyon Dynamics Ltd
Full Time position
Listed on 2026-02-07
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below

At Ilyon, we build mobile games enjoyed by millions of players around the world. Our decisions are driven by data — from feature development and live ops to user acquisition and monetization. We’re looking for a skilled Data Engineer to help us take our analytics and data infrastructure to the next level.

As a Data Engineer, you’ll design and maintain the systems that power insights across the company. You’ll work closely with data analysts, UA managers, and product teams to ensure clean, fast, and scalable access to our most important asset: data.

  • Design, build, and maintain scalable, robust ETL/ELT pipelines.
  • Ingest data from various sources (APIs, databases, flat files, cloud buckets).
  • Automate workflows for batch and/or streaming pipelines (e.g., using Airflow, GCP services).
  • Design and organize data for analytics teams in cloud warehouses (Big Query, Snowflake).
  • Implement best practices for partitioning, clustering, and materialized views.
  • Manage and optimize data infrastructure (cloud resources, storage, compute).
  • Ensure scalability, security, and compliance in data platforms.

Data Quality & Governance

  • Monitor data integrity, consistency, and accuracy.
  • Implement validation, monitoring, and alerting for pipeline health and data accuracy.
  • Maintain documentation and data catalogs.
  • Troubleshoot failures or performance bottlenecks.
  • Work closely with data analysts, managers, and developers.
  • Translate business requirements into technical solutions.
  • Support self-service analytics and create reusable datasets.
Requirements
  • 2+ years of experience as a Data Engineer or similar role.
  • Strong SQL and Python skills for data manipulation and pipeline logic.
  • Experience with Airflow for orchestration and Docker/Kubernetes for deployment.
  • Hands-on experience with cloud data platforms (GCP, AWS) and warehouses like Big Query or Snowflake.
  • Knowledge of data modeling, optimization, and performance tuning.
  • Familiarity with DAX and BI tools like Power BI or Looker.
  • Experience with Kafka or Pub/Sub for real-time data ingestion- an advantage.
  • Knowledge of Docker, Kubernetes, and cloud-native tools in GCP- an advantage.
  • Experience with Firebase Analytics and Unity Analytics (data structure wise)- an advantage.

Languages: SQL, Python, DAX

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary