×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Burbank, Los Angeles County, California, 91520, USA
Listing for: Data Freelance Hub
Contract position
Listed on 2026-02-07
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 80 - 85 USD Hourly USD 80.00 85.00 HOUR
Job Description & How to Apply Below

⭐ - Featured Role | Apply direct with Data Freelance Hub

This role is for a Senior Data Engineer with a 10-month contract, paying $80-$85/hr, located in hybrid Burbank, CA. Requires 7+ years in data engineering, AWS services, SQL, Python, and experience with modern data platforms.

United States

$ USD

Hybrid

W2 Contractor

Burbank, CA 91505

#Databricks #Lambda (AWS Lambda) #ML (Machine Learning) #SQL (Structured Query Language) #BI (Business Intelligence) #Redshift #Scripting #Data Catalog #Data Governance #Deployment #Data Engineering #Data Quality #PySpark #AI (Artificial Intelligence) #Airflow #Python #AWS (Amazon Web Services) #Data Pipeline #Batch #RDS (Amazon Relational Database Service) #Snowflake #Anomaly Detection #Monitoring #Data Science #S3 (Amazon Simple Storage Service) #ETL (Extract #Transform #Load) #Metadata #Scala #Spark (Apache Spark) #Dynamo

DB #Informatica #Data Transformations

Job Responsibilities
  • Design & Build Scalable Data Pipelines
  • Lead development of batch and streaming pipelines using AWS-native tools (Glue, Lambda, Step Functions, Kinesis) and modern orchestration frameworks.
  • Implement best practices for monitoring, resilience, and cost optimization in high-scale pipelines.
  • Collaborate with architects to translate canonical and semantic data models into physical implementations.
  • Build pipelines that deliver clean, well-structured data to analysts, BI tools, and ML pipelines.
  • Work with data scientists to enable feature engineering and deployment of ML models into production environments.
  • Embed validation, lineage, and anomaly detection into pipelines.
  • Contribute to the enterprise data catalog and enforce schema alignment across pods.
  • Partner with governance teams to implement role-based access, tagging, and metadata standards.
  • Guide junior data engineers, sharing best practices in pipeline design and coding standards.
  • Participate in pod ceremonies (backlog refinement, sprint reviews) and program-level architecture syncs.
  • Promote reusable services and reduce fragmentation by advocating platform-first approaches.
Must Have Skills / Requirements
  • Experience in data engineering, with hands-on expertise in AWS services (Glue, Kinesis, Lambda, RDS, Dynamo

    DB, S3) and orchestration tools (Airflow, Step Functions). 7+ years of experience.
  • Strong skills in SQL, Python, PySpark, and scripting for data transformations. 7+ years of experience.
  • Experience working with modern data platforms (Snowflake, Databricks, Redshift, Informatica). 7+ years of experience.
Soft Skills
  • Strong collaboration and mentoring skills; ability to influence across pods and domains.
  • Knowledge of data governance practices, including lineage, validation, and cataloging.
Technology Requirements
  • Strong skills in SQL, Python, PySpark, and scripting for data transformations.
  • Hands‑on expertise in AWS services (Glue, Kinesis, Lambda, RDS, Dynamo

    DB, S3) and orchestration tools (Airflow, Step Functions).
  • Experience working with modern data platforms (Snowflake, Databricks, Redshift, Informatica).
  • Proven ability to optimize pipelines for both batch and streaming use cases.

Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.

Job Type: Contract
Pay: $80.00 - $85.00 per hour
Expected hours: 40 per week
Application Question(s):
How many years of experience of hands‑on expertise in AWS services do you have?
Years of experience with SQL, Python, PySpark, and scripting for data transformations?
Years of experience working with modern data platforms (Snowflake, Databricks, Redshift, Informatica)?

Work Location:

Hybrid remote in Burbank, CA 91505

#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary