More jobs:
Data Engineer; USC & GC | NO H1, OPT, CPT
Job in
California City, Kern County, California, 93504, USA
Listed on 2026-02-27
Listing for:
Data Freelance Hub
Full Time
position Listed on 2026-02-27
Job specializations:
-
IT/Tech
Data Engineer, Big Data
Job Description & How to Apply Below
Data Engineer (USC & GC Only | NO H1, OPT, CPT…)
⭐ Featured Role | Apply direct with Data Freelance Hub
This contract position lasts over 6 months and pays $40.00–$50.00 per hour. U.S. Citizenship or Green Card is required.
Responsibilities- Design, build, and maintain scalable ETL pipelines, data workflows, and data warehousing solutions for analytics and reporting.
- Develop and optimize data modeling solutions, databases, and large-scale datasets using platforms such as Google Big Query, Hive, and other cloud-native technologies.
- Write and optimize complex SQL queries and implement data transformations with Python, Spark, and Scala.
- Build and maintain batch and streaming pipelines using Spark, PySpark, Scala, and messaging systems such as Kafka.
- Orchestrate workflows and automate data processes using Apache Airflow.
- Integrate data from multiple sources, including third‑party systems via REST APIs, ensuring data quality, consistency, governance, and compliance.
- Collaborate with product, engineering, and business teams to translate requirements into scalable technical solutions.
- Test, deploy, monitor, and support data solutions across cloud environments (AWS, Azure, or GCP).
- Document processes, contribute to continuous improvement initiatives, and provide technical guidance to users and cross‑functional teams.
- 5+ years of experience in data engineering, analytics, or related fields.
- Advanced proficiency in SQL and strong programming skills in Python and/or Scala.
- Hands‑on experience with ETL processes, data modeling, and data warehousing concepts.
- Experience working with cloud data platforms such as Google Big Query, AWS, or Azure.
- Strong experience with Spark (PySpark/Scala) and big data technologies such as Hive and Kafka.
- Experience building and managing workflows using Apache Airflow.
- Experience integrating systems using REST APIs.
- Knowledge of data governance, data quality, and compliance best practices.
- Bachelor’s degree in Computer Science, Engineering, or equivalent experience.
Job Types: Full‑time, Contract, Permanent
Pay: $40.00 - $50.00 per hour
Work Location:
In person (On‑site)
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×