Data Engineer Spark Streaming & Google Cloud Platform - W2
Job in
Bentonville, Benton County, Arkansas, 72712, USA
Listed on 2026-02-17
Listing for:
Jobs via Dice
Full Time
position Listed on 2026-02-17
Job specializations:
-
IT/Tech
Data Engineer, Big Data, Cloud Computing, Database Administrator
Job Description & How to Apply Below
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Info Dinamica Inc, is seeking the following. Apply via Dice today!
Role:Data Engineer (Python, Scala, SQL)
Location:
Bentonville, AR (Onsite from Day
1) Job Type: W2 Contract
Note:
Only W2 - No C2C (or) Third-party candidates Mandatory Areas:
- 8+ years of experience in Python, SQL, and potentially Scala/Java
- Big Data:
Expertise in Apache Spark (Spark SQL, Data Frames, Streaming). - 4+ Years in Google Cloud Platform
We are seeking a Data Engineer with Spark & Streaming skills that builds real-time, scalable data pipelines using tools like Spark, Kafka, and cloud services (Google Cloud Platform) to ingest, transform, and deliver data for analytics and ML.
Responsibilities:- Design, develop, and maintain ETL/ELT data pipelines for batch and real-time data ingestion, transformation, and loading using Spark (PySpark/Scala) and streaming technologies (Kafka, Flink).
- Build and optimize scalable data architectures, including data lakes, data warehouses (Big Query), and streaming platforms.
- Performance Tuning:
Optimize Spark jobs, SQL queries, and data processing workflows for speed, efficiency, and cost-effectiveness. - Data Quality:
Implement data quality checks, monitoring, and alerting systems to ensure data accuracy and consistency.
Qualifications:
- Programming:
Strong proficiency in Python, SQL, and potentially Scala/Java. - Big Data:
Expertise in Apache Spark (Spark SQL, Data Frames, Streaming). - Streaming:
Experience with messaging queues like Apache Kafka, or Pub/Sub. - Cloud:
Familiarity with Google Cloud Platform, Azure data services. - Databases:
Knowledge of data warehousing (Snowflake, Redshift) and No
SQL databases. - Tools:
Experience with Airflow, Databricks, Docker, Kubernetes is a plus.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×