×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer Spark & SCALA Our W2

Job in Bentonville, Benton County, Arkansas, 72712, USA
Listing for: Jobs via Dice
Full Time position
Listed on 2026-03-01
Job specializations:
  • IT/Tech
    Data Engineer, Big Data, Cloud Computing, Database Administrator
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Position: Data Engineer with Spark & SCALA on Our W2

Role:
Data Engineer with Spark & SCALA on Our W2

Location:

Bentonville AR

Onsite Requirement: Y

Number of days onsite: 5 Days

Mandatory Areas
  • Must Have Skills – Data Engineer with Scala
  • Skill 1 – Scala, Spark, Python, SQL, Bigdata, Hadoop
  • Google Cloud Platform data tools – Big Query, Dataproc, Vertex AI, Pub/Sub, Cloud Functions
  • Skill 2 – PySpark, Python, Spark

    SQL, and data modeling

Only W2 Candidates required, No glider needed for this role

We are seeking a Data Engineer with Spark & SCALA;
Streaming skills builds real-time, scalable data pipelines using tools like Spark, Kafka, and cloud services (Google Cloud Platform) to ingest, transform, and deliver data for analytics and ML.

Responsibilities
  • Design, develop, and maintain ETL/ELT data pipelines for batch and real-time data ingestion, transformation, and loading using Spark (PySpark/Scala) and streaming technologies (Kafka, Flink).
  • Build and optimize scalable data architectures, including data lakes, data warehouses (Big Query), and streaming platforms.
  • Performance Tuning:
    Optimize Spark jobs, SQL queries, and data processing workflows for speed, efficiency, and cost-effectiveness.
  • Data Quality:
    Implement data quality checks, monitoring, and alerting systems to ensure data accuracy and consistency.
Required

Skills & Qualifications
  • Programming:
    Strong proficiency in Python, SQL, and potentially Scala/Java.
  • Big Data:
    Expertise in Apache Spark (Spark SQL, Data Frames, Streaming).
  • Streaming:
    Experience with messaging queues like Apache Kafka, or Pub/Sub.
  • Cloud:
    Familiarity with Google Cloud Platform, Azure data services.
  • Databases:
    Knowledge of data warehousing (Snowflake, Redshift) and No

    SQL databases.
  • Tools:
    Experience with Airflow, Databricks, Docker, Kubernetes is a plus.
  • Experience Level: Total IT Experience – Minimum 8 Years.
  • Google Cloud Platform – 4+ years of recent GCP experience.
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary