×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Cary, Wake County, North Carolina, 27518, USA
Listing for: JPS Tech Solutions LLC
Full Time position
Listed on 2026-01-07
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
  • Engineering
    Data Engineer, Data Science Manager
Job Description & How to Apply Below

Senior Data Engineer – Cary, NC

Experience:

12+ Years

We are looking for an experienced Senior Data Engineer who will lead the design and development of modern data platforms and scalable data pipelines. The ideal candidate has strong hands‑on expertise in cloud data engineering, big data technologies, ELT / ETL architecture, and data modeling, along with the ability to mentor teams and work closely with business stakeholders to deliver high‑quality data solutions.

Key Responsibilities
  • Architect, design, and implement large‑scale data pipelines and data integration workflows across structured and unstructured datasets.
  • Build, optimize, and maintain robust ETL / ELT processes for ingestion, transformation, and delivery of data across enterprise systems.
  • Develop and manage data lakes, data warehouses, and analytics platforms using cloud technologies.
  • Work closely with data scientists, analysts, and business teams to understand data requirements and deliver reliable data for reporting and analytics.
  • Define best practices for data quality, lineage, governance, security, and performance optimization.
  • Lead and mentor junior engineers, participate in design / code reviews, and ensure engineering excellence.
  • Collaborate with cross‑functional teams in an Agile environment for solution design and implementation.
  • Own production deployment, monitoring, debugging, performance tuning, and incident management for data pipelines.
Required Skills & Experience
  • 12+ years of experience in Data Engineering, Data Architecture, or similar roles.
  • Strong programming skills in Python / Java / Scala.
  • Expert in SQL and performance tuning for large datasets.
  • Hands‑on experience with Big Data ecosystems:
    Hadoop, Spark, Kafka, Hive, HBase, etc.
  • Strong experience with Cloud platforms (AWS / Azure / GCP) and services:
    • AWS: S3, Glue, EMR, Redshift, Lambda, Kinesis
    • Azure:
      Data Factory, Synapse, Databricks, ADLS
    • GCP:
      Big Query, Dataflow, Pub / Sub
  • Experience with Data Warehouse / Data Lake / Lakehouse design and modeling (Kimball, OLAP, OLTP, Star / Snowflake schemas).
  • Proficiency in CI / CD, Git, Docker, Kubernetes, Airflow or similar orchestration tools.
  • Knowledge of data governance, security, metadata management, and data quality frameworks.
Nice to Have
  • Experience with Databricks / Snowflake / Delta Lake.
  • Exposure to ML pipelines, MLOps, and real‑time streaming data processing.
  • Experience in designing scalable solutions for enterprise‑level environments.
Education

Bachelor's or Master's degree in Computer Science, Engineering, or related field.

#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary