×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Data Engineering Lead Raleigh, NC

Job in Raleigh, Wake County, North Carolina, 27601, USA
Listing for: Disability Solutions
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer
  • Engineering
    Data Engineer
Job Description & How to Apply Below
Position: Data Engineering Lead at Disability Solutions Raleigh, NC

Data Engineering Lead job at Disability Solutions. Raleigh, NC. Lexis Nexis is seeking an experienced Data Engineering Lead to manage and develop a high-performing engineering team that delivers secure, scalable, and business-critical data solutions across our global platforms. This role requires a strong people manager with hands‑on technical expertise and the ability to lead teams through complex data engineering challenges in a fast‑paced, enterprise environment.

You will manage a team that includes Consulting Data Engineers, Senior Data Engineers, Data Engineers III, Data Engineer II, and entry-level engineers, guiding their growth and ensuring strong execution, delivery quality, and engineering excellence. This is a leadership role for someone who can balance team management, delivery ownership, and architectural oversight while maintaining a strong technical foundation.

What You’ll Do People Leadership & Team Management
  • Lead, mentor, and manage a multi-level data engineering team (Consulting, Senior, DE III/II/I) distributed across multiple global locations.
  • Drive career development, skill growth, coaching, and performance reviews for all team members.
  • Build an inclusive, collaborative, and high-accountability team culture aligned with Lexis Nexis values.
  • Participate in hiring, onboarding, and talent planning to strengthen the engineering organization.
Delivery & Execution Ownership
  • Own execution and delivery of all data engineering roadmap items for your domain.
  • Manage sprint planning, prioritization, estimation, and work allocation across multiple projects.
  • Track delivery KPIs-pipeline availability, data quality, SLA adherence, velocity, and stability.
  • Anticipate risks, resolve blockers, and ensure consistent, predictable delivery.
Technical Leadership & Architectural Oversight
  • Provide architectural guidance on building secure, scalable cloud data pipelines and platforms.
  • Ensure all solutions meet enterprise standards for governance, observability, and compliance.
  • Review and approve solution designs, architectural documents, and critical code paths.
  • Guide the team in implementing best practices in CI/CD, testing, modularity, resiliency, and documentation.
Cross-Functional Collaboration
  • Partner with Product, Architecture, Platform Engineering, Data Governance, and business teams.
  • Translate business requirements into actionable engineering tasks and technical designs.
  • Influence upstream and downstream teams to ensure data consistency, quality, and availability.
  • Represent the Data Engineering function in planning meetings, architecture reviews, and operational forums.
Operational Excellence
  • Oversee production data pipelines, ensuring reliability, cost efficiency, and optimal performance.
  • Implement best practices for monitoring, logging, alerting, on-call rotations, incident management, and RCA.
  • Drive automation across deployment, testing, orchestration, and environment provisioning.
  • Continuously reduce technical debt and enhance platform scalability and resilience.
Technical Skills & Experience Required Technical Skills
  • Python - strong hands‑on experience writing production-grade code for ETL/ELT and automation.
  • SQL - expert-level ability to write, optimize, and troubleshoot complex SQL queries at scale.
  • AWS - strong experience with cloud data services such as S3, Redshift, Lambda, Glue, EMR, Step Functions, IAM, etc.
  • Redshift - hands‑on experience modeling data, optimizing queries, and managing Redshift clusters.
  • Dev Ops - knowledge of CI/CD pipelines, Git Ops, automation, monitoring, environment management, and infrastructure-as-code.
  • Orchestration - experience with Airflow, Step Functions, or equivalent workflow orchestration tools.
Preferred / Nice-to-Have Skills
  • Databricks - experience processing large datasets using Spark and Delta Lake.
  • Matomo - exposure to tracking/analytics data ingestion and event pipelines.
  • Full Story - experience handling behavioral analytics, session replay data, or similar tools.
  • Pendo - experience with product analytics datasets and event telemetry.
  • EMR - experience running distributed data processing workloads using Hadoop/Spark on AWS EMR.
  • Data Quality & Governance - familiarity with…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary