×
Register Here to Apply for Jobs or Post Jobs. X

Lead Data Engineer

Job in Pleasant Grove, Utah County, Utah, 84062, USA
Listing for: Kenect
Full Time position
Listed on 2025-12-02
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below

About Us

Kenect is on a mission to revolutionize customer communication and engagement for businesses across North America. Founded with a deep understanding of the challenges businesses face in connecting with their customers, Kenect helps companies streamline communication, enhance customer satisfaction, and drive growth through its innovative messaging and reputation platform. Trusted by thousands of businesses, our passionate team is committed to building technology that fosters closer connections and helps businesses thrive in a digital-first world.

About This Role

As the Lead Data Engineer, you will be the architectural and technical leader for Kenect’s core data ecosystem, including designing and implementing a robust Customer Data Platform (CDP). This platform will unify batch and real-time data from sources such as DMS, CRMs, social networks, and web traffic to create comprehensive ML-ready 360-degree customer profiles. You will work closely with a Product Manager to align data initiatives with the product roadmap, ensuring the creation of high-value data products prioritized to deliver customer-facing value.

In collaboration with our product, engineering, and AI Platform teams, you will engineer the foundational data infrastructure that enables advanced digital initiatives like real-time personalization, predictive modeling, and targeted marketing campaigns.

In this role, you will help to define the architecture, tools, and strategic execution for data pipelines, the data lake, our cloud data warehouse, and integrations with marketing automation tools. Your work will directly support Kenect’s mission to transform the customer experience at tens of thousands of dealerships.

What you will be doing
  • Designing and implementing highly scalable batch and real-time data pipelines using modern tools such as GCP Dataflow (Apache Beam), dbt, and managed Apache Spark (Dataproc).
  • Architecting and managing a secure, performance-optimized data lake and cloud data warehouse solution in Google Cloud using Big Query, Cloud Storage (GCS), and open formats like Apache Iceberg.
  • Building and optimizing high-throughput streaming data pipelines with GCP Pub/Sub and Dataflow to support real-time data ingestion and processing.
  • Designing and engineering robust feature pipelines to deliver high-quality, low-latency, ML-ready datasets to the AI Platform team for model training and serving.
  • Developing integrations with Customer Data Platforms (e.g., Twilio Segment, Rudder Stack) and marketing automation systems to ensure a closed-loop data strategy.
  • Collaborating with engineers, analysts, and Data Scientists on the AI Platform team to build data solutions that support advanced use cases like real-time personalization, predictive scoring, and segmentation.
  • Defining strategies for multi-tenant SaaS data solutions to ensure scalability, performance, security, and cost-efficiency.
  • Leading technical initiatives to adopt new technologies and best practices for high-volume data engineering.
  • Ensuring data quality, governance, and compliance with industry standards and regulations.
Skills & Qualifications
  • 8+ years of professional experience in data engineering, with a focus on high-volume batch and streaming data architectures.
  • Expertise with core GCP data products (e.g., Big Query, Dataflow, Pub/Sub, Cloud Composer, Dataproc, and experience integrating with Vertex AI services).
  • Proficiency in Python and modern data processing tools like GCP Dataflow (Apache Beam), dbt, and Apache Spark.
  • Hands-on experience with feature engineering, feature store patterns, and designing data pipelines for low-latency ML serving.
  • Strong understanding of CDPs and tools like Twilio Segment or Rudder Stack.
  • Hands-on experience with streaming tools like GCP Pub/Sub, Kafka, or AWS SNS is highly desirable.
  • Experience with multi-tenant SaaS architecture is a strong plus.
  • Excellent communication skills and a collaborative approach to solving complex problems, particularly when engaging with platform teams.

What Kenect Offers!

• Health, Dental, Vision, Life & Disability Insurance

• Your birthday is a paid day off

• Onsite gym

• Breakroom full of snacks and drinks

Convenient location next to freeway entrance/exit

We believe in hiring self-motivated team members who can run alongside us without needing to be “managed” along the way. Yes, we have managers and 1:1s. Yes, we believe in giving open two-way feedback. We also believe in having team members who can run without the daily guidance that some companies prefer.

Kenect is an equal opportunity employer. We are an organization comprised of people of all kinds of backgrounds, and we believe this mix is precisely what makes us strong. All employment decisions at Kenect are based on business needs, job requirements, and individual qualifications without regard to race, color, religion or belief, family or parental status, or any other status protected under federal, state, or local law.

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary