×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Data Platform – Data Engineer

Job in Oklahoma City, Oklahoma County, Oklahoma, 73116, USA
Listing for: Canoo
Full Time position
Listed on 2025-12-01
Job specializations:
  • IT/Tech
    Data Engineer
Job Description & How to Apply Below
Position: Canoo Data Platform – Data Engineer

Join to apply for the Canoo Data Platform – Data Engineer role at Canoo

Continue with Google Continue with Google

Join to apply for the Canoo Data Platform – Data Engineer role at Canoo

About Canoo

Canoo’s mission is to bring EVs to Everyone and build a world-class teamtodeploy this sustainable mobility revolution.

We have developed breakthrough electric vehicles that are reinventing the automotive landscape with pioneering technologies, award-winning designs, and a unique business model that spans all owners in the full lifecycle of the vehicle. Canoo is starting production and is distinguished by its pioneering and experienced team of technologists, engineers, and designers. With offices around the country, the company is scaling quickly and seeking candidates who love to challenge themselves, are motivated by purpose, and possess a strong desire to get things done.

Job Title

Canoo Data Platform – Data Engineer

About Canoo

Canoo’s mission is to bring EVs to Everyone and build a world-class teamtodeploy this sustainable mobility revolution.

We have developed breakthrough electric vehicles that are reinventing the automotive landscape with pioneering technologies, award-winning designs, and a unique business model that spans all owners in the full lifecycle of the vehicle. Canoo is starting production and is distinguished by its pioneering and experienced team of technologists, engineers, and designers. With offices around the country, the company is scaling quickly and seeking candidates who love to challenge themselves, are motivated by purpose, and possess a strong desire to get things done.

The “Canoo Way”

Canoo’s success is the direct result of our disciplined application of our core operating principles and drills, which are based on three main principles:
Think 80/20 (“Important versus less important”), Act 30/30 (“Reduce waste and increase output”), and Live 90/10 (“We have each other’s back”). We hire based on “MET” - Mindset, Equipment and willingness to Train - and seek individuals that take accountability and deliver results being Humble, Hungry to succeed, and Hunting for opportunities to win. We train our team to engage with each other by modulating between their intellect (iQ) and emotional intelligence (eQ) applying Facts, Finesse, and Force when they communicate.

The principles and drills of the CANOO Way have been fundamental to our success, our ability to grow, continuously improve, innovate and are at the core of our day-to-day operations.

Job Purpose

As a Data Engineer, you will be responsible for developing and maintaining highly scalable data pipelines that enable data transformation and load between internal systems, IoT devices (electric vehicles), external backend systems, and frontend user interfaces. You will design and implement data streams ensuring data quality, data integrity, security, and high performance. Additionally, you will collaborate with cross-functional teams to continually integrate all company systems.

Responsibilities (80s Of The Position)

  • Work with stakeholders to gather data and reporting requirements, to build dashboards and data flows.
  • Create infrastructure-as-code, deployment pipelines, developer tools, and other automations.
  • Understand product requirements, engage with team members and customers to define solutions, and estimate the scope of work required.
  • Deliver solutions that can keep up with a rapidly evolving product in a timely fashion.
Required Experience

  • Google Cloud Platform (GCP), GCS, Big Query
  • Expertise with one or more back-end languages such as Python, Go, Type Script, JavaScript, etc.
  • SQL expertise – DBT experience a plus.
  • Experience with cloud services like GCP, AWS or Azure.
  • Kafka
  • Dashboarding and Reporting – Superset, Looker
  • Git – Bit Bucket/Gitlab
  • * Kubernetes – Mid-Level Experience


Preferred Experience

  • Python
  • Python dependency management and custom packages
  • Expertise with Google Cloud Platform (GCP)
  • Data Warehousing – partitioning, segmentation
  • Internet of Things (IoT) and MQTT
  • Docker
  • Terraform – experience a plus
  • CI/CD tooling - Jenkins/git-ci
  • Understanding of automotive and embedded software systems
Travel Requirements

  • Onsite presence in the office, this is not a remote or hybrid role.
  • Travel may be required on an occasional basis for events such as team meetings or working with manufacturers or subject-matter experts on particular tasks (
    To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
    (If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
     
     
     
    Search for further Jobs Here:
    (Try combinations for better Results! Or enter less keywords for broader Results)
    Location
    Increase/decrease your Search Radius (miles)

    Job Posting Language
    Employment Category
    Education (minimum level)
    Filters
    Education Level
    Experience Level (years)
    Posted in last:
    Salary