×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Integration Engineer

Job in Sunnyvale, Santa Clara County, California, 94087, USA
Listing for: Crusoe
Full Time position
Listed on 2025-11-22
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Warehousing, Cloud Computing
Salary/Wage Range or Industry Benchmark: 150000 - 200000 USD Yearly USD 150000.00 200000.00 YEAR
Job Description & How to Apply Below

Crusoe’s mission is to accelerate the abundance of energy and intelligence. We’re crafting the engine that powers a world where people can create ambitiously with AI — without sacrificing scale, speed, or sustainability.

Be a part of the AI revolution with sustainable technology e, you'll drive meaningful innovation, make a tangible impact, and join a team that’s setting the pace for responsible, transformative cloud infrastructure.

Crusoe Cloud is seeking a Data Integration Engineer to help build the foundation of our next-generation data platform. In this role, you’ll design and maintain scalable data pipelines and integrations between key business systems, enabling reliable data flow and analytics across the organization. You’ll play a key role in supporting Crusoe’s data-driven decision-making and help connect systems across construction, engineering, and enterprise platforms.

A successful candidate will have a strong background in data structures; specifically, prior exposure and experience in Construction-related datasets is desired (DCIS, PMIS, CMMS, ERP, etc).

What You’ll Be Working On:
  • Data Pipeline Development: Design, implement, and maintain scalable data pipelines (ETL/ELT) using primary tools like Fivetran
    , Workato
    , and DBT to move data between critical business systems, including PMIS
    , ERP
    , HCM
    , and cloud environments like GCS/GCP
    .

  • Initial Project Focus: Lead the development of data integrations for our datacenter construction business, linking systems such as DCIS
    , PMIS
    , BIM
    , ERP
    , Cost Management
    , and Procurement
    .

  • Data Lake Management: Build and manage data ingestion processes (ETL) to consolidate structured and unstructured data into a centralized Datalake built on GCS
    .

  • Analytics Enablement: Ensure data quality and availability to support both business analytics & reporting, as well as complex forecasting and modeling initiatives.

  • Reporting Tool Integration: Build the necessary data integrations to allow visualization and reporting using tools like Sigma and DBT
    .

  • Continually meet with various business units to collect data requirements and propose and implement data pipeline enhancements and modernization.

  • Prepare functional specifications (business requirements) and test data as needed for new integrations.

  • Work with the Operations Team to create and maintain a roadmap of data integration projects.

  • Maintain accurate documentation of code, designs, and integrations; including project tickets, knowledge bases, configuration documents, and as-built diagrams.

  • Stay accountable on project work by adhering to Agile Sprint principles, and meet KPI objectives.

  • Timely communications and acknowledgments.

  • Travel up to 20% for the support of other offices.

What do you bring to the team
  • Bachelor’s or Master’s Degree in Computer Science, Data Science, Engineering, Information Technology, or equivalent experience of 5+ years of working experience as a Data Integration Engineer or similar role (e.g., Data Engineer, ETL Developer).

  • Has 3+ years of experience designing and implementing highly reliable, high-volume ETL/ELT pipelines.

  • Expertise in cloud-based data warehousing and data lake solutions, specifically using Google Cloud Storage (GCS) and Google Cloud Platform (GCP) services.

  • Strong proficiency with data integration/ETL platforms like Fivetran and Workato
    . Ideally has achieved the Workato Integration Developer Certificate.

  • Proven experience with DBT (Data Build Tool) for data transformation and modeling in a cloud data warehouse environment.

  • Experience with BI tools, preferably Sigma
    , for data visualization and reporting.

  • Strong knowledge of SQL, data modeling (Kimball, Inmon), schema design, and database management.

  • Demonstrates strong knowledge of EAI/SOA best practices, solution designs, and methodology & standards related to data movement.

  • Can demonstrate prior experience with Role Based Access Controls, Data Management, Environmental Controls, and audit logs.

  • Experience with
    Atlassian JIRA, JSM, and Confluence is a PLUS.

  • Be a master of planning out implementations and new data integrations.

  • Good written, oral, and interpersonal communication skills.

  • Self-starter yet knows when to ask for help, and works…

Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary