×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Senior Data Engineer - Data Infrastructure and Architecture

Job in Wakefield, Middlesex County, Massachusetts, 01880, USA
Listing for: C-4 Analytics
Full Time position
Listed on 2025-12-03
Job specializations:
  • IT/Tech
    Data Engineer
Job Description & How to Apply Below

Overview

Senior Data Engineer - Data Infrastructure and Architecture: C-4 Analytics

C-4 Analytics is a fast-growing, private, full-service digital marketing company that excels at helping automotive dealerships increase sales, increase market share, and lower cost per acquisition. C-4 Analytics is committed to developing innovative solutions for every dealer in every market and to providing the highest levels of accountability and customer service. We are currently hiring for a Senior Data Engineer as we look to expand our team and support our growing roster of local and national clients.

If you are unable to complete this application due to a disability, contact this employer to ask for accommodation or an alternative application process.

Who We’re Looking For

Senior Data Engineer

Data Alchemy:
Forging Insights from Information

C-4 Analytics is looking for an Experienced Senior Data Engineer with expertise in data infrastructure and architecture to help shape the future of our data-driven digital marketing platforms. As a part of our growing Product, Engineering, and AI team, you’ll play a critical role in identifying and bringing together our diverse data sources and orchestrating intelligent systems. We need you to lead with an AI-forward mindset—designing and managing pipelines, platforms, and orchestration technologies that transform information into actionable insights.

We re not just processing data—we re transforming it into organizational intelligence. As our Data Engineering Virtuoso, you ll build enterprise-grade AI pipelines, turning unstructured data into decision-making gold by creating intelligent data platforms at scale.

Qualifications & Responsibilities

Illuminating the Future of Marketing Intelligence

The ideal candidate will have a strong background in ETL/ELT pipelines, data warehouse connectivity, data cleaning, normalization, database architecture, database optimization, and orchestration processes. This role will be responsible for orchestrating and delivering enterprise-grade AI pipelines. Connecting disparate data sources to Snowflake (our data warehouse), ensuring the cleanliness and normalization of data, and implementing database architecture best practices.

Your Canvas
  • Prototype the Impossible
  • Design, develop, and maintain proof-of-concepts using cutting-edge technologies, then refine them into production-ready solutions.
  • Empower Through Innovation
  • Craft intuitive tools that elevate data scientists and analysts to their highest potential
  • Collaborate with cross-functional teams to ensure that data storage and organization align with business needs and objectives.
  • Seamless Scaling & Performance Optimization
  • Implement database architecture best practices, including database sharding, replication strategies, indexing, and optimization techniques to enhance data performance.
  • Compose Data Symphonies
  • Orchestrate enterprise-grade AI pipelines for complex data flows that bring harmony to disparate sources through batch and streaming pipelines
  • Evaluate and optimize data storage and retrieval systems based on relationships, data access patterns, cost-effectiveness, and performance requirements.
  • Blueprint Before Building
  • Design elegant solutions and document your vision so others can follow your path
  • Provide leadership and guidance on information architecture decisions, ensuring that data is stored, organized, and accessed in the most efficient and effective manner.

Your Toolkit

  • The Languages You Speak:
    Python, SQL, the dialect of data
  • Libraries | Tools:
    Terraform, Flask, Pandas, FastAPI, Dagster, Graph

    QL, SQL Alchemy, Git Lab, Athena
  • Your Trusted Companions:
    Docker, Snowflake, Mongo

    DB, Relational Databases (eg MySQL, Postgre

    SQL), Dagster, Airflow/Luigi, Spark, Kubernetes
  • Your AWS Kingdom:
    Lambda, Redshift, EC2, ELB, IAM, RDS, Route
    53, S3—the building blocks of cloud mastery
  • Your Philosophy:
    Continuous integration/deployments, (CI/CD) automation, rigorous code reviews, documentation as communication
Preferred Qualifications
  • Familiar with data manipulation and experience with Python libraries like Flask, FastAPI, Pandas, PySpark, PyTorch, to name a few…
  • Proficiency in statistics and/or machine learning…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary