×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Vancouver, BC, Canada
Listing for: Two Circles
Contract position
Listed on 2026-02-02
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Data Security, Data Analyst
Job Description & How to Apply Below

About Two Circles

We are Two Circles. We grow audiences and revenues by helping our clients know fans best. We analyze fan behavior across spending, event attendance, channel engagement, and content consumption to empower leading sports and entertainment organizations. Our trusted platforms and services are used by over 900 clients globally – from the English Premier League, UEFA, NFL, and Red Bull, to Amazon, Nike, and VISA.

With nearly 1,000 team members across 14 global offices, our impact spans the world of sports and entertainment.

About the KORE Intelligence Platform

The KORE Intelligence Platform is a best-in-class fan, partnership, and revenue intelligence solution. It aggregates data from an integration catalog of over
150providers such as Ticketmaster, Eloqua, Shopify, and Salesforce Marketing Cloud, enabling powerful segmentation and actionable insights for rights-holders. Our platform underpins client strategies to drive audience growth, personalization, and commercial success.

Role Overview

We are looking for a Data Engineer with 3+ years' experience working on enterprise-scale data architectures, preferably within a cloud-native environment (AWS) to join us on a6 month contract. You will bring deep expertise in designing and operating secure, highly available, and cost-optimized pipelines, with hands-on experience using tools like Spark, Kinesis, Airflow, and Terraform.

You'll collaborate closely with architects, Dev Ops engineers, analysts, and stakeholders to shape the future of our SaaS-based fan intelligence platform and ensure it meets the needs of high-volume, real-time data use cases.

Key Responsibilities

Lead the design and implementation of scalable, secure, and fault-tolerant data pipelines supporting batch and streaming workloads.

Champion enterprise data governance, monitoring, and alerting practices, ensuring compliance with data protection regulations (e.g., GDPR, CCPA).

Collaborate with cross-functional teams (Product, Dev Ops, BI, Client Services) to drive end-to-end delivery of data features.

Take ownership of pipeline performance, observability, cost efficiency, and CI/CD workflows.

Provide technical guidance, mentor junior team members, and uphold high standards of code quality and engineering practices.

Contribute to the reliability and scalability of the platform through strong infrastructure-as-code, monitoring, and automated testing frameworks.

Experience Required

3+ years of professional experience as a Data Engineer in cloud environments, working with high-throughput, enterprise-scale data pipelines.

Strong experience with AWS services: S3, Lambda, Kinesis, Dynamo

DB, Redshift, EC2, EMR, IAM.

Demonstrated success managing secure data architectures with sensitive or regulated data (e.g., PII).

Proven expertise in big data architectures, including Data Lakes and Lake houses, leveraging AWS services like S3, Athena, Glue, Redshift, and distributed computing with Apache Spark.

Practical experience with real-time data streaming using AWS Kinesis (Firehose and Data Streams) for scalable ingestion and processing.

Experience integrating with APIs, third-party data platforms, and real-time ingestion services.

Bonus points for experience and/or strong interest in AI-assisted programming tools and workflows.

Technical Skills

Strong programming experience in Python and at least one of Scala or C#.

Proven experience with Apache Airflow (or equivalent orchestration tools) for job scheduling and workflow management.

Proficient with containerization technologies like Docker and orchestration with ECS or Kubernetes.

Experience with infrastructure as code (e.g., Terraform, Cloud Formation).

Experience in setting up observability stacks and using tools such as New Relic, Cloud Watch, Prometheus, or Datadog.

Expertise in data testing and CI/CD, including unit testing (e.g., PyTest), data validation, and schema enforcement.

Understanding of streaming vs batch-oriented architectures, and when to use each approach.

Soft Skills

Self-directed and highly proactive, with strong ownership of delivery and quality.

Excellent written and verbal communication skills; able to translate technical requirements into business impact.

Strong collaboration skills and a mindset for mentorship and knowledge-sharing.

Agile mindset, comfortable working in Scrum or Kanban environments.

Mindset & Values

Outcome-oriented, with a passion for delivering high-quality, maintainable, and secure data products.

Thrives in a fast-paced, evolving product landscape.

Values transparency, open communication, and a collaborative team culture.

#J-18808-Ljbffr
Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary