×
Register Here to Apply for Jobs or Post Jobs. X

Data Analytics Engineer

Job in Wokingham, Berkshire, RG40, England, UK
Listing for: Profitero
Full Time position
Listed on 2025-12-20
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below

About Profitero

Profitero is a leading global SaaS commerce platform that uses predictive intelligence to help brands anticipate, activate and automate their next best action to fuel profitable growth. Our technology monitors 80+ million products daily, across 1400+ retailers and 70+ countries, helping brands optimise search placement, product content, pricing, stock availability, reviews and more. News outlets, including Good Morning America, The Wall Street Journal and Ad Age frequently cite and trust Profitero as a source of data for their stories.

Now’s an exciting time to join our fast-growth business.

Profitero+ joined Publicis Groupe (a $13 billion global marketing services and technology company) as a standalone commerce division, infusing our business with significant product development resources and investment while giving our employees an incredible launchpad for their careers. Profitero’s tech and data combined with Publicis’ tech, data and activation services positions us to be a true end-to-end partner for helping brands maximise eCommerce market share and profits.

Come be a part of our fast-paced, entrepreneurial culture and next stage of growth.

Location

either Winnersh Triangle, Reading or White City, London / (hybrid)

Overview

We are inviting a Data Analytics Engineer to join our Analytics team that is creating a next-generation eCommerce intelligence service for retailers and manufacturers. We will be happy to welcome new team members who are not afraid of non-trivial tasks, can offer non-standard technical solutions, and take initiative.

About the role:

We are developing a new portfolio of analytical services to allow our customers to analyse eCommerce data. The process behind this development includes collection, processing, and presenting a big amount of data to the customer. It is developed using Snowflake, dbt, Sigma, and Python.

Profitero provides customers with data on how products perform online from various angles: prices, availability, placement, ratings and reviews, product content, etc. Customers can use dashboards, our web application or API connection to Snowflake to get information about products performance online.

The migration of existing services to Snowflake will cause us to rewrite certain data pipelines to adjust our client dashboards to the new technology stack. The data engineer in this team will be responsible for implementing those pipelines working with the BI team and Architecture team.

Responsibilities:
  • Design and build scalable and resilient Data & Analytics solutions
  • Automate data workflows and optimize data processing for performance and cost
  • Design and development of new data pipelines. Improvement of existing data pipelines by using data engineering best practices.
  • Design and develop efficient, scalable data models that enable fast and accurate reporting while minimizing cost and query complexity
  • Monitor and optimize data warehouse costs, leveraging Snowflake's cost management tools to ensure efficient data processing and storage usage
  • Engage in proof of concepts and experiments
  • Coordinate with the Infrastructure team to secure necessary permissions and exchange technical insights that enhance the Analytics data environment.
Who you are:
  • Bachelor's degree in Computer Science, Data Engineering, Data Science, Software Engineering or similar. A master’s degree is a plus.
  • 1+ years of related work experience
Technical skills:
  • Strong knowledge of Python and SQL with a minimum of 1+ years of practical experience in data automation.
  • Hands‑on experience with at least one cloud data warehouse platform:
    Snowflake, Big Query, or Databricks. Knowledge of Snowflake dynamic tables, tasks and procedures is a plus.
  • Hands on experience with Github Actions/ Gitlab ci‑cd.
  • Solid understanding of designing and optimising data pipelines and data models in a cloud environment. Experience in setting up ETL processes using dbt (or similar tools such as Dataform or Dagster).
  • Experience with BI tools like Power BI, Tableau, Sigma, Looker, or similar is a plus.
Soft Skills:
  • Strong problem‑solving skills with the ability to troubleshoot and resolve complex data issues.
  • Excellent communication…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary