×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer - IT; Sales & Marketing; Austin, MN, Eden Prairie, MN

Job in Austin, Mower County, Minnesota, 55912, USA
Listing for: Hormel Foods
Full Time position
Listed on 2025-12-12
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Warehousing
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below
Position: Staff Data Engineer - IT (Sales & Marketing) Hormel Foods (Austin, MN, Eden Prairie, MN)

Staff Data Engineer – IT (Sales & Marketing) – Hormel Foods

To save time applying, Hormel Foods does not offer sponsorship of job applicants for employment-based visas for this position at this time.

Hormel Foods Corporation

ABOUT HORMEL FOODS – Inspired People. Inspired Food.™

Hormel Foods Corporation, based in Austin, Minn., is a global branded food company with over $12 billion in annual revenue across more than 80 countries worldwide. Its brands include SKIPPY® , SPAM ®, Hormel®Natural Choice®, Applegate®, Justin’s®, Wholly®, Hormel®Black Label®, Columbus® and more than 30 other beloved brands. The company is a member of the S&P 500 Index and the S&P 500 Dividend Aristocrats, was named on the “Global 2000 World’s Best Employers” list by Forbes magazine for three straight years, is one of Fortune magazine’s most admired companies, has appeared on Corporate Responsibility Magazine’s “The 100 Best Corporate Citizens” list for the 12th year in a row, and has received numerous other awards and accolades for its corporate responsibility and community service efforts.

The company lives by its purpose statement — Inspired People. Inspired Food.™ — to bring some of the world’s most trusted and iconic brands to tables across the globe. For more information, visit  and

Summary

We are looking for a Staff Data Engineer within the Sales and Marketing domain as part of our Data and Analytics team. This is an exciting opportunity to help grow and modernize analytics at Hormel Foods! The ideal candidate will have strong communication skills and the ability to collaborate across multiple levels of the organization. You will manage multiple initiatives that require creative problem‑solving.

You will use tools such as Google Cloud Platform, SQL, Python, Incorta, Oracle Business Intelligence, and Informatica ETL to engineer data pipelines and data models to enhance enterprise reporting and analytics. Additionally, you will engineer reports, dashboards and visualizations using enterprise business intelligence tools (Oracle, Power BI and Tableau).

Specific Competencies
  • Data Structures and Models - Develops the overall database/data warehouse structure based on functional and technical requirements. Develops data collection frameworks for mainly structured and sometimes unstructured data.
  • Data Pipelines and ELT- Applies data extraction, loading and transformation techniques in order to connect medium to large data sets from a variety of sources.
  • Data Performance - With minimal guidance, troubleshoots and fixes for data performance issues that come with querying and combining medium to large volumes of data. Tests for scenarios affecting performance during initial development.
  • Visualizations and Dashboards - Designs and develops reports and dashboards that meet business needs. Leverages visualizations when possible to increase speed to identifying an insight.
Responsibilities
  • Collaborate with Sales & Marketing team members, data scientists, BI analysts, and other stakeholders to understand data needs and deliver solutions.
  • Develop the overall database/data warehouse structure based on functional and technical requirements.
  • Engineer physical and logical data models for dimensions and facts within the staging, warehouse, and semantic layers of enterprise data warehouses and platforms.
  • Performance tune SQL, Python, Incorta, or Informatica ETL pipelines, as well as Google Big Query Dataprocs to move data from various source systems and file types into dimensional data models.
  • Utilize SQL within Google Big Query, Informatica ETLs, Incorta pipelines, or Oracle SQL Views to achieve proper metric calculations or derive dimension attributes.
  • Engineer schedule and orchestration for batch and mini‑batch data loads into enterprise data warehouses and platforms.
  • Provide issue resolution and maintenance for various business unit solutions existing in enterprise data warehouses and platforms.
  • Use tools such as SQL, Oracle Business Intelligence, Power BI, Tableau, Google Cloud Platform, Python, Incorta, and Informatica ETL to engineer data pipelines and models to enhance enterprise reporting and analytics.
  • Engineer dashboards within enterprise…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary