×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Modeler

Job in Birmingham, Jefferson County, Alabama, 35275, USA
Listing for: Kemper
Full Time position
Listed on 2026-02-21
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

Location(s):
Birmingham, Alabama;
Chicago, Illinois;
Dallas, Texas;
Jacksonville, Florida

Overview

Kemper is one of the nation’s leading specialized insurers. Our success is a direct reflection of the talented and diverse people who make a positive difference in the lives of our customers every day. We believe a high-performing culture, valuable opportunities for personal development and professional challenge, and a healthy work‑life balance can be highly motivating and productive. Kemper’s products and services are making a real difference to our customers, who have unique and evolving needs.

By joining our team, you are helping to provide an experience to our stakeholders that delivers on our promises.

Position Summary

We are seeking a highly skilled Senior Data Modeler to join our Data Engineering organization within a large, enterprise‑scale P&C and Life insurance company. In this role, you will lead the design and evolution of conceptual, logical, and physical data models that support underwriting, claims, policy administration, billing, actuarial, finance, and customer engagement domains. You will collaborate closely with data engineers, data architects, business SMEs, and delivery teams to ensure data structures are robust, scalable, governed, and analytics‑ready.

This role is a key contributor to our enterprise data modernization journey—including cloud data platforms, domain‑oriented data ecosystems, and advanced analytics/AI initiatives.

Position Responsibilities

Data Modeling & Architecture

  • Lead the design of enterprise-level conceptual, logical, and physical data models across P&C insurance domains.
  • Translate complex business requirements into normalized, dimensional, and data vault models aligned with architectural standards.
  • Design models supporting operational data stores (ODS), data lakes, data warehouses, and real‑time/streaming data pipelines.
  • Ensure data models follow industry best practices, data governance standards, and regulatory requirements (e.g., state/federal insurance regulations).

Collaboration & Delivery

  • Work closely with Data Engineers to implement models in cloud‑native environments (Snowflake/AWS).
  • Partner with Business Analysts, Product Owners, Actuaries, Underwriters, and Claims SMEs to gather requirements and validate model designs.
  • Support Enterprise Architecture and Data Governance in defining data standards, naming conventions, lineage, and metadata.
  • Review and optimize existing data models for scalability, maintainability, and performance.

Data Governance & Quality

  • Ensure models align with enterprise data definitions, master data management (MDM), and data quality rules.
  • Collaborate with governance teams to maintain business glossaries, taxonomies, and data catalogs.
  • Promote consistent data usage, stewardship, and compliance across the organization.

Documentation & Communication

  • Produce clear, standards‑based model documentation and metadata.
  • Present data model concepts to both technical and non‑technical stakeholders.
  • Mentor junior modelers and data engineers on modeling strategies and best practices.
Position Qualifications
  • Bachelor’s degree in Computer Science, Information Systems, Data Management, or related field.
  • 7–10+ years of experience in data modeling within large, complex environments.
  • Proven expertise in:
    • Relational modeling (3NF)
    • Dimensional modeling (Kimball, star/snowflake)
    • Data Vault 2.0 design
  • Experience working within P&C insurance data domains (e.g., policy, claims, billing, exposures, reinsurance, agent/producer).
  • Strong proficiency with data modeling tools such as ERwin, ER/Studio, or Azure Data Modeling tools.
  • Solid understanding of SQL, data warehousing patterns, cloud data platforms (Snowflake, Databricks, etc.) and AWS Ecosystem.
  • Experience supporting data engineering pipelines and distributed data environments.
Preferred Qualifications
  • Familiarity with API data contracts, event‑driven modeling, and streaming platforms (Kafka/Kinesis).
  • Understanding of actuarial, risk modeling, regulatory reporting, and financial analytics.
  • Working knowledge of MDM and golden‑record strategies.
  • Cloud certifications (Snowflake, AWS, Azure, or GCP) a plus.
  • Sponsorship is not…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary