×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Architect

Job in Charlotte, Mecklenburg County, North Carolina, 28245, USA
Listing for: CRC Group
Full Time position
Listed on 2025-12-03
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Science Manager, Data Warehousing
Job Description & How to Apply Below
Senior Data Architect page is loaded## Senior Data Architect locations:
Charlotte, NCtime type:
Full time posted on:
Posted Todayjob requisition :
R
** The position is described below. If you want to apply, click the Apply button at the top or bottom of this page. You'll be required to create an account or sign in to an existing one.
**** If you have a disability and need assistance with the application, you can request a reasonable accommodation. Send an email to
** Accessibility **(accommodation requests only; other inquiries won't receive a response).
**** Regular or Temporary:
** Regular
* * Language Fluency:
** English (Required)
*
* Work Shift:

** 1st Shift (United States of America)
** Please review the following job description:
** We are seeking an experienced Data Architect with deep expertise in Databricks and a strong understanding of the insurance industry. The ideal candidate will design and implement scalable data architectures that support advanced analytics, reporting, and AI/BI/ML initiatives across Broking, Binding, underwriting, claims, actuarial, and policy management functions.
*
* Key Responsibilities:

**** Data Architecture & Strategy
*** Design and implement enterprise data architecture leveraging
** Databricks Lakehouse**, Delta Lake, and Azure cloud-native services
* Define data integration, modeling, and governance frameworks tailored for
** insurance data domains
** such as agency, carrier, policy, claims, Invoicing & billing, and reinsurance.
* Create scalable and secure data pipelines that handle structured and unstructured data from internal and external sources.
* Author and maintain technological roadmap with mappings to business capabilities
* Experiment and establish adoption blueprint for emerging capabilities within Databrick and in data technology
* Secure the data with appropriate NIST cyber controls
** Data Engineering & Analytics Enablement
*** Lead the design of ETL/ELT pipelines using PySpark, SQL, Delta Live Tables, Lakeflow connect,  Autoloader, Lake flow declarative pipelines, DBT,  and Databricks Workflows.
* Partner with actuarial, underwriting, and BI teams to design semantic layers and analytics-ready datasets.
* Optimize performance and cost of Databricks clusters and workflows.
* Enable machine learning and predictive analytics by providing clean, governed, and feature-rich data sets.
** Governance & Quality
*** Review solution developed by the divisional teams and vendors to ensure the solution uses modern technology, built to scale, has resilience, and is cost effective to operate.
* Implement data quality, lineage, and metadata management frameworks using tools like Unity Catalog, Collibra, or Alation.
* Establish and enforce data security and compliance policies aligned with insurance regulations (e.g., NAIC, Privacy, HIPAA, GDPR, CFIUS).
* Ensure consistency of master data across operational and analytical systems.
** Collaboration & Leadership
*** Collaborate with business leaders, data engineers, divisional data teams, data visualization experts, and platform head to align architecture with organizational goals.
* Mentor teams on best practices for data modeling, Databricks optimization, and cloud data architecture.
* Evaluate new technologies to continuously improve data strategy and capabilities.
*
* Required Qualifications:

*** Bachelor’s or master’s degree in computer science, Data Engineering, Information Systems, or related field.
* 7+ years of experience in data architecture or data engineering, with at least 4 years in Databricks.
* Strong experience with insurance data models, including policy, claims, premium, fees, agency, underwriting, accounting, and insured domains.
* Expertise in cloud platforms (AWS, Azure, or GCP) and modern data lakehouse architecture.
* Proficiency in SQL, PySpark, Delta Lake, and Databricks SQL.
* Experience integrating with BI tools (Power BI, Tableau, Looker) and data governance tools.
* Excellent communication and stakeholder management skills.
** Preferred

Skills:

*** Experience with streaming data frameworks (Kafka, Delta Live Tables).
* Familiarity with AI/ML pipelines in Databricks.
* Certification(s):  + Databricks Certified Data Engineer…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary