More jobs:
Senior Data Engineer
Job in
Charlotte, Mecklenburg County, North Carolina, 28245, USA
Listed on 2026-02-03
Listing for:
AssetMark
Full Time
position Listed on 2026-02-03
Job specializations:
-
IT/Tech
Data Engineer, Data Science Manager, Cloud Computing
Job Description & How to Apply Below
*
* Job Description:
** Asset Mark is a leading strategic provider of innovative investment and consulting solutions serving independent financial advisors. We provide investment, relationship, and practice management solutions that advisors use in helping clients achieve wealth, independence, and purpose.
** The Job/What You'll Do:
** The Senior Data Engineer / Technical Lead is a pivotal, hands-on leadership role responsible for the end-to-end design, governance, and operational excellence of Asset Mark's data platform. This role is a strategic blend of deep technical architecture and team enablement, serving as the bridge between business needs and production-grade data systems. The focus is on driving highly scalable solutions and pioneering the integration of AI/ML models into our data ecosystem.
** We can consider candidates for this position who are able to accommodate a hybrid work schedule and are close to our Charlotte, NC office.
**** Responsibilities:
**** I. Data Architecture & Strategic Design**
* ** Platform Leadership:
** Define, champion, and drive the technical vision for our modern data architecture on
** Azure and Snowflake**. This includes making key decisions on Lakehouse patterns, data modeling methodologies (Dimensional, Data Vault), and the strategic use of services like Azure Synapse and Azure Data Factory.
* ** End-to-End Design:
** Lead the architectural design and implementation of highly scalable and resilient ELT/ETL pipelines, ensuring optimal performance for mission-critical financial workloads.
* ** Build vs. Buy:
** Provide expert technical guidance and contribute to the evaluation and selection of new data tools and frameworks (e.g., orchestration, observability, vector databases).
* ** Cost Optimization:
** Drive Fin Ops practices within the data platform, focusing on optimizing Snowflake compute usage, storage costs on Azure, and overall cost-per-query efficiency.
** II. Engineering Excellence & Team Leadership**
* ** Hands-on Coding & Delivery:
** Serve as a
** hands-on technical leader
** by writing, optimizing, and reviewing complex code primarily in
** Python
* * and
** SQL**. Directly contribute to the most challenging parts of data pipeline development.
* ** Standards & Governance:
** Define, document, and enforce
** engineering best practices, architectural design patterns, and coding standards
** across the data team.
* ** Code Review / PR Process Ownership:
** Oversee the code review process, providing constructive, high-quality technical feedback to ensure that all committed code is scalable, secure, maintainable, and aligns with the defined vision.
* ** Mentorship:
** Actively mentor and coach junior and mid-level data engineers on technical depth, debugging complex distributed systems, and modern data stack methodologies.
* ** CI/CD & Dev Ops:
** Lead the integration of data solutions into CI/CD pipelines (e.g., Azure Dev Ops, Git Hub Actions), ensuring robust testing, deployment automation, and operational readiness.
** III. Data Governance & Reliability (Data Ops)**
* ** Data Quality & Observability:
** Own the strategy and implementation of
** Data Observability
** solutions (like Monte Carlo) to proactively monitor the health, freshness, volume, and lineage of all production datasets.
* ** Data Lineage & Cataloging:
** Ensure comprehensive data lineage is captured and maintained to support transparency, auditing, and impact analysis across the platform.
* ** Security & Compliance:
** Collaborate closely with security and compliance teams to design and implement rigorous data governance policies, including PII masking, data tokenization, and
** Role-Based Access Control (RBAC)
** specific to financial data.
* ** SLA Management:
** Define, monitor, and enforce data Service Level Agreements (SLAs) and Service Level Objectives (SLOs) for critical data assets, and lead blameless post-mortems following any data incident.
** IV. AI/ML Enablement & Innovation**
* ** AI Data Strategy:
** Partner with Data Science and Product teams to architect the necessary data flows and infrastructure to support
** AI/ML model training, inference, and MLOps**.
* ** GenAI Integration:
** Provide…
Position Requirements
10+ Years
work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×