More jobs:
ETL Team Lead
Job in
Greenville, Greenville County, South Carolina, 29610, USA
Listed on 2026-01-11
Listing for:
Canal Insurance Company
Full Time
position Listed on 2026-01-11
Job specializations:
-
IT/Tech
Data Engineer
Job Description & How to Apply Below
Join to apply for the ETL Team Lead role at Canal Insurance Company
.
Canal Insurance Company specializes in insurance for commercial trucking and specialty transportation operations. Founded in 1939 and located in Greenville, South Carolina, Canal recognizes that its success depends on the hard work and dedication of its employees. The company cultivates a culture that enables the recruitment and retention of the best talent, balancing happiness and productivity.
Culture- Located in beautiful downtown Greenville, SC
- Career growth & advancement opportunities
- Comprehensive benefits package
- Employee referral program
- Casual dress code
- Innovation-focused & customer-centric
- 80+ years of industry expertise
- Committed to giving back to our community
- Unquestioned integrity and commitment
- Basic & voluntary life insurance plans
- Medical, dental, & vision coverage
- Short-term & long-term disability
- 401(k) plan with company match up to 6%
- Flexible spending accounts
- Employee assistance programs
- Generous PTO plan
- Production Support, Operations & Reliability
- The ETL Team Lead owns end-to-end operational support for Canal’s existing data stack.
- Monitor daily ETL loads across SQL jobs, DHIC (GW Data Hub and Info Center), and legacy SSIS packages.
- Collaborate with the AMS team to troubleshoot pipeline failures, performance issues, schema mismatches, permissions issues, and cloud resource failures.
- Perform root-cause analysis and implement permanent fixes.
- Ensure SLA adherence and on-time delivery of critical reporting datasets for scheduled ETL jobs.
- Provide direction for both AMS and ETL developers for legacy and current ETL maintenance.
- Refactor or retire outdated or redundant ETL processes.
- Maintain and improve existing pipelines that utilize the following technologies
- Microsoft SQL Server database programming
- T‑SQL scripting
- SQL Server Integration Services
- Microsoft Power Shell
- Guidewire Data Hub and Info Center
- Oracle database programming
- Oracle PL/SQL scripting
- SAP BODS (SAP Business Objects Data Services)
- Postgre
SQL scripting
- Operational Excellence
- Assist with the creation and enhancement of operational runbooks, SOPs, monitoring dashboards, and incident response workflows.
- Partner with other IT operational segments, business SMEs, and AMS to minimize downtime and meet business SLAs.
- Improve existing processes and implement new proactive solutions for daily processing.
- Business Continuity
- Ensure development support coverage for critical data pipelines (rotation-based).
- Support month‑end and quarter‑end financial reporting cycles.
- Coordinate production releases and validate deployments.
- Become the steady‑state technical owner of the entire data operations layer during the Canal modernization journey.
- Technical Leadership & Collaboration
- Serve as technical lead guiding onshore/offshore developers.
- Review code, enforce best practices, and mentor junior engineers.
- Partner with Scrum Masters, Project Managers, Enterprise Architecture, QA Automation, Change Management, and AMS support teams.
- Data Ingestion, ETL/ELT Development & Optimization
- Develop reusable ingestion patterns for Guidewire Data Hub and Info Center, Hub Spot, telematics, and other facets of the business.
- Modernize existing ETL workloads using Delta Lake, Medallion Architecture, and Fabric Lakehouse.
- Build scalable data ingestion pipelines using emerging technologies such as Azure Data Factory, MS Fabric, Databricks, and Synapse Pipelines.
- Integrate internal and external data into the platform.
- Real‑Time, Streaming & Event‑Driven Engineering
- Design and implement real‑time data pipelines using Event Hub, Fabric Real‑Time Analytics, Databricks Structured Streaming, and KQL‑based event processing.
- Enable real‑time operational insights and automation, including telematics alerting, FNOL automation, and fraud/VNOS/VNOP detection.
- Modern Azure Data Stack Leadership
- Lead the strategy, design, and engineering of Canal’s modern Azure data ecosystem using next‑generation tools and Medallion Architecture.
- Implement Medallion Architecture (Bronze/Silver/Gold) across Fabric Lakehouse, Warehouse, Eventhouse, and KQL Database.
- Leverage Delta tables with schema enforcement, ACID compliance, and versioning.
- Data Modeling, Curation & Governance
- Develop curated, analytics‑ready datasets to support Power BI, operational reporting, and advanced analytics use cases.
- Assist Canal architect with the implementation of Data Governance tools.
- Establish robust data quality, validation, alerting, and observability frameworks.
- AI/ML Data Enablement (Optional)
- Prepare ML‑ready datasets for pricing, risk, fraud detection, underwriting, claims leakage, and predictive insights.
Mid‑Senior level
Employment typeFull‑time
Job functionEngineering and Information Technology
#J-18808-LjbffrTo View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×