×
Register Here to Apply for Jobs or Post Jobs. X

Business Info Developer Consultant Senior

Job in Mason, Warren County, Ohio, 45040, USA
Listing for: Elevance Health
Part Time position
Listed on 2026-02-08
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Database Administrator
Job Description & How to Apply Below

Overview

Business Information Developer Consultant Senior

Location: This role requires associates to be in-office 1-2 days per week, fostering collaboration and connectivity, while providing flexibility to support productivity and work-life balance. This approach combines structured office engagement with the autonomy of virtual work, promoting a dynamic and adaptable workplace. Alternate locations may be considered if candidates reside within a commuting distance from an office.

Please note that per our policy on hybrid/virtual work, candidates not within a reasonable commuting distance from the posting location(s) will not be considered for employment, unless an accommodation is granted as required by law.

The Business Information Developer Consultant Senior will be viewed as an expert while supporting Carelon Payment Integrity in the development and execution of data mining analyses. The Business Developer Consultant Sr. will lead the onshore/offshore development team of engineers and Dev Ops to produce scalable product solutions.

Carelon Payment Integrity is a proud member of the Elevance Health family of companies. Carelon Insights, formerly Payment Integrity, is determined to recover, eliminate and prevent unnecessary medical-expense spending.

How You Will Make An Impact
  • Undertakes complex assignments requiring additional specialized technical knowledge.
  • Develops very complex and varied strategic report applications from a Data Warehouse.
  • Establishes and communicates common goal as well as a roadmap for team.
  • Establishes and maintains advanced knowledge of data warehouse database design, data definitions, system capabilities, and data integrity issues.
  • Acts as a source of direction, training, and guidance for less experienced staff.
  • Monitors project schedules and costs.
  • Develops and supports very complex Data Warehouse-related applications for business areas requiring design and implementation of database tables.
  • Coordinates with offshore team and cross-functional teams to ensure that applications are properly configured, tested, and deployed.
  • Effectively communicates with multiple levels of the organization including models/data story interpretation and presentations to include training on use of applications developed.
  • This job is focused on spending time thinking about programming and how it would be used to design solutions.
Minimum Requirements

Requires a BS/BA degree; minimum of 6 years' experience; or any combination of education and experience, which would provide an equivalent background.

Preferred Skills, Capabilities, And Experiences
  • 10+ years of IT experience as a Data Engineer with expertise in designing data-intensive applications using Snowflake, Cloud Data Engineering, Data Warehouse / Data Mart and Data Quality solutions.
  • 7 years' experience in Spark with Scala and Python, 10 years in SQL, 3 years in Snowflake and 3 years in AWS/GCP ecosystem.
  • Experience with Snowflake utilities, Snow SQL, Snow Pipe, big data model techniques using Python, Scala, & Java.
  • Familiar with data warehouse technologies such as Edward, ODW, and GBD Facets.
  • Designed, built, and managed ETL data pipelines. Worked on various data source formats like CSV, Parquet, Avro and other RDBMS and Data warehouses are extracted, transformed, and loaded to generate flat files with Python & Scala programming and SQL Queries.
  • Worked on Spark 2.3.0 with Scala for Extract Transform Load (ETL) process.
  • Experience in building and migrating large-scale data platforms using Snowflake.
  • Domain knowledge of US Healthcare/medical claims.
  • Hands-on experience with EC2, EMR, S3, Glue, Lambda, Athena, IAM, SQS and other, Data Proc, Cloud Function and Cloud Composer services of the AWS/GCP ecosystems. Utilized AWS Glue for data transformation, validation and data cleansing.
  • Experience in importing and exporting data between HDFS and RDBMS and vice versa using SQOOP.
  • Experience in analyzing data using Hive QL and designing custom UDF in Hive.
  • Hands-on experience in Linux Shell Scripting.
  • Demonstrated experience writing complicated SQL Queries and database analysis experience for performance tuning.
  • Expert in building, deploying and maintaining…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary