×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Westlake, Cuyahoga County, Ohio, 44145, USA
Listing for: Fidelity Investments Inc.
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Data Warehousing
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

Job Description:

Position

Description:

Develops and implements data management and reporting services tools, platforms, and enhancements. Participates in the designing and crafting modern platforms in the Cloud to support company products for global markets. Creates, develops, and maintains various metrics, reports, dashboards, and Business Intelligence (BI) solutions. Builds scalable patterns for data consumption from Cloud-based data lakes by leveraging Cloud technologies and Dev Ops concepts, including Continuous Integration/Continuous Delivery (CI/CD) pipelines.

Develops and maintains comprehensive reporting solutions to provide actionable insights and support data-driven decision-making for the team and stakeholders.

Primary Responsibilities:
  • Identifies opportunities for new development within a scalable public Cloud environment.
  • Collaborates and partners with product owners and development teams to translate business requirements into actionable tasks, ensuring cross-functional teams are informed throughout the project lifecycle.
  • Works closely with business partners and other system partners and serves as developer for new tools and implementation projects.
  • Analyzes and manipulates datasets aligned with business requirements, to ensure data integrity, accessibility, and security throughout the entire data lifecycle.
  • Applies data engineering, data warehousing, and analytics technologies in the data application development, data integration, and data pipeline design patterns on a distributed platform.
  • Collaborates with development teams to integrate automated testing into the sprint cycle.
  • Performs continuous testing and validation of new features and functionalities.
  • Analyzes complex requirements, collaborates with developers to design efficient solutions, and creates prototypes to validate proposed solutions and mitigate technical risks.
Education and Experience:

Bachelor’s degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and three (3) years of experience as a Senior Data Engineer (or closely related occupation) building financial and Medicare data warehouse applications using data modeling and Extract, Transform, and Load (ETL) processing.

Or, alternatively, Master’s degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and one (1) year of experience as a Senior Data Engineer (or closely related occupation) building financial and Medicare data warehouse applications using data modeling and Extract, Transform, and Load (ETL) processing.

Skills and Knowledge:

Candidate must also possess:

  • Demonstrated Expertise (“DE”) implementing ETL and Data Integration (Data Stage with UNIX); designing and optimizing ETL Workflows, using IBM Data Stage, Mainframe files, databases (DB2 and Snowflake), and Flat files; automating and integrating tasks, using UNIX Shell scripting with Data Stage jobs; implementing effective data transformation and performance tuning, using Data Stage and UNIX; and orchestrating multistage data pipelines to ensure data accuracy and automated data profiling, using Data Stage and UNIX.
  • DE designing and implementing data workflows, using Apache Airflow (migrating TWS and Control‑M to improve scaling and automation); coordinating the execution of Data Stage and Snowflake jobs, using UNIX scripts; and automating Snowflake features (Snowpipe and stored procedures) and integrating them into workflows with task parallelization and dependency tracking, using Directed Acyclic Graphs (DAGs) (to ensure Audit logging).
  • DE implementing Cloud Data Warehousing to store, process, and analyze data, using Snowflake; designing complex SQL and PL/SQL, and optimizing complex data transformations, analytics and aggregations, using Command Table Expression (CTEs), pivoting, and window functions (to handle data processing); developing data ingestion pipelines, using Snowflake with Airflow and ETL scripts; and implementing Change Data Capture (CDC) strategies for incremental loads and optimizing bulk data ingestion on Cloud…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary