More jobs:
Data Engineer
Job in
Okemos, Ingham County, Michigan, 48864, USA
Listed on 2025-12-24
Listing for:
FastTek Global
Full Time
position Listed on 2025-12-24
Job specializations:
-
IT/Tech
Data Engineer -
Engineering
Data Engineer
Job Description & How to Apply Below
Okemos, Michigan – Data Engineer #1042483
Job DescriptionWe are looking for a Data Engineer to join our Data Engineering Team. The ideal candidate should have a minimum of 3 years of experience with excellent analytical reasoning and critical thinking skills. The candidate will be a part of a team that creates data pipelines that use change data capture (CDC) mechanisms to move data from on-premises to cloud-based destinations and then transform data to make it available to Customers to consume.
The Data Engineering Team also does general extraction, transformation, and load (ETL) work, along with traditional Enterprise Data Warehousing (EDW) work.
- Participates in the analysis and development of technical specifications, programming, and testing of Data Engineering components.
- Participates in creating data pipelines and ETL workflows to ensure that design and enterprise programming standards and guidelines are followed.
- Assists with updating the enterprise standards when gaps are identified.
- Follows technology best practices and standards and escalates any issues as deemed appropriate.
- Follows architecture and design best practices (as guided by the Lead Data Engineer, BI Architect, and Architectural team).
- Responsible for assisting in configuration and scripting to implement fully automated data pipelines, stored procedures, and functions, and ETL workflows that allow data to flow from on‑premises data sources to cloud-based data platforms (e.g., Snowflake) and application platforms (e.g., Salesforce), where data may be consumed by end customers.
- Follows standard change control and configuration management practices.
- Participates in 24‑hour on‑call rotation in support of the platform.
Skills & Qualifications Database Platforms
- Snowflake, Oracle, and SQL Server
- Red Hat Enterprise Linux and Windows Server
- PL/SQL, Python, T‑SQL, Stream Sets, Snowflake Cloud Data Platform, and Informatica Power Center, Informatica IICS or IDMC.
- Experience creating and maintaining ETL processes that use Salesforce as a destination.
- Drive and desire to automate repeatable processes.
- Excellent interpersonal skills and communication, as well as the willingness to collaborate with teams across the organization.
Skills & Qualifications
- Experience creating and maintaining solutions within Snowflake that involve internal file stages, procedures and functions, tasks, and dynamic tables.
- Experience creating and working with near‑real‑time data pipelines between relational sources and destinations.
- Experience working with Stream Sets Data Collector or similar data streaming/pipelining tools (Fivetran, Striim, Airbyte etc.).
- Medical and Dental (Fast Tek pays majority of the medical program)
- Vision
- Personal Time Off (PTO) Program
- Long Term Disability (100% paid)
- Life Insurance (100% paid)
- 401(k) with immediate vesting and 3% (of salary) dollar‑for‑dollar match
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×