More jobs:
Data Engineer
Job in
Houston, Harris County, Texas, 77246, USA
Listed on 2026-01-26
Listing for:
FloWorks International LLC
Full Time
position Listed on 2026-01-26
Job specializations:
-
IT/Tech
Data Engineer, Data Analyst, Cloud Computing
Job Description & How to Apply Below
Overview
Flo Works is a leading, privately held specialty industrial supplier of pipe, valves, fittings, and related products, as well as a provider of technical solutions to the energy and industrial sectors. Headquarters in Houston, Texas, Flo Works is dedicated to delivering exceptional products, expertise, and service to its customers.
Job InformationAs the Data Engineer you are responsible for building, managing, and optimizing cloud-based data solutions to support business processes and analytics across the organization. This role requires expertise in ETL development, data modeling, cloud technologies, and business intelligence to ensure data integrity, insightful analysis, and effective reporting.
Responsibilities Data Analytics- Build and manage cloud ETL processes for data extraction, transformation, and loading from multiple sources into data lakes and data warehouses primarily within Microsoft Data Fabric.
- Apply business rules, audit, and stage data to ensure data integrity and compliance.
- Develop fact and dimension tables to support Power BI report development and other business intelligence needs.
- Create visualizations and reports to meet business requirements and support decision-making.
- Provide business analysis, problem-solving, and creativity to identify KPIs and metrics that drive business goals.
- Ensure timely and accurate performance on assigned projects, maintaining compliance with project budgets and deadlines.
- Proactively engage in projects, recognize and resolve problems, and implement solutions independently.
- Collaborate with cross-functional teams to gather complete datasets and communicate findings company-wide.
- Train stakeholders on best practices for data reporting and self-service analytics.
- Integrate data across systems using REST and SOAP APIs, including authentication (OAuth), API keys, pagination, rate limits, retries, and error handling.
- Design and manage scalable data ingestion pipelines that pull data from SaaS platforms such as Salesforce, Net Suite, Workday, or ERP systems.
- Build and maintain internal data APIs to support curated datasets for analysts and applications.
- Translate API payloads (JSON, XML) into internal data models for loading into data lakes, warehouses, or MDM environments.
- Support Master Data Management (MDM) processes by integrating and synchronizing core business entities across platforms, ensuring consistent, governed, and high-quality master data throughout the ecosystem.
- Implement middleware workflows using platforms such as Azure Logic Apps, Azure Data Factory, Mule Soft, Boomi, or Informatica Cloud.
- Develop event-driven integrations using messaging systems such as Azure Service Bus, Kafka, or Rabbit
MQ. - Build end-to-end orchestration workflows that call APIs, transform data, and deliver it to destination systems.
- Apply integration patterns such as ETL/ELT, Change Data Capture (CDC), event streaming, batch vs. real-time ingestion, and data synchronization.
- Ensure secure API access (OAuth flows, token refresh logic) and apply governance practices for PII, auditability, and cross-platform data flows.
- Support cloud-aligned practices including automated deployment of API connectors or middleware workflows using CI/CD.
- Bachelor's degree in technology, mathematics, statistics, accounting, finance, or a related quantitative discipline.
- Over 5 years of experience in data analytics Microsoft Fabric (high grade).
- Prior in-depth hands-on experience with system integrations utilizing Boomi is required for this position.
- Expert in SQL (a must-have skill).
- Highly experienced in cloud technologies, with a strong preference for Microsoft Fabric, DBT, and Azure. Experience with Databricks, Snowflake, and AWS may also be considered.
- Proficient in Python programming and data modeling using the Kimball Method (Star Schema).
- Skilled in analytical and visualization tools, with a strong preference for Power BI. Experience with Tableau may also be considered.
- Experience and passion for training data science and machine learning models.
- Familiarity with Git and source control concepts.
- Experience with Databricks, Airflow, Python, AI, AWS, and data integrations with ERPs or Salesforce is a plus.
- Ability to work with Azure Dev Ops and cross-functional teams to define analytics use cases and translate them into technical solutions.
- Strong intellectual and analytical curiosity, adaptability, and independence.
- Frequently required to stand
- Frequently required to walk
- Continually required to sit
- Continually required to utilize hand and finger dexterity
- Occasionally balance, bend, stoop, kneel or crawl
- Continually required to talk or hear
- Continually utilize visual acuity to read technical information and/or use a keyboard
- Occasionally required to lift/push/carry items up to 25 pounds
- Occasionally work near moving mechanical parts
- Occasionally exposure to outside weather conditions
- Occasionally loud noise (examples: shop tool noises,…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×