×
Register Here to Apply for Jobs or Post Jobs. X

Data Warehouse Engineer

Job in Tulsa, Tulsa County, Oklahoma, 74145, USA
Listing for: Tulsa Community Foundation
Full Time position
Listed on 2026-02-16
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

Overview

Tulsa For You and Me: a collection of talent and economic development programs building a vibrant and inclusive city through a booming economic landscape for Tulsa. We are forging strategic connections between talented professionals and innovative companies in future-focused industries. Operated under the stewardship of the George Kaiser Family Foundation® (GKFF®), affiliated programs include Tulsa Remote, LLC, Tulsa Innovation Labs, LLC, in Tulsa Initiative, LLC, Build In Tulsa, LLC, Atlas School, LLC, Campus Tulsa program, and Tulsa Service Year program, among others.

Our North Star goal is both ambitious and clear: to attract and build 20,000 jobs in high-growth industry clusters by 2033, with at least one-third of opportunities accessible to Tulsans from historically underrepresented communities. This is part of a broader vision for a more equitable, innovative, and vibrant Tulsa, where economic health supports equal opportunity. The Integrated Strategies group is composed of teams focused on Technology, Research & Analytics, Marketing, and Operations.

Join our high-impact team, where your work will have a lasting impact on the Tulsa community and its future.

Primary Purpose And Functions

In this position, you will play a crucial role in managing and optimizing our data processes within our emerging Data Warehouse/Data Lake environments. A strong foundation in Python programming, ETL systems and tools, and a solid understanding of relational databases are required. You will work to ensure the efficiency and reliability of our data pipelines, contributing directly to project success. You will maintain back-end hardware and platforms to enable these applications and systems to run harmoniously with the Research and Data Analytics department.

Azure experience is a strong plus.

Requirements Essential Functions And Responsibilities
  • Monitor performance and troubleshoot issues within ETL projects, proposing and implementing effective solutions to optimize system performance.
  • Collaborate with data engineers, analysts, and other stakeholders to integrate new data sources and ensure seamless data flow and accessibility across the organization.
  • Assist in the development and testing of new features, models, or data processes within our data warehouses, leveraging feedback to drive continuous improvement.
  • Document data processes, models, and workflows within our data warehouses for internal knowledge sharing and to support ongoing learning and development.
  • Stay updated on the latest data warehouse management features and best practices, applying this knowledge to innovate and improve project outcomes.
  • Implement and maintain security protocols and data governance practices to ensure compliance with organizational policies and industry regulations.
  • Evaluate and recommend new tools, technologies, and methodologies to enhance data processing and storage capabilities.
  • Collaborate with data engineers, analysts, and other stakeholders with data flows, scripts, and models within Dataiku and other ETL systems, optimizing for performance and reliability.
  • Collaborate with data engineers, analysts, and other stakeholders to ensure integrity and quality of data throughout the entire data lifecycle, from ingestion to analysis, by implementing data validation and cleansing processes.
  • Other duties as assigned.
Knowledge Experience And Skill Requirements Education & Experience
  • 4+ years experience managing data warehouses and ETL systems.
Preferred
  • 4-5 years working with Azure Data Warehouses.
  • 2-3 years experience with version control systems, such as Git.
  • 2-3 years experience in other data science and machine learning tools and frameworks (e.g., scikit-learn, Tensor Flow, PyTorch, Java, Apex).
Skills & Abilities
  • Strong abilities in creating and optimizing data warehouses in Azure, with knowledge of AWS and GCP.
  • Conceptual understanding of relational databases and data querying.
  • Strong proficiency in writing and modifying database centric APIs.
  • Familiarity with Dataiku and other ETL systems or a willingness to learn and master the platform.
  • Excellent problem-solving skills and attention to detail.
  • Strong proficiency in cloud…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary