Data Warehouse Engineer
Listed on 2026-02-16
-
IT/Tech
Data Engineer, Data Analyst
Overview
Tulsa For You and Me: a collection of talent and economic development programs building a vibrant and inclusive city through a booming economic landscape for Tulsa. We are forging strategic connections between talented professionals and innovative companies in future-focused industries. Operated under the stewardship of the George Kaiser Family Foundation® (GKFF®), affiliated programs include Tulsa Remote, LLC, Tulsa Innovation Labs, LLC, in Tulsa Initiative, LLC, Build In Tulsa, LLC, Atlas School, LLC, Campus Tulsa program, and Tulsa Service Year program, among others.
Our North Star goal is both ambitious and clear: to attract and build 20,000 jobs in high-growth industry clusters by the year 2033. A cornerstone of this mission is our commitment to ensuring that at least one-third of these opportunities are accessible to Tulsans from historically underrepresented communities. This is an extension of our broader vision for a more equitable, innovative, and vibrant Tulsa, where a strong state of economic health provides the basis for equal opportunity.
The Integrated Strategies group is composed of several expert teams dedicated to fostering cross-program collaboration, driving operational excellence, and equipping Tulsa for You programs with the tools, insights, and best practices to drive tangible results. The Integrated Strategies group includes teams focused on Technology, Research & Analytics, Marketing, and Operations. Join our high-impact team, where your work will have a lasting impact on the Tulsa community and its future.
PRIMARYPURPOSE AND FUNCTIONS
In this position, you will play a crucial role in managing and optimizing our data processes within our emerging Data Warehouse/Data Lake environments. This position requires a strong foundation in Python programming, ETL systems, and tools and a solid understanding of relational databases. You will work to ensure the efficiency and reliability of our data pipelines, contributing directly to the success of our projects.
You will maintain all the back-end hardware and platforms to enable these applications and systems to run harmoniously with the Research and Data Analytics department. Azure experience is a strong plus.
- Monitor performance and troubleshoot issues within ETL projects, proposing and implementing effective solutions to optimize system performance.
- Collaborate with data engineers, analysts, and other stakeholders to integrate new data sources and ensure seamless data flow and accessibility across the organization.
- Assist in the development and testing of new features, models, or data processes within our data warehouses, leveraging feedback to drive continuous improvement.
- Document data processes, models, and workflows within our data warehouses for internal knowledge sharing and to support ongoing learning and development.
- Stay updated on the latest data warehouse management features and best practices, applying this knowledge to innovate and improve project outcomes.
- Implement and maintain security protocols and data governance practices to ensure compliance with organizational policies and industry regulations.
- Evaluate and recommend new tools, technologies, and methodologies to enhance data processing and storage capabilities.
- Collaborate with data engineers, analysts, and other stakeholders with data flows, scripts, and models within Dataiku and other ETL systems, optimizing for performance and reliability.
- Collaborate with data engineers, analysts, and other stakeholders to ensure integrity and quality of data throughout the entire data lifecycle, from ingestion to analysis, by implementing data validation and cleansing processes.
- Other duties as assigned.
Education & Experience:
- 4+ years experience managing data warehouses and ETL systems.
Preferred:
- 4-5 years working with Azure Data Warehouses.
- 2-3 years experience with version control systems, such as Git.
- 2-3 years experience in other data science and machine learning tools and frameworks (e.g., scikit-learn, Tensor Flow, PyTorch, Java, Apex).
Skills & Abilities:
- Strong abilities in creating and optimizing data…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).