×
Register Here to Apply for Jobs or Post Jobs. X

Architect and Implement Scalable Data Warehouse Solution

Job in Snowflake, Navajo County, Arizona, 85937, USA
Listing for: Featmate
Full Time position
Listed on 2025-12-11
Job specializations:
  • IT/Tech
    Data Engineer, Data Warehousing
Job Description & How to Apply Below
Position: Architect and Implement a Scalable Data Warehouse Solution
Location: Snowflake

Architect and Implement a Scalable Data Warehouse Solution

Sep 28, 2025 - Senior

$3,000.00 Fixed

Business Overview:

We are a fintech company with a rapidly growing volume of transactional data. Our current database is struggling to handle complex analytical queries, and our data is siloed across different systems.

The Challenge:

Our current data infrastructure is not designed for advanced analytics. We cannot perform business intelligence queries efficiently, and integrating data from different sources is a manual, time-consuming process. This prevents us from making data-driven decisions.

The inability to query and analyze our data effectively is a major bottleneck for our business growth. We are missing critical insights into customer behavior and market trends, which puts us at a competitive disadvantage.

Proposed Method:

We need a senior data architect to design and implement a scalable, cloud-based data warehouse. The project involves:

  • Data Modeling: Designing a new schema optimized for analytical queries.
  • ETL/ELT Pipeline: Building automated pipelines to ingest data from various sources (e.g., operational databases, APIs, logs).
  • Data Warehouse Implementation: Setting up the data warehouse using a technology like Snowflake, Big Query, or Redshift.
  • Data Governance: Establishing clear data quality and security standards.

Required Experience:

At least 4+ years of experience in data engineering or data architecture. The freelancer must have a proven track record of designing and implementing production-ready data warehouse solutions.

Required Expertise:

  • Expertise in data warehouse technologies (Snowflake, Big Query, Redshift).
  • Mastery of ETL/ELT tools and processes.
  • Strong knowledge of SQL and data modeling techniques.
  • Experience with cloud platforms (AWS, GCP, or Azure).

Sample Work

Required:

Please provide documentation or a case study of a data warehouse project you have previously worked on, including details on the architecture, technologies used, and business impact.

Freelancer Proposal:

The freelancer should submit a comprehensive proposal detailing the proposed data warehouse architecture, the ETL/ELT pipeline design, and the overall project plan and timeline. The proposal must also include a risk assessment.

Notice:
You must have login as a freelancer to send a proposal.

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary