Snowflake Data Engineer
Listed on 2026-02-16
-
Software Development
Data Engineer
Base pay range: $55.00/hr - $60.00/hr
Status: 6 month contract-to-hire
Pay:
Up to $60.00 per hour ($120,000 - $130,000 annually upon conversion to FTE)
Location:
Remote (must be willing to work 7:45am – 4:30pm Mountain Time). Two other Engineers on the team live in Denver and would be ideal to have the 3rd there as they meet up sometimes for lunch – not required though.
- US Citizen with strong communication skills
- Minimum 2 years experience working within Snowflake or an equivalent combination of education and experience sufficient to perform the essential functions of the job
- Minimum 5 years related experience as a Data Engineer or similar role
- Technical expertise with data pipelines, API management, data models, and data warehouses
- Working knowledge of programming languages (e.g. Java and Python)
- Hands‑on experience with SQL database design
- Bachelor’s degree in computer science, IT, or related field
- Demonstrated analytical skills
- Demonstrated skill in interacting and collaborating with others
- Skill in oral and written communication, sufficient to discuss a variety of job‑related topics, and to effectively communicate complex topics to a variety of audiences
- Skill in utilizing a systematic approach to problem solving
- Skill in researching information to gain knowledge to apply to business challenges
- Skill in advising and guiding individuals to achieve results
The Data Engineer builds and optimizes the association’s data and data pipeline architecture, supporting multiple teams, systems, and projects. Staying up-to-date with data engineering tools and technologies, including cloud-based data services, is essential for this position.
Data Storage- Designs, optimizes, and maintains databases for efficient data storage and retrieval.
- Manages data warehouses or data lakes to ensure accessibility and reliability of data.
- Develops and maintains data models and schemas that support analytics and reporting.
- Manages our Snowflake instance to provide business and regulatory report on our portfolio and ancillary services from initial contact to post loan closure.
- Builds and maintains data pipelines to move, transform, and load data from various sources to a centralized repository.
- Optimizes data infrastructure and pipelines for speed, scalability, and cost-effectiveness.
- Designs, publishes, documents, monitors, secures, and analyzes Application Programming Interfaces (APIs).
- Creates ETL (Extract, Transform, Load) processes to clean, transform, and prepare data for analysis.
- Ensures data completeness, integrity, and security through validation, monitoring, and governance practices.
- Normalization of data to eliminate duplications and ensure single source of truth.
- Works closely with stakeholders to understand data needs and provide access to relevant data.
- Creates documentation and provides support to help others understand and use the data infrastructure effectively.
- Appropriately protects the confidentiality, security, and integrity of the Association, employees, borrowers, and other stakeholders.
Seniority level:
Mid‑Senior level
Employment type:
Contract
Job function:
Engineering and Information Technology
Industries:
Banking, Financial Services, and Farming
- Medical insurance
- Vision insurance
- 401(k)
- Tuition assistance
Referrals increase your chances of interviewing at Kavaliro by 2x.
Apply BELOW
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).