More jobs:
Senior Bigdata Consultant - Onsite, OH
Job in
Columbus, Franklin County, Ohio, 43224, USA
Listed on 2025-10-08
Listing for:
Ocean Blue Solutions Inc
Full Time
position Listed on 2025-10-08
Job specializations:
-
IT/Tech
Data Engineer, Database Administrator, Data Warehousing
Job Description & How to Apply Below
Senior Bigdata Consultant - Onsite, OH at Ocean Blue Solutions Inc
OverviewClient - State of Ohio. In-person role located at 50 W. Town Street, Columbus, Ohio 43215. Work hours Monday–Friday, 8:00 AM to 5:00 PM EST. Submission due date: 08/15/2025.
Responsibilities- Participate in team activities, design discussions, stand-up meetings, and planning reviews with the team.
- Provide Snowflake database technical support to develop reliable, efficient, and scalable solutions for various projects on Snowflake.
- Ingest existing data, framework, and programs from the ODM EDW IOPBig data environment to the ODM EDW Snowflake environment using best practices.
- Design and develop Snowpark features in Python; understand requirements and iterate.
- Interface with the open-source community and contribute to Snowflake’s open-source libraries including Snowpark Python and the Snowflake Python Connector.
- Create, monitor, and maintain role-based access controls, virtual warehouses, tasks, Snow pipes, and streams on Snowflake databases to support different use cases.
- Performance tune Snowflake queries and procedures; document Snowflake best practices.
- Explore new Snowflake capabilities, perform proofs of concept, and implement them based on business requirements.
- Create and maintain Snowflake technical documentation, ensuring compliance with data governance and security policies.
- Implement Snowflake user/query log analysis, history capture, and user email alert configuration.
- Enable data governance in Snowflake, including row/column-level data security using secure views and dynamic data masking.
- Perform data analysis, profiling, quality checks, and data ingestion in various layers using big data/Hadoop/Hive/Impala, PySpark, and UNIX shell scripts.
- Follow organizational coding standards; create mappings, sessions, and workflows per mapping specifications.
- Perform gap and impact analysis of ETL/IOP jobs for new requirements and enhancements.
- Create mock data, perform unit testing, and capture result sets for lower-environment jobs.
- Update production support run books and control schedules per production releases.
- Create and update design documents; provide detailed workflow descriptions after every production release.
- Continuously monitor production data loads; fix issues, update trackers, and identify performance issues.
- Perform performance tuning of long-running ETL/ELT jobs with partitioning and other best practices.
- Perform QA checks, reconcile data post-loads, and coordinate with vendors for fixed data.
- Participate in ETL/ELT code reviews and design reusable frameworks.
- Create change requests, work plans, test results, BCAB checklists for production deployments and perform post-deployment validation.
- Collaborate with Snowflake Admin, Hadoop Admin, ETL and SAS admins for deployments and health checks.
- Develop reusable frameworks for audit balance control, capture reconciliation, and provide a single reference for workflows.
- Develop Snowpark and PySpark programs to ingest historical and incremental data.
- Create Sqoop scripts to ingest historical data from the EDW Oracle database to Hadoop IOP; create Hive tables and Impala views for dimension tables.
- Participate in meetings to upgrade functional and technical expertise.
- Proficiency in data warehousing, data migration, and Snowflake.
- Experience in implementing, executing, and maintaining data integration technology solutions.
- Minimum 4–6 years of hands-on experience with cloud databases.
- Minimum 2–3 years of hands-on data migration experience from Big Data to Snowflake.
- Minimum 2–3 years of hands-on Snowflake experience including Snowpipe and Snowpark.
- Strong experience with SnowSQL, PL/SQL, and writing Snowflake procedures using SQL/Python/Java.
- Experience optimizing Snowflake performance and real-time monitoring.
- Strong database architecture, analytical, and problem-solving abilities.
- Experience with AWS services.
- Snowflake Certification is highly desirable.
- Snowpark with Python is preferred for building data pipelines.
- 8+ years of experience with Big Data, Hadoop, and data warehousing/integration projects.
- Extensive ETL/ELT development experience with Cloudera technologies (8–9…
Position Requirements
10+ Years
work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×