×
Register Here to Apply for Jobs or Post Jobs. X

Snowflake Dremio Data Engineer-Y-PWC AC

Job in Bengaluru, 560001, Bangalore, Karnataka, India
Listing for: PwC Acceleration Center India
Full Time position
Listed on 2026-02-04
Job specializations:
  • IT/Tech
    Data Engineer, Data Warehousing, Database Administrator, Cloud Computing
Job Description & How to Apply Below
Position: Snowflake Dremio Data Engineer-4-8Y-PWC AC
Location: Bengaluru

We have an opportunity Snowflake + Dremio + Databricks in PWC AC
Position:  Snowflake + Dremio + Databricks
Experience

Required:

4-8 Years
Notice Period:
Immediate to 60 Days
Locations:
Bangalore, Hyderabad, Kolkata, Chennai, Pune, Gurgaon
Work Mode:
Hybrid
Must Have

Skills:

Azure Data Services
Data bricks
Snowflake data warehousing, including SQL, Snow pipe
SnowSQL, Snow Pipe
Dremio
Hands-on experience with Snowflake utilities, SnowSQL, Snow Pipe, ,ETL data Pipelines, Big Data model techniques using
Python / Java
Proven experience architecting and implementing large-scale data solutions on Snowflake, including data ingestion, transformation, and optimization.
Proficiency in Azure Databricks, including Spark architecture and optimization.
Experience migrating data from relational databases to Snowflake and optimizing Snowflake’s features such as data sharing, events, and lakehouse patterns.
Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
Hands-on expertise with Dremio for data lake query acceleration, data virtualization, and managing diverse data formats (e.g., JSON, XML, CSV).Handling large and complex sets of XML, JSON, and CSV from various sources and databases
Rich experience working in Azure ADLS, Data Bricks, Data Flows, HDInsight, Azure Analysis services
Experience in load from disparate data sets and translate complex functional and technical requirements into detailed design
Knowledge of data security, access controls, and governance within cloud-native data platforms like Snowflake and Dremio.
Exposure to cloud AWS, Azure or GCP data storage and management technologies such as S3, Blob/ADLS and Google Cloud Storage
Should have a good understanding of Data Quality processes, methods and project lifecycle.
Experience validating the ETL and writing SQL queries
Strong knowledge in DWH/ODS, ETL concept and modeling structure principles
Should have clear understanding of DW Lifecycle and contributed in preparing technical design documents and test plans

Interested candidates kindly share profiles on  
MB
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary