×
Register Here to Apply for Jobs or Post Jobs. X

Data Modeller​/Architect

Job in 500016, Prakāshamnagar, Telangana, India
Listing for: Confidential
Full Time position
Listed on 2026-02-04
Job specializations:
  • IT/Tech
    Data Engineer, Database Administrator, Data Analyst, Data Warehousing
Job Description & How to Apply Below
Position: Data Modeller / Architect
Location: Prakāshamnagar

The Snowflake Data Architect / Snowflake Solution Architect is responsible for lead the design and implementation of scalable, secure, and high-performance data platforms using the Snowflake Data Cloud. The ideal candidate will have strong expertise in enterprise data architecture, dimensional and relational data modelling, performance tuning, and data governance. This role will drive architectural decisions to support analytics, reporting, and real-time business applications.

Designing, building, and maintaining robust data systems and infrastructure for  Pacific Data Platform. This role involves developing data pipelines, optimizing database performance, and ensuring data integrity and security. The architect collaborates with other data engineers, application engineers and analysts to support data-driven decision-making and leverages advanced technologies to enhance data processing capabilities.

Key responsibilities include  understanding of  data model and  design database, ETL processes, and implementing best practices for data management.  Strong programming skills, experience with SQL and No

SQL databases, and proficiency in cloud platforms are essential. He/she must be self-directed and comfortable supporting the data needs of multiple teams, systems and products.

The right candidate will be excited by the prospect of data process automation and optimizing or even re-designing our company's data architecture to support our next generation of products and data initiatives.

Job Title:

Snowflake Data Architect / Snowflake Solution Architect

Job Location:

Hyderabad

Start Date:

As soon as possible

Key Responsibilities

Design and implement scalable data architectures to support data storage, processing, and analytics.
Design and implement data schemas within Snowflake to effectively support analytics, reporting needs.
Establish and enforce data access roles and policies.
Develop strategies to make data AI-ready, including data cleansing, transformation, and enrichment processes.
Provide guidance and support for analytical development and modelling to enhance data visualization and reporting capabilities.
Conduct performance tuning and optimization of data models to improve query efficiency and response times.
Develop, maintain, and optimize ETL (Extract, Transform, Load) processes for Pacific Data Analytics Platform to ensure efficient data integration from various sources (Both internal and external datasets)
Manage and optimize database / data warehouse systems such as snowflake ensuring high availability and performance.
Analyze and tune database performance, identifying bottlenecks and implementing improvements to enhance query performance.
Ensure data integrity, consistency, and accuracy through rigorous data quality checks and validations.
Work closely with data engineers, application engineers, analysts, and other stakeholders to understand data needs and provide appropriate solutions.
Leverage cloud technologies (mainly AWS) for data storage, processing, and analytics, ensuring cost-effectiveness and scalability.
Document data processes, architectures, and workflows while establishing best practices for data management and engineering.
Set up monitoring solutions to track data pipelines and database performance, ensuring timely maintenance and fault resolution.
Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code.
Implement data security measures and ensure compliance with relevant regulations regarding data protection and privacy.
Provide guidance and mentorship to junior data engineers, fostering a culture of learning and continuous improvement.

Key Qualifications

Experience :
15+ years of experience in Snowflake Solution Architect would be preferable.

Bachelor's or Master's degree in Computer Science, Information Technology, or a related field with at least 10+ years of software development experience
Expert knowledge in Database like Oracle, Postgre

SQL, SQL Server (preferably cloud hosted), with strong programming experience in SQL.
Competence in data preparation and/or ETL tools like Snaplogic or Azure Data Factory or AWS Glue or SSIS (preferably strong working experience in one or more) to build and maintain data pipelines and flows.
Programming language experience in Python, shells scripts (bash/zsh, grep/sed/awk etc..).
Deep knowledge of databases, stored procedures, optimizations of huge data
In-depth knowledge of ingestion techniques, data cleaning, de-dupe, partitioning.

Experience with building the infrastructure required for data ingestion and analytics
Solid understanding of normalization and denormalization of data, database exception handling, transactions, profiling queries, performance counters, debugging, database & query optimization techniques
Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption (TDE),…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary