×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in 500001, Hyderabad, Telangana, India
Listing for: RandomTrees
Full Time position
Listed on 2026-02-22
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
  • Engineering
    Data Engineer
Job Description & How to Apply Below
Data Engineering Specialist

Experience:

7+ Years

Location:

Chennai/Hyderabad
Work Mode :
Hybrid – 3 days office in a week
Overview of the requirement:
Random Trees is looking for a skilled Data Engineering Specialist to design and implement data solutions. The ideal candidate will have experience with Snowflake, SQL, DBT, Python/Pyspark or any Data modelling tools and Azure/AWS/GCP, along with a strong foundation of Cloud Platforms. You will be responsible for developing scalable, efficient data architectures that enable personalized customer experiences and advanced analytics.

Roles and Responsibility:
Implement and maintain data warehousing solutions in Snowflake to handle large-scale data processing and analytics needs.
Optimize workflows using DBT to streamline data transformation and modelling processes.
Optimize workflows using any data modelling tools to streamline data transformation and modelling processes.
Strong expertise in SQL/PLSQL with hands-on experience in querying, transforming, and analysing large datasets.
Good experience in Python programming
Expertise with cloud data platforms for large-scale data processing.
Solid understanding of data profiling, validation, and cleansing techniques.
Support both real-time and batch data integration, ensuring data is accessible for actionable insights and decision-making.
Strong understanding of data modelling, ETL/ELT processes, and modern data architecture frameworks.
Hands-on experience with Python for data engineering tasks and scripting.
Collaborate with cross-functional teams to identify and prioritize project requirements.
Develop and maintain large-scale data warehouses on Snowflake.
Optimize database performance and ensure data quality.
Troubleshoot and resolve technical issues related to data processing and analysis.
Participate in code reviews and contribute to improving overall code quality.

Job Requirements:

Strong understanding of data modelling and ETL concepts.

Experience with Snowflake and Dany Data Modelling is highly desirable.
Optimize workflows using DBT to streamline data transformation and modelling processes.
Hands-on experience with Python for data engineering tasks and scripting.
Strong expertise in SQL with hands-on experience in querying, transforming, and analysing large datasets.
Expertise with cloud data platforms (Azure preferred) and Big Data technologies for large-scale data processing.
Excellent problem-solving skills and attention to detail.
Ability to work collaboratively in a team environment.
Strong communication and interpersonal skills.
Familiarity with agile development methodologies.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary