Data Engineer
Job in
500016, Prakāshamnagar, Telangana, India
Listed on 2026-02-04
Listing for:
Confidential
Full Time
position Listed on 2026-02-04
Job specializations:
-
IT/Tech
Data Engineer, Data Science Manager, Big Data, Data Warehousing
Job Description & How to Apply Below
Job Title:
Data Engineer
Location:
Hyderabad, Telangana
Employment Type:
Full-time | On-site / Hybrid
About the Role
We are looking for a skilled and detail-oriented Data Engineer to design, build, and maintain scalable data pipelines and architectures that power analytics, reporting, and machine learning across the organization. You will play a key role in ensuring reliable, high-quality data is accessible to teams for data-driven decision-making.
Key Responsibilities
Design, develop, and maintain ETL/ELT pipelines for structured and unstructured data.
Build and optimize data warehouses, data lakes, and data marts.
Develop data integration solutions across multiple data sources and systems.
Ensure data quality, governance, security, and compliance throughout the data lifecycle.
Collaborate with Data Analysts, Data Scientists, and software engineers to deliver trusted datasets.
Monitor, troubleshoot, and optimize data pipelines for performance and scalability.
Automate data processing workflows using Python, SQL, and cloud-native tools.
Document data architecture, workflows, and best practices.
Required
Skills & Qualifications
Bachelor's or Master's degree in Computer Science, Data Engineering, IT, or a related field.
2–5 years of experience in data engineering or a related role.
Strong proficiency in SQL and experience with relational databases (Postgre
SQL, MySQL, SQL Server).
Hands-on experience with big data technologies such as Spark, Kafka, or Hadoop.
Proficiency in Python or Java for data pipeline development.
Experience with cloud platforms such as AWS, Azure, or GCP (S3, Redshift, Big Query, Databricks, etc.).
Understanding of ETL/ELT concepts, data modeling, and data warehousing.
Familiarity with Git and CI/CD practices.
Preferred / Good-to-Have
Experience with workflow orchestration tools (Airflow, Prefect, Luigi).
Knowledge of No
SQL databases (Mongo
DB, Cassandra, Dynamo
DB).
Exposure to containerization and Dev Ops tools (Docker, Kubernetes).
Understanding of data governance, metadata management, and security best practices.
Why Join Us
Work on high-impact data platforms and analytics systems.
Collaborate with cross-functional teams in a fast-paced environment.
Opportunity to work with modern data tools and cloud technologies.
Competitive compensation with strong career growth opportunities.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×