More jobs:
Job Description & How to Apply Below
Chennai
Experience:
8-10 years
Job Description:
- Architect and implement enterprise-grade data migration solutions using Java and Python, enabling seamless data transfers from on-premises to GCP (Cloud Storage, Big Query, Pub/Sub) using Apache Airflow and Google Cloud Composer.
- Build secure, scalable, and optimized data architectures leveraging GCP services such as Cloud Storage, Pub/Sub, Dataproc, Dataflow, and Big Query.
- Design and implement automated frameworks for data delivery, monitoring, and troubleshooting.
- Develop data observability frameworks to ensure quality, lineage, and reliability across pipelines.
- Proactively monitor system performance, identify bottlenecks, and optimize pipelines for efficiency, scalability, and cost.
- Troubleshoot and resolve complex technical issues in distributed systems and cloud environments.
- Drive best practices in documentation of tools, architecture, processes, and solutions.
- Mentor junior engineers, conduct design/code reviews, and influence engineering standards.
- Collaborate with cross-functional teams to enable AI/ML and GenAI-driven use cases on LUMI.
Minimum Qualifications:
- 8+ years of experience in data engineering, software engineering, or platform development.
- Strong programming expertise in Java, Python, and Shell scripting.
- Advanced knowledge of SQL, data modeling, and performance optimization.
- Deep expertise in Google Cloud Platform services:
Cloud Storage, Big Query, Pub/Sub, Dataproc, Dataflow.
- Strong background in RDBMS (Oracle, Postgres, MySQL) and exposure to No
SQL DBs (Cassandra, Mongo
DB, or similar).
- Proven track record in CI/CD pipelines, Git workflows, and Agile development.
- Demonstrated experience in building and scaling production-grade data pipelines.
- Strong problem-solving and troubleshooting skills in distributed and cloud-native systems.
Preferred Qualifications:
- Hands-on experience with Dev Ops best practices, automation, and infrastructure as code.
- Exposure to platform engineering (networking, security, IAM, firewalls).
- Experience designing and implementing data observability frameworks (monitoring, lineage, anomaly detection).
- Hands-on or exposure to GenAI integrations (LLMs, RAG, AI-driven data engineering workflows).
• Proven ability to mentor, influence, and lead engineering discussions
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×