×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer – PySpark, Cloud & Kafka - YoE - r

Job in Bengaluru, 560001, Bangalore, Karnataka, India
Listing for: UST
Full Time position
Listed on 2026-02-20
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Big Data, Data Science Manager
Job Description & How to Apply Below
Position: Senior Data Engineer – PySpark, Cloud & Kafka - 5+ YoE - Immediate Joiner - Any UST Location
Location: Bengaluru

Candidates ready to join immediately can share their details via email for quick processing.

CCTC | ECTC | Notice Period | Location Preference

Act fast for immediate attention! ⏳

Key Responsibilities

- Design, develop, and maintain scalable data pipelines using Py Spark
- Build and manage batch and real-time data processing systems
- Develop and integrate Kafka-based streaming solutions
- Optimize Spark jobs for performance, cost, and scalability
- Work with cloud-native services to deploy and manage data solutions
- Ensure data quality, reliability, and security across platforms
- Collaborate with data scientists, analysts, and application teams
- Participate in code reviews, design discussions, and production support

Must-Have Skills

- Strong hands-on experience with PySpark / Apache Spark
- Solid understanding of distributed data processing concepts

- Experience with Apache Kafka (producers, consumers, topics, partitions)
- Hands-on experience with any one cloud platform:
- AWS (S3, EMR, Glue, EC2, IAM) or
- Azure (ADLS, Synapse, Databricks) or
- GCP (GCS, Dataproc, Big Query)
- Proficiency in Python
- Strong experience with SQL and data modeling
- Experience working with large-scale datasets
- Familiarity with Linux/Unix environments
- Understanding of ETL/ELT frameworks

- Experience with CI/CD pipelines for data applications

Good-to-Have Skills

- Experience with Spark Structured Streaming
- Knowledge of Kafka Connect and Kafka Streams
- Exposure to Databricks

- Experience with No

SQL databases (Cassandra, Mongo

DB, HBase)
- Familiarity with workflow orchestration tools (Airflow, Oozie)
- Knowledge of containerization (Docker, Kubernetes)

- Experience with data lake architectures
- Understanding of security, governance, and compliance in cloud environments
- Exposure to Scala or Java is a plus
- Prior experience in Agile/Scrum environments
Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary