×
Register Here to Apply for Jobs or Post Jobs. X

Cloud Specialist - Data Bricks + Python

Job in Bengaluru, 560001, Bangalore, Karnataka, India
Listing for: Confidential
Full Time position
Listed on 2026-02-04
Job specializations:
  • IT/Tech
    Data Engineer, Big Data, Data Science Manager, Data Warehousing
Job Description & How to Apply Below
Location: Bengaluru

Job Description:

We are seeking a skilled Backend Data Engineer to join our dynamic team. The ideal candidate will be responsible for designing and implementing data pipelines, ensuring data integrity, and developing backend services that support data processing and analytics. You will work closely with data scientists, analysts, and other engineers to manage and process large datasets, utilizing Data Bricks and various Python frameworks.

Your role will involve optimizing data workflows and ensuring that our data infrastructure is robust, scalable, and efficient.

Key Responsibilities:

- Design, develop, and maintain data pipelines using Data Bricks.
- Collaborate with cross-functional teams to gather requirements and deliver high-quality data solutions.
- Implement ETL processes to extract, transform, and load data from various sources into data lakes and warehouses.
- Write clean, efficient, and well-documented Python code for data processing.
- Optimize data models and queries for performance and scalability.
- Monitor and troubleshoot data pipeline performance issues.
- Ensure data quality and integrity throughout all stages of data processing.
- Stay updated with emerging technologies and best practices in data engineering.

Skills Required:

- Proficiency in Python programming and data manipulation libraries such as Pandas and Num Py.

- Experience with Data Bricks and Apache Spark for big data processing.
- Strong understanding of data warehousing concepts and ETL processes.
- Familiarity with SQL for querying relational databases.
- Knowledge of data modeling techniques and practices.

- Experience with cloud platforms such as AWS, Azure, or Google Cloud.
- Excellent problem-solving skills and attention to detail.
- Strong communication skills and ability to work collaboratively in a team environment.
Tools

Required:

- Data Bricks for data engineering tasks.
- Python for backend development and data processing.
- Apache Spark for handling large-scale data processing.
- SQL databases such as Postgre

SQL, MySQL, or similar.
- Cloud services (AWS, Azure, or Google Cloud) for data storage and processing.
- Git for version control and collaboration.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary