×
Register Here to Apply for Jobs or Post Jobs. X

GCP Data Engineer

Job in 243601, Gurgaon, Uttar Pradesh, India
Listing for: Impetus
Full Time position
Listed on 2026-02-14
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Job Description & How to Apply Below
About the Organization
- Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth.

Founded in 1991, we are cloud and data engineering leaders providing solutions to fortune 100 enterprises, headquartered in Los Gatos, California, with development centers in NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad with over 3000 global team members. We also have offices in Canada and Australia and collaborate with a number of established companies, including American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon.

Locations
- Bengaluru, & Gurgaon

Job Summary:

We are seeking experienced Data Engineering professionals with 4–6 years of hands-on expertise in Big Data technologies, with a focus on building scalable data solutions using Google Cloud Platform (GCP).

Key

Skills & Experience:

- Proven expertise in PySpark (Data Frame and Spark

SQL), Hadoop, and Hive
- Strong programming skills in Python and Bash
- Solid understanding of SQL and data warehousing concepts
- Demonstrated analytical and problem-solving abilities, particularly in data analysis and troubleshooting
- Innovative thinker with a passion for building efficient, scalable data solutions
- Excellent verbal and written communication skills, with the ability to work collaboratively across teams

Preferred/Good to Have:

- Experience with GCP services such as Big Query, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and IAM
- Familiarity with Airflow or other orchestration tools
- Exposure to cloud migration projects (on-prem to GCP or cloud-to-cloud)

Roles & Responsibilities :

- Design and develop robust and scalable ETL pipelines on GCP to meet business needs
- Ensure code quality and performance by adhering to development best practices and standards
- Perform integration testing and troubleshoot pipeline issues across environments
- Estimate efforts for development, testing, and deployment tasks
- Participate in code reviews and provide feedback to maintain high development standards
- Where applicable, design cost-effective data pipelines leveraging GCP-native services

For Quick Response
- Interested Candidates can directly share their resume along with the details like Notice Period, Current CTC and Expected CTC at
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary