×
Register Here to Apply for Jobs or Post Jobs. X

GCP Data Engineer

Job in Richardson, Dallas County, Texas, 75080, USA
Listing for: Compunnel, Inc.
Full Time position
Listed on 2025-11-28
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

We are seeking an experienced GCP Data Engineer with strong experience in building and optimizing big data pipelines on Google Cloud Platform (GCP). The ideal candidate will have hands-on experience with GCP services, big data frameworks, and Python-based ETL development, and will contribute to building robust data solutions that scale to meet complex analytics requirements.

Job Responsibilities:

  • Design, develop, and maintain scalable data pipelines and workflows using GCP services such as Dataflow, Cloud Composer, and Cloud Functions
  • Implement ETL/ELT solutions using PySpark, Spark SQL, and Python to process and transform large datasets
  • Develop monitoring and alerting mechanisms for data quality and pipeline failures
  • Write advanced SQL queries to support reporting and analytics use cases
  • Work with GCP services like Compute Engine, Data Proc, Kubernetes Engine, Big Query, Pub/Sub, and Cloud Storage
  • Participate in CI/CD pipeline automation using tools like Jenkins, Git Hub, and Git Hub Actions
  • Design and implement data models (conceptual, logical, and physical) for analytics and reporting
  • Conduct architecture and code reviews to ensure solutions align with GCP best practices and performance standards
  • Provide architectural recommendations for scalable and secure cloud data solution
  • Collaborate with cross-functional teams across data engineering, data science, and product teams
Required Skills:
  • Strong hands-on experience with core GCP services:
    Compute Engine, Data Proc, Kubernetes Engine, Cloud Storage, Big Query, Pub/Sub, Cloud Functions, and Dataflow
  • Proficiency in PySpark, Python, Spark SQL, Data Frames, and Py Test
  • Experience with Cloud Composer (Airflow on GCP) for orchestrating data workflows
  • Proven expertise in writing complex and optimized SQL queries
  • Experience designing and implementing CI/CD pipelines using Git Hub, Git Hub Actions, and Jenkins
  • Deep understanding of data architecture, data modeling, and pipeline optimization on GCP
  • Strong troubleshooting and debugging skills
  • Excellent verbal and written communication skills
Preferred

Skills:

  • Experience with data governance and metadata management tools
  • Familiarity with machine learning pipelines on GCP
  • GCP Professional Data Engineer certification is a plus
Certifications:

GCP Professional Data Engineer (preferred)

Education:

Bachelor's or master’s degree in computer science, Engineering, Data Science, or a related field

Email  
* This field is required Please enter valid email

Id.

Cell phone
* This field is required Please enter valid cell phone.

First Name
* This field is required Please enter valid first name.

Last Name
* This field is required Please enter valid last name.

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary