More jobs:
Job Description & How to Apply Below
Aarorn Technologies Inc provided pay range
This range is provided by Aarorn Technologies Inc. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.
Base pay rangeCA $40.00/hr - CA $45.00/hr
Job Title:
GCP Data Engineer
Location:
Toronto, ON (3x onsite a week)
Employment Type:
Contract
We are looking for a skilled GCP Data Engineer with strong experience in building and optimizing data pipelines on Google Cloud Platform. The ideal candidate will have hands‑on expertise in ETL workflow design using Spark Serverless and Spark on GKE, combined with proficiency in Python programming for data processing and automation. This role requires deep knowledge of GCP services, data governance, and monitoring practices.
Key Responsibilities- Design, develop, and maintain scalable ETL workflows using Spark Serverless, Spark on GKE, and Python.
- Implement data ingestion, transformation, and loading processes for structured and unstructured data.
- Utilize GCP services such as Big Query, Cloud SQL, Cloud Storage, and Pub/Sub for efficient data processing.
- Optimize performance and cost for large‑scale data workflows.
- Implement data governance, data quality checks, and data lineage tracking using GCP tools.
- Ensure compliance with organizational and regulatory standards.
- Set up Cloud Logging, Cloud Monitoring, and alerting for data pipelines and infrastructure.
- Troubleshoot and resolve issues proactively to maintain high availability.
- Work closely with data scientists, analysts, and architects to deliver reliable solutions.
- Automate workflows and processes using Python and GCP‑native tools.
- Strong experience with Google Cloud Platform and its data services.
- Hands‑on expertise in Spark Serverless, Spark on GKE, and Python programming.
- Proficiency in Big Query, Cloud
SQL, and GCP‑native ETL tools. - Knowledge of data governance, data quality frameworks, and lineage tools.
- Experience with Cloud Logging, Cloud Monitoring, and observability best practices.
- GCP Professional Data Engineer certification.
- Experience with containerization such as Docker and Kubernetes and CI/CD pipelines.
- Familiarity with data security and compliance standards such as GDPR and HIPAA.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×