Software Developer, Data Engineer, Cloud Computing
Listed on 2026-03-03
-
IT/Tech
Data Engineer, Cloud Computing, Data Analyst, Big Data
Your Opportunity
In this role you will be contracted to build and maintain scalable data pipelines on Google Cloud Platform (Google Cloud Platform) to serve our fraud data mart customers. You will work closely with cross-functional teams to ensure data integrity, reliability, and scalability. Your expertise in Google Big Query, Google Cloud Storage, Dataflow, Cloud Composer, Python, and SQL will be crucial in developing effective data solution that support our fraud data analytics and reporting efforts.
What you're good at
Design, build, and maintain scalable data pipelines using Google Cloud Platform tools such as Big Query, Cloud Storage, Dataflow (Apache Beam), Cloud Composer (Airflow) and Pub/Sub.
Write high-performance, production-grade Python and SQL, optimizing queries to support data extraction, transformation, and loading (ETL) processes.
Implement complex data models in Big Query, utilizing partitioning, clustering, and materialized views for optimal performance.
Collaborate with cross-functional teams, including business customers, Subject Matter Experts, to understand data requirements and deliver effective solutions.
Implement best practices for data quality, data governance and data security.
Monitor and troubleshoot data pipeline issues, ensuring high availability and performance.
Contribute to data architecture decisions to provide recommendations for improving the data pipeline.
Stay up to date with emerging trends and technologies in cloud-based data engineering and cyber security.
Exceptional communication skills, including the ability to gather relevant data and information, actively listen, dialogue freely, and verbalize ideas effectively.
Ability to work in an Agile work environment to deliver incremental value to customers by managing and prioritizing tasks.
Proactively lead investigation and resolution efforts when data issues are identified taking ownership to resolve them in a timely manner.
Ability to interoperate and document processes and procedures for producing metrics.
location:Telecommute
job type:
Contract
salary: $53.25 - 58.25 per hour
work hours: 8am to 5pm
education:
Bachelors
responsibilities:
Your Opportunity
In this role you will be contracted to build and maintain scalable data pipelines on Google Cloud Platform (Google Cloud Platform) to serve our fraud data mart customers. You will work closely with cross-functional teams to ensure data integrity, reliability, and scalability. Your expertise in Google Big Query, Google Cloud Storage, Dataflow, Cloud Composer, Python, and SQL will be crucial in developing effective data solution that support our fraud data analytics and reporting efforts.
What you're good at
- Design, build, and maintain scalable data pipelines using Google Cloud Platform tools such as Big Query, Cloud Storage, Dataflow (Apache Beam), Cloud Composer (Airflow) and Pub/Sub.
- Write high-performance, production-grade Python and SQL, optimizing queries to support data extraction, transformation, and loading (ETL) processes.
- Implement complex data models in Big Query, utilizing partitioning, clustering, and materialized views for optimal performance.
- Collaborate with cross-functional teams, including business customers, Subject Matter Experts, to understand data requirements and deliver effective solutions.
- Implement best practices for data quality, data governance and data security.
- Monitor and troubleshoot data pipeline issues, ensuring high availability and performance.
- Contribute to data architecture decisions to provide recommendations for improving the data pipeline.
- Stay up to date with emerging trends and technologies in cloud-based data engineering and cyber security.
- Exceptional communication skills, including the ability to gather relevant data and information, actively listen, dialogue freely, and verbalize ideas effectively.
- Ability to work in an Agile work environment to deliver incremental value to customers by managing and prioritizing tasks.
- Proactively lead investigation and resolution efforts when data issues are identified taking ownership to resolve them in a timely manner.
- Ability to interoperate and document processes and procedures for producing metrics.
Must Have
Bachelor's or Master's degree in computer science, Information Systems, Engineering, or related field.
8+ years of hands-on experience with data management in gathering data from multiple sources and consolidating them into a single centralized location. Transforming the data with business logic in a consumable manner for visualization and data analysis.
Robust expertise in Google Big Query, Google Cloud Storage, Dataflow, Pub/Sub, Cloud Composer, and related Google Cloud Platform (Google Cloud Platform) services.
Proficiency in Python and SQL for data processing and automation.
Experience with ETL processes and data pipeline design.
Excellent problem-solving skills and attention to detail.
Robust communication and collaboration.
Nice to Have
Deep expertise in real-time processing using…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).