More jobs:
Job Description & How to Apply Below
data pipelines, architect scalable solutions, and ensure seamless integration across
diverse platforms. You will empowerteams with reliable, high-quality data, driving
excellence in analytics and shaping outcomes that impact industries like healthcare and
financial services.
What You’ll Do
• Design and build data pipelines using GCP services and Big Query.
• Implement and manage workflow orchestration with Airflow.
• Develop and optimize data solutions using Java or Python.
• Collaborate with data scientists, analysts, and business teams to deliver actionable
insights.
• Ensure data quality, governance, and compliance across projects.
• Optimize performance of large‐scale data processing systems.
• Support cloud migration and modernization initiatives.
• Partner with stakeholders to align data solutions with business goals.
What You Bring
• 8+ years of experience in Data engineering.
• Strong expertise in GCP and Big Query.
• Hands-on experience with Airflow for workflow orchestration.
• Proficiency in Java or Python for data processing and automation.
• Solid understanding of Data modeling, ETL, and Database optimization.
• Experience working in Agile delivery environments.
• Strong problem-solving, communication, and stakeholder management skills.
• Healthcare domain is preferred.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×