Middle Data Engineer; GCP, BigQuery
Listed on 2026-01-01
-
IT/Tech
Data Engineer
Location: Town of Poland
Join to apply for the Middle Data Engineer (GCP, Big Query) role at Exadel
Why Join ExadelWe’re an AI-first global tech company with 25+ years of engineering leadership, 2,000+ team members, and 500+ active projects powering Fortune 500 clients, including HBO, Microsoft, Google, and Starbucks.
From AI platforms to digital transformation, we partner with enterprise leaders to build what’s next.
What powers it all? Our people are ambitious, collaborative, and constantly evolving.
What You’ll Do Pipeline Engineering- Build and maintain data pipelines using Apache Beam and Dataflow under the guidance of senior engineers
- Develop ingestion patterns across batch or near real-time workflows
- Write Python and SQL for transformations, validations, and automation tasks
- Create Big Query tables with sound partitioning and clustering choices
- Use dbt or Dataform to manage transformations and testing
- Contribute to data model implementation following established standards
- Document logic and assumptions clearly for partner teams
- Support production workloads by monitoring pipelines, analyzing issues, and applying fixes
- Contribute to performance tuning efforts across Big Query and Dataflow
- Participate in the implementation of CI and CD practices for data workflows
- Work with analysts, scientists, and engineers to understand requirements
- Participate in code reviews and apply feedback to improve your craft
- Learn modern GCP approaches through close coordination with senior engineers and architects
- Experience with Dataflow, Apache Beam, Big Query, Cloud Storage, or similar cloud‑native tools
- Solid proficiency in Python for data tasks and automation
- Strong SQL skills and a clear understanding of analytic query patterns
- Experience with dbt or Dataform for transformations and testing
- Understanding of common data modeling concepts used in analytics environments
- Familiarity with CI and CD practices
- Comfortable working with logging, metrics, and monitoring tools
- Interest in data quality practices and validation frameworks
- Strong debugging instincts and patience with iterative problem solving
- Clear communication with teammates and partner groups
- Desire to grow your technical depth through real project experience
- Steady focus on reliability, clarity, and maintainability
- Experience with Pub/Sub or other event streaming tools
- Exposure to Dataproc or Spark from legacy environments
- Familiarity with Vertex AI or ML‑related workflows
- Understanding of orchestration tools such as Composer or Airflow
Intermediate+
Legal & Hiring Information- Exadel is proud to be an Equal Opportunity Employer committed to inclusion across minority, gender identity, sexual orientation, disability, age, and more
- Reasonable accommodations are available to enable individuals with disabilities to perform essential functions
- Please note:
this job description is not exhaustive. Duties and responsibilities may evolve based on business needs
- International projects
- In‑office, hybrid, or remote flexibility
- Medical healthcare
- Recognition program
- Ongoing learning & reimbursement
- Well‑being program
- Team events & local benefits
- Sports compensation
- Referral bonuses
- Top‑tier equipment provision
We lead with trust, respect, and purpose. We believe in open dialogue, creative freedom, and mentorship that helps you grow, lead, and make a real difference. Ours is a culture where ideas are challenged, voices are heard, and your impact matters.
Seniority levelEntry level
Employment typeFull‑time
Job functionInformation Technology
IndustriesIT Services and IT Consulting, Information Services, and Software Development
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).