×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in 26240, Çankaya, Eskişehir, Turkey (Türkiye)
Listing for: OPLOG
Full Time position
Listed on 2025-12-29
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager, Data Analyst, Big Data
Salary/Wage Range or Industry Benchmark: 150000 - 300000 TRY Yearly TRY 150000.00 300000.00 YEAR
Job Description & How to Apply Below
Location: Çankaya

Who We Are

OPLOG is the tech engine behind seamless e‑commerce fulfillment for top brands across Türkiye, Europe, and the US. By fusing in‑house software with robotics and automation, we erase the line between the physical and digital worlds—delivering post‑purchase experiences that turn customers into fans and giving our clients an unfair competitive edge.

We’re building a world-class robotic fulfillment ecosystem, and we’re looking for curious minds to shape the future with us.

About the Role

We are looking for a highly experienced, talented, and self‑starter Senior Data Engineer who will be responsible for designing and implementing scalable and robust data platforms and solutions using cutting‑edge technologies.

What You’ll Do
  • Design, maintain and optimize batch and streaming data pipelines using Databricks, Apache Spark, and Fivetran.
  • Build and implement data models, products, and platforms with high quality using Databricks ecosystem and dbt for data transformation.
  • Develop MLOps pipelines and AI‑driven solutions to enhance our fulfillment operations and predictive analytics.
  • Work with product management and product teams to build data‑driven products, extract, interpret and present insights using Qlik
    .
  • Contribute to the development of the analytical data warehouse and related big data ecosystem on AWS platform.
  • Implement real‑time data processing and streaming architectures using Databricks structured streaming.
  • Build and maintain dbt models for data transformation and analytics engineering.
  • Provide support to data analysts and data scientists for their data engineering and ML infrastructure requirements.
  • End‑to‑end experience of software development lifecycle with focus on MLOps best practices.
Who You Are
  • Bachelor/Master's degree in related engineering departments such as Computer Engineering, Software Engineering, or Data Science.
  • 6+ years of professional experience in data engineering with a proven track record of delivering complex data solutions in a production environment.
  • High proficiency in Python for data engineering,
    AI/ML development
    , and SQL
    .
  • Experience with dbt (Data Build Tool) for data transformation and analytics engineering.
  • Deep experience in AWS cloud services (S3, EMR, Glue, Lambda, EC2).
  • Extensive hands‑on experience with Databricks platform and Apache Spark for large‑scale data processing.
  • Experience with Fivetran or similar ELT tools for data integration and pipeline automation.
  • Proven experience building MLOps pipelines and deploying machine learning models in production using Databricks MLflow
    .
  • Experience with data processing frameworks and tools such as RDBMS, No

    SQL, and High Scale Databases.
  • Proven experience building data pipelines using Databricks
    , Fivetran
    , and related modern data stack tools.
  • Experience in real‑time and streaming architectures using Databricks Structured Streaming and related technologies.
  • Strong knowledge of Data Warehouse concepts and modern data lake architectures on AWS
    .
Nice-to-haves
  • Familiarity with Qlik Sense/Qlik View for business intelligence and data visualization.
  • Advanced experience with dbt for complex data transformations and data modeling.
  • Experience with Databricks Delta Lake for data lake management and ACID transactions.
  • Experience with Databricks MLflow for machine learning lifecycle management.
  • Knowledge of Apache Airflow or Databricks Workflows for orchestration.
  • Experience with AWS data services and infrastructure.
  • Experience with data governance and data quality frameworks within the Databricks ecosystem.
  • Knowledge of containerization technologies (Docker, Kubernetes) for ML deployment.
  • Experience with Databricks Unity Catalog for data governance and security.
What We Offer
  • AI‑assisted coding & LLM licenses – build smarter, faster
  • Paid vacation in your first year – no waiting period
  • Birthday day off – celebrate you
  • Flexible hours & open kitchen – fuel creativity on your schedule
  • Private health insurance – from day one
  • Shuttle service or monthly gas card – your commute, covered
  • Meal card – lunches on us
  • Learning budget – for courses, books, and conferences
  • Massage Fridays – at our Cyberpark office
  • Unlimited fun at the office – anytime
  • Mac or PC – your choice of tools
  • English courses – grow your communication skills globally

Ready to join us and shape the robotic future of fulfillment?

Apply now.#J-18808-Ljbffr
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary