DevOps/Data Engineer; Snowflake, DBT, Airflow
Listed on 2025-12-19
-
IT/Tech
Data Engineer, Cloud Computing
About the Role
We are seeking a hands‑on Dev Ops/Data Engineer with strong experience across Snowflake, DBT, and Airflow
, combined with solid Dev Ops skills for automation, CI/CD, and cloud‑based infrastructure. This role will support both data engineering workloads and Dev Ops initiatives, so flexibility and versatility are key.
The ideal candidate understands data pipelines end‑to‑end — from ingestion and transformation to deployment and observability — and can clearly articulate individual contributions on past projects
.
✔ Design, build, and maintain scalable data pipelines using Snowflake, DBT, and Airflow
✔ Develop CI/CD pipelines and automate deployments for data workloads and cloud infrastructure
✔ Contribute to Dimensional Modeling and ELT/ETL transformations
✔ Deploy and manage infrastructure using IaC tools (Terraform is preferred)
✔ Collaborate with data, analytics, and platform teams to support business use cases
✔ Document architecture and ensure monitoring, logging, and alerting are in place
✔ Work across Dev Ops and Data Engineering functions depending on workload demands
Required Skills & ExperienceMust have:
🔹 2–6 years of hands‑on experience in Dev Ops and/or data engineering
🔹 Proven experience with Project examples showing what you built/contributed
🔹 Strong experience with:
- Snowflake
- DBT
- Airflow
- CI/CD pipelines (Git Lab, Git Hub Actions, Jenkins, etc.)
- Python SQL
➕ Terraform (IaC)
➕ Docker / Kubernetes
➕ Experience with performance monitoring and observability
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).