More jobs:
Engineer, Cloud Computing
Job in
Cincinnati, Hamilton County, Ohio, 45208, USA
Listed on 2026-02-07
Listing for:
Tata Consultancy Services
Full Time
position Listed on 2026-02-07
Job specializations:
-
IT/Tech
Cloud Computing, Data Engineer
Job Description & How to Apply Below
Overview
Skills:
Databricks Engineer
Minimum 6-8 years of relevant experience in the following areas.
Responsibilities- Develop and maintain CI/CD pipelines for Azure Databricks deployments (Azure Dev Ops/YAML and related tools).
- Automate deployment and configuration of Databricks clusters, jobs, libraries, notebooks, and environment promotions.
- Implement and manage the Databricks environment for performance, cost efficiency, and scalability; optimize cluster sizing and autoscaling.
- Collaborate with Data Engineers/Scientists/Software Engineers to design, deploy, and scale data pipelines and models on Databricks.
- Monitor and troubleshoot clusters, pipelines, jobs, and associated workflows; integrate Azure Monitor/Log Analytics for visibility and metrics.
- Implement Infrastructure as Code (IaC) using Terraform, ARM templates, or Bicep to manage Azure resources and Databricks artifacts.
- Design and maintain backup, recovery, and DR strategies for Databricks environments.
- Support security best practices: RBAC/ABAC, managed identities, Key Vault secrets, compliance controls, and Unity Catalog governance.
- Produce clear documentation, templates, and runbooks; enable smooth KT to BAU teams.
- Proven experience as a Dev Ops/Platform Engineer in cloud environments, with a strong focus on Azure.
- Hands-on experience automating Databricks: clusters, libraries, jobs, notebooks, and environment promotions via pipelines.
- Proficiency in Unity Catalog and Databricks data governance.
- Familiarity with Apache Spark (PySpark, Spark SQL).
- Strong IaC skills:
Terraform, ARM, or Bicep. - Scripting (Python/Power Shell), and Git (branching strategies, conflict resolution).
- Observability with Azure Monitor, Log Analytics; pipeline orchestration with Azure Data Factory.
- Security best practices for cloud (RBAC, managed identities, Key Vault).
- Workspace & environment engineering:
Standardize Dev/UAT/Prod work spaces (network/Private Link, VNets, secure egress), service principals, secret scopes, and Key Vault integrations. - Unity Catalog & governance:
Configure catalogs/schemas, RBAC, lineage, and data access patterns aligned to guardrails. - CI/CD for Databricks:
Implement YAML-based Azure Dev Ops pipelines to automate notebook/job deployments, dependencies, environment promotions, and approvals/compliance checks. - IaC for Databricks & Azure:
Author reusable Bicep/Terraform modules for work spaces, clusters/pools, UC objects, and supporting Azure resources. - Observability & reliability:
Establish monitoring/alerting for jobs, clusters, SLAs, autoscaling, and cost controls; and automation for disaster recovery scenarios. - Documentation & handover:
Patterns, pipeline templates, IaC modules, and operational runbooks for BAU, plus KT during the first two releases.
- Bachelor of Computer Science or equivalent.
- Deep expertise in Azure Databricks, Azure Data Lake Storage, Azure Resource Manager (ARM), Microsoft Entra, Azure SQL Database.
- Proficiency in Unity Catalog and Databricks data governance.
- Familiarity with Apache Spark (PySpark, Spark SQL).
- Strong IaC skills:
Terraform, ARM, or Bicep. - Scripting (Python/Power Shell), and Git (branching strategies, conflict resolution).
- Observability with Azure Monitor, Log Analytics; pipeline orchestration with Azure Data Factory.
- Security best practices for cloud (RBAC, managed identities, Key Vault).
- Excellent problem solving and collaboration across cross functional teams.
- Workspace & environment engineering:
Standardize Dev/UAT/Prod work spaces (network/Private Link, VNets, secure egress), service principals, secret scopes, and Key Vault integrations. - Unity Catalog & governance:
Configure catalogs/schemas, RBAC, lineage, and data access patterns aligned to guardrails. - CI/CD for Databricks:
Implement YAML-based Azure Dev Ops pipelines to automate notebook/job deployments, dependencies, environment promotions, and approvals/compliance checks. - IaC for Databricks & Azure:
Author reusable Bicep/Terraform modules for work spaces, clusters/pools, UC objects, and supporting Azure resources. - Observability & reliability:
Establish monitoring/alerting for jobs, clusters, SLAs, autoscaling, and cost controls; and automation for disaster recovery scenarios. - Documentation & handover:
Patterns, pipeline templates, IaC modules, and operational runbooks for BAU, plus KT during the first two releases.
Salary Range: $120,000-$140,000 per year
- Discretionary Annual Incentive
- Comprehensive Medical Coverage:
Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans - Family Support:
Maternal & Parental Leaves - Insurance Options:
Auto & Home Insurance, Identity Theft Protection - Convenience & Professional Growth:
Commuter Benefits & Certification & Training Reimbursement - Time Off:
Vacation, Time Off, Sick Leave & Holidays - Legal & Financial Assistance:
Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing
Qualifications:
Bachelor of Computer Science
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×