More jobs:
Data Engineer - Jazwares
Job in
Fort Lauderdale, Broward County, Florida, 33336, USA
Listed on 2026-02-13
Listing for:
3Core Systems Inc.
Full Time
position Listed on 2026-02-13
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Data Science Manager
Job Description & How to Apply Below
Plantation, United States | Posted on 02/10/2026
As a Data Engineer, your duties will be focused on three core areas:
1. Data Platform Infrastructure & Dev Ops- Administer, optimize, and scale our Databricks Lakehouse environment, ensuring high performance, cost efficiency, and operational excellence.
- Develop, maintain, and enhance our data platform infrastructure and security configurations using Terraform. This includes provisioning Databricks work spaces, SQL Endpoints, Unity Catalog objects, and network components.
- Manage and enforce Unity Catalog for data governance, access control, and metadata management.
- Implement and manage CI/CD pipelines for data pipelines, dbt projects, and infrastructure deployments using Git Hub Actions.
- Automate operational tasks, monitoring, and alerting for the data platform.
- Implement and enforce Dev Sec Ops principles, working closely with security teams to ensure compliance and manage/rotate credentials securely.
- Design and implement data ingestion patterns into Databricks using Delta Lake, optimizing for large‑scale data processing and storage.
- Develop, optimize, and troubleshoot complex Spark jobs (PySpark/Scala) for data processing and transformation within Databricks.
- Manage and extend data ingestion pipelines using Airbyte (or similar modern tools like Fivetran, Stitch), including configuring connectors, monitoring syncs, and ensuring data quality and reliability from diverse source systems (e.g., ERP, CRM, marketing, supply chain).
- Orchestrate and automate data pipelines and dbt models using Databricks Workflows and potentially integrating with other orchestration tools.
- Collaborate with Analytics Engineers to translate business requirements into efficient and scalable data models using dbt (Data Build Tool).
- Implement dbt best practices for modularity, testing, documentation, and version control, ensuring seamless integration with Databricks.
- Partner effectively with Analytics Engineers, Data Scientists, and business stakeholders to deliver high‑quality data solutions.
- Provide technical guidance and mentorship to junior team members, and champion data engineering best practices, code quality, and documentation standards.
- Education: Bachelor's degree in Computer Science, Data Engineering, or a related technical field required.
- Experience: 5+ years of progressive experience as a Data Engineer, with a strong focus on cloud-based data platforms.
- Deep Databricks Expertise: Extensive experience with Spark (PySpark/Scala), Delta Lake, Unity Catalog, Databricks SQL, and platform administration.
- Data Modeling: Proven experience with dbt for data modeling, transformation, and testing.
- Infrastructure as Code (IaC): Strong proficiency with Terraform for defining, provisioning, and managing cloud infrastructure and Databricks resources as code.
- Dev Ops & CI/CD: Expertise in Git and Git Hub Actions for version control and implementing robust CI/CD pipelines.
- Programming: Proficiency in SQL and at least one programming language (
Python strongly preferred, Scala is a plus). - Data Architecture: Solid understanding of data warehousing, data lake, and Lakehouse architectures.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×