×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Senior Data Engineer IRC

Job in Town of Poland, Jamestown, Chautauqua County, New York, 14701, USA
Listing for: Hitachi Vantara Corporation
Full Time position
Listed on 2025-12-18
Job specializations:
  • Software Development
    Data Engineer
Job Description & How to Apply Below
Position: Senior Data Engineer IRC278994
Location: Town of Poland

Description

Global Logic is searching for a motivated, results‑driven, and innovative software engineer to join our project team at a dynamic startup specializing in pet insurance. Our client is a leading global holding company that is dedicated to developing an advanced pet insurance claims clearing solution designed to expedite and simplify the veterinary invoice reimbursement process for pet owners.

You will be working on a cutting‑edge system built from scratch, leveraging Azure cloud services and adopting a low‑code paradigm. The project adheres to industry best practices in quality assurance and project management, aiming to deliver exceptional results.

We are looking for an engineer who thrives in collaborative, supportive environments and is passionate about making a meaningful impact on people’s lives. If you are enthusiastic about building innovative solutions and contributing to a cause that matters, this role could be an excellent fit for you.

Only for candidates from Poland and Ukraine.

Requirements
  • Strong hands‑on experience with Azure Databricks (DLT Pipelines, Lakeflow Connect, Delta Live Tables, Unity Catalog, Time Travel, Delta Share) for large‑scale data processing and analytics
  • Proficiency in data engineering with Apache Spark, using PySpark, Scala, or Java for data ingestion, transformation, and processing
  • Proven expertise in the Azure data ecosystem:
    Databricks, ADLS Gen2, Azure SQL, Azure Blob Storage, Azure Key Vault, Azure Service Bus/Event Hub, Azure Functions, Azure Data Factory, and Azure CosmosDB
  • Solid understanding of Lakehouse architecture, Modern Data Warehousing, and Delta Lake concepts
  • Experience designing and maintaining config‑driven ETL/ELT pipelines with support for Change Data Capture (CDC) and event/stream‑based processing
  • Proficiency with RDBMS (MS SQL, MySQL, Postgre

    SQL) and No

    SQL databases
  • Strong understanding of data modeling, schema design, and database performance optimization
  • Practical experience working with various file formats, including JSON, Parquet, and ORC
  • Familiarity with machine learning and AI integration within the data platform context
  • Hands‑on experience building and maintaining CI/CD pipelines (Azure Dev Ops, Git Lab) and automating data workflow deployments
  • Solid understanding of data governance, lineage, and cloud security (Unity Catalog, encryption, access control)
  • Strong analytical and problem‑solving skills with attention to detail
  • Excellent teamwork and communication skills
Job responsibilities
  • Design, implement, and optimize scalable and reliable data pipelines using Databricks, Spark, and Azure data services
  • Develop and maintain config‑driven ETL/ELT solutions for both batch and streaming data
  • Ensure data governance, lineage, and compliance using Unity Catalog and Azure Key Vault
  • Work with Delta tables, Delta Lake, and Lakehouse architecture to ensure efficient, reliable, and performant data processing
  • Collaborate with developers, analysts, and data scientists to deliver trusted datasets for reporting, analytics, and machine learning use cases
  • Integrate data pipelines with event‑based and microservice architectures leveraging Service Bus, Event Hub, and Functions
  • Design and maintain data models and schemas optimized for analytical and operational workloads
  • Identify and resolve performance bottlenecks, ensuring cost efficiency and maintainability of data workflows
  • Participate in architecture discussions, backlog refinement, estimation, and sprint planning
  • Contribute to defining and maintaining best practices, coding standards, and quality guidelines for data engineering
  • Perform code reviews, provide technical mentorship, and foster knowledge sharing within the team
  • Continuously evaluate and enhance data engineering tools, frameworks, and processes in the Azure environment
What we offer Culture of caring.

At Global Logic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate…

Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary