×
Register Here to Apply for Jobs or Post Jobs. X

Senior Specialist, Data Engineer

Job in Melbourne, Brevard County, Florida, 32935, USA
Listing for: Harris Geospatial Solutions
Full Time position
Listed on 2026-02-08
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Science Manager, Data Warehousing
Job Description & How to Apply Below

Overview

L3

Harris is dedicated to recruiting and developing high-performing talent who are passionate about what they do. Our employees are unified in a shared dedication to our customers’ mission and quest for professional growth. L3

Harris provides an inclusive, engaging environment designed to empower employees and promote work-life success. Fundamental to our culture is an unwavering focus on values, dedication to our communities, and commitment to excellence in everything we do.

L3

Harris is the Trusted Disruptor in defense tech. With customers’ mission-critical needs always in mind, our employees deliver end-to-end technology solutions connecting the space, air, land, sea and cyber domains in the interest of national security.

Job Title: Senior Specialist, Data Engineer

Job Code: 33790

Job Location: Melbourne, FL;
Remote

Job Schedule: 9/80:
Employees work 9 out of every 14 days – totaling 80 hours worked – and have every other Friday off

Job Description

L3

Harris Enterprise Data and AI team is seeking a Data Engineer with experience in managing enterprise-level data life cycle processes. This role includes overseeing data ETL/ELT pipelines, ensuring adherence to data standards, maintaining data frameworks, conducting data cleansing, orchestrating data pipelines, and ensuring data consolidation. The selected individual will play a pivotal role in maintaining ontologies, building scalable data solutions, and developing dashboards to provide actionable insights for the enterprise within Palantir Foundry.

This position will support the company’s modern data platform, Unified Data Layer, focusing on data pipeline development and maintenance, data platform design, documentation, and user training. The goal is to ensure seamless access to data for all levels of the organization, empowering decision-makers with clean, reliable data.

Essential Functions:

  • Design, build, and maintain robust data pipelines to ensure reliable data flow across the enterprise.
  • Maintain data pipeline schedules, orchestrate workflows, and monitor the overall health of data pipelines to ensure continuous data availability.
  • Create, update, and optimize data connections, datasets, and transformations to align with business needs.
  • Troubleshoot and resolve data sync issues, ensuring consistent and correct data flow from source systems.
  • Collaborate with cross-functional teams to uphold data quality standards and ensure accurate data is available for use.
  • Utilize Palantir Foundry to establish data connections to source applications, extract and load data, and design complex logical data models that meet functional and technical specifications.
  • Develop and manage data cleansing, consolidation, and integration mechanisms to support big data analytics at scale.
  • Build visualizations using Palantir Foundry tools and assist business users with testing, troubleshooting, and documentation creation, including data maintenance guides.

Qualifications:

  • Bachelor’s Degree and minimum 6 years prior Palantir experience or Graduate Degree and a minimum of 4 years of prior Palantir experience In lieu of degree, minimum 10 years of prior Palantir experience.
  • 4+ years of experience with Data Pipeline development or ETL tools such as Palantir Foundry, Azure Data Factory, SSIS, or Python.
  • 4+ years of experience in Data Integration.
  • 4+ years experience with design, development of Data Pipelines in Palantir Foundry Pipeline Builder or Code Repo, PySpark and Spark SQL, and data build/sync schedule deployment in Palantir

Preferred Additional

Skills:

  • Understanding of BI (Business Intelligence) & DW (Data Warehouse) development methodologies.
  • Experience with Snowflake cloud data platform including but not limited to hands-on experience with Snowflake
  • Experience with Python, Pandas, Databricks, JavaScript, Typescript or other scripting languages
  • Experience in ETL tools such as Palantir Foundry, ADF (Azure Data Factory), SSIS, Informatica or Talend is preferable
  • Working knowledge to connect and extract data from various ERP applications such as Oracle EBS, SAP ECC/S4, Deltek Costpoint, REST API, and more.
  • Experience with AI tools such as OpenAI, Palantir AIP, Snowflake Cortex or…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary