Palantir Foundry Developer Full StackEngineer Focus
Listed on 2026-02-16
-
IT/Tech
Data Engineer, Data Analyst, Data Security, Cloud Computing
Job Description
Job Description
ABOUT ARCOP: ARCOP is the supply chain cooperative for Arby’s. We support one of the largest quick-service restaurant networks in the country by managing purchasing, distribution, and the technology that powers critical operations. Our team is driving a digital transformation built around Palantir Foundry, creating tools that replace spreadsheets with real-time systems and smarter workflows.
POSITION SUMMARY:
We are seeking a technically strong Palantir Foundry Developer with a core focus on data engineering and platform development to design, build, and scale Foundry-based solutions that support critical supply chain workflows, analytics, and enterprise integrations. This role is hands-on and execution-oriented, requiring deep expertise in Python, PySpark, and the Palantir Foundry ecosystem.
The ideal candidate brings strong engineering fundamentals, experience developing production-grade data pipelines and applications, and the ability to collaborate effectively with product, analytics, and operations teams. Full-stack development experience is preferred, with Type Script knowledge considered a significant plus.
This role is designed with a clear growth trajectory for high performers, offering the opportunity to evolve into a Staff-level technical leader, providing architectural direction, mentoring junior engineers, and helping shape platform standards and best practices as the team scales.
KEY RESPONSIBILITIES:
- Data Engineering and Integration
- Develop and manage complex data pipelines using PySpark and Foundry’s Pipeline Builder;
Manage versioned code in code repositories with branching strategies and reviews for reproducibility - Align data modeling and transformation logic with evolving business needs
- Model datasets and objects to align with the Foundry Ontology, ensuring consistent semantics and maintaining lineage for traceability and audits
- Own ingestion frameworks via API, sFTP, and AWS Transfer Family; standardize adapters, schemas, and validation for partner feeds
- Define data quality rules (nulls, referential checks, business constraints), configure Data Health monitors, and build alerting/triage workflows to detect schedule failures and changelog job issues
- Lead incident handling for failed builds/health checks; use Data Lineage and platform logs to isolate faults, coordinate fixes, and document postmortems
- Translate business logic from purchasing, logistics, compliance, and finance into reproducible transforms and durable datasets that power Contour/Slate analyses and operational tools
- Implement fine-grained permissions and group strategies; follow ARCOP’s pipeline approval process and maintain audit trails for production changes and access escalations
- Cloud Infrastructure Management
- Operate and maintain AWS infrastructure components including S3, EC2, VPC, Lambda, IAM, Secrets Manager, and Cloud Watch
- Serve as the technical owner for inbound and outbound data workflows across ARCOP and external partners
- Palantir Development
- Design and build applications in Palantir Foundry using tools such as Workshop, Quiver, Slate, Contour, and Ontology
- Maintain and enhance operational tools that support analytics, forecasting, and supply chain workflows
- Manage user access, application performance, data lineage, and platform health
- Cross-Functional Collaboration
- Partner with ARCOP teams in purchasing, logistics, compliance, and finance to develop practical, scalable solutions
- Support troubleshooting and ongoing improvement efforts to ensure system accuracy and usability
- Develop and manage complex data pipelines using PySpark and Foundry’s Pipeline Builder;
REQUIREMENTS:
- Minimum 3 years of professional experience if you have Palantir platform expertise; otherwise, 5+ years in data engineering or software development
- Proficiency in Python and PySpark; working knowledge of Type Script is a plus
- Experience with AWS services including data storage, compute, security, and automation
- Strong background in data architecture, integrations, and system design
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).