Core Platform Engineer
Listed on 2026-01-29
-
IT/Tech
Data Engineer, Cloud Computing, Data Security, Cybersecurity
Overview
Outstanding contract opportunity! A well-known Financial Services Company is looking for a Core Platform Engineer in Charlotte, NC or Iselin, NJ.
Work with the brightest minds at one of the largest financial institutions in the world. This is long-term contract opportunity that includes a competitive benefit package! Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.
Contract Duration: 24 Months
Primary Role:
Build and maintain secure, scalable infrastructure and services.
- Support a highly available and scalable infrastructure containing Object storage, Openshift, Spark, Iceberg, Yunikorn, Trino
- Monitor for configuration drift and enforce infrastructure policies.
- Configure and monitor Big Data ecosystem components with various BI tools, observability tools etc
- Build automated regression and performance test suite to ensure health checks of all components of the platform
- Monitor system health and enforce runtime policies
- Implement and manage security protocols, including Oauth authentication, TLS encryption, and role-based access control (RBAC).
- Conduct regular maintenance, including cluster scaling, perform regular security audits.
Programming & Scripting
- Languages
:
Python, Bash, Shell, SQL, Java (basic), Scala (for big data, good to have) - Automation & Scripting
:
Python scripting for automation, Linux shell scripting
Operating Systems & Containers
- System programming, performance tuning, networking
- OCP, Kubernetes (K8s), Helm, Terraform, container orchestration and deployment
Big Data & Data Engineering
- Frameworks
:
Nexus One, Apache Spark, Hadoop, Hive, Trino, Iceberg - ETL Tools
:
Apache Airflow, NiFi (good to have) - Data Pipelines
:
Batch and streaming (Kafka, Flink) - Object Storage
: S3, Net App Storage Grid - Data Formats
:
Parquet/Avro, ORC, JSON, CSV
- Frameworks or LLM modeling
- Model Ops
: MLflow, Kubeflow, Sage Maker - Data Science
:
Feature engineering, model deployment, inference pipelines
- Access Models
: RBAC (Role-Based Access Control), ABAC (Attribute-Based Access Control) - Data Protection
:
Encryption at rest and in transit, TLS/SSL, KMS (Key Management Services) - Compliance
: GDPR, HIPAA (if applicable), IAM policies
- Design Principles
:
Microservices, Event-driven architecture, Serverless - Scalability
:
Load balancing, caching (Redis, Memcached), horizontal scaling - High Availability
:
Failover strategies, disaster recovery, monitoring (Prometheus, Grafana)
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).