Cloud Developer
Listed on 2026-03-03
-
IT/Tech
Data Engineer, Cloud Computing
Description
Position:
Cloud Developer
Melbourne, FL (Patrick SFB)
Make an Impact with LeidosAt Leidos, we deliver innovative solutions through the dedication of our diverse and talented teams. United by a shared commitment to our customers' success, we empower our people, support our communities, and operate sustainably. Our Mission, Vision, and Values guide every aspect of our work, ensuring we do the right thing‑for our customers, our people, and the world around us.
Thrive in an Impactful EnvironmentThe Leidos Defense Sector offers a broad portfolio of systems, solutions, and services across land, sea, air, space, and cyberspace. We support critical defense missions with capabilities in enterprise and mission IT, large‑scale intelligence systems, command and control, geospatial and data analytics, cybersecurity, logistics, training, and intelligence operations. Our teams tackle the world's toughest security challenges for customers with "can't fail" missions.
JobOverview
Leidos Defense Sector is seeking an Cloud Developer to join our engineering team to design, build, and maintain scalable cloud‑native solutions. The ideal candidate combines strong serverless development skills (Lambda, API Gateway, Step Functions), hands‑on data engineering experience (ETL/ELT, streaming, data lakes), and expertise with XML transformation tools. This role partners with product and data teams to deliver reliable, performant systems to mobilize trusted data for customer services.
PrimaryResponsibilities
- Design, develop, and deploy cloud‑native applications and microservices on AWS using serverless and container‑based architectures.
- Implement, maintain, and optimize XML transformation workflows using tools such as XSLT, Apache DFDL, Saxon, or Altova Map Force.
- Build and maintain data ingestion, transformation, and storage pipelines (batch and streaming) to support analytics and ML workloads.
- Develop APIs and backend services using serverless frameworks (Lambda, API Gateway) and event‑driven architectures (SNS, SQS, Event Bridge, Kinesis).
- Collaborate with data engineers to model, optimize, and operationalize data workflows across S3, Glue, Redshift, Snowflake, and related platforms.
- Automate infrastructure provisioning and deployments using IaC tools (Cloud Formation, Terraform, CDK) and CI/CD pipelines.
- Monitor, troubleshoot, and optimize system performance, cost, and reliability; implement observability using Cloud Watch, XRay, Prometheus/Grafana.
- Apply security best practices across the stack (IAM, KMS, VPC, encryption, least privilege) and ensure compliance with DoD security policies.
- Produce and maintain documentation, runbooks, and code reviews; participate in post‑incident reviews.
- Support on‑call rotation and occasional after‑hours work as needed.
- Bachelor's degree in Computer Science, Engineering, or related discipline with 4+ years of relevant experience. Additional experience, training, or certifications may substitute for a degree.
- U.S. citizen; currently possessing an Active DoD Top Secret clearance and eligibility for Top Secret/SCI.
- IAT Level II certification (e.g., CompTIA Security+) or ability to earn within 6 months of hire.
- 2+ years of hands‑on AWS experience (Lambda, API Gateway, S3, IAM, RDS/Dynamo
DB, VPC, Cloud Watch). - Strong programming skills in at least one serverless‑friendly language (Python, Node.js/Type Script, Java, or Go).
- Experience building data pipelines and working with data stores (S3, Redshift, Dynamo
DB, RDS). - Familiarity with IaC tools (Terraform, Cloud Formation, CDK) and CI/CD pipelines.
- Hands‑on experience with code repositories such as Git, Git Lab, Git Hub.
- Knowledge of RESTful API design, event‑driven architectures, and asynchronous processing.
- Experience with observability and troubleshooting in production environments.
- Experience with streaming technologies (Kinesis, Kafka, Glue Streaming).
- Hands‑on experience with Apache NiFi, including custom processors, flow versioning, and integration with Kafka, HDFS/S3, and downstream ETL systems.
- Familiarity with serverless orchestration (Step Functions) and workflow automation.
- AWS…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).