Full Stack Data Engineer - Senior
California, Moniteau County, Missouri, 65018, USA
Listed on 2026-01-08
-
IT/Tech
Cloud Computing, Data Engineer
OVERVIEW
In this role you will be deeply involved in the design, development, and deployment of secure, high-quality software solutions. Your role will focus on integrating security and automation throughout the software development lifecycle (SDLC), with an emphasis on writing clean, maintainable code and building infrastructure that supports CI/CD pipelines, automated testing, and cloud-native delivery.
You’ll implement and enforce Dev Sec Ops best practices tailored for Azure, contribute to infrastructure as code, and work closely with developers, testers, and cloud engineers to ensure code is secure, scalable, and production-ready from day one.
This role requires a hands‑on engineer who thrives in a collaborative environment and is passionate about code quality, automation, and secure cloud development.
We are looking for leaders that are energized by creative and critical thinking, building and sustaining high-performing teams, getting results the right way, and fostering continuous learning.
- Duration: 6+ months contract, not C2C eligible
- Location:
Remote, but must reside in California, Arizona, Washington, Oregon, Nevada. Working hours will be PST. Preference for California. - Rate: $60/hr - $80/hr DOE
Must be able to work in the United States without sponsorship
RESPONSIBILITIES- Build data pipelines:
Create, maintain, and optimize workloads from development to production for specific use cases, with a focus on cloud-native solutions and modern frameworks - Develop the most efficient and cost-effective implementation, leveraging reusable features where possible
- Drive operational excellence, including but not limited to Incident Management, process automation leveraging AI, and ensuring smooth deployments for your technology products/platform features
- Monitor and manage software configuration changes to anticipate and address data reliability and customer satisfaction issues, leveraging cloud monitoring tools and practices.
- Coordinate sustaining support for multiple application platforms or business processes, ensuring seamless integration and operation in a cloud environment.
- Apply significant knowledge of IT and healthcare industry trends
- Execute the analysis and remediation of root causes, including deficiencies in technology, process or resource capabilities
- Work in agile/Dev Sec Ops pod model alongside solution leads, data modelers, analysts, business partners and other developers in delivery of data
- Support monitoring and tuning application code to assure optimal availability, performance and utilization of resources
- Provide technical expertise working with Analysts and Business Users to implement complex and varied functional specifications into technical designs
- Requires a bachelor’s degree in computer science, Information Technology, Management Information Systems, or a related field (or equivalent experience), with a minimum of 3 years of relevant experience in enterprise application support and cloud-based solution delivery
- Experience in Cloud platform preferably Azure (or AWS or GCP) and its related technical stack including ADLS, Synapse, Azure data factory etc.
- Experienced in Snowflake and/or Databricks
- Solid experience with JavaScript, along with CSS responsive design practices
- Strong technical understanding of data modeling (data vault 2.0), data mining, master data management, data integration, data architecture, data virtualization, data warehousing and data quality techniques
- Hands on experience using data management technologies like Informatica Power Center/IICS, Collibra, Reltio Master Data Management, DBT Cloud, Dbt core, Denodo and Golden Gate or Striim replication
- Working knowledge of testing tools and systems and scheduling software (Tidal, Control-m)
- Basic experience in working with data governance and data security and specifically information stewards and privacy and security officers in moving data pipelines into production with appropriate data quality, governance and security standards and certification
- Proficiency in Unix command-line operations, including directory navigation, file manipulation, and shell scripting, python along with utilities like awk, sed etc.
- Hands‑…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).