More jobs:
Data Infrastructure Software Engineer, Software
Job in
Waltham, Middlesex County, Massachusetts, 02254, USA
Listed on 2025-12-01
Listing for:
Boston Dynamics
Full Time
position Listed on 2025-12-01
Job specializations:
-
Software Development
Data Engineer, Software Engineer
Job Description & How to Apply Below
Data Infrastructure Software Engineer, Central Software
Boston Dynamics is seeking a data infrastructure software engineer to join the Central Software (CSW) team. The role focuses on developing and maintaining robust cloud-based data pipelines and other big data solutions for use across the company, including integration with robots. The solutions you develop will help expand the reach and capabilities of our advanced mobile robots.
Boston Dynamics is at the forefront of mobile robotics, tackling challenging problems in the field and expanding automation solutions for industrial applications and warehouse logistics.
Responsibilities- Design, develop, and maintain scalable and robust data pipelines using Apache Airflow and other big data technologies.
- Optimize existing data systems for performance, reliability, and cost-effectiveness.
- Collaborate with machine learning engineers and other software engineers to respond to data needs and solve problems with data.
- Troubleshoot and resolve issues related to data availability, performance, and accuracy.
- Monitor data quality and integrity, implementing processes for data validation and error handling.
- Participate in code reviews, contributing to a high standard of code quality and best practices.
- Research and evaluate new technologies and tools to improve our data platform.
- Contribute to the overall architecture and strategy for data infrastructure.
- Participate in our agile development process, coordinating work with others, identifying challenges, and communicating progress regularly.
- Mentor and upskill peers and other contributors across the organization.
- 5+ years of professional experience in delivering data infrastructure solutions to end-users.
- Proven ability to design, develop, and optimize efficient ETL/ELT pipelines for large-scale data ingestion and transformation (e.g., Apache Airflow).
- In-depth knowledge and hands-on experience with big data technologies such as Apache Spark, Hadoop, Kafka, Flink, or similar distributed systems.
- Expertise in relational databases (e.g., Postgre
SQL, MySQL). - Experience with major cloud providers like AWS, Google Cloud Platform (GCP), or Microsoft Azure, including services related to data storage, processing, and analytics.
- Proficiency in Python.
- Familiarity with Git version control and a comfortable working proficiency in a Linux development environment.
- Bachelor’s in Engineering, Computer Science, or other technical areas.
- Experience with C++ or Rust.
- Familiarity with containerization (Docker, Kubernetes).
- Mid-Senior level
- Full-time
- Engineering and Information Technology
- Automation Machinery Manufacturing
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×