×
Register Here to Apply for Jobs or Post Jobs. X

Machine Learning Engineer

Job in Coos Bay, Coos County, Oregon, 97458, USA
Listing for: Twilio
Full Time position
Listed on 2026-02-21
Job specializations:
  • IT/Tech
    Data Engineer, Machine Learning/ ML Engineer, AI Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 125000 - 150000 USD Yearly USD 125000.00 150000.00 YEAR
Job Description & How to Apply Below

Who we are

At Twilio, we’re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences. Our dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you’re part of a vibrant team with diverse experiences making a global impact each day.

As we continue to revolutionize how the world interacts, we’re acquiring new skills and experiences that make work feel truly rewarding. Your career at Twilio is in your hands.

We use Artificial Intelligence (AI) technologies to maintain an efficient, fair and transparent hiring process. Our hiring process is never completely automated, and uses AI in conjunction with our recruiting professionals.

See yourself at Twilio

Join the team as Twilio’s next Machine Learning Engineer.

About the job

Join Twilio’s rapidly-growing AI & Data Platform team as a Machine Learning Engineer. You will design, build, and operate the cloud-native data and ML infrastructure that powers every customer interaction, enabling Twilio’s product teams and customers to move from raw events to real-time intelligence. This is a hands-on, builder-focused role that offers clear technical ownership, mentoring, and growth inside a company defining the future of communications with AI.

Responsibilities
  • Architect, implement, and maintain scalable data pipelines and feature stores for batch and real-time workloads.
  • Build reproducible ML training, evaluation, and inference workflows using modern orchestration and MLOps tooling.
  • Integrate event streams from Twilio products (e.g., Messaging, Voice, Segment) into unified, analytics-ready datasets.
  • Monitor, test, and improve data quality, model performance, latency, and cost.
  • Partner with product, data science, and security teams to ship resilient, compliant services.
  • Automate deployment with CI/CD, infrastructure-as-code, and container orchestration best practices.
  • Produce clear documentation, dashboards, and runbooks; share knowledge through code reviews and brown-bag sessions.
  • Embrace Twilio’s “We are Builders” values by taking ownership of problems and driving them to completion.
Qualifications

Twilio values diverse experiences from all kinds of industries, and we encourage everyone who meets the required qualifications to apply. If your career is just starting or hasn t followed a traditional path, don t let that stop you from considering Twilio. We are always looking for people who will bring something new to the table!


* Required:

  • B.S. in Computer Science, Data Engineering, Electrical Engineering, Mathematics, or related field—or equivalent practical experience.
  • 3–5 years building and operating data or ML systems in production.
  • Proficient in Python and SQL; comfortable with software engineering fundamentals (testing, version control, code reviews).
  • Hands-on experience with ETL/ELT orchestration tools (e.g., Airflow, Dagster) and cloud data warehouses (Snowflake, Big Query, or Redshift).
  • Familiarity with ML lifecycle tooling such as MLflow, Sage Maker, Vertex AI, or similar.
  • Working knowledge of Docker and Kubernetes and at least one major cloud platform (AWS, GCP, or Azure).
  • Understanding of data modeling, distributed computing concepts, and streaming frameworks (Spark, Flink, or Kafka Streams).
  • Strong analytical thinking, communication skills, and a demonstrated sense of ownership, curiosity, and continuous learning.

Desired:

  • Experience with Twilio Segment, Kafka/Kinesis, or other high-throughput event buses.
  • Exposure to infrastructure-as-code (Terraform, Pulumi) and Git Hub-based CI/CD pipelines.
  • Practical knowledge of generative AI workflows, foundation-model fine-tuning, or vector databases.
  • Contributions to open-source data/ML projects or published technical presentations/blogs.
  • Domain experience in communications, marketing automation, or customer engagement analytics.
Location

This role will be remote, but is not eligible to be hired in CA, CT, NJ, NY, PA, WA.

Travel

We prioritize connection and opportunities to build relationships…

To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary