×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Remote / Online - Candidates ideally in
Idaho, USA
Listing for: Avaya Corporation
Remote/Work from Home position
Listed on 2025-12-06
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Salary/Wage Range or Industry Benchmark: 128200 - 157000 USD Yearly USD 128200.00 157000.00 YEAR
Job Description & How to Apply Below

About Avaya

Location:

Virtual, US

Avaya is an enterprise software leader that helps the world’s largest organizations and government agencies forge unbreakable connections.

The Avaya Infinity™ platform unifies fragmented customer experiences, connecting the channels, insights, technologies, and workflows that together create enduring customer and employee relationships.

We believe success is built through strong connections – with each other, with our work, and with our mission. At Avaya, you'll find a community that values your contributions and supports your growth every step of the way.

You’ll build and scale the real-time and batch data platform that powers a large enterprise contact center solution. Our products demand ultra-low-latency decisioning for live interactions and cost-efficient big-data analytics for historical insights. We’re primarily on Azure today and expanding toGCP and AWS. Data is the backbone for our AI features and product intelligence.

Primary charter
: complex contact center analytics and operational intelligence: an AI-enabled enterprise contact center analytics. Our vision is a flexible AI-enabled data platform that unifies contact center KPIs, customer/business outcomes, and AI quality/performance, and pervasively applies AI to deliver advanced features that help users easily leverage rich contact center data alongside business data and AI performance monitoring to drive decisions end-to-end.

Job Description Team & Tech
  • Cloud:
    Azure (primary), expanding to GCP/AWS
  • Platform:
    Databricks, Spark (batch + streaming), Airflow, Apache Superset, Kafka
  • Data & governance:
    Delta Lake, Unity Catalog, Delta Sharing
  • Infra & delivery:
    Terraform, Docker/Kubernetes, CI/CD (Git Hub Actions/Azure Dev Ops)
  • Interfaces: REST/gRPC; schemas with Avro/Protobuf
  • Processing alternatives:
    Apache Flink/Apache Beam where appropriate; custom processors/services in Go for specialized low-latency needs
  • App stack:
    React + Type Script (front‑end), Go (preferred) and Java (backend)
  • Focus:
    Real-time streaming, lakehouse analytics, reliability, and cost efficiency
  • Experimentation & metrics: MLflow for experiment tracking and AI quality/performance metrics
  • Tooling integration: MCP (Model Context Protocol) to expose/consume data tools for agents
What you’ll do
  • Design, build, and operate low‑latency streaming pipelines (Kafka, Spark Structured Streaming) and robust batch ETL/ELT on Databricks Lakehouse.
  • Establish reliable orchestration and dependency management (Airflow), with strong SLAs and on‑call readiness for business‑critical data flows.
  • Model, optimize, and document curated datasets and interfaces that serve analytics, product features, and AI workloads.
  • Implement data quality checks, observability, and backfills; drive root‑cause analysis and incident prevention.
  • Partner with application teams (Go/Java), analytics, and ML/AI to ship data products into production.
  • Build and maintain datasets and services that power RAG pipelines and agentic AI workflows (tool‑use/function calling).
  • When Spark/Databricks isn’t optimal, design and operate custom processors/services in Go to meet strict latency or specialized transformation requirements.
  • Instrument prompt/response and token usage telemetry to support LLMOps evaluation and cost optimization; provide datasets for labeling and golden sets.
  • Improve performance and cost (storage/compute), review code, and raise engineering standards.
Security & Compliance
  • Design data solutions aligned to enterprise security, privacy, and compliance requirements (e.g., SOC 2, ISO 27001, GDPR/CCPA as applicable), partnering with Security/Legal.
  • Implement RBAC/ABAC and least‑privilege access; manage service principals, secrets, and key rotation; enforce encryption in transit and at rest.
  • Govern sensitive data: classification, PII handling, masking/tokenization, retention/archival, lineage, and audit logging across pipelines and storage.
  • Build observability for data security and quality; support incident response, access reviews, and audit readiness.
  • Embed controls in CI/CD (policy checks, dependency vulnerability scanning) and ensure infra‑as‑code adheres to guardrails.
  • Partner with security engineering on…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary