Data Engineer - Enterprise Products
Listed on 2026-02-18
-
IT/Tech
Data Engineer, Data Security
Location: New York, NY (Hybrid)
Department: Digital Experience
Reports To: Technology Lead, Enterprise Products
About Lincoln Center: Lincoln Center for the Performing Arts is a global home for music, dance, theater, film, and culture. We steward a 16-acre campus in New York City, collaborate with world-renowned Resident Arts Organizations, and bring the arts to millions each year.
We are building a modern digital ecosystem that expands access to culture, strengthens audience relationships, and supports long-term sustainability. A unified, trusted data platform is foundational to this work enabling Lincoln Center to operate as a connected campus while respecting the autonomy, privacy, and unique missions of our partner organizations.
Role OverviewLincoln Center is seeking a Data Engineer to help design, build, and operate an enterprise data platform that powers analytics, reporting, personalization, and future AI initiatives across the campus.
Sitting within the Enterprise Products area, this role focuses on shared platforms and systems that serve the entire organization. You will be responsible for building secure, scalable, and cost-efficient data pipelines and datasets on AWS that unify information from ticketing, CRM, fundraising, marketing, finance, and digital engagement systems.
You’ll collaborate closely with product, business development, marketing, finance, and external partners to ensure the data platform is reliable, governed, observable, and trusted across the institution.
What You’ll Do- Design, build, and operate scalable ETL/ELT pipelines on AWS supporting batch and near-real-time data use cases.
- Ingest and process data using AWS-native services such as S3, Glue, Lambda, Step Functions, and Cloud Watch, alongside modern data tooling.
- Integrate data from enterprise systems including ticketing, CRM, fundraising, finance, marketing platforms, and web/app analytics.
- Develop dimensional and semantic data models in the warehouse that serve as trusted sources of truth across departments.
- Optimize data workflows for performance, reliability, and cost, including partitioning strategies, orchestration schedules, and compute usage.
- Implement data governance standards using AWS IAM and warehouse controls, including role-based access, masking, tagging, and metadata management.
- Partner with teams across Marketing, Business Development, Finance, and Programming to deliver high-impact data products such as cohorts, funnels, donor analytics, and event performance insights.
- Improve data quality and reliability through automated testing, monitoring, alerting, and clear data ownership.
- Reduce technical debt by automating manual processes, deprecating legacy pipelines, and standardizing data access patterns.
- Contribute to enterprise-wide data standards, documentation, and cloud best practices as the platform evolves.
- A pragmatic, cloud-oriented data engineer who values reliability, clarity, and long-term maintainability.
- Comfortable designing and operating data systems in AWS production environments.
- Fluent in SQL and Python, with strong instincts for data modeling, performance tuning, and cost awareness.
- Deeply invested in data quality, security, and governance.
- Comfortable collaborating across teams and explaining technical concepts to non-technical partners.
- Motivated by building foundational platforms that enable teams to move faster and make better decisions.
- Energized by applying modern data practices in a multi-stakeholder, mission-driven environment.
- Minimum 5 years of experience in data engineering or closely related roles.
- Strong SQL skills (CTEs, window functions, performance tuning) and solid Python experience for data processing and testing.
- Hands‑on experience with AWS data services, including S3, Glue, Lambda, IAM, and Cloud Watch.
- Experience working with modern data warehouses and transformation tools such as Snowflake and dbt.
- Experience designing dimensional and semantic models, incremental pipelines, and slowly changing dimensions.
- Working knowledge of data governance concepts including RBAC/ABAC, masking, tagging, and PII handling (GDPR/CCPA).
- Familiarity with workflow…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).