More jobs:
Senior Technical Reporting Manager, ETL Reporting Engineer-AWS Data Platform/Tableau
Job in
Henderson, Clark County, Nevada, 89077, USA
Listed on 2026-01-01
Listing for:
DataSync
Full Time
position Listed on 2026-01-01
Job specializations:
-
IT/Tech
Data Engineer, Database Administrator
Job Description & How to Apply Below
Senior Technical Reporting Manager, ETL Reporting Engineer-AWS Data Platform/Tableau
Design and maintain enterprise-scale data pipelines using AWS cloud services, handling schema evolution in data feeds and delivering analytics-ready datasets to BI platforms. This role requires hands-on expertise with the full AWS data stack and proven ability to build enterprise-grade data solutions that scale.
Responsibilities:
- Build and orchestrate ETL/ELT workflows using Apache Airflow for complex data pipeline management
- Develop serverless data processing with AWS Lambda and Event Bridge for real-time transformations
- Create scalable ETL jobs using AWS Glue with automated schema discovery and catalog management
- Execute database migrations and continuous replication using AWS DMS
- Design and optimize Amazon Redshift data warehouses and Amazon Athena federated queries
- Implement streaming data pipelines with Apache Kafka for real-time ingestion
- Manage schema changes in data feeds with automated detection and pipeline adaptation
- Create data feeds for Tableau and Business Objects reporting platforms
Requirements:
- Airflow: DAG development, custom operators, workflow orchestration, production deployment
- Lambda:
Serverless functions, event triggers, performance optimization - Event Bridge:
Event-driven architecture, rule configuration, cross-service integration - Glue: ETL job development, crawlers, Data Catalog, schema management
- DMS:
Database migrations, continuous replication, heterogeneous database integration - Redshift:
Cluster management, query optimization, workload management - Athena:
Serverless analytics, partitioning strategies, federated queries - Tableau (Expert Level):
Develop and maintain data analogs, data cubes, queries, data visualization and reports
Education And Experience:
- 5+ years AWS data platform development
- 3+ years production Airflow experience with complex workflow orchestration
- Proven experience managing high-volume data feeds (TB+ daily) with schema evolution
- Database migration expertise using DMS for enterprise-scale projects
- BI integration experience with Tableau and Business Objects platforms
Key
Competencies:
- Design fault-tolerant data pipelines with automated error handling and recovery
- Handle schema changes in real-time and batch data feeds without pipeline disruption
- Optimize performance across streaming and batch processing architectures
- Implement data quality validation and monitoring frameworks
- Coordinate cross-platform data synchronization and lineage tracking
Preferred Qualifications:
- AWS Data Analytics Specialty or Solutions Architect Professional certification
- Experience with Infrastructure as Code (Terraform, Cloud Formation)
- Knowledge of Data Ops practices and CI/CD for data pipelines
- Containerization experience (Docker, ECS, EKS) for data workloads
We are an equal opportunities employer and welcome applications from all qualified candidates.
#J-18808-LjbffrPosition Requirements
10+ Years
work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×