Amazon Database Administrator
Listed on 2026-02-10
-
IT/Tech
Data Engineer, Cloud Computing, Data Analyst, Data Warehousing
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Amazon Database Administrator with 8+ years of experience, specializing in Amazon Redshift and Google Big Query. It offers a contract length of unspecified duration, a pay rate of $60/hour, and requires onsite work in Chicago 3 days a week.
United States • $ USD
#Scala #Storage #Redshift #Schema Design #Disaster Recovery #Cloud #Database Monitoring #Observability #Airflow #Talend #GCP (Google Cloud Platform) #SQL (Structured Query Language) #Automation #Big Query #Clustering #Data Security #Amazon Redshift #Debugging #Data Modeling #Data Warehouse #Compliance #Data Engineering #Database Administration #AWS (Amazon Web Services) #ETL (Extract #Transform #Load) #Monitoring #Deployment #Security #dbt (data build tool) #DBA (Database Administrator)
Responsibilities- Manage, monitor, and optimize Amazon Redshift and Google Big Query environments.
- Implement robust backup, recovery, and disaster recovery strategies.
- Oversee user management, access controls, and security compliance.
- Perform capacity planning, cost monitoring, and performance tuning to ensure efficient query execution and storage usage.
- Partner with data engineering and analytics teams to design and implement scalable data warehouse architectures.
- Evaluate and recommend partitioning, clustering, and distribution strategies for optimal performance.
- Contribute to data modeling and best practices for schema design to support analytics workloads.
- Build scripts and automation for database monitoring, deployments, and maintenance tasks.
- Optimize ETL/ELT pipelines with a focus on query performance and cost efficiency.
- Implement observability tools for query execution and workload management.
- 8+ years of experience as a Database Administrator, Data Engineer, or similar role.
- Proven hands‑on expertise with Amazon Redshift and Google Big Query (administration, performance tuning, workload management).
- Strong knowledge of SQL, query optimization, and data modeling techniques.
- Experience with ETL/ELT pipelines and integration with tools like dbt, Airflow, or Talend.
- Solid understanding of cloud infrastructure (AWS, GCP) and data security best practices.
- Strong problem‑solving and debugging skills in complex distributed systems.
Pay:
Up to $60.00 per hour.
Work Location:
Hybrid remote in Chicago, IL 60617.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).