Senior Data Engineer
Listed on 2026-01-06
-
IT/Tech
Data Engineer, Data Science Manager
Location
Charleston - 997 Morrison Drive, Suite 402
BusinessOur Growth, Your Opportunity
At Maymont Homes, our success starts with people, our residents and our team. We are transforming the single‑family rental experience through innovation, quality, and genuine care. With more than 20,000 homes across 47+ markets, 25+ build‑to‑rent communities, and continued expansion on the horizon, we are more than a leader in the industry—we are a company that puts people and communities at the heart of everything we do.
As part of Brookfield, Maymont Homes is growing quickly and making a lasting impact. We are also proud to be Certified™ by Great Place to Work®, a recognition based entirely on feedback from our employees. This honor reflects the culture of trust, collaboration, and belonging that makes Maymont a place where people thrive.
Join a purpose‑driven team where your work creates opportunity, sparks innovation, and helps families across the country feel truly at home.
Senior Data Engineer
Reports toManager, Data Engineer
FLSA StatusExempt
ResponsibilitiesThe Senior Data Engineer is responsible for designing, building, and optimizing scalable data systems and pipelines that enable robust analytics and data‑driven decision‑making. This role requires strong technical expertise in modern data engineering tools, cloud environments, and programming languages. The ideal candidate demonstrates excellence in developing efficient data solutions, ensuring data quality, and collaborating across cross‑functional teams to deliver business value.
Qualifications- Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
- Minimum of 3+ years of experience in data engineering.
- Strong problem‑solving skills and attention to detail.
- Excellent communication and collaboration skills in a cross‑functional environment.
- Technical expertise in data architecture, ETL processes, and big data systems.
- Proficiency in Python and SQL; familiarity with Spark is a plus.
- Hands‑on experience with modern data warehouses such as Redshift or Snowflake.
- Proficiency with cloud data services (AWS), particularly AWS S3 and Redshift.
- Expertise with ETL orchestration tools (e.g., Airflow, Glue).
- Familiarity with Git and version control best practices.
- Experience working with Salesforce data models, APIs, and integration tools to support data ingestion, synchronization, and analytics.
- Certification in Data Engineering or Cloud Architecture (AWS, Azure, Snowflake, or GCP).
- Experience with Open Search, document databases, and other non‑relational systems.
- Familiarity with CI/CD pipelines and infrastructure‑as‑code tools such as Docker and Terraform.
- Experience using Jira and Scrum boards to manage sprints, track progress, and collaborate effectively within agile development teams.
- Experience with data governance, quality frameworks, and metadata management.
- Exposure to data visualization tools (e.g., Power BI, Tableau) for understanding downstream data use.
- Design, develop, and maintain data pipelines that extract, transform, and load data between Salesforce, APIs, and cloud data platforms (e.g., Snowflake, Redshift, Databricks).
- Develop and maintain efficient, scalable, and well‑documented code using modern engineering practices.
- Partner with data analysts, scientists, and architects to improve data accessibility and reliability across the organization.
- Implement and maintain ETL frameworks and automation to ensure timely and accurate data delivery.
- Research emerging data technologies to enhance scalability, performance, and automation of data systems.
- Pipeline performance: deliver reliable, scalable, and optimized data pipelines that meet SLAs.
- Data quality: ensure accuracy, completeness, and timeliness of data across systems.
- Collaboration:
work closely with cross‑functional teams, including data scientists, software engineers, and business stakeholders, to support analytical and operational initiatives. - Innovation: contribute to the evaluation and adoption of new tools, technologies, and processes for improving data infrastructure.
- Documentation &…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).