×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Senior Data Engineer

Job in Charleston, Charleston County, South Carolina, 29408, USA
Listing for: Maymont Homes
Full Time position
Listed on 2026-02-21
Job specializations:
  • IT/Tech
    Data Engineer
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

Location

Charleston - 997 Morrison Drive, Suite 402

Business

Our Growth, Your Opportunity At Maymont Homes, our success starts with people, our residents and our team. We are transforming the single‑family rental experience through innovation, quality, and genuine care. With more than 20,000 homes across 47+ markets, 25+ build‑to‑rent communities, and continued expansion on the horizon, we are more than a leader in the industry—we are a company that puts people and communities at the heart of everything we do.

As part of Brookfield, Maymont Homes is growing quickly and making a lasting impact. We are also proud to be Certified™ by Great Place to Work®, a recognition based entirely on feedback from our employees. This honor reflects the culture of trust, collaboration, and belonging that makes Maymont a place where people thrive. Join a purpose‑driven team where your work creates opportunity, sparks innovation, and helps families across the country feel truly at home.

Job Description

Job Title: Senior Data Engineer

Reports to: Manager, Data Engineer

FLSA Status: Exempt

Primary Responsibilities: The Senior Data Engineer is responsible for designing, building, and optimizing scalable data systems and pipelines that enable robust analytics and data‑driven decision‑making. This role requires strong technical expertise in modern data engineering tools, cloud environments, and programming languages. The ideal candidate demonstrates excellence in developing efficient data solutions, ensuring data quality, and collaborating across cross‑functional teams to deliver business value.

Qualifications
  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
  • Minimum of 3+ years of experience in data engineering.
  • Strong problem‑solving skills and attention to detail.
  • Excellent communication and collaboration skills in a cross‑functional environment.
Essential Skills
  • Technical Expertise: Strong understanding of data architecture, ETL processes, and big data systems.
  • Programming: Proficiency in Python and SQL; familiarity with Spark is a plus.
  • Data Warehousing: Hands‑on experience with modern data warehouses such as Redshift or Snowflake.
  • Cloud Platforms: Proficiency with AWS data services, particularly S3 and Redshift.
  • ETL Tools: Experience with orchestration tools such as Airflow or Glue.
  • Version Control: Familiarity with Git and best practices.
  • Salesforce Integration: Experience with Salesforce data models, APIs, and integration tools.
Preferred Qualifications
  • Certification in Data Engineering or Cloud Architecture (AWS, Azure, Snowflake, or GCP).
  • Experience with Open Search, document databases, and other non‑relational systems.
  • Dev Ops Practices:
    Familiarity with CI/CD pipelines and infrastructure‑as‑code tools such as Docker and Terraform.
  • Agile Practices:
    Experience using Jira and Scrum boards.
  • Experience with data governance, quality frameworks, and metadata management.
  • Exposure to data visualization tools such as Power BI or Tableau.
Essential Job Functions
  • Design, develop, and maintain data pipelines that extract, transform, and load data between Salesforce, APIs, and cloud platforms (Snowflake, Redshift, Databricks).
  • Develop and maintain efficient, scalable, and well‑documented code using modern engineering practices.
  • Partner with data analysts, scientists, and architects to improve data accessibility and reliability.
  • Implement and maintain ETL frameworks and automation to ensure timely and accurate data delivery.
  • Research emerging data technologies to enhance scalability, performance, and automation.
Key Metrics & Responsibilities
  • Pipeline Performance:
    Deliver reliable, scalable, and optimized data pipelines that meet SLAs.
  • Data Quality:
    Ensure accuracy, completeness, and timeliness of data across systems.
  • Collaboration:

    Work closely with data scientists, software engineers, and business stakeholders.
  • Innovation:
    Contribute to the evaluation and adoption of new tools, technologies, and processes.
  • Documentation & Standards:
    Maintain technical documentation and adhere to best practices.
  • Data Solutions Development:
    Build and maintain enterprise data products.
  • Infrastructure Management:
    Automate and…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary