×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Charleston, Charleston County, South Carolina, 29408, USA
Listing for: Brookfield Properties
Full Time position
Listed on 2026-01-12
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below
** Location
* * Charleston - 997 Morrison Drive, Suite 402
** Business
* * Our Growth, Your Opportunity At Maymont Homes, our success starts with people, our residents and our team. We are transforming the single-family rental experience through innovation, quality, and genuine care. With more than 20,000 homes across 47+ markets, 25+ build-to-rent communities, and continued expansion on the horizon, we are more than a leader in the industry—we are a company that puts people and communities at the heart of everything we do.

As part of Brookfield, Maymont Homes is growing quickly and making a lasting impact. We are also proud to be Certified by Great Place to Work, a recognition based entirely on feedback from our employees. This honor reflects the culture of trust, collaboration, and belonging that makes Maymont a place where people thrive. Join a purpose-driven team where your work creates opportunity, sparks innovation, and helps families across the country feel truly at home.
** Job Description
**** Job Title**:
Senior Data Engineer
*
* Reports to:

** Manager, Data Engineer
** FLSA Status:
** Exempt
*
* Primary Responsibilities:

** The Senior Data Engineer is responsible for designing, building, and optimizing scalable data systems and pipelines that enable robust analytics and data-driven decision-making. This role requires strong technical expertise in modern data engineering tools, cloud environments, and programming languages. The ideal candidate demonstrates excellence in developing efficient data solutions, ensuring data quality, and collaborating across cross-functional teams to deliver business value.#

Skills &

Competencies:

**
* Qualifications:

**** Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
* Minimum of 3+ years of experience in data engineering
* Strong problem-solving skills and attention to detail.
* Excellent communication and collaboration skills in a cross-functional environment.
*** Essential

Skills:

**** Technical Expertise:
Strong understanding of data architecture, ETL processes, and big data systems.
* Programming:
Strong proficiency in Python and SQL; familiarity with Spark is a plus.
* Data Warehousing:
Hands-on experience with modern data warehouses such as Redshift or Snowflake
* Cloud Platforms:
Proficiency with cloud data services (AWS), particularly AWS S3 and Redshift,
* Data Engineering Tools:
Expertise with ETL orchestration tools (e.g., Airflow, Glue)
* Version Control:
Familiarity with Git and version control best practices.
* Salesforce Integration:
Experience working with Salesforce data models, APIs, and integration tools to support data ingestion, synchronization, and analytics.
**
* Preferred Qualifications:

**** Certification in Data Engineering or Cloud Architecture (AWS, Azure, Snowflake, or GCP).
* Experience with Open Search, Document Databases, and other non-relational systems.
* Dev Ops Practices:
Familiarity with CI/CD pipelines and infrastructure-as-code tools such as Docker and Terraform.
* Agile Practices:
Experience using Jira and Scrum boards to manage sprints, track progress, and collaborate effectively within agile development teams.
* Experience with data governance, quality frameworks, and metadata management.
* Exposure to data visualization tools (e.g., Power BI, Tableau) for understanding downstream data use.#

Essential Job Functions:

*** Typical Day

Activities:

**** Design, develop, and maintain data pipelines that extract, transform, and load data between Salesforce, APIs and cloud data platforms (e.g., Snowflake, Redshift, Databricks).
* Develop and maintain efficient, scalable, and well-documented code using modern engineering practices.
* Partner with data analysts, scientists, and architects to improve data accessibility and reliability across the organization.
* Implement and maintain ETL frameworks and automation to ensure timely and accurate data delivery.
* Research emerging data technologies to enhance scalability, performance, and automation of data systems.# Key Metrics & Responsibilities:
* Pipeline Performance:
Deliver reliable, scalable, and optimized data pipelines that meet SLAs.
* Data Quality:
Ensure…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary