Data Engineer II
Listed on 2026-02-13
-
IT/Tech
Data Engineer
Lennar is one of the nation's leading homebuilders, dedicated to making an impact and creating an extraordinary experience for their Homeowners, Communities, and Associates by building quality homes and providing exceptional customer service, giving back to the communities in which we work and live in, and fostering a culture of opportunity and growth for our Associates throughout their career. Lennar has been recognized as a Fortune 500® company and consistently ranked among the top homebuilders in the United States.
Join a Company that Empowers you to Build your Future
As a Data Engineer, you are responsible for analyzing large amounts of business data, solving real world problems, and developing metrics and business cases that will enable Business Insights. This is done by leveraging data from various platforms such as Jira, Portal, Salesforce. You will work with a team of Product Managers, Software Engineers and Business Intelligence Engineers to automate and scale the analysis, and to make the data more actionable to manage business will own many large datasets, implement new data pipelines that feed into or from critical data systems.
A career with purpose.
A career built on making dreams come true.
A career built on building zero defect homes, cost management, and adherence to schedules.
Your Responsibilities on the TeamDesign, implement and support an analytical data infrastructure and working knowledge of Modern Data Warehouse concepts.
Design, build, and maintain efficient and scalable data pipelines and ETL processes to process large volumes of structured and unstructured data.
Optimize data storage and retrieval methods to ensure performance, scalability, and cost-efficiency.
Manage AWS resources including EC2, S3, Glue, Lambda, API’s, IAM, Cloud Watch etc.
Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL and AWS big data technologies.
Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency.
Collaborate with Data Scientists and Business Intelligence Engineers (BIEs) to recognize and help adopt best practices in reporting and analysis.
Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers.
Maintain internal reporting platforms/tools including troubleshooting and development. Interact with internal users to establish and clarify requirements in order to develop report specifications.
Work with Engineering partners to help shape and implement the development of BI infrastructure including Data Warehousing, reporting and analytics platforms.
Contribute to the development of the BItools, skills, culture and impact.
Write advanced SQL queries and Python code to develop solutions.
Working Knowledge of Snowflake.
Collaborate across teams to align AI initiatives with organizational goals and understanding of AI concepts.
Knowledge on continuous integration/continuous delivery (CI/CD) pipelines and working on deployments when necessary.
Requirements- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 3-5 years of experience in data engineering or a related role, with demonstrated success in delivering data solutions.
- Proficient in SQL, with the ability to write complex queries, perform query optimization, and conduct performance tuning.
- Experience with No
SQL databases, such as Mongo
DB, Cassandra, or Dynamo
DB, and an understanding of their appropriate use cases. - Strong programming skills in Python, Java, or Scala, with experience in data processing frameworks (e.g., Apache Spark, Hadoop).
- Experience with cloud platforms (AWS, Azure, GCP) and data services, such as AWS Redshift, Azure Synapse, or Google Big Query.
- Knowledge of big data technologies, including Hadoop, Spark, Kafka, and HBase, with experience in distributed data processing.
- Familiarity with data orchestration tools, such as Apache Airflow for scheduling and managing data workflows.
- Experience with data versioning and testing tools, such as DVC (Data Version Control) and dbt (data build tool).
- Understanding of data security practices,…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).