×
Register Here to Apply for Jobs or Post Jobs. X

Sr Lead Software Engineer - Big Data Pyspark, Java And AWS

Job in Wilmington, New Castle County, Delaware, 19894, USA
Listing for: JPMorganChase
Full Time position
Listed on 2026-01-10
Job specializations:
  • Software Development
    Software Engineer, Data Engineer
Job Description & How to Apply Below

Sr Lead Software Engineer - Big Data Pyspark, Java And AWS

Be an integral part of an agile team that constantly pushes the envelope to enhance, build, and deliver top‑notch technology products. As a Senior Lead Software Engineer at JPMorgan Chase within the Consumer and Community Banking cards Technology Team, you will create secure, stable, and scalable solutions that drive significant business impact across a diverse array of technologies and applications.

Our Consumer & Community Banking division serves Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small‑business loans, and payment processing. Praised for leading the U.S. in credit‑card sales and deposit growth, we host the most‑used digital solutions while ranking first in customer satisfaction.

As a Lead Data Engineer, you will play a key role on an agile team focused on enhancing, building, and delivering secure, stable, and scalable solutions for data collection, storage, access, and analytics. You will leverage your deep technical expertise and problem‑solving skills to drive significant business impact, addressing a wide range of challenges across multiple data pipelines, architectures, and data consumers.

Job Responsibilities
  • Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors.
  • Develops secure and high‑quality production code, and reviews and debugs code written by others.
  • Drives decisions that influence the product design, application functionality, and technical operations and processes.
  • Serves as a function‑wide subject matter expert in one or more areas of focus.
  • Actively contributes to the engineering community as an advocate of firm‑wide frameworks, tools, and practices of the Software Development Life Cycle.
  • Influences peers and project decision‑makers to consider the use and application of leading‑edge technologies.
  • Adds to the team culture of diversity, opportunity, inclusion, and respect.
  • Designs and develops end‑to‑end data pipelines using Spark SQL, Java, and AWS services.
  • Utilizes programming languages such as Java and Python, works with No

    SQL databases and SQL, and leverages container orchestration services including Kubernetes, along with a variety of AWS tools and services.
  • Defines and implements database backup, recovery, and archiving strategies.
  • Generates advanced data models for one or more teams using firm‑wide tools, linear algebra, statistics, and geometrical algorithms and approves data analysis tools and processes to ensure consistency and quality.
Required Qualifications , Capabilities, and Skills
  • Formal training or certification in software engineering concepts and at least 3 years applied experience.
  • Hands‑on practical experience delivering system design, application development, testing, and operational stability.
  • Advanced proficiency in one or more programming language(s) – Java.
  • Hands‑on practical experience developing spark‑based frameworks for end‑to‑end ETL, ELT & reporting solutions using key components like Spark SQL & Spark Streaming.
  • Experience with AWS cloud technologies, including S3.
  • Experience with relational and No

    SQL databases.
  • Proficiency in AWS data services:
    Lake Formation, Glue ETL (or EMR), S3, Glue Catalog, Athena, Kinesis (or MSK), Airflow (or Lambda + Step Functions + Event Bridge).
  • Expertise in data de/serialization formats:
    Parquet, Iceberg, Avro, JSON‑LD.
  • Good understanding of AWS data security concepts such as Lake Formation, IAM, service roles, encryption, KMS, and Secrets Manager.
  • Advanced knowledge of software applications and technical processes with considerable in‑depth knowledge in at least one technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile).
  • Ability to tackle design and functionality problems independently with little to no oversight.
  • Practical cloud‑native experience.
  • Experience in Computer Science, Computer Engineering, Mathematics, or a related technical field.
Preferred Qualifications , Capabilities, and Skills
  • Knowledge of Snowflake.
  • Experience building data lakes, data platforms, and…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary