More jobs:
CCB Risk Program Associate
Job in
Los Angeles, Los Angeles County, California, 90079, USA
Listed on 2026-02-16
Listing for:
J.P. Morgan
Full Time
position Listed on 2026-02-16
Job specializations:
-
Software Development
Data Scientist, Machine Learning/ ML Engineer
Job Description & How to Apply Below
The Portfolio Risk Modeling team within CCB Risk Modeling group is responsible for end-to-end development of best in class forecasting model suite for Chase credit card portfolios to support stress testing, loss reserve, and business planning exercises.
Job Responsibilities- You will work on a large and cleanly structured codebase designed for large-scale distributed simulation and forecasting.
- Collaborate with business partners in loss forecasting, finance and technology, effectively communicate model results, analytical findings, and insights to them and senior leadership team to support business and or technical decisions.
- Perform machine learning tasks such as feature engineering, feature selection, and developing and training machine learning algorithms using cutting-edge technology to extract predictive models/patterns from billions of transactions’ amounts of data.
- Collaborate with business teams to identify opportunities, collect business needs, and provide guidance on leveraging the machine learning solutions.
- Interact with a broader audience in the firm to share knowledge, disseminate findings, and provide domain expertise
- Master’s degree in Computer Science, Mathematics, Statistics, Econometrics, Physics, Engineering, or a related quantitative discipline is required.
- Proven proficiency in programming languages for large-scale data analysis, such as Python or Scala.
- A strong interest in how models work, the reasons why particular models work or not work on particular problems, and the practical aspects of how new models are designed.
- PhD in a quantitative field with publications in top journals, preferably in machine learning.
- Experience with model design in a big data environment making use of distributed/parallel processing via Hadoop, particularly Spark and Hive
Position Requirements
10+ Years
work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×