Carolina Biagini - Senior Quantitative Analyst - Kevin D. Oden & Associates

Amul Bhatia - Partner - Kevin D. Oden & Associates

Introduction

In last week's post, we focused on developing a strong foundation for model risk management. This week we will focus on the application of model risk management and show how to build a strong program with proper model development, validation, and monitoring.

Model Development

At many institutions, model risk management starts after a model is developed or acquired from a vendor. This mindset poses real risks to the Credit Union (CU) and should be remedied by incorporating model development into the model risk management process. In other words, treating risk management as an afterthought is an error. It is important for every CU, regardless of its size, to understand the risks it is accepting when it proposes the development or purchase of a model. Therefore, the model development process must be a well-designed process that involves, at minimum, the following stakeholders: model risk management, the model owner, and ultimate model users.

Risk management plays a critical role in the model development and model review processes. The risk management team is the body responsible for identifying the risk practices apply to model management. Risk management will also outline the documentation necessary to ensure the model meets not just regulatory requirements but internal standards as well. If a third party (vendor model) is under consideration, third-party risk management will need to be engaged as another crucial player who will conduct some of the critical up-front negotiations with the vendor(s). 

The development process itself

Existing models can be absorbed into the development process. However, we will focus on new models and changes to models. The first step should be identifying the business need for that particular model. The business leader should initially reach out to the model development team, which should then reach out to the risk management committee so that all components of model development or discussions around existing model modifications are properly and safely addressed. An effective model development process has at minimum the following components:

  1. Purpose and Objective Assessment: This involves the potential model owner or business leader, MRM, and the model developer assessing why the model is needed and its objectives. Note that this should happen before the developer begins working on the potential model or the vendors demonstrate their model. This can simply be a meeting where the model risk management team learns about the proposed model, begins to formulate a risk assessment of the proposed model, and can formulate a validation strategy inclusive of resources. This also commences the model inventory process, with a model ID assigned and model purpose/use field populated.
  2. Requirements or Expectations: Outlining the model requirements in sufficient detail to benchmark subsequent development or vendor capabilities is essential to ensuring the developed or purchased product is ultimately fit for purpose. Portfolio or business coverage, access or control requirements, and as much as possible performance standards should be considered. This will also help develop the validation strategy.
  3. Documentation Standards: Developed by MRM and potentially tailored to individual model types, consistent and strong documentation ensures business continuity as developers or users change firms or roles or as new users utilize internally built or vendor models.
  4. Development Guidelines: In addition to the components outlined above, it is good practice to have development standards. Essential to this will be testing. As SR-11-7 notes: “An integral part of model development is testing, in which the various components of a model and its overall functioning are evaluated to determine whether the model is performing as intended.” Testing is where #2 (Requirements and Expectations) are assessed and why it is important to outline them initially. Testing should check the model’s accuracy, its robustness and stability, and include the impacts of assumptions. Stress testing the model to understand its limitations is also a critical aspect of development testing. These development guidelines can be included in MRM’s documentation standards or published separately.

Importance of business continuity

Well-developed documentation is critical from both a business continuity and risk management perspective. An organization could spend years building out models; however, if the internal teams produce models with little or no documentation, business continuity is in jeopardy. The CU could use those models successfully for several years. However, at some point, the individual or group that built the models moves on, and the firm has no idea how to address problems should they arise because there is no documentation. Imagine a scenario where no CU employees know the codes that implement the models; thus, the institution must hire an individual or company that can reverse engineer the model, costing thousands of dollars, and countless hours. That error in judgment also exposes the CU to a barrage of regulatory and financial risks.

Using an outside vendor for model development

Before reaching out to outside vendors for model development or for a third-party model, the CU should first ask itself if it has the skill set internally to build the model. If the answer is yes, does that staff have the time to produce a model? If the CU does not have the staff, can it still conduct its developing in-house? The CU will need to hire the personnel with this expertise and bring them on board. All of this may be worth the cost because that model is going to demand support and maintenance over the years.

If the CU does not currently have the internal skill or staff to build a model and adding staff to build a model is cost prohibitive, the conclusion may be to use an outside vendor. But the question remains: What is the cost/benefit analysis of developing a model internally versus outsourcing the process?

Securing the services of an outside vendor, although cost effective, may introduce issues with sustainability and quality. How will the vendor meet all your standards? Increasingly, CU often overlook whether portions of a vendor’s model or the model itself may be proprietary. If that is the case, it is imperative that the CU understand that using a model with proprietary data or information does not relieve the CU of its responsibilities. If the model is ineffective and errors occur, you can sue the vendor, but your members and the regulators are focused on the CU, not the vendor. It is crucial that you still diligently manage the risk in the vendor relationship. That means ensuring the CU understands even the proprietary components of the model well enough so the documentation works. CU own the risk of the models they bring in-house. The vendors do not own that risk. Every model owner and user should acknowledge what risk they are owning with the model.

This is certainly not to suggest that using a vendor’s services is inappropriate. What we are emphasizing is that when selecting a vendor, it needs to be clear from the start that the vendor will be required to provide documentation that is in line with the CU’s internal requirements and needs, and that the vendor will be working with the CU’s risk management team. Also ensure the vendor can make required changes to the model going forward and, conversely, if the vendor makes changes to the model’s codes or other components, the CU is given notice and provided reasons why those changes are being made. Ultimately, the CU should evaluate if those changes make sense for the institution and its needs. 

AI and ML models

AI/ML models should be developed in a well-controlled environment designed to encourage integrity, traceability and reproducibility of results. This implies controls embedded throughout the model’s life cycle, from data sourcing and pre-processing, model design and construction to implementation, performance assessment and ongoing monitoring. Developmental decisions and model choices should be properly documented and justified, including special AI/ML considerations such as calibration of hyper-parameters, trade-off between predictive accuracy and interpretability of model output, trade-off between algorithmic simplicity and computational intensity, trade-off between model bias and model fit, potential information security and privacy related risks. As in the case of traditional models, developers should perform appropriate tests and document the results.\

Model Oversight

Model Validation

Ensuring all models are validated and fit for use is of increasing concern to CU members, investors and regulators (and appropriately so). Furthermore, the standards around what is considered a strong validation are expanding as the use of models grows and the risks involved become increasingly clear. This is also leading to a more consistent approach to model validation and benchmarks for sound model validation. 

One concern confronting small institutions trying to manage their risks appropriately is the frequency of validations. There is no short answer or predetermined time, but high-risk models should be validated and revalidated much more frequently, maybe annually in best practice. Regulators have made it clear that the scope, depth, and rigor of a validation should be commensurate with the scale and complexity of that model in the context of the individual firm. This applies to the frequency of revalidation as well. 

As a simple example of how this frequency could differ by firm, we consider the same model alternately employed at a $250 billion asset institution for credit decisioning the entire portfolio and at a $10 billion asset institution where it complements expert judgment to risk manage 50% of a portfolio. The rigor of validation activities should be different, downsized, and rescoped for that smaller institution given the risk profile of the portfolio and the use of the model. In particular, the range of tests and the severity of issues would be different for larger institutions.  

The challenge for smaller institutions as the model inventory grows is attracting and retaining the resources to effectively validate models of various types. The expertise requirements vary greatly depending on the model use. As an example, the skills to appropriately validate a retail credit model differ vastly from those required to validate a BSA/AML model or to validate a derivative pricing model. As a result, many smaller institutions need to engage third parties to properly validate some or many of their models. To handle this third-party arrangement effectively, the CU must ensure the third party has the requisite skill set to effectively validate the model. The CU should also baseline expectations in a statement of work (SOW) and expect regular check-ins as the validation progresses. 

 An effective validation framework should include three core elements:

  • Evaluation of conceptual soundness, including developmental evidence.
  • Ongoing monitoring, including process verification and benchmarking.
  • Outcomes analysis, including back-testing.

Some of the process elements that have become “best practice” for model validations in the industry include:

  • Pre-model development (revalidation) meeting
    • Used to understand intended model purpose and requirements.
    • Assess findings and observations from prior validations and their status.
    • Discuss model changes, if any, since last validation.
  • Documentation review with developers and business leads
    • Answer documentation questions.
    • Make owners aware of any deficiencies that may slow the validation process.
  • Evaluation of conceptual soundness, including developmental evidence
  • Implementation review
  • Regular check-ins with developer and business
  • Findings, issues, and recommendations
  • Validation report
  • Ongoing monitoring, including process verification and benchmarking. 

The CU should not and cannot rely on the vendor to perform an independent validation and should ensure that any vendor is willing to comply with the CU’s model validation requirements, including sound documentation, performance monitoring, and independent review.

Annual Reviews

While it is crucial to ensure the model continues to perform as expected, continuously validating a model in production can be cumbersome and an unnecessary use of resources. An annual model assessment/review is a critical component of the ongoing validation process. These annual reviews involve a comprehensive evaluation of the models' performance, including checking for any changes in the underlying data, assumptions, or economic conditions that could impact the models' effectiveness.

By conducting annual reviews, CUs can identify and address potential model weaknesses or biases, thus reducing the risk of financial loss or regulatory penalties. Additionally, these reviews help ensure that the models continue to align with the institution's risk appetite and strategic objectives. Regularly updating and approving models through annual reviews also fosters greater confidence among stakeholders, including regulators and members, in the CUs risk management practices.

Model Monitoring

All models should be continually monitored for performance. A monitoring plan outlining expectations should be reviewed as part of model validation and approved by the independent model validation team or model risk manager. The metrics and reporting should initially be based on back-tests performed by the model developer and potentially repeated by the validator and updated over time. Importantly, the degree of monitoring in terms of frequency, resource allocation, etc., should be commensurate with the risk of the model and therefore driven by the risk rating. 

The topic of model monitoring may conjure images of fancy systems and armies of teams watching flashing buttons on rows of monitors, but that is not necessary. Good model monitoring occurs over the life of each model, and it can be as simple as ascertaining the model’s performance in regular meetings with senior executives and model owners.

Effective model monitoring includes outlining performance expectations in terms of quantitative limits – for example, using a system of red, amber, and green is common. These should be reviewed on a regular basis (e.g., quarterly or semiannual), depending on the risk level of the model.  Best practice is to establish escalation protocols to senior level committees and ultimately the board. However, board level escalations should be reserved for the highest risk models and degradation of performance to severe levels.  

Setting up a monitoring framework for a model need not be complex to start.  For example, take a credit decision model developed internally or by a vendor where back-testing has demonstrated 90% accuracy in differentiating good from bad credits (along with precision and other metrics). If the model returned 75-80% accuracy, it could be considered in the red or amber level, depending on the CU’s acceptable risk level. The CU would then review the actual credits developed over the past six months or a year, determine the accuracy of the model, and assess whether the model performance is red, amber, or green. 

It is key that when the CU establishes model monitoring, there are appropriate actions associated with each level of performance. Those steps should be in writing and well documented in the model policy and procedure. However, as with any policy, there could be exceptions. The following figure describes the components of a strong model monitoring program.

Figure 1: Model monitoring program

 12882225286?profile=RESIZE_584x

Source: KDOA

AI and ML models

Validators should assess the rationale for the use of an AI/ML model as opposed to traditional techniques and whether the model specification is informed by domain expertise and aligns with the business purpose. When assessing the conceptual soundness of AI/ML models, EY (2020) suggests focusing on the following dimensions[1]:

  • Data integrity: AI/ML models rely on large volumes of heterogeneous and high-dimensional data. Assessing data quality and appropriateness requires reviewing the entire data lifecycle, from sourcing and pre-processing to training, testing and deployment. Additionally, validators should consider internal policies and regulation related to data privacy, protection, and ownership.
  • Feature engineering: Validators should review the process through which input variables are constructed from the raw data, including statistical analysis and business intuition applied to select features.
  • Sampling bias and fairness[2]: Validators should evaluate model impact on stakeholders and where necessary, other control functions (such as compliance) may participate in the assessment.
  • Hyper-parameter calibration: Validators should evaluate how different parameter settings impact the results of the model and the computational feasibility in production.
  • Explainability: Understanding how a model produces outputs based on the input variables and being able to qualitatively interpret model outputs can become challenging for complex AI/ML models (neural networks and ensemble techniques) as the way outputs respond to inputs may be unclear and lack transparency. In other cases, it will be difficult to understand how inputs get transformed into outputs (traceability).

AI/ML model owners and developers should create a comprehensive ongoing monitoring plan to confirm that the model is operating as intended over time. This plan should cover model performance, stability and alignment with business purpose. Performance results should be evaluated in terms of sampling bias and fairness. Model users should play an important role when evaluating model performance, as they are able to observe performance over time and to effectively challenge model results.

Setting an appropriate monitoring frequency for the performance indicators can become challenging for models frequently retrained. Similarly, defining what constitutes a model change and how to assess it can be difficult given the dynamic nature of AI/ML models and the need to manage feature changes through frequent retraining.

Conclusion

The importance of model development documentation cannot be stressed enough. As previously claimed, well-developed documentation is critical from both a business continuity and risk management perspective. The standards for the documentation of vendor models should be the same as for models developed internally.

It should be stressed that model validation is critical to sound risk management whether the model is developed internally or by a vendor. A sound validation process involves a degree of independence from the model developer and model user. To be able to effectively challenge, validators need to have the requisite knowledge, skills, and expertise. 

The risk of model performance not meeting internal and reported expectations is large and increasing as members, investors, and regulatory expectations grow. For this reason, developing a model monitoring program inclusive of expectations, roles, and responsibilities is so important for every institution, large and small.

 

[1] Agarwala, G., et al. (EY 2020). Supervisory expectations and sound model risk management practices for artificial intelligence and machine learning.

[2] Sampling bias occurs when AI/ML models incorrectly and systematically under/over-represent specific groups or classes from a population in a non-random manner. Fairness includes concerns for equality and equity by addressing issues such as harmful bias and discrimination, that can lead to disparate treatment.

If you have any questions or would like to get in touch, feel free to reach out – we're here to help!

Website

LinkedIn

12798264091?profile=RESIZE_400x

E-mail me when people leave their comments –

You need to be a member of CULytics Community to add comments!

Join CULytics Community

 

advantedge
altair
ibi
arka
trellance
coopfs
dfa
wherescape
alkami
prismacampaigns
marquis
aiq
totex
cnet
datava
aun
cinch
know

Related Post

 

Ad Unit Settings






Ad Url Settings

 

api-lead-approach
the-amazon-lending-experience
executing-advanced-analytics-do-s-and-don-t
lending-transformation-old-vs-new
data-journey-building-strong-analytical-practices
4-step-iterative-process-building-a-relevant-analytics-practice
significant-measures-towards-new-normal
building-a-strong-analytics-practice-recipe-for-success
data-warehouse-evaluation-and-implementation
explainable-ai-trust-and-transparency
forecasting
top-50-members-using-transactional-website-jun-2020
top-50-cus-with-highest-and-lowest-efficiency-june-2020
importance-of-financial-risk-management
secret-sauce-for-long-term-sustainable-business-intelligence-succ
top-pfm-technologies
secret-sauce-for-long-term-sustainable-business-intelligence-succ
top-pfm-technologies
data-warehouse-and-bi-technologies-opportunities-challenges
top-chatbot-technologies
keys-to-building-an-effective-branch-or-atm-network
top-50-credit-unions-with-highest-and-lowest-accounts-per-member
lowest-and-highest-net-income-per-branch
marketing-holy-grail
top-50-most-and-least-delinquent-credit-unions
modern-marketing-technologies
incremental-low-cost-data-driven-wins
power-of-storytelling
the-cost-of-not-investing-in-data-governance
questions-you-should-ask-before-investing-in-data-warehouse
learnings-from-new-data-based-on-auto-loan-pricing
5-questions-you-need-to-ask-before-investing-in-data-governance
digital-marketing-maturity-models-for-credit-unions
marketing-expense-per-member
top-2-reasons-that-are-holding-credit-unions-back-when-they-are-i
using-data-analytics-to-manage-lending-complexity-while-driving-h
5-reasons-your-credit-union-should-invest-in-data-and-digital-now
top-50-most-and-least-efficient-credit-unions
retail-financial-services-outlook-during-covid-19
use-of-operational-analytics-to-mitigate-the-impact-of-covid-19
top-50-credit-unions-based-on-asset-size
cu-peer-comparison-dashboard
cu-peer-benchmark
all-about-machine-learning-engineering
top-web-design-trends
most-important-social-media-marketing-trends
state-of-digital-marketing-maturing-in-credit-unions
top-kpis-for-email-marketing
data-cloud-and-the-digital-transformation-imperative
digital-trinity-and-you
phases-of-financial-industry
analytics-roundtable-workshop
invitation-to-join-digital-transformation-hub
analytics-in-the-credit-union-business
value-of-member-centricity-and-analytics-in-the-growth-of-cus
all-about-membership-analytics
top-fraud-management-technologies
getting-started-with-your-data-analytics-journey
explore-vizualization-for-credit-unions
investment-in-website-personalization-technologies
data-analytics-supporting-cu-s-first-member-philosophy
loyalty-rewards-and-retention-technologies
member-experience-analytics
channel-analytics-and-its-importance
project-portfolio-management-technologies
investment-in-self-service-data-preparation-technologies
self-service-data-preparation-technologies
new-frontier-in-customer-experience-management
role-of-marketing-analytics-in-credit-unions
important-aspects-of-consumer-lending-analytics
kpis-on-website-analytics
journey-towards-bank-less-banking
investment-in-crm-technologies
top-omni-channel-vendors
conversational-banking-solutions
/top-kpis-for-chief-information-officer
mistakes-to-avoid-when-implementing-a-omnichannel-member
top-things-to-consider-when-building-dashboards
making-digital-marketing-more-agile-through-tag-managers
cecl-solution-providers
mistakes-to-avoid-while-implementing-marketing-automation
p2p-payment-integrated-solutions
kpis-for-social-media-tracking
kpis-for-human-resources-management
investment-in-fintechs-should-or-should-not
top-kpis-for-online-banking
investment-in-marketing-automation-technologies
investment-in-e-signature-technologies-should-or-should-not
tips-and-tricks-to-a-successful-bi-program
kpis-for-credit-card-business
kpis-for-digital-marketing
kpis-for-consumer-lending
hot-topics-for-credit-union-data-leaders
kpis-for-debt-collections
kpis-for-finance
website-personalization-tools
data-integration-technologies
robotic-process-automation-tools
why-data-analytics-initiatives-fail
electronic-signature-softwares
data-governance-tools-for-credit-unions
digital-and-mobile-banking-technologies
report-inconsistencies-are-frustrating
is-your-culture-ready-for-data-analytics
three-big-data-myths
turning-transaction-data-into-a-goldmine-a-becu-case-study
call-for-presentation-for-2019-credit-union-analytics-summit-is-n
top-10-keys-to-successful-data-analytics-practice
credit-union-chooses-accountscore-for-open-banking-transaction-da
how-much-do-you-spend-to-serve-a-customer
marketing-automation-technologies-for-credit-union
alexa-ask-first-abilene-fcu-for-my-balance
dataweb-content-management-technologies-for-credit-unions
efficiency-ratio
web-analytics-technologies
data-warehousing-software-for-banks
customer-experience-software
the-best-kept-secret-for-credit-union-data-analytics
mark-sievewright-on-technology-trends
naveen-jain-on-credit-union-analytics-summit-2018
why-analytics-doesn-t-make-a-difference-by-gary-angel
cuas2018-harnessing-the-right-data
build-a-financial-phone-assistant-for-your-credit-union-in-3-step
2018-culytics-analytics-challenge-winner
update-from-naveen
error-resolution
benefits-of-conversational-apps
who-are-your-most-valuable-members-part-1
how-alexa-can-help-your-credit-union
top-10-kpis-for-measuring-retail-channel-performance
how-much-is-too-much-personalization
top-10-kpis-for-measuring-contact-center-efficiency
pressure-on-margins-for-auto-loans-indirect-auto-loans-declining
best-business-intelligence-technologies-for-credit-unions
establishing-a-thriving-data-analytics-practice-is-a-journey
educational-presentations-from-the-2017-axfi-conference
modelling-alternatives-for-cecl-a-deep-future-analytics-study
data-analytics-use-cases-for-credit-unions-infographic
data-analytics-opportunities-in-credit-union-business
loan-application-analytics-with-cufx
machine-learning-delivers-great-consumer-experiences
deep-insights-of-credit-union-members-data-with-machine-learning
web-analytics-reporting-tips-for-credit-unions
big-data-strategy-roadmap-our-data-journey
webinar-framework-for-member-focused-decision-making
too-many-regulations-hurt-credit-union-members
digital-marketing-automation-solutions
online-banking-boom
transformation-transactions-to-relationships
top-dispute-management-technologies
2020-retail-trends
future-of-artificial-intelligence
2020-culytics-summit-attendee-dashboard
repositioning-the-role-of-marketing
marketing-automation-a-step-towards-marketing-transformation
strategic-agility
using-data-to-navigate-through-the-new-normal
digital-transformation-bcu
highest-and-lowest-new-loan-balances-per-branch-as-of-jun-2020
-new-members-ratio-as-of-june-2020
cus-with-highest-and-lowest-loan-grants-per-member-june-2020
self-service-data-preparation-technologies
highest-and-lowest-marketing-expense-per-member-june-2020
the-amazon-lending-experience
api-lead-approach
4-step-iterative-process-building-a-relevant-analytics-practice
data-journey-building-strong-analytical-practices
post-election-the-cu-outlook
most-and-least-delinquent-credit-unions-sept-2020
leveraging-ach-data-to-produce-real-outcomes
member-engagement-scores-benefits
member-engagement-key-to-serve-the-best
story-of-james-an-intelligence-transformation
executive-kpis-the-pulse-of-the-organization
untangling-member-journey
onboarding-strategy-to-deliver-success
the-importance-of-digital-technologies
top-interactive-financial-calculators
using-artificial-intelligence-to-improve-your-productivity
organizational-transformation-to-drive-growth
multi-year-journey-through-data-transformation
top-50-cus-with-the-highest-and-lowest-member-per-branch
digital-transformation-lessons-through-the-eyes-of-a-ceo
organizational-readiness-for-digital-transformation
ruthless-prioritization-to-do-more-to-learn-more-and-to-earn-more
performance-measures-for-digital-services
analytical-maturity-journey-towards-growth
less-is-more-the-necessity-of-focus-for-strategic-success
solving-the-crm-mrm-puzzle
insights-driven-messaging-member-and-product-onboarding
performance-measures-for-marketing
data-insights-that-drive-member-product-innovation
solving-the-crm-mrm-puzzle
the-agility-flywheel-a-strategy-that-never-goes-out-of-the-way
artificial-intelligence-as-a-playing-field-for-credit-unions
performance-measures-for-call-centers
top-automl-technologies
performance-measures-for-lending
building-business-case-for-data-analytics
driving-innovation-and-change
data-analyze-decide-and-create
digital-readiness-important-steps-to-achieve
digital-readiness-important-steps-to-achieve
enabling-credit-unions-with-ai
culytics-virtual-summit-2022-a-resounding-success
culytics-virtual-summit-2022-day-1
digital-banking-roundtable
digital-marketing-roundtable
transformative-lessons-from-a-chief-digital-officer
data-analytics-roundtable-mar-11
rewind-2022-culytics-day-key-highlights
data-analytics-team-roles
data-warehouse-development
data-analytics-team-size
is-your-data-analytics-program-not-delivering-results
active-deposit-management-for-profitable-growth
data-modeling
maximize-your-success-with-2023-CULytics-summit
biggest-opportunities-for-credit-unions
should-ceos-attend-the-culytics-summit
the-cost-of-a-wrong-decision
biggest-roadblocks-in-becoming-data-driven
a-journey-for-all-organizational-maturity-levels
maximize-your-data-analytics-checkup
navigating-the-data-analytics-landscape
improving-data-literacy
why-credit-union-leaders-should-invest-in-their-teams
why-credit-unions-should-not-invest-in-building-predictive-models
why-should-measure-the-success-of-data-analytics-program
cost-of-choosing-the-wrong-data-analytics-technology-stack
why-data-analytics-strategy-focus-on-supply-and-demand-side
kpis-to-measure-the-success-of-data-analytics-program
data-analytics-for-credit-union-branch-heads
data-organizing-principles
top-data-warehouse-storage-technologies
discover-the-hidden-truth-behind-watermelon-kpis
unveiling-the-hidden-dangers-of-cobra-effect-on-kpis
are-you-accurately-interpreting-your-kpi
unmasking-biases-a-guide-to-data-analysis-and-kpi-definition
uncover-the-power-of-proxy-kpis
unraveling-the-hidden-impact-of-sampling-bias-in-credit-unions
bi-department-structure
hidden-impact-of-confirmation-bias-in-credit-unions
getting-executive-attention-for-your-data-analytics-program
uncovering-biases-in-data-preprocessing
navigating-missing-data-in-credit-unions
navigating-sampling-bias-in-cu
unleash-the-power-of-real-time-data-use-cases
how-confirmation-bias-impacts-cus
breaking-down-selection-bias-in-credit-unions
unmasking-reporting-bias
elevate-your-cu-with-data-analytics-expertise
understanding-and-tackling-volunteer-bias-in-credit-unions
time-period-bias-in-credit-union
overcoming-biases-in-credit-unions
embracing-the-future-fast-future-fundamentals-program-equips-cred
unlock-growth-and-efficiency-credit-unions-guide-to-generative-ai
how-better-data-and-behavioral-biometrics-can-help-credit-unions-
harnessing-the-power-of-data-in-credit-unions
leveraging-third-party-data-a-strategic-guide-for-credit-unions
unlocking-member-insights-how-cus-can-leverage-third-party-data
enhancing-customer-experience-through-third-party-data
third-party-data-integration-techniques-and-technologies
the-future-of-lending-third-party-data-role-in-credit-decisioning
how-third-party-information-shapes-cu-strategies
using-data-to-improve-access-to-credit-for-low-income-members
designing-financial-products-for-low-income-members-using-data
measuring-and-enhancing-the-impact-of-support-programs
data-governance-why-selling-internally-is-important
selling-data-governance-in-your-credit-union
building-a-business-case-and-engaging-stakeholders
creating-a-data-governance-roadmap-and-executing-it
measuring-and-demonstrating-the-impact-of-data-governance
sustaining-momentum-keeping-data-governance-a-priority
overcoming-challenges-in-transaction-data-analysis-credit-unions
empowering-members-through-transaction-data
how-credit-unions-leverage-transaction-data-best-practices
unlocking-financial-independence-the-power-of-transaction-data
the-power-of-transaction-data-enrichment
avoid-financial-reputation-and-member-trust-issues
introduction-to-model-risk-management
week-1-mrm-a-practitioner-s-approach
week-2-guide-to-identifying-and-maintaining-models
survey-insights-navigating-mrm-in-credit-unions
week-3-application-of-mrm-insights-to-sound-model-development-eff
unlocking-the-secrets-to-attracting-gen-y-and-z
creating-a-seamless-member-experience-for-gen-y-and-gen-z
data-analytics-maturity-assessment-report
marketing-to-gen-y-and-z-strategies-that-work-for-credit-unions
the-imperative-of-engaging-millennials-and-gen-z
cu-build-lasting-relationships-with-gen-z-financial-literacy
how-social-responsibility-drives-gen-z-membership
loyalty-programs-that-work-keeping-gen-y-and-z-members-engaged
insights-on-engaging-millennials-and-gen-z-at-credit-union
ai-driven-member-experience
streamlining-operations-with-ai
innovation-and-member-inclusion-in-ai-credit-risk-models
ai-risk-management-enhancing-fraud-detection-and-cybersecurity
how-ai-is-transforming-data-analytics-for-credit-union
overcoming-ai-adoption-challenges-in-credit-unions
the-state-of-ai-in-credit-unions-survey-insights
creating-a-culture-of-innovation
building-the-foundation
closing-the-talent-gap