Flashield's Blog

Just For My Daily Diary

Flashield's Blog

Just For My Daily Diary

Year: 2024

05.course-advanced-uses-of-shap-values【SHAP值的高级用法】

Recap 回顾 We started by learning about permutation importance and partial dependence plots for an overview of what the model has learned. 我们首先学习了排列重要性和部分依赖性图,以概述所学到的模型内容。 We then learned about SHAP values to break down the components of individual predictions. 然后,我们学习了 SHAP 值,以分解各个预测的组成部分。 Now we’ll expand on SHAP values, seeing how aggregating many SHAP values can give more […]

04.exercise-shap-values【练习:SHAP值】

This notebook is an exercise in the Machine Learning Explainability course. You can reference the tutorial at this link. Set Up 设置 At this point, you have enough tools to put together compelling solutions to real-world problems. You will ned to pick the right techniques for each part of the following data science scenario. Along […]

04.course-shap-values【SHAP值】

Introduction 简介 You’ve seen (and used) techniques to extract general insights from a machine learning model. But what if you want to break down how the model works for an individual prediction? 您已经看到(并使用)了从机器学习模型中提取一般见解的技术。但是,如果您想分解模型如何针对单个预测发挥作用,该怎么办? SHAP Values (an acronym from SHapley Additive exPlanations) break down a prediction to show the impact of each feature. Where could you […]

03.exercise-partial-plots【练习:部分依赖图】

This notebook is an exercise in the Machine Learning Explainability course. You can reference the tutorial at this link. Set Up 设置 Today you will create partial dependence plots and practice building insights with data from the Taxi Fare Prediction competition. 今天,您将创建部分依赖关系图,并练习使用来自 出租车费预测 竞赛的数据来阐述问题。 We have again provided code to do the basic loading, review […]

03.course-partial-plots【部分依赖图】

Partial Dependence Plots 部分依赖图 While feature importance shows what variables most affect predictions, partial dependence plots show how a feature affects predictions. 虽然特征重要性显示了哪些变量对预测影响最大,但部分依赖图显示了特征如何影响预测。 This is useful to answer questions like: 这对于回答以下问题很有用: Controlling for all other house features, what impact do longitude and latitude have on home prices? To restate this, how would similarly sized houses […]

02.exercise-permutation-importance【练习:排列重要性】

This notebook is an exercise in the Machine Learning Explainability course. You can reference the tutorial at this link. Intro 简介 You will think about and calculate permutation importance with a sample of data from the Taxi Fare Prediction competition. 您将使用来自 出租车费预测 竞赛的数据样本来思考和计算排列重要性。 We won’t focus on data exploration or model building for now. You […]

02.course-permutation-importance【排列重要性】

Introduction 简介 One of the most basic questions we might ask of a model is: What features have the biggest impact on predictions? 我们可能会问模型的一个最基本问题是:哪些特征对预测的影响最大? This concept is called feature importance. 这个概念称为特征重要性。 There are multiple ways to measure feature importance. Some approaches answer subtly different versions of the question above. Other approaches have documented shortcomings. 有多种方法来衡量特征重要性。一些方法回答了上述问题的微妙不同版本。其他方法有记录的缺点。 […]

01.course-use-cases-for-model-insights【模型洞察用例】

What Types of Insights Are Possible 可能获得哪些类型的洞察 Many people say machine learning models are "black boxes", in the sense that they can make good predictions but you can’t understand the logic behind those predictions. This statement is true in the sense that most data scientists don’t know how to extract insights from models yet. 许多人说机器学习模型是“黑匣子”,即它们可以做出正确的预测,但你无法理解这些预测背后的逻辑。这种说法是正确的,因为大多数数据科学家还不知道如何从模型中提取洞察。 […]

05.exercise-model-cards【练习:模型卡】

This notebook is an exercise in the AI Ethics course. You can reference the tutorial at this link. In the tutorial, you learned how to use model cards. In this exercise, you’ll sharpen your understanding of model cards by engaging with them in a couple of scenarios. 在本教程中,您学习了如何使用模型卡。在本练习中,您将通过在几个场景中使用模型卡来加深对模型卡的理解。 Introduction 简介 Run the next code cell […]

Scroll to top