A python library for decision tree visualization and model interpretation.
-
Updated
Aug 29, 2024 - Jupyter Notebook
A python library for decision tree visualization and model interpretation.
InterpretDL: Interpretation of Deep Learning Models,基于『飞桨』的模型可解释性算法库。
A multi-functional library for full-stack Deep Learning. Simplifies Model Building, API development, and Model Deployment.
Overview of different model interpretability libraries.
A set of tools for leveraging pre-trained embeddings, active learning and model explainability for effecient document classification
FastAI Model Interpretation with LIME
What Has Been Enhanced in my Knowledge-Enhanced Language Model?
Overview of machine learning interpretation techniques and their implementations
Model Interpretability via Hierarchical Feature Perturbation
Implémentation d'un modèle de scoring (OpenClassrooms | Data Scientist | Projet 7)
The tasks I was required to complete as a part of the BCG Open-Access Data Science & Advanced Analytics Virtual Experience Program are all contained in this repository. 📊📈📉👨💻
Integrating multimodal data through heterogeneous ensembles
This repository has all of the assignments I had to do for the Standard Bank Data Science Virtual Experience Program. 📉👨💻📊📈
Using LIME and SHAP for model interpretability of Machine Learning Black-box models.
Visualize a Decision Tree using dtreeviz
Advise one of Cognizant’s clients on a supply chain issue by applying knowledge of machine learning models.
Prediction of students' dropout using classification models. Data visualisation, feature selection, dimensionality reduction, model selection and interpretation, parameters tuning.
This project included a XGBoost Regression model, which predict the purchase possibility of a customer customer based on their online shopping behavior. In addtion, a recommendation model including both CF and CBF was built using customer purchase transaction data.
Exploratory data analysis, data modelling, model building and interpretation, machine learning production, quality assurance
Add a description, image, and links to the model-interpretation topic page so that developers can more easily learn about it.
To associate your repository with the model-interpretation topic, visit your repo's landing page and select "manage topics."