Dictionary of Terms

Decision Tree- CART(classification and regression tree)

More
6 years 1 month ago - 6 years 1 month ago #741 by Dorina Grossu
https://en.wikipedia.org/wiki/Decision_tree_learning

Metrics Algorithms for constructing decision trees usually work top-down, by choosing a variable at each step that best splits the set of items.[15] Different algorithms use different metrics for measuring "best". These generally measure the homogeneity of the target variable within the subsets. Some examples are given below. These metrics are applied to each candidate subset, and the resulting values are combined (e.g., averaged) to provide a measure of the quality of the split.Gini impurity Not to be confused with Gini coefficient. Used by the CART (classification and regression tree) algorithm for classification trees, Gini impurity is a measure of how often a randomly chosen element from the set would be incorrectly labeled if it was randomly labeled according to the distribution of labels in the subset. The Gini impurity can be computed by summing the probability
Last edit: 6 years 1 month ago by Dorina Grossu.

Please Log in or Create an account to join the conversation.

Powered by Kunena Forum