Class: Rumale::Ensemble::GradientBoostingClassifier
- Inherits:
-
Object
- Object
- Rumale::Ensemble::GradientBoostingClassifier
- Includes:
- Base::BaseEstimator, Base::Classifier
- Defined in:
- lib/rumale/ensemble/gradient_boosting_classifier.rb
Overview
GradientBoostingClassifier is a class that implements gradient tree boosting for classification. The class use negative binomial log-likelihood for the loss function. For multiclass classification problem, it uses one-vs-the-rest strategy.
reference
-
J H. Friedman, “Greedy Function Approximation: A Gradient Boosting Machine,” Annals of Statistics, 29 (5), pp. 1189–1232, 2001.
-
J H. Friedman, “Stochastic Gradient Boosting,” Computational Statistics and Data Analysis, 38 (4), pp. 367–378, 2002.
-
Chen and C. Guestrin, “XGBoost: A Scalable Tree Boosting System,” Proc. KDD’16, pp. 785–794, 2016.
-
Instance Attribute Summary collapse
-
#classes ⇒ Numo::Int32
readonly
Return the class labels.
-
#estimators ⇒ Array<GradientTreeRegressor>
readonly
Return the set of estimators.
-
#feature_importances ⇒ Numo::DFloat
readonly
Return the importance for each feature.
-
#rng ⇒ Random
readonly
Return the random generator for random selection of feature index.
Attributes included from Base::BaseEstimator
Instance Method Summary collapse
-
#apply(x) ⇒ Numo::Int32
Return the index of the leaf that each sample reached.
-
#decision_function(x) ⇒ Numo::DFloat
Calculate confidence scores for samples.
-
#fit(x, y) ⇒ GradientBoostingClassifier
Fit the model with given training data.
-
#initialize(n_estimators: 100, learning_rate: 0.1, reg_lambda: 0.0, subsample: 1.0, max_depth: nil, max_leaf_nodes: nil, min_samples_leaf: 1, max_features: nil, n_jobs: nil, random_seed: nil) ⇒ GradientBoostingClassifier
constructor
Create a new classifier with gradient tree boosting.
-
#marshal_dump ⇒ Hash
Dump marshal data.
-
#marshal_load(obj) ⇒ nil
Load marshal data.
-
#predict(x) ⇒ Numo::Int32
Predict class labels for samples.
-
#predict_proba(x) ⇒ Numo::DFloat
Predict probability for samples.
Methods included from Base::Classifier
Constructor Details
#initialize(n_estimators: 100, learning_rate: 0.1, reg_lambda: 0.0, subsample: 1.0, max_depth: nil, max_leaf_nodes: nil, min_samples_leaf: 1, max_features: nil, n_jobs: nil, random_seed: nil) ⇒ GradientBoostingClassifier
Create a new classifier with gradient tree boosting.
65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 |
# File 'lib/rumale/ensemble/gradient_boosting_classifier.rb', line 65 def initialize(n_estimators: 100, learning_rate: 0.1, reg_lambda: 0.0, subsample: 1.0, max_depth: nil, max_leaf_nodes: nil, min_samples_leaf: 1, max_features: nil, n_jobs: nil, random_seed: nil) check_params_type_or_nil(Integer, max_depth: max_depth, max_leaf_nodes: max_leaf_nodes, max_features: max_features, n_jobs: n_jobs, random_seed: random_seed) check_params_integer(n_estimators: n_estimators, min_samples_leaf: min_samples_leaf) check_params_float(learning_rate: learning_rate, reg_lambda: reg_lambda, subsample: subsample) check_params_positive(n_estimators: n_estimators, learning_rate: learning_rate, reg_lambda: reg_lambda, subsample: subsample, max_depth: max_depth, max_leaf_nodes: max_leaf_nodes, min_samples_leaf: min_samples_leaf, max_features: max_features) @params = {} @params[:n_estimators] = n_estimators @params[:learning_rate] = learning_rate @params[:reg_lambda] = reg_lambda @params[:subsample] = subsample @params[:max_depth] = max_depth @params[:max_leaf_nodes] = max_leaf_nodes @params[:min_samples_leaf] = min_samples_leaf @params[:max_features] = max_features @params[:n_jobs] = n_jobs @params[:random_seed] = random_seed @params[:random_seed] ||= srand @estimators = nil @classes = nil @base_predictions = nil @feature_importances = nil @rng = Random.new(@params[:random_seed]) end |
Instance Attribute Details
#classes ⇒ Numo::Int32 (readonly)
Return the class labels.
36 37 38 |
# File 'lib/rumale/ensemble/gradient_boosting_classifier.rb', line 36 def classes @classes end |
#estimators ⇒ Array<GradientTreeRegressor> (readonly)
Return the set of estimators.
32 33 34 |
# File 'lib/rumale/ensemble/gradient_boosting_classifier.rb', line 32 def estimators @estimators end |
#feature_importances ⇒ Numo::DFloat (readonly)
Return the importance for each feature. The feature importances are calculated based on the numbers of times the feature is used for splitting.
41 42 43 |
# File 'lib/rumale/ensemble/gradient_boosting_classifier.rb', line 41 def feature_importances @feature_importances end |
#rng ⇒ Random (readonly)
Return the random generator for random selection of feature index.
45 46 47 |
# File 'lib/rumale/ensemble/gradient_boosting_classifier.rb', line 45 def rng @rng end |
Instance Method Details
#apply(x) ⇒ Numo::Int32
Return the index of the leaf that each sample reached.
176 177 178 179 180 181 182 183 184 185 |
# File 'lib/rumale/ensemble/gradient_boosting_classifier.rb', line 176 def apply(x) check_sample_array(x) n_classes = @classes.size leaf_ids = if n_classes > 2 Array.new(n_classes) { |n| @estimators[n].map { |tree| tree.apply(x) } } else @estimators.map { |tree| tree.apply(x) } end Numo::Int32[*leaf_ids].transpose end |
#decision_function(x) ⇒ Numo::DFloat
Calculate confidence scores for samples.
133 134 135 136 137 138 139 140 141 |
# File 'lib/rumale/ensemble/gradient_boosting_classifier.rb', line 133 def decision_function(x) check_sample_array(x) n_classes = @classes.size if n_classes > 2 multiclass_scores(x) else @estimators.map { |tree| tree.predict(x) }.reduce(&:+) + @base_predictions end end |
#fit(x, y) ⇒ GradientBoostingClassifier
Fit the model with given training data.
99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 |
# File 'lib/rumale/ensemble/gradient_boosting_classifier.rb', line 99 def fit(x, y) check_sample_array(x) check_label_array(y) check_sample_label_size(x, y) # initialize some variables. n_features = x.shape[1] @params[:max_features] = n_features if @params[:max_features].nil? @params[:max_features] = [[1, @params[:max_features]].max, n_features].min @classes = Numo::Int32[*y.to_a.uniq.sort] n_classes = @classes.size # train estimator. if n_classes > 2 @base_predictions = multiclass_base_predictions(y) @estimators = multiclass_estimators(x, y) else negative_label = y.to_a.uniq.min bin_y = Numo::DFloat.cast(y.ne(negative_label)) * 2 - 1 y_mean = bin_y.mean @base_predictions = 0.5 * Numo::NMath.log((1.0 + y_mean) / (1.0 - y_mean)) @estimators = partial_fit(x, bin_y, @base_predictions) end # calculate feature importances. @feature_importances = if n_classes > 2 multiclass_feature_importances else @estimators.map(&:feature_importances).reduce(&:+) end self end |
#marshal_dump ⇒ Hash
Dump marshal data.
189 190 191 192 193 194 195 196 |
# File 'lib/rumale/ensemble/gradient_boosting_classifier.rb', line 189 def marshal_dump { params: @params, estimators: @estimators, classes: @classes, base_predictions: @base_predictions, feature_importances: @feature_importances, rng: @rng } end |
#marshal_load(obj) ⇒ nil
Load marshal data.
200 201 202 203 204 205 206 207 208 |
# File 'lib/rumale/ensemble/gradient_boosting_classifier.rb', line 200 def marshal_load(obj) @params = obj[:params] @estimators = obj[:estimators] @classes = obj[:classes] @base_predictions = obj[:base_predictions] @feature_importances = obj[:feature_importances] @rng = obj[:rng] nil end |
#predict(x) ⇒ Numo::Int32
Predict class labels for samples.
147 148 149 150 151 152 |
# File 'lib/rumale/ensemble/gradient_boosting_classifier.rb', line 147 def predict(x) check_sample_array(x) n_samples = x.shape[0] probs = predict_proba(x) Numo::Int32.asarray(Array.new(n_samples) { |n| @classes[probs[n, true].max_index] }) end |
#predict_proba(x) ⇒ Numo::DFloat
Predict probability for samples.
158 159 160 161 162 163 164 165 166 167 168 169 170 |
# File 'lib/rumale/ensemble/gradient_boosting_classifier.rb', line 158 def predict_proba(x) check_sample_array(x) proba = 1.0 / (Numo::NMath.exp(-decision_function(x)) + 1.0) return (proba.transpose / proba.sum(axis: 1)).transpose if @classes.size > 2 n_samples, = x.shape probs = Numo::DFloat.zeros(n_samples, 2) probs[true, 1] = proba probs[true, 0] = 1.0 - proba probs end |