Class: CrossValidation::ConfusionMatrix
- Inherits:
-
Object
- Object
- CrossValidation::ConfusionMatrix
- Defined in:
- lib/cross_validation/confusion_matrix.rb
Overview
Provides a confusion matrix (contingency table) for classification results.
See the following book for more details:
Speech and Language Processing: An introduction to natural language processing, computational linguistics, and speech recognition. Daniel Jurafsky & James H. Martin.
Instance Method Summary collapse
-
#accuracy ⇒ Float
Computes the accuracy of the classifier, defined as (tp + tn)/n.
-
#error ⇒ Float
Returns the classifier’s error.
-
#f1 ⇒ Float
Returns an F-score that favors precision and recall equally.
-
#fscore(beta) ⇒ Float
Returns the F-measure of the classifier’s precision and recall.
-
#initialize(keys_proc) ⇒ ConfusionMatrix
constructor
Initialize the confusion matrix with a Proc (or block).
-
#precision ⇒ Float
Computes the precision of the classifier, defined as tp/(tp + fp).
-
#recall ⇒ Float
Computes the recall of the classifier, defined as tp/(tp + fn).
-
#store(actual, truth) ⇒ self
Save the result of classification.
Constructor Details
#initialize(keys_proc) ⇒ ConfusionMatrix
Initialize the confusion matrix with a Proc (or block). This Proc must return a symbol of :tp (true positive), :tn (true negative), :fp (false positive), or :fn (false negative) for a given classification and its expected value.
See the unit test for an example Proc.
22 23 24 25 |
# File 'lib/cross_validation/confusion_matrix.rb', line 22 def initialize(keys_proc) @keys_for = keys_proc @values = {:tp => 0, :tn => 0, :fp => 0, :fn => 0} end |
Instance Method Details
#accuracy ⇒ Float
Computes the accuracy of the classifier, defined as (tp + tn)/n
51 52 53 |
# File 'lib/cross_validation/confusion_matrix.rb', line 51 def accuracy (@values.fetch(:tp) + @values.fetch(:tn)) / total() end |
#error ⇒ Float
Returns the classifier’s error
88 89 90 |
# File 'lib/cross_validation/confusion_matrix.rb', line 88 def error 1.0 - accuracy() end |
#f1 ⇒ Float
Returns an F-score that favors precision and recall equally.
81 82 83 |
# File 'lib/cross_validation/confusion_matrix.rb', line 81 def f1 fscore(1) end |
#fscore(beta) ⇒ Float
Returns the F-measure of the classifier’s precision and recall.
73 74 75 76 |
# File 'lib/cross_validation/confusion_matrix.rb', line 73 def fscore(beta) b2 = Float(beta**2) ((b2 + 1) * precision * recall) / (b2 * precision + recall) end |
#precision ⇒ Float
Computes the precision of the classifier, defined as tp/(tp + fp)
58 59 60 |
# File 'lib/cross_validation/confusion_matrix.rb', line 58 def precision @values.fetch(:tp) / Float(@values.fetch(:tp) + @values.fetch(:fp)) end |
#recall ⇒ Float
Computes the recall of the classifier, defined as tp/(tp + fn)
65 66 67 |
# File 'lib/cross_validation/confusion_matrix.rb', line 65 def recall @values.fetch(:tp) / Float(@values.fetch(:tp) + @values.fetch(:fn)) end |
#store(actual, truth) ⇒ self
Save the result of classification
36 37 38 39 40 41 42 43 44 45 46 |
# File 'lib/cross_validation/confusion_matrix.rb', line 36 def store(actual, truth) key = @keys_for.call(truth, actual) if @values.key?(key) @values[key] += 1 else fail IndexError, "#{key} not found in confusion matrix" end self end |