Class: Google::Cloud::AutoML::V1beta1::TextExtractionEvaluationMetrics
- Inherits:
-
Object
- Object
- Google::Cloud::AutoML::V1beta1::TextExtractionEvaluationMetrics
- Defined in:
- lib/google/cloud/automl/v1beta1/doc/google/cloud/automl/v1beta1/text_extraction.rb
Overview
Model evaluation metrics for text extraction problems.
Defined Under Namespace
Classes: ConfidenceMetricsEntry
Instance Attribute Summary collapse
-
#au_prc ⇒ Float
Output only.
-
#confidence_metrics_entries ⇒ Array<Google::Cloud::AutoML::V1beta1::TextExtractionEvaluationMetrics::ConfidenceMetricsEntry>
Output only.
Instance Attribute Details
#au_prc ⇒ Float
Returns Output only. The Area under precision recall curve metric.
39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 |
# File 'lib/google/cloud/automl/v1beta1/doc/google/cloud/automl/v1beta1/text_extraction.rb', line 39 class TextExtractionEvaluationMetrics # Metrics for a single confidence threshold. # @!attribute [rw] confidence_threshold # @return [Float] # Output only. The confidence threshold value used to compute the metrics. # Only annotations with score of at least this threshold are considered to # be ones the model would return. # @!attribute [rw] recall # @return [Float] # Output only. Recall under the given confidence threshold. # @!attribute [rw] precision # @return [Float] # Output only. Precision under the given confidence threshold. # @!attribute [rw] f1_score # @return [Float] # Output only. The harmonic mean of recall and precision. class ConfidenceMetricsEntry; end end |
#confidence_metrics_entries ⇒ Array<Google::Cloud::AutoML::V1beta1::TextExtractionEvaluationMetrics::ConfidenceMetricsEntry>
Returns Output only. Metrics that have confidence thresholds. Precision-recall curve can be derived from it.
39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 |
# File 'lib/google/cloud/automl/v1beta1/doc/google/cloud/automl/v1beta1/text_extraction.rb', line 39 class TextExtractionEvaluationMetrics # Metrics for a single confidence threshold. # @!attribute [rw] confidence_threshold # @return [Float] # Output only. The confidence threshold value used to compute the metrics. # Only annotations with score of at least this threshold are considered to # be ones the model would return. # @!attribute [rw] recall # @return [Float] # Output only. Recall under the given confidence threshold. # @!attribute [rw] precision # @return [Float] # Output only. Precision under the given confidence threshold. # @!attribute [rw] f1_score # @return [Float] # Output only. The harmonic mean of recall and precision. class ConfidenceMetricsEntry; end end |