Class: Aws::Bedrock::Types::AutomatedEvaluationConfig
- Inherits:
-
Struct
- Object
- Struct
- Aws::Bedrock::Types::AutomatedEvaluationConfig
- Includes:
- Structure
- Defined in:
- lib/aws-sdk-bedrock/types.rb
Overview
The configuration details of an automated evaluation job. The ‘EvaluationDatasetMetricConfig` object is used to specify the prompt datasets, task type, and metric names.
Constant Summary collapse
- SENSITIVE =
[]
Instance Attribute Summary collapse
-
#dataset_metric_configs ⇒ Array<Types::EvaluationDatasetMetricConfig>
Configuration details of the prompt datasets and metrics you want to use for your evaluation job.
-
#evaluator_model_config ⇒ Types::EvaluatorModelConfig
Contains the evaluator model configuration details.
Instance Attribute Details
#dataset_metric_configs ⇒ Array<Types::EvaluationDatasetMetricConfig>
Configuration details of the prompt datasets and metrics you want to use for your evaluation job.
44 45 46 47 48 49 |
# File 'lib/aws-sdk-bedrock/types.rb', line 44 class AutomatedEvaluationConfig < Struct.new( :dataset_metric_configs, :evaluator_model_config) SENSITIVE = [] include Aws::Structure end |
#evaluator_model_config ⇒ Types::EvaluatorModelConfig
Contains the evaluator model configuration details. ‘EvaluatorModelConfig` is required for evaluation jobs that use a knowledge base or in model evaluation job that use a model as judge. This model computes all evaluation related metrics.
44 45 46 47 48 49 |
# File 'lib/aws-sdk-bedrock/types.rb', line 44 class AutomatedEvaluationConfig < Struct.new( :dataset_metric_configs, :evaluator_model_config) SENSITIVE = [] include Aws::Structure end |