Class: Aws::SageMaker::Types::TextGenerationJobConfig
- Inherits:
-
Struct
- Object
- Struct
- Aws::SageMaker::Types::TextGenerationJobConfig
- Includes:
- Aws::Structure
- Defined in:
- lib/aws-sdk-sagemaker/types.rb
Overview
The collection of settings used by an AutoML job V2 for the text generation problem type.
<note markdown=“1”> The text generation models that support fine-tuning in Autopilot are currently accessible exclusively in regions supported by Canvas. Refer to the documentation of Canvas for the [full list of its supported Regions].
</note>
Constant Summary collapse
- SENSITIVE =
[]
Instance Attribute Summary collapse
-
#base_model_name ⇒ String
The name of the base model to fine-tune.
-
#completion_criteria ⇒ Types::AutoMLJobCompletionCriteria
How long a fine-tuning job is allowed to run.
-
#model_access_config ⇒ Types::ModelAccessConfig
The access configuration file to control access to the ML model.
-
#text_generation_hyper_parameters ⇒ Hash<String,String>
The hyperparameters used to configure and optimize the learning process of the base model.
Instance Attribute Details
#base_model_name ⇒ String
The name of the base model to fine-tune. Autopilot supports fine-tuning a variety of large language models. For information on the list of supported models, see [Text generation models supporting fine-tuning in Autopilot]. If no ‘BaseModelName` is provided, the default model used is Falcon7BInstruct.
42489 42490 42491 42492 42493 42494 42495 42496 |
# File 'lib/aws-sdk-sagemaker/types.rb', line 42489 class TextGenerationJobConfig < Struct.new( :completion_criteria, :base_model_name, :text_generation_hyper_parameters, :model_access_config) SENSITIVE = [] include Aws::Structure end |
#completion_criteria ⇒ Types::AutoMLJobCompletionCriteria
How long a fine-tuning job is allowed to run. For ‘TextGenerationJobConfig` problem types, the `MaxRuntimePerTrainingJobInSeconds` attribute of `AutoMLJobCompletionCriteria` defaults to 72h (259200s).
42489 42490 42491 42492 42493 42494 42495 42496 |
# File 'lib/aws-sdk-sagemaker/types.rb', line 42489 class TextGenerationJobConfig < Struct.new( :completion_criteria, :base_model_name, :text_generation_hyper_parameters, :model_access_config) SENSITIVE = [] include Aws::Structure end |
#model_access_config ⇒ Types::ModelAccessConfig
The access configuration file to control access to the ML model. You can explicitly accept the model end-user license agreement (EULA) within the ‘ModelAccessConfig`.
-
If you are a Jumpstart user, see the [End-user license agreements] section for more details on accepting the EULA.
-
If you are an AutoML user, see the *Optional Parameters* section of *Create an AutoML job to fine-tune text generation models using the API* for details on [How to set the EULA acceptance when fine-tuning a model using the AutoML API].
[1]: docs.aws.amazon.com/sagemaker/latest/dg/jumpstart-foundation-models-choose.html#jumpstart-foundation-models-choose-eula [2]: docs.aws.amazon.com/sagemaker/latest/dg/autopilot-create-experiment-finetune-llms.html#autopilot-llms-finetuning-api-optional-params
42489 42490 42491 42492 42493 42494 42495 42496 |
# File 'lib/aws-sdk-sagemaker/types.rb', line 42489 class TextGenerationJobConfig < Struct.new( :completion_criteria, :base_model_name, :text_generation_hyper_parameters, :model_access_config) SENSITIVE = [] include Aws::Structure end |
#text_generation_hyper_parameters ⇒ Hash<String,String>
The hyperparameters used to configure and optimize the learning process of the base model. You can set any combination of the following hyperparameters for all base models. For more information on each supported hyperparameter, see [Optimize the learning process of your text generation models with hyperparameters].
-
‘“epochCount”`: The number of times the model goes through the entire training dataset. Its value should be a string containing an integer value within the range of “1” to “10”.
-
‘“batchSize”`: The number of data samples used in each iteration of training. Its value should be a string containing an integer value within the range of “1” to “64”.
-
‘“learningRate”`: The step size at which a model’s parameters are updated during training. Its value should be a string containing a floating-point value within the range of “0” to “1”.
-
‘“learningRateWarmupSteps”`: The number of training steps during which the learning rate gradually increases before reaching its target or maximum value. Its value should be a string containing an integer value within the range of “0” to “250”.
Here is an example where all four hyperparameters are configured.
‘{ “epochCount”:“5”, “learningRate”:“0.5”, “batchSize”: “32”, “learningRateWarmupSteps”: “10” }`
[1]: docs.aws.amazon.com/sagemaker/latest/dg/autopilot-llms-finetuning-set-hyperparameters.html
42489 42490 42491 42492 42493 42494 42495 42496 |
# File 'lib/aws-sdk-sagemaker/types.rb', line 42489 class TextGenerationJobConfig < Struct.new( :completion_criteria, :base_model_name, :text_generation_hyper_parameters, :model_access_config) SENSITIVE = [] include Aws::Structure end |