Class: Google::Cloud::DiscoveryEngine::V1::BigQuerySource
- Inherits:
-
Object
- Object
- Google::Cloud::DiscoveryEngine::V1::BigQuerySource
- Extended by:
- Protobuf::MessageExts::ClassMethods
- Includes:
- Protobuf::MessageExts
- Defined in:
- proto_docs/google/cloud/discoveryengine/v1/import_config.rb
Overview
BigQuery source import data from.
Instance Attribute Summary collapse
-
#data_schema ⇒ ::String
The schema to use when parsing the data from the source.
-
#dataset_id ⇒ ::String
Required.
-
#gcs_staging_dir ⇒ ::String
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters.
-
#partition_date ⇒ ::Google::Type::Date
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
-
#project_id ⇒ ::String
The project ID or the project number that contains the BigQuery source.
-
#table_id ⇒ ::String
Required.
Instance Attribute Details
#data_schema ⇒ ::String
Returns The schema to use when parsing the data from the source.
Supported values for user event imports:
user_event
(default): One UserEvent per row.
Supported values for document imports:
document
(default): One Document format per row. Each document must have a valid Document.id and one of Document.json_data or Document.struct_data.custom
: One custom data per row in arbitrary format that conforms to the defined Schema of the data store. This can only be used by the GENERIC Data Store vertical.
108 109 110 111 |
# File 'proto_docs/google/cloud/discoveryengine/v1/import_config.rb', line 108 class BigQuerySource include ::Google::Protobuf::MessageExts extend ::Google::Protobuf::MessageExts::ClassMethods end |
#dataset_id ⇒ ::String
Returns Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
108 109 110 111 |
# File 'proto_docs/google/cloud/discoveryengine/v1/import_config.rb', line 108 class BigQuerySource include ::Google::Protobuf::MessageExts extend ::Google::Protobuf::MessageExts::ClassMethods end |
#gcs_staging_dir ⇒ ::String
Returns Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
108 109 110 111 |
# File 'proto_docs/google/cloud/discoveryengine/v1/import_config.rb', line 108 class BigQuerySource include ::Google::Protobuf::MessageExts extend ::Google::Protobuf::MessageExts::ClassMethods end |
#partition_date ⇒ ::Google::Type::Date
Returns BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
108 109 110 111 |
# File 'proto_docs/google/cloud/discoveryengine/v1/import_config.rb', line 108 class BigQuerySource include ::Google::Protobuf::MessageExts extend ::Google::Protobuf::MessageExts::ClassMethods end |
#project_id ⇒ ::String
Returns The project ID or the project number that contains the BigQuery source. Has a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
108 109 110 111 |
# File 'proto_docs/google/cloud/discoveryengine/v1/import_config.rb', line 108 class BigQuerySource include ::Google::Protobuf::MessageExts extend ::Google::Protobuf::MessageExts::ClassMethods end |
#table_id ⇒ ::String
Returns Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
108 109 110 111 |
# File 'proto_docs/google/cloud/discoveryengine/v1/import_config.rb', line 108 class BigQuerySource include ::Google::Protobuf::MessageExts extend ::Google::Protobuf::MessageExts::ClassMethods end |