Class: Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client
- Inherits:
-
Object
- Object
- Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Client
- Includes:
- Paths
- Defined in:
- lib/google/cloud/bigquery/storage/v1/big_query_read/client.rb
Overview
Client for the BigQueryRead service.
BigQuery Read API.
The Read API can be used to read data from BigQuery.
Defined Under Namespace
Classes: Configuration
Class Method Summary collapse
-
.configure {|config| ... } ⇒ Client::Configuration
Configure the BigQueryRead Client class.
Instance Method Summary collapse
-
#configure {|config| ... } ⇒ Client::Configuration
Configure the BigQueryRead Client instance.
-
#create_read_session(request, options = nil) {|response, operation| ... } ⇒ ::Google::Cloud::Bigquery::Storage::V1::ReadSession
Creates a new read session.
-
#initialize {|config| ... } ⇒ Client
constructor
Create a new BigQueryRead client object.
-
#read_rows(request, options = nil) {|response, operation| ... } ⇒ ::Enumerable<::Google::Cloud::Bigquery::Storage::V1::ReadRowsResponse>
Reads rows from the stream in the format prescribed by the ReadSession.
-
#split_read_stream(request, options = nil) {|response, operation| ... } ⇒ ::Google::Cloud::Bigquery::Storage::V1::SplitReadStreamResponse
Splits a given
ReadStream
into twoReadStream
objects. -
#universe_domain ⇒ String
The effective universe domain.
Methods included from Paths
#project_path, #read_session_path, #read_stream_path, #table_path
Constructor Details
#initialize {|config| ... } ⇒ Client
Create a new BigQueryRead client object.
142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 |
# File 'lib/google/cloud/bigquery/storage/v1/big_query_read/client.rb', line 142 def initialize # These require statements are intentionally placed here to initialize # the gRPC module only when it's required. # See https://github.com/googleapis/toolkit/issues/446 require "gapic/grpc" require "google/cloud/bigquery/storage/v1/storage_services_pb" # Create the configuration object @config = Configuration.new Client.configure # Yield the configuration if needed yield @config if block_given? # Create credentials credentials = @config.credentials # Use self-signed JWT if the endpoint is unchanged from default, # but only if the default endpoint does not have a region prefix. enable_self_signed_jwt = @config.endpoint.nil? || (@config.endpoint == Configuration::DEFAULT_ENDPOINT && !@config.endpoint.split(".").first.include?("-")) credentials ||= Credentials.default scope: @config.scope, enable_self_signed_jwt: enable_self_signed_jwt if credentials.is_a?(::String) || credentials.is_a?(::Hash) credentials = Credentials.new credentials, scope: @config.scope end @quota_project_id = @config.quota_project @quota_project_id ||= credentials.quota_project_id if credentials.respond_to? :quota_project_id @big_query_read_stub = ::Gapic::ServiceStub.new( ::Google::Cloud::Bigquery::Storage::V1::BigQueryRead::Stub, credentials: credentials, endpoint: @config.endpoint, endpoint_template: DEFAULT_ENDPOINT_TEMPLATE, universe_domain: @config.universe_domain, channel_args: @config.channel_args, interceptors: @config.interceptors, channel_pool_config: @config.channel_pool ) end |
Class Method Details
.configure {|config| ... } ⇒ Client::Configuration
Configure the BigQueryRead Client class.
See Configuration for a description of the configuration fields.
65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 |
# File 'lib/google/cloud/bigquery/storage/v1/big_query_read/client.rb', line 65 def self.configure @configure ||= begin namespace = ["Google", "Cloud", "Bigquery", "Storage", "V1"] parent_config = while namespace.any? parent_name = namespace.join "::" parent_const = const_get parent_name break parent_const.configure if parent_const.respond_to? :configure namespace.pop end default_config = Client::Configuration.new parent_config default_config.rpcs.create_read_session.timeout = 600.0 default_config.rpcs.create_read_session.retry_policy = { initial_delay: 0.1, max_delay: 60.0, multiplier: 1.3, retry_codes: [4, 14] } default_config.rpcs.read_rows.timeout = 86_400.0 default_config.rpcs.read_rows.retry_policy = { initial_delay: 0.1, max_delay: 60.0, multiplier: 1.3, retry_codes: [14] } default_config.rpcs.split_read_stream.timeout = 600.0 default_config.rpcs.split_read_stream.retry_policy = { initial_delay: 0.1, max_delay: 60.0, multiplier: 1.3, retry_codes: [4, 14] } default_config end yield @configure if block_given? @configure end |
Instance Method Details
#configure {|config| ... } ⇒ Client::Configuration
Configure the BigQueryRead Client instance.
The configuration is set to the derived mode, meaning that values can be changed, but structural changes (adding new fields, etc.) are not allowed. Structural changes should be made on configure.
See Configuration for a description of the configuration fields.
112 113 114 115 |
# File 'lib/google/cloud/bigquery/storage/v1/big_query_read/client.rb', line 112 def configure yield @config if block_given? @config end |
#create_read_session(request, options = nil) ⇒ ::Google::Cloud::Bigquery::Storage::V1::ReadSession #create_read_session(parent: nil, read_session: nil, max_stream_count: nil, preferred_min_stream_count: nil) ⇒ ::Google::Cloud::Bigquery::Storage::V1::ReadSession
Creates a new read session. A read session divides the contents of a BigQuery table into one or more streams, which can then be used to read data from the table. The read session also specifies properties of the data to be read, such as a list of columns or a push-down filter describing the rows to be returned.
A particular row can be read by at most one stream. When the caller has reached the end of each stream in the session, then all the data in the table has been read.
Data is assigned to each stream such that roughly the same number of rows can be read from each stream. Because the server-side unit for assigning data is collections of rows, the API does not guarantee that each stream will return the same number or rows. Additionally, the limits are enforced based on the number of pre-filtered rows, so some filters can lead to lopsided assignments.
Read sessions automatically expire 6 hours after they are created and do not require manual clean-up by the caller.
269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 |
# File 'lib/google/cloud/bigquery/storage/v1/big_query_read/client.rb', line 269 def create_read_session request, = nil raise ::ArgumentError, "request must be provided" if request.nil? request = ::Gapic::Protobuf.coerce request, to: ::Google::Cloud::Bigquery::Storage::V1::CreateReadSessionRequest # Converts hash and nil to an options object = ::Gapic::CallOptions.new(**.to_h) if .respond_to? :to_h # Customize the options with defaults = @config.rpcs.create_read_session..to_h # Set x-goog-api-client, x-goog-user-project and x-goog-api-version headers [:"x-goog-api-client"] ||= ::Gapic::Headers.x_goog_api_client \ lib_name: @config.lib_name, lib_version: @config.lib_version, gapic_version: ::Google::Cloud::Bigquery::Storage::V1::VERSION [:"x-goog-api-version"] = API_VERSION unless API_VERSION.empty? [:"x-goog-user-project"] = @quota_project_id if @quota_project_id header_params = {} if request.read_session&.table header_params["read_session.table"] = request.read_session.table end request_params_header = header_params.map { |k, v| "#{k}=#{v}" }.join("&") [:"x-goog-request-params"] ||= request_params_header .apply_defaults timeout: @config.rpcs.create_read_session.timeout, metadata: , retry_policy: @config.rpcs.create_read_session.retry_policy .apply_defaults timeout: @config.timeout, metadata: @config., retry_policy: @config.retry_policy @big_query_read_stub.call_rpc :create_read_session, request, options: do |response, operation| yield response, operation if block_given? return response end rescue ::GRPC::BadStatus => e raise ::Google::Cloud::Error.from_error(e) end |
#read_rows(request, options = nil) ⇒ ::Enumerable<::Google::Cloud::Bigquery::Storage::V1::ReadRowsResponse> #read_rows(read_stream: nil, offset: nil) ⇒ ::Enumerable<::Google::Cloud::Bigquery::Storage::V1::ReadRowsResponse>
Reads rows from the stream in the format prescribed by the ReadSession. Each response contains one or more table rows, up to a maximum of 100 MiB per response; read requests which attempt to read individual rows larger than 100 MiB will fail.
Each request also returns a set of stream statistics reflecting the current state of the stream.
368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 |
# File 'lib/google/cloud/bigquery/storage/v1/big_query_read/client.rb', line 368 def read_rows request, = nil raise ::ArgumentError, "request must be provided" if request.nil? request = ::Gapic::Protobuf.coerce request, to: ::Google::Cloud::Bigquery::Storage::V1::ReadRowsRequest # Converts hash and nil to an options object = ::Gapic::CallOptions.new(**.to_h) if .respond_to? :to_h # Customize the options with defaults = @config.rpcs.read_rows..to_h # Set x-goog-api-client, x-goog-user-project and x-goog-api-version headers [:"x-goog-api-client"] ||= ::Gapic::Headers.x_goog_api_client \ lib_name: @config.lib_name, lib_version: @config.lib_version, gapic_version: ::Google::Cloud::Bigquery::Storage::V1::VERSION [:"x-goog-api-version"] = API_VERSION unless API_VERSION.empty? [:"x-goog-user-project"] = @quota_project_id if @quota_project_id header_params = {} if request.read_stream header_params["read_stream"] = request.read_stream end request_params_header = header_params.map { |k, v| "#{k}=#{v}" }.join("&") [:"x-goog-request-params"] ||= request_params_header .apply_defaults timeout: @config.rpcs.read_rows.timeout, metadata: , retry_policy: @config.rpcs.read_rows.retry_policy .apply_defaults timeout: @config.timeout, metadata: @config., retry_policy: @config.retry_policy @big_query_read_stub.call_rpc :read_rows, request, options: do |response, operation| yield response, operation if block_given? return response end rescue ::GRPC::BadStatus => e raise ::Google::Cloud::Error.from_error(e) end |
#split_read_stream(request, options = nil) ⇒ ::Google::Cloud::Bigquery::Storage::V1::SplitReadStreamResponse #split_read_stream(name: nil, fraction: nil) ⇒ ::Google::Cloud::Bigquery::Storage::V1::SplitReadStreamResponse
Splits a given ReadStream
into two ReadStream
objects. These
ReadStream
objects are referred to as the primary and the residual
streams of the split. The original ReadStream
can still be read from in
the same manner as before. Both of the returned ReadStream
objects can
also be read from, and the rows returned by both child streams will be
the same as the rows read from the original stream.
Moreover, the two child streams will be allocated back-to-back in the
original ReadStream
. Concretely, it is guaranteed that for streams
original, primary, and residual, that original[0-j] = primary[0-j] and
original[j-n] = residual[0-m] once the streams have been read to
completion.
473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 |
# File 'lib/google/cloud/bigquery/storage/v1/big_query_read/client.rb', line 473 def split_read_stream request, = nil raise ::ArgumentError, "request must be provided" if request.nil? request = ::Gapic::Protobuf.coerce request, to: ::Google::Cloud::Bigquery::Storage::V1::SplitReadStreamRequest # Converts hash and nil to an options object = ::Gapic::CallOptions.new(**.to_h) if .respond_to? :to_h # Customize the options with defaults = @config.rpcs.split_read_stream..to_h # Set x-goog-api-client, x-goog-user-project and x-goog-api-version headers [:"x-goog-api-client"] ||= ::Gapic::Headers.x_goog_api_client \ lib_name: @config.lib_name, lib_version: @config.lib_version, gapic_version: ::Google::Cloud::Bigquery::Storage::V1::VERSION [:"x-goog-api-version"] = API_VERSION unless API_VERSION.empty? [:"x-goog-user-project"] = @quota_project_id if @quota_project_id header_params = {} if request.name header_params["name"] = request.name end request_params_header = header_params.map { |k, v| "#{k}=#{v}" }.join("&") [:"x-goog-request-params"] ||= request_params_header .apply_defaults timeout: @config.rpcs.split_read_stream.timeout, metadata: , retry_policy: @config.rpcs.split_read_stream.retry_policy .apply_defaults timeout: @config.timeout, metadata: @config., retry_policy: @config.retry_policy @big_query_read_stub.call_rpc :split_read_stream, request, options: do |response, operation| yield response, operation if block_given? return response end rescue ::GRPC::BadStatus => e raise ::Google::Cloud::Error.from_error(e) end |
#universe_domain ⇒ String
The effective universe domain
122 123 124 |
# File 'lib/google/cloud/bigquery/storage/v1/big_query_read/client.rb', line 122 def universe_domain @big_query_read_stub.universe_domain end |