Class: Google::Cloud::Bigquery::QueryJob

Inherits:
Job
  • Object
show all
Defined in:
lib/google/cloud/bigquery/query_job.rb

Overview

QueryJob

A Job subclass representing a query operation that may be performed on a Table. A QueryJob instance is created when you call Project#query_job, Dataset#query_job.

Examples:

require "google/cloud/bigquery"

bigquery = Google::Cloud::Bigquery.new

job = bigquery.query_job "SELECT COUNT(word) as count FROM " \
                         "`bigquery-public-data.samples.shakespeare`"

job.wait_until_done!

if job.failed?
  puts job.error
else
  puts job.data.first
end

With multiple statements and child jobs:

require "google/cloud/bigquery"

bigquery = Google::Cloud::Bigquery.new

multi_statement_sql = <<~SQL
  -- Declare a variable to hold names as an array.
  DECLARE top_names ARRAY<STRING>;
  -- Build an array of the top 100 names from the year 2017.
  SET top_names = (
  SELECT ARRAY_AGG(name ORDER BY number DESC LIMIT 100)
  FROM `bigquery-public-data.usa_names.usa_1910_current`
  WHERE year = 2017
  );
  -- Which names appear as words in Shakespeare's plays?
  SELECT
  name AS shakespeare_name
  FROM UNNEST(top_names) AS name
  WHERE name IN (
  SELECT word
  FROM `bigquery-public-data.samples.shakespeare`
  );
SQL

job = bigquery.query_job multi_statement_sql

job.wait_until_done!

child_jobs = bigquery.jobs parent_job: job

child_jobs.each do |child_job|
  script_statistics = child_job.script_statistics
  puts script_statistics.evaluation_kind
  script_statistics.stack_frames.each do |stack_frame|
    puts stack_frame.text
  end
end

See Also:

Direct Known Subclasses

Updater

Defined Under Namespace

Classes: Stage, Step, Updater

Attributes collapse

Instance Method Summary collapse

Methods inherited from Job

#cancel, #configuration, #created_at, #delete, #done?, #ended_at, #error, #errors, #failed?, #job_id, #labels, #location, #num_child_jobs, #parent_job_id, #pending?, #project_id, #reload!, #rerun!, #reservation_usage, #running?, #script_statistics, #session_id, #started_at, #state, #statistics, #status, #transaction_id, #user_email

Instance Method Details

#batch?Boolean

Checks if the priority for the query is BATCH.

Returns:

  • (Boolean)

    true when the priority is BATCH, false otherwise.



96
97
98
# File 'lib/google/cloud/bigquery/query_job.rb', line 96

def batch?
  @gapi.configuration.query.priority == "BATCH"
end

#bytes_processedInteger?

The number of bytes processed by the query.

Returns:

  • (Integer, nil)

    Total bytes processed for the job.



213
214
215
216
217
# File 'lib/google/cloud/bigquery/query_job.rb', line 213

def bytes_processed
  Integer @gapi.statistics.query.total_bytes_processed
rescue StandardError
  nil
end

#cache?Boolean

Checks if the query job looks for an existing result in the query cache. For more information, see Query Caching.

Returns:

  • (Boolean)

    true when the query cache will be used, false otherwise.



133
134
135
136
137
# File 'lib/google/cloud/bigquery/query_job.rb', line 133

def cache?
  val = @gapi.configuration.query.use_query_cache
  return false if val.nil?
  val
end

#cache_hit?Boolean

Checks if the query results are from the query cache.

Returns:

  • (Boolean)

    true when the job statistics indicate a cache hit, false otherwise.



203
204
205
206
# File 'lib/google/cloud/bigquery/query_job.rb', line 203

def cache_hit?
  return false unless @gapi.statistics.query
  @gapi.statistics.query.cache_hit
end

#clustering?Boolean

Returns:

  • (Boolean)

    true when the table will be clustered, or false otherwise.

See Also:



660
661
662
# File 'lib/google/cloud/bigquery/query_job.rb', line 660

def clustering?
  !@gapi.configuration.query.clustering.nil?
end

#clustering_fieldsArray<String>?

One or more fields on which the destination table should be clustered. Must be specified with time-based partitioning, data in the table will be first partitioned and subsequently clustered. The order of the returned fields determines the sort order of the data.

BigQuery supports clustering for both partitioned and non-partitioned tables.

See Google::Cloud::Bigquery::QueryJob::Updater#clustering_fields=, Table#clustering_fields and Table#clustering_fields=.

Returns:

  • (Array<String>, nil)

    The clustering fields, or nil if the destination table will not be clustered.

See Also:



686
687
688
# File 'lib/google/cloud/bigquery/query_job.rb', line 686

def clustering_fields
  @gapi.configuration.query.clustering.fields if clustering?
end

#data(token: nil, max: nil, start: nil) ⇒ Google::Cloud::Bigquery::Data Also known as: query_results

Retrieves the query results for the job.

Examples:

require "google/cloud/bigquery"

bigquery = Google::Cloud::Bigquery.new

sql = "SELECT word FROM `bigquery-public-data.samples.shakespeare`"
job = bigquery.query_job sql

job.wait_until_done!
data = job.data

# Iterate over the first page of results
data.each do |row|
  puts row[:word]
end
# Retrieve the next page of results
data = data.next if data.next?

Parameters:

  • token (String) (defaults to: nil)

    Page token, returned by a previous call, identifying the result set.

  • max (Integer) (defaults to: nil)

    Maximum number of results to return.

  • start (Integer) (defaults to: nil)

    Zero-based index of the starting row to read.

Returns:



748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
# File 'lib/google/cloud/bigquery/query_job.rb', line 748

def data token: nil, max: nil, start: nil
  return nil unless done?
  return Data.from_gapi_json({ rows: [] }, nil, @gapi, service) if dryrun?
  if ddl? || dml? || !ensure_schema!
    data_hash = { totalRows: nil, rows: [] }
    return Data.from_gapi_json data_hash, nil, @gapi, service
  end

  data_hash = service.list_tabledata destination_table_dataset_id,
                                     destination_table_table_id,
                                     token: token,
                                     max: max,
                                     start: start
  Data.from_gapi_json data_hash, destination_table_gapi, @gapi, service
end

#ddl?Boolean

Whether the query is a DDL statement.

Examples:

require "google/cloud/bigquery"

bigquery = Google::Cloud::Bigquery.new
query_job = bigquery.query_job "CREATE TABLE my_table (x INT64)"

query_job.statement_type #=> "CREATE_TABLE"
query_job.ddl? #=> true

Returns:

  • (Boolean)

See Also:



299
300
301
302
303
304
305
306
307
308
309
310
# File 'lib/google/cloud/bigquery/query_job.rb', line 299

def ddl?
  [
    "ALTER_TABLE",
    "CREATE_MODEL",
    "CREATE_TABLE",
    "CREATE_TABLE_AS_SELECT",
    "CREATE_VIEW",
    "DROP_MODEL",
    "DROP_TABLE",
    "DROP_VIEW"
  ].include? statement_type
end

#ddl_operation_performedString?

The DDL operation performed, possibly dependent on the pre-existence of the DDL target. (See #ddl_target_table.) Possible values (new values might be added in the future):

  • "CREATE": The query created the DDL target.
  • "SKIP": No-op. Example cases: the query is CREATE TABLE IF NOT EXISTS while the table already exists, or the query is DROP TABLE IF EXISTS while the table does not exist.
  • "REPLACE": The query replaced the DDL target. Example case: the query is CREATE OR REPLACE TABLE, and the table already exists.
  • "DROP": The query deleted the DDL target.

Returns:

  • (String, nil)

    The DDL operation performed.



355
356
357
358
# File 'lib/google/cloud/bigquery/query_job.rb', line 355

def ddl_operation_performed
  return nil unless @gapi.statistics.query
  @gapi.statistics.query.ddl_operation_performed
end

#ddl_target_routineGoogle::Cloud::Bigquery::Routine?

The DDL target routine, in reference state. (See Routine#reference?.) Present only for CREATE/DROP FUNCTION/PROCEDURE queries. (See #statement_type.)

Returns:



368
369
370
371
372
373
374
# File 'lib/google/cloud/bigquery/query_job.rb', line 368

def ddl_target_routine
  return nil unless @gapi.statistics.query
  ensure_service!
  routine = @gapi.statistics.query.ddl_target_routine
  return nil unless routine
  Google::Cloud::Bigquery::Routine.new_reference_from_gapi routine, service
end

#ddl_target_tableGoogle::Cloud::Bigquery::Table?

The DDL target table, in reference state. (See Table#reference?.) Present only for CREATE/DROP TABLE/VIEW queries. (See #statement_type.)

Returns:



384
385
386
387
388
389
390
# File 'lib/google/cloud/bigquery/query_job.rb', line 384

def ddl_target_table
  return nil unless @gapi.statistics.query
  ensure_service!
  table = @gapi.statistics.query.ddl_target_table
  return nil unless table
  Google::Cloud::Bigquery::Table.new_reference_from_gapi table, service
end

#deleted_row_countInteger?

The number of deleted rows. Present only for DML statements DELETE, MERGE and TRUNCATE. (See #statement_type.)

Returns:

  • (Integer, nil)

    The number of deleted rows, or nil if not applicable.



411
412
413
# File 'lib/google/cloud/bigquery/query_job.rb', line 411

def deleted_row_count
  @gapi.statistics.query&.dml_stats&.deleted_row_count
end

#destination(view: nil) ⇒ Table

The table in which the query results are stored.

Parameters:

  • view (String) (defaults to: nil)

    Specifies the view that determines which table information is returned. By default, basic table information and storage statistics (STORAGE_STATS) are returned. Accepted values include :unspecified, :basic, :storage, and :full. For more information, see BigQuery Classes. The default value is the :unspecified view type.

Returns:

  • (Table)

    A table instance.



448
449
450
451
452
453
454
455
# File 'lib/google/cloud/bigquery/query_job.rb', line 448

def destination view: nil
  table = @gapi.configuration.query.destination_table
  return nil unless table
  retrieve_table table.project_id,
                 table.dataset_id,
                 table.table_id,
                 metadata_view: view
end

#dml?Boolean

Whether the query is a DML statement.

Examples:

require "google/cloud/bigquery"

bigquery = Google::Cloud::Bigquery.new
query_job = bigquery.query_job "UPDATE my_table " \
                               "SET x = x + 1 " \
                               "WHERE x IS NOT NULL"

query_job.statement_type #=> "UPDATE"
query_job.dml? #=> true

Returns:

  • (Boolean)

See Also:



331
332
333
334
335
336
337
338
# File 'lib/google/cloud/bigquery/query_job.rb', line 331

def dml?
  [
    "INSERT",
    "UPDATE",
    "MERGE",
    "DELETE"
  ].include? statement_type
end

#dryrun?Boolean Also known as: dryrun, dry_run, dry_run?

If set, don't actually run this job. A valid query will return a mostly empty response with some processing statistics, while an invalid query will return the same error it would if it wasn't a dry run.

Returns:

  • (Boolean)

    true when the dry run flag is set for the query job, false otherwise.



148
149
150
# File 'lib/google/cloud/bigquery/query_job.rb', line 148

def dryrun?
  @gapi.configuration.dry_run
end

#encryptionGoogle::Cloud::BigQuery::EncryptionConfiguration

The encryption configuration of the destination table.

Returns:

  • (Google::Cloud::BigQuery::EncryptionConfiguration)

    Custom encryption configuration (e.g., Cloud KMS keys).



501
502
503
# File 'lib/google/cloud/bigquery/query_job.rb', line 501

def encryption
  EncryptionConfiguration.from_gapi @gapi.configuration.query.destination_encryption_configuration
end

#flatten?Boolean

Checks if the query job flattens nested and repeated fields in the query results. The default is true. If the value is false,

large_results? should return true.

Returns:

  • (Boolean)

    true when the job flattens results, false otherwise.



163
164
165
166
167
# File 'lib/google/cloud/bigquery/query_job.rb', line 163

def flatten?
  val = @gapi.configuration.query.flatten_results
  return true if val.nil?
  val
end

#inserted_row_countInteger?

The number of inserted rows. Present only for DML statements INSERT and MERGE. (See #statement_type.)

Returns:

  • (Integer, nil)

    The number of inserted rows, or nil if not applicable.



422
423
424
# File 'lib/google/cloud/bigquery/query_job.rb', line 422

def inserted_row_count
  @gapi.statistics.query&.dml_stats&.inserted_row_count
end

#interactive?Boolean

Checks if the priority for the query is INTERACTIVE.

Returns:

  • (Boolean)

    true when the priority is INTERACTIVE, false otherwise.



106
107
108
109
110
# File 'lib/google/cloud/bigquery/query_job.rb', line 106

def interactive?
  val = @gapi.configuration.query.priority
  return true if val.nil?
  val == "INTERACTIVE"
end

#large_results?Boolean

Checks if the the query job allows arbitrarily large results at a slight cost to performance.

Returns:

  • (Boolean)

    true when large results are allowed, false otherwise.



119
120
121
122
123
# File 'lib/google/cloud/bigquery/query_job.rb', line 119

def large_results?
  val = @gapi.configuration.query.allow_large_results
  return false if val.nil?
  val
end

#legacy_sql?Boolean

Checks if the query job is using legacy sql.

Returns:

  • (Boolean)

    true when legacy sql is used, false otherwise.



462
463
464
465
466
# File 'lib/google/cloud/bigquery/query_job.rb', line 462

def legacy_sql?
  val = @gapi.configuration.query.use_legacy_sql
  return true if val.nil?
  val
end

#maximum_billing_tierInteger?

Limits the billing tier for this job. Queries that have resource usage beyond this tier will raise (without incurring a charge). If unspecified, this will be set to your project default. For more information, see High-Compute queries.

Returns:

  • (Integer, nil)

    The tier number, or nil for the project default.



179
180
181
# File 'lib/google/cloud/bigquery/query_job.rb', line 179

def maximum_billing_tier
  @gapi.configuration.query.maximum_billing_tier
end

#maximum_bytes_billedInteger?

Limits the bytes billed for this job. Queries that will have bytes billed beyond this limit will raise (without incurring a charge). If nil, this will be set to your project default.

Returns:

  • (Integer, nil)

    The number of bytes, or nil for the project default.



191
192
193
194
195
# File 'lib/google/cloud/bigquery/query_job.rb', line 191

def maximum_bytes_billed
  Integer @gapi.configuration.query.maximum_bytes_billed
rescue StandardError
  nil
end

#num_dml_affected_rowsInteger?

The number of rows affected by a DML statement. Present only for DML statements INSERT, UPDATE or DELETE. (See #statement_type.)

Returns:

  • (Integer, nil)

    The number of rows affected by a DML statement, or nil if the query is not a DML statement.



399
400
401
402
# File 'lib/google/cloud/bigquery/query_job.rb', line 399

def num_dml_affected_rows
  return nil unless @gapi.statistics.query
  @gapi.statistics.query.num_dml_affected_rows
end

#query_planArray<Google::Cloud::Bigquery::QueryJob::Stage>?

Describes the execution plan for the query.

Examples:

require "google/cloud/bigquery"

bigquery = Google::Cloud::Bigquery.new

sql = "SELECT word FROM `bigquery-public-data.samples.shakespeare`"
job = bigquery.query_job sql

job.wait_until_done!

stages = job.query_plan
stages.each do |stage|
  puts stage.name
  stage.steps.each do |step|
    puts step.kind
    puts step.substeps.inspect
  end
end

Returns:



244
245
246
247
# File 'lib/google/cloud/bigquery/query_job.rb', line 244

def query_plan
  return nil unless @gapi&.statistics&.query&.query_plan
  Array(@gapi.statistics.query.query_plan).map { |stage| Stage.from_gapi stage }
end

#range_partitioning?Boolean

Checks if the destination table will be range partitioned. See Creating and using integer range partitioned tables.

Returns:

  • (Boolean)

    true when the table is range partitioned, or false otherwise.



513
514
515
# File 'lib/google/cloud/bigquery/query_job.rb', line 513

def range_partitioning?
  !@gapi.configuration.query.range_partitioning.nil?
end

#range_partitioning_endInteger?

The end of range partitioning, exclusive. See Creating and using integer range partitioned tables.

Returns:

  • (Integer, nil)

    The end of range partitioning, exclusive, or nil if not range partitioned.



564
565
566
# File 'lib/google/cloud/bigquery/query_job.rb', line 564

def range_partitioning_end
  @gapi.configuration.query.range_partitioning.range.end if range_partitioning?
end

#range_partitioning_fieldString?

The field on which the destination table will be range partitioned, if any. The field must be a top-level NULLABLE/REQUIRED field. The only supported type is INTEGER/INT64. See Creating and using integer range partitioned tables.

Returns:

  • (String, nil)

    The partition field, if a field was configured, or nil if not range partitioned.



527
528
529
# File 'lib/google/cloud/bigquery/query_job.rb', line 527

def range_partitioning_field
  @gapi.configuration.query.range_partitioning.field if range_partitioning?
end

#range_partitioning_intervalInteger?

The width of each interval. See Creating and using integer range partitioned tables.

Returns:

  • (Integer, nil)

    The width of each interval, for data in range partitions, or nil if not range partitioned.



552
553
554
# File 'lib/google/cloud/bigquery/query_job.rb', line 552

def range_partitioning_interval
  @gapi.configuration.query.range_partitioning.range.interval if range_partitioning?
end

#range_partitioning_startInteger?

The start of range partitioning, inclusive. See Creating and using integer range partitioned tables.

Returns:

  • (Integer, nil)

    The start of range partitioning, inclusive, or nil if not range partitioned.



539
540
541
# File 'lib/google/cloud/bigquery/query_job.rb', line 539

def range_partitioning_start
  @gapi.configuration.query.range_partitioning.range.start if range_partitioning?
end

#standard_sql?Boolean

Checks if the query job is using standard sql.

Returns:

  • (Boolean)

    true when standard sql is used, false otherwise.



473
474
475
# File 'lib/google/cloud/bigquery/query_job.rb', line 473

def standard_sql?
  !legacy_sql?
end

#statement_typeString?

The type of query statement, if valid. Possible values (new values might be added in the future):

Returns:

  • (String, nil)

    The type of query statement.



277
278
279
280
# File 'lib/google/cloud/bigquery/query_job.rb', line 277

def statement_type
  return nil unless @gapi.statistics.query
  @gapi.statistics.query.statement_type
end

#time_partitioning?Boolean

Checks if the destination table will be time-partitioned. See Partitioned Tables.

Returns:

  • (Boolean)

    true when the table will be time-partitioned, or false otherwise.



577
578
579
# File 'lib/google/cloud/bigquery/query_job.rb', line 577

def time_partitioning?
  !@gapi.configuration.query.time_partitioning.nil?
end

#time_partitioning_expirationInteger?

The expiration for the destination table partitions, if any, in seconds. See Partitioned Tables.

Returns:

  • (Integer, nil)

    The expiration time, in seconds, for data in partitions, or nil if not present.



622
623
624
625
# File 'lib/google/cloud/bigquery/query_job.rb', line 622

def time_partitioning_expiration
  tp = @gapi.configuration.query.time_partitioning
  tp.expiration_ms / 1_000 if tp && !tp.expiration_ms.nil?
end

#time_partitioning_fieldString?

The field on which the destination table will be partitioned, if any. If not set, the destination table will be partitioned by pseudo column _PARTITIONTIME; if set, the table will be partitioned by this field. See Partitioned Tables.

Returns:

  • (String, nil)

    The partition field, if a field was configured. nil if not partitioned or not set (partitioned by pseudo column '_PARTITIONTIME').



607
608
609
610
# File 'lib/google/cloud/bigquery/query_job.rb', line 607

def time_partitioning_field
  return nil unless time_partitioning?
  @gapi.configuration.query.time_partitioning.field
end

#time_partitioning_require_filter?Boolean

If set to true, queries over the destination table will require a partition filter that can be used for partition elimination to be specified. See Partitioned Tables.

Returns:

  • (Boolean)

    true when a partition filter will be required, or false otherwise.



638
639
640
641
642
# File 'lib/google/cloud/bigquery/query_job.rb', line 638

def time_partitioning_require_filter?
  tp = @gapi.configuration.query.time_partitioning
  return false if tp.nil? || tp.require_partition_filter.nil?
  tp.require_partition_filter
end

#time_partitioning_typeString?

The period for which the destination table will be partitioned, if any. See Partitioned Tables.

Returns:

  • (String, nil)

    The partition type. The supported types are DAY, HOUR, MONTH, and YEAR, which will generate one partition per day, hour, month, and year, respectively; or nil if not present.



591
592
593
# File 'lib/google/cloud/bigquery/query_job.rb', line 591

def time_partitioning_type
  @gapi.configuration.query.time_partitioning.type if time_partitioning?
end

#udfsArray<String>

The user-defined function resources used in the query. May be either a code resource to load from a Google Cloud Storage URI (gs://bucket/path), or an inline resource that contains code for a user-defined function (UDF). Providing an inline code resource is equivalent to providing a URI for a file containing the same code. See User-Defined Functions.

Returns:

  • (Array<String>)

    An array containing Google Cloud Storage URIs and/or inline source code.



488
489
490
491
492
# File 'lib/google/cloud/bigquery/query_job.rb', line 488

def udfs
  udfs_gapi = @gapi.configuration.query.user_defined_function_resources
  return nil unless udfs_gapi
  Array(udfs_gapi).map { |udf| udf.inline_code || udf.resource_uri }
end

#updated_row_countInteger?

The number of updated rows. Present only for DML statements UPDATE and MERGE. (See #statement_type.)

Returns:

  • (Integer, nil)

    The number of updated rows, or nil if not applicable.



433
434
435
# File 'lib/google/cloud/bigquery/query_job.rb', line 433

def updated_row_count
  @gapi.statistics.query&.dml_stats&.updated_row_count
end

#wait_until_done!Object

Refreshes the job until the job is DONE. The delay between refreshes will incrementally increase.

Examples:

require "google/cloud/bigquery"

bigquery = Google::Cloud::Bigquery.new

sql = "SELECT word FROM `bigquery-public-data.samples.shakespeare`"
job = bigquery.query_job sql

job.wait_until_done!
job.done? #=> true


705
706
707
708
709
710
711
712
713
714
715
716
717
# File 'lib/google/cloud/bigquery/query_job.rb', line 705

def wait_until_done!
  return if done?

  ensure_service!
  loop do
    query_results_gapi = service.job_query_results job_id, location: location, max: 0
    if query_results_gapi.job_complete
      @destination_schema_gapi = query_results_gapi.schema
      break
    end
  end
  reload!
end