Class: Google::Cloud::Bigquery::LoadJob

Inherits:
Job
  • Object
show all
Defined in:
lib/google/cloud/bigquery/load_job.rb

Overview

LoadJob

A Job subclass representing a load operation that may be performed on a Table. A LoadJob instance is created when you call Table#load_job.

Examples:

require "google/cloud/bigquery"

bigquery = Google::Cloud::Bigquery.new
dataset = bigquery.dataset "my_dataset"

gs_url = "gs://my-bucket/file-name.csv"
load_job = dataset.load_job "my_new_table", gs_url do |schema|
  schema.string "first_name", mode: :required
  schema.record "cities_lived", mode: :repeated do |nested_schema|
    nested_schema.string "place", mode: :required
    nested_schema.integer "number_of_years", mode: :required
  end
end

load_job.wait_until_done!
load_job.done? #=> true

See Also:

Direct Known Subclasses

Updater

Defined Under Namespace

Classes: Updater

Attributes collapse

Instance Method Summary collapse

Methods inherited from Job

#cancel, #configuration, #created_at, #done?, #ended_at, #error, #errors, #failed?, #job_id, #labels, #location, #num_child_jobs, #parent_job_id, #pending?, #project_id, #reload!, #rerun!, #running?, #script_statistics, #started_at, #state, #statistics, #status, #user_email, #wait_until_done!

Instance Method Details

#allow_jagged_rows?Boolean

Checks if the load operation accepts rows that are missing trailing optional columns. The missing values are treated as nulls. If false, records with missing trailing columns are treated as bad records, and if there are too many bad records, an error is returned. The default value is false. Only applicable to CSV, ignored for other formats.

Returns:

  • (Boolean)

    true when jagged rows are allowed, false otherwise.



235
236
237
238
239
# File 'lib/google/cloud/bigquery/load_job.rb', line 235

def allow_jagged_rows?
  val = @gapi.configuration.load.allow_jagged_rows
  val = false if val.nil?
  val
end

#autodetect?Boolean

Checks if BigQuery should automatically infer the options and schema for CSV and JSON sources. The default is false.

Returns:

  • (Boolean)

    true when autodetect is enabled, false otherwise.



184
185
186
187
188
# File 'lib/google/cloud/bigquery/load_job.rb', line 184

def autodetect?
  val = @gapi.configuration.load.autodetect
  val = false if val.nil?
  val
end

#backup?Boolean

Checks if the source data is a Google Cloud Datastore backup.

Returns:

  • (Boolean)

    true when the source format is DATASTORE_BACKUP, false otherwise.



220
221
222
223
# File 'lib/google/cloud/bigquery/load_job.rb', line 220

def backup?
  val = @gapi.configuration.load.source_format
  val == "DATASTORE_BACKUP"
end

#clustering?Boolean?

Checks if the destination table will be clustered.

Returns:

  • (Boolean, nil)

    true when the table will be clustered, or false otherwise.

See Also:



501
502
503
# File 'lib/google/cloud/bigquery/load_job.rb', line 501

def clustering?
  !@gapi.configuration.load.clustering.nil?
end

#clustering_fieldsArray<String>?

One or more fields on which the destination table should be clustered. Must be specified with time-based partitioning, data in the table will be first partitioned and subsequently clustered. The order of the returned fields determines the sort order of the data.

See Google::Cloud::Bigquery::LoadJob::Updater#clustering_fields=.

Returns:

  • (Array<String>, nil)

    The clustering fields, or nil if the destination table will not be clustered.

See Also:



525
526
527
# File 'lib/google/cloud/bigquery/load_job.rb', line 525

def clustering_fields
  @gapi.configuration.load.clustering.fields if clustering?
end

#csv?Boolean

Checks if the format of the source data is CSV. The default is true.

Returns:

  • (Boolean)

    true when the source format is CSV, false otherwise.



208
209
210
211
212
# File 'lib/google/cloud/bigquery/load_job.rb', line 208

def csv?
  val = @gapi.configuration.load.source_format
  return true if val.nil?
  val == "CSV"
end

#delimiterString

The delimiter used between fields in the source data. The default is a comma (,).

Returns:

  • (String)

    A string containing the character, such as ",".



79
80
81
# File 'lib/google/cloud/bigquery/load_job.rb', line 79

def delimiter
  @gapi.configuration.load.field_delimiter || ","
end

#destinationTable

The table into which the operation loads data. This is the table on which Table#load_job was invoked.

Returns:

  • (Table)

    A table instance.



67
68
69
70
71
# File 'lib/google/cloud/bigquery/load_job.rb', line 67

def destination
  table = @gapi.configuration.load.destination_table
  return nil unless table
  retrieve_table table.project_id, table.dataset_id, table.table_id
end

#encryptionGoogle::Cloud::BigQuery::EncryptionConfiguration

The encryption configuration of the destination table.

Returns:

  • (Google::Cloud::BigQuery::EncryptionConfiguration)

    Custom encryption configuration (e.g., Cloud KMS keys).



332
333
334
335
336
# File 'lib/google/cloud/bigquery/load_job.rb', line 332

def encryption
  EncryptionConfiguration.from_gapi(
    @gapi.configuration.load.destination_encryption_configuration
  )
end

#ignore_unknown_values?Boolean

Checks if the load operation allows extra values that are not represented in the table schema. If true, the extra values are ignored. If false, records with extra columns are treated as bad records, and if there are too many bad records, an invalid error is returned. The default is false.

Returns:

  • (Boolean)

    true when unknown values are ignored, false otherwise.



251
252
253
254
255
# File 'lib/google/cloud/bigquery/load_job.rb', line 251

def ignore_unknown_values?
  val = @gapi.configuration.load.ignore_unknown_values
  val = false if val.nil?
  val
end

#input_file_bytesInteger

The number of bytes of source data in the load job.

Returns:

  • (Integer)

    The number of bytes.



307
308
309
310
311
# File 'lib/google/cloud/bigquery/load_job.rb', line 307

def input_file_bytes
  Integer @gapi.statistics.load.input_file_bytes
rescue StandardError
  nil
end

#input_filesInteger

The number of source data files in the load job.

Returns:

  • (Integer)

    The number of source files.



296
297
298
299
300
# File 'lib/google/cloud/bigquery/load_job.rb', line 296

def input_files
  Integer @gapi.statistics.load.input_files
rescue StandardError
  nil
end

#iso8859_1?Boolean

Checks if the character encoding of the data is ISO-8859-1.

Returns:

  • (Boolean)

    true when the character encoding is ISO-8859-1, false otherwise.



114
115
116
117
# File 'lib/google/cloud/bigquery/load_job.rb', line 114

def iso8859_1?
  val = @gapi.configuration.load.encoding
  val == "ISO-8859-1"
end

#json?Boolean

Checks if the format of the source data is newline-delimited JSON. The default is false.

Returns:

  • (Boolean)

    true when the source format is NEWLINE_DELIMITED_JSON, false otherwise.



197
198
199
200
# File 'lib/google/cloud/bigquery/load_job.rb', line 197

def json?
  val = @gapi.configuration.load.source_format
  val == "NEWLINE_DELIMITED_JSON"
end

#max_bad_recordsInteger

The maximum number of bad records that the load operation can ignore. If the number of bad records exceeds this value, an error is returned. The default value is 0, which requires that all records be valid.

Returns:

  • (Integer)

    The maximum number of bad records.



141
142
143
144
145
# File 'lib/google/cloud/bigquery/load_job.rb', line 141

def max_bad_records
  val = @gapi.configuration.load.max_bad_records
  val = 0 if val.nil?
  val
end

#null_markerString

Specifies a string that represents a null value in a CSV file. For example, if you specify \N, BigQuery interprets \N as a null value when loading a CSV file. The default value is the empty string. If you set this property to a custom value, BigQuery throws an error if an empty string is present for all data types except for STRING and BYTE. For STRING and BYTE columns, BigQuery interprets the empty string as an empty value.

Returns:

  • (String)

    A string representing null value in a CSV file.



158
159
160
161
162
# File 'lib/google/cloud/bigquery/load_job.rb', line 158

def null_marker
  val = @gapi.configuration.load.null_marker
  val = "" if val.nil?
  val
end

#output_bytesInteger

The number of bytes that have been loaded into the table. While an import job is in the running state, this value may change.

Returns:

  • (Integer)

    The number of bytes that have been loaded.



344
345
346
347
348
# File 'lib/google/cloud/bigquery/load_job.rb', line 344

def output_bytes
  Integer @gapi.statistics.load.output_bytes
rescue StandardError
  nil
end

#output_rowsInteger

The number of rows that have been loaded into the table. While an import job is in the running state, this value may change.

Returns:

  • (Integer)

    The number of rows that have been loaded.



319
320
321
322
323
# File 'lib/google/cloud/bigquery/load_job.rb', line 319

def output_rows
  Integer @gapi.statistics.load.output_rows
rescue StandardError
  nil
end

#quoteString

The value that is used to quote data sections in a CSV file. The default value is a double-quote ("). If your data does not contain quoted sections, the value should be an empty string. If your data contains quoted newline characters, #quoted_newlines? should return true.

Returns:

  • (String)

    A string containing the character, such as "\"".



128
129
130
131
132
# File 'lib/google/cloud/bigquery/load_job.rb', line 128

def quote
  val = @gapi.configuration.load.quote
  val = "\"" if val.nil?
  val
end

#quoted_newlines?Boolean

Checks if quoted data sections may contain newline characters in a CSV file. The default is false.

Returns:

  • (Boolean)

    true when quoted newlines are allowed, false otherwise.



171
172
173
174
175
# File 'lib/google/cloud/bigquery/load_job.rb', line 171

def quoted_newlines?
  val = @gapi.configuration.load.allow_quoted_newlines
  val = false if val.nil?
  val
end

#range_partitioning?Boolean

Checks if the destination table will be range partitioned. See Creating and using integer range partitioned tables.

Returns:

  • (Boolean)

    true when the table is range partitioned, or false otherwise.



358
359
360
# File 'lib/google/cloud/bigquery/load_job.rb', line 358

def range_partitioning?
  !@gapi.configuration.load.range_partitioning.nil?
end

#range_partitioning_endInteger?

The end of range partitioning, exclusive. See Creating and using integer range partitioned tables.

Returns:

  • (Integer, nil)

    The end of range partitioning, exclusive, or nil if not range partitioned.



410
411
412
# File 'lib/google/cloud/bigquery/load_job.rb', line 410

def range_partitioning_end
  @gapi.configuration.load.range_partitioning.range.end if range_partitioning?
end

#range_partitioning_fieldString?

The field on which the destination table will be range partitioned, if any. The field must be a top-level NULLABLE/REQUIRED field. The only supported type is INTEGER/INT64. See Creating and using integer range partitioned tables.

Returns:

  • (String, nil)

    The partition field, if a field was configured, or nil if not range partitioned.



372
373
374
# File 'lib/google/cloud/bigquery/load_job.rb', line 372

def range_partitioning_field
  @gapi.configuration.load.range_partitioning.field if range_partitioning?
end

#range_partitioning_intervalInteger?

The width of each interval. See Creating and using integer range partitioned tables.

Returns:

  • (Integer, nil)

    The width of each interval, for data in range partitions, or nil if not range partitioned.



397
398
399
400
# File 'lib/google/cloud/bigquery/load_job.rb', line 397

def range_partitioning_interval
  return nil unless range_partitioning?
  @gapi.configuration.load.range_partitioning.range.interval
end

#range_partitioning_startInteger?

The start of range partitioning, inclusive. See Creating and using integer range partitioned tables.

Returns:

  • (Integer, nil)

    The start of range partitioning, inclusive, or nil if not range partitioned.



384
385
386
# File 'lib/google/cloud/bigquery/load_job.rb', line 384

def range_partitioning_start
  @gapi.configuration.load.range_partitioning.range.start if range_partitioning?
end

#schemaSchema?

The schema for the destination table. The schema can be omitted if the destination table already exists, or if you're loading data from Google Cloud Datastore.

The returned object is frozen and changes are not allowed. Use Table#schema to update the schema.

Returns:

  • (Schema, nil)

    A schema object, or nil.



267
268
269
# File 'lib/google/cloud/bigquery/load_job.rb', line 267

def schema
  Schema.from_gapi(@gapi.configuration.load.schema).freeze
end

#schema_update_optionsArray<String>

Allows the schema of the destination table to be updated as a side effect of the load job if a schema is autodetected or supplied in the job configuration. Schema update options are supported in two cases: when write disposition is WRITE_APPEND; when write disposition is WRITE_TRUNCATE and the destination table is a partition of a table, specified by partition decorators. For normal tables, WRITE_TRUNCATE will always overwrite the schema. One or more of the following values are specified:

  • ALLOW_FIELD_ADDITION: allow adding a nullable field to the schema.
  • ALLOW_FIELD_RELAXATION: allow relaxing a required field in the original schema to nullable.

Returns:

  • (Array<String>)

    An array of strings.



287
288
289
# File 'lib/google/cloud/bigquery/load_job.rb', line 287

def schema_update_options
  Array @gapi.configuration.load.schema_update_options
end

#skip_leading_rowsInteger

The number of rows at the top of a CSV file that BigQuery will skip when loading the data. The default value is 0. This property is useful if you have header rows in the file that should be skipped.

Returns:

  • (Integer)

    The number of header rows at the top of a CSV file to skip.



91
92
93
# File 'lib/google/cloud/bigquery/load_job.rb', line 91

def skip_leading_rows
  @gapi.configuration.load.skip_leading_rows || 0
end

#sourcesObject

The URI or URIs representing the Google Cloud Storage files from which the operation loads data.



57
58
59
# File 'lib/google/cloud/bigquery/load_job.rb', line 57

def sources
  Array @gapi.configuration.load.source_uris
end

#time_partitioning?Boolean?

Checks if the destination table will be time partitioned. See Partitioned Tables.

Returns:

  • (Boolean, nil)

    true when the table will be time-partitioned, or false otherwise.



423
424
425
# File 'lib/google/cloud/bigquery/load_job.rb', line 423

def time_partitioning?
  !@gapi.configuration.load.time_partitioning.nil?
end

#time_partitioning_expirationInteger?

The expiration for the destination table time partitions, if any, in seconds. See Partitioned Tables.

Returns:

  • (Integer, nil)

    The expiration time, in seconds, for data in time partitions, or nil if not present.



466
467
468
469
470
471
# File 'lib/google/cloud/bigquery/load_job.rb', line 466

def time_partitioning_expiration
  return nil unless time_partitioning?
  return nil if @gapi.configuration.load.time_partitioning.expiration_ms.nil?

  @gapi.configuration.load.time_partitioning.expiration_ms / 1_000
end

#time_partitioning_fieldString?

The field on which the destination table will be time partitioned, if any. If not set, the destination table will be time partitioned by pseudo column _PARTITIONTIME; if set, the table will be time partitioned by this field. See Partitioned Tables.

Returns:

  • (String, nil)

    The time partition field, if a field was configured. nil if not time partitioned or not set (partitioned by pseudo column '_PARTITIONTIME').



452
453
454
# File 'lib/google/cloud/bigquery/load_job.rb', line 452

def time_partitioning_field
  @gapi.configuration.load.time_partitioning.field if time_partitioning?
end

#time_partitioning_require_filter?Boolean

If set to true, queries over the destination table will require a time partition filter that can be used for partition elimination to be specified. See Partitioned Tables.

Returns:

  • (Boolean)

    true when a time partition filter will be required, or false otherwise.



484
485
486
487
488
# File 'lib/google/cloud/bigquery/load_job.rb', line 484

def time_partitioning_require_filter?
  tp = @gapi.configuration.load.time_partitioning
  return false if tp.nil? || tp.require_partition_filter.nil?
  tp.require_partition_filter
end

#time_partitioning_typeString?

The period for which the destination table will be time partitioned, if any. See Partitioned Tables.

Returns:

  • (String, nil)

    The time partition type. Currently the only supported value is "DAY", or nil if not present.



436
437
438
# File 'lib/google/cloud/bigquery/load_job.rb', line 436

def time_partitioning_type
  @gapi.configuration.load.time_partitioning.type if time_partitioning?
end

#utf8?Boolean

Checks if the character encoding of the data is UTF-8. This is the default.

Returns:

  • (Boolean)

    true when the character encoding is UTF-8, false otherwise.



102
103
104
105
106
# File 'lib/google/cloud/bigquery/load_job.rb', line 102

def utf8?
  val = @gapi.configuration.load.encoding
  return true if val.nil?
  val == "UTF-8"
end