Class: Google::Apis::BigquerydatatransferV1::TransferConfig

Inherits:
Object
  • Object
show all
Includes:
Core::Hashable, Core::JsonObjectSupport
Defined in:
generated/google/apis/bigquerydatatransfer_v1/classes.rb,
generated/google/apis/bigquerydatatransfer_v1/representations.rb,
generated/google/apis/bigquerydatatransfer_v1/representations.rb

Overview

Represents a data transfer configuration. A transfer configuration contains all metadata needed to perform a data transfer. For example, destination_dataset_id specifies where data should be stored. When a new transfer configuration is created, the specified destination_dataset_id is created when needed and shared with the appropriate data source service account.

Instance Attribute Summary collapse

Instance Method Summary collapse

Methods included from Core::JsonObjectSupport

#to_json

Methods included from Core::Hashable

process_value, #to_h

Constructor Details

#initialize(**args) ⇒ TransferConfig

Returns a new instance of TransferConfig



663
664
665
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 663

def initialize(**args)
   update!(**args)
end

Instance Attribute Details

#data_refresh_window_daysFixnum

The number of days to look back to automatically refresh the data. For example, if data_refresh_window_days = 10, then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value. Corresponds to the JSON property dataRefreshWindowDays

Returns:

  • (Fixnum)


580
581
582
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 580

def data_refresh_window_days
  @data_refresh_window_days
end

#data_source_idString

Data source id. Cannot be changed once data transfer is created. Corresponds to the JSON property dataSourceId

Returns:

  • (String)


585
586
587
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 585

def data_source_id
  @data_source_id
end

#dataset_regionString

Output only. Region in which BigQuery dataset is located. Corresponds to the JSON property datasetRegion

Returns:

  • (String)


590
591
592
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 590

def dataset_region
  @dataset_region
end

#destination_dataset_idString

The BigQuery target dataset id. Corresponds to the JSON property destinationDatasetId

Returns:

  • (String)


595
596
597
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 595

def destination_dataset_id
  @destination_dataset_id
end

#disabledBoolean Also known as: disabled?

Is this config disabled. When set to true, no runs are scheduled for a given transfer. Corresponds to the JSON property disabled

Returns:

  • (Boolean)


601
602
603
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 601

def disabled
  @disabled
end

#display_nameString

User specified display name for the data transfer. Corresponds to the JSON property displayName

Returns:

  • (String)


607
608
609
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 607

def display_name
  @display_name
end

#nameString

The resource name of the transfer config. Transfer config names have the form projects/project_id/transferConfigs/config_id`. Whereconfig_idis usually a uuid, even though it is not guaranteed or required. The name is ignored when creating a transfer config. Corresponds to the JSON propertyname`

Returns:

  • (String)


617
618
619
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 617

def name
  @name
end

#next_run_timeString

Output only. Next time when data transfer will run. Corresponds to the JSON property nextRunTime

Returns:

  • (String)


622
623
624
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 622

def next_run_time
  @next_run_time
end

#paramsHash<String,Object>

Data transfer specific parameters. Corresponds to the JSON property params

Returns:

  • (Hash<String,Object>)


627
628
629
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 627

def params
  @params
end

#scheduleString

Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: 1st,3rd monday of month 15:30, every wed,fri of jan,jun 13:15, and first sunday of quarter 00:00. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with- cron-yaml#the_schedule_format NOTE: the granularity should be at least 8 hours, or less frequent. Corresponds to the JSON property schedule

Returns:

  • (String)


644
645
646
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 644

def schedule
  @schedule
end

#stateString

Output only. State of the most recently updated transfer run. Corresponds to the JSON property state

Returns:

  • (String)


649
650
651
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 649

def state
  @state
end

#update_timeString

Output only. Data transfer modification time. Ignored by server on input. Corresponds to the JSON property updateTime

Returns:

  • (String)


654
655
656
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 654

def update_time
  @update_time
end

#user_idFixnum

Output only. Unique ID of the user on whose behalf transfer is done. Applicable only to data sources that do not support service accounts. When set to 0, the data source service account credentials are used. Corresponds to the JSON property userId

Returns:

  • (Fixnum)


661
662
663
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 661

def user_id
  @user_id
end

Instance Method Details

#update!(**args) ⇒ Object

Update properties of this object



668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
# File 'generated/google/apis/bigquerydatatransfer_v1/classes.rb', line 668

def update!(**args)
  @data_refresh_window_days = args[:data_refresh_window_days] if args.key?(:data_refresh_window_days)
  @data_source_id = args[:data_source_id] if args.key?(:data_source_id)
  @dataset_region = args[:dataset_region] if args.key?(:dataset_region)
  @destination_dataset_id = args[:destination_dataset_id] if args.key?(:destination_dataset_id)
  @disabled = args[:disabled] if args.key?(:disabled)
  @display_name = args[:display_name] if args.key?(:display_name)
  @name = args[:name] if args.key?(:name)
  @next_run_time = args[:next_run_time] if args.key?(:next_run_time)
  @params = args[:params] if args.key?(:params)
  @schedule = args[:schedule] if args.key?(:schedule)
  @state = args[:state] if args.key?(:state)
  @update_time = args[:update_time] if args.key?(:update_time)
  @user_id = args[:user_id] if args.key?(:user_id)
end