Class: Google::Apis::DiscoveryengineV1alpha::GoogleCloudDiscoveryengineV1alphaBigQuerySource

Inherits:
Object
  • Object
show all
Includes:
Core::Hashable, Core::JsonObjectSupport
Defined in:
lib/google/apis/discoveryengine_v1alpha/classes.rb,
lib/google/apis/discoveryengine_v1alpha/representations.rb,
lib/google/apis/discoveryengine_v1alpha/representations.rb

Overview

BigQuery source import data from.

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(**args) ⇒ GoogleCloudDiscoveryengineV1alphaBigQuerySource

Returns a new instance of GoogleCloudDiscoveryengineV1alphaBigQuerySource.



123
124
125
# File 'lib/google/apis/discoveryengine_v1alpha/classes.rb', line 123

def initialize(**args)
   update!(**args)
end

Instance Attribute Details

#data_schemaString

The schema to use when parsing the data from the source. Supported values for imports: * user_event (default): One JSON UserEvent per line. * document ( default): One JSON Document per line. Each document must have a valid document. id. Corresponds to the JSON property dataSchema

Returns:

  • (String)


83
84
85
# File 'lib/google/apis/discoveryengine_v1alpha/classes.rb', line 83

def data_schema
  @data_schema
end

#dataset_idString

Required. The BigQuery data set to copy the data from with a length limit of 1, 024 characters. Corresponds to the JSON property datasetId

Returns:

  • (String)


89
90
91
# File 'lib/google/apis/discoveryengine_v1alpha/classes.rb', line 89

def dataset_id
  @dataset_id
end

#gcs_staging_dirString

Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory. Corresponds to the JSON property gcsStagingDir

Returns:

  • (String)


96
97
98
# File 'lib/google/apis/discoveryengine_v1alpha/classes.rb', line 96

def gcs_staging_dir
  @gcs_staging_dir
end

#partition_dateGoogle::Apis::DiscoveryengineV1alpha::GoogleTypeDate

Represents a whole or partial calendar date, such as a birthday. The time of day and time zone are either specified elsewhere or are insignificant. The date is relative to the Gregorian Calendar. This can represent one of the following: * A full date, with non-zero year, month, and day values. * A month and day, with a zero year (for example, an anniversary). * A year on its own, with a zero month and a zero day. * A year and month, with a zero day (for example, a credit card expiration date). Related types: * google.type. TimeOfDay * google.type.DateTime * google.protobuf.Timestamp Corresponds to the JSON property partitionDate



108
109
110
# File 'lib/google/apis/discoveryengine_v1alpha/classes.rb', line 108

def partition_date
  @partition_date
end

#project_idString

The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters. If not specified, inherits the project ID from the parent request. Corresponds to the JSON property projectId

Returns:

  • (String)


115
116
117
# File 'lib/google/apis/discoveryengine_v1alpha/classes.rb', line 115

def project_id
  @project_id
end

#table_idString

Required. The BigQuery table to copy the data from with a length limit of 1, 024 characters. Corresponds to the JSON property tableId

Returns:

  • (String)


121
122
123
# File 'lib/google/apis/discoveryengine_v1alpha/classes.rb', line 121

def table_id
  @table_id
end

Instance Method Details

#update!(**args) ⇒ Object

Update properties of this object



128
129
130
131
132
133
134
135
# File 'lib/google/apis/discoveryengine_v1alpha/classes.rb', line 128

def update!(**args)
  @data_schema = args[:data_schema] if args.key?(:data_schema)
  @dataset_id = args[:dataset_id] if args.key?(:dataset_id)
  @gcs_staging_dir = args[:gcs_staging_dir] if args.key?(:gcs_staging_dir)
  @partition_date = args[:partition_date] if args.key?(:partition_date)
  @project_id = args[:project_id] if args.key?(:project_id)
  @table_id = args[:table_id] if args.key?(:table_id)
end