Class: Google::Apis::DataprocV1::Batch
- Inherits:
-
Object
- Object
- Google::Apis::DataprocV1::Batch
- Includes:
- Core::Hashable, Core::JsonObjectSupport
- Defined in:
- lib/google/apis/dataproc_v1/classes.rb,
lib/google/apis/dataproc_v1/representations.rb,
lib/google/apis/dataproc_v1/representations.rb
Overview
A representation of a batch workload in the service.
Instance Attribute Summary collapse
-
#create_time ⇒ String
Output only.
-
#creator ⇒ String
Output only.
-
#environment_config ⇒ Google::Apis::DataprocV1::EnvironmentConfig
Environment configuration for a workload.
-
#labels ⇒ Hash<String,String>
Optional.
-
#name ⇒ String
Output only.
-
#operation ⇒ String
Output only.
-
#pyspark_batch ⇒ Google::Apis::DataprocV1::PySparkBatch
A configuration for running an Apache PySpark (https://spark.apache.org/docs/ latest/api/python/getting_started/quickstart.html) batch workload.
-
#runtime_config ⇒ Google::Apis::DataprocV1::RuntimeConfig
Runtime configuration for a workload.
-
#runtime_info ⇒ Google::Apis::DataprocV1::RuntimeInfo
Runtime information about workload execution.
-
#spark_batch ⇒ Google::Apis::DataprocV1::SparkBatch
A configuration for running an Apache Spark (https://spark.apache.org/) batch workload.
-
#spark_r_batch ⇒ Google::Apis::DataprocV1::SparkRBatch
A configuration for running an Apache SparkR (https://spark.apache.org/docs/ latest/sparkr.html) batch workload.
-
#spark_sql_batch ⇒ Google::Apis::DataprocV1::SparkSqlBatch
A configuration for running Apache Spark SQL (https://spark.apache.org/sql/) queries as a batch workload.
-
#state ⇒ String
Output only.
-
#state_history ⇒ Array<Google::Apis::DataprocV1::StateHistory>
Output only.
-
#state_message ⇒ String
Output only.
-
#state_time ⇒ String
Output only.
-
#uuid ⇒ String
Output only.
Instance Method Summary collapse
-
#initialize(**args) ⇒ Batch
constructor
A new instance of Batch.
-
#update!(**args) ⇒ Object
Update properties of this object.
Constructor Details
#initialize(**args) ⇒ Batch
Returns a new instance of Batch.
347 348 349 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 347 def initialize(**args) update!(**args) end |
Instance Attribute Details
#create_time ⇒ String
Output only. The time when the batch was created.
Corresponds to the JSON property createTime
255 256 257 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 255 def create_time @create_time end |
#creator ⇒ String
Output only. The email address of the user who created the batch.
Corresponds to the JSON property creator
260 261 262 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 260 def creator @creator end |
#environment_config ⇒ Google::Apis::DataprocV1::EnvironmentConfig
Environment configuration for a workload.
Corresponds to the JSON property environmentConfig
265 266 267 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 265 def environment_config @environment_config end |
#labels ⇒ Hash<String,String>
Optional. The labels to associate with this batch. Label keys must contain 1
to 63 characters, and must conform to RFC 1035 (https://www.ietf.org/rfc/
rfc1035.txt). Label values may be empty, but, if present, must contain 1 to 63
characters, and must conform to RFC 1035 (https://www.ietf.org/rfc/rfc1035.txt)
. No more than 32 labels can be associated with a batch.
Corresponds to the JSON property labels
274 275 276 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 274 def labels @labels end |
#name ⇒ String
Output only. The resource name of the batch.
Corresponds to the JSON property name
279 280 281 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 279 def name @name end |
#operation ⇒ String
Output only. The resource name of the operation associated with this batch.
Corresponds to the JSON property operation
284 285 286 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 284 def operation @operation end |
#pyspark_batch ⇒ Google::Apis::DataprocV1::PySparkBatch
A configuration for running an Apache PySpark (https://spark.apache.org/docs/
latest/api/python/getting_started/quickstart.html) batch workload.
Corresponds to the JSON property pysparkBatch
290 291 292 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 290 def pyspark_batch @pyspark_batch end |
#runtime_config ⇒ Google::Apis::DataprocV1::RuntimeConfig
Runtime configuration for a workload.
Corresponds to the JSON property runtimeConfig
295 296 297 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 295 def runtime_config @runtime_config end |
#runtime_info ⇒ Google::Apis::DataprocV1::RuntimeInfo
Runtime information about workload execution.
Corresponds to the JSON property runtimeInfo
300 301 302 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 300 def runtime_info @runtime_info end |
#spark_batch ⇒ Google::Apis::DataprocV1::SparkBatch
A configuration for running an Apache Spark (https://spark.apache.org/) batch
workload.
Corresponds to the JSON property sparkBatch
306 307 308 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 306 def spark_batch @spark_batch end |
#spark_r_batch ⇒ Google::Apis::DataprocV1::SparkRBatch
A configuration for running an Apache SparkR (https://spark.apache.org/docs/
latest/sparkr.html) batch workload.
Corresponds to the JSON property sparkRBatch
312 313 314 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 312 def spark_r_batch @spark_r_batch end |
#spark_sql_batch ⇒ Google::Apis::DataprocV1::SparkSqlBatch
A configuration for running Apache Spark SQL (https://spark.apache.org/sql/)
queries as a batch workload.
Corresponds to the JSON property sparkSqlBatch
318 319 320 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 318 def spark_sql_batch @spark_sql_batch end |
#state ⇒ String
Output only. The state of the batch.
Corresponds to the JSON property state
323 324 325 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 323 def state @state end |
#state_history ⇒ Array<Google::Apis::DataprocV1::StateHistory>
Output only. Historical state information for the batch.
Corresponds to the JSON property stateHistory
328 329 330 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 328 def state_history @state_history end |
#state_message ⇒ String
Output only. Batch state details, such as a failure description if the state
is FAILED.
Corresponds to the JSON property stateMessage
334 335 336 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 334 def @state_message end |
#state_time ⇒ String
Output only. The time when the batch entered a current state.
Corresponds to the JSON property stateTime
339 340 341 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 339 def state_time @state_time end |
#uuid ⇒ String
Output only. A batch UUID (Unique Universal Identifier). The service generates
this value when it creates the batch.
Corresponds to the JSON property uuid
345 346 347 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 345 def uuid @uuid end |
Instance Method Details
#update!(**args) ⇒ Object
Update properties of this object
352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 352 def update!(**args) @create_time = args[:create_time] if args.key?(:create_time) @creator = args[:creator] if args.key?(:creator) @environment_config = args[:environment_config] if args.key?(:environment_config) @labels = args[:labels] if args.key?(:labels) @name = args[:name] if args.key?(:name) @operation = args[:operation] if args.key?(:operation) @pyspark_batch = args[:pyspark_batch] if args.key?(:pyspark_batch) @runtime_config = args[:runtime_config] if args.key?(:runtime_config) @runtime_info = args[:runtime_info] if args.key?(:runtime_info) @spark_batch = args[:spark_batch] if args.key?(:spark_batch) @spark_r_batch = args[:spark_r_batch] if args.key?(:spark_r_batch) @spark_sql_batch = args[:spark_sql_batch] if args.key?(:spark_sql_batch) @state = args[:state] if args.key?(:state) @state_history = args[:state_history] if args.key?(:state_history) @state_message = args[:state_message] if args.key?(:state_message) @state_time = args[:state_time] if args.key?(:state_time) @uuid = args[:uuid] if args.key?(:uuid) end |