Class: Google::Apis::DataprocV1::Batch
- Inherits:
-
Object
- Object
- Google::Apis::DataprocV1::Batch
- Includes:
- Core::Hashable, Core::JsonObjectSupport
- Defined in:
- lib/google/apis/dataproc_v1/classes.rb,
lib/google/apis/dataproc_v1/representations.rb,
lib/google/apis/dataproc_v1/representations.rb
Overview
A representation of a batch workload in the service.
Instance Attribute Summary collapse
-
#create_time ⇒ String
Output only.
-
#creator ⇒ String
Output only.
-
#environment_config ⇒ Google::Apis::DataprocV1::EnvironmentConfig
Environment configuration for a workload.
-
#labels ⇒ Hash<String,String>
Optional.
-
#name ⇒ String
Output only.
-
#operation ⇒ String
Output only.
-
#pyspark_batch ⇒ Google::Apis::DataprocV1::PySparkBatch
A configuration for running an Apache PySpark (https://spark.apache.org/docs/ latest/api/python/getting_started/quickstart.html) batch workload.
-
#runtime_config ⇒ Google::Apis::DataprocV1::RuntimeConfig
Runtime configuration for a workload.
-
#runtime_info ⇒ Google::Apis::DataprocV1::RuntimeInfo
Runtime information about workload execution.
-
#spark_batch ⇒ Google::Apis::DataprocV1::SparkBatch
A configuration for running an Apache Spark (https://spark.apache.org/) batch workload.
-
#spark_r_batch ⇒ Google::Apis::DataprocV1::SparkRBatch
A configuration for running an Apache SparkR (https://spark.apache.org/docs/ latest/sparkr.html) batch workload.
-
#spark_sql_batch ⇒ Google::Apis::DataprocV1::SparkSqlBatch
A configuration for running Apache Spark SQL (https://spark.apache.org/sql/) queries as a batch workload.
-
#state ⇒ String
Output only.
-
#state_history ⇒ Array<Google::Apis::DataprocV1::StateHistory>
Output only.
-
#state_message ⇒ String
Output only.
-
#state_time ⇒ String
Output only.
-
#uuid ⇒ String
Output only.
Instance Method Summary collapse
-
#initialize(**args) ⇒ Batch
constructor
A new instance of Batch.
-
#update!(**args) ⇒ Object
Update properties of this object.
Constructor Details
#initialize(**args) ⇒ Batch
Returns a new instance of Batch.
401 402 403 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 401 def initialize(**args) update!(**args) end |
Instance Attribute Details
#create_time ⇒ String
Output only. The time when the batch was created.
Corresponds to the JSON property createTime
309 310 311 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 309 def create_time @create_time end |
#creator ⇒ String
Output only. The email address of the user who created the batch.
Corresponds to the JSON property creator
314 315 316 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 314 def creator @creator end |
#environment_config ⇒ Google::Apis::DataprocV1::EnvironmentConfig
Environment configuration for a workload.
Corresponds to the JSON property environmentConfig
319 320 321 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 319 def environment_config @environment_config end |
#labels ⇒ Hash<String,String>
Optional. The labels to associate with this batch. Label keys must contain 1
to 63 characters, and must conform to RFC 1035 (https://www.ietf.org/rfc/
rfc1035.txt). Label values may be empty, but, if present, must contain 1 to 63
characters, and must conform to RFC 1035 (https://www.ietf.org/rfc/rfc1035.txt)
. No more than 32 labels can be associated with a batch.
Corresponds to the JSON property labels
328 329 330 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 328 def labels @labels end |
#name ⇒ String
Output only. The resource name of the batch.
Corresponds to the JSON property name
333 334 335 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 333 def name @name end |
#operation ⇒ String
Output only. The resource name of the operation associated with this batch.
Corresponds to the JSON property operation
338 339 340 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 338 def operation @operation end |
#pyspark_batch ⇒ Google::Apis::DataprocV1::PySparkBatch
A configuration for running an Apache PySpark (https://spark.apache.org/docs/
latest/api/python/getting_started/quickstart.html) batch workload.
Corresponds to the JSON property pysparkBatch
344 345 346 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 344 def pyspark_batch @pyspark_batch end |
#runtime_config ⇒ Google::Apis::DataprocV1::RuntimeConfig
Runtime configuration for a workload.
Corresponds to the JSON property runtimeConfig
349 350 351 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 349 def runtime_config @runtime_config end |
#runtime_info ⇒ Google::Apis::DataprocV1::RuntimeInfo
Runtime information about workload execution.
Corresponds to the JSON property runtimeInfo
354 355 356 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 354 def runtime_info @runtime_info end |
#spark_batch ⇒ Google::Apis::DataprocV1::SparkBatch
A configuration for running an Apache Spark (https://spark.apache.org/) batch
workload.
Corresponds to the JSON property sparkBatch
360 361 362 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 360 def spark_batch @spark_batch end |
#spark_r_batch ⇒ Google::Apis::DataprocV1::SparkRBatch
A configuration for running an Apache SparkR (https://spark.apache.org/docs/
latest/sparkr.html) batch workload.
Corresponds to the JSON property sparkRBatch
366 367 368 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 366 def spark_r_batch @spark_r_batch end |
#spark_sql_batch ⇒ Google::Apis::DataprocV1::SparkSqlBatch
A configuration for running Apache Spark SQL (https://spark.apache.org/sql/)
queries as a batch workload.
Corresponds to the JSON property sparkSqlBatch
372 373 374 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 372 def spark_sql_batch @spark_sql_batch end |
#state ⇒ String
Output only. The state of the batch.
Corresponds to the JSON property state
377 378 379 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 377 def state @state end |
#state_history ⇒ Array<Google::Apis::DataprocV1::StateHistory>
Output only. Historical state information for the batch.
Corresponds to the JSON property stateHistory
382 383 384 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 382 def state_history @state_history end |
#state_message ⇒ String
Output only. Batch state details, such as a failure description if the state
is FAILED.
Corresponds to the JSON property stateMessage
388 389 390 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 388 def @state_message end |
#state_time ⇒ String
Output only. The time when the batch entered a current state.
Corresponds to the JSON property stateTime
393 394 395 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 393 def state_time @state_time end |
#uuid ⇒ String
Output only. A batch UUID (Unique Universal Identifier). The service generates
this value when it creates the batch.
Corresponds to the JSON property uuid
399 400 401 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 399 def uuid @uuid end |
Instance Method Details
#update!(**args) ⇒ Object
Update properties of this object
406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 406 def update!(**args) @create_time = args[:create_time] if args.key?(:create_time) @creator = args[:creator] if args.key?(:creator) @environment_config = args[:environment_config] if args.key?(:environment_config) @labels = args[:labels] if args.key?(:labels) @name = args[:name] if args.key?(:name) @operation = args[:operation] if args.key?(:operation) @pyspark_batch = args[:pyspark_batch] if args.key?(:pyspark_batch) @runtime_config = args[:runtime_config] if args.key?(:runtime_config) @runtime_info = args[:runtime_info] if args.key?(:runtime_info) @spark_batch = args[:spark_batch] if args.key?(:spark_batch) @spark_r_batch = args[:spark_r_batch] if args.key?(:spark_r_batch) @spark_sql_batch = args[:spark_sql_batch] if args.key?(:spark_sql_batch) @state = args[:state] if args.key?(:state) @state_history = args[:state_history] if args.key?(:state_history) @state_message = args[:state_message] if args.key?(:state_message) @state_time = args[:state_time] if args.key?(:state_time) @uuid = args[:uuid] if args.key?(:uuid) end |