Class: Google::Apis::DataprocV1::Batch
- Inherits:
-
Object
- Object
- Google::Apis::DataprocV1::Batch
- Includes:
- Core::Hashable, Core::JsonObjectSupport
- Defined in:
- lib/google/apis/dataproc_v1/classes.rb,
lib/google/apis/dataproc_v1/representations.rb,
lib/google/apis/dataproc_v1/representations.rb
Overview
A representation of a batch workload in the service.
Instance Attribute Summary collapse
-
#create_time ⇒ String
Output only.
-
#creator ⇒ String
Output only.
-
#environment_config ⇒ Google::Apis::DataprocV1::EnvironmentConfig
Environment configuration for a workload.
-
#labels ⇒ Hash<String,String>
Optional.
-
#name ⇒ String
Output only.
-
#operation ⇒ String
Output only.
-
#pyspark_batch ⇒ Google::Apis::DataprocV1::PySparkBatch
A configuration for running an Apache PySpark (https://spark.apache.org/docs/ latest/api/python/getting_started/quickstart.html) batch workload.
-
#runtime_config ⇒ Google::Apis::DataprocV1::RuntimeConfig
Runtime configuration for a workload.
-
#runtime_info ⇒ Google::Apis::DataprocV1::RuntimeInfo
Runtime information about workload execution.
-
#spark_batch ⇒ Google::Apis::DataprocV1::SparkBatch
A configuration for running an Apache Spark (https://spark.apache.org/) batch workload.
-
#spark_r_batch ⇒ Google::Apis::DataprocV1::SparkRBatch
A configuration for running an Apache SparkR (https://spark.apache.org/docs/ latest/sparkr.html) batch workload.
-
#spark_sql_batch ⇒ Google::Apis::DataprocV1::SparkSqlBatch
A configuration for running Apache Spark SQL (https://spark.apache.org/sql/) queries as a batch workload.
-
#state ⇒ String
Output only.
-
#state_history ⇒ Array<Google::Apis::DataprocV1::StateHistory>
Output only.
-
#state_message ⇒ String
Output only.
-
#state_time ⇒ String
Output only.
-
#uuid ⇒ String
Output only.
Instance Method Summary collapse
-
#initialize(**args) ⇒ Batch
constructor
A new instance of Batch.
-
#update!(**args) ⇒ Object
Update properties of this object.
Constructor Details
#initialize(**args) ⇒ Batch
Returns a new instance of Batch.
1025 1026 1027 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1025 def initialize(**args) update!(**args) end |
Instance Attribute Details
#create_time ⇒ String
Output only. The time when the batch was created.
Corresponds to the JSON property createTime
933 934 935 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 933 def create_time @create_time end |
#creator ⇒ String
Output only. The email address of the user who created the batch.
Corresponds to the JSON property creator
938 939 940 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 938 def creator @creator end |
#environment_config ⇒ Google::Apis::DataprocV1::EnvironmentConfig
Environment configuration for a workload.
Corresponds to the JSON property environmentConfig
943 944 945 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 943 def environment_config @environment_config end |
#labels ⇒ Hash<String,String>
Optional. The labels to associate with this batch. Label keys must contain 1
to 63 characters, and must conform to RFC 1035 (https://www.ietf.org/rfc/
rfc1035.txt). Label values may be empty, but, if present, must contain 1 to 63
characters, and must conform to RFC 1035 (https://www.ietf.org/rfc/rfc1035.txt)
. No more than 32 labels can be associated with a batch.
Corresponds to the JSON property labels
952 953 954 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 952 def labels @labels end |
#name ⇒ String
Output only. The resource name of the batch.
Corresponds to the JSON property name
957 958 959 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 957 def name @name end |
#operation ⇒ String
Output only. The resource name of the operation associated with this batch.
Corresponds to the JSON property operation
962 963 964 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 962 def operation @operation end |
#pyspark_batch ⇒ Google::Apis::DataprocV1::PySparkBatch
A configuration for running an Apache PySpark (https://spark.apache.org/docs/
latest/api/python/getting_started/quickstart.html) batch workload.
Corresponds to the JSON property pysparkBatch
968 969 970 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 968 def pyspark_batch @pyspark_batch end |
#runtime_config ⇒ Google::Apis::DataprocV1::RuntimeConfig
Runtime configuration for a workload.
Corresponds to the JSON property runtimeConfig
973 974 975 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 973 def runtime_config @runtime_config end |
#runtime_info ⇒ Google::Apis::DataprocV1::RuntimeInfo
Runtime information about workload execution.
Corresponds to the JSON property runtimeInfo
978 979 980 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 978 def runtime_info @runtime_info end |
#spark_batch ⇒ Google::Apis::DataprocV1::SparkBatch
A configuration for running an Apache Spark (https://spark.apache.org/) batch
workload.
Corresponds to the JSON property sparkBatch
984 985 986 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 984 def spark_batch @spark_batch end |
#spark_r_batch ⇒ Google::Apis::DataprocV1::SparkRBatch
A configuration for running an Apache SparkR (https://spark.apache.org/docs/
latest/sparkr.html) batch workload.
Corresponds to the JSON property sparkRBatch
990 991 992 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 990 def spark_r_batch @spark_r_batch end |
#spark_sql_batch ⇒ Google::Apis::DataprocV1::SparkSqlBatch
A configuration for running Apache Spark SQL (https://spark.apache.org/sql/)
queries as a batch workload.
Corresponds to the JSON property sparkSqlBatch
996 997 998 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 996 def spark_sql_batch @spark_sql_batch end |
#state ⇒ String
Output only. The state of the batch.
Corresponds to the JSON property state
1001 1002 1003 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1001 def state @state end |
#state_history ⇒ Array<Google::Apis::DataprocV1::StateHistory>
Output only. Historical state information for the batch.
Corresponds to the JSON property stateHistory
1006 1007 1008 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1006 def state_history @state_history end |
#state_message ⇒ String
Output only. Batch state details, such as a failure description if the state
is FAILED.
Corresponds to the JSON property stateMessage
1012 1013 1014 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1012 def @state_message end |
#state_time ⇒ String
Output only. The time when the batch entered a current state.
Corresponds to the JSON property stateTime
1017 1018 1019 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1017 def state_time @state_time end |
#uuid ⇒ String
Output only. A batch UUID (Unique Universal Identifier). The service generates
this value when it creates the batch.
Corresponds to the JSON property uuid
1023 1024 1025 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1023 def uuid @uuid end |
Instance Method Details
#update!(**args) ⇒ Object
Update properties of this object
1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 |
# File 'lib/google/apis/dataproc_v1/classes.rb', line 1030 def update!(**args) @create_time = args[:create_time] if args.key?(:create_time) @creator = args[:creator] if args.key?(:creator) @environment_config = args[:environment_config] if args.key?(:environment_config) @labels = args[:labels] if args.key?(:labels) @name = args[:name] if args.key?(:name) @operation = args[:operation] if args.key?(:operation) @pyspark_batch = args[:pyspark_batch] if args.key?(:pyspark_batch) @runtime_config = args[:runtime_config] if args.key?(:runtime_config) @runtime_info = args[:runtime_info] if args.key?(:runtime_info) @spark_batch = args[:spark_batch] if args.key?(:spark_batch) @spark_r_batch = args[:spark_r_batch] if args.key?(:spark_r_batch) @spark_sql_batch = args[:spark_sql_batch] if args.key?(:spark_sql_batch) @state = args[:state] if args.key?(:state) @state_history = args[:state_history] if args.key?(:state_history) @state_message = args[:state_message] if args.key?(:state_message) @state_time = args[:state_time] if args.key?(:state_time) @uuid = args[:uuid] if args.key?(:uuid) end |