Class: Google::Apis::MlV1::GoogleCloudMlV1Version

Inherits:
Object
  • Object
show all
Includes:
Core::Hashable, Core::JsonObjectSupport
Defined in:
generated/google/apis/ml_v1/classes.rb,
generated/google/apis/ml_v1/representations.rb,
generated/google/apis/ml_v1/representations.rb

Overview

Represents a version of the model. Each version is a trained model deployed in the cloud, ready to handle prediction requests. A model can have multiple versions. You can get information about all of the versions of a given model by calling projects.models.versions.list.

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(**args) ⇒ GoogleCloudMlV1Version

Returns a new instance of GoogleCloudMlV1Version.



3219
3220
3221
# File 'generated/google/apis/ml_v1/classes.rb', line 3219

def initialize(**args)
   update!(**args)
end

Instance Attribute Details

#accelerator_configGoogle::Apis::MlV1::GoogleCloudMlV1AcceleratorConfig

Represents a hardware accelerator request config. Note that the AcceleratorConfig can be used in both Jobs and Versions. Learn more about accelerators for training and accelerators for online prediction. Corresponds to the JSON property acceleratorConfig



3000
3001
3002
# File 'generated/google/apis/ml_v1/classes.rb', line 3000

def accelerator_config
  @accelerator_config
end

#auto_scalingGoogle::Apis::MlV1::GoogleCloudMlV1AutoScaling

Options for automatically scaling a model. Corresponds to the JSON property autoScaling



3005
3006
3007
# File 'generated/google/apis/ml_v1/classes.rb', line 3005

def auto_scaling
  @auto_scaling
end

#containerGoogle::Apis::MlV1::GoogleCloudMlV1ContainerSpec

Specification of a custom container for serving predictions. This message is a subset of the Kubernetes Container v1 core specification. Corresponds to the JSON property container



3012
3013
3014
# File 'generated/google/apis/ml_v1/classes.rb', line 3012

def container
  @container
end

#create_timeString

Output only. The time the version was created. Corresponds to the JSON property createTime

Returns:

  • (String)


3017
3018
3019
# File 'generated/google/apis/ml_v1/classes.rb', line 3017

def create_time
  @create_time
end

#deployment_uriString

The Cloud Storage URI of a directory containing trained model artifacts to be used to create the model version. See the guide to deploying models for more information. The total number of files under this directory must not exceed 1000. During projects. models.versions.create, AI Platform Prediction copies all files from the specified directory to a location managed by the service. From then on, AI Platform Prediction uses these copies of the model artifacts to serve predictions, not the original files in Cloud Storage, so this location is useful only as a historical record. If you specify container, then this field is optional. Otherwise, it is required. Learn how to use this field with a custom container. Corresponds to the JSON property deploymentUri

Returns:

  • (String)


3033
3034
3035
# File 'generated/google/apis/ml_v1/classes.rb', line 3033

def deployment_uri
  @deployment_uri
end

#descriptionString

Optional. The description specified for the version when it was created. Corresponds to the JSON property description

Returns:

  • (String)


3038
3039
3040
# File 'generated/google/apis/ml_v1/classes.rb', line 3038

def description
  @description
end

#error_messageString

Output only. The details of a failure or a cancellation. Corresponds to the JSON property errorMessage

Returns:

  • (String)


3043
3044
3045
# File 'generated/google/apis/ml_v1/classes.rb', line 3043

def error_message
  @error_message
end

#etagString

etag is used for optimistic concurrency control as a way to help prevent simultaneous updates of a model from overwriting each other. It is strongly suggested that systems make use of the etag in the read-modify-write cycle to perform model updates in order to avoid race conditions: An etag is returned in the response to GetVersion, and systems are expected to put that etag in the request to UpdateVersion to ensure that their change will be applied to the model as intended. Corresponds to the JSON property etag NOTE: Values are automatically base64 encoded/decoded in the client library.

Returns:

  • (String)


3055
3056
3057
# File 'generated/google/apis/ml_v1/classes.rb', line 3055

def etag
  @etag
end

#explanation_configGoogle::Apis::MlV1::GoogleCloudMlV1ExplanationConfig

Message holding configuration options for explaining model predictions. There are three feature attribution methods supported for TensorFlow models: integrated gradients, sampled Shapley, and XRAI. Learn more about feature attributions. Corresponds to the JSON property explanationConfig



3063
3064
3065
# File 'generated/google/apis/ml_v1/classes.rb', line 3063

def explanation_config
  @explanation_config
end

#frameworkString

Optional. The machine learning framework AI Platform uses to train this version of the model. Valid values are TENSORFLOW, SCIKIT_LEARN, XGBOOST. If you do not specify a framework, AI Platform will analyze files in the deployment_uri to determine a framework. If you choose SCIKIT_LEARN or XGBOOST, you must also set the runtime version of the model to 1.4 or greater. Do not specify a framework if you're deploying a custom prediction routine or if you're using a custom container. Corresponds to the JSON property framework

Returns:

  • (String)


3075
3076
3077
# File 'generated/google/apis/ml_v1/classes.rb', line 3075

def framework
  @framework
end

#is_defaultBoolean Also known as: is_default?

Output only. If true, this version will be used to handle prediction requests that do not specify a version. You can change the default version by calling projects.methods.versions.setDefault. Corresponds to the JSON property isDefault

Returns:

  • (Boolean)


3082
3083
3084
# File 'generated/google/apis/ml_v1/classes.rb', line 3082

def is_default
  @is_default
end

#labelsHash<String,String>

Optional. One or more labels that you can add, to organize your model versions. Each label is a key-value pair, where both the key and the value are arbitrary strings that you supply. For more information, see the documentation on using labels. Corresponds to the JSON property labels

Returns:

  • (Hash<String,String>)


3091
3092
3093
# File 'generated/google/apis/ml_v1/classes.rb', line 3091

def labels
  @labels
end

#last_use_timeString

Output only. The time the version was last used for prediction. Corresponds to the JSON property lastUseTime

Returns:

  • (String)


3096
3097
3098
# File 'generated/google/apis/ml_v1/classes.rb', line 3096

def last_use_time
  @last_use_time
end

#machine_typeString

Optional. The type of machine on which to serve the model. Currently only applies to online prediction service. If this field is not specified, it defaults to mls1-c1-m2. Online prediction supports the following machine types: * mls1-c1-m2 * mls1-c4-m2 * n1-standard-2 * n1-standard-4 * n1- standard-8 * n1-standard-16 * n1-standard-32 * n1-highmem-2 * n1- highmem-4 * n1-highmem-8 * n1-highmem-16 * n1-highmem-32 * n1-highcpu- 2 * n1-highcpu-4 * n1-highcpu-8 * n1-highcpu-16 * n1-highcpu-32 mls1- c4-m2 is in beta. All other machine types are generally available. Learn more about the differences between machine types. Corresponds to the JSON property machineType

Returns:

  • (String)


3110
3111
3112
# File 'generated/google/apis/ml_v1/classes.rb', line 3110

def machine_type
  @machine_type
end

#manual_scalingGoogle::Apis::MlV1::GoogleCloudMlV1ManualScaling

Options for manually scaling a model. Corresponds to the JSON property manualScaling



3115
3116
3117
# File 'generated/google/apis/ml_v1/classes.rb', line 3115

def manual_scaling
  @manual_scaling
end

#nameString

Required. The name specified for the version when it was created. The version name must be unique within the model it is created in. Corresponds to the JSON property name

Returns:

  • (String)


3121
3122
3123
# File 'generated/google/apis/ml_v1/classes.rb', line 3121

def name
  @name
end

#package_urisArray<String>

Optional. Cloud Storage paths (gs://…) of packages for custom prediction routines or scikit- learn pipelines with custom code. For a custom prediction routine, one of these packages must contain your Predictor class (see predictionClass). Additionally, include any dependencies used by your Predictor or scikit-learn pipeline uses that are not already included in your selected runtime version. If you specify this field, you must also set runtimeVersion to 1.4 or greater. Corresponds to the JSON property packageUris

Returns:

  • (Array<String>)


3135
3136
3137
# File 'generated/google/apis/ml_v1/classes.rb', line 3135

def package_uris
  @package_uris
end

#prediction_classString

Optional. The fully qualified name (module_name.class_name) of a class that implements the Predictor interface described in this reference field. The module containing this class should be included in a package provided to the packageUris field. Specify this field if and only if you are deploying a custom prediction routine (beta). If you specify this field, you must set runtimeVersion to 1.4 or greater and you must set machineType to a legacy (MLS1) machine type. The following code sample provides the Predictor interface: class Predictor(object): """Interface for constructing custom predictors.""" def predict(self, instances, **kwargs): """Performs custom prediction. Instances are the decoded values from the request. They have already been deserialized from JSON. Args: instances: A list of prediction input instances. **kwargs: A dictionary of keyword args provided as additional fields on the predict request body. Returns: A list of outputs containing the prediction results. This list must be JSON serializable. """ raise NotImplementedError() @classmethod def from_path(cls, model_dir): """ Creates an instance of Predictor using the given path. Loading of the predictor should be done in this method. Args: model_dir: The local directory that contains the exported model file along with any additional files uploaded when creating the version resource. Returns: An instance implementing this Predictor class. """ raise NotImplementedError() Learn more about the Predictor interface and custom prediction routines. Corresponds to the JSON property predictionClass

Returns:

  • (String)


3163
3164
3165
# File 'generated/google/apis/ml_v1/classes.rb', line 3163

def prediction_class
  @prediction_class
end

#python_versionString

Required. The version of Python used in prediction. The following Python versions are available: * Python '3.7' is available when runtime_version is set to '1.15' or later. * Python '3.5' is available when runtime_version is set to a version from '1.4' to '1.14'. * Python '2.7' is available when runtime_version is set to '1.15' or earlier. Read more about the Python versions available for each runtime version. Corresponds to the JSON property pythonVersion

Returns:

  • (String)


3174
3175
3176
# File 'generated/google/apis/ml_v1/classes.rb', line 3174

def python_version
  @python_version
end

#request_logging_configGoogle::Apis::MlV1::GoogleCloudMlV1RequestLoggingConfig

Configuration for logging request-response pairs to a BigQuery table. Online prediction requests to a model version and the responses to these requests are converted to raw strings and saved to the specified BigQuery table. Logging is constrained by BigQuery quotas and limits. If your project exceeds BigQuery quotas or limits, AI Platform Prediction does not log request- response pairs, but it continues to serve predictions. If you are using continuous evaluation, you do not need to specify this configuration manually. Setting up continuous evaluation automatically enables logging of request-response pairs. Corresponds to the JSON property requestLoggingConfig



3187
3188
3189
# File 'generated/google/apis/ml_v1/classes.rb', line 3187

def request_logging_config
  @request_logging_config
end

#routesGoogle::Apis::MlV1::GoogleCloudMlV1RouteMap

Specifies HTTP paths served by a custom container. AI Platform Prediction sends requests to these paths on the container; the custom container must run an HTTP server that responds to these requests with appropriate responses. Read Custom container requirements for details on how to create your container image to meet these requirements. Corresponds to the JSON property routes



3197
3198
3199
# File 'generated/google/apis/ml_v1/classes.rb', line 3197

def routes
  @routes
end

#runtime_versionString

Required. The AI Platform runtime version to use for this deployment. For more information, see the runtime version list and how to manage runtime versions. Corresponds to the JSON property runtimeVersion

Returns:

  • (String)


3204
3205
3206
# File 'generated/google/apis/ml_v1/classes.rb', line 3204

def runtime_version
  @runtime_version
end

#service_accountString

Optional. Specifies the service account for resource access control. If you specify this field, then you must also specify either the containerSpec or the predictionClass field. Learn more about using a custom service account. Corresponds to the JSON property serviceAccount

Returns:

  • (String)


3212
3213
3214
# File 'generated/google/apis/ml_v1/classes.rb', line 3212

def 
  @service_account
end

#stateString

Output only. The state of a version. Corresponds to the JSON property state

Returns:

  • (String)


3217
3218
3219
# File 'generated/google/apis/ml_v1/classes.rb', line 3217

def state
  @state
end

Instance Method Details

#update!(**args) ⇒ Object

Update properties of this object



3224
3225
3226
3227
3228
3229
3230
3231
3232
3233
3234
3235
3236
3237
3238
3239
3240
3241
3242
3243
3244
3245
3246
3247
3248
3249
# File 'generated/google/apis/ml_v1/classes.rb', line 3224

def update!(**args)
  @accelerator_config = args[:accelerator_config] if args.key?(:accelerator_config)
  @auto_scaling = args[:auto_scaling] if args.key?(:auto_scaling)
  @container = args[:container] if args.key?(:container)
  @create_time = args[:create_time] if args.key?(:create_time)
  @deployment_uri = args[:deployment_uri] if args.key?(:deployment_uri)
  @description = args[:description] if args.key?(:description)
  @error_message = args[:error_message] if args.key?(:error_message)
  @etag = args[:etag] if args.key?(:etag)
  @explanation_config = args[:explanation_config] if args.key?(:explanation_config)
  @framework = args[:framework] if args.key?(:framework)
  @is_default = args[:is_default] if args.key?(:is_default)
  @labels = args[:labels] if args.key?(:labels)
  @last_use_time = args[:last_use_time] if args.key?(:last_use_time)
  @machine_type = args[:machine_type] if args.key?(:machine_type)
  @manual_scaling = args[:manual_scaling] if args.key?(:manual_scaling)
  @name = args[:name] if args.key?(:name)
  @package_uris = args[:package_uris] if args.key?(:package_uris)
  @prediction_class = args[:prediction_class] if args.key?(:prediction_class)
  @python_version = args[:python_version] if args.key?(:python_version)
  @request_logging_config = args[:request_logging_config] if args.key?(:request_logging_config)
  @routes = args[:routes] if args.key?(:routes)
  @runtime_version = args[:runtime_version] if args.key?(:runtime_version)
  @service_account = args[:service_account] if args.key?(:service_account)
  @state = args[:state] if args.key?(:state)
end