Class: Google::Apis::MlV1::GoogleCloudMlV1Version
- Inherits:
-
Object
- Object
- Google::Apis::MlV1::GoogleCloudMlV1Version
- Includes:
- Core::Hashable, Core::JsonObjectSupport
- Defined in:
- lib/google/apis/ml_v1/classes.rb,
lib/google/apis/ml_v1/representations.rb,
lib/google/apis/ml_v1/representations.rb
Overview
Represents a version of the model. Each version is a trained model deployed in the cloud, ready to handle prediction requests. A model can have multiple versions. You can get information about all of the versions of a given model by calling projects.models.versions.list.
Instance Attribute Summary collapse
-
#accelerator_config ⇒ Google::Apis::MlV1::GoogleCloudMlV1AcceleratorConfig
Represents a hardware accelerator request config.
-
#auto_scaling ⇒ Google::Apis::MlV1::GoogleCloudMlV1AutoScaling
Options for automatically scaling a model.
-
#container ⇒ Google::Apis::MlV1::GoogleCloudMlV1ContainerSpec
Specification of a custom container for serving predictions.
-
#create_time ⇒ String
Output only.
-
#deployment_uri ⇒ String
The Cloud Storage URI of a directory containing trained model artifacts to be used to create the model version.
-
#description ⇒ String
Optional.
-
#error_message ⇒ String
Output only.
-
#etag ⇒ String
etag
is used for optimistic concurrency control as a way to help prevent simultaneous updates of a model from overwriting each other. -
#explanation_config ⇒ Google::Apis::MlV1::GoogleCloudMlV1ExplanationConfig
Message holding configuration options for explaining model predictions.
-
#framework ⇒ String
Optional.
-
#is_default ⇒ Boolean
(also: #is_default?)
Output only.
-
#labels ⇒ Hash<String,String>
Optional.
-
#last_migration_model_id ⇒ String
Output only.
-
#last_migration_time ⇒ String
Output only.
-
#last_use_time ⇒ String
Output only.
-
#machine_type ⇒ String
Optional.
-
#manual_scaling ⇒ Google::Apis::MlV1::GoogleCloudMlV1ManualScaling
Options for manually scaling a model.
-
#name ⇒ String
Required.
-
#package_uris ⇒ Array<String>
Optional.
-
#prediction_class ⇒ String
Optional.
-
#python_version ⇒ String
Required.
-
#request_logging_config ⇒ Google::Apis::MlV1::GoogleCloudMlV1RequestLoggingConfig
Configuration for logging request-response pairs to a BigQuery table.
-
#routes ⇒ Google::Apis::MlV1::GoogleCloudMlV1RouteMap
Specifies HTTP paths served by a custom container.
-
#runtime_version ⇒ String
Required.
-
#service_account ⇒ String
Optional.
-
#state ⇒ String
Output only.
Instance Method Summary collapse
-
#initialize(**args) ⇒ GoogleCloudMlV1Version
constructor
A new instance of GoogleCloudMlV1Version.
-
#update!(**args) ⇒ Object
Update properties of this object.
Constructor Details
#initialize(**args) ⇒ GoogleCloudMlV1Version
Returns a new instance of GoogleCloudMlV1Version.
3274 3275 3276 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3274 def initialize(**args) update!(**args) end |
Instance Attribute Details
#accelerator_config ⇒ Google::Apis::MlV1::GoogleCloudMlV1AcceleratorConfig
Represents a hardware accelerator request config. Note that the
AcceleratorConfig can be used in both Jobs and Versions. Learn more about
accelerators for training and accelerators for
online prediction.
Corresponds to the JSON property acceleratorConfig
3042 3043 3044 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3042 def accelerator_config @accelerator_config end |
#auto_scaling ⇒ Google::Apis::MlV1::GoogleCloudMlV1AutoScaling
Options for automatically scaling a model.
Corresponds to the JSON property autoScaling
3047 3048 3049 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3047 def auto_scaling @auto_scaling end |
#container ⇒ Google::Apis::MlV1::GoogleCloudMlV1ContainerSpec
Specification of a custom container for serving predictions. This message is a
subset of the Kubernetes Container v1 core specification.
Corresponds to the JSON property container
3054 3055 3056 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3054 def container @container end |
#create_time ⇒ String
Output only. The time the version was created.
Corresponds to the JSON property createTime
3059 3060 3061 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3059 def create_time @create_time end |
#deployment_uri ⇒ String
The Cloud Storage URI of a directory containing trained model artifacts to be
used to create the model version. See the guide to deploying models for more information. The total
number of files under this directory must not exceed 1000. During projects.
models.versions.create, AI Platform Prediction copies all files from the
specified directory to a location managed by the service. From then on, AI
Platform Prediction uses these copies of the model artifacts to serve
predictions, not the original files in Cloud Storage, so this location is
useful only as a historical record. If you specify container, then this field
is optional. Otherwise, it is required. Learn how to use this field with a
custom container.
Corresponds to the JSON property deploymentUri
3075 3076 3077 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3075 def deployment_uri @deployment_uri end |
#description ⇒ String
Optional. The description specified for the version when it was created.
Corresponds to the JSON property description
3080 3081 3082 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3080 def description @description end |
#error_message ⇒ String
Output only. The details of a failure or a cancellation.
Corresponds to the JSON property errorMessage
3085 3086 3087 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3085 def @error_message end |
#etag ⇒ String
etag
is used for optimistic concurrency control as a way to help prevent
simultaneous updates of a model from overwriting each other. It is strongly
suggested that systems make use of the etag
in the read-modify-write cycle
to perform model updates in order to avoid race conditions: An etag
is
returned in the response to GetVersion
, and systems are expected to put that
etag in the request to UpdateVersion
to ensure that their change will be
applied to the model as intended.
Corresponds to the JSON property etag
NOTE: Values are automatically base64 encoded/decoded in the client library.
3097 3098 3099 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3097 def etag @etag end |
#explanation_config ⇒ Google::Apis::MlV1::GoogleCloudMlV1ExplanationConfig
Message holding configuration options for explaining model predictions. There
are three feature attribution methods supported for TensorFlow models:
integrated gradients, sampled Shapley, and XRAI. Learn more about feature
attributions.
Corresponds to the JSON property explanationConfig
3105 3106 3107 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3105 def explanation_config @explanation_config end |
#framework ⇒ String
Optional. The machine learning framework AI Platform uses to train this
version of the model. Valid values are TENSORFLOW
, SCIKIT_LEARN
, XGBOOST
.
If you do not specify a framework, AI Platform will analyze files in the
deployment_uri to determine a framework. If you choose SCIKIT_LEARN
or
XGBOOST
, you must also set the runtime version of the model to 1.4 or greater.
Do not specify a framework if you're deploying a custom prediction
routine or if you're
using a custom container.
Corresponds to the JSON property framework
3117 3118 3119 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3117 def framework @framework end |
#is_default ⇒ Boolean Also known as: is_default?
Output only. If true, this version will be used to handle prediction requests
that do not specify a version. You can change the default version by calling
projects.methods.versions.setDefault.
Corresponds to the JSON property isDefault
3124 3125 3126 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3124 def is_default @is_default end |
#labels ⇒ Hash<String,String>
Optional. One or more labels that you can add, to organize your model versions.
Each label is a key-value pair, where both the key and the value are
arbitrary strings that you supply. For more information, see the documentation
on using labels. Note that this field is not updatable for mls1* models.
Corresponds to the JSON property labels
3133 3134 3135 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3133 def labels @labels end |
#last_migration_model_id ⇒ String
Output only. The AI Platform (Unified) Model
ID for
the last model migration.
Corresponds to the JSON property lastMigrationModelId
3141 3142 3143 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3141 def last_migration_model_id @last_migration_model_id end |
#last_migration_time ⇒ String
Output only. The last time this version was successfully migrated to AI
Platform (Unified).
Corresponds to the JSON property lastMigrationTime
3148 3149 3150 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3148 def last_migration_time @last_migration_time end |
#last_use_time ⇒ String
Output only. The time the version was last used for prediction.
Corresponds to the JSON property lastUseTime
3153 3154 3155 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3153 def last_use_time @last_use_time end |
#machine_type ⇒ String
Optional. The type of machine on which to serve the model. Currently only
applies to online prediction service. To learn about valid values for this
field, read Choosing a machine type for online prediction. If this field is not
specified and you are using a regional endpoint, then the machine type defaults to n1-standard-2
. If
this field is not specified and you are using the global endpoint (ml.
googleapis.com
), then the machine type defaults to mls1-c1-m2
.
Corresponds to the JSON property machineType
3165 3166 3167 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3165 def machine_type @machine_type end |
#manual_scaling ⇒ Google::Apis::MlV1::GoogleCloudMlV1ManualScaling
Options for manually scaling a model.
Corresponds to the JSON property manualScaling
3170 3171 3172 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3170 def manual_scaling @manual_scaling end |
#name ⇒ String
Required. The name specified for the version when it was created. The version
name must be unique within the model it is created in.
Corresponds to the JSON property name
3176 3177 3178 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3176 def name @name end |
#package_uris ⇒ Array<String>
Optional. Cloud Storage paths (gs://…
) of packages for custom prediction
routines or scikit-
learn pipelines with custom code. For a custom prediction routine, one of
these packages must contain your Predictor class (see predictionClass
). Additionally, include any dependencies used
by your Predictor or scikit-learn pipeline uses that are not already included
in your selected runtime version. If you specify this field, you must also set runtimeVersion
to 1.4 or greater.
Corresponds to the JSON property packageUris
3190 3191 3192 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3190 def package_uris @package_uris end |
#prediction_class ⇒ String
Optional. The fully qualified name (module_name.class_name) of a class that
implements the Predictor interface described in this reference field. The
module containing this class should be included in a package provided to the
packageUris
field. Specify this field if and
only if you are deploying a custom prediction routine (beta). If you specify this field, you must
set runtimeVersion
to 1.4 or greater and
you must set machineType
to a legacy (MLS1) machine type. The following code sample provides the
Predictor interface: class Predictor(object): """Interface for constructing
custom predictors.""" def predict(self, instances, **kwargs): """Performs
custom prediction. Instances are the decoded values from the request. They
have already been deserialized from JSON. Args: instances: A list of
prediction input instances. **kwargs: A dictionary of keyword args provided as
additional fields on the predict request body. Returns: A list of outputs
containing the prediction results. This list must be JSON serializable. """
raise NotImplementedError() @classmethod def from_path(cls, model_dir): """
Creates an instance of Predictor using the given path. Loading of the
predictor should be done in this method. Args: model_dir: The local directory
that contains the exported model file along with any additional files uploaded
when creating the version resource. Returns: An instance implementing this
Predictor class. """ raise NotImplementedError() Learn more about the
Predictor interface and custom prediction routines.
Corresponds to the JSON property predictionClass
3218 3219 3220 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3218 def prediction_class @prediction_class end |
#python_version ⇒ String
Required. The version of Python used in prediction. The following Python
versions are available: * Python '3.7' is available when runtime_version
is
set to '1.15' or later. * Python '3.5' is available when runtime_version
is
set to a version from '1.4' to '1.14'. * Python '2.7' is available when
runtime_version
is set to '1.15' or earlier. Read more about the Python
versions available for each runtime version.
Corresponds to the JSON property pythonVersion
3229 3230 3231 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3229 def python_version @python_version end |
#request_logging_config ⇒ Google::Apis::MlV1::GoogleCloudMlV1RequestLoggingConfig
Configuration for logging request-response pairs to a BigQuery table. Online
prediction requests to a model version and the responses to these requests are
converted to raw strings and saved to the specified BigQuery table. Logging is
constrained by BigQuery quotas and limits. If your project
exceeds BigQuery quotas or limits, AI Platform Prediction does not log request-
response pairs, but it continues to serve predictions. If you are using
continuous evaluation, you do not
need to specify this configuration manually. Setting up continuous evaluation
automatically enables logging of request-response pairs.
Corresponds to the JSON property requestLoggingConfig
3242 3243 3244 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3242 def request_logging_config @request_logging_config end |
#routes ⇒ Google::Apis::MlV1::GoogleCloudMlV1RouteMap
Specifies HTTP paths served by a custom container. AI Platform Prediction
sends requests to these paths on the container; the custom container must run
an HTTP server that responds to these requests with appropriate responses.
Read Custom container requirements for details on how to create your container image to
meet these requirements.
Corresponds to the JSON property routes
3252 3253 3254 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3252 def routes @routes end |
#runtime_version ⇒ String
Required. The AI Platform runtime version to use for this deployment. For more
information, see the runtime version list and how to manage runtime versions.
Corresponds to the JSON property runtimeVersion
3259 3260 3261 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3259 def runtime_version @runtime_version end |
#service_account ⇒ String
Optional. Specifies the service account for resource access control. If you
specify this field, then you must also specify either the containerSpec
or
the predictionClass
field. Learn more about using a custom service account.
Corresponds to the JSON property serviceAccount
3267 3268 3269 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3267 def service_account @service_account end |
#state ⇒ String
Output only. The state of a version.
Corresponds to the JSON property state
3272 3273 3274 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3272 def state @state end |
Instance Method Details
#update!(**args) ⇒ Object
Update properties of this object
3279 3280 3281 3282 3283 3284 3285 3286 3287 3288 3289 3290 3291 3292 3293 3294 3295 3296 3297 3298 3299 3300 3301 3302 3303 3304 3305 3306 |
# File 'lib/google/apis/ml_v1/classes.rb', line 3279 def update!(**args) @accelerator_config = args[:accelerator_config] if args.key?(:accelerator_config) @auto_scaling = args[:auto_scaling] if args.key?(:auto_scaling) @container = args[:container] if args.key?(:container) @create_time = args[:create_time] if args.key?(:create_time) @deployment_uri = args[:deployment_uri] if args.key?(:deployment_uri) @description = args[:description] if args.key?(:description) @error_message = args[:error_message] if args.key?(:error_message) @etag = args[:etag] if args.key?(:etag) @explanation_config = args[:explanation_config] if args.key?(:explanation_config) @framework = args[:framework] if args.key?(:framework) @is_default = args[:is_default] if args.key?(:is_default) @labels = args[:labels] if args.key?(:labels) @last_migration_model_id = args[:last_migration_model_id] if args.key?(:last_migration_model_id) @last_migration_time = args[:last_migration_time] if args.key?(:last_migration_time) @last_use_time = args[:last_use_time] if args.key?(:last_use_time) @machine_type = args[:machine_type] if args.key?(:machine_type) @manual_scaling = args[:manual_scaling] if args.key?(:manual_scaling) @name = args[:name] if args.key?(:name) @package_uris = args[:package_uris] if args.key?(:package_uris) @prediction_class = args[:prediction_class] if args.key?(:prediction_class) @python_version = args[:python_version] if args.key?(:python_version) @request_logging_config = args[:request_logging_config] if args.key?(:request_logging_config) @routes = args[:routes] if args.key?(:routes) @runtime_version = args[:runtime_version] if args.key?(:runtime_version) @service_account = args[:service_account] if args.key?(:service_account) @state = args[:state] if args.key?(:state) end |