Class GoogleCloudAiplatformV1SafetySetting
A safety setting that affects the safety-blocking behavior. A SafetySetting consists of a harm category and a threshold for that category.
Implements
Inherited Members
Namespace: Google.Apis.Aiplatform.v1.Data
Assembly: Google.Apis.Aiplatform.v1.dll
Syntax
public class GoogleCloudAiplatformV1SafetySetting : IDirectResponseSchema
Properties
Category
Required. The harm category to be blocked.
Declaration
[JsonProperty("category")]
public virtual string Category { get; set; }
Property Value
| Type | Description |
|---|---|
| string |
ETag
The ETag of the item.
Declaration
public virtual string ETag { get; set; }
Property Value
| Type | Description |
|---|---|
| string |
Method
Optional. The method for blocking content. If not specified, the default behavior is to use the probability score.
Declaration
[JsonProperty("method")]
public virtual string Method { get; set; }
Property Value
| Type | Description |
|---|---|
| string |
Threshold
Required. The threshold for blocking content. If the harm probability exceeds this threshold, the content will be blocked.
Declaration
[JsonProperty("threshold")]
public virtual string Threshold { get; set; }
Property Value
| Type | Description |
|---|---|
| string |