Class GoogleCloudAiplatformV1SchemaModelevaluationMetricsClassificationEvaluationMetricsConfidenceMetrics
Inheritance
Implements
Inherited Members
Namespace: Google.Apis.Aiplatform.v1.Data
Assembly: Google.Apis.Aiplatform.v1.dll
Syntax
public class GoogleCloudAiplatformV1SchemaModelevaluationMetricsClassificationEvaluationMetricsConfidenceMetrics : IDirectResponseSchema
Properties
ConfidenceThreshold
Metrics are computed with an assumption that the Model never returns predictions with score lower than this value.
Declaration
[JsonProperty("confidenceThreshold")]
public virtual float? ConfidenceThreshold { get; set; }
Property Value
| Type | Description |
|---|---|
| float? |
ConfusionMatrix
Confusion matrix of the evaluation for this confidence_threshold.
Declaration
[JsonProperty("confusionMatrix")]
public virtual GoogleCloudAiplatformV1SchemaModelevaluationMetricsConfusionMatrix ConfusionMatrix { get; set; }
Property Value
| Type | Description |
|---|---|
| GoogleCloudAiplatformV1SchemaModelevaluationMetricsConfusionMatrix |
ETag
The ETag of the item.
Declaration
public virtual string ETag { get; set; }
Property Value
| Type | Description |
|---|---|
| string |
F1Score
The harmonic mean of recall and precision. For summary metrics, it computes the micro-averaged F1 score.
Declaration
[JsonProperty("f1Score")]
public virtual float? F1Score { get; set; }
Property Value
| Type | Description |
|---|---|
| float? |
F1ScoreAt1
The harmonic mean of recallAt1 and precisionAt1.
Declaration
[JsonProperty("f1ScoreAt1")]
public virtual float? F1ScoreAt1 { get; set; }
Property Value
| Type | Description |
|---|---|
| float? |
F1ScoreMacro
Macro-averaged F1 Score.
Declaration
[JsonProperty("f1ScoreMacro")]
public virtual float? F1ScoreMacro { get; set; }
Property Value
| Type | Description |
|---|---|
| float? |
F1ScoreMicro
Micro-averaged F1 Score.
Declaration
[JsonProperty("f1ScoreMicro")]
public virtual float? F1ScoreMicro { get; set; }
Property Value
| Type | Description |
|---|---|
| float? |
FalseNegativeCount
The number of ground truth labels that are not matched by a Model created label.
Declaration
[JsonProperty("falseNegativeCount")]
public virtual long? FalseNegativeCount { get; set; }
Property Value
| Type | Description |
|---|---|
| long? |
FalsePositiveCount
The number of Model created labels that do not match a ground truth label.
Declaration
[JsonProperty("falsePositiveCount")]
public virtual long? FalsePositiveCount { get; set; }
Property Value
| Type | Description |
|---|---|
| long? |
FalsePositiveRate
False Positive Rate for the given confidence threshold.
Declaration
[JsonProperty("falsePositiveRate")]
public virtual float? FalsePositiveRate { get; set; }
Property Value
| Type | Description |
|---|---|
| float? |
FalsePositiveRateAt1
The False Positive Rate when only considering the label that has the highest prediction score and not below the confidence threshold for each DataItem.
Declaration
[JsonProperty("falsePositiveRateAt1")]
public virtual float? FalsePositiveRateAt1 { get; set; }
Property Value
| Type | Description |
|---|---|
| float? |
MaxPredictions
Metrics are computed with an assumption that the Model always returns at most this many predictions (ordered
by their score, descendingly), but they all still need to meet the confidenceThreshold.
Declaration
[JsonProperty("maxPredictions")]
public virtual int? MaxPredictions { get; set; }
Property Value
| Type | Description |
|---|---|
| int? |
Precision
Precision for the given confidence threshold.
Declaration
[JsonProperty("precision")]
public virtual float? Precision { get; set; }
Property Value
| Type | Description |
|---|---|
| float? |
PrecisionAt1
The precision when only considering the label that has the highest prediction score and not below the confidence threshold for each DataItem.
Declaration
[JsonProperty("precisionAt1")]
public virtual float? PrecisionAt1 { get; set; }
Property Value
| Type | Description |
|---|---|
| float? |
Recall
Recall (True Positive Rate) for the given confidence threshold.
Declaration
[JsonProperty("recall")]
public virtual float? Recall { get; set; }
Property Value
| Type | Description |
|---|---|
| float? |
RecallAt1
The Recall (True Positive Rate) when only considering the label that has the highest prediction score and not below the confidence threshold for each DataItem.
Declaration
[JsonProperty("recallAt1")]
public virtual float? RecallAt1 { get; set; }
Property Value
| Type | Description |
|---|---|
| float? |
TrueNegativeCount
The number of labels that were not created by the Model, but if they would, they would not match a ground truth label.
Declaration
[JsonProperty("trueNegativeCount")]
public virtual long? TrueNegativeCount { get; set; }
Property Value
| Type | Description |
|---|---|
| long? |
TruePositiveCount
The number of Model created labels that match a ground truth label.
Declaration
[JsonProperty("truePositiveCount")]
public virtual long? TruePositiveCount { get; set; }
Property Value
| Type | Description |
|---|---|
| long? |