@Generated(value="OracleSDKGenerator", comments="API Version: 20221001") public final class NamedEntityRecognitionModelMetrics extends com.oracle.bmc.http.client.internal.ExplicitlySetBmcModel
Model level named entity recognition metrics
Note: Objects should always be created or deserialized using the NamedEntityRecognitionModelMetrics.Builder
. This model
distinguishes fields that are null
because they are unset from fields that are explicitly
set to null
. This is done in the setter methods of the NamedEntityRecognitionModelMetrics.Builder
, which maintain a
set of all explicitly set fields called NamedEntityRecognitionModelMetrics.Builder.__explicitlySet__
. The hashCode()
and equals(Object)
methods are implemented to take the explicitly set
fields into account. The constructor, on the other hand, does not take the explicitly set fields
into account (since the constructor cannot distinguish explicit null
from unset null
).
Modifier and Type | Class and Description |
---|---|
static class |
NamedEntityRecognitionModelMetrics.Builder |
EXPLICITLY_SET_FILTER_NAME, EXPLICITLY_SET_PROPERTY_NAME
Constructor and Description |
---|
NamedEntityRecognitionModelMetrics(Float microF1,
Float microPrecision,
Float microRecall,
Float macroF1,
Float macroPrecision,
Float macroRecall,
Float weightedF1,
Float weightedPrecision,
Float weightedRecall)
Deprecated.
|
Modifier and Type | Method and Description |
---|---|
static NamedEntityRecognitionModelMetrics.Builder |
builder()
Create a new builder.
|
boolean |
equals(Object o) |
Float |
getMacroF1()
F1-score, is a measure of a model’s accuracy on a dataset
|
Float |
getMacroPrecision()
Precision refers to the number of true positives divided by the total number of positive
predictions (i.e., the number of true positives plus the number of false positives)
|
Float |
getMacroRecall()
Measures the model’s ability to predict actual positive classes.
|
Float |
getMicroF1()
F1-score, is a measure of a model’s accuracy on a dataset
|
Float |
getMicroPrecision()
Precision refers to the number of true positives divided by the total number of positive
predictions (i.e., the number of true positives plus the number of false positives)
|
Float |
getMicroRecall()
Measures the model’s ability to predict actual positive classes.
|
Float |
getWeightedF1()
F1-score, is a measure of a model’s accuracy on a dataset
|
Float |
getWeightedPrecision()
Precision refers to the number of true positives divided by the total number of positive
predictions (i.e., the number of true positives plus the number of false positives)
|
Float |
getWeightedRecall()
Measures the model’s ability to predict actual positive classes.
|
int |
hashCode() |
NamedEntityRecognitionModelMetrics.Builder |
toBuilder() |
String |
toString() |
String |
toString(boolean includeByteArrayContents)
Return a string representation of the object.
|
markPropertyAsExplicitlySet, wasPropertyExplicitlySet
@Deprecated @ConstructorProperties(value={"microF1","microPrecision","microRecall","macroF1","macroPrecision","macroRecall","weightedF1","weightedPrecision","weightedRecall"}) public NamedEntityRecognitionModelMetrics(Float microF1, Float microPrecision, Float microRecall, Float macroF1, Float macroPrecision, Float macroRecall, Float weightedF1, Float weightedPrecision, Float weightedRecall)
public static NamedEntityRecognitionModelMetrics.Builder builder()
Create a new builder.
public NamedEntityRecognitionModelMetrics.Builder toBuilder()
public Float getMicroF1()
F1-score, is a measure of a model’s accuracy on a dataset
public Float getMicroPrecision()
Precision refers to the number of true positives divided by the total number of positive predictions (i.e., the number of true positives plus the number of false positives)
public Float getMicroRecall()
Measures the model’s ability to predict actual positive classes. It is the ratio between the predicted true positives and what was actually tagged. The recall metric reveals how many of the predicted classes are correct.
public Float getMacroF1()
F1-score, is a measure of a model’s accuracy on a dataset
public Float getMacroPrecision()
Precision refers to the number of true positives divided by the total number of positive predictions (i.e., the number of true positives plus the number of false positives)
public Float getMacroRecall()
Measures the model’s ability to predict actual positive classes. It is the ratio between the predicted true positives and what was actually tagged. The recall metric reveals how many of the predicted classes are correct.
public Float getWeightedF1()
F1-score, is a measure of a model’s accuracy on a dataset
public Float getWeightedPrecision()
Precision refers to the number of true positives divided by the total number of positive predictions (i.e., the number of true positives plus the number of false positives)
public Float getWeightedRecall()
Measures the model’s ability to predict actual positive classes. It is the ratio between the predicted true positives and what was actually tagged. The recall metric reveals how many of the predicted classes are correct.
public String toString()
toString
in class com.oracle.bmc.http.client.internal.ExplicitlySetBmcModel
public String toString(boolean includeByteArrayContents)
Return a string representation of the object.
includeByteArrayContents
- true to include the full contents of byte arrayspublic boolean equals(Object o)
equals
in class com.oracle.bmc.http.client.internal.ExplicitlySetBmcModel
public int hashCode()
hashCode
in class com.oracle.bmc.http.client.internal.ExplicitlySetBmcModel
Copyright © 2016–2024. All rights reserved.