EvalMetrics.Encodings.OneMinusOneType
OneMinusOne{T<:Number} <: TwoClassEncoding{T}

Two class label encoding in which one(T) represents the positive class, and -one(T) the negative class.

EvalMetrics.Encodings.OneTwoType
OneTwo{T<:Number} <: TwoClassEncoding{T}

Two class label encoding in which one(T) represents the positive class, and 2*one(T) the negative class.

EvalMetrics.Encodings.OneZeroType
OneZero{T<:Number} <: TwoClassEncoding{T}

Two class label encoding in which one(T) represents the positive class, and zero(T) the negative class.

EvalMetrics.ConfusionMatrixMethod
ConfusionMatrix(targets::AbstractVector, scores::RealVector, thres::RealVector)
ConfusionMatrix(enc::TwoClassEncoding, targets::AbstractVector, scores::RealVector, thres_in::RealVector)

For each threshold from thres computes the binary classification confusion matrix.

EvalMetrics.ConfusionMatrixMethod
ConfusionMatrix(targets::AbstractVector, scores::RealVector, thres::Real)
ConfusionMatrix(enc::TwoClassEncoding, targets::AbstractVector, scores::RealVector, thres::Real)

For the given prediction scores .>= thres of the true labels targets computes the binary confusion matrix.

EvalMetrics.ConfusionMatrixMethod
ConfusionMatrix(targets::AbstractVector, predicts::AbstractVector)
ConfusionMatrix(enc::TwoClassEncoding, targets::AbstractVector, predicts::AbstractVector)

For the given prediction predicts of the true labels targets computes the binary confusion matrix.

Base.precisionMethod
precision(args; kwargs...)

Returns precision tp/(tp + fp). Aliases: positive_predictive_value.

EvalMetrics.f1_scoreMethod
f1_score(args; kwargs...)

Returns f1 score 2*precision*recall/(precision + recall).

EvalMetrics.find_threshold_binsMethod
find_threshold_bins(x::Real, thres::RealVector)

findthresholdbins: x < thres[1] –> 1 thres[i] <= x < thres[i+1] –> i+1 x >= thres[n] –> n+1

EvalMetrics.fβ_scoreMethod
fβ_score(args; kwargs...)

Returns fβ score (1 + β^2)*precision*recall/(β^2*precision + recall).

EvalMetrics.roccurveMethod
roccurve(args; kwargs...)

Returns false positive rates and true positive rates.

EvalMetrics.threat_scoreMethod
threat_score(args; kwargs...)

Returns threat score tp/(tp + fn + fp). Aliases: critical_success_index.

EvalMetrics.threshold_at_fnrMethod
threshold_at_fnr(targets, scores, fnr)

Returns a decision threshold at a given false negative rate fnr ∈ [0, 1].

EvalMetrics.threshold_at_fprMethod
threshold_at_fpr(targets, scores, fpr)

Returns a decision threshold at a given false positive rate fpr ∈ [0, 1].

EvalMetrics.threshold_at_kMethod
threshold_at_k(scores, k; rev)

Returns a decision threshold at k most anomalous samples if rev == true and a decision threshold at k least anomalous samples otherwise.

EvalMetrics.threshold_at_tnrMethod
threshold_at_tnr(targets, scores, tnr)

Returns a decision threshold at a given true negative rate fpr ∈ [0, 1].

EvalMetrics.threshold_at_tprMethod
threshold_at_tpr(targets, scores, tpr)

Returns a decision threshold at a given true positive rate tpr ∈ [0, 1].

EvalMetrics.thresholdsFunction
thresholds(scores; ...)
thresholds(scores, n; reduced, zerorecall)

Returns n decision thresholds which correspond to n evenly spaced quantiles of the given vector of scores. If reduced == true, then the resulting n is min(length(scores) + 1, n). If zerorecall == true, then the largest threshold will be maximum(scores)*(1 + eps()) otherwise maximum(scores).