LIBSVM.svmmodelMethod

Convert SVM model to libsvm struct for prediction

LIBSVM.svmpredictMethod
svmpredict(model::SVM{T}, X::AbstractMatrix{U}) where {T,U<:Real}

Predict values using model based on data X. The shape of X needs to be (nfeatures, nsamples) (for precomputed kernel see below). The method returns tuple (predictions, decisionvalues).

Precomputed kernel

In the case of precomputed kernel, the input matrix X should be of shape (l, n), where l is the number of training instances and n is the number of testing instances. Column i of X should contain values [K(t_i, x_1), K(t_i, x_2), ..., K(t_i, x_l)], where t_i is i-th testing instance and x_j is j-th training instance. For linear kernel, M' * T produces the correct matrix, where columns of M contain the training instances and columns of T the testing instances.

LIBSVM.svmtrainMethod
svmtrain(
    X::AbstractMatrix{U}, y::AbstractVector{T} = [];
    svmtype::Type = SVC,
    kernel = Kernel.RadialBasis,
    degree::Integer = 3,
    gamma::Float64 = 1.0/size(X, 1),
    coef0::Float64 = 0.0,
    cost::Float64=1.0,
    nu::Float64 = 0.5,
    epsilon::Float64 = 0.1,
    tolerance::Float64 = 0.001,
    shrinking::Bool = true,
    probability::Bool = false,
    weights::Union{Dict{T,Float64},Cvoid} = nothing,
    cachesize::Float64 = 200.0,
    verbose::Bool = false
) where {T,U<:Real}

Train Support Vector Machine using LIBSVM using response vector y and training data X. The shape of X needs to be (nfeatures, nsamples) or (nsamples, nsamples) in the case of precomputed kernel (see below). For one-class SVM use only X.

Arguments

  • svmtype::Type = LIBSVM.SVC: Type of SVM to train SVC (for C-SVM), NuSVC OneClassSVM, EpsilonSVR or NuSVR. Defaults to OneClassSVM if y is not used.
  • kernel = Kernel.RadialBasis: Model kernel Kernels.Linear, Kernels.Polynomial, Kernels.RadialBasis, Kernels.Sigmoid, Kernels.Precomputed or a Base.Callable.
  • degree::Integer = 3: Kernel degree. Used for polynomial kernel
  • gamma::Float64 = 1.0/size(X, 1) : γ for kernels
  • coef0::Float64 = 0.0: parameter for sigmoid and polynomial kernel
  • cost::Float64 = 1.0: cost parameter C of C-SVC, epsilon-SVR, and nu-SVR
  • nu::Float64 = 0.5: parameter nu of nu-SVC, one-class SVM, and nu-SVR
  • epsilon::Float64 = 0.1: epsilon in loss function of epsilon-SVR
  • tolerance::Float64 = 0.001: tolerance of termination criterion
  • shrinking::Bool = true: whether to use the shrinking heuristics
  • probability::Bool = false: whether to train a SVC or SVR model for probability estimates
  • weights::Union{Dict{T, Float64}, Cvoid} = nothing: dictionary of class weights
  • cachesize::Float64 = 200.0: cache memory size in MB
  • verbose::Bool = false: print training output from LIBSVM if true
  • nt::Integer = 0: number of OpenMP cores to use, if 0 it is set to OMPNUMTHREADS, if negative it is set to the max number of threads

Consult LIBSVM documentation for advice on the choise of correct parameters and model tuning.

Precomputed kernel

In the case of precomputed kernel, the input matrix X should be a (symmetric) matrix of shape (n, n) where column i contains values [K(x_i, x_1), K(x_i, x_2), ..., K(x_i, x_n)]. For example, if matrix M contains instances in its columns, then M' * M produces the correct input matrix X for linear kernel.