`EvoLinear.Splines.EvoSplineRegressor`

— Method`EvoSplineRegressor(; kwargs...)`

A model type for constructing a EvoSplineRegressor, based on EvoLinear.jl, and implementing both an internal API and the MLJ model interface.

**Keyword arguments**

`loss=:mse`

: loss function to be minimised. Can be one of:`:mse`

`:logistic`

`:poisson`

`:gamma`

`:tweedie`

`nrounds=10`

: maximum number of training rounds.`eta=1`

: Learning rate. Typically in the range`[1e-2, 1]`

.`L1=0`

: Regularization penalty applied by shrinking to 0 weight update if update is < L1. No penalty if update > L1. Results in sparse feature selection. Typically in the`[0, 1]`

range on normalized features.`L2=0`

: Regularization penalty applied to the squared of the weight update value. Restricts large parameter values. Typically in the`[0, 1]`

range on normalized features.`rng=123`

: random seed. Not used at the moment.`updater=:all`

: training method. Only`:all`

is supported at the moment. Gradients for each feature are computed simultaneously, then bias is updated based on all features update.`device=:cpu`

: Only`:cpu`

is supported at the moment.

**Internal API**

Do `config = EvoSplineRegressor()`

to construct an hyper-parameter struct with default hyper-parameters. Provide keyword arguments as listed above to override defaults, for example:

`EvoSplineRegressor(loss=:logistic, L1=1e-3, L2=1e-2, nrounds=100)`

**Training model**

A model is built using `fit`

:

```
config = EvoSplineRegressor()
m = fit(config; x, y, w)
```

**Inference**

Fitted results is an `EvoLinearModel`

which acts as a prediction function when passed a features matrix as argument.

`preds = m(x)`

**MLJ Interface**

From MLJ, the type can be imported using:

`EvoSplineRegressor = @load EvoSplineRegressor pkg=EvoLinear`

Do `model = EvoLinearRegressor()`

to construct an instance with default hyper-parameters. Provide keyword arguments to override hyper-parameter defaults, as in `EvoSplineRegressor(loss=...)`

.

**Training model**

In MLJ or MLJBase, bind an instance `model`

to data with `mach = machine(model, X, y)`

where:

`X`

: any table of input features (eg, a`DataFrame`

) whose columns each have one of the following element scitypes:`Continuous`

,`Count`

, or`<:OrderedFactor`

; check column scitypes with`schema(X)`

`y`

: is the target, which can be any`AbstractVector`

whose element scitype is`<:Continuous`

; check the scitype with`scitype(y)`

Train the machine using `fit!(mach, rows=...)`

.

**Operations**

`predict(mach, Xnew)`

: return predictions of the target given

features `Xnew`

having the same scitype as `X`

above. Predictions are deterministic.

**Fitted parameters**

The fields of `fitted_params(mach)`

are:

`:fitresult`

: the`SplineModel`

object returned by EvoSplineRegressor fitting algorithm.

**Report**

The fields of `report(mach)`

are:

`:coef`

: Vector of coefficients (βs) associated to each of the features.`:bias`

: Value of the bias.`:names`

: Names of each of the features.

`MLJModelInterface.fit`

— Method`fit(config::EvoSplineRegressor; x_train, y_train, x_eval = nothing, y_eval = nothing)`

Train a splined linear model.

`EvoLinear.Metrics.gamma_deviance`

— Method```
gamma_deviance(p, y)
gamma_deviance(p, y, w)
```

Gamma deviance evaluation metric. `𝐷 = 2 * (log(μ/y) + y/μ - 1)`

**Arguments**

`p`

: predicted value. Assumes that p is on a projected basis (ie. in the`[0-Inf]`

range).`y`

: observed target variable.`w`

: vector of weights.

`EvoLinear.Metrics.logloss`

— Method```
logloss(p, y)
logloss(p, y, w)
```

Logloss evaluation metric. ylog(p) + (1-y)log(1-p)

**Arguments**

`p`

: predicted value. Assumes that p is on a projected basis (ie. in the`[0-1]`

range).`y`

: observed target variable.`w`

: vector of weights.

`EvoLinear.Metrics.mae`

— Method```
mae(p, y)
mae(p, y, w)
```

Mean absolute error evaluation metric.

**Arguments**

`p`

: predicted value.`y`

: observed target variable.`w`

: vector of weights.

`EvoLinear.Metrics.mse`

— Method```
mse(p, y)
mse(p, y, w)
```

Mean squared error evaluation metric.

**Arguments**

`p`

: predicted value.`y`

: observed target variable.`w`

: vector of weights.

`EvoLinear.Metrics.poisson_deviance`

— Method```
poisson_deviance(p, y)
poisson_deviance(p, y, w)
```

Poisson deviance evaluation metric. `𝐷 = 2 * (y * log(y/p) + p - y)`

**Arguments**

`p`

: predicted value. Assumes that p is on a projected basis (ie. in the`[0-Inf]`

range).`y`

: observed target variable.`w`

: vector of weights.

`EvoLinear.Metrics.tweedie_deviance`

— Method```
tweedie_deviance(p, y)
tweedie_deviance(p, y, w)
```

Tweedie deviance evaluation metric. Fixed rho (ρ) of 1.5. 𝐷 = 2 * (y²⁻ʳʰᵒ/(1-rho)(2-rho) - yμ¹⁻ʳʰᵒ/(1-rho) + μ²⁻ʳʰᵒ/(2-rho))

**Arguments**

`p`

: predicted value. Assumes that p is on a projected basis (ie. in the`[0-Inf]`

range).`y`

: observed target variable.`w`

: vector of weights.

`EvoLinear.Linear.EvoLinearRegressor`

— Method`EvoLinearRegressor(; kwargs...)`

A model type for constructing a EvoLinearRegressor, based on EvoLinear.jl, and implementing both an internal API and the MLJ model interface.

**Keyword arguments**

`loss=:mse`

: loss function to be minimised. Can be one of:`:mse`

`:logistic`

`:poisson`

`:gamma`

`:tweedie`

`nrounds=10`

: maximum number of training rounds.`eta=1`

: Learning rate. Typically in the range`[1e-2, 1]`

.`L1=0`

: Regularization penalty applied by shrinking to 0 weight update if update is < L1. No penalty if update > L1. Results in sparse feature selection. Typically in the`[0, 1]`

range on normalized features.`L2=0`

: Regularization penalty applied to the squared of the weight update value. Restricts large parameter values. Typically in the`[0, 1]`

range on normalized features.`rng=123`

: random seed. Not used at the moment.`updater=:all`

: training method. Only`:all`

is supported at the moment. Gradients for each feature are computed simultaneously, then bias is updated based on all features update.`device=:cpu`

: Only`:cpu`

is supported at the moment.

**Internal API**

Do `config = EvoLinearRegressor()`

to construct an hyper-parameter struct with default hyper-parameters. Provide keyword arguments as listed above to override defaults, for example:

`EvoLinearRegressor(loss=:logistic, L1=1e-3, L2=1e-2, nrounds=100)`

**Training model**

A model is built using `fit`

:

```
config = EvoLinearRegressor()
m = fit(config; x, y, w)
```

**Inference**

Fitted results is an `EvoLinearModel`

which acts as a prediction function when passed a features matrix as argument.

`preds = m(x)`

**MLJ Interface**

From MLJ, the type can be imported using:

`EvoLinearRegressor = @load EvoLinearRegressor pkg=EvoLinear`

Do `model = EvoLinearRegressor()`

to construct an instance with default hyper-parameters. Provide keyword arguments to override hyper-parameter defaults, as in `EvoLinearRegressor(loss=...)`

.

**Training model**

In MLJ or MLJBase, bind an instance `model`

to data with `mach = machine(model, X, y)`

where:

`X`

: any table of input features (eg, a`DataFrame`

) whose columns each have one of the following element scitypes:`Continuous`

,`Count`

, or`<:OrderedFactor`

; check column scitypes with`schema(X)`

`y`

: is the target, which can be any`AbstractVector`

whose element scitype is`<:Continuous`

; check the scitype with`scitype(y)`

Train the machine using `fit!(mach, rows=...)`

.

**Operations**

`predict(mach, Xnew)`

: return predictions of the target given

features `Xnew`

having the same scitype as `X`

above. Predictions are deterministic.

**Fitted parameters**

The fields of `fitted_params(mach)`

are:

`:fitresult`

: the`EvoLinearModel`

object returned by EvoLnear.jl fitting algorithm.

**Report**

The fields of `report(mach)`

are:

`:coef`

: Vector of coefficients (βs) associated to each of the features.`:bias`

: Value of the bias.`:names`

: Names of each of the features.

`EvoLinear.Linear.predict_linear`

— Method`predict_linear(m, x)`

Returns the predictions on the linear basis from model `m`

using the features matrix `x`

.

**Arguments**

`m::EvoLinearModel`

: model generating the predictions.`x`

: features matrix`[nobs, num_features]`

for which predictions are generated.

`EvoLinear.Linear.predict_proj`

— Method`predict_proj(m, x)`

Returns the predictions on the projected basis from model `m`

using the features matrix `x`

.

`MSE`

:`pred_proj = pred_linear`

`Logistic`

:`pred_proj = sigmoid(pred_linear)`

`Poisson`

:`pred_proj = exp(pred_linear)`

`Gamma`

:`pred_proj = exp(pred_linear)`

`Tweedie`

:`pred_proj = exp(pred_linear)`

**Arguments**

`m::EvoLinearModel`

: model generating the predictions.`x`

: features matrix`[nobs, num_features]`

for which predictions are generated.

`EvoLinear.Linear.update_∇!`

— Method`update_∇!(L, ∇¹, ∇², x, y, p, w)`

Update gradients w.r.t each feature. Each feature gradient update is dispatch according to the loss type (`mse`

, `logistic`

...).

`MLJModelInterface.fit`

— Method```
fit(config::EvoLinearRegressor;
x, y, w=nothing,
x_eval=nothing, y_eval=nothing, w_eval=nothing,
metric=:none,
print_every_n=1)
```

Provided a `config`

, `EvoLinear.fit`

takes `x`

and `y`

as features and target inputs, plus optionally `w`

as weights and train a Linear boosted model.

**Arguments**

`config::EvoLinearRegressor`

:

**Keyword arguments**

`x::AbstractMatrix`

: Features matrix. Dimensions are`[nobs, num_features]`

.`y::AbstractVector`

: Vector of observed targets.`w=nothing`

: Vector of weights. Can be be either a`Vector`

or`nothing`

. If`nothing`

, assumes a vector of 1s.`metric=nothing`

: Evaluation metric to be tracked through each iteration. Default to`nothing`

. Can be one of:`:mse`

`:logistic`

`:poisson_deviance`

`:gamma_deviance`

`:tweedie_deviance`