EasyFit.CubicMethod
(fit::EasyFit.Cubic)(x::Real)

Calling the the fitted estimator on a new sample generates point predictions. To compute predictions for multiple new data points, use broadcasting.

Examples

julia> x = sort(rand(10)); y = x.^3 .+ rand(10);

julia> f = fitcubic(x,y)

 ------------------- Cubic Fit ----------------- 

 Equation: y = ax^3 + bx^2 + cx + d 

 With: a = 3.498571133673037
       b = -5.75292789995513
       c = 2.626129810011887
       d = 0.6361773562878126

 Pearson correlation coefficient, R = 0.7405690253097572
 Average square residue = 0.01215483592609077

 Predicted Y: ypred = [0.6416314330095221, 0.6417874373639705...
 residues = [-0.13182717628179608, -0.01592993507117535...

 ----------------------------------------------- 


julia> f.(rand(10))
10-element Vector{Float64}:
 0.8761239348448231
 0.9115358893542463
 0.9121562305431836
 0.8919530945018805
 ⋮
 0.81693749334824
 0.9622975666245418
 0.9753695182250022
EasyFit.LinearMethod
(fit::Linear)(x::Real)

Calling the the fitted estimator on a new sample generates point predictions. To compute predictions for multiple new data points, use broadcasting.

Examples

julia> x = sort(rand(10)) ; y = sort(rand(10));

julia> f = fitlinear(x,y)

 ------------------- Linear Fit ------------- 

 Equation: y = ax + b 

 With: a = 0.7492056732121763
       b = 0.19051493263850805

 Pearson correlation coefficient, R = 0.9880617647915537
 Average square residue = 0.0011903365407044974

 Predicted Y: ypred = [0.19131831893286483, 0.2588265305624418...
 residues = [-0.015828020422002875, -0.0503384398812427...

 -------------------------------------------- 


julia> f.(rand(10))
10-element Vector{Float64}:
 0.318112972601176
 0.5541324942065607
 0.2684448170646049
 0.2448341076856998
 0.19914167794590798
 0.7043365726554222
 0.47993606210138606
 0.6726328561177188
 0.3094063592157996
 0.40113908380656205
EasyFit.MultipleExponentialMethod

Examples

julia> x = sort(rand(10)); y = rand()*exp.(sort(rand(10)));

julia> f = fitexp(x,y,l=lower(b=[0.,0.]),n=2)

 -------- Multiple-exponential fit ------------- 

 Equation: y = sum(a[i] exp(-x/b[i]) for i in 1:2) + c 

 With: a = [70.52352486967449, -71.7586710966385]
       b = [0.4660083805047125, 0.4780084734705192]
       c = 1.8482858404652986

 Pearson correlation coefficient, R = 0.9933941403034768
 Average square residue = 0.0007163482924343316

 Predicted Y: ypred = [0.5864990290966858, 0.5582571702322241...
 residues = [0.003833387360068774, -0.026186725644245512...

 ----------------------------------------------- 


julia> f.(rand(10))
10-element Vector{Float64}:
 0.7484563114305345
 0.9716494335236276
 0.5700590449540968
 1.2075688838563046
 1.2280682982441256
 0.5361263670282497
 0.6277801581255005
 0.7234067850308292
 0.5698309060534725
 0.5441089815268014
EasyFit.QuadraticMethod
(fit::Quadratic)(x::Real)

Calling the the fitted estimator on a new sample generates point predictions. To compute predictions for multiple new data points, use broadcasting.

Examples

julia> x = sort(rand(10)); y = x.^2 .+ rand(10);

julia> f = fitquad(x,y)

 ------------------- Quadratic Fit ------------- 

 Equation: y = ax^2 + bx + c 

 With: a = 3.0661527272135043
       b = -1.2832262361743607
       c = 0.3650565332863989

 Pearson correlation coefficient, R = 0.8641384358642901
 Average square residue = 0.06365720683818799

 Predicted Y: ypred = [0.2860912283436672, 0.2607175680542409...
 residues = [0.11956927540814413, -0.26398094542690925...

 ----------------------------------------------- 


julia> f.(rand(10))
10-element Vector{Float64}:
 1.7769193832294496
 0.2489307162532048
 0.33127545070267345
 0.4422093927705098
 0.2974933484569105
 0.6299254836558978
 0.24575582331233475
 0.9185494116516484
 1.4615291776107249
 1.600204246377446
EasyFit.SingleExponentialMethod
(fit::SingleExponential)(x::Real)

Calling the the fitted estimator on a new sample generates point predictions. To compute predictions for multiple new data points, use broadcasting.

Examples

julia> x = sort(rand(10)); y = rand()*exp.(sort(rand(10)));

julia> f = fitexp(x,y,l=lower(b=0.),u=upper(b=10.),c=5.)

 ------------ Single Exponential fit ----------- 

 Equation: y = a exp(-x/b) + c

 With: a = -4.663077750813696
       b = 3.503891711388752
       c = 5.0

 Pearson correlation coefficient, R = 0.8451556303228667
 Average square residue = 0.018962968678623245

 Predicted Y: ypred = [0.5349351918795522, 0.8336327629821874...
 residues = [-0.1770144709939867, 0.026331423737706694...

 ----------------------------------------------- 


julia> f.(rand(10))
10-element Vector{Float64}:
 1.3311695732688578
 0.3472845242931859
 1.4590115873141651
 0.3763535792927408
 0.5756106398622487
 1.423007971271891
 0.8893556848381099
 0.7803804518815749
 0.9734992788718548
 0.5963345544654599
EasyFit.fitcubicMethod
fitcubic(x,y)

Obtains the cubic polynomial fit: $y = ax^3 + bx^2 + cx + d$

Optional lower and upper bounds for a, b, and c can be provided using, for example:

fitcubic(x,y, l=lower(b=0.), u=upper(a=5.))

and d can be set to constant with, for example:

fitcubic(x,y,d=5.)

Examples

julia>  x = sort(rand(10)); y = x.^3 .+ rand(10);

julia> fit = fitcubic(x,y)

 ------------------- Cubic Fit ----------------- 

 Equation: y = ax^3 + bx^2 + cx + d 

 With: a = 12.637633791600711
       b = -19.648194970330454
       c = 10.018385827387148
       d = -0.8740912356800155

 Pearson correlation coefficient, R = 0.7831345513024988
 Average square residue = 0.0781543071776559

 Predicted Y: ypred = [0.24999805379642903, 0.3001612840610868...
 residues = [0.2238223147726266, 0.12656861200050698...

 ----------------------------------------------- 
EasyFit.fitdensityMethod
fitdensity(x; step, norm)

Obtains the density function given data sampled.

Use step=(Float64) to control the bin step. Use norm=(0 or 1) to set if the number of data points or the probability of finding a data point within ± step/2 will be output.

By default, norm=1 (probability) and the step is (xmax-xmin)/100.

Examples

julia> x = randn(1000)

julia> d = fitdensity(x)

 ------------------- Density -------------

  `d` contains the number of data points within x ± 0.028325979340904375

 -----------------------------------------
EasyFit.fitexponentialMethod
fitexponential(x,y; n::Int=1)
fitexp(x,y; n::Int=1)

Obtains single or multiexponential fits: $y = a*exp(-x/b) + c$ or $y = sum(a[i]*exp(-x/b[i]) for i in 1:N) + c$

Lower and upper bounds can be optionall set, and the intercept c can set to be constant:

For single exponentials all parameters are scalars:

fitexp(x,y,l=lower(b=0.),u=upper(b=10.),c=5.)

For multiple exponentials, a and b bounds must be vectors of dimension N.

fitexp(x,y,n=2,l=lower(a=[0.,0.]),u=upper(b=[-100.,-5.]))

Examples

julia> x = sort(rand(10)); y = rand()*exp.(sort(rand(10)));

julia> fit = fitexp(x,y,l=lower(b=[0.,0.]),n=2)

 -------- Multiple-exponential fit -------------

 Equation: y = sum(a[i] exp(-x/b[i]) for i in 1:2) + c

 With: a = [6.60693727987886e-13, 0.6249999999993409]
       b = [0.02688289803014393, 0.5000000000002596]
       c = 0.37499999999999856

 Pearson correlation coefficient, R = 1.0
 Average square residue = 1.1639900380979497e-29

 Predicted Y: ypred = [1.0000000000000002, 0.4595845520228801...
 residues = [2.220446049250313e-16, -2.831068712794149e-15...

 -----------------------------------------------
EasyFit.fitlinearMethod
fitlinear(x,y)

Obtains the linear fit: $y = a*x + b$

Optional lower and upper bounds for a, and constant b can be provided using, for example:

fitlinear(x,y, l=lower(a=0.), b=3)

Examples

julia> x = sort(rand(10)) ; y = sort(rand(10));

julia> fit = fitlinear(x,y)

------------------- Linear Fit ------------- 

Equation: y = ax + b 

With: a = 1.0448783208110997
      b = 0.18817627115683894

Pearson correlation coefficient, R = 0.8818586822210751
Average absolute residue = 0.14274752107157443

Predicted Y: ypred = [0.1987357699444139, 0.32264343301109627...
residues = [0.1613987313816987, 0.22309410865095275...

-------------------------------------------- 
EasyFit.fitndgrMethod
fitndgr(x,y,n)

Obtains the polynomial fit of degree n: y = p[n+1]*x^n + p[n]*x^(n-1) + ... + p[2]*x + p[1]

Optional lower and upper bounds for p[i] can be provided using two arrays of length n+1, for example:

fitndgr(x,y,4; l=fill(-1.0, 5), u=[5.0, 7.0, 8.0, 7.0, 5.0])
EasyFit.fitquadraticMethod
fitquad(x,y)
fitquadratic(x,y)

Obtains the quadratic fit: $y = a*x^2 + b*x + c$

Optional lower and upper bounds for a and b can be provided using, for example:

fitquad(x,y, lower(b=0.), upper(a=5.,b=7.) )

and the intercept c can be fixed with

fitquad(x,y, c=3.)

Examples

julia>  x = sort(rand(10)); y = x.^2 .+ rand(10);

julia> fit = fitquad(x,y)

 ------------------- Quadratic Fit ------------- 

 Equation: y = ax^2 + bx + c 

 With: a = 1.9829681649993036
       b = -1.24215737650827
       c = 0.9410816080128867

 Pearson correlation coefficient, R = 0.8452759310204063
 Average square residue = 0.039620067833833005

 Predicted Y: ypred = [0.778952191090992, 0.7759243614999851...
 residues = [0.0550252612868799, -0.15207394277809727...

 ----------------------------------------------- 
EasyFit.movingaverageMethod
movingaverage(x,n)
movavg(x,n)

Computes a moving average of x[i] in the range i±(n-1)/2. If n is even, we do n ← n + 1.

Example

julia> x = rand(10);

julia> movingaverage(x,3)

 ------------------- Moving Average ----------

 Number of points averaged: 3 (± 1 points)

 Pearson correlation coefficient, R = 0.3532754137104625

 Averaged X: x = [0.5807828672543551, 0.40496733381946143...
 residues = [-0.22791917753944557, 0.4037347109743393...

 --------------------------------------------
EasyFit.@FitMethodsMacro
FitMethods(f :: Func)

Generates the methods for each fit function which allow calls using only upper or lower bounds to be defined, in any order.