SVMs
In SPL the function svm() contains five algorithms, C-SVC, nu-SVC, one-class SVM, epsilon-SVR and nu-SVR, which can be used to solve binary classification problems and regression problems. Details see function reference documentation at svm()-Functions | esProc Function Reference (raqsoft.com)
For example, a regression problem
Continue to use the sample data in the previous section for modeling and prediction by using svm()
A |
|
1 |
[[1.1,1.1],[1.4,1.5],[1.7,1.8],[1.7,1.7],[1.8,1.9],[1.8,1.8],[1.9,1.8],[2.0,2.1],[2.3,2.4],[2.4,2.5]] |
2 |
[16.3,16.8,19.2,18,19.5,20.9,21.1,20.9,20.3,22] |
3 |
[[2.4, 2.4]] |
4 |
>svm_type=3,kenel=0,degree=3,cache_size=100,eps=0.001,C=1,gamma=0.25,coef=0,nu=0.5,p=0.1,nr_weight=1,shrinking=1,probability=0 |
5 |
=[svm_type,kenel,degree,cache_size,eps,C,gamma,coef,nu,p,nr_weight,shrinking,probability] |
6 |
=svm(A1,A2,A5) |
7 |
=svm(A6,A3) |
8 |
=svm(A1,A2,A5,A3) |
A1 Training set x
A2 Training set y
A3 Prediction data
A4 Set the parameters of svm and select SVR regression algorithm
A5 The parameter values in A4 are passed into the sequence as the parameters of the svm
A6 Enter training data and parameters, perform modeling, and return model R. The member values in R are, in order, the coefficients of the support vectors in the decision function, the labels of each class, the number of classes, the number of total support vectors, the number of support vectors per class, the constants in the decision function, the support vectors, and the parameters during training
A7 According to the model information R of A6 to make the prediction on the prediction data and return the prediction result
A8 Continuous modeling and prediction, directly return the prediction results, the effect is equivalent to A6+A7
Another example, a classification problem
Using the titanic data for modeling.
Since there are missing values and character variables in the data that cannot be modeled directly, relevant processing has been carried out before use. In this example, the processed data is directly used for modeling demonstration.
A |
|
1 |
=file("D://titanic_svm.csv").import@tc() |
2 |
=A1.array().to(2:) |
3 |
=A2.(~.to(2:)).to(800) |
4 |
=A2.(~(1)).to(800) |
5 |
=A2.(~.to(2:)).to(801:) |
6 |
>svm_type=0,kenel=2,degree=3,cache_size=100,eps=0.001,C=1,gamma=0.25,coef=0,nu=0.5,p=0.1,nr_weight=1,shrinking=1,probability=0 |
7 |
=[svm_type,kenel,degree,cache_size,eps,C,gamma,coef,nu,p,nr_weight,shrinking,probability] |
8 |
=svm(A3,A4,A7) |
9 |
=svm(A8,A5) |
10 |
=svm(A3,A4,A7,A5) |
A1 Import the titanic data as a sequence table
A2 Change the sequence table to vector form, drop the title
A3 Take the independent variables of the first 800 samples as the training set X
A4 Take the target variables of the first 800 samples as the training set Y
A5 Take the independent variables after the 800 samples as the prediction set
A6 Set the svm parameters and select the SVC algorithm
A7 The parameter values in A6 are passed into the sequence as the parameters of the svm
A8 Using the parameters in A7, make training on the training set and return the training result R
A9 According to the model information R to make the prediction on the prediction data and return the prediction result
A10 Continuous modeling and prediction, directly return the prediction results, the effect is equivalent to A7+A8
SPL Official Website 👉 https://www.scudata.com
SPL Feedback and Help 👉 https://www.reddit.com/r/esProcSPL
SPL Learning Material 👉 https://c.scudata.com
SPL Source Code and Package 👉 https://github.com/SPLWare/esProc
Discord 👉 https://discord.gg/cFTcUNs7
Youtube 👉 https://www.youtube.com/@esProc_SPL