Week7
Support Vector Regression
The detailed explanation of Kernel could be found in the following references :
http://www.svms.org/regression/SmSc98.pdf
https://alex.smola.org/papers/2004/SmoSch04.pdf
https://alex.smola.org/papers/2003/SmoSch03b.pdf
http://web.mit.edu/6.034/wwwbob/svm-notes-long-08.pdffrom sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X,y,test_size=f)
from sklearn.svm import SVR
svr = SVR(kernel= 'linear') #linear/poly/rbf/sigmoid
svr.fit(X_train, y_train)
svr.predict(X_test)
svr.predict(X_train)If you wish to us ‘rbf’ or ‘sigmoid’ in SVR, its highly recommended you standardize the dataset first.
Once you standardize/normalize data, in order to calculate Regression metrics ( Error Criteria) , do not forget to de-standardize/de-normalize the output results first, and then calculate errors.
Example :
import numpy as np
from sklearn.svm import SVR
import pylab as plt
X = np.sort(5 * np.random.rand(100, 1), axis=0)
y = np.cos(X)
svr_rbf = SVR(kernel='rbf')
svr_lin = SVR(kernel='linear')
svr_poly= SVR(kernel='poly', degree=3)
svr_rbf.fit(X, y)
svr_lin.fit(X, y)
svr_poly.fit(X, y)
y_rbf = svr_rbf.predict(X)
y_lin = svr_lin.predict(X)
y_poly = svr_poly.predict(X)
plt.scatter(X, y, c='k', label='data')
plt.hold('on')
plt.plot(X, y_rbf, c='g', label='RBF model')
plt.plot(X, y_lin, c='r', label='Linear model')
plt.plot(X, y_poly,c='b', label='Polynomial model')
plt.legend()
plt.show()Multilayer Perceptron (MLP)
Normalize data first
activation: identity/logistic/tanh/relu
solver: ibfgs/sgd/adam
tol: When the loss is not improving by at least tol for two consecutive iterations, convergence is considered to be reached and training stops.
Example
KNN
Exercise
SVR
Create a data set consisting of 200 random input and the output would be the sin(x) of the input. Make the inputs between 0 and 5. ‘Sort’ the input in ascending order. Add some noise to the output and then, do the followings: • SVR with ‘rbf’ kernel • SVR with ‘linear’ kernel • SVR with ‘poly’ kernel • Plot the results for each of them together with the dataset itself.
MLP
For the a synthetic dataset containing 100 samples, and noisy output, create MLP regression for TWO activation function i.e. ‘Tanh’ and ‘Relu’ , plot each regression separately to compare the results. Hint: use makeregression library to create data
KNN
Last updated
Was this helpful?