??????機器學(xué)習(xí) 實戰(zhàn)系列 總目錄
本篇文章的代碼運行界面均在Pycharm中進(jìn)行
本篇文章配套的代碼資源已經(jīng)上傳
SVM分類實戰(zhàn)1之簡單SVM分類
SVM分類實戰(zhàn)2線性SVM
SVM分類實戰(zhàn)3非線性SVM文章來源地址http://www.zghlxwxcb.cn/news/detail-733679.html
4、非線性SVM
4.1 創(chuàng)建非線性數(shù)據(jù)
from sklearn.datasets import make_moons
X, y = make_moons(n_samples=100, noise=0.15, random_state=42)
def plot_dataset(X, y, axes):
plt.plot(X[:, 0][y==0], X[:, 1][y==0], "bs")
plt.plot(X[:, 0][y==1], X[:, 1][y==1], "g^")
plt.axis(axes)
plt.grid(True, which='both')
plt.xlabel(r"$x_1$", fontsize=20)
plt.ylabel(r"$x_2$", fontsize=20, rotation=0)
plot_dataset(X, y, [-1.5, 2.5, -1, 1.5])
plt.show()
4.2 分類預(yù)測
from sklearn.datasets import make_moons
from sklearn.pipeline import Pipeline
from sklearn.preprocessing import PolynomialFeatures
polynomial_svm_clf=Pipeline((("poly_features",PolynomialFeatures(degree=3)),
("scaler",StandardScaler()),
("svm_clf",LinearSVC(C=10,loss="hinge"))
))
polynomial_svm_clf.fit(X,y)
- 使用PolynomialFeatures模塊進(jìn)行預(yù)處理,使用這個可以增加數(shù)據(jù)維度
- polynomial_svm_clf.fit(X,y)對當(dāng)前進(jìn)行訓(xùn)練傳進(jìn)去X和y數(shù)據(jù)
def plot_predictions(clf,axes):
x0s = np.linspace(axes[0],axes[1],100)
x1s = np.linspace(axes[2],axes[3],100)
x0,x1 = np.meshgrid(x0s,x1s)
X = np.c_[x0.ravel(),x1.ravel()]
y_pred = clf.predict(X).reshape(x0.shape)
plt.contourf(x0,x1,y_pred,cmap=plt.cm.brg,alpha=0.2)
plot_predictions(polynomial_svm_clf,[-1.5,2.5,-1,1.5])
plot_dataset(X,y,[-1.5,2.5,-1,1.5])
5、核函數(shù)
5.1 核函數(shù)
from sklearn.svm import SVC
poly_kernel_svm_clf = Pipeline([
("scaler", StandardScaler()),
("svm_clf", SVC(kernel="poly", degree=3, coef0=1, C=5))
])
poly_kernel_svm_clf.fit(X, y)
poly100_kernel_svm_clf = Pipeline([
("scaler", StandardScaler()),
("svm_clf", SVC(kernel="poly", degree=10, coef0=100, C=5))
])
poly100_kernel_svm_clf.fit(X, y)
plt.figure(figsize=(11, 4))
plt.subplot(121)
plot_predictions(poly_kernel_svm_clf, [-1.5, 2.5, -1, 1.5])
plot_dataset(X, y, [-1.5, 2.5, -1, 1.5])
plt.title(r"$d=3, r=1, C=5$", fontsize=18)
plt.subplot(122)
plot_predictions(poly100_kernel_svm_clf, [-1.5, 2.5, -1, 1.5])
plot_dataset(X, y, [-1.5, 2.5, -1, 1.5])
plt.title(r"$d=10, r=100, C=5$", fontsize=18)
plt.show()
文章來源:http://www.zghlxwxcb.cn/news/detail-733679.html
5.2 高斯核函數(shù)
SVM分類實戰(zhàn)1之簡單SVM分類
SVM分類實戰(zhàn)2線性SVM
SVM分類實戰(zhàn)3非線性SVM
到了這里,關(guān)于機器學(xué)習(xí)實戰(zhàn)-系列教程8:SVM分類實戰(zhàn)3非線性SVM(鳶尾花數(shù)據(jù)集/軟間隔/線性SVM/非線性SVM/scikit-learn框架)項目實戰(zhàn)、代碼解讀的文章就介紹完了。如果您還想了解更多內(nèi)容,請在右上角搜索TOY模板網(wǎng)以前的文章或繼續(xù)瀏覽下面的相關(guān)文章,希望大家以后多多支持TOY模板網(wǎng)!