5.9 程序示例--非线性分类-机器学习笔记-斯坦福吴恩达教授
生活随笔
收集整理的這篇文章主要介紹了
5.9 程序示例--非线性分类-机器学习笔记-斯坦福吴恩达教授
小編覺得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.
程序示例–非線性分類
接下來,我們采用高斯核函數(shù)來解決非線性可分問題,由于數(shù)據(jù)集較大,我們使用性能更好的完整版 SMO 算法進(jìn)行訓(xùn)練:
# coding: utf8 # svm/test_non_linear.py import smo import numpy as np from sklearn import datasets from scipy.io import loadmat import matplotlib.pyplot as pltdata = loadmat('data/ex6data2.mat')X = np.mat(data['X']) y = np.mat(data['y'], dtype=np.float) y[y==0] = -1m, n = X.shape C = 1.0 tol = 1e-3 maxIter = 5 kernel = smo.rbfKernel(0.1)trainSimple, train, predict = smo.getSmo(X, y, C, tol, maxIter, kernel=kernel) alphas, w, b, supportVectorsIndex, supportVectors, iterCount = train() print supportVectorsIndex print len(supportVectorsIndex) print 'iterCount:%d' % iterCountpredictions = predict(X, alphas, b, supportVectorsIndex, supportVectors) errorCount = (np.multiply(predictions, y).A < 0 ).sum() print errorCount print 'error rate: %.2f'%(float(errorCount)/m)# 繪制數(shù)據(jù)點(diǎn) x1Min = X[:, 0].min() x1Max = X[:, 0].max() x2Min = X[:, 1].min() x2Max = X[:, 1].max() plt.title('C=%.1f'%C) plt.xlabel('X1') plt.ylabel('X2') plt.xlim(x1Min, x1Max) plt.ylim(x2Min, x2Max)for i in range(m):x = X[i].A[0]if y[i] == 1:color = 'black'if i in supportVectorsIndex:color = 'red'plt.scatter(x[0], x[1], marker='*', color=color, s=50)else:color = 'green'if i in supportVectorsIndex:color = 'red'plt.scatter(x[0], x[1], marker='o', color=color, s=50)# 繪制決策邊界 xx1, xx2 = np.meshgrid(np.linspace(x1Min, x1Max, 100),np.linspace(x2Min, x2Max, 100) ) predictX = np.mat(np.c_[xx1.ravel(), xx2.ravel()]) predictions = predict(predictX, alphas, b, supportVectorsIndex, supportVectors) predictions = predictions.reshape(xx1.shape) plt.contour(xx1, xx2, predictions, [0.5], linewidths=5) plt.show()測試結(jié)果如下:
總結(jié)
以上是生活随笔為你收集整理的5.9 程序示例--非线性分类-机器学习笔记-斯坦福吴恩达教授的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 5.8 程序示例--线性分类-机器学习笔
- 下一篇: 5.10 程序示例--模型选择-机器学习