概率编程库Pymc3案例之神经网络(批量训练)
生活随笔
收集整理的這篇文章主要介紹了
概率编程库Pymc3案例之神经网络(批量训练)
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
Pymc3提供minibatch訓練,參考:
https://twiecki.io/blog/2016/06/01/bayesian-deep-learning/
但在ppc上卻遇到測試集batch問題。
https://github.com/pymc-devs/pymc3/issues/2190
這里我直接將測試集按照訓練集批次大小分開了做預測,但準確率低好多。還未知有效。
%matplotlib inline import theano import pymc3 as pm import sklearn import numpy as np import matplotlib.pyplot as plt import seaborn as sns from warnings import filterwarnings filterwarnings('ignore') sns.set_style('white') from sklearn import datasets from sklearn.preprocessing import scale from sklearn.cross_validation import train_test_split from sklearn.datasets import make_moons X, Y = make_moons(noise=0.2, random_state=0, n_samples=1000) X = scale(X) X = X.astype(float) Y = Y.astype(float) X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size=.5) fig, ax = plt.subplots(figsize=(12, 8)) ax.scatter(X[Y==0, 0], X[Y==0, 1], label='Class 0') ax.scatter(X[Y==1, 0], X[Y==1, 1], color='r', label='Class 1') sns.despine(); ax.legend() ax.set(xlabel='X', ylabel='Y', title='Toy binary classification data set'); def construct_nn(ann_input, ann_output):n_hidden = 5# Initialize random weights between each layerinit_1 = np.random.randn(X.shape[1], n_hidden).astype(float)init_2 = np.random.randn(n_hidden, n_hidden).astype(float)init_out = np.random.randn(n_hidden).astype(float)with pm.Model() as neural_network:# Weights from input to hidden layerweights_in_1 = pm.Normal('w_in_1', 0, sd=1, shape=(X.shape[1], n_hidden), testval=init_1)# Weights from 1st to 2nd layerweights_1_2 = pm.Normal('w_1_2', 0, sd=1, shape=(n_hidden, n_hidden), testval=init_2)# Weights from hidden layer to outputweights_2_out = pm.Normal('w_2_out', 0, sd=1, shape=(n_hidden,), testval=init_out)# Build neural-network using tanh activation functionact_1 = pm.math.tanh(pm.math.dot(ann_input,weights_in_1))act_2 = pm.math.tanh(pm.math.dot(act_1, weights_1_2))act_out = pm.math.sigmoid(pm.math.dot(act_2, weights_2_out))# Binary classification -> Bernoulli likelihoodout = pm.Bernoulli('out', act_out,observed=ann_output,total_size=Y_train.shape[0] # IMPORTANT for minibatches)return neural_network采用分批訓練,將訓練集分成50個批次。這里我多次試驗,也查看了https://docs.pymc.io/api/data.html#pymc3.data.Minibatch
找不到如何將多批次傳到模型中采樣。和訓練集一次采樣相比收斂差點,但速度快。
#批量訓練 minibatch_x = pm.Minibatch(X_train, batch_size=50) minibatch_y = pm.Minibatch(Y_train, batch_size=50) neural_network_minibatch = construct_nn(minibatch_x, minibatch_y) with neural_network_minibatch:inference = pm.ADVI()approx = pm.fit(50000, method=inference) plt.plot(-inference.hist) plt.ylabel('ELBO') plt.xlabel('iteration');?下面進行ppc,對測試集進行預測
# Replace arrays our NN references with the test data preds=[] for i in range(10):minibatch_x.set_value(X_test[i*50:i*50+50])minibatch_y.set_value(Y_test[i*50:i*50+50])with neural_network_minibatch:ppc = pm.sample_ppc(trace, samples=500, progressbar=False)pred = ppc['out'].mean(axis=0) > 0.5preds.extend(list(pred)) print('Accuracy = {}%'.format((Y_test == np.array(preds)).mean() * 100))?這個方法有點土,但我在pymc3官方上沒找到更好辦法。這里僅供參考。
?https://docs.pymc.io/notebooks/variational_api_quickstart.html
這里面也有一個Minibatches案例,但未發現ppc案例。
?
總結
以上是生活随笔為你收集整理的概率编程库Pymc3案例之神经网络(批量训练)的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: linux下配置Docker的jupyt
- 下一篇: python字符串截取及Html解析