IndexError: list index out of range
とエラーが出ました。
Tracebackには、
Run id: P0W5X0 Log directory: /tmp/tflearn_logs/ Exception in thread Thread-2: Traceback (most recent call last): File "/usr/local/Cellar/python@2/2.7.15/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 801, in __bootstrap_inner self.run() File "/usr/local/Cellar/python@2/2.7.15/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 754, in run self.__target(*self.__args, **self.__kwargs) File "/Users/xxx/anaconda/xxx/lib/python2.7/site-packages/tflearn/data_flow.py", line 201, in fill_batch_ids_queue ids = self.next_batch_ids() File "/Users/xxx/anaconda/xxx/lib/python2.7/site-packages/tflearn/data_flow.py", line 215, in next_batch_ids batch_start, batch_end = self.batches[self.batch_index] IndexError: list index out of range
と出ました。
# coding: utf-8 import tensorflow as tf import tflearn from tflearn.layers.core import input_data,dropout,fully_connected from tflearn.layers.conv import conv_2d, max_pool_2d from tflearn.layers.normalization import local_response_normalization from tflearn.layers.estimator import regression tf.reset_default_graph() net = input_data(shape=[None,20000, 4, 42]) net = conv_2d(net, 4, 16, activation='relu') net = max_pool_2d(net, 1) net = tflearn.activations.relu(net) net = dropout(net, 0.5) net = tflearn.fully_connected(net, 2, activation='softmax') net = tflearn.regression(net, optimizer='adam', learning_rate=0.5, loss='categorical_crossentropy') model = tflearn.DNN(net) model.fit(np.array(trainDataSet).reshape(1,20000, 4, 42), np.array(trainLabel), n_epoch=400, batch_size=32, validation_set=0.1, show_metric=True) pred = np.array(model.predict(np.array(testDataSet).reshape(1,20000, 4, 42)).argmax(axis=1)) label = np.array(testLabel).argmax(axis=0) accuracy = np.mean(pred == label, axis=0) print(accuracy)
とコードを書きました。なぜエラーが出るのでしょうか?どのようにコードを修正したら良いのでしょうか?
回答1件
あなたの回答
tips
プレビュー
バッドをするには、ログインかつ
こちらの条件を満たす必要があります。
2018/11/22 11:47
2018/11/25 11:52
2018/11/25 12:47
2018/11/26 00:45
2018/11/26 01:39
2018/11/26 01:58
2018/11/26 02:38
2018/11/26 03:41
2018/11/26 04:37