TensorFlow DataSet API causes graph size to explode -


i have bug data set training.

i'm using data set api so:

self._dataset = tf.contrib.data.dataset.from_tensor_slices((self._images_list, self._labels_list))  self._dataset = self._dataset.map(self.load_image)  self._dataset = self._dataset.batch(batch_size) self._dataset = self._dataset.shuffle(buffer_size=shuffle_buffer_size) self._dataset = self._dataset.repeat()  self._iterator = self._dataset.make_one_shot_iterator() 

if use training small amount of data well. if use data tensorflow crash error: valueerror: graphdef cannot larger 2gb.

it seems tensorflow tries load data instead of loading data needs... not sure...

any advice great!

update... found solution/workaround

according post: tensorflow dataset api doubles graph protobuff filesize

i replaced make_one_shot_iterator() make_initializable_iterator() , of course called iterator initializer after creating session:

init = tf.global_variables_initializer() sess.run(init) sess.run(train_data._iterator.initializer) 

but i'm leaving question open me seems workaround , not solution...


Comments

Popular posts from this blog

ios - MKAnnotationView layer is not of expected type: MKLayer -

ZeroMQ on Windows, with Qt Creator -

unity3d - Unity SceneManager.LoadScene quits application -