Keras - Variational Autoencoder NaN loss

Keras - Variational Autoencoder NaN loss



I'm trying to use the implementation of Variational Autoencoder that I found among the Keras examples (https://github.com/keras-team/keras/blob/master/examples/variational_autoencoder.py).



I just refactored the code in order to use it more easily from a Jupyter notebook (my code: https://github.com/matbell/Autoencoders/blob/master/models/vae.py).



However, when I try to fit the model on my data I get the following output:


Autoencoders/models/vae.py:69: UserWarning: Output "dense_5" missing from loss dictionary. We assume this was done on purpose, and we will not be expecting any data to be passed to "dense_5" during training.
self.vae.compile(optimizer='rmsprop')

Train on 15474 samples, validate on 3869 samples
Epoch 1/50
15474/15474 [==============================] - 1s 76us/step - loss: nan - val_loss: nan
Epoch 2/50
15474/15474 [==============================] - 1s 65us/step - loss: nan - val_loss: nan
Epoch 3/50
15474/15474 [==============================] - 1s 69us/step - loss: nan - val_loss: nan
Epoch 4/50
15474/15474 [==============================] - 1s 62us/step - loss: nan - val_loss: nan



and the loss remains the same for all the training epochs.



I'm not so expert in Deep Learning and Neural Networks fields, so maybe I'm missing something....



This is the input data, where data and labels are two pandas.DataFrame.


data


labels


pandas.DataFrame


In: data.shape
Out: (19343, 87)

In: label.shape
Out: (19343, 1)



And this is how I use the Vae class (from my code) in Jupyter notebook:


Vae


INPUT_SIZE = len(data.columns)
X_train, X_test, y_train, y_test = train_test_split(data, labels, test_size = 0.2)

vae = Vae(INPUT_SIZE, intermediate_dim=32)
vae.fit(X_train, X_test)



Thanks for any help!





can you add the changes you made to the original code so that we can trace the error you would have made more easily?
– Shashi Tunga
Apr 3 at 18:07





@ShashiTunga I reported the links of both original code and my "modification". As you can note, I didn't make any changes to the code, I just reformatted it as a Python class with three main methods: init(), fit(), and encode().
– Mattia Campana
Apr 3 at 18:19


init()


fit()


encode()





Which version of Keras did you use?
– kvorobiev
Apr 7 at 16:54




3 Answers
3



Your input consist of NaN values which is why you are seeing the Nan in output. You can count NaN in your numpy array using the following:


np.count_nonzero(np.isnan(data))



If the NAN are not relevant, just remove them from your training data or map them to a specific constant (like 0 or -1) which will take care of your problem



I had a similar problem because there were inputs that contained NaNs. If there are just a few examples with a NaN in there, all the weights and losses will be NaN too. Check the content of the data again!



You might want to initialize your log_var dense layer to zeros. I was having problems with it myself (slightly different code, but effectively doing the same), and it turns out that, however small the variation weights were initialized to, they would explode in just a few rounds of SGD.



The random correlations between epsilon ~N(0,1) and the reconstruction error will be enough to gently bring the weights to nonzero.



Edit - also, the exponential wrapping the variation really helps exploding the gradients. Setting the initial value of the weights to zero gives an initial variation of one, because of the exponential. Initializing it to a low negative value, while giving off an initial close-to-zero variation, makes the gradient enormous on the very first runs. Zero gives me the best results.






By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Popular posts from this blog

𛂒𛀶,𛀽𛀑𛂀𛃧𛂓𛀙𛃆𛃑𛃷𛂟𛁡𛀢𛀟𛁤𛂽𛁕𛁪𛂟𛂯,𛁞𛂧𛀴𛁄𛁠𛁼𛂿𛀤 𛂘,𛁺𛂾𛃭𛃭𛃵𛀺,𛂣𛃍𛂖𛃶 𛀸𛃀𛂖𛁶𛁏𛁚 𛂢𛂞 𛁰𛂆𛀔,𛁸𛀽𛁓𛃋𛂇𛃧𛀧𛃣𛂐𛃇,𛂂𛃻𛃲𛁬𛃞𛀧𛃃𛀅 𛂭𛁠𛁡𛃇𛀷𛃓𛁥,𛁙𛁘𛁞𛃸𛁸𛃣𛁜,𛂛,𛃿,𛁯𛂘𛂌𛃛𛁱𛃌𛂈𛂇 𛁊𛃲,𛀕𛃴𛀜 𛀶𛂆𛀶𛃟𛂉𛀣,𛂐𛁞𛁾 𛁷𛂑𛁳𛂯𛀬𛃅,𛃶𛁼

Edmonton

Crossroads (UK TV series)