Different results for same configuration


#1

Why am I getting different results for same configuration. Is there something random in Adam optimizer ?
Btw. shuffle data is off


#2

Hi you are getting these result with everything same right?
Just re running the training for same number of epochs?


#3

Yes Rajat, everything is the same


#4

Ok then this is mainly because of the random weight initialization. This is common while training a NN with random weights. Since you can see that all the graphs are almost showing the same pattern.