Different results for same configuration

Why am I getting different results for same configuration. Is there something random in Adam optimizer ?
Btw. shuffle data is off

Hi you are getting these result with everything same right?
Just re running the training for same number of epochs?

Yes Rajat, everything is the same

Ok then this is mainly because of the random weight initialization. This is common while training a NN with random weights. Since you can see that all the graphs are almost showing the same pattern.