Is it because adam optimizer changes the learning rate by itself. I get an error saying 'Attempting to use uninitialized value Adam_1/lr' I guess there is no point in using ReduceLRonPlateau as Adam will automatically change learning rate.Anyways i have updated the codee Update: Code:
from keras.optimizers import Adam
model.compile(optimizer='adam',loss='mse')callback_reduce_lr=ReduceLROnPlateau(monitor='val_loss',factor=0.1, min_lr=1e-4,patience=0,verbose=1)
model.fit(xTrain,yTrain,epochs=100,batch_size=10,validation_data=(xTest,yTest),verbose=2,callbacks=[callback_reduce_lr])
Error://Attempting to use uninitialized value Adam_1/lr
I read somewhere that initializing adam doesnt work while working with ReduceLROnPlateau,,i have tried to initialize the weights too but i got the same error