I'm try to code Elastic-Net. It's look likes:
And I want to use this loss function into Keras:
def nn_weather_model():ip_weather = Input(shape = (30, 38, 5))x_weather = BatchNormalization(name='weather1')(ip_weather)x_weather = Flatten()(x_weather)Dense100_1 = Dense(100, activation='relu', name='weather2')(x_weather)Dense100_2 = Dense(100, activation='relu', name='weather3')(Dense100_1)Dense18 = Dense(18, activation='linear', name='weather5')(Dense100_2)model_weather = Model(inputs=[ip_weather], outputs=[Dense18])model = model_weatherip = ip_weatherop = Dense18return model, ip, op
my loss function is:
def cost_function(y_true, y_pred):return ((K.mean(K.square(y_pred - y_true)))+L1+L2)return cost_function
It's mse+L1+L2
and L1 and L2 is
weight1=model.layers[3].get_weights()[0]
weight2=model.layers[4].get_weights()[0]
weight3=model.layers[5].get_weights()[0]
L1 = Calculate_L1(weight1,weight2,weight3)
L2 = Calculate_L2(weight1,weight2,weight3)
I use Calculate_L1 function to sum of the weight of dense1 & dense2 & dense3 and Calculate_L2 do it again.
When I train RB_model.compile(loss = cost_function(),optimizer= 'RMSprop')
the L1 and L2 variable didn't update every batch. So I try to use callback when batch_begin while using:
class update_L1L2weight(Callback):def __init__(self):super(update_L1L2weight, self).__init__()def on_batch_begin(self,batch,logs=None):weight1=model.layers[3].get_weights()[0]weight2=model.layers[4].get_weights()[0]weight3=model.layers[5].get_weights()[0]L1 = Calculate_L1(weight1,weight2,weight3)L2 = Calculate_L2(weight1,weight2,weight3)
How could I use callback in the batch_begin calculate L1 and L2 done, and pass L1,L2 variable into loss funtion?