Gradient overflow. Skipping step, loss scaler 0 reducing loss scale to 3.3409558876152446e-52
というメッセージをoffにしたいです やり方がわかりません
わかる人教えてください
コード
mainQN = QNetwork(state.shape,action_size).to('cuda:0') optimizer = optim.Adam(mainQN.parameters(), lr=learning_rate) mainQN, optimizer = amp.initialize(mainQN, optimizer, opt_level="O1")#------------ mainQN.train() optimizer.zero_grad() output = mainQN.forward(inputs,"net_q") if self.IQN==True: self.loss_IQN(target,output,weights) else: loss = criterion(output,targets) loss=loss*weights with amp.scale_loss(loss, optimizer) as scaled_loss: scaled_loss.backward() # ここを変更 optimizer.step() 結果 Gradient overflow. Skipping step, loss scaler 0 reducing loss scale to 8.552847072295026e-50 Gradient overflow. Skipping step, loss scaler 0 reducing loss scale to 4.276423536147513e-50 Gradient overflow. Skipping step, loss scaler 0 reducing loss scale to 2.1382117680737565e-50 Gradient overflow. Skipping step, loss scaler 0 reducing loss scale to 1.0691058840368783e-50 Gradient overflow. Skipping step, loss scaler 0 reducing loss scale to 5.345529420184391e-51 Gradient overflow. Skipping step, loss scaler 0 reducing loss scale to 2.6727647100921956e-51 Gradient overflow. Skipping step, loss scaler 0 reducing loss scale to 1.3363823550460978e-51 Gradient overflow. Skipping step, loss scaler 0 reducing loss scale to 6.681911775230489e-52 Gradient overflow. Skipping step, loss scaler 0 reducing loss scale to 3.3409558876152446e-52
回答1件
あなたの回答
tips
プレビュー