在清华新闻分类数据源上,使用TextCNN效果不错,使用TextLSTM/RNN学习不动,损失和acc均无变化
修改seq_length,从50改为32,短了加0,长了截取
重新训练,问题解决
epoch:0 item:50 loss:2.3013758659362793 train_acc:0.109375 dev_acc:0.1474609375
epoch:0 item:100 loss:2.2781167030334473 train_acc:0.1953125 dev_acc:0.17578125
epoch:0 item:150 loss:2.205510377883911 train_acc:0.21875 dev_acc:0.1767578125
epoch:0 item:200 loss:2.183446168899536 train_acc:0.2421875 dev_acc:0.294921875
epoch:0 item:250 loss:2.144759178161621 train_acc:0.2265625 dev_acc:0.2470703125
epoch:0 item:300 loss:2.143526792526245 train_acc:0.265625 dev_acc:0.28515625
epoch:0 item:350 loss:2.078019857406616 train_acc:0.34375 dev_acc:0.279296875
epoch:0 item:400 loss:2.096219301223755 train_acc:0.296875 dev_acc:0.3203125
epoch:0 item:450 loss:2.0016613006591797 train_acc:0.3984375 dev_acc:0.3974609375
epoch:0 item:500 loss:1.9698306322097778 train_acc:0.390625 dev_acc:0.4150390625
epoch:0 item:550 loss:1.9621188640594482 train_acc:0.4453125 dev_acc:0.47265625
https://github.com/haibincoder/NlpSummary/tree/master/torchcode/classification
记一次模型调试问题:使用TextCNN效果不错,使用TextLSTM/RNN学习不动,损失和acc均无变化
原文:https://www.cnblogs.com/bincoding/p/14392155.html