Methods
|
Average macro (Top-1)
|
Average macro (Top-5)
|
Average macro (Top-10)
|
---|
|
Accuracy
|
F1-score
|
Accuracy
|
F1-score
|
Accuracy
|
F1-score
|
---|
TextCNN
|
65.38
|
60.6
|
92.91
|
91.49
|
97.27
|
96.73
|
LSTM
|
64.18
|
59.68
|
91.9
|
90.08
|
96.61
|
95.87
|
LEAM
|
63.44
|
55.79
|
92.1
|
90.2
|
96.9
|
96.3
|
Transformer
|
65.11
|
59.97
|
92.74
|
91.24
|
97.12
|
96.6
|
BERT-base
|
65.95
|
61.36
|
93.11
|
91.59
|
97.28
|
96.78
|
BERT-wwm
|
66.12
|
61.56
|
93.06
|
91.47
|
97.34
|
96.82
|
CHMBERT
|
66.28
|
61.95
|
93.08
|
91.58
|
97.27
|
96.83
|
- The best performance is boldfaced