Methods
|
Average macro (Top-1)
|
Average macro (Top-2)
|
Average macro (Top-3)
|
---|
|
Accuracy
|
F1-score
|
Accuracy
|
F1-score
|
Accuracy
|
F1-score
|
---|
TextCNN
|
85.87
|
72.32
|
94.74
|
84.52
|
97.16
|
89.77
|
LSTM
|
84.72
|
70.98
|
94.05
|
83.46
|
96.55
|
89.00
|
LEAM
|
84.64
|
68.35
|
94.02
|
83.04
|
96.73
|
88.88
|
Transformer
|
84.98
|
69.59
|
94.37
|
83.87
|
96.95
|
89.51
|
BERT-base
|
86.47
|
73.36
|
95.04
|
85.29
|
97.32
|
89.62
|
BERT-wwm
|
86.52
|
73.47
|
95.03
|
84.47
|
97.24
|
89.73
|
CHMBERT
|
86.66
|
74.06
|
95.18
|
85.30
|
97.44
|
90.67
|
- The best performance is boldfaced