Relation Classification via Keyword-Attentive Sentence Mechanism and Synthetic Stimulation Loss

You are here

Top Reasons to Join SPS Today!

1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.

Relation Classification via Keyword-Attentive Sentence Mechanism and Synthetic Stimulation Loss

By: 
Luoqin Li; Jiabing Wang; Jichang Li; Qianli Ma; Jia Wei

Previous studies have shown that attention mechanisms and shortest dependency paths have a positive effect on relation classification. In this paper, a keyword-attentive sentence mechanism is proposed to effectively combine the two methods. Furthermore, to effectively handle the imbalanced classification problem, this paper proposes a new loss function called the synthetic stimulation loss, which uses a modulating factor to allow the model to focus on hard-to-classify samples. The proposed two methods are integrated into a bidirectional gated recurrent unit (BiGRU). As a single model is not strong in noise immunity, this paper applies the mutual learning method to our model and forces the networks to teach each other. Therefore, we call the final model SSL-KAS-MuBiGRU . Experiments on the SemEval-2010 Task 8 data set and the TAC40 data set demonstrate that the keyword-attentive sentence mechanism and synthetic stimulation loss are useful for relation classification, and our model achieves state-of-the-art results.

SPS Social Media

IEEE SPS Educational Resources

IEEE SPS Resource Center

IEEE SPS YouTube Channel