The technology we use, and even rely on, in our everyday lives –computers, radios, video, cell phones – is enabled by signal processing. Learn More »
1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.
Previous studies have shown that attention mechanisms and shortest dependency paths have a positive effect on relation classification. In this paper, a keyword-attentive sentence mechanism is proposed to effectively combine the two methods. Furthermore, to effectively handle the imbalanced classification problem, this paper proposes a new loss function called the synthetic stimulation loss, which uses a modulating factor to allow the model to focus on hard-to-classify samples. The proposed two methods are integrated into a bidirectional gated recurrent unit (BiGRU). As a single model is not strong in noise immunity, this paper applies the mutual learning method to our model and forces the networks to teach each other. Therefore, we call the final model SSL-KAS-MuBiGRU . Experiments on the SemEval-2010 Task 8 data set and the TAC40 data set demonstrate that the keyword-attentive sentence mechanism and synthetic stimulation loss are useful for relation classification, and our model achieves state-of-the-art results.
Home | Sitemap | Contact | Accessibility | Nondiscrimination Policy | IEEE Ethics Reporting | IEEE Privacy Policy | Terms | Feedback
© Copyright 2024 IEEE - All rights reserved. Use of this website signifies your agreement to the IEEE Terms and Conditions.
A public charity, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity.