Transformer-Based Named Entity Recognition for Semantic Information Extraction
DOI:
https://doi.org/10.71366/ijwos03032662201Keywords:
Named Entity Recognition (NER), Natural Language Processing, Transformer Models, Deep Learning, Semantic Information Extraction, Text Mining, Sequence Labelling Machine Learning
Abstract
Named Entity Recognition (NER) is one of the main functions of Natural Language Processing, and it is used for the identification and categorization of important information in texts, such as person names, organization names, locations, dates, and monetary values, among others. The development of digital texts, such as news articles, social media, and documents, has led to an increase in the need for efficient information extraction techniques, such as NER, due to the rapid growth of digital texts. The traditional rule-based and statistical approaches to NER face difficulties in dealing with contextual relationships in texts. The paper discusses the application of deep learning techniques for improving the efficiency of Named Entity Recognition models. The deep learning technique is effective in understanding contextual relationships in texts using attention mechanisms, and it is useful for semantic information extraction techniques, such as NER. The proposed technique applies a deep learning NER approach for processing texts, recognizing, and categorizing relevant information in texts.
The paper evaluates the application of deep learning techniques for improving the efficiency of NER models for information extraction techniques. The results show that the application of deep learning techniques is effective for improving the efficiency of NER models, and the results show the importance of deep learning techniques for improving information extraction techniques.
Downloads
Published
Issue
Section
License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.


