Written by 11:08 AM Tech

Naver’s research paper accepted by a top global conference in the field of natural language processing.

“Demonstrating World-Class Search Technology”, Naver (CEO Choi Soo-yeon) announced on the 18th that it has successfully had a regular paper related to its search technology accepted at the world’s most prestigious natural language processing (NLP) conference, thereby proving its global-level research capabilities.

EMNLP (Empirical Methods in Natural Language Processing), now in its 28th year, is recognized alongside NAACL (Annual Conference of the North American Chapter of the Association for Computational Linguistics) and ACL (Association for Computational Linguistics) as one of the top global AI conferences in the field of natural language processing. It covers a variety of research topics involving language data-based NLP approaches such as AI translation, chatbots, and machine reading comprehension.

EMNLP 2024 will be held from November 12 to 16 in Florida, USA, where Naver plans to present four accepted papers, including those related to search technology.

Particularly, Naver has applied the results of this research to its actual search services to enhance search quality and usability, thereby adding value in terms of service creation and research effectiveness.

Firstly, a research paper on the algorithm applied to Naver’s generative AI search service ‘CUE:’ was accepted. This study deals with a modular approach using an SLM (Small Language Model) to detect harmful queries and provide appropriate answers. Naver has been applying these research outcomes to CUE: since last November to enhance AI stability. For example, queries related to illegal information such as crime/harmful data, copyright law and privacy infringement, personal information leakage, and profanity are assessed to ensure safe generative AI search environments without indiscriminate responses. Naver plans to utilize this technology to enhance the relevance judgment between queries and search results, expand the exposure of high-quality content, and strengthen the overall quality of the search services by prominently displaying answers from highly reliable sources.

Additionally, in the ‘Knowledge Snippet’ service, which summarizes major information related to a search query at the top of integrated search results, Naver proposed technology where AI effectively processes complex forms of snippets, such as lists and tables, in addition to text. This technology aims to be applied to the Knowledge Snippet service in the first half of next year, contributing to improving performance to accurately answer long-tail queries (long and complex search terms) and enabling users to quickly find the information they want.

Furthermore, a paper on a method of applying the document ranking abilities of LLMs (Large Language Models) to sLLMs (Small Large Language Models) for search services was also accepted. This technology was designed and proposed to deliver LLM-level quality in search services that require real-time results without speed reduction. Naver applied the model introduced in the paper to its integrated search service in June to provide documents more suitable for the context of long-tail queries as results. After applying this technology, Naver saw improvements such as a 4.3% increase in document click-through rates (CTR) and a 3% increase in dwell time.

Moreover, beyond EMNLP, Naver’s search technology papers have been published in other prestigious conferences such as NAACL (1 paper), the world’s top AI conference CVPR (2 papers), Information Sciences (1 paper), LREC-COLING (1 paper), and SIGIR/LLM4eval (1 paper). Additionally, seven papers were accepted at HCLT, the top domestic conference on Korean language and information processing, with two selected as outstanding papers, further demonstrating high-level search technology.

Kim Kwang-hyun, leader of Naver’s Search/Data Platform division, stated, “Through this research, Naver’s search technology, which has led the domestic search market, has been recognized on the global stage,” and added, “We will continue to offer competitive search services optimized for users by improving search accuracy and experimenting with generative AI.”

Visited 1 times, 1 visit(s) today
Close Search Window
Close
Exit mobile version