Loading [MathJax]/extensions/MathMenu.js
Special Session: Neuro-Symbolic Architecture Meets Large Language Models: A Memory-Centric Perspective | IEEE Conference Publication | IEEE Xplore

Special Session: Neuro-Symbolic Architecture Meets Large Language Models: A Memory-Centric Perspective


Abstract:

Large language models (LLMs) have significantly transformed the landscape of artificial intelligence, demonstrating exceptional capabilities in natural language understan...Show More

Abstract:

Large language models (LLMs) have significantly transformed the landscape of artificial intelligence, demonstrating exceptional capabilities in natural language understanding and generation. Recently, the integration of LLMs with neurosymbolic architectures has gained traction to enhance contextual awareness and planning capabilities. However, this integration faces computational challenges that hinder scalability and efficiency, especially in edge computing environments. This paper provides an in-depth analysis of these challenges and explores state-of-the-art solutions, focusing on memory-centric computing principles at both algorithmic and hardware levels. Our exploration is centered around the key computational elements of the Transformer, the foundation of all LLMs, and vector-symbolic architecture, the leading neuro-symbolic model for edge applications. Additionally, we propose potential research directions for further investigation. By examining these aspects, this paper aims to bridge critical gaps in the path toward effective artificial general intelligence at the edge.
Date of Conference: 29 September 2024 - 04 October 2024
Date Added to IEEE Xplore: 06 November 2024
ISBN Information:

ISSN Information:

Conference Location: Raleigh, NC, USA

Funding Agency:


References

References is not available for this document.