Researchers from the University of Technology Sydney and the Sydney Quantum Academy are exploring the intersection of Large Language Models (LLMs) and Quantum Machine Learning (QML). The study focuses on implementing the foundational Transformer architecture, integral to ChatGPT, within a quantum computing paradigm. The team has designed quantum circuits to implement adapted versions of the transformers' core components and the generative pre-training phase. The research aims to bridge the gap between QML advancements and state-of-the-art language models, potentially leading to significant advancements in AI and quantum computing.
↧