AI Q&A: Memory Networks vs Transformer's Self-Attention

⚡️Hudson Ⓜ️endes
9 min readMay 2, 2023

This document records a conversation with ChatGPT comparing Memory Networks (Weston et al, 2015) and Transformer's Self-Attention(Vaswani et al, 2017).

Prompt: blog header banner, minimalist, artistic, futuristic, high-resolution, clear lighting, picturing personified wise-looking AI teaches machine learni, in a room with modern zen architecture. by sachin teng and sergey kolesov and ruan jia and heng z. graffiti art, scifi, fantasy, hyper detailed. octane render. concept art. trending on artstation

Q1: How does GPT-4 use the attention mechanism to extract information from previous messages and apply them to its text-completion…

--

--