Fascinating. We hear that the leaps in AI have been made possible by orders of magnitude increases in compute and data availability, and of course that’s substantially true—but exactly how true? It’s a nice exercise in perspective to see how much or how little modern machine learning methods would have been capable of if you brought them by time machine to the 70’s and optimized them for that environment.
Thanks for reposting! I'm the author of ATTN-11. Happy to answer any questions about the fixed-point arithmetic, the PDP-11 hardware, or the training process.
Incredible work! Fitting transformer into 32KB RAM is crazy
For those who read this project and do not know PDP-11 it could be hard to understand that working with these memory limits is difficult.
Here is visual guide for PDP11 architecture - https://vectree.io/c/pdp-11-hardware-architecture
With a concave trackpoint, respect.
For those who read this project and do not know PDP-11 it could be hard to understand that working with these memory limits is difficult. Here is visual guide for PDP11 architecture - https://vectree.io/c/pdp-11-hardware-architecture
Thanks for this amazing project!