It has become clear that we need something like the inverse of Hofstadter’s law to describe the rate of AI progress.

February 24th: Meta announces their LLaMA language model and releases it to academic researchers

March 3rd: The LLaMA model weights leak publicly on 4Chan

March 10th: Georgi Gerganov releases a heavily optimized port of LLaMA, allowing it to run locally on consumer Apple hardware

March 12th: Artem Andreenko successfully runs LLaMA on a Raspberry Pi

So, without further ado:

Sydney’s Law: AI progress is faster than you expect, even when you take into account Sydney’s Law.