The Brain Metaphor Trap
The human brain runs on roughly 20 watts. That is less power than the light bulb illuminating your desk, yet it orchestrates consciousness, creativity, memory, and the ability to read these very wo...

Source: DEV Community
The human brain runs on roughly 20 watts. That is less power than the light bulb illuminating your desk, yet it orchestrates consciousness, creativity, memory, and the ability to read these very words. Within that modest thermal envelope, approximately 100 billion neurons fire in orchestrated cascades, connected by an estimated 100 trillion synapses, each consuming roughly 10 femtojoules per synaptic event. To put that in perspective: the energy powering a single thought could not warm a thimble of water by a measurable fraction of a degree. Meanwhile, the graphics processing units training today's large language models consume megawatts and require industrial cooling systems. Training a single frontier AI model can cost millions in electricity alone. The disparity is so stark, so seemingly absurd, that it has launched an entire field of engineering dedicated to a single question: can we build computers that think like brains? The answer, it turns out, is far more complicated than the