If you insert electric probes into an insect *before* adulthood, its tissues can organically grow around the probe and unlock a high-bandwidth insect-machine interface.
Then you can read data from the insect's brain and *control* its flight by stimulation. This is from 2009, but I believe modern hardware and AI stack can dramatically improve the "insect cyborg":
- Microchips and onboard computing are now way more powerful.
- Diffusion is able to reconstruct images from even the human brain, let alone an insect's. We will be able to decode the insect's vision and plan accordingly.
- Control methods are more sample efficient & robust. With enough insect samples, we can gather data to train a highly nontrivial flight controller. RLIF = reinforcement learning from insect feedback?
Conversation
Paper: Insect–Machine Interface Based Neurocybernetics.
Link: ibionics.ece.ncsu.edu/assets/Publica
5
34
213
's Neuralink is of course a MUCH more scaled up version of this: unlocking the brain-machine interface for humans. I don't think human-level is within reach yet, but we will solve insect's brain first.
Again, massive opportunities come along with massive risks and… Show more
5
5
74
Show replies





