In the second of a three post series revealing their spatial computing roadmap, Facebook Researchers reveal in-progress gestural and neural interfaces. One demo shows expressive wrist and finger control mediated by a watch-like wearable reminiscent of the Myo armband; supporting concept videos telegraph aspirations for neural keyboards and that old AR chestnut: a user manipulating GUI elements in 3D space in front of them. However far off the tech is, it’s immediately satisfying to watch one of the researchers gush about the shiny future of neural interfaces where “you and the machine are in agreement about which neurons mean left and which ones mean right” without any mention of data harvesting or his employer’s long history of malfeasance.

1,125 days, 1,724 entries ...

Newsticker, link list, time machine: logs emerging trajectories in art, science, technology, and culture––every day
$40 USD