• chevron_right

      Neural implant lets paralyzed person type by imagining writing

      John Timmer · news.movim.eu / ArsTechnica · Wednesday, 12 May, 2021 - 17:03

    An artist

    Enlarge / An artist's schematic of the system. (credit: Nature)

    Elon Musk's Neuralink has been making waves on the technology side of neural implants, but it hasn't yet shown how we might actually use implants. For now, demonstrating the promise of implants remains in the hands of the academic community.

    This week, the academic community provided a rather impressive example of the promise of neural implants. Using an implant, a paralyzed individual managed to type out roughly 90 characters per minute simply by imagining that he was writing those characters out by hand.

    Dreaming is doing

    Previous attempts at providing typing capabilities to paralyzed people via implants have involved giving subjects a virtual keyboard and letting them maneuver a cursor with their mind. The process is effective but slow, and it requires the user's full attention, as the subject has to track the progress of the cursor and determine when to perform the equivalent of a key press. It also requires the user to spend the time to learn how to control the system.

    Read 13 remaining paragraphs | Comments

    index?i=Zm75se1p7zk:FofDz9KUSw4:V_sGLiPBpWUindex?i=Zm75se1p7zk:FofDz9KUSw4:F7zBnMyn0Loindex?d=qj6IDK7rITsindex?d=yIl2AUoC8zA
    • chevron_right

      Valve’s Gabe Newell imagines “editing” personalities with future headsets

      Sam Machkovech · news.movim.eu / ArsTechnica · Monday, 25 January, 2021 - 14:33 · 1 minute

    Dota 2 tournament trophies might look if Valve chief Gabe Newell pushes any further into brain-computer interface (BCI) research.' src='https://cdn.arstechnica.net/wp-content/uploads/2021/01/gabe-newell-brain-listing-800x494.jpg' >

    Enlarge / An artist's interpretation of how future Dota 2 tournament trophies might look if Valve chief Gabe Newell pushes any further into brain-computer interface (BCI) research. (credit: Getty Images / David Jackmanson / Sam Machkovech)

    For years, the open secret at Valve (makers of game series like Half-Life and Portal ) has been the company's interest in a new threshold of game experiences. We've seen this most prominently with SteamVR as a virtual reality platform, but the game studio has also openly teased its work on " brain-computer interfaces " (BCI)—meaning, ways to read brainwave activity to either control video games or modify those experiences.

    Most of what we've seen from Valve's skunkworks divisions thus far, particularly at a lengthy GDC 2019 presentation , has revolved around reading your brain's state (i.e., capturing nervous-system energy in your wrists before it reaches your fingers, to reduce button-tap latency in twitchy shooters like Valve's Counter-Strike ). In a Monday interview with New Zealand's 1 News , Valve co-founder Gabe Newell finally began teasing a more intriguing level of BCI interaction: one that changes the state of your brain.

    "Our ability to create experiences in people's brains, that aren't mediated through their meat peripherals [ e.g., fingers, eyes ], will be better than is [currently] possible," Newell asserts as part of his latest 12-minute video interview. Later, he claims that "the real world will seem flat, colorless, and blurry compared to the experiences that you'll be able to create in people's brains."

    Read 17 remaining paragraphs | Comments

    index?i=L2mcrjw1Dz0:44SVRPSg9Wg:V_sGLiPBpWUindex?i=L2mcrjw1Dz0:44SVRPSg9Wg:F7zBnMyn0Loindex?d=qj6IDK7rITsindex?d=yIl2AUoC8zA
    • chevron_right

      A new way to plug a human brain into a computer: Via veins

      WIRED · news.movim.eu / ArsTechnica · Saturday, 31 October, 2020 - 12:10 · 1 minute

    human brain, motherboards, chip and artificial intelligence concept and neural tech and brain computer interfaces.

    Enlarge / human brain, motherboards, chip and artificial intelligence concept and neural tech and brain computer interfaces.

    The hard part of connecting a gooey, thinking brain to a cold, one-ing and zero-ing computer is getting information through your thick skull—or mine, or anyone’s. The whole point of a skull, after all, is keeping a brain safely separate from [waves hands at everything].

    So if that brain isn’t yours, the only way to tell what’s going on inside it is inference. People make very educated guesses based on what that brain tells a body to do—like, if the body makes some noises that you can understand (that’s speech) or moves around in a recognizable way. That’s a problem for people trying to understand how the brain works, and an even bigger problem for people who because of injury or illness can’t move or speak. Sophisticated imaging technologies like functional magnetic resonance can give you some clues. But it’d be great to have something more direct. For decades, technologists have been trying to get brains to interface with computer keyboards or robot arms, to get meat to commune with silicon.

    Read 12 remaining paragraphs | Comments

    index?i=lkVIz-1LPWI:xRT1h9L81Ps:V_sGLiPBpWUindex?i=lkVIz-1LPWI:xRT1h9L81Ps:F7zBnMyn0Loindex?d=qj6IDK7rITsindex?d=yIl2AUoC8zA