Valve Founder Says Brain-Computer Interfaces Could One Day Replace Our ‘Meat Peripherals’

https://kotaku.com/valve-founder-says-brain-computer-interfaces-could-one-1846124830


Screenshot: 1 News / Kotaku

In an interview with New Zealand’s 1 News, Valve co-founder and president Gabe Newell talks about engineering a future where brain-computer interfaces create better-than-reality visuals and can actively edit who we think we are. You know, terrifying science-fiction stuff, only real.

Why use your eyes and ears—which Newell sinisterly refers to as “meat peripherals”—to experience a game when you can have the visuals, sounds, and even feelings fed directly into your brain. That’s the idea of a brain-computer interface or BCI. Long-time proponents of body-interface technology like eye-tracking, Newell and Valve are currently working on an open-source BCI software project to give developers easy access to brain-reading tech. Using a headset like the ones developed by OpenBCI, developers can read signals from users’ bodies and minds, telling them if players are sad, surprised, scared, or bored. Armed with such data, developers could then adjust the game to ramp up the excitement or invoke the desired emotion.

OpenBCI’s Galea headset concept
Image: OpenBCI

This isn’t the science fiction part. This is technology that already exists. Newell tells 1 News that, “If you’re a software developer in 2022 who doesn’t have one of these (headsets) in your test lab, you’re making a silly mistake.” You’re also from a year in the future, but that’s not the scary part. The scary part is when Newell starts talking about where BCIs will lead next.

Speaking of visuals, Newell talks about our eyes, “created by this low-cost bidder that didn’t care about failure rates and RMAs, and if it got broken there was no way to repair anything effectively.” BCIs beaming signals directly into the brain would be able to create visuals beyond what our flawed orbs could see. “The real world will seem flat, colourless, blurry compared to the experiences you’ll be able to create in people’s brains.”

G/O Media may get a commission

This is the point in the Black Mirror episode where things start to go horribly wrong. Addiction, deception, brainwashing, that sort of stuff. Newell continues, “Where it gets weird is when who you are becomes editable through a BCI.” He says people will soon be able to edit the way they feel through an app. That’s just great.

Other nightmare scenarios mentioned in the interview include giving people tentacles and using a BCI to generate real physical pain. You can watch the video interview below, complete with a generous view of Newell’s fleshy walking platforms.

via Kotaku https://kotaku.com

January 25, 2021 at 09:48AM

Netflix delivers ‘studio-quality’ sound upgrade for Android viewers

https://www.engadget.com/netflix-improves-android-audio-221457366.html

Don’t be surprised if Netflix sounds nicer the next time you marathon a show on your Android phone. Netflix has upgraded its Android app to stream audio in xHE-AAC (Extended HE-AAC with MPEG-D DRC; yes, it’s a mouthful), promising “studio-quality” sound that’s also more consistent — that is, you should enjoy it in more places.

The new format offers a variable bitrate that can improve audio quality when your connection allows, and scale back when you’re on a flaky cellular link. Loudness management, meanwhile, prevents jarring volume changes (think of jumping from an action movie to a quiet drama) and compensates for noisy environments without the risk of clipping the loudest sounds. You could listen on your phone’s speakers without struggling to understand dialogue.

You’ll need at least Android 9 Pie to use xHE-AAC. While this isn’t quite as useful as it could be when you’re likely watching at home during the pandemic, it could be important if you’re determined to finish a show in bed. If nothing else, it could save you from reaching for your earbuds in noisier environments.

via Engadget http://www.engadget.com

January 24, 2021 at 04:24PM