This Clever Robotic Finger Feels With Light

https://www.wired.com/story/this-clever-robotic-finger-feels-with-light

“Extracting information out of those 1,000 signals in an analytical way—it’s very, very hard to do,” says Ciocarlie, who developed the system. “I would venture to say that it’s impossible without modern machine learning.”

Courtesy of Columbia University

Machine learning comes into play when they’re calibrating the system. They can stick the finger on a table, point it upward, and use a separate robotic arm to prod the finger in precise spots, using a specific amount of pressure. Because they know exactly where the robotic arm is jabbing the finger, they can see how the photodiodes detect light differently at each location. (If you take a look at the GIF above, you can see the system both localizing the touch and the intensity as the red dot swells with more pressure.) Despite the large amount of data collected per jab, with machine learning, the system can crunch it all.

“So that’s the missing piece, the thing that’s really become available to the field really in the last maybe five years or so,” says Ciocarlie. “We now have the machine-learning methods that we can add on top of these many, many optical signals, so that we can decipher the information that’s in there.”

This mimics how humans learn to wield our own sense of touch. As children, we grab everything we can, banking our memories of how objects feel. Even as adults, our brains continue to catalog the feel of things—for example, how much resistance to expect from a steering wheel when you’re turning left, or how hard to bang a hammer against a nail. “If we were to put you into the body of another person somehow, you would have to relearn all the motor skills,” says Columbia electrical engineer Ioannis Kymissis, who developed the system with Ciocarlie. “And that’s one of the nice things about the plasticity of the brain, right? You can have a stroke, you can knock out half of the brain and still relearn and then function.”

This new robotic finger, though, has its limits. While it can gauge the pressure it’s placing on an object, it’s missing out on a bunch of other data that people can sense through our own hands but often take for granted, like temperature and texture. But interestingly enough, the researchers think they could listen to the robotic finger’s slip, or its motion as it slides over a surface.

“When you have slip, there’s a little bit of a singing—if you ever put your ear against the table and run your finger on the table,” says Kymissis. If you’re holding on to, say, a wet glass, the slip might happen on a small scale, then “spread” to your hand’s entire contact area as the glass slides out of your grasp. By listening to the characteristic noise of an object slipping out of a robot hand equipped with these new fingers, the machine could correct its grip before the slip spreads across the whole hand.

via Wired Top Stories https://ift.tt/2uc60ci

February 26, 2020 at 06:03AM

VIDEO: LG V60 First Look!

https://www.droid-life.com/2020/02/26/video-lg-v60-first-look/

LG was kind enough to grant us early access to its new V60 ThinQ 5G smartphone recently, so if you’ve already read our official announcement post and the specs, then it’s time to dive right into this thing!

  • Related: LG V60 official spec sheet!

To recap you, this phone features a 6.8? FHD+ P-OLED display (capped at 60Hz refresh rate), Snapdragon 865 processor, 8GB RAM, 128GB storage, microSD, IP68, fingerprint reader, 5,000mAh battery, dual speakers, 5G connectivity, and maybe most importantly, support for an updated LG Dual Screen accessory.

That added Dual Screen brings an additional 6.8-inches of FHD+ real estate for looking through apps, multitasking, and consuming content. It’s a solid accessory that comes with the V60 for absolutely free with every purchase. Good on ya, LG.

We’re still awaiting pricing and availability, so for now, watch this video.

VIDEO: LG V60 First Look! is a post from: Droid Life

via Droid Life: A Droid Community Blog https://ift.tt/2dLq79c

February 26, 2020 at 09:10AM