AI Is Revolutionizing Prosthetic Arm Control

https://www.discovermagazine.com/technology/ai-is-revolutionizing-prosthetic-arm-control


Prosthetic hands and arms have dramatically improved in recent years, thanks to advances that allow independently moving fingers, control over multiple joints, personalized 3D printing and so on.

Despite these breakthroughs, most users find prosthetic arms difficult to control. The most common control mechanism records the electrical activity in the arm muscles – a technique known as myoelectric sensing—and then uses this to actuate the prosthetic.

The problem is that users usually have to contract their muscles in specific combinations of patterns to generate hand or wrist motions. These patterns are often counterintuitive, time consuming and frustrating.

At the heart of this problem is that neuroscientists do not know how to accurately decode the signals the brain sends through nerves to control muscles. And that makes it hard to interpret nerve signals accurately.

What’s needed is a way to measure and decode nerve signals so that they can be used to intuitively control prosthetic arm, hand and finger movement.

Translating Intention

Now Diu Khue Luu and Anh Tuan Nguyen from the University of Minnesota with colleagues, have found a way to do this using an AI decoder that learns the user’s intention based on the nerve signals it senses in the arm. “We present a neuroprosthetic system to demonstrate that principle by employing an artificial intelligence (AI) agent to translate the amputee’s movement intent,” they say.

One key difference between this and electromyographic systems is that the team record the nerve signals via an implanted electrode called a peripheral nerve interface. That requires minor surgery and comes with obvious risks of infection, tissue damage and so on.

The advantage is a much cleaner signal. “These interfaces aim to enable intuitive prosthesis control purely by thoughts and achieve a natural user experience,” say the researchers.
With this in place, the team record amputees’ nerve signals as they attempt specific hand movements. And the team have tested the approach with three subjects who each received the microelectronic implant for a period of up to 16 months.

To train the AI system, the user wears a data glove on the uninjured hand and then repeatedly practices a hand movement in this and the phantom hand. During this process, the data glove records the intended movement while the electrodes record the nerve signals in the amputated arm.

In this way, the AI system learns to correlate the patterns of nerve signals with specific hand movements. In particular, it can decode several movements at the same time, such as pinching which requires the thumb and forefinger to flex while the other fingers remain still. The AI then manipulates the prosthetic arm by Bluetooth connection.

The result is a remarkable degree of dexterity. In tests, the subjects successfully achieve the intended action 99.2% of the time with a median reaction time of 0.81 seconds. Crucially, each movement is intuitive with the AI system matching the intended motion. “The AI agent allows amputees to control prosthetic upper limbs with their thoughts by decoding true motor intent,” say Luu, Nguyen and co.

Tactile Sensation

The team say this system is part of their broader work on signals that flow in both directions, from the user to control the arm, and then back to the brain for tactile sensation. They point to earlier work in which users reported tactile sensation and proprioception (the body’s ability to sense motion, action and position in 3D space). “This and our previous works form the foundation to materialize a complete closed-loop human-machine bidirectional communication,” they say.

The approach also allows users to connect directly to a virtual environment to control a computer game for example, without using a prosthetic hand. “The proposed nerve interface with an AI neural decoder allows people to manipulate remote objects using only their thoughts in an actual “telekinesis” manner,” say the team.

Of course, the system isn’t perfect. The researchers point to various kinds of fine motor control that are still difficult to achieve, such as wrist extension, wrist supination and applied force estimation. Some of these movements will require additional implanted electrode sensors. A shorter reaction time would be appreciated by most users. And implanted electrodes always run the risk of being disturbed and requiring re-implantation.

For now, this approach is a useful step forward in the technology for amputees and an exciting avenue for future work. As Luu, Nguyen and co conclude: “These results promise a future generation of prosthetic hands that can provide a natural user experience just like real hands.”


Ref: Artificial Intelligence Enables Real-Time and Intuitive Control of Prostheses via Nerve Interface: arxiv.org/abs/2203.08648

via Discover Main Feed https://ift.tt/a1oDnBy

March 21, 2022 at 08:51AM

Ford is now using robots to operate 3D printers without human help

https://techcrunch.com/2022/03/16/ford-is-now-using-robots-to-operate-3d-printers-without-human-help/


Ford’s Advanced Manufacturing Center has developed an interface that allows machines from different suppliers to speak to each other in the same language and operate parts of the production line autonomously.

Automakers have been incorporating robotics into their manufacturing processes for decades to reduce costs and boost efficiency. But Ford’s patent-pending system solves a crucial bottleneck in the production line by using robots to operate the 3D printers through the night without human interaction.

The autonomous system marks the first time the Carbon 3D printers and the KUKA-built robots can talk to each other in the same language, opening limitless possibilities for other machines involved throughout the production process to collaborate.

So far, the venture has helped produce low-volume, custom car parts such as the brake line bracket for Mustang Shelby GT500 sports cars equipped with the Performance Package.

“This new process has the ability to change the way we use robotics in our manufacturing facilities,” Jason Ryska, director, global manufacturing technology development, said in a statement.

Javier, the name given to the wheeled robot from supplier KUKA, can run the 3D printer continuously without human interference, even once employees have gone home for the night. Ford says the robots are constantly learning from the printer data to help the automaker achieve more accuracy and reduce margins of error.

“At Ford’s Advanced Manufacturing Center, Javier is tasked with operating the 3D printers completely on his own,” Ford said in a statement. “He is always on time, very precise in his movements, and he works most of the day – taking only a short break to charge up.”

Usually, equipment from different suppliers are unable to interact because they use separate communication interfaces. Ford’s system allows equipment from different suppliers to talk to each other, sending commands and feedback in real time.

Once the Carbon 3D tells Javier that the printed car component is ready, Javier retrieves it and puts it aside for a human operator to collect later.

Ford has filed several patents for the technology underpinning its communication interfaces and the precise positioning of robots. Though the process is autonomous, human operators must upload the 3D designs to the printer and maintain the machinery.

Reporting by Jaclyn Trop for TechCrunch.

Related video:

via Autoblog https://ift.tt/E3vCeNP

March 20, 2022 at 07:34AM

This electric vehicle splits itself down the middle to let passengers inside

https://www.autoblog.com/2022/03/20/iev-z-motors-electirc-vehicle-ev/


Tiny electric cars are becoming a dime a dozen nowadays, so iEV Motors wanted to change things up by introducing a tiny, fully electric vehicle that can extend itself to let passengers inside. iEV Z’s unique chassis can adjust to fit a second passenger in the back or increase cargo space. The iEV Z can be accessed with either a key card or an exclusive app from iEV, so there’s no need for a key fob. The app itself transforms the vehicle, locates it if “lost,” shows remaining battery life, and much more. According to iEV Motors, the iEV Z is six times smaller than a traditional vehicle with a range of 62 miles on a single charge and a max speed of 28 mph. There is a “plus” version of the iEV Z, the iEV Z+, with a 99-mile range and 50-mph max speed. Both versions can be fully charged in 3 hours. You can pre-order the iEV Z from the iEV Motors website right now at a starting price of € 5,850 or $6,460.51 USD.

For more content like this be sure to visit Your Future Car by Autoblog on Facebook or on YouTube. Subscribe for new videos every week.

via Autoblog https://ift.tt/E3vCeNP

March 20, 2022 at 09:14AM