A Single Math Model Explains Many Mysteries of Vision

https://www.wired.com/story/a-single-math-model-explains-many-mysteries-of-vision

This is the great mystery of human vision: Vivid pictures of the world appear before our mind’s eye, yet the brain’s visual system receives very little information from the world itself. Much of what we “see” we conjure in our heads.

“A lot of the things you think you see you’re actually making up,” said Lai-Sang Young, a mathematician at New York University. “You don’t actually see them.”

Quanta Magazine


author photo

About

Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research develop­ments and trends in mathematics and the physical and life sciences.

Yet the brain must be doing a pretty good job of inventing the visual world, since we don’t routinely bump into doors. Unfortunately, studying anatomy alone doesn’t reveal how the brain makes these images up any more than staring at a car engine would allow you to decipher the laws of thermodynamics.

New research suggests mathematics is the key. For the past few years, Young has been engaged in an unlikely collaboration with her NYU colleagues Robert Shapley, a neuroscientist, and Logan Chariker, a mathematician. They’re creating a single mathematical model that unites years of biological experiments and explains how the brain produces elaborate visual reproductions of the world based on scant visual information.

“The job of the theorist, as I see it, is we take these facts and put them together in a coherent picture,” Young said. “Experimentalists can’t tell you what makes something work.”

Young and her collaborators have been building their model by incorporating one basic element of vision at a time. They’ve explained how neurons in the visual cortex interact to detect the edges of objects and changes in contrast, and now they’re working on explaining how the brain perceives the direction in which objects are moving.

Their work is the first of its kind. Previous efforts to model human vision made wishful assumptions about the architecture of the visual cortex. Young, Shapley, and Chariker’s work accepts the demanding, unintuitive biology of the visual cortex as is—and tries to explain how the phenomenon of vision is still possible.

“I think their model is an improvement in that it’s really founded on the real brain anatomy. They want a model that’s biologically correct or plausible,” said Alessandra Angelucci, a neuroscientist at the University of Utah.

Layers and Layers

There are some things we know for sure about vision.

The eye acts as a lens. It receives light from the outside world and projects a scale replica of our visual field onto the retina, which sits in the back of the eye. The retina is connected to the visual cortex, the part of the brain in the back of the head.

However, there’s very little connectivity between the retina and the visual cortex. For a visual area roughly one-quarter the size of a full moon, there are only about 10 nerve cells connecting the retina to the visual cortex. These cells make up the LGN, or lateral geniculate nucleus, the only pathway through which visual information travels from the outside world into the brain.

Not only are LGN cells scarce—they can’t do much either. LGN cells send a pulse to the visual cortex when they detect a change from dark to light, or vice versa, in their tiny section of the visual field. And that’s all. The lighted world bombards the retina with data, but all the brain has to go on is the meager signaling of a tiny collection of LGN cells. To see the world based on so little information is like trying to reconstruct Moby-Dick from notes on a napkin.

“You may think of the brain as taking a photograph of what you see in your visual field,” Young said. “But the brain doesn’t take a picture, the retina does, and the information passed from the retina to the visual cortex is sparse.”

But then the visual cortex goes to work. While the cortex and the retina are connected by relatively few neurons, the cortex itself is dense with nerve cells. For every 10 LGN neurons that snake back from the retina, there are 4,000 neurons in just the initial “input layer” of the visual cortex—and many more in the rest of it. This discrepancy suggests that the brain heavily processes the little visual data it does receive.

“The visual cortex has a mind of its own,” Shapley said.

For researchers like Young, Shapley, and Chariker, the challenge is deciphering what goes on in that mind.

Visual Loops

The neural anatomy of vision is provocative. Like a slight person lifting a massive weight, it calls out for an explanation: How does it do so much with so little?

Young, Shapley, and Chariker are not the first to try and answer that question with a mathematical model. But all previous efforts assumed that more information travels between the retina and the cortex — an assumption that would make the visual cortex’s response to stimuli easier to explain.

“People hadn’t taken seriously what the biology was saying in a computational model,” Shapley said.

Mathematicians have a long, successful history of modeling changing phenomena, from the movement of billiard balls to the evolution of space-time. These are examples of “dynamical systems”—systems that evolve over time according to fixed rules. Interactions between neurons firing in the brain are also an example of a dynamical system—albeit one that’s especially subtle and hard to pin down in a definable list of rules.

LGN cells send the cortex a train of electrical impulses one-tenth of a volt in magnitude and one millisecond in duration, setting off a cascade of neuron interactions. The rules that govern these interactions are “infinitely more complicated” than the rules that govern interactions in more familiar physical systems, Young said.

Individual neurons receive signals from hundreds of other neurons simultaneously. Some of these signals encourage the neuron to fire. Others restrain it. As a neuron receives electrical pulses from these excitatory and inhibitory neurons, the voltage across its membrane fluctuates. It only fires when that voltage (its “membrane potential”) exceeds a certain threshold. It’s nearly impossible to predict when that will happen.

“If you watch a single neuron’s membrane potential, it’s fluctuating wildly up and down,” Young said. “There’s no way to tell exactly when it’s going to fire.”

The situation is even more complicated than that. Those hundreds of neurons connected to your single neuron? Each of those is receiving signals from hundreds of other neurons. The visual cortex is a swirling play of feedback loop upon feedback loop.

“The problem with this thing is there are a lot of moving parts. That’s what makes it difficult,” Shapley said.

Earlier models of the visual cortex ignored this feature. They assumed that information flows just one way: from the front of the eye to the retina and into the cortex until voilà, vision appears at the end, as neat as a widget coming off a conveyor belt. These “feed forward” models were easier to create, but they ignored the plain implications of the anatomy of the cortex—which suggested “feedback” loops had to be a big part of the story.

“Feedback loops are really hard to deal with because the information keeps coming back and changes you, it keeps coming back and affecting you,” Young said. “This is something that almost no model deals with, and it’s everywhere in the brain.”

In their initial 2016 paper, Young, Shapley, and Chariker began to try and take these feedback loops seriously. Their model’s feedback loops introduced something like the butterfly effect: Small changes in the signal from the LGN were amplified as they ran through one feedback loop after another in a process known as “recurrent excitation” that resulted in large changes in the visual representation produced by the model in the end.

Young, Shapley, and Chariker demonstrated that their feedback-rich model was able to reproduce the orientation of edges in objects—from vertical to horizontal and everything in between—based on only slight changes in the weak LGN input coming into the model.

“[They showed] that you can generate all orientations in the visual world using just a few neurons connecting to other neurons,” Angelucci said.

Vision is much more than edge detection, though, and the 2016 paper was just a start. The next challenge was to incorporate additional elements of vision into their model without losing the one element they’d already figured out.

“If a model is doing something right, the same model should be able to do different things together,” Young said. “Your brain is still the same brain, yet you can do different things if I show you different circumstances.”

Swarms of Vision

In lab experiments, researchers present primates with simple visual stimuli — black-and-white patterns that vary in terms of contrast or the direction in which they enter the primates’ visual fields. Using electrodes hooked to the primates’ visual cortices, the researchers track the nerve pulses produced in response to the stimuli. A good model should replicate the same kinds of pulses when presented with the same stimuli.

“You know if you show [a primate] some picture, then this is how it reacts,” Young said. “From this information you try to reverse engineer what must be going on inside.”

In 2018, the three researchers published a second paper in which they demonstrated that the same model that can detect edges can also reproduce an overall pattern of pulse activity in the cortex known as the gamma rhythm. (It’s similar to what you see when swarms of fireflies flash in collective patterns.)

They have a third paper under review that explains how the visual cortex perceives changes in contrast. Their explanation involves a mechanism by which excitatory neurons reinforce each other’s activity, an effect like the gathering fervor in a dance party. It’s the type of ratcheting up that’s necessary if the visual cortex is going to create full images from sparse input data.

Currently Young, Shapley, and Chariker are working on adding directional sensitivity into their model—which would explain how the visual cortex reconstructs the direction in which objects are moving across your visual field. After that, they’ll start trying to explain how the visual cortex recognizes temporal patterns in visual stimuli. They hope to decipher, for example, why we can perceive the flashes in a blinking traffic light, but we don’t see the frame-by-frame action in a movie.

At that point, they’ll have a simple model for activity in just one of the six layers in the visual cortex—the layer where the brain roughs out the basic outlines of visual impression. Their work doesn’t address the remaining five layers, where more sophisticated visual processing goes on. It also doesn’t say anything about how the visual cortex distinguishes colors, which occurs through an entirely different and more difficult neural pathway.

“I think they still have a long way to go, though this is not to say they’re not doing a good job,” Angelucci said. “It’s complex and it takes time.”

While their model is far from uncovering the full mystery of vision, it is a step in the right direction—the first model to try and decipher vision in a biologically plausible way.

“People hand-waved about that point for a long time,” said Jonathan Victor, a neuroscientist at Cornell University. “Showing you can do it in a model that fits the biology is a real triumph.”

Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.


More Great WIRED Stories

via Wired Top Stories https://ift.tt/2uc60ci

August 25, 2019 at 07:06AM

Does Hyundai’s rooftop solar panel change the fuel-economy equation?

https://www.popsci.com/hyundai-hybrid-car-solar-panel/

The hybrid Sonata with a solar panel on its roof.

The hybrid Sonata with a solar panel on its roof. (Hyundai/)

The new hybrid Hyundai Sonata isn’t available yet in the United States, but it offers something compelling enough to make headlines here—a solar panel on its roof. While the panel can’t produce nearly enough juice to give the car’s battery all it needs for regular travel, it does occupy what the company calls a "supporting role" for the vehicle.

Hyundai notes that the solar cells could provide a boost of about 808 miles annually. In other words, if you drove this car every day for a year, you’d be getting about 2.2 miles of daily travel from the sun, on average. Unlike with plug-in charging or previous attempts at cars with photovoltaics, the solar system shunts power to the Sonata’s battery even when you’re driving.

And in October of last year, Hyundai announced that they have been working on different types of solar cells for cars, one of which is designed to juice the battery of a car with an internal-combustion engine, "thereby improving fuel efficiency," they argue.

Hyundai follows Toyota down the solar-panels-on-a-car road; the Japanese carmaker first offered a Prius with a solar roof around a decade ago, but in 2009, the MIT Technology Review referred to it as "underwhelming." That’s because the solar energy didn’t power the battery that drove the vehicle’s propulsion system, but just "ran a fan to ventilate the car," they note.

Things got better in 2017, when Toyota started offering the Prius PHV with an optional solar roof in Japan. That solar system charges the main battery only when the car is parked, and can add, on average, 1.8 miles to the car’s driving distance each day, with a max of around 3.8 miles. Solar power can’t charge that main battery—called the traction battery—while the car is being driven, however. During that time, the power goes into the 12-volt auxiliary battery, which gives juice to car systems like the radio.

The demonstrator vehicle with solar cells on its hood, roof, and rear.

The demonstrator vehicle with solar cells on its hood, roof, and rear. (Toyota/)

But earlier this summer, Toyota announced something even better: a blue and white demonstration vehicle with solar cells on the hood, roof, and rear that (unlike the Prius PHV production car in Japan) could charge the main battery while the vehicle is in motion. Not only that, but the solar cells are more efficient (clocking in at 34 percent efficiency or more for the demonstration car, compared to 22.5 percent for the Prius PHV) and they produce nearly five times as much wattage. In short, this demonstrator vehicle offers a more capable solar system than the production car. "The trials aim to assess the effectiveness of improvements in cruising range and fuel efficiency of electrified vehicles equipped with high-efficiency solar batteries," Toyota said in its announcement.

Carmakers and others have much more affordable and more powerful solar panels to work with now than they did around ten or so years ago. "In the last decade, the prices of solar panels have dropped at least 60 percent," says Vikram Aggarwal, the CEO of EnergySage, a company that provides people with financial quotes for installing solar power. That, plus solar panels are more energy dense—capable of producing more wattage—and more efficient now. In a nutshell: they’re packing more power, better efficiency, while costing less, Aggarwal says.

All of that means that it now makes more sense for a carmaker to throw a solar panel on the back of a car, even if it’s a far cry from powering the whole vehicle. “You can cover every inch of the car’s exterior with solar cells—the total surface area is never going to be that much [of a power source],” Aggarwal observes, meaning that the panels are just going to be supplemental.

But Parth Vaishnav, an assistant research professor of engineering and public policy at Carnegie Mellon University, wonders if the cars themselves are truly the best place to take advantage of solar power. After all, he notes, solar energy produced at the utility level is much cheaper per watt than solar energy that comes from a person’s residential solar installation. And putting a solar panel on a car could hypothetically add financial cost to the carmaker, plus complexity, weight, and make the car more difficult to disassemble at the end of its life. Ultimately, a person interested in driving a car powered by the sun would be better served by plugging an electric vehicle into a power source that came from a large-scale solar installation. "If you wanted to deploy solar power," he reflects, "is the roof of a car the best place to put it?

via Popular Science – New Technology, Science News, The Future Now https://www.popsci.com

August 23, 2019 at 12:35PM

Netflix test brings human-curated ‘Collections’ to streaming

https://www.engadget.com/2019/08/23/netflix-collections/

Netflix leans on algorithms for virtually all of its show suggestions, but it’s trying something radical: curation from real, honest-to-goodness humans. The service is testing expert-crafted Collections that, much like music playlists, offer selections based around certain themes. You can check out a collection of light-hearted fare if you’re looking for relief from a stressful week, or go for prizewinning titles if you only want critically-praised pieces.

The company is only testing the feature in its iOS app so far, and stressed that there’s no guarantee Collections will be widely available. They "may or may not become permanent features," a spokesperson told TechCrunch. Netflix’s disc-based service already has a similar Collections feature, although it clearly doesn’t have the instant gratification of streaming.

It seems like it may be just a matter of time before Collections are more widely available, though. As initial discoverer Jeff Higgins found, Collections are billed as an "easy way" to find shows you’d like. It’s all too common to be overwhelmed by choices on streaming services like Netflix. If this helps you start watching sooner instead of browsing endless automated suggestions, you may be more likely to come back instead of drifting toward other services.

Source: TechCrunch

via Engadget http://www.engadget.com

August 23, 2019 at 02:58PM

Hundreds of “banned” goods still for sale on Amazon, report finds

https://arstechnica.com/?p=1556937

A drone with an Amazon package floats in front of the Amazon logistics center in Leipzig, Germany, 28 October 2014. Amazon did not comment on whether drones will fuel this default one-day speed boost for paying Amazon Prime subscribers' deliveries.
Enlarge /

A drone with an Amazon package floats in front of the Amazon logistics center in Leipzig, Germany, 28 October 2014. Amazon did not comment on whether drones will fuel this default one-day speed boost for paying Amazon Prime subscribers’ deliveries.

Amazon is by far the biggest US online retailer. In the past 20 years it has leapt past its origins as a website you could order books from to become, among other things, the everything store—one-stop shopping for all physical and digital goods from A to Z.

The company’s explosive growth is due in part to its sprawling third-party merchant marketplace. Many marketplace merchants are indeed above-board retailers, manufacturers, and resellers. But thousands more sell not only counterfeit items, but also mislabeled, unsafe, recalled, or even banned items that can put consumers—especially children—in serious danger.

The Wall Street Journal identified more than 4,100 such products for sale on Amazon.com during the course of a months-long investigation, and at least 2,000 are toys or medications that fail to include warnings about risks to children.

Among the Journal’s findings: 116 products falsely listed as FDA-approved, including toys, which the agency does not regulate; 80 listings for infant “sleeping wedges” the FDA says can cause suffocation and that Amazon had previously banned; 1,412 electronics listings falsely claiming to be UL-certified; 2,324 toys that failed to include federally mandated choking hazard warnings; and more.

The WSJ commissioned tests of 10 specific children’s products it bought on Amazon, many carrying the enigmatic “Amazon’s Choice” badge. Of those, four failed tests based on federal safety standards, including one that contained excessively high levels of lead.

Balloons also proved to be a sticking point: the WSJ found 4,500 balloon listings that did not include required choking hazard warnings, and the paper notified Amazon about them. Weeks later, WSJ found another 2,200 balloon listings that did not include required warnings. Including all the balloon listings, the WSJ identified 10,870 problematic listings to Amazon, of which 83% were eventually removed or altered.

One mislabeled product the WSJ included in its report proved deadly to its owner. A 23-year-old man in Missouri purchased a motorcycle helmet from Amazon that was at the time listed as certified as meeting US Department of Transportation safety standards. Later that year, however, he was killed in a crash while riding. A federal investigation later found that the helmet did not meet DOT standards and was recalled. The WSJ, however, found the product still for sale, with an active listing promising compliance, until the WSJ contacted Amazon to inquire about it.

Whack-a-Mole

The WSJ’s investigation found 157 products for sale that Amazon has already banned from sale on its site. The motorcycle helmet was one of more than 2,300 product listings altered or pulled after the WSJ drew them to Amazon’s attention. Yet, within two weeks, the WSJ found that at least 130 of these problem items reappeared, “some sold by the same vendors previously identified by the Journal under different listings.”

In short, the Journal writes, Amazon “has increasingly evolved like a flea market,” exercising little to no oversight over items sold by third-party merchants unless a specific complaint or media report draws an item to the company’s attention. The marketplace setup that causes Amazon to land in hot water seemingly annually for selling some kind of pro-rape, pro-slavery, or pro-Nazi apparel also leads to endemic listings for recalled or harmful goods.

About 60% of Amazon’s physical retail sales come from the third-party marketplace, the company has said. A recent quarterly report showed the marketplace generates $11.14 billion in sales for the ecommerce behemoth in just three months. But even consumers wary of shopping from third-party merchants can still easily find themselves purchasing from one. Many of the items in the WSJ’s investigation were fulfilled by Amazon: eligible for Prime shipping, from Amazon warehouses, and in Amazon boxes. Two different shoppers told the WSJ they had assumed harmful or mislabeled products they bought from the site were reviewed and approved in some way by Amazon, as they would be in a big-box store such as Target or Walmart, until contacted by the WSJ.

Counterfeit products, which are often less safe than their “real deal” counterparts, are also a persistent plague to Amazon and its customers. The WSJ did not test counterfeit goods, which show up in searches for every kind of product, from luxury goods to cheap USB cables.

When reached by Ars for comment, an Amazon representative directed us to a company blog post that says, in part, “We provide a number of ways for regulatory agencies, industry organizations, brands, customers, and our customer service teams to report safety issues. When we receive these reports, we move quickly to protect customers, remove unsafe products from our store, and investigate.”

Limited recourse

Swimming against the tide of dodgy listings and questionable goods is challenging for even the best-educated consumer, and prevailing in court if you do end up with a damaging product is, at best, a hit-or-miss exercise.

The family of the young man who died in the motorcycle crash sued Amazon, as well as the driver of the vehicle with which he had collided and the third-party merchant that sold the helmet. The merchant was ordered to pay $1.9 million in restitution; Amazon settled for $5,000 without admitting wrong doing. A company attorney told the WSJ, “Basically, a third party was using Amazon as a bulletin board to advertise the product and sell.”

Your ability to sue Amazon if you are injured by a third party’s product largely depends on where you live. A federal appeals court in Philadelphia ruled in July that Amazon could be held liable under state law for a defective product that blinded a Pennsylvania woman.

The Third Circuit court, however, was the first to do so. The Fourth and Sixth Circuit Courts of Appeals ruled in May and June respectively that Amazon is merely a platform, not a “seller,” when it comes to state consumer protection law.

via Ars Technica https://arstechnica.com

August 23, 2019 at 03:10PM