Putting CO? to use: 10 finalists named for Carbon XPrize

Putting CO? to use: 10 finalists named for Carbon XPrize

https://ift.tt/2GRb3dg

On Monday, the XPrize organization announced that it had selected 10 finalists for its NRG COSIA Carbon Competition. These finalists will be given space near a power plant and pipes that will deliver some of the plant’s carbon-dioxide-rich exhaust. It’s up to the competitors to turn that carbon dioxide into marketable products.

For the finalists, those products range from concrete to carbon nanotubes. To get a better overview of the technologies and the competition itself, we talked with Marcius Extavour, the XPrize’s senior director of energy and resources.

Capture, no storage

The world remains committed to fossil fuels, despite our increasing knowledge of the risks they pose. These risks have raised interest in the idea of carbon capture and storage. Rather than shut down our fossil-fuel-burning hardware and all the infrastructure that feeds it, we simply remove the carbon dioxide from the plant’s exhaust, placing it in either long-term storage or reacting it with rocks to lock it away indefinitely.

But doing so costs energy, taking away from the output of the fossil fuel plants and costing money. “Carbon pollution is free today, and this is an expensive technology,” Extavour said. “That’s a fundamental challenge.”

As a result, there have been only a few small-scale tests of carbon capture and storage, and the few plans to expand to full-scale facilities have ended up cancelled. Viewed in that light, it could be difficult to understand what the XPrize hopes to accomplish here.

The answer is quite simple: they’re not doing carbon capture and storage. They’re doing carbon capture and conversion—conversion into products that there’s a market for. The challengers all have processes that can use carbon dioxide as a feedstock. The Carbon Competition is their opportunity to see which of these can scale.

Deployment

For the 10 finalists, the announcement signaled the start of a couple of years of hard work. “We have about another two years of runway for the finalists to scale up by 10x with respect to what they’ve already done,” Extavour told Ars. “A year to build up and test and develop, and another nine months to a year to actually run on site and collect data.”

In this case, “on site” means one of two locations: a natural gas plant in Alberta, Canada or a coal-fired power plant in Wyoming.

Getting the plants’ operators on board was one of the challenges faced by the XPrize itself, Extavour said: “There was a bit of hesitancy at first—we don’t see that type of innovation in the energy industry.”

The finalists being announced.
Enlarge /

The finalists being announced.

Prize

The exhaust streams differ in terms of the additional gases present, which could impact any processes that involve catalysts. There’s also a big difference in CO2 concentrations, with the natural gas plant exhaust carrying about five percent CO2, and the coal plant 12 percent. For those teams where this mattered, the XPrize tried to locate them at the appropriate site. But for many of them, the location didn’t matter much. “They need to purify it up to 90 [percent] anyway,” Extavour said.

The projects will be judged based on three sets of criteria. One is related to the goal’s primary task: what percentage of the carbon dioxide that’s sent through the system ends up in some form of product. Related to that, the processes should use more carbon than ends up released from powering them, resulting in a net reduction in emissions. Another set of criteria focus on energy and material efficiency. “How expensive are your catalysts? How much electricity does it cost? How much heat do you need?” Extavour asked. “The teams are competing to minimize the cost and use of materials and energy.”

Also in this category are any land use and resource issues, like water. Both of these, Extavour suggested, may be why there’s only one team that is focused on feeding the carbon dioxide to an organism that would incorporate it into useful molecules. While things like that can be done with photosynthetic algae, it requires a lot of space for growth ponds, as well as significant amounts of water.

The final set of criteria are economic. “It’s about transforming the carbon molecule into something useful,” Extavour told Ars. “Another way of describing useful is valuable or revenue generating.”

But there’s not one path to success on economic terms. One of the teams hopes to produce carbon nanotubes; although their market is small, they command a high price premium. At the other end, a couple of teams are focusing on concrete, where low prices are traded off against an enormous market.

Thermodynamics

Right now, the only use for captured carbon dioxide we’ve come up with is in oil extraction, where it can be pumped underground to force crude to the surface. The projects being pursued here all transform the carbon dioxide chemically. And that means almost all of them run into a big thermodynamic challenge, because carbon dioxide is an extremely stable molecule.

The one exception is carbonates, chemicals that typically involve a metal complexed with the negatively charged CO3 ion. These are energetically favorable compared to carbon dioxide and can find some uses in building materials. But most of the projects involve some sort of energy input to break the carbon-oxygen bonds. “It doesn’t matter where you get the energy from,” Extavour said, “As long as it’s low carbon if you’re trying to lower carbon emissions.”

But he said the plunge in price of renewable power has shifted the economics, and further developments there could open up additional possibilities.

Marcius Extavour.
Enlarge /

Marcius Extavour.

In some cases, the teams explicitly mention using solar power to provide the energy to break down carbon dioxide or to supply the hydrogen to react it with. In other cases, it’s part of a process that we’re already injecting energy into—like making carbon nanotubes, battery components, or plastics.

Ultimately, the price and efficiency will be critical determinants here. Otherwise, it will continue to be more economical to make our plastics from fossil fuels and to expend less energy by using alternate processes. But things like carbon taxes or emissions trading could also tilt things in favor of using carbon dioxide; Extavour spoke of situations where “Energetically, you’re losing, but if you’re focusing on minimizing carbon emissions, you might be winning.”

Why a competition?

If the challenge is a mix of thermodynamics and economics, why would a competition be necessary? For that, Extavour had a number of answers. One is that it could help overcome the (quite reasonable) conservatism of utilities. “I’m familiar with the mandate of ‘do not change anything, do not experiment with anything, do not let the lights flicker,'” he told Ars.

With large-scale demonstration projects, the XPrize could help demonstrate that the technology doesn’t interfere with the primary purpose of these power plants.

There’s also a catch-22 in operation here. These technologies need to be able to scale in order to make any dent in our carbon emissions. But, as Extavour notes, “the free market would never build a test center for the purpose of testing a new technology at industrial scale before the market was mature.”

The XPrize could provide a way out of this catch-22.

Beyond those practical concerns, Extavour sounded a bit like the Solar Impulse team in talking about their round-the-world trip in a solar-powered aircraft—terms like “moonshot” and “inspirational” peppered the conversation. Rather than having people listen to news about carbon capture plans that never get off the ground, “we’re trying to orient people’s minds to think ‘hey, this is possible,'” he told Ars. Two years from now, when the data is collected and analyzed, we’ll have a much better sense of what’s possible.

Tech

via Ars Technica https://arstechnica.com

April 11, 2018 at 10:40AM

Rocket Lab is about to win the small satellite launch space race

Rocket Lab is about to win the small satellite launch space race

https://ift.tt/2JHqBy4

Enlarge /

In January, Rocket Lab reached orbit for the first time with the second launch of its Electron vehicle.

Rocket Lab

Life is pretty good for Rocket Lab and its founder Peter Beck right now. With two test flights of its Electron rocket completed in the last 10.5 months, the company says it will move into commercial operations later this month. The 14-day launch window for the “It’s Business Time” mission, carrying two private payloads, opens on April 20.

In an interview, Beck said Rocket Lab hopes to fly eight missions in 2018 and reach a monthly launch cadence by the end of the year. The company’s initial test flight in May 2017 failed to reach orbit, but a second flight in January of this year was almost entirely successful. Rocket Lab will become the first of a number of small-satellite launch companies to begin serving customers.

Tech

via Ars Technica https://arstechnica.com

April 12, 2018 at 09:31AM

A crummy drop-down menu appeared to kill dozens of mothers in Texas

A crummy drop-down menu appeared to kill dozens of mothers in Texas

https://ift.tt/2HtfNCQ

In the US, the rate of women dying from pregnancy and childbirth is higher than in any other developed country—much, much higher. And we’re bucking the global trend of improving the situation. While the rest of the world largely saw its maternal mortality rates drop by more than a third between 2000 and 2015, the US was one of the few countries that seemed to experience increases in the rate of women dying from pregnancy-related causes.

The state of maternal health in the US is so grim that researchers can’t even get quality data on the deaths. In fact, the country has not published an official maternal mortality rate since 2007 due to the lack of accurate data from individual states. In 2016, a group of researchers didn’t mince words about the situation: “It is an international embarrassment that the United States, since 2007, has not been able to provide a national maternal mortality rate to international data repositories,” the researchers concluded in a study published in the journal Obstetrics & Gynecology.

Now, a new study in the same journal goes further to highlight just how bad the state of maternal health data is in the US. The study links a dramatic rise in maternal deaths in Texas to errors from a poorly designed drop-down menu in the state’s electronic death records system. While the discovery drags down the state’s stratospheric maternal mortality rate, the corrected numbers are still extremely high for a developed country. Moreover, having to make these types of corrections squanders precious resources, experts note.

In an accompanying editorial, maternal health experts wrote that the maternal mortality review committee responsible for the discovery—and others like it— are “spending too much time simply identifying cases and eliminating false positives.” Instead, those groups need to “get back to doing the job they were designed to do—investigating and preventing maternal deaths.”

Labored statistics

The new study on Texas’ data is a direct response to the 2016 study, which was led by Marian MacDorman of the Maryland Population Research Center at the University of Maryland. In it, MacDorman and colleagues tried to amass the ragtag data from individual states to come up with a country-wide estimate of maternal mortality rates.

They determined that between 2000 and 2014, the country’s maternal mortality rate (maternal deaths per 100,000 live births) went from roughly 18.1 to 23.8—a 26.6 percent increase. For comparison, that would put the US second to last out of 31 countries that reported maternal mortality rates to the Organization for Economic Cooperation and Development (OECD). We would be above only Mexico, while Italy reported a rate of 1.2, Spain reported 2.1, Japan reported 3.3, and the UK reported 6.7 in 2014.

Those numbers are a bit low compared with other estimates, which can differ depending on methodology and timeframes for death after pregnancy (from 42 days to 18 months). For instance, a study published in The Lancet (PDF) later in 2016 came up with higher estimates overall, but the US still fared terribly. In that study, the US maternal mortality rate for 2015 was 26.4, while Italy, Spain, Japan and the UK had rates of 4.2, 5.6, 6.4, and 9.2, respectively. Similarly, the Institute of Health Metrics and Evaluation—part of the University of Washington—pegged the US’s 2015 rate at 29.4, while Italy, Spain, Japan, and the UK had rates of 3.9, 5.0, 5.9, and 6.8, respectively.

Part of the reason the US estimate calculated by MacDorman and her colleagues in 2016 may have been a bit different was that the researchers excluded data from California—which saw a unique decrease in maternal mortality rates—and Texas, which saw a “puzzling” and dramatic increase between 2010 and 2012. Texas’ rate went from an abysmal 18.6 in 2010 to a jaw-dropping 38.4 in 2012, a leap the authors struggled to understand.

“In the absence of war, natural disaster, or severe economic upheaval, the doubling of a mortality rate within a two-year period in a state with almost 400,000 annual births seems unlikely,” the authors concluded.

The new study explains what happened—user error.

Birthing defects

The new study was authored by the Texas Maternal Mortality and Morbidity Task Force, a type of review committee that several states have set up in recent years to try to understand the country’s high maternal death rates. These committees can help shape up state stats, but as an investigation by ProPublica pointed out last year, they have trouble making progress due to a lack of resources. A third of the states that have such committees have no budget for them, and they rely solely on volunteer efforts from health practitioners who focus on maternal health.

The Texas committee set out to try to understand the extremely high maternal death rate in just 2012—at which point the rate had doubled from 2010. There were 147 maternal deaths in Texas during 2012, based on medical codes on death records that indicated maternal death while pregnant or within 42 days of postpartum. The researchers looked into those 147 cases, plus medical records of all women’s deaths during the year, to see if any maternal deaths were missed.

The committee tried to match up women’s death records with birth or fetal death records as well as individual medical records—which could show prenatal care or not—plus any information from an autopsy or death certifier that would indicate pregnancy-related death.

With this method, they found that only 47 of the 147 deaths were clearly linked to pregnancy. Of the other 100, 74 had zero evidence of pregnancy in their medical records, 15 had insufficient medical records to make a call, and 11 deaths occurred beyond the 42-day window after pregnancy to count for this study.

Looking through the death records of all women, the committee found nine other maternal deaths that were previously missed, bringing the 2012 total to 56 confirmed maternal deaths. That works out to a maternal mortality rate of 14.6 deaths per 100,000 live births. That’s still very high, but it’s significantly lower than the 38.4 reported before the corrections. If the researchers add in the 15 deaths that didn’t have enough medical data to make a call, the rate jumps to 18.6.

Nursing numbers

The researchers note a possible explanation for why many of the 74 women had obstetric codes without any other evidence of pregnancy. Fifty-six of the 74 (76 percent) had a listing of “pregnant at the time of death” on their death record. This label is listed directly below the “not pregnant within the past year” option on a drop-down menu in Texas’ electronic death-registration system. Health professionals filling out the record may have simply mis-clicked. To add to that concern: between 2010 and 2012, use of electronic death records rose from 63 percent to 91 percent. The authors suggest that the system should have separate boxes for pregnancy status—as well as more training for personnel—to avoid the potential click-fail in the future.

The accompanying editorial response—penned by MacDorman and colleagues—says that “the work of the Texas maternal mortality review committee is laudable, necessary, and entirely appropriate.” But they note that it’s just one year’s worth of data and doesn’t explain the higher rate overall or the recent trend upward.

The solution, they say, is that:

We need to improve data quality in the National Vital Statistics System by working at the national, state, and local levels to better train physicians, medical examiners, and coroners on the importance of the pregnancy checkbox and completing the cause-of-death section; enhance query systems and internal consistency checks in real-time; and review cause-of-death coding procedures in relation to maternal deaths.

All the experts agree that such information and systems are critical to preventing pregnant women and mothers from dying needless deaths from pregnancy-related issues and complications, such as hemorrhaging and infections. A recent investigation by Vox found that California reversed its trend—dropping maternal mortality rate from 21.5 in 2003 to 15.1 in 2014—by doing simple things like providing a “toolkit” for hospitals on safe births and having “hemorrhage carts” at hand that are packed with everything a medical team needs to handle hemorrhaging. During childbirth, a woman can bleed to death in as little as five minutes, Vox notes.

And additional data can help address the massive racial disparities in maternal mortality rates. For instance, even after the Texas death rates were corrected, the maternal mortality rate for black women was still a stunning 27.8. And older mothers, too, face grave statistics. Those over 35 years of age had a mortality rate of 32.2 in Texas after corrections.

As MacDorman and colleagues concluded in their 2016 study: “There is a need to redouble efforts to prevent maternal deaths and improve maternity care for the 4 million US women giving birth each year.”

Obstetrics & Gynecology, 2018. DOI: 10.1097/AOG.0000000000002565  (About DOIs).

Tech

via Ars Technica https://arstechnica.com

April 12, 2018 at 06:34AM

A LEGO “Roomba” That Picks Up LEGO Pieces From the Floor [Video]

A LEGO “Roomba” That Picks Up LEGO Pieces From the Floor [Video]

https://ift.tt/2qosG96

From The Brick Wall:

There is one part of my life where I welcome ROBOTS – cleaning the floor. Here is my story. For years my Mom says “Please clean the floor from the Lego parts.” Now, my answer is “Lego will do it!”

[The Brick Wall]

The post A LEGO “Roomba” That Picks Up LEGO Pieces From the Floor [Video] appeared first on Geeks are Sexy Technology News.

Tech

via [Geeks Are Sexy] Technology News https://ift.tt/23BIq6h

April 11, 2018 at 03:02PM

Bank of America will stop lending to makers of ‘military-style firearms’

Bank of America will stop lending to makers of ‘military-style firearms’

https://ift.tt/2qoCwba

Bank of America plans to stop lending to manufacturers of “military-style firearms” used by civilians, an executive told Bloomberg.

“We want to continue in any way we can to reduce these mass shootings,” Anne Finucane, vice chairman of Bank of America (BAC), said in an interview. “It is our intention not to finance these military-style firearms for civilian use.”

Finucane said the bank does “have a few manufacturers of military-style firearms.”

“We’re in discussion with them. We have let them know that it’s not our intent to underwrite or finance military-style firearms on a go-forward basis,” she said.

Finucane was referring to the AR-15-style semiautomatic rifles produced by gun companies like Remington Outdoor Company. Remington made the Bushmaster used in a 2012 mass shooting that killed 26 children and educators at an elementary school in Newtown, Connecticut.

Bank of America is listed as a creditor in Remington’s Chapter 11 filing. A spokesman for Bank of America would not confirm or deny reports to CNNMoney that the bank also lends to Sturm Ruger (RGR) and Vista Outdoor (VSTO), which also make AR-15-style rifles. Remington, Ruger and Vista did not return messages from CNNMoney.

Everytown for Gun Safety, a gun control group, applauded the move.

“When the second-largest bank in the US takes concrete steps to prevent gun violence, it sends a clear message to the entire industry: It’s time for every financial institution to do their part,” said Everytown president John Feinblatt, in a statement.

But the gun industry group, the National Shooting Sports Foundation, objected to Bank of America’s characterization of the guns as “military” since they’re products for civilians.

“We as an industry would welcome the opportunity to sit down with Bank of America executives to explain our industry’s perspective and to discuss what really would work to keep firearms out of the hands of those who should not have them,” said NSSF spokesman Michael Bazinet, in a statement.

Related: Why the AR-15 is the mass shooter’s go-to weapon

In February, Bank of America said it plans to “engage the limited number of clients we have that manufacture assault weapons for non-military use to understand what they can contribute to this shared responsibility.”

This is a rising trend on Wall Street, which is losing its taste for gun manufacturers ever since a mass shooting in February, when a AR-15-style Smith & Wesson rifle made by American Outdoor Brands was used to kill 17 students and educators at a high school in Parkland, Florida.

Last month, Citigroup (C)said it will bar companies that it does business with from selling guns to people under the age of 21 and require customers to undergo background checks for all firearm purchases.

Money manger BlackRock, (BLK) the largest shareholder in both Sturm Ruger and American Outdoor Brands, said last month that it will start offering clients the option to invest in funds that exclude firearms and retailers that sell them.

State Street (STT), a top shareholder in Sturm Ruger, American Outdoor Brands and Vista Outdoor, has said that it’s contacted the companies to see how they “will support the safe and responsible use of their products.” Blackstone, another investment firm, reached out to a dozen hedge fund managers asking for information about their stakes in gun manufacturers and distributors.

Related: Remington, one of the oldest gun makers in America, files Chapter 11

Bank of America’s Finucane stopped short of saying the bank was going to stop lending to gun retailers any time soon, saying that “gets into civil liberties and Second Amendment.”

Dick’s Sporting Goods (DKS) Chief Executive Officer Edward Stack said in February, days after the Parkland shooting, that his stores would stop selling “assault-type rifles” and high capacity magazines, and won’t sell any gun to anyone under 21.

Walmart (WMT), which stopped selling military-style semiautomatic rifles in 2015, also recently raised its gun buying age to 21, along with L.L. Bean and the grocery chain Kroger (KR), which sells guns through its Fred Meyer stores.

CNN’s Danielle Wiener-Bronner contributed to this report.

News

via Business and financial news – CNNMoney.com https://ift.tt/UU2JWz

April 11, 2018 at 12:41PM

FTC makes clear ‘warranty void if removed’ stickers are illegal

FTC makes clear ‘warranty void if removed’ stickers are illegal

https://ift.tt/2qnlp9z

Those stickers on gadgets that say you’ll void your warranty if they’re removed? You’ve probably come to expect them whenever you purchase a new device. The FTC has just made clear, however, that those warranty notices are illegal when it fired off warning letters to six companies that market and sell automobiles, mobile devices and video game consoles in the US. It didn’t mention which automakers and tech corporations they are, but since the list includes companies that make video game consoles, Sony and Microsoft could be two of them.

Under the 1975 Magnuson-Moss Warranty Act, which the commission cited in its letter, companies can’t put repair restrictions on their products unless they provide the parts or services for free or receive a waiver from the FTC. Thomas B. Pahl, Acting Director of the FTC’s Bureau of Consumer Protection, said in a statement:

"Provisions that tie warranty coverage to the use of particular products or services harm both consumers who pay more for them as well as the small businesses who offer competing products and services."

Since warranty stickers are a common sight on popular consumer electronics, like say, the PS4 and various phones, it was pretty unclear whether the law covers products much cheaper than cars. As Motherboard noted, though, the letters made it crystal that it also covers electronic devices, so long as they cost more than $15.

The FTC asked the six companies to review their warranty notices and make sure that they don’t "state or imply that warranty coverage is conditioned on the use of specific parts of services." It will then review the companies’ websites after 30 days, warning the letters’ recipients that "failure to correct any potential violations may result in law enforcement action."

Via: Motherboard

Source: FTC

Tech

via Engadget http://www.engadget.com

April 11, 2018 at 02:06AM

A Long-Awaited IoT Crisis Is Here, and Many Devices Aren’t Ready

A Long-Awaited IoT Crisis Is Here, and Many Devices Aren’t Ready

https://ift.tt/2IB0K9m

You know by now that Internet of Things devices like your router are often vulnerable to attack, the industry-wide lack of investment in security leaving the door open to a host of abuses. Worse still, known weaknesses and flaws can hang around for years after their initial discovery. Even decades. And Monday, the content and web services firm Akamai published new findings that it has observed attackers actively exploiting a flaw in devices like routers and video game consoles that was originally exposed in 2006.

Over the last decade, reports have increasingly detailed the flaws and vulnerabilities that can plague insecure implementations of a set of networking protocols called Universal Plug and Play. But where these possibilities were largely academic before, Akamai found evidence that attackers are actively exploiting these weaknesses not to attack the devices themselves, but as a jumping off point for all sorts of malicious behavior, which could include DDoS attacks, malware distribution, spamming/phishing/account takeovers, click fraud, and credit card theft.

To pull that off, hackers are using UPnP weaknesses in commercial routers and other devices to reroute their traffic over and over again until it’s nearly impossible to trace. This creates elaborate “proxy” chains that cover an attacker’s tracks, and create what Akamai calls “multi-purpose proxy botnets.”

“We started talking about how many of these vulnerable devices are out there and what can they be leveraged for, because most people seem to have forgotten about this vulnerability,” says Chad Seaman, a senior engineer on the security intelligence response team at Akamai. “As part of that we had to write some basic tools to find what was vulnerable. And some of these machines did have very abnormal [activity] on them. It was not something that we honestly expected to find and when we did it was kind of like ‘uh oh.’ So this theorized problem is actually being abused by somebody.”

Down With UPnP

UPnP helps devices on a network find and essentially introduce themselves to each other, so that a server, say, can discover and vet the printers on a network. You can find it both on internal, institutional networks and on the larger internet, handling things like IP address routing and data flow coordination. UPnP works with and incorporates other network protocols to negotiate and automatically configure these network communications, and it can be used when applications want to send each other large quantities of data to facilitate a sort of unrestricted firehose—think video streaming, or a gaming console talking to its web server.

‘This theorized problem is actually being abused by somebody.

Chad Seaman, Akamai

When IoT devices expose too many of these mechanisms to the open internet without requiring authentication—or when credential checks are easily guessable or can be brute forced—attackers can then scan for devices that have implemented a few of these protocols badly all in one device, and then exploit this series of manufacturer missteps to launch an attack.

That’s also how the Akamai researchers found the malicious UPnP proxy schemes. Akamai says it found 4.8 million devices on the open internet that would improperly return a certain query related to UPnP. Of those, about 765,000 also had a secondary implementation issue that created a bigger network communication vulnerability. And then on more than 65,000 of those, Akamai saw evidence that attackers had exploited the other weaknesses to inject one or more malicious commands into the router mechanism that controls traffic flow. Those final 65,000 devices were grouped together in various ways and ultimately pointed to 17,599 unique IP addresses for attackers to bounce traffic around to mask their movements.

Uptick in Attacks

Just because they haven’t been seen until recently, that doesn’t mean UPnP attacks haven’t been around. Last month, for example, Symantec published evidence that an espionage group it tracks known as Inception Framework uses UPnP proxying to compromise routers and obscure its cloud communications. But observers note that the strategy is probably not more common because the schemes are difficult to set up.

“In particular it’s annoying to build these attacks against hundreds of personal routers, and testing these attacks is hard too,” says Dave Aitel, who runs the penetration testing firm Immunity. “I’ve not seen it in the wild. That said, a working version would get you significant access.” He notes, though, that data leaks stemming from implementation mistakes, like the ones Akamai detected, make it easier for attackers to craft their attacks. For the manufacturers who developed vulnerable devices? “It falls under the ‘WTF were they thinking’ category,” Aitel notes.

Notably, the Akamai researchers saw evidence that UPnP proxying isn’t just being used for malicious activity. It also seems to be part of efforts to skirt censorship schemes in countries like China to gain unfettered web access. Even when a user is behind the Great Firewall, they can use a proxy network built on exposed devices to query web servers that would normally be blocked. Akamai’s Seaman notes that the group approached publishing its research carefully, since plugging these holes will limit people’s ability to exploit them for access to information. Ultimately, though, they concluded that the risks must be addressed, especially given how long the vulnerabilities have been known for.

‘It falls under the “WTF were they thinking” category.’

Dave Aitel, Immunity

Users won’t realize if their devices are being exploited for UPnP proxy attacks, and there is little they can do to defend themselves if they have a vulnerable device besides getting a new one. Some devices will allow users to disable UPnP, but that can lead to functionality issues. Though more and more devices have improved their UPnP implementations over the years to avoid these exposures, Akamai found 73 brands and almost 400 IoT models that are vulnerable in some way. The United States Computer Emergency Readiness Team, which tracks and warns about vulnerabilities, wrote in a note to impacted brands that, “CERT/CC has been notified by Akamai that a large number of devices remain vulnerable to malicious NAT injections. …This vulnerable behavior is a known problem.”

The whole point of proxying is to cover your tracks, so a lot is still unknown about how attackers use UPnP proxying and for what. But Akamai’s goal is to raise awareness about the problem to ultimately reduce the number of vulnerable devices that exist. “It was one of those things where it was like, this would be bad and it could be used for these attacks, but no one ever actually found it being used for that,” Akamai’s Seaman says. Now that it has been, hopefully manufacturers will finally do something about it.

Internet of Threats

Tech

via Wired Top Stories https://ift.tt/2uc60ci

April 9, 2018 at 01:12PM