Astronomers Watched a Black Hole Gobble a Star


(Credit: NASA/Swift/Aurore Simonnet, Sonoma State University)

We can’t — yet — directly see black holes, making finding one of these elusive beasts hard, especially since a great majority of them are dormant. But researchers at the University of Maryland, NASA Goddard, and the University of Michigan recently caught one of these sleeping giants waking up to slurp on a big snack: a passing star.

A Star for Dinner

Called Swift J1644+57, the black hole is about 3.8 billion light-years away at the center of a relatively quiet galaxy. The supermassive black hole was initially spotted in 2011 when a passing star woke the hungry giant up. The black hole, which is itself invisible, shredded the material of the star into an accretion disk as it feasted, giving researchers a window into its activity. The new study, published today in Nature, outlines a unique phenomenon just discerned from the event: so-called X-ray reverberation, in which the energy is seen bouncing around as it prepares to be eaten.

“The basic idea is that there are primary flashes of X-ray emission, and we see that directly but it also gets reflected off the walls of the accretion disk,” Erin Kara, lead author of the paper, says.

By analyzing these reverberations, astronomers are better able to discern the geometry of the black hole by analyzing the accretion disk during one of these “tidal disruption events.”

Rare Glimpse

Such events should be common, but they’re hard to spot. Most black hole regions emit energy in the X-ray spectrum, and to see X-rays, you need a space-based observatory like NASA’s Swift or the European Space Agency’s XMM-Newton to catch a glimpse. Researchers in 2011 got a sort of “early warning” on the event and were able to monitor it for 200 days.

“We caught it fairly early on in its lifetime,” Kara says. “It was long enough that it could form an accretion disk around this normally dormant black hole.”

The time delay in the emission lines gives a rough idea of not only what material is present, but what shape the accretion disk has assumed. Additional data on size is inferred through redshift and blueshift present. From there, the researchers build a more complete picture of not only the black hole, but its snack.

“In X-rays, we can’t image the innermost region around the black hole directly, so we really need to use these other techniques to infer what it must look like around the black hole,” Kara says.

The Swift Gamma-Ray Burst Mission has been NASA’s main workhorse in this effort. But in its 12 years in orbit, it’s only seen three tidal disruption events.

“It’s a very rare thing in the hard X-rays, and theory predicts that we should see more of them, but we are still waiting since 2011,” Kara says.

Future Telescopes Offer Better Prospects

With a continuous eye in the sky viewing across the cosmos in X-ray, that could become more possible. Kara says that in the future, more and more powerful telescopes may be able to find a few optical signatures of a tidal disruption event too faint to otherwise capture. An all-sky monitor like the Large Synoptic Survey Telescope may be able to catch a few such events, and a potential observatory called the Lobster Transient X-Ray Detector may find a few more if it becomes part of the International Space Station.

“We don’t know where these tidal disruption events are going to happen, so you need to look everywhere,” she says.

This post originally appeared on

from Discover Main Feed

Remains of the Day: Opera Takes Aim at Microsoft’s Battery Life Claims

Opera has offered up a rebuttal to Microsoft’s claims that their Edge browse is your best bet for extending your laptop’s battery life. Shocker: Opera claims that their battery-saving mode is the true battery savior. Hey, sure, I’ll go with anything that lets me watch ten hours of Netflix in one sitting.

Edit Assist,

from Lifehacker

Computer simulations point to the source of gravitational waves

Since the first gravitational waves were successfully detected last September by the earthbound Laser Interferometer Gravitational-Wave Observatory (LIGO), scientists have wondered what made them. Today, researchers from the University of Warsaw published a theory suggesting that they were likely created by the collision of two black holes, which had been stars that formed 12 billion years ago.
They arrived at that cause and date by modeling the birth of the universe. The researchers plugged stars and data into the computer simulation Synthetic Universe and ran it until they encountered an event that would possibly have emanated the actual gravitational waves. In this case, the culprit was likely a recent merger of two black holes made soon after the dawn of the universe.

"We play God," lead study author Chris Belczynski, an astrophysicist at Warsaw University, told The Verge. "We have a model of the entire Universe in our computer. We populate the computer with stars from the beginning, from the Big Bang, and you let them go ahead, evolve, produce black holes, etc."

Since Synthetic Universe’s simulation also includes a mock-LIGO to chronologically sync when we detected the waves, the model is also predictive, the study argues. If correct, we should see LIGO pick up to 60 detections when it starts "listening" for the waves again in fall. At its peak sensitivity, it could hear up to 1,000 detections annually.

Belczynski’s speculation specifies the size of black hole mergers that the LIGO should be able to detect from gravitational waves, a combined mass between 20 and 80 times the mass of our sun. That large size indicates that they’re likely from just after the Big Bang, when stars had lower metal content and formed proportionately larger black holes.

Belczynski’s model strongly suggests that the ones that collided to make these gravitational waves were stars that formed 12 billion years ago, became black holes 5 million years later, and then merged 10.3 billion years after that. 1.2 billion years later LIGO detected those reverberations in space-time. As more data comes in to LIGO and other detectors from these gravitational waves, the more Belczynski can refine his Synthetic Universe model and theorize the life cycles of stars.

Via: The Verge

Source: Nature

from Engadget

This $24 Smart Plug Adds Smartphone Control and Energy Monitoring To Any Outlet

Like the idea of a Belkin WeMo Switch, but not willing to spend $40-$50 to try one out? This TP-Link alternative has a nearly identical feature set for half the price. Just clip the $10 coupon on the page to knock it down to $24.

Just like a WeMo Insight switch, TP-Link’s Smart Plug will let you turn appliances on and off from your smartphone, track their energy usage, and set schedules to toggle them automatically. The only major feature it’s lacking is IFTTT support, but it will integrate with an Amazon Echo for voice control.

Note: We’ve posted several deals on a nearly identical TP-Link Smart Plug in the past, but this is easily the best price we’ve seen on the version that includes energy monitoring.…

Commerce Content is independent of Editorial and Advertising, and if you buy something through our posts, we may get a small share of the sale. Click here to learn more, and don’t forget to sign up for our email newsletter. We want your feedback.

from Lifehacker

Supercomputer Powered by Mobile Chips Suggests New Threat to Intel

The world’s biggest chip maker, Intel, is struggling because it missed out on the market in chips for mobile devices. Now mobile chips are coming for a market Intel has long had mostly to itself, supercomputers.

Supercomputers are used in government, academia, and industry for research on topics as varied as nuclear weapons and potential new drugs. Intel chips power more than 90 percent of the 500 most powerful of these, as well as dominating the server and PC markets. But smartphones and tablets are almost all powered by chips built using designs licensed from U.K. company ARM, which has long prioritized energy efficiency (see “Intel Outside”).

Fujitsu said this week that it will use ARM-based processors to build a successor to a Japanese supercomputer called Project K. Fujitsu is building the machine for the Riken Advanced Institute for Computational Science, which plans to use it for biomedical, climate, and energy research. The computer is slated to be installed and turned on in 2020.

A replacement for the K supercomputer at RIKEN Advanced Institute for Computational Science in Kobe, Japan, will be built using chips similar to those used in smartphones.

Fujitsu announced that plan at the International Supercomputing Conference in Germany, where there was more bad news for Intel. A new list of the world’s most powerful supercomputers was revealed, and the new top machine is not based on Intel’s x86 technology.

The makers of the Chinese TaihuLight system, at the National Supercomputing Center in the Chinese city of Wuxi, used a custom-built processor that uses an unspecified architecture built by the Chinese (see “New Fastest Supercomputer Is Chinese Through and Through”).

The power of supercomputers can be measured by the number of operations they can perform per second, using a metric known as FLOPS. TaihuLight performs at 93 petaflops—a thousand billion per second.

TaihuLight has a crazy amount of computing power, but it isn’t enough. And the prospects for making supercomputers get faster have started to look murky in recent years. Using more powerful chips—usually from Intel—used to deliver predictable gains in high-performance computing. But other factors, like the speed at which data can be moved around inside the system, have become limiting. And the power bills racked up by top supercomputers have become a major headache. The race to build bigger machines has seemingly hit a wall.

Computers from supercomputers down to mobile devices used to get more and more powerful as chip makers crammed more and smaller transistors onto chips, a trend known as Moore’s Law. But transistors are no longer shrinking so fast, and the power consumption of chips is getting out of control. Supercomputer builders have started looking to alternative designs that could allow their machines to keep getting faster. One of those is ARM.

“It’s a disruptive time to be in high performance computing,” says James Cuff, assistant dean for research computing at Harvard University. “Those that design machines inside the power envelope with the right support of key algorithms and codes are going to be the players that ultimately win in this new game.”

ARM has been putting dollars into getting its chips into high-performance computers since 2011. It has struck partnerships with IBM and graphics chip maker Nvidia, and it recently created software partnerships to make sure popular research software will run on ARM-based processors.

ARM’s strategy is yet to be fully tested, since no supercomputer based on its chip designs has been built, points out Jack Dongarra, a professor of computer science at the University of Tennessee in Knoxville, one of the authors of the list of the 500 most powerful supercomputers. But in supercomputing’s new energy-conscious era, it could make sense. “I think ARM has great potential,” he says. “It hasn’t been demonstrated in a large-scale machine so far. But there is nothing in the design that would limit its use.”

from Technology Review Feed – Tech Review Top Stories

A Protein That Moves From Muscle To Brain May Tie Exercise To Memory

Researchers have long known that exercise is good for the brain. An enzyme produced by muscles might help explain why.

Monalyn Gracia/Corbis/VCG/Getty Images

hide caption

toggle caption

Monalyn Gracia/Corbis/VCG/Getty Images

Researchers have long known that exercise is good for the brain. An enzyme produced by muscles might help explain why.

Monalyn Gracia/Corbis/VCG/Getty Images

Researchers have identified a substance in muscles that helps explain the connection between a fit body and a sharp mind.

When muscles work, they release a protein that appears to generate new cells and connections in a part of the brain that is critical to memory, a team reports Thursday in the journal Cell Metabolism.

The finding “provides another piece to the puzzle,” says Henriette van Praag, an author of the study and an investigator in brain science at the National Institute on Aging. Previous research, she says, had revealed factors in the brain itself that responded to exercise.

The discovery came after van Praag and a team of researchers decided to “cast a wide net” in searching for factors that could explain the well-known link between fitness and memory.

They began by looking for substances produced by muscle cells in response to exercise. That search turned up cathepsin B, a protein best known for its association with cell death and some diseases.

Experiments showed that blood levels of cathepsin B rose in mice that spent a lot of time on their exercise wheels. What’s more, as levels of the protein rose, the mice did better on a memory test in which they had to swim to a platform hidden just beneath the surface of a small pool.

The team also found evidence that, in mice, cathepsin B was causing the growth of new cells and connections in the hippocampus, an area of the brain that is central to memory.

But the researchers needed to know whether the substance worked the same way in other species. So they tested monkeys, and found that exercise did, indeed, raise circulating levels of cathepsin in the blood.

Next, they studied 43 people who hadn’t been getting much exercise.

“The people were university students that were couch potatoes — they didn’t exercise much,” says Dr. Emrah Duzel, a neurologist and team member from the German Center for Neurodegenerative Diseases.

Half the students remained sedentary. The other half began a regimen of tough treadmill workouts several times a week.

“Within four months we really made them fit,” Duzel says.

And, just like mice, the students who exercised saw their cathepsin B levels rise as their fitness improved. They also got better at a memory task: reproducing a geometric pattern they’d seen several minutes earlier.

But the clincher was the link between memory improvement and cathepsin levels, Duzel says.

“Those individuals that showed the largest gains in memory also were those that had the largest increase in cathepsin,” he says.

Of course, cathepsin is probably just one of several factors linking exercise and brain function, van Praag says.

“I don’t think we have fully explained how exercise improves memory,” she says, “but I think we’ve made a significant step forward.”

Also, cathepsin has a dark side. It’s produced by tumor cells and has been linked to the brain plaques associated with Alzheimer’s. So, trying to artificially raise levels might not be a good idea, van Praag says.

However van Praag says she’s trying to keep her own cathepsin levels up naturally by jogging — when she can.

“It takes a lot of time and effort to do all this research,” she says. “So sometimes the exercise regimen suffers a little bit.”

from NPR Topics: News

Juiced-Up Home Wi-Fi for $10 Extra a Month? It’s Coming.

Juiced-Up Home Wi-Fi for $10 Extra a Month? It’s Coming.

Startup Plume is betting that consumers are ready to pay more for a powerful wireless network with a brain in the cloud.

Are you willing to pay more for better home Wi-Fi?

A startup called Plume is betting that the challenge of running a home wireless network good enough for ultra-high-definition television, video games, and the Internet of things means that consumers are ready to pay extra for high-quality managed Wi-Fi in their homes.

Plume, which launched Thursday, offers a different take on the Wi-Fi router. It sells highly designed, palm-sized routers that plug into outlets around your home, and controls those routers via a cloud-based brain that’s actively managing how your home network functions.

Plume sells its routers to consumers for about $49 each (or $39 if you pre-order and buy at least six). You connect one into your modem, download an app to create your network, and then plug the rest into their respective rooms. These other routers find the network using Bluetooth.

A Plume router plugged into an outlet.

Fahri Diner, CEO and cofounder of Plume, says the company is also working with Internet service providers—he hopes ISPs might offer his routers and charge consumers about $10 extra a month.

The idea of paying extra for home Wi-Fi might rub many consumers the wrong way, but three factors could change their minds. The first is that people increasingly want high-quality bandwidth throughout the home. A few years ago they may have been content with great Wi-Fi only in an office by the router, but now people are placing Wi-Fi-connected cars in their garage, have video doorbells outside their house, and use streaming latency-sensitive applications such as video calls or games all over the house, and connections have to stretch further. Applications like virtual reality will require Wi-Fi  to work even harder.

The second factor is that it will be easy. Many consumers already pay their ISP for Wi-Fi because those providers have instituted modem rental fees that often include a router. A modem is the device that connects your home to the ISP’s service, while a router translates the modem’s signal into Wi-Fi. The two are often combined in one device. Customers are already accustomed to renting, and ISPs are in the business of enabling that, so Diner’s idea of selling routers to service providers could make sense.

A third factor is that not all homes can get good Wi-Fi even if they rent their modems and routers from an ISP. Long or tall homes might require additional gear. Interference can also cause issues in places where homes are close together, such as in a condo or apartment building.

So in a world where consumers need better Wi-Fi but can’t always get it, companies are looking to offer some kind of Wi-Fi as a service. Plume is one, but Eero, a Wi-Fi router maker that recently raised $50 million, is also eyeing paid services, according to Nick Weaver, Eero’s CEO.

Tim Chang, a managing director at Mayfield Fund, a venture capital firm, last week told a group of attendees at an investment conference that managed home Wi-Fi is the “next phase in delivering Wi-Fi.” Mayfield doesn’t have an investment in Plume or Eero.

But to build a subscription model for Wi-Fi, Plume had to change the way it is delivered. Most Wi-Fi routers have a radio and a small amount of brainpower in a processor to handle the packets flowing around the home. Plume’s routers have only a radio, with the computing in the cloud. It uses the cloud to handle the logic, which Plume’s CEO Diner says lets the service adapt quickly to changes in the network.

The company uses machine learning to understand the daily activity of a person’s network and discern patterns that could help it operate more efficiently. For example, it might analyze the network traffic and realize that one of the routers should be plugged in closer to the television for a better experience. It can make that suggestion through the mobile app.

The upside of Plume’s design means the routers will get smarter and faster as cloud services improve, and homeowners can spread more radios around their home for better coverage at a relatively low cost. The downside is that operating a service that requires ongoing investment in software and cloud servers means the company has to sell its hardware at premium that will support the business for the long term.

Several companies are rethinking the traditional router, including Eero, Luo, and Securifi, but it’s unclear if the mass market is convinced that its existing gear from an ISP or from the established vendors such as D-Link, Belkin, and Netgear is worth tossing.

from Technology Review Feed – Tech Review Top Stories