The US military is testing stratospheric balloons that ride the wind so they never have to come down

https://www.technologyreview.com/s/612417/darpa-is-testing-stratospheric-balloons-that-ride-the-wind-so-they-never-have-to-come-down/


The idea of a balloon that floats high up above Earth indefinitely is a tantalizing one. Solar power would allow such stratospheric balloons to operate like low-cost satellites at the edge of space, where they could provide communication in remote or disaster-hit area, follow hurricanes, or monitor pollution at sea. One day, they could even take tourists on near-space trips to see the curvature of the planet.

It’s not a new idea. Indeed, the original stratospheric balloons were flown by NASA in the 1950s, and the agency still uses them for science missions. And Project Loon, owned by Google’s parent company Alphabet, successfully deployed such balloons to provide mobile communications in the aftermath of Hurricane Maria in Puerto Rico.

There’s a major snag, though: current balloons shift with the wind and can only stay in one area for a few days at a time. At the height of the stratosphere, some 60,000 feet (18,300 meters) up, winds blow in different directions at different altitudes. In theory it should be possible to find a wind blowing in any desired direction simply by changing altitude. But while machine learning and better data are improving navigation, the progress is gradual.

DARPA, the US military’s research arm, thinks it may have cracked the problem. It is currently testing a wind sensor that could allow devices in its Adaptable Lighter-Than-Air (ALTA) balloon program to spot wind speed and direction from a great distance and then make the necessary adjustments to stay in one spot. DARPA has been working on ALTA for some time, but its existence was only revealed in September.

Ball Aerospace

“By flying higher we hope to take advantage of a larger range of winds,” says ALTA project manager Alex Walan. ALTA will operate even higher than Loon at 75,000 to 90,000 feet (22,900 to 27,400 meters or 14 to 17 miles), where the winds are less predictable. That shouldn’t be a problem if the balloon can see exactly where the favorable winds are.

The wind sensor, called Strat-OAWL (short for “stratospheric optical autocovariance wind lidar”), is a new version of one originally designed for NASA satellites. Made by Ball Aerospace, OAWL shines pulses of laser light into the air. A small fraction of the beam is reflected back, and the reflected laser light is gathered by a telescope. The wavelength of the reflected light is changed slightly depending on how fast the air it bounced back from is moving, a change known as doppler shift. By analyzing this shift, OAWL can determine the speed and direction of the wind.

Unlike other wind sensors, OAWL looks in two directions at once, giving a better indication of wind speed and direction.

“It’s like looking out with two eyes open instead of one,” says Sara Tucker, a lidar systems engineer at Ball Aerospace.

Sign up for The Download

Your daily dose of what’s up in emerging technology

By signing up you agree to receive email newsletters and
notifications from MIT Technology Review. You can change your preferences at any time. View our
Privacy Policy for more detail.

Previous versions of OAWL flown in aircraft have measured winds more than 14 kilometers (8.6 miles) away with an accuracy of better than a meter per second. The main challenge with Strat-OAWL has been shrinking it to fit the space, weight, and power requirements of the ALTA balloons.

Walan was not able to discuss military roles for ALTA technology, but a high-resolution sensor permanently positioned 15 miles above a war zone would be a useful asset. Military aircraft have ceilings of 60,000 to 65,000 feet, so they could intercept Loon-type balloons. Because it will fly higher, ALTA will be a much trickier target. The balloon could provide secure communications and navigation or act as a mother ship for drones.

The ALTA test flight program has already begun, with flights lasting up to three days, and will continue with steadily longer flights.

The technology might have applications beyond the military, too. Some companies, such as WorldView, are talking about “near-space tourism,” taking a passenger capsule to altitudes where the blackness of space and the curvature of the Earth can be seen. Reliable navigation of the sort provided by OAWL would make such trips a far safer prospect. It could also give commercial airliners a tool to spot and avoid clear air turbulence.

The ALTA balloon itself is made by Raven Aerostar, which also makes the Loon balloons. The firm’s general manager, Scott Wickersham, says this sort of technology gets us much closer to balloons that stay aloft indefinitely—and that will make all sort of applications possible.

“I believe we will see a future in which the stratospheric balloons will be as common as commercial airliners are today,” he says.

via Technology Review Feed – Tech Review Top Stories https://ift.tt/1XdUwhl

November 14, 2018 at 12:22PM

Tesla Model 3 Owner Recharges Car by Having It Towed

https://jalopnik.com/tesla-model-3-owner-recharges-car-by-having-it-towed-1830422030


The Tesla Model 3, like most electric vehicles, uses regenerative braking to help trickle charge the car’s battery and extend the range. In theory, one could use the regenerative powers of the car’s motors to charge the battery by having the whole thing towed, and now a Model 3 owner has put that to the test.

While this is definitely NOT how it was designed to work, pulling an electric car with another car to charge it isn’t exactly against any rules. As Electrek picked up, Tech Forum on YouTube wanted to see what exactly would happen if he tried it, towing a Model 3 with a Ford C-Max hybrid.

This is Matt’s second tow test, since the first was before the version 9 software was released which unlocked stronger regenerative braking on Teslas, and he also wanted to do it for a longer leg to see more significant results, so he tried it a second time.

The car started at 227 miles of range, creep mode was turned off, and around the 4:20 minute mark you can see the impact of towing on the car’s energy consumption graph.

As the towing begins, you see the energy consumption drop almost immediately to the bottom of the graph as the projected range rapidly climbs. After one mile of towing, the car managed to drive itself a further 10 miles theoretically just off of the energy “conserved” by towing the car.

When I reached out to Tesla about this, they labeled this way of recharging the car creative but not very feasible, mostly due to the increased risk of towing a car with a strap like this at higher speeds for the long periods of time necessary for a decent charge. The company officially recommends towing Teslas by flatbed truck. Also, keep in mind the car needs to be “on” for the regen to charge appropriately. But it’s possible!

Think of the possibilities… Tesla Semis pulling giant cargo containers that are just giant batteries, or maybe even miniature power plants of some future efficient design, for all of us to pull up behind on the highway, link up with and enjoy a nice regen charge before our exit. Somebody pay me for this plan. Thanks.

via Gizmodo https://gizmodo.com

November 14, 2018 at 11:54AM

The Pixel’s Night Sight camera mode performs imaging miracles

https://www.engadget.com/2018/11/14/google-pixel-night-sight-launch-sample-photos-comparison/



Cherlynn Low / Engadget

When Google showed off its Night Sight feature at the Pixel 3 event last month, we were impressed but skeptical. Sample photos from the keynote looked drastically better when shot with the low-light mode, but since the feature wasn’t live, we couldn’t vouch for its effectiveness. Now, Google is finally releasing Night Sight to the masses — meaning you won’t have to resort to installing a camera port to test out this mode. After some time testing the software out, I have to say, I’m blown away.

The difference is much more noticeable on photos taken in pitch black, but in those situations, the images came out noisy and blurry, even though colors are accurate. Google said that Night Sight measures your natural hand shake, as well as how much motion is in the scene, before you press the shutter button. When there’s less movement, the software spends more time capturing light for less noise. Otherwise, it uses shorter exposures to minimize blur. In most cases, this appeared to work well, save for the example below, which was shot in my closet in complete darkness.

Pixel 3 Night Sight comparison

Left: Shot with Night Sight. Right: Shot in regular Camera mode.

Google recommended that we test Night Sight on a Pixel 3, which already took decent shots in low light. But I still saw a noticeable improvement in quality in some photos in a dark corner in my kitchen. Colors were more saturated, and details like the ridges on a jar’s cap were sharper.


Left: Shot in regular Camera mode. Right: Shot with Night Sight.

Those using older Pixels should be excited, too. Even when I tested Night Sight on my Pixel 2, the results were impressive. Shots of Thai food in a dimly lit restaurant looked brighter, clearer and with more accurate colors when shot in the mode.


Left: Shot in regular Camera mode. Right: Shot with Night Sight

Though I don’t like that you have to swipe all the way to the More section in the camera app to launch Night Sight manually, the system has so far been smart enough to suggest I switch modes when it detects a dark scene. When that happens, it displays a shortcut at the bottom of the viewfinder so Night Sight is easier to access.

The Pixel 3 was already one of the best cameras, despite not packing dual sensors. The company’s software prowess shows in things like HDR+ and Portrait mode, and now, Night Sight clinches the photography trophy for Google. The new feature will arrive over the next few days on all Pixels via a Camera app update, so keep an eye out for it, and have fun!

via Engadget http://www.engadget.com

November 14, 2018 at 11:12AM

Google’s Incredible Night Sight Mode Was Worth the Wait

https://gizmodo.com/googles-incredible-night-sight-mode-was-worth-the-wait-1830432540


Photo: Sam Rutherford (Gizmodo)

As good as the Pixel 3 is, when Google released the phone last month, it sort of felt incomplete. Part of that feeling is due to Google’s software-first approach to smartphone design which means there’s always code being tinkered with, but the other part was simply because Google still hadn’t delivered two major Pixel 3 features the company teased prior to launch.

But starting today, one of those missing additions, Night Sight, is finally rolling out to Pixel 3 owners, and after having the chance to test it out for a few days, I can safely say Google’s sophisticated camera feature was worth the wait.

Previously, when faced with certain low-light situations, the Pixel 3 sometimes struggled to best phones with larger aperture lenses like the Galaxy Note 9, or top sophisticated low-light techniques like Huawei’s impressive Night Mode, which really pushed the use of HDR photo processing when it debuted earlier this year on the P20 Pro.

However, Night Sight changes all that, and thanks to Google flexing its computational photography skills, the Pixel 3 is now capable of capturing some incredible low-light pics. And to prove it, I took all the best smartphone cameras out for a side-by-side nighttime shootout.

One quick note before we get to the pictures. Accessing Night Sight is as easy as scrolling over to the More tab in Google’s camera app and then tapping the icon for Night Sight. But when you actually shoot a picture, it’s not a simple click and you’re done. Similar to Huawei’s Night Mode, Night Sight shoots multiple images during a three to four-second window, and then combines those images together to create a final pic that looks better than each original frame on its own. That means to get the best results, you really need to try to hold your phone as steady as possible.

Click to enlarge. All images are unedited, though they have been resized slightly to 10-MP to match the Mate 20 Pro’s resolution.
Photo: Sam Rutherford (Gizmodo)

OK, now onto the photos. To set a baseline, I started by taking two shots with the Pixel 3, one with Night Sight off and one with Night Sight on. And while in some respects, I like the pic taken without Night Sight more because its richer colors and darker exposure make for a moodier final image, when it comes to details and sharpness, the photo taken with Night Sight is clearly superior. Text on signs is clearer, the overall image is much less grainy, and you can even make out each individual brick in the building no matter where you’re looking.

Photo: Sam Rutherford (Gizmodo)

Next, I moved on to a face-off between the Pixel 3 and the Galaxy Note 9; a battle of computer smarts versus a wider f/1.5 lens aided by a touch of AI tuning. This time at a local bar, while both photos look pretty good zoomed out, you can still see things like the overly yellow color cast often produced by Samsung cameras in low light.

Photo: Sam Rutherford (Gizmodo)

But when you start pixel peeping at 100 percent, you can really appreciate little details from the general sharpness on every sign and sticker to the Pixel 3’s ability to clearly capture the wood grain on the framed chalkboard and the texture on various beer taps.

Photo: Sam Rutherford (Gizmodo)

For the Pixel 3’s third challenge, I shot a nighttime cityscape with it and Huawei’s new Mate 20 Pro. And once again, while the Pixel 3’s pic does show some signs of over-sharpening like the hard pink outline on the lights at the top of the building in the middle, Night Sight still bested Huawei’s Night Mode, which up till now, was the best way to capture challenging low-light scenes.

While the Mate 20 Pro’s image does offer more zoom, the sacrifice in quality while using night mode isn’t worth it
Photo: Sam Rutherford (Gizmodo)

Also, while I was up there, I decided to see if Night Sight could work in combination with Google’s Super Res Zoom feature, and it seems it does. But the bigger shock is that for a camera with a supposed 3x zoom, the Mate 20 Pro’s Night Mode pic was a blurry mess compared to the Pixel 3’s image. Now part of this may have been due to having to hand hold both phones, which is far from ideal, but both pics were shot one after another under the exact same circumstances, and I’m still struggling to work out why things turned out so poorly for the Mate 20 Pro.

Photo: Sam Rutherford (Gizmodo)

Finally, in what may have been the most difficult challenge, I took a shot using the Pixel 3 and an iPhone XS of a pizza lit almost solely by a single candle. And the difference couldn’t be more apparent. The colors in the Pixel 3’s pic are brighter, it features better contrast, and despite a fair amount of noise in the dark background, everything else is just so much clearer. Suffice to say, if you take a lot of pics in dimly lit restaurants, the Pixel 3 and Night Sight is probably your ticket to way better food photography.

Night Sight is exactly what the Pixel 3 needed to really round out its photo toolkit. But in strange way, the biggest winners of all this may be Pixel 1 and Pixel 2 owners, who will also be getting Night Sight over the next few days. Night Sight is available as an update the Pixel 3’s camera app via a Play Store update starting today.

via Gizmodo https://gizmodo.com

November 14, 2018 at 11:06AM

Sorcery: Guy Produces Black Fire That Casts A Shadow

https://geekologie.com/2018/11/sorcery-guy-produces-black-fire-that-cas.php


This is a video of Youtuber The Action Lab producing a black fire that casts a shadow by exposing an alcohol flame to salt water that’s all being illuminated by a low pressure sodium lamp that only emits light in a very tiny wavelength range of monochromatic yellow. *breathing heavy* It works because the sodium atoms being released from the salt water absorb the wavelength of yellow light from the lamp, making the fire appear black. Of course, using the scientific method, we also can’t rule out the possibility of black magic. Which is of course 100% what this is, since it is a black flame and all. *stamps ‘CLOSED’ on case file* I’ve seen Hocus Pocus before.

Keep going for the video, actual black flame around 4:00.

Thanks to Damien, who agrees this would have been much more valuable information three years ago when I was an evil wizard for Halloween.

blog comments powered by Disqus

via Geekologie – Gadgets, Gizmos, and Awesome https://geekologie.com/

November 14, 2018 at 10:57AM

Microsoft resumes Windows 10 update after fixing data loss bug

https://www.engadget.com/2018/11/14/microsoft-resumes-windows-10-update-rollout-after-data-loss/



Engadget

At last, Microsoft has resumed delivering its Windows 10 October update after pulling it over a data loss bug. The company is confident it has fixed the flaw and has seen “no further evidence” of data loss. With that said, it’s being particularly cautious this time around. It’s “slowing” the deployment to watch device data and will only give you the device update when it thinks there won’t be a problem, such as an incompatible app.

The company also used the re-release as an opportunity to defend its software testing methods. It introduced new uses of “data and feedback” to improve its software quality, and uses extensive automated testing, external labs, partner vendors and ‘self-hosting’ (where development teams run their own software builds) as part of the testing process. It also tracked evidence to suggest quality was improving. Customer support chats and calls have been declining for much of Windows 10’s lifetime, Microsoft said.

The problem, of course, is that this process still let a data loss bug slip through. Many of these testing methodsare also familiar on some level — self-hosting is usually called “dogfooding” and represents a common industry practice. These kind of serious update bugs tend to be rare, but it’s not clear if there are any testing changes in place to reduce the chances of such a significant flaw popping up in the future.

via Engadget http://www.engadget.com

November 14, 2018 at 10:54AM

This Chemical Is So Hot It Destroys Nerve Fibers—in a Good Way

https://www.wired.com/story/resiniferatoxin


In Morocco there grows a cactus-like plant that’s so hot, I have to insist that the next few sentences aren’t hyperbole. On the Scoville Scale of hotness, its active ingredient, resiniferatoxin, clocks in at 16 billion units. That’s 10,000 times hotter than the Carolina reaper, the world’s hottest pepper, and 45,000 times hotter than the hottest of habaneros, and 4.5 million times hotter than a piddling little jalapeno. Euphorbia resinifera, aka the resin spurge, is not to be eaten. Just to be safe, you probably shouldn’t even look at it.

But while that toxicity will lay up any mammal dumb enough to chew on the resin spurge, resiniferatoxin has also emerged as a promising painkiller. Inject RTX, as it’s known, into an aching joint, and it’ll actually destroy the nerve endings that signal pain. Which means medicine could soon get a new tool to help free us from the grasp of opioids.

The human body is loaded with different kinds of sensory neurons. Some flavors respond to light touch, others signal joint position, yet others respond only to stimuli like tissue injury and burns. RTX isn’t going to destroy the endings of all these neurons willy-nilly. Instead, it binds to a major molecule in specifically pain-sensing nerve endings, called TRPV1 (pronounced TRIP-vee one).

This TRPV1 receptor normally responds to temperature. But it also responds to a family of molecules called pungents, which includes capsaicin, the active ingredient in hot pepper. “So when you put hot pepper on your tongue and it feels like it’s burning, it’s not because your tongue is on fire,” says Tony Yaksh, an anesthesiologist and pharmacologist at UC San Diego who’s studied RTX. “It’s simply activating the same sensory axons that would have been activated if your tongue had been on fire.”

RTX is a capsaicin analog, only it’s between 500 and 1,000 times more potent. When RTX binds to TRPV1, it props open the nerve cell’s ion channel, letting a whole lot of calcium in. That’s toxic, leading to the inactivation of the pain-sensing nerve endings.

This leaves other varieties of sensory neurons unaffected, because RTX is highly specific to TRPV1. “So you gain selectivity because it only acts on TRPV1, which is only on a certain class of fibers, which only transmit pain,” says Yaksh. “Therefore you can selectively knock out pain without knocking out, say, light touch or your ability to walk.”

So if you wanted to treat knee pain, you could directly inject RTX into the knee tissue. You’d anesthetize the patient first, of course, since the resulting pain would be intense. But after a few hours, that pain wears off, and you end up with a knee that’s desensitized to pain.

Researchers have already done this with dogs. “It is profoundly effective there, and lasts much, much longer than I might have expected, maybe a median of 5 months before the owners of the dogs asked for reinjection,” says Michael Iadarola, who’s studying RTX at the National Institutes of Health. “The animals went from basically limping to running around.” One dog even went 18 months before its owners noticed the pain had returned.

That’s a very targeted application, but what about more widespread pain? Cancer patients, for instance, can live in agony through their end-of-life care. Here, too, RTX might work as a powerful painkiller. In fact, the NIH is in the midst of trials with bone cancer patients.

“We use the same technique for administering this as we would a spinal anesthetic,” says NIH anesthesiologist Andrew Mannes. “The whole idea is you’re not injecting into the spinal column itself, you’re injecting it into the fluid that surrounds the spinal column.” Injecting straight into the cord would damage it. Patients are anesthetized for all of this, and treated with short-term painkillers when they wake. “That seems to get them over the worst of it, and then over the next few hours it subsides until the point where they don’t feel the pain any longer.”

Here RTX is working on the same principle as if you injected it straight into a specific area of concern like a knee. But because it’s injected more centrally, it delivers widespread pain relief. “For many of the cancer patients, we need to have the drug remove pain from a lot of different regions,” says Iadarola. “So we give it into a compartment where the nerves to the lower half of the body are gathered together.”

Now, the thing with pain is, it evolved for a reason. It’s an indispensable tool for you to feel if you’re doing something to your body that you shouldn’t be, like holding a scalding cup of coffee. Of course we want to alleviate pain, but might that be problematic if it’s too effective?

For people with knee pain, not really—the injection is targeting a specific area, so the rest of your body can still feel pain. And for end-of-life care, a central injection can bring long-awaited relief. “We’re doing that on cancer pain patients who have tried all other treatments and they’ve not been successful,” says NIH neurosurgeon John Heiss. “The FDA has only allowed us to have the indication for cancer patients with limited life expectancy, because the concern is that if you lose pain and temperature sensation you could have deleterious effects.”

RTX’s promise lies in its specificity. Think of it like a sniper rifle for pain, whereas opioids are more like hand grenades. Opioids target receptors all over the body, not a specific kind of sensory neuron. “That’s why when you give it to somebody, you get problems with constipation, sedation, they can have respiratory depression,” says Mannes.

That and you have to take opioids constantly, but not so with RTX. “You give it once and it should last for an extended period of time because it is destroying the fibers,” says Mannes. “But the other thing with this to remember is there’s no reinforcement. There’s no high associated with it, there’s no addiction potential whatsoever.”

If RTX does become widely available, by the way, it’s not going to be for you to treat a sore post-marathon knee—this is a serious drug for serious conditions. But by more directly addressing the root of pain, a plant with a hell of a kick could help us cut back on opioids and other grenade-like painkillers.


More Great WIRED Stories

via Wired Top Stories https://ift.tt/2uc60ci

November 14, 2018 at 06:06AM