Water on Asteroid Bennu Could Mean ‘Pay Dirt’ for Space Miners

https://www.space.com/asteroid-bennu-water-space-mining-osiris-rex.html

The near-Earth asteroid Bennu could be Exhibit A for space miners making their case to skeptical investors.

New observations by NASA’s OSIRIS-REx spacecraft suggest that the 1,650-foot-wide (500 meters) Bennu harbors lots of accessible water, a key resource that prospective asteroid miners aim to target. 

Water can be split into its constituent hydrogen and oxygen, the chief components of rocket fuel. This stuff can then be sold at off-Earth “gas stations,” where spacecraft could fill their tanks up on the go, mining advocates have stressed.

Related: OSIRIS-REx: NASA’s Asteroid-Sampling Mission in Pictures

“For an asteroid miner, Bennu is pay dirt,” OSIRIS-REx principal investigator Dante Lauretta told Space.com. “That is exactly the kind of target that we want to go to and process [for] a propellant depot that people have been envisioning for the first profitable asteroid mine.”

Studying Bennu up close

The $800 million OSIRIS-REx mission launched in September 2016 and slipped into orbit around Bennu on Dec. 31 of last year. This latter event was an epic achievement: Bennu is the smallest object ever to be circled by a spacecraft.

OSIRIS-REx is making valuable observations from Bennu orbit, but much of the mission’s science data will be gathered here on Earth. If all goes according to plan, the probe will snag a sample of Bennu material in July 2020, then deliver that stuff to Earth in a special return capsule arriving in September 2023. 

The main goal of the mission is to learn more about the solar system’s early days and to better understand the role that dark, carbon-rich asteroids such as Bennu may have played in life’s emergence on Earth. That role is suspected to be significant; scientists think asteroids may have delivered much of our planet’s water, as well as lots of complex organic molecules — the building blocks of life as we know it.

But OSIRIS-REx has several subsidiary objectives, as indicated by its full name: ”Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer.” The “security” bit refers to information that could help humanity better deal with potentially hazardous space rocks, a broad class that counts Bennu as a member. And “resource identification” is a nod to the nascent asteroid-mining industry, which needs to know which rocks to go after.

Related: Images: Potentially Dangerous Asteroids

This image of the asteroid Bennu shows a view from NASA's OSIRIS-REx spacecraft (left) and a spectrum analysis of its composition.

This image of the asteroid Bennu shows a view from NASA’s OSIRIS-REx spacecraft (left) and a spectrum analysis of its composition. 

(Image: © NASA/Goddard/University of Arizona/Arizona State University)

In December, the OSIRIS-REx team announced the detection of hydrated clay minerals on Bennu’s surface. The find indicated that water was likely abundant in the interior of Bennu’s parent asteroid long ago, the scientists said at the time. (Team members think Bennu is a rubble pile consisting of pieces of that shattered asteroid, which may have been about 62 miles, or 100 kilometers, wide. Bennu may harbor chunks of the impactor as well.)

The new results, which Lauretta and his colleagues announced today (March 19), confirm and extend that recent discovery: OSIRIS-REx has now spotted the apparent signature of the iron-oxide mineral magnetite on Bennu’s surface.

Magnetite is “typically indicative of very intense hydrothermal activity,” Lauretta said. 

A candidate sample site on asteroid Bennu for NASA's OSIRIS-REx spacecraft.

A candidate sample site on asteroid Bennu for NASA’s OSIRIS-REx spacecraft.

(Image: © NASA/Goddard/University of Arizona)

He and the mission team haven’t yet nailed down the origin of this activity. But the leading theory holds that Bennu’s parent asteroid formed far away from the newborn sun, incorporating significant amounts of water ice and organics, along with rocky and metallic material. Some radioactive elements, such as aluminum-26, got sucked up as well, and the heat thrown off by this stuff likely melted lots of that native ice.

“Water probably did circulate through the interior of the asteroid, like a hydrothermal system on Earth, and altered the originally anhydrous rocky material, forming these clays,” Lauretta said. The flow also “probably altered the metals to produce iron oxides, like the magnetite.”

This likely happened very early on, he added — within the first 10 million years or so of the solar system’s existence.

To be clear: The water we’re talking about on present-day Bennu isn’t stand-alone and pure; it’s locked up in those clays, in the form of hydroxyl groups (one oxygen atom and one hydrogen atom bonded together). But it is likely accessible: Hydroxyl can be baked out of clays, generating water vapor, asteroid-mining advocates say. 

Picture of Bennu taking shape

The magnetite find is just one of many discoveries Lauretta and his colleagues announced today in seven papers, which were published in the journals Nature, Nature Astronomy, Nature Geoscience and Nature Communications. The team also discussed the results during a news conference today at the 50th Lunar and Planetary Science Conference (LPSC) in The Woodlands, Texas.

For example, OSIRIS-REx’s observations suggest that Bennu’s rotation rate is speeding up, likely because of the reradiation of solar energy as heat — something known as the Yarkovsky-O’Keefe-Radzievskii-Paddack (YORP) effect. It currently takes Bennu about 4.3 hours to complete one rotation; if this spin speedup continues apace, that rotational period will be cut in half in 1.5 million years, Lauretta said.

The team also determined Bennu’s bulk density to be about 72.3 lbs. per cubic foot (1,190 kilograms per cubic meter) and the asteroid’s interior to be about 50 percent open space. Both of these numbers indicate that Bennu is a rubble pile rather than a solid block of rock, the scientists said.

And then there’s the asteroid’s surface. By counting craters, the team has estimated that Bennu formed between 100 million and 1 billion years ago, likely after a mammoth collision in the main asteroid belt between Mars and Jupiter. (Bennu’s move to a near-Earth orbit occurred quite recently; such paths tend to be stable for just 10 million years or so, mission team members said, because of gravitational encounters with Earth and other rocky planets.)

Related: The Asteroid Belt Explained: Space Rocks by the Millions (Infographic)

The abundance of visible craters may force a rethink about how asteroids such as Bennu and Ryugu, which Japan’s Hayabusa2 spacecraft is currently studying up close, got their striking diamond shapes. The leading explanation posits that this shape results from a fast spin, which causes loose asteroid material to migrate to equatorial regions. But such migration would be expected to bury many craters, so perhaps something else is going on.

“We’re actively re-evaluating that model,” Lauretta said.

The team also found that, on average, Bennu reflects just 4.4 percent of the sunlight hitting it, making the asteroid one of the darkest objects in the solar system. But Bennu is far from uniform; along with very dark patches, it sports much brighter regions, some of which have reflectivities of 15 percent to 20 percent.

Sample-gathering may be tougher than thought

Bennu’s surface diversity also manifests as rocky ruggedness, which has surprised the mission team. Radar imagery by big, ground-based dishes such as the Arecibo Observatory in Puerto Rico had revealed just one boulder with a width of between 33 feet and 66 feet (10 to 20 meters). Those data, and the suspected spin-induced equatorial migration of material, gave the OSIRIS-REx researchers reason to think Bennu is pretty smooth, at least at low latitudes.

“Everything was self-consistent and suggested a lot of centimeter-scale particles, probably concentrated in the equator,” Lauretta said. “And I was really envisioning kind of a beach that went all the way around the asteroid in equatorial regions.”

But reality is very different from this prediction. OSIRIS-REx has spotted more than 200 boulders in the 33-to-66-foot size range so far, and the biggest boulder-free patches the probe has found to date measure between 16.5 feet and 66 feet (5 to 20 m) wide, one of the new papers reported.

That’s an issue, because the mission design calls for OSIRIS-REx to grab a sample from a boulder-free patch at least 165 feet (50 m) in diameter. 

“We have to upgrade the autonomous guidance system on the spacecraft, so it can be a lot smarter and guide us into that smaller region,” Lauretta said.

The team will also have to collect a lot more high-resolution imagery of the landing site than previously anticipated, he added.

Lauretta said he’s confident the team will make everything work, though sample collection may have to be pushed back a bit as a result. But the team has some leeway; the sampling operation can be performed as late as October 2020 with no significant effect on the mission timeline, Lauretta added. 

The OSIRIS-REx team takes heart from the success of Hayabusa2, which grabbed samples from rugged Ryugu last month. And Lauretta and some colleagues will travel to Japan in April to get information and advice from Hayabusa2 team members, especially about how Ryugu’s surface behaved during the sampling sortie, he said.

“That’s still our biggest uncertainty — what is the nature of this material in the microgravity environment?” Lauretta said. “What forces are holding it together, and how does it respond when a spacecraft punches it in and then fires thrusters to back away from it?” 

The Hayabusa2 team, by the way, unveiled a raft of new results today, as well, in three papers in the journal Science and at LPSC. The Japanese mission has determined that the 3,000-foot-wide (900 m) Ryugu is likely relatively dry, though it’s carbon-rich and diamond-shaped like Bennu.

Ryugu may be drier because it has resided in the inner solar system longer than Bennu or has swung closer to the sun on its various orbits, Lauretta said. 

Related: Our Solar System: A Photo Tour of the Planets

“You haven’t seen anything yet”

The new OSIRIS-REx results come from observations made during the probe’s approach to Bennu last summer and fall, and its early days orbiting the space rock. There’s a lot more to come, as the science team analyzes more-detailed data and imagery.

For example, OSIRIS-REx hasn’t confirmed the presence of organics on Bennu’s surface, but the probe hasn’t really had a chance to look yet. The observations that may do the trick will be performed seven or eight weeks from now, if all goes according to plan, Lauretta said.

“The science is really starting to ramp up,” he said. “You haven’t seen anything yet.”

And we may have to wait a few years for the most exciting results. Hayabusa2’s sample is scheduled to land on Earth in December 2020, and OSIRIS-REx’s won’t touch down until nearly three years later.

“That’s really exciting,” Lauretta said. “You can learn a lot by bringing a sample back from an asteroid, but we’re going to learn exponentially more by bringing samples back from these two asteroids, which initially looked very similar to each other, may still be related to each other but have had different histories.”

Mike Wall’s book about the search for alien life, “Out There” (Grand Central Publishing, 2018; illustrated by Karl Tate), is out now. Follow him on Twitter @michaeldwall. Follow us on Twitter @Spacedotcom or Facebook

via Space.com https://ift.tt/2CqOJ61

March 19, 2019 at 12:58PM

Google Announces Stadia: A Game Streaming Service

https://www.anandtech.com/show/14105/google-announces-stadia-a-game-streaming-service

Today at GDC, Google announced its new video game streaming service. The new service will be called Stadia. This builds on the information earlier this year that AMD was powering Project Stream (as was then called) with Radeon Pro GPUs, and Google is a primary partner using AMD’s next generation CPUs and GPUs.

Stadia is being advertised as the central community for gamers, creators, and developers. The idea is that people can play a wide array of games regardless of the hardware at hand. Back in October, Google debuted the technology showcasing a top-end AAA gaming title running at 60 FPS. Google wants a single place where gamers and YouTube creators can get together – no current gaming platform, according to Google, does this.

Ultimately Google wants to stream straight to the Google browser. Google worked with leading publishers and developers to help build the system infrastructure. Google is one of a few companies with enough content delivery networks around the world to ensure that frame rates are kept high with super low latency.

Users will be able to watch a video about a game, and instantly hit ‘Play Now’ and start playing the game in under five seconds without any download and lag. The idea is that a single code base can be enjoyed at any stream. At launch, desktop, laptop, TV, tablets, and phones will be supported. With Stadia, the datacenter is platform. No hardware acceleration is required on the device. The experience can be transferred between devices, such as chromebook to smartphone.

One of the highlights of Google’s demonstration of Stadia was the platform working on Google-enabled TVs.

The platform allows users to have any USB connected controller, or mouse and keyboard. Google will also be releasing its own Stadia Controller, available in three colors – white, black, and light blue. The controller connects via Wi-Fi straight into the cloud, and also which device is being run (it’s unclear how this works).

The controller has two new buttons. The first allows saving and sharing the experience out to YouTube. The second is Google Assistant, using the integrated microphone in the controller. This allows game developers to integrate Google Assistant into their games.

Stadia uses the same datacenter infrastructure already in place at Google. There are 7500+ edge nodes allows for compute resources being closer to players for lower latency. Custom designed, purpose built hardware powers the experience. Interconnected racks have sufficient compute and memory for the most demanding games. The technology has been in development inside Google for years.

At launch, resolutions will be supported up to 4K 60 fps with HDR and surround sound. Future  plans for up to 8K streaming at 120 fps are planned. The platform has been built to scale to support this. While playing, the stream is duplicated in 4K for direct upload – you get rendering quality video rather than what you capture locally.

The platform is instance based, so Google can scale when needed. Game developers no longer have to worry about building to a specific hardware performance – the datacenter can scale as required.

Custom GPU with AMD, with 10 TF of power, with a custom CPU with AVX2 support. Combined they create a single instance per person. Uses Linux and Vulkan, with full Unreal and Unity support. Havok engine support as well. Tool companies are onboard.

(When Google says custom CPU and custom GPU – this could be early hardware of AMD’s upcoming generations of technology, put into a custom core configuration / TDP. We’re likely looking at a Zen 2 based CPU, based on AVX2 support listed, and a Radeon Instinct based GPU with tweaked settings specifically for Google.)

One of the first games supported will be Doom Eternal from id Software, which will support 4K with HDR at 60 fps.

This is breaking news, please refresh in case there are updates.

via AnandTech https://ift.tt/phao0v

March 19, 2019 at 12:19PM

Sikorsky’s Self-Flying Helicopter Hints at the Flying Future

https://www.wired.com/story/sikorsky-sara-helicopter-autonomous-flying-car-air-taxi-tech

As helicopter flights go, this one was especially boring. We took off, hovered for a bit, and maneuvered around the airport. We flew to a spot about 10 miles away, did some turns and gentle banks, then came back and landed. I’ve been on more exciting ferris wheels, with views more inspiring than those of rural Connecticut. Still, the flight was impressive for at least one reason: The pilot controlling the 12,000-pound Sikorsky S-76 had never before operated a helicopter. That would be me.

Fortunately, I’m not responsible for keeping anybody alive. The blue and white commercial chopper did all the work, from takeoff to touchdown. It navigated and executed those turns and banks, all the while scanning its surroundings for trees, power lines, birds, and other aircraft. I merely played conductor, occasionally tapping the tablet strapped to my right knee to direct it here or there.

This, of course, is no ordinary helicopter. It’s the testbed for Sikorsky’s Matrix Technology, a suite of systems to boost helicopter safety by reducing pilot workload to the point where the crew can focus on what they need to do rather than how to do it. The mission, not the mechanics. Its maker calls it SARA, for Sikorsky Autonomy Research Aircraft. And ultimately, it could do a lot more than make flying a helicopter safer and easier.

To take over, the pilot simply starts working the controls, and the autonomous system gives way. When he lets go again, the computer retakes control. It’s a two-way backup: The helicopter is always ready to take over from the pilot, just as the pilot is always able to take over from the computer.

Eric Adams

While the future of autonomous, urban air taxis has drawn new players and bold plans into the aviation space, Sikorsky has been quietly developing its own solution, putting itself at the head of the helicopter autonomy space. In the next year or so, it will include Matrix features in the Black Hawks it builds for the US Army. Applications like oil rig transport and search-and-rescue missions will follow. And yes, this tech could someday enable those flying cars we keep hearing about. The company, which Lockheed Martin acquired in 2015, recently announced it’s formally entering the urban mobility race, using the Matrix system, electric propulsion tech, and its data systems. Already, Sikorsky says that the Matrix tech is essentially ready to start piloting flying cars—even if those flying cars don’t exactly exist just yet.

My flight lent that claim some credence. The tablet on my knee showed a moving map (with real images, Google Satellite view-style) and a few command options. You can load in a preset mission or just point to a spot on the map and enter your speed and altitude preferences. The computer—tucked into a corner behind the pilot’s seat and surrounded by way more test and evaluation hardware than it actually takes to fly the chopper—then calculates the best route. Tap Execute and the helicopter takes off and goes to work. Along the way, inertial guidance systems and GPS keep it on track, while external sensors, including lidar and cameras, watch for obstacles and potential landing sites should something go wrong.

LEARN MORE

The WIRED Guide to Drones

To take over, the pilot simply starts working the controls and the autonomous system gives way. When he lets go again, the computer retakes control. It’s a two-way backup: The helicopter is always ready to take over from the pilot, just as the pilot is always able to take over from the computer.

Along with the tablet, the SARA comes with two “inceptors” on either side of my seat. These hand controls let the pilot interrupt the flight plan to change the helicopter’s direction or position, or just fly around. The left control manages the throttle and yaw. The right is a joystick that allows for horizontal control, pitching forward, backward, left, or right. Even these, though, are far simpler than a helicopter’s four standard controls: the cyclic, throttle, collective, and pedals. When the pilot uses the inceptors, the computer finds the right combination of controls to deliver what they want. The result is a control system that’s as intuitive as any videogame.

I flew us to different positions above the tarmac at Sikorsky’s headquarters, swinging around easily and using a combination of visuals and the map to position myself. If I keep my pace below five knots, the computer stops and hovers when I release the controls. If I go faster, it maintains my heading and speed. The helicopter did everything smoothly and predictably even when flying via the inceptors, regardless of how unsteady my hand may have been.

Mostly, though, I used the tabled to command the whirlybird. The current interface is aimed at pro pilots, with the commensurate degree of detail and data. The design is evolving, says Igor Cherepinsky, director of Sikorsky’s autonomy program. Eventually, it will take on a form that non-pilots can readily use. But it was simple enough for me, and its broad capability made the entire flight thoroughly uneventful, right down to the easy, bounce-free touchdown. That sort of effortless-seeming flight is, of course, precisely what you’d want from a robot or noob pilot, especially if and when future air taxis start shuttling us around the skies.


More Great WIRED Stories

via Wired Top Stories https://ift.tt/2uc60ci

March 19, 2019 at 08:06AM

SpaceX’s Falcon Heavy Megarocket to Fly 1st Commercial Mission in April: Report

https://www.space.com/spacex-falcon-heavy-arabsat6a-april-2019.html

SpaceX plans to launch the first commercial mission of its Falcon Heavy megarocket early next month, according to media reports.

The company is targeting April 7 for the launch of the Arabsat 6A communications satellite from historic Pad 39A at NASA’s Kennedy Space Center in Florida, CNBC reported Friday (March 15), citing anonymous sources. SpaceX has not officially announced a launch target for the Falcon Heavy mission. 

The 13,200-lb. (6,000 kilograms) Arabsat 6A was built by Lockheed Martin and will be operated by the Saudi Arabian company Arabsat.

Related: SpaceX’s Epic Falcon Heavy Road Trip with Starman in Photos

The reusable Falcon Heavy, the most powerful rocket flying today, has one spaceflight under its belt — a test mission that lifted off on Feb. 6, 2018. The rocket launched SpaceX founder and CEO Elon Musk’s red Tesla Roadster — driven by a spacesuit-clad dummy dubbed Starman — into orbit around the sun.

The Falcon Heavy is based on SpaceX’s two-stage Falcon 9 rocket, which has executed dozens of launches and first-stage landings over the past few years. The Heavy lashes together three Falcon 9 first stages; the Heavy’s second stage, and the payload to be launched, sit atop the central first stage “core.”

Two of the Falcon Heavy’s three first-stage boosters aced their landings during the rocket’s February 2018 test flight. The central core came close, narrowly missing its target “drone ship” in the Atlantic Ocean off the Florida coast.

Mike Wall’s book about the search for alien life, “Out There” (Grand Central Publishing, 2018; illustrated by Karl Tate), is out now. Follow him on Twitter @michaeldwall. Follow us on Twitter @Spacedotcom or Facebook

Have a news tip, correction or comment? Let us know at community@space.com.

via Space.com https://ift.tt/2CqOJ61

March 18, 2019 at 07:04PM

A meteor exploded over the Bering Sea with the energy of 10 atomic bombs

https://www.popsci.com/bering-sea-meteor-explosion-10-atomic-bombs?dom=rss-default&src=syn

a meteor flaming next to earth

And we never saw it coming.

Scientists recently observed a meteor exploding over the Bering Sea with the energy of 10 atomic bombs. It’s officially the second largest fireball of its kind to occur…

via Popular Science – New Technology, Science News, The Future Now https://ift.tt/2k2uJQn

March 18, 2019 at 05:22PM

Nvidia Announces Jetson Nano Dev Kit & Board: X1 for $99

https://www.anandtech.com/show/14101/nvidia-announces-jetson-nano

Today at GTC 2019 Nvidia launched a new member of the Jetson family: The new Jetson Nano. The Jetson family of products represents Nvidia new focus on robotics, AI and autonomous machine applications. A few months back we had the pleasure to have a high level review of the Jetson AGX as well as the Xavier chip that powers it.

The biggest concern of the AGX dev kit was its pricing – with retail costs of $2500 ($1299 as part of Nvidia’s developer programme), it’s massively out of range of most hobbyist users such as our readers.

The new Jetson Nano addresses the cost issue in a quite dramatic way. Here Nvidia promises to deliver a similar level of functionality than its more expensive Jetson products, at a much lower price point, and of course at a lower performance point.

The Jetson Nano is a full blown single-board-computer in the form of a module. The module form-factor and connector is SO-DIMM and is similar to past Nvidia modules by the company. The goal of the form-factor is to have the most compact form-factor possible, as it is envisioned to be used in a wide variety of applications where a possible customer will design their own connector boards best fit for their design needs.

At the heart of the Nano module we find Nvidia’s “Erista” chip which also powered the Tegra X1 in the Nvidia Shield as well as the Nintendo Switch. The variant used in the Nano is a cut-down version though, as the 4 A57 cores only clock up to 1.43GHz and the GPU only has half the cores (128 versus 256 in the full X1) active. The module comes with 4GB of LPDDR4 and a 16GB eMMC module. The standalone Jetson Nano module for use in COTS production will be available to interested parties for $129/unit in quantities of 1000.

Naturally, because you can’t do much with the module itself, Nvidia also offers the Jetson Nano in the form of a complete computer: The Jetson Nano Developer Kit. Among the advantages of the Kit is vastly better hardware capabilities compared to competing solutions, such as the performance of the SoC or simply better connectivity such as 4 USB full (3x 2.0 + 1x 3.0) ports, HDMI, DisplayPort and a Gigabit Ethernet port, along with the usual SDIO, I2C, SPI, GPIO and UART connectors you’re used to on such boards. One even finds a M.2 connector for additional WiFi as well as a MIPI-CSI interface for cameras.


Jetson AGX Dev Kit vs Jetson Nano Dev Kit


Jetbot with Jetson Nano Dev Kit vs Jetson Nano Dev Kit

The Jetson Nano Development Kit can be had for only $99. One way Nvidia reaches this price is through the omission of on-board storage, and the kit is driven purely by microSD card. Availability starts today.

We have the Jetson Nano in-house and will seeing what fun things Nvidia cooked up for us soon!

via AnandTech https://ift.tt/phao0v

March 18, 2019 at 06:06PM