Google Announces Stadia: A Game Streaming Service

https://www.anandtech.com/show/14105/google-announces-stadia-a-game-streaming-service

Today at GDC, Google announced its new video game streaming service. The new service will be called Stadia. This builds on the information earlier this year that AMD was powering Project Stream (as was then called) with Radeon Pro GPUs, and Google is a primary partner using AMD’s next generation CPUs and GPUs.

Stadia is being advertised as the central community for gamers, creators, and developers. The idea is that people can play a wide array of games regardless of the hardware at hand. Back in October, Google debuted the technology showcasing a top-end AAA gaming title running at 60 FPS. Google wants a single place where gamers and YouTube creators can get together – no current gaming platform, according to Google, does this.

Ultimately Google wants to stream straight to the Google browser. Google worked with leading publishers and developers to help build the system infrastructure. Google is one of a few companies with enough content delivery networks around the world to ensure that frame rates are kept high with super low latency.

Users will be able to watch a video about a game, and instantly hit ‘Play Now’ and start playing the game in under five seconds without any download and lag. The idea is that a single code base can be enjoyed at any stream. At launch, desktop, laptop, TV, tablets, and phones will be supported. With Stadia, the datacenter is platform. No hardware acceleration is required on the device. The experience can be transferred between devices, such as chromebook to smartphone.

One of the highlights of Google’s demonstration of Stadia was the platform working on Google-enabled TVs.

The platform allows users to have any USB connected controller, or mouse and keyboard. Google will also be releasing its own Stadia Controller, available in three colors – white, black, and light blue. The controller connects via Wi-Fi straight into the cloud, and also which device is being run (it’s unclear how this works).

The controller has two new buttons. The first allows saving and sharing the experience out to YouTube. The second is Google Assistant, using the integrated microphone in the controller. This allows game developers to integrate Google Assistant into their games.

Stadia uses the same datacenter infrastructure already in place at Google. There are 7500+ edge nodes allows for compute resources being closer to players for lower latency. Custom designed, purpose built hardware powers the experience. Interconnected racks have sufficient compute and memory for the most demanding games. The technology has been in development inside Google for years.

At launch, resolutions will be supported up to 4K 60 fps with HDR and surround sound. Future  plans for up to 8K streaming at 120 fps are planned. The platform has been built to scale to support this. While playing, the stream is duplicated in 4K for direct upload – you get rendering quality video rather than what you capture locally.

The platform is instance based, so Google can scale when needed. Game developers no longer have to worry about building to a specific hardware performance – the datacenter can scale as required.

Custom GPU with AMD, with 10 TF of power, with a custom CPU with AVX2 support. Combined they create a single instance per person. Uses Linux and Vulkan, with full Unreal and Unity support. Havok engine support as well. Tool companies are onboard.

(When Google says custom CPU and custom GPU – this could be early hardware of AMD’s upcoming generations of technology, put into a custom core configuration / TDP. We’re likely looking at a Zen 2 based CPU, based on AVX2 support listed, and a Radeon Instinct based GPU with tweaked settings specifically for Google.)

One of the first games supported will be Doom Eternal from id Software, which will support 4K with HDR at 60 fps.

This is breaking news, please refresh in case there are updates.

via AnandTech https://ift.tt/phao0v

March 19, 2019 at 12:19PM

Sikorsky’s Self-Flying Helicopter Hints at the Flying Future

https://www.wired.com/story/sikorsky-sara-helicopter-autonomous-flying-car-air-taxi-tech

As helicopter flights go, this one was especially boring. We took off, hovered for a bit, and maneuvered around the airport. We flew to a spot about 10 miles away, did some turns and gentle banks, then came back and landed. I’ve been on more exciting ferris wheels, with views more inspiring than those of rural Connecticut. Still, the flight was impressive for at least one reason: The pilot controlling the 12,000-pound Sikorsky S-76 had never before operated a helicopter. That would be me.

Fortunately, I’m not responsible for keeping anybody alive. The blue and white commercial chopper did all the work, from takeoff to touchdown. It navigated and executed those turns and banks, all the while scanning its surroundings for trees, power lines, birds, and other aircraft. I merely played conductor, occasionally tapping the tablet strapped to my right knee to direct it here or there.

This, of course, is no ordinary helicopter. It’s the testbed for Sikorsky’s Matrix Technology, a suite of systems to boost helicopter safety by reducing pilot workload to the point where the crew can focus on what they need to do rather than how to do it. The mission, not the mechanics. Its maker calls it SARA, for Sikorsky Autonomy Research Aircraft. And ultimately, it could do a lot more than make flying a helicopter safer and easier.

To take over, the pilot simply starts working the controls, and the autonomous system gives way. When he lets go again, the computer retakes control. It’s a two-way backup: The helicopter is always ready to take over from the pilot, just as the pilot is always able to take over from the computer.

Eric Adams

While the future of autonomous, urban air taxis has drawn new players and bold plans into the aviation space, Sikorsky has been quietly developing its own solution, putting itself at the head of the helicopter autonomy space. In the next year or so, it will include Matrix features in the Black Hawks it builds for the US Army. Applications like oil rig transport and search-and-rescue missions will follow. And yes, this tech could someday enable those flying cars we keep hearing about. The company, which Lockheed Martin acquired in 2015, recently announced it’s formally entering the urban mobility race, using the Matrix system, electric propulsion tech, and its data systems. Already, Sikorsky says that the Matrix tech is essentially ready to start piloting flying cars—even if those flying cars don’t exactly exist just yet.

My flight lent that claim some credence. The tablet on my knee showed a moving map (with real images, Google Satellite view-style) and a few command options. You can load in a preset mission or just point to a spot on the map and enter your speed and altitude preferences. The computer—tucked into a corner behind the pilot’s seat and surrounded by way more test and evaluation hardware than it actually takes to fly the chopper—then calculates the best route. Tap Execute and the helicopter takes off and goes to work. Along the way, inertial guidance systems and GPS keep it on track, while external sensors, including lidar and cameras, watch for obstacles and potential landing sites should something go wrong.

LEARN MORE

The WIRED Guide to Drones

To take over, the pilot simply starts working the controls and the autonomous system gives way. When he lets go again, the computer retakes control. It’s a two-way backup: The helicopter is always ready to take over from the pilot, just as the pilot is always able to take over from the computer.

Along with the tablet, the SARA comes with two “inceptors” on either side of my seat. These hand controls let the pilot interrupt the flight plan to change the helicopter’s direction or position, or just fly around. The left control manages the throttle and yaw. The right is a joystick that allows for horizontal control, pitching forward, backward, left, or right. Even these, though, are far simpler than a helicopter’s four standard controls: the cyclic, throttle, collective, and pedals. When the pilot uses the inceptors, the computer finds the right combination of controls to deliver what they want. The result is a control system that’s as intuitive as any videogame.

I flew us to different positions above the tarmac at Sikorsky’s headquarters, swinging around easily and using a combination of visuals and the map to position myself. If I keep my pace below five knots, the computer stops and hovers when I release the controls. If I go faster, it maintains my heading and speed. The helicopter did everything smoothly and predictably even when flying via the inceptors, regardless of how unsteady my hand may have been.

Mostly, though, I used the tabled to command the whirlybird. The current interface is aimed at pro pilots, with the commensurate degree of detail and data. The design is evolving, says Igor Cherepinsky, director of Sikorsky’s autonomy program. Eventually, it will take on a form that non-pilots can readily use. But it was simple enough for me, and its broad capability made the entire flight thoroughly uneventful, right down to the easy, bounce-free touchdown. That sort of effortless-seeming flight is, of course, precisely what you’d want from a robot or noob pilot, especially if and when future air taxis start shuttling us around the skies.


More Great WIRED Stories

via Wired Top Stories https://ift.tt/2uc60ci

March 19, 2019 at 08:06AM

SpaceX’s Falcon Heavy Megarocket to Fly 1st Commercial Mission in April: Report

https://www.space.com/spacex-falcon-heavy-arabsat6a-april-2019.html

SpaceX plans to launch the first commercial mission of its Falcon Heavy megarocket early next month, according to media reports.

The company is targeting April 7 for the launch of the Arabsat 6A communications satellite from historic Pad 39A at NASA’s Kennedy Space Center in Florida, CNBC reported Friday (March 15), citing anonymous sources. SpaceX has not officially announced a launch target for the Falcon Heavy mission. 

The 13,200-lb. (6,000 kilograms) Arabsat 6A was built by Lockheed Martin and will be operated by the Saudi Arabian company Arabsat.

Related: SpaceX’s Epic Falcon Heavy Road Trip with Starman in Photos

The reusable Falcon Heavy, the most powerful rocket flying today, has one spaceflight under its belt — a test mission that lifted off on Feb. 6, 2018. The rocket launched SpaceX founder and CEO Elon Musk’s red Tesla Roadster — driven by a spacesuit-clad dummy dubbed Starman — into orbit around the sun.

The Falcon Heavy is based on SpaceX’s two-stage Falcon 9 rocket, which has executed dozens of launches and first-stage landings over the past few years. The Heavy lashes together three Falcon 9 first stages; the Heavy’s second stage, and the payload to be launched, sit atop the central first stage “core.”

Two of the Falcon Heavy’s three first-stage boosters aced their landings during the rocket’s February 2018 test flight. The central core came close, narrowly missing its target “drone ship” in the Atlantic Ocean off the Florida coast.

Mike Wall’s book about the search for alien life, “Out There” (Grand Central Publishing, 2018; illustrated by Karl Tate), is out now. Follow him on Twitter @michaeldwall. Follow us on Twitter @Spacedotcom or Facebook

Have a news tip, correction or comment? Let us know at community@space.com.

via Space.com https://ift.tt/2CqOJ61

March 18, 2019 at 07:04PM

A meteor exploded over the Bering Sea with the energy of 10 atomic bombs

https://www.popsci.com/bering-sea-meteor-explosion-10-atomic-bombs?dom=rss-default&src=syn

a meteor flaming next to earth

And we never saw it coming.

Scientists recently observed a meteor exploding over the Bering Sea with the energy of 10 atomic bombs. It’s officially the second largest fireball of its kind to occur…

via Popular Science – New Technology, Science News, The Future Now https://ift.tt/2k2uJQn

March 18, 2019 at 05:22PM

Nvidia Announces Jetson Nano Dev Kit & Board: X1 for $99

https://www.anandtech.com/show/14101/nvidia-announces-jetson-nano

Today at GTC 2019 Nvidia launched a new member of the Jetson family: The new Jetson Nano. The Jetson family of products represents Nvidia new focus on robotics, AI and autonomous machine applications. A few months back we had the pleasure to have a high level review of the Jetson AGX as well as the Xavier chip that powers it.

The biggest concern of the AGX dev kit was its pricing – with retail costs of $2500 ($1299 as part of Nvidia’s developer programme), it’s massively out of range of most hobbyist users such as our readers.

The new Jetson Nano addresses the cost issue in a quite dramatic way. Here Nvidia promises to deliver a similar level of functionality than its more expensive Jetson products, at a much lower price point, and of course at a lower performance point.

The Jetson Nano is a full blown single-board-computer in the form of a module. The module form-factor and connector is SO-DIMM and is similar to past Nvidia modules by the company. The goal of the form-factor is to have the most compact form-factor possible, as it is envisioned to be used in a wide variety of applications where a possible customer will design their own connector boards best fit for their design needs.

At the heart of the Nano module we find Nvidia’s “Erista” chip which also powered the Tegra X1 in the Nvidia Shield as well as the Nintendo Switch. The variant used in the Nano is a cut-down version though, as the 4 A57 cores only clock up to 1.43GHz and the GPU only has half the cores (128 versus 256 in the full X1) active. The module comes with 4GB of LPDDR4 and a 16GB eMMC module. The standalone Jetson Nano module for use in COTS production will be available to interested parties for $129/unit in quantities of 1000.

Naturally, because you can’t do much with the module itself, Nvidia also offers the Jetson Nano in the form of a complete computer: The Jetson Nano Developer Kit. Among the advantages of the Kit is vastly better hardware capabilities compared to competing solutions, such as the performance of the SoC or simply better connectivity such as 4 USB full (3x 2.0 + 1x 3.0) ports, HDMI, DisplayPort and a Gigabit Ethernet port, along with the usual SDIO, I2C, SPI, GPIO and UART connectors you’re used to on such boards. One even finds a M.2 connector for additional WiFi as well as a MIPI-CSI interface for cameras.


Jetson AGX Dev Kit vs Jetson Nano Dev Kit


Jetbot with Jetson Nano Dev Kit vs Jetson Nano Dev Kit

The Jetson Nano Development Kit can be had for only $99. One way Nvidia reaches this price is through the omission of on-board storage, and the kit is driven purely by microSD card. Availability starts today.

We have the Jetson Nano in-house and will seeing what fun things Nvidia cooked up for us soon!

via AnandTech https://ift.tt/phao0v

March 18, 2019 at 06:06PM

Google Expected to Make Big Gaming Announcement this Week

https://www.legitreviews.com/google-expected-to-make-big-gaming-announcement-this-week_210994

Posted by

Shane McGlaun |

Mon, Mar 18, 2019 – 9:05 AM

This week is a big one for Google and gamers around the world. Google is expected to announce its entry into the gaming market in a big way this week. While no real details are known, there are expectations for what Google will show off.

The game service is expected to be cloud-based, and it is tipped to have gone by the Project Yeti name. Speculation suggests that the game service will have a hardware component to it. That speculation stems from a Google hardware SVP tweeting a link to the Google GDC presentation reports Inc.

The thought is that there is no reason for a hardware guy to promote the presentation if hardware isn’t a part of it. The service is said to have a separate controller. Any game system must have a controller, so that is no surprise.

If Google launches what is essentially the Netflix of video games, as some have speculated, this would be a significant change for the industry. It remains to be seen what Google offers and how gamers receive it.

via Legit Reviews Hardware Articles https://ift.tt/2Y6Fy3O

March 18, 2019 at 09:10AM

IKEA makes furniture more accessible with 3D printing

https://www.engadget.com/2019/03/17/ikea-makes-furniture-more-accessible-with-3d-printing/

If you live with disabilities, shopping for furniture can be difficult. Many common furniture items aren’t designed with accessibility in mind, and those that are can be rare or non-existent. IKEA Israel has a technological solution: 3D print pieces that make them easier to use. The store has collaborated with Milbat on ThisAbles, a project that provides 3D-printed add-ons for furniture that can be tough to use with certain conditions. Among the 13 initial items are easier-to-grab handles, bumpers to protect cabinets and lifts to raise couches.

You’ll have to visit an Israeli IKEA store to see the items in person, and you can only buy ready-made items through Milbat. However, you don’t even have to buy anything if you have a 3D printer and some filament — the designs are available to make for free. You can even ask for customizations in case the designs don’t fit your third-party furniture. This is less about profit and more about encouraging furniture makers to consider accessibility as an important feature, whether it’s built into a given design or available as an extra.

Via: Washington Post

Source: ThisAbles

via Engadget http://www.engadget.com

March 17, 2019 at 07:03AM