Tim Sweeney wants Unreal to power the cross-platform revolution

Tim Sweeney wants Unreal to power the cross-platform revolution

http://ift.tt/2puLvXG

It’s 2018 and developers are finally taking mobile games seriously — or it’s the other way around, depending on whom you ask.

"I think what we are seeing is now these AAA games from traditional PC and console developers going mobile, and they are among the most popular mobile games that exist," Epic Games co-founder Tim Sweeney says.

Epic CTO Kim Liberi jumps in and adds, "I think it’s almost the other way, I think it’s that mobile developers are taking games more seriously."

Either way, the mobile game market has shifted drastically over the past few years, and today major developers are building massive experiences for tiny screens, often putting fully fledged PC and console titles directly on handheld devices. Think Fortnite, Ark: Survival Evolved, PlayerUnknown’s Battlegrounds and Rocket League. All of these games, and countless others, run on Unreal, Epic’s engine (and Fortnite is Epic’s in-house baby, of course).

Running on Unreal means these games can play across all platforms with relative ease — the same code is powering the PlayStation 4, Xbox One, Nintendo Switch, PC and mobile editions of each game. It’s the same title across all platforms.

That means there’s no reason, say, Xbox One and PlayStation 4 players can’t link up and jump into games together. Well — there’s no technical reason. Sony has long been a holdout in this space, refusing to allow cross-console play. Both Microsoft and Nintendo are open to the idea, while the PC and mobile markets have been primed for years.

"Fortnite runs on six platforms, so there are 36 combinations of platforms and 35 of them can all play together," Sweeney says. "We’re one link away from having it all connected. But we’re talking with everybody and I feel that it’s now becoming inevitable, as these trends of people playing across platforms. Eventually you won’t be able to tell kids in school, ‘Sorry, you can’t play with those particular people who are your friends because they’re on a different platform.’ That’s just not gonna hold water anymore."

It’s not going to make sense from a business perspective, either, Sweeney argues.

"We’re one link away from having it all connected."

"At the core of these businesses is profit and loss calculations but really, what gaming is about ultimately is people," he says. "Can you imagine how dysfunctional Facebook would be if people who were using Facebook on iOS weren’t allowed to communicate with people using Facebook on Android? But that’s the state of console gaming right now."

Epic is supporting the cross-platform trend with Unreal Engine. The latest version will make it easier for developers to bring their console or PC games to mobile devices, using Fortnite as a successful case study. Another improvement heading to Unreal 4.20, which lands for developers this summer, is a new live record and replay feature. This allows players to cinematically view and edit their gameplay after the match is done — not only allowing serious players to study their strategies, but also empowering YouTubers and Twitch streamers to create movie-like highlight reels.

Coming soon

Looking to the future, Epic is working on fine-tuning high-end graphics capabilities and motion-capture animation processes — these are things that major, AAA developers might use. Partnering with NVIDIA and Microsoft on new ray-tracing technology, at GDC Epic showed off a demo in the Star Wars universe and featuring the technique running in real-time. The quality was stunning, but this kind of tech isn’t quite ready for everyday consumers.

As Liberi explains it, "It’s running on a quite powerful piece of hardware right now because experimental technology runs on a –"

"It’s one PC with four GPUs," Sweeney chimes in.

"Four GPUs, yeah. Nvidia DGX-1 with four GPUs."

That’s certainly not what most folks have at home, but the tech should catch up to accessible gaming hardware in the near future.

Epic Games

In other news of a visually striking nature, Epic also developed a real-time motion-capture animation system in partnership with 3Lateral. Using the company’s Meta Human Framework volumetric capture, reconstruction and compression technology, Epic was able to digitize a performance by actor Andy Serkis in a shockingly lifelike manner — in real-time and without any manual animation. The technology also allowed Epic to seamlessly transfer Serkis’ performance (a MacBeth monologue) onto the face of a 3D alien creature.

Partnering with 3Lateral, Cubic Motion, Tencent and Vicon, Epic also showed off Siren, a digital character rendered in real-time based on a live actress’ performance.

"[3Lateral] is the company that actually builds the digital faces that then we work out how to make them look realistic in the engine," Liberi says. "What they’re able to do is what they call four-dimensional capturing, which is like a scan but it’s a scan that moves over time. Because of that, they’re able to refine the facial animation systems for the digital human to get all the nuances of every wrinkle, how every piece of flesh moves."

Click here to catch up on the latest news from GDC 2018!

Tech

via Engadget http://www.engadget.com

March 21, 2018 at 12:24PM

‘Monty Python’ is silly-walking onto Netflix

‘Monty Python’ is silly-walking onto Netflix

http://ift.tt/2FTT2KM

A big chunk of Monty Python‘s catalog is coming to Netflix UK in April and the US later in the year. The slate includes the iconic films Monty Python & the Holy Grail and Monty Python’s Life of Brian, along with TV series Monty Python’s Flying Circus and several live specials. They’ll all hit the service at once, so UK viewers can start binging starting on April 15th (don’t forget a thin mint afterwards). US users will have to wait, and "not all titles will be available at the same time in all territories," Python’s website says.

It’s the first time much of the catalog has come to Netflix, though Monty Python’s The Meaning of Life is missing from the list. Members include Eric Idle, John Cleese, Michael Palin, Terry Jones, Terry Gilliam and Graham Chapman (deceased). A complete list of the available Netflix titles is shown below.

  • Monty Python and the Holy Grail
  • Monty Python’s Life of Briain
  • Monty Python’s Flying Circus
  • Monty Python’s Fliegender Zirkus season 1
  • Monty Python’s Personal Best: season 1
  • Monty Python Conquers America
  • Monty Python’s Almost the Truth
  • The Meaning of Monty Python
  • Monty Python’s Best Bits (mostly): season 1
  • Monty Python Live (Mostly): One Down, Five to Go
  • Monty Python: The Meaning of Live

Source: Monty Python

Tech

via Engadget http://www.engadget.com

March 22, 2018 at 08:39AM

Unreal Engine + $60,000 GPU = Amazing, real-time raytraced Star Wars [Updated]

Unreal Engine + $60,000 GPU = Amazing, real-time raytraced Star Wars [Updated]

http://ift.tt/2u6YqER

SAN FRANCISCO—In the computer graphics community this week, companies from Nvidia to Microsoft have been stressing just how important real-time raytracing will be to making games look more movie-like in the near future. Epic Games used a striking demo at a Game Developers Conference keynote presentation this morning to show just how much better raytracing can make real-time, interactive graphics look with top-of-the-line hardware right now.

The Star Wars “Reflections” demo, made with the cooperation of Nvidia and ILMxLAB, showed two extremely realistic-looking and talkative Stormtroopers clamming up in an elevator when the shiny Captain Phasma pops in. Running on what Epic still refers to as “experimental code” (planned to be introduced to Unreal Engine for production later this year) the raytracing in the demo allows background elements like the guns and the opening elevator doors to reflect accurately off Phasma’s mirror-like armor in real time. These are the kinds of effects that Epic CTO Kim Libreri highlights they’ve “never been able to do before [with rasterized graphics].”

With raytracing, developers can change the shape of the light in a scene on the fly, in turn changing the character and the diffuse “softness” of the shadows in the scene. The way the raytraced shadows allow for gentle gradations of light as characters and objects block parts of the scene is one of the most apparent improvements over the harsher, “pre-baked” light in rasterized graphics. The raytracing technology also gives the scene a realistic, cinematic depth-of field effect automatically, with no need for fancy shaders or tricks implemented manually by developers.

Getting a “cinematic” 24fps with real-time raytracing still requires some serious hardware: it’s currently running on Nvidia’s ultra-high-end, four-GPU DGX Station, which lists for $60,000 [Update: After publication, Epic reached out to clarify that the demo was running on a DGX Station, and not a DGX-1 as originally stated during the interview.] . Even with that, some elements of the scene, like the walls, need to be rasterized rather than made fully reflective, Libreri told Ars. And the Volta technology that is key to powering this kind of performance isn’t even available in consumer-grade GPUs below the $3,000+ Titan V, so don’t expect this kind of scene to run on your home gaming rig in the near-term.

But Libreri adds that the tech requirements have been lessened considerably by a de-noising process that dynamically raises or lowers the number of simulated light rays that are necessary for various parts of the scene, based on their importance to the scene in total. Nvidia’s Gameworks de-noising code can get things down to as low as two rays per pixel (from a maximum of six), severely lowering the GPU overhead necessary for doing these complex lighting calculations in real time.

“We’ll be in a hybrid place for a few years where part is rasterized, part is raytraced,” Libreri said. “The film business was based on rasterization for two decades [and] finally started making the move to raytracing a decade ago.”

He added, “It’s just a matter of time” before raytracing is the standard in games, too.

In the meantime, though, we can enjoy this glimpse of just how realistic PC games could look in the future.

Tech

via Ars Technica https://arstechnica.com

March 21, 2018 at 12:24PM

Report: Google is buying innovative camera startup Lytro for $40 million

Report: Google is buying innovative camera startup Lytro for $40 million

http://ift.tt/2FU2IF1

A report from TechCrunch claims that Google is going to buy the camera company Lytro for “around 40 million dollars.” Lytro is best known for creating an innovative “Light field camera,” but the company has lately pivoted to professional camera technology for filmmaking and capturing VR video.

You might remember the first Lytro camera, which came in a crazy “tube” form factor with a lens at one end and a 1.5-inch touchscreen on the other. The tube was full of lenses and a special “Light Field Sensor” that would capture images as light-field data rather than a grid of pixels. The benefit was that you could just take a picture without worrying about the focus, and you could later selectively focus the image however you wanted. The downside is that you needed a much denser CMOS sensor to capture a high megapixel image. In 2012, when the camera came out, Lytro could compute all this light-field data down to only a 1MP image.

Tech

via Ars Technica https://arstechnica.com

March 21, 2018 at 12:56PM

Best bad idea ever? Why Putin’s nuclear-powered missile is possible… and awful

Best bad idea ever? Why Putin’s nuclear-powered missile is possible… and awful

http://ift.tt/2GOPkzb

In a March 1, 2018 speech before Russia’s Federal Assembly, Russian President Vladimir Putin discussed new strategic weapons being developed to counter United States ballistic missile defenses. Two of these weapons are allegedly nuclear powered: a previously revealed intercontinental-range nuclear torpedo and a cruise missile. As Putin described them:

Russia’s advanced arms are based on the cutting-edge, unique achievements of our scientists, designers, and engineers. One of them is a small-scale, heavy-duty nuclear energy unit that can be installed in a missile like our latest X-101 air-launched missile or the American Tomahawk missile—a similar type but with a range dozens of times longer, dozens—basically an unlimited range. It is a low-flying stealth missile carrying a nuclear warhead, with almost an unlimited range, unpredictable trajectory and ability to bypass interception boundaries. It is invincible against all existing and prospective missile defense and counter-air defense systems.

Defense and nuclear disarmament experts did a double take. “I’m still kind of in shock,” Edward Geist, a Rand Corporation researcher specializing in Russia, told NPR. “My guess is they’re not bluffing, that they’ve flight-tested this thing. But that’s incredible.”

Tech

via Ars Technica https://arstechnica.com

March 22, 2018 at 06:04AM

Non-Lethal Weapon: DOD seeks to use lasers to create shouting will-o-the-wisp

Non-Lethal Weapon: DOD seeks to use lasers to create shouting will-o-the-wisp

http://ift.tt/2pxl0kj

The Department of Defense’s Joint Non-Lethal Weapons Development Program (JNLWD) is closing in on a directed energy weapon that can literally tell people to go away—creating sound waves with laser pulses that can annoy, frighten, or otherwise send the message to people approaching a military unit that getting closer is not a good idea.

The Non-Lethal Laser-Induced Plasma Effect (NL-LIPE) system can be used to manipulate air molecules, creating a ball of plasma that oscillates to create sound waves with a stream of femtosecond-long laser bursts. A first laser creates the plasma ball, and a second then oscillates the plasma ball to create the sound. As Defense One’s Patrick Tucker reports, the current Laser-Induced Plasma Effect implementation can only manage an indistinguishable mumble—though it can create a wide variety of very distinguishable sounds, as demonstrated in the video below.

A video of the Laser-Induced Plasma Effect in action.

David Law, JNLWD’s Technology Division chief, believes that, within the next three years, the system will be able to create intelligible speech from a glowing ball of plasma hovering in the air at a distance. “We’re this close to getting it to speak to us,” Law told Tucker. “I need three or four more kilohertz.”

While it can’t talk clearly yet, NL-LIPE can create the equivalent of a stun grenade (or “flashbang”), and it could be combined with other non-lethal laser applications. NL-LIPE could also be used to scorch or burn clothing, as shown in this DOD video.

More demonstrations of NL-LIPE in action.

.

NL-LIPE is just one of a variety of non-lethal systems the military is researching to provide soldiers with “area denial” tools beyond the deadly kind. Other non-lethal weapons in development include the Active Denial System, a long-range millimeter-wave directed-energy weapon that “creates a heating sensation,” as JNLWD puts it, to keep crowds at a distance without causing injury. And the Marine Corps is acquiring a “hail and warning” device called the Ocular Interrupter System—a green laser system combined with a range-finder that can be aimed at someone up to 500 meters.

Tech

via Ars Technica https://arstechnica.com

March 22, 2018 at 06:19AM

SpaceX launch last year punched huge, temporary hole in the ionosphere

SpaceX launch last year punched huge, temporary hole in the ionosphere

http://ift.tt/2pwCK09

Enlarge /

The Formosat-5 mission launches in August, 2017, from Vandenberg Air Force Base in California.

SpaceX

Contrary to popular belief, most of the time when a rocket launches, it does not go straight up into outer space. Rather, shortly after launch, most rockets will begin to pitch over into the downrange direction, limiting gravity drag and stress on the vehicle. Often, by 80 or 100km, a rocket is traveling nearly parallel to the Earth’s surface before releasing its payload into orbit.

However, in August of last year, a SpaceX Falcon 9 rocket launch from California did not make such a pitch over maneuver. Rather, the Formosat-5 mission launched vertically and stayed that way for most of its ascent into space. The rocket could do this because the Taiwanese payload was light for the Falcon 9 rocket, weighing only 475kg and bound for an orbit 720km above the Earth’s surface.

As a result of this launch profile, the rocket maintained a nearly vertical trajectory all the way through much of the Earth’s ionosphere, which ranges from about 60km above the planet to 1,000km up. In doing so, the Falcon 9 booster and its second stage created unique, circular shockwaves. The rocket launch also punched a temporary, 900-km-wide hole into the plasma of the ionosphere.

Circular shock waves

Scientists used to think the Sun’s radiation dominated the Earth’s extremely tenuous atmosphere in the ionosphere, but in the last decade they have begun to understand that weather at the planet’s surface can also change conditions far above. Researchers are increasingly interested in the effects of rockets, too. This is because disturbances in the ionized and neutral particles of the ionosphere have consequences for satellites, such as causing errors in Global Positioning System navigation.

Tech

via Ars Technica https://arstechnica.com

March 22, 2018 at 09:47AM