Apple Analyst Predicts In-House ARM Processor To Be Used In Next 12-18 Months

https://www.legitreviews.com/apple-analyst-predicts-in-house-arm-processor-to-be-used-in-next-12-18-months_217663

Noted Apple analyst Ming-Chi Kuo has issued a new investor note that predicts Apple will release its first Mac computer using an ARM processor that it designed in the first half of 2021. Kuo expects Apple to release a new product using its own in-house designed processor within the next 12 to 18 months. Traditionally Apple uses Intel processors.

There have been grumblings for a long time that Apple intended to switch from Intel parts to custom-designed ARM processors for Mac computers. This is the first indication of the timeframe for that change.

The first of the ARM processors are expected to be 5nm, and Apple has reportedly significantly increased its funding for research, development, and production of those parts. Kuo also says that 5nm chips will be used in the iPhone and iPad later this year, along with the Mac that lands in 2021.

The analyst also notes that new mini-LED iPad is expected to launch in late 2020 to early 2021. The move to in-house designed processors will allow Apple more flexibility with hardware updates and improved ability to fine-tune hardware and software efficiency for user experience. Developers will face a challenge in ensuring their software supports the ARM-powered Mac computers reports 9to5Mac.

via Legit Reviews Hardware Articles https://ift.tt/2Y6Fy3O

February 25, 2020 at 09:51AM

See how ‘The Mandalorian’ used Unreal Engine for its real-time digital sets

https://www.engadget.com/2020/02/21/mandalorian-ilm-stagecraft-real-time-digital-sets/

It’s not surprising that a VFX-heavy show like The Mandalorian uses digital sets, even though it also relies heavily on practical, in-camera effects. What’s more unexpected, however, is that the actors were able to see and perform within those sets rather than against a sterile green screen environment. To do all that, Jon Favreau and his team work with Epic Games to develop a new, innovative technique using digital sets projected on LED displays, using the Fortnite creator’s Unreal game engine.

In a new VFX sizzle reel (below) ILM, Favreau and other production team members explain how the cutting edge technique works. They built a 20-foot high, 270-degree semicircular LED video wall with a 75-foot diameter circular stage. ILM’s digital 3D sets, built ahead of production (rather than later in post-production as is the norm on VFX-heavy shows) were projected interactively on the walls. Those could be used as stand-alone backgrounds or extensions to practical set pieces placed on the stage.

The digital sets weren’t merely pre-rendered imagery, but game-type 3D objects rendered on the fly by powerful NVIDIA GPUs. They were lit and rendered from the physical camera’s perspective to generate perspective, so that the sets didn’t look like old-school rear screen projections often used in traveling vehicle shots. At the same time, the actors were lit with practical LED stage lights to match the position of lights and the sun on the digital sets.

The perspective aspect of the technique is on display at around the 3:40 mark of the video, where you can clearly see the background changing to match the camera movement. Since the camera is being moved by a dolly grip operator (rather than a computer motion control system) it appears that the digital set is linked to a motion tracker placed on the camera.

There are huge advantages to this technique, the team said. Sets can be changed on the fly (within an hour) to better match the director and cinematographer’s vision, for one thing. It also makes performing easier, as actor’s can see their environment rather than needing to pretend it’s there as with green screens. Plus, it no doubt saves on post-production costs — according to ILM, the technique was used in fully 50 percent of The Mandalorian’s shots.

Best of all, the technique was seamless and invisible in the final show. ILM has built a whole new platform around this technique called StageCraft that uses Unreal Engine’s real-time interactivity, and will make it available to other filmmakers and show runners. "We have been able to see through a few technical innovations and a few firsts that I think are going to have a lot of impact on the way that television and movies are made going forward," said Favreau.

Source: ILM (YouTube)

via Engadget http://www.engadget.com

February 21, 2020 at 05:21AM