Beijing is letting its first driverless cars take to the roads
http://ift.tt/2pwMNSS
Beijing is letting its first driverless cars take to the roads
Tech giant Baidu has been granted the first license for testing autonomous vehicles in China’s capital.
The news: Beijing officials gave Baidu license plates for its self-driving cars today. That’s one way of saying that the company will soon take its autonomous vehicles out for a drive on the city’s roads.
Challenging conditions: With pedestrians, bicycles, scooters, and cars jostling for space, Beijing’s roads will be incredibly complex for robotic vehicles to navigate. That’s especially true compared to the kinds of suburban roads that many American autonomous cars have been tackling.
China vs US: Companies such as Baidu are in a tight race against American counterparts to apply the technology in the real world. For now, the Chinese government has voted for full speed ahead with self-driving cars.
Tech
via Technology Review Feed – Tech Review Top Stories http://ift.tt/1XdUwhl
New Zealand just took care of a $3.6 million mouse infestation
http://ift.tt/2pvvO2A
Compared to New Zealand, we are all terrible at getting rid of mice. They just totally eradicated the 200,000 or so rodents scurrying around the subantarctic Antipodes island.
Unlike most of the mouse traps around the world, this was not about helping humans. In fact, humans won’t benefit from this mission at all. There are no people on Antipodes island and there never will be. It’s a remote nature reserve and World Heritage site, because it’s home to an enormous number of plants and animals found nowhere else in the world. Four land birds, 21 uncommon plant species, and a couple dozen insects are unique to the island.
You’ll notice that mice are not listed. Mice are not native to Antipodes island, just as they’re not native to any other part of New Zealand. So when they arrived by ship (alongside whalers and sealers and explorers) they invaded. Mice are excellent invaders. They’re small, they breed fast, they eat a broad diet—they’ll infiltrate anywhere, anytime. In the last few decades, though, New Zealand’s Department of Conservation realized that mice and other mammals were decimating native plants and animals, especially on islands like Antipodes. They were eating everything, including baby chicks of rare birds that hadn’t evolved any protection from rodents. They’d already wiped out two insect taxa, and had forced two bird species from the main island to smaller ones offshore.
They had some experience in this area, because New Zealand is an island nation with no native land mammals whatsoever. Sheep, mice, cats, stoats, hedgehogs, deer—they all arrived with humans in the last few hundred years. That means the Department of Conservation (DOC) has had to figure out effective ways to get rid of mammalian predators before. They had previously eradicated mice from some smaller islands near to the mainland, but the Antipodes effort—dubbed the Million Dollar Mouse project—is one of the largest ever attempts to rid an area of mice. Antipodes island is about 2,100 hectares, which is roughly 1.5 times the size of Los Angeles International Airport (that sounds small, but just go on Google Maps and look how huge it is).
To eradicate mice on the entire island, though, you first have to get a whole team of people there. That means a three-day boat ride from the southern tip of New Zealand, down into the subantarctic zone, bringing all the supplies they’d need with them (you can read about The Chorizo Affair, in which they accidentally got 382 packages of chorizo instead of 35, on their brilliant blog). There’s no harbor, so the 13-person crew had to climb up sea cliffs to get onto the island, and they slept in small huts with no electricity. They stayed for several months in winter 2016, spreading poison bait over the entire island in two separate treatments—144,400 pounds in all.
In between, they got down to some other conservation work. They tagged birds and collected insects. They surveyed animal populations. They ate loads of chorizo, sometimes on pizza.
The whole mission didn’t take long to set in motion, but then they had to return to the mainland and wait. Part of the problem with eradicating mice is that they’re so darned hard to find. You can’t kill them off and then immediately look for evidence that they’ll all dead. You’ll probably never find two lone mice on a 2,100 hectare island, and two mice can do a lot of damage once they start getting it on. So you have to leave, wait, and see whether the population rebounds in a couple years. If any mice survived in 2016, there’d be plenty of offspring by 2018.
A team went back to the island at beginning of the year to check, and spent weeks trying to find any sign at all of the little critters. They checked the chewing papers left behind for evidence of mouse bites. They took two adorable dogs out to sniff for the buggers. And they found nothing. No mice.
This is, to put it bluntly, huge. Yeah, it’s only 1.5 times the size of a large airport. And yes, it’s just one island, and a remote one at that. It’s only a tiny fraction of the world’s fragile ecosystems, and an even tinier percent of the world’s at-risk species, that they’ve saved. And it took $3.6 million to do it.
But the point is: they did it. It can be done. This is, as the BBC wrote, considered “one of the most sophisticated pest eradications project ever carried out in the world.” It’s basically proof that in at least some cases, we can save ecosystems from invasive species. Other island nations might be able to follow in New Zealand’s footsteps, and though larger land areas might be impossible to entirely rid of pests, the techniques they used here could inform how other countries approach pest management.
The Kiwis are basically the world’s leading experts on invasive species eradication—their methods are the gold standard now. They’ve had more successful projects than almost any other country in the world. And they’re going to need a lot more. The New Zealand government has pledged to rid the whole country of invasive species by 2050, which for a country with no native mammals will be quite a challenge. But this is exactly how you have to do it: one island at a time.
Tech
via Popular Science – New Technology, Science News, The Future Now http://ift.tt/2k2uJQn
MIT Developed a Way For Cars To See Through Fog When Human Drivers Can’t
http://ift.tt/2GUlN7u
GIF
GIF: MIT (https://www.youtube.com/watch?v=CkR1UowJF0w)
Bad weather can make driving extremely dangerous, no matter who’s behind the wheel. But when it comes to dense fog, which makes it all but impossible for human drivers to see more than a few feet ahead of a vehicle, MIT researchers say they have developed a new imaging system that could eventually let autonomous cars see right through the obstruction.
Most autonomous navigation systems use cameras and sensors that rely on images and video feeds generated by visible light. Humans work the same way, which is why fog and mist have been equally problematic for cars with or without a driver in the front seat.
To solve this problem, MIT researchers Guy Satat, Ramesh Raskar, and Matthew Tancik created a new laser-based imaging system that can accurately calculate the distance to objects, even through a thick fog. The system, which is officially being presented in a paper at the International Conference on Computational Photography in Pittsburgh this May, uses short bursts of laser light that are fired away from a camera, and timed for how long it takes them to bounce back.
When the weather’s nice and the path is clear for the laser lightwaves to travel, this time-of-flight approach is a very accurate way to measure the distance to an object. But fog, which is made up of countless tiny water droplets hanging in the air, scatters the light in all directions. The disrupted laser bursts eventually arrive back at the camera at different times, throwing off the distance calculations that are dependent on accurate timing info.
To solve this problem, the MIT researchers say they have developed a new processing algorithm. They discovered that no matter how thick the fog might be, the arrival times of the scattered laser light always adhered to a very specific distribution pattern. A camera counts the number of photons bouncing back to its sensor every trillionth of a second, and when those results are graphed, the system is able to apply specific mathematical filters revealing data spikes that in turn reveal actual objects hidden in the fog.
In an MIT laboratory, the imaging system was tested in a small chamber measuring about a meter long, and it was able to clearly see objects 21 centimeters further away than human eyes could discern. When scaled up to real world dimensions and conditions, where fog never gets as thick as what the researchers had artificially created, the system would be able to see objects far enough head for a vehicle to have plenty of time to safely react and avoid them.
Tim Sweeney wants Unreal to power the cross-platform revolution
http://ift.tt/2puLvXG
It’s 2018 and developers are finally taking mobile games seriously — or it’s the other way around, depending on whom you ask.
"I think what we are seeing is now these AAA games from traditional PC and console developers going mobile, and they are among the most popular mobile games that exist," Epic Games co-founder Tim Sweeney says.
Epic CTO Kim Liberi jumps in and adds, "I think it’s almost the other way, I think it’s that mobile developers are taking games more seriously."
Either way, the mobile game market has shifted drastically over the past few years, and today major developers are building massive experiences for tiny screens, often putting fully fledged PC and console titles directly on handheld devices. Think Fortnite, Ark: Survival Evolved, PlayerUnknown’s Battlegrounds and Rocket League. All of these games, and countless others, run on Unreal, Epic’s engine (and Fortnite is Epic’s in-house baby, of course).
Running on Unreal means these games can play across all platforms with relative ease — the same code is powering the PlayStation 4, Xbox One, Nintendo Switch, PC and mobile editions of each game. It’s the same title across all platforms.
That means there’s no reason, say, Xbox One and PlayStation 4 players can’t link up and jump into games together. Well — there’s no technical reason. Sony has long been a holdout in this space, refusing to allow cross-console play. Both Microsoft and Nintendo are open to the idea, while the PC and mobile markets have been primed for years.
"Fortnite runs on six platforms, so there are 36 combinations of platforms and 35 of them can all play together," Sweeney says. "We’re one link away from having it all connected. But we’re talking with everybody and I feel that it’s now becoming inevitable, as these trends of people playing across platforms. Eventually you won’t be able to tell kids in school, ‘Sorry, you can’t play with those particular people who are your friends because they’re on a different platform.’ That’s just not gonna hold water anymore."
It’s not going to make sense from a business perspective, either, Sweeney argues.
"We’re one link away from having it all connected."
"At the core of these businesses is profit and loss calculations but really, what gaming is about ultimately is people," he says. "Can you imagine how dysfunctional Facebook would be if people who were using Facebook on iOS weren’t allowed to communicate with people using Facebook on Android? But that’s the state of console gaming right now."
Epic is supporting the cross-platform trend with Unreal Engine. The latest version will make it easier for developers to bring their console or PC games to mobile devices, using Fortnite as a successful case study. Another improvement heading to Unreal 4.20, which lands for developers this summer, is a new live record and replay feature. This allows players to cinematically view and edit their gameplay after the match is done — not only allowing serious players to study their strategies, but also empowering YouTubers and Twitch streamers to create movie-like highlight reels.
Coming soon
Looking to the future, Epic is working on fine-tuning high-end graphics capabilities and motion-capture animation processes — these are things that major, AAA developers might use. Partnering with NVIDIA and Microsoft on new ray-tracing technology, at GDC Epic showed off a demo in the Star Wars universe and featuring the technique running in real-time. The quality was stunning, but this kind of tech isn’t quite ready for everyday consumers.
As Liberi explains it, "It’s running on a quite powerful piece of hardware right now because experimental technology runs on a –"
"It’s one PC with four GPUs," Sweeney chimes in.
"Four GPUs, yeah. Nvidia DGX-1 with four GPUs."
That’s certainly not what most folks have at home, but the tech should catch up to accessible gaming hardware in the near future.
Epic Games
In other news of a visually striking nature, Epic also developed a real-time motion-capture animation system in partnership with 3Lateral. Using the company’s Meta Human Framework volumetric capture, reconstruction and compression technology, Epic was able to digitize a performance by actor Andy Serkis in a shockingly lifelike manner — in real-time and without any manual animation. The technology also allowed Epic to seamlessly transfer Serkis’ performance (a MacBeth monologue) onto the face of a 3D alien creature.
Partnering with 3Lateral, Cubic Motion, Tencent and Vicon, Epic also showed off Siren, a digital character rendered in real-time based on a live actress’ performance.
"[3Lateral] is the company that actually builds the digital faces that then we work out how to make them look realistic in the engine," Liberi says. "What they’re able to do is what they call four-dimensional capturing, which is like a scan but it’s a scan that moves over time. Because of that, they’re able to refine the facial animation systems for the digital human to get all the nuances of every wrinkle, how every piece of flesh moves."
Click here to catch up on the latest news from GDC 2018!
A big chunk of Monty Python‘s catalog is coming to Netflix UK in April and the US later in the year. The slate includes the iconic films Monty Python & the Holy Grail and Monty Python’s Life of Brian, along with TV series Monty Python’s Flying Circus and several live specials. They’ll all hit the service at once, so UK viewers can start binging starting on April 15th (don’t forget a thin mint afterwards). US users will have to wait, and "not all titles will be available at the same time in all territories," Python’s website says.
It’s the first time much of the catalog has come to Netflix, though Monty Python’s The Meaning of Life is missing from the list. Members include Eric Idle, John Cleese, Michael Palin, Terry Jones, Terry Gilliam and Graham Chapman (deceased). A complete list of the available Netflix titles is shown below.
Everything in this scene is computer generated and running in real time.
Check out the depth of field and reflection on Phasma back there.
Check out the reflection of the open elevator doors on Phasma’s chest here.
TFW the boss walks in on you gossiping in the elevator.
The diffuse light dripping off these Stormtroopers’ shoulders is one of the main intrinsic benefits of raytracing.
So shiny.
SAN FRANCISCO—In the computer graphics community this week, companies from Nvidia to Microsoft have been stressing just how important real-time raytracing will be to making games look more movie-like in the near future. Epic Games used a striking demo at a Game Developers Conference keynote presentation this morning to show just how much better raytracing can make real-time, interactive graphics look with top-of-the-line hardware right now.
The Star Wars “Reflections” demo, made with the cooperation of Nvidia and ILMxLAB, showed two extremely realistic-looking and talkative Stormtroopers clamming up in an elevator when the shiny Captain Phasma pops in. Running on what Epic still refers to as “experimental code” (planned to be introduced to Unreal Engine for production later this year) the raytracing in the demo allows background elements like the guns and the opening elevator doors to reflect accurately off Phasma’s mirror-like armor in real time. These are the kinds of effects that Epic CTO Kim Libreri highlights they’ve “never been able to do before [with rasterized graphics].”
With raytracing, developers can change the shape of the light in a scene on the fly, in turn changing the character and the diffuse “softness” of the shadows in the scene. The way the raytraced shadows allow for gentle gradations of light as characters and objects block parts of the scene is one of the most apparent improvements over the harsher, “pre-baked” light in rasterized graphics. The raytracing technology also gives the scene a realistic, cinematic depth-of field effect automatically, with no need for fancy shaders or tricks implemented manually by developers.
Getting a “cinematic” 24fps with real-time raytracing still requires some serious hardware: it’s currently running on Nvidia’s ultra-high-end, four-GPU DGX Station, which lists for $60,000 [Update: After publication, Epic reached out to clarify that the demo was running on a DGX Station, and not a DGX-1 as originally stated during the interview.] . Even with that, some elements of the scene, like the walls, need to be rasterized rather than made fully reflective, Libreri told Ars. And the Volta technology that is key to powering this kind of performance isn’t even available in consumer-grade GPUs below the $3,000+ Titan V, so don’t expect this kind of scene to run on your home gaming rig in the near-term.
But Libreri adds that the tech requirements have been lessened considerably by a de-noising process that dynamically raises or lowers the number of simulated light rays that are necessary for various parts of the scene, based on their importance to the scene in total. Nvidia’s Gameworks de-noising code can get things down to as low as two rays per pixel (from a maximum of six), severely lowering the GPU overhead necessary for doing these complex lighting calculations in real time.
“We’ll be in a hybrid place for a few years where part is rasterized, part is raytraced,” Libreri said. “The film business was based on rasterization for two decades [and] finally started making the move to raytracing a decade ago.”
He added, “It’s just a matter of time” before raytracing is the standard in games, too.
In the meantime, though, we can enjoy this glimpse of just how realistic PC games could look in the future.
Report: Google is buying innovative camera startup Lytro for $40 million
http://ift.tt/2FU2IF1
The first-gen Lytro camera, released in 2012.
Lytro
The Lytro Illum, a second-gen light-field camera with a more traditional form factor.
With the pivot to VR come crazy camera setups like this, the “Lytro Immerge 2.0.” It captures 3D VR video.
The Lytro Cinema is a super-sized light-field camera for filmmaking.
A report from TechCrunch claims that Google is going to buy the camera company Lytro for “around 40 million dollars.” Lytro is best known for creating an innovative “Light field camera,” but the company has lately pivoted to professional camera technology for filmmaking and capturing VR video.
You might remember the first Lytro camera, which came in a crazy “tube” form factor with a lens at one end and a 1.5-inch touchscreen on the other. The tube was full of lenses and a special “Light Field Sensor” that would capture images as light-field data rather than a grid of pixels. The benefit was that you could just take a picture without worrying about the focus, and you could later selectively focus the image however you wanted. The downside is that you needed a much denser CMOS sensor to capture a high megapixel image. In 2012, when the camera came out, Lytro could compute all this light-field data down to only a 1MP image.
Lytro followed up the first-gen tube camera with the “Lytro Illum” in 2014. This camera used more of a traditional form factor and increased the resolution. Neither of these cameras sold very well, and the company eventually moved away from consumer cameras and started making more professional cameras for VR and cinema.
TechCrunch says Google’s plans for Lytro are “not clear yet.” The report suggests that the purchase would help Google’s VR efforts, pointing to Google’s own work with light-field cameras and a recently released immersive image viewer for Stream VR called “Light Fields.” In addition to having competing light-field camera designs and engineers, Lytro also has 59 patents related to light-field imaging that Google might want. There’s already a strong connection between Lytro and Google: Rick Osterloh, Google’s SVP of hardware, has a seat on Lytro’s board of directors.
The report is a bit cloudy on the exact details of the transaction. “One source described the deal as an ‘asset sale,’ with Lytro going for no more than $40 million,” the report says. “Another source said the price was even lower: $25 million and that it was shopped around—to Facebook, according to one source and possibly to Apple, according to another. A separate person told us that not all employees are coming over with the company’s technology: some have already received severance and parted ways with the company, and others have simply left.”
All of those possibilities sound like a rough outcome for Lytro. In 2017, the company was valued at $360 million, and the company has raised more than $200 million in total funding from various investors. With a lack of a hit product or service over Lytro’s 12-year history, it sounds like the money dried up, and Google is grabbing the company at a bargain.