DJI Turned Its Smartphone Stabilizer Into a Steady Telescoping Selfie Stick

https://gizmodo.com/dji-turned-its-smartphone-stabilizer-into-a-steady-tele-1847635344


Today DJI revealed the fifth iteration of its affordable smartphone stabilizers that help turn the device in your pocket into a surprisingly capable filmmaking tool. Although the DJI OM 5 now includes a telescoping arm that makes capturing group shot selfies and low-angle shots easier, the improvements come with a significant cost to battery life.

Like the DJI OM 4 that made its debut just over a year ago, the new OM 5 features the same easy-to-use magnetic clamp system but with increased adjustability so that larger phones, including those wearing a bulky protective case, can be easily attached and removed from the gimbal. The OM 5 also features a smaller and lighter design, even with the addition of a built-in “extension rod” that can position a smartphone an additional 215-millimeters (about eight-and-a-half-inches) farther away from the user. DJI has essentially turned the gimbal into a stabilized selfie stick for capturing larger group shots, but the extra length will also make it easier to capture more creative shots from higher or lower angles.

The extension rod doesn’t automatically extend and retract so you can’t use it as a short-range camera dolly, but you probably wouldn’t want it to anyways because while last year’s OM 4 model boasted 15 hours of battery life, the new OM 5 tops out at around six hours and 20 minutes. That’s still pretty decent—you’d probably fill up your phone’s storage capturing video before the OM 5 needed a recharge—but it’s a significant drop in battery life given the OM 5 doesn’t look much smaller than the OM 4. The storage space needed for that telescoping arm apparently left a lot less room for the battery.

The added reach of the telescoping gimbal promises to make it easier to capture larger selfie shots with a group of people, as well as more creative angles when using the smartphone’s rear cameras.
Image: DJI

Other upgrades on the DJI OM 5 include stronger motors in the gimbal to hold larger and heavier smartphones steady, improved image recognition and tracking for automatically locking onto specific subjects in the frame, and a new ShotGuides feature in the app which will scan a scene, automatically recommend a series of shots to capture, and then automatically edit them together into a video.

G/O Media may get a commission

The OM 5 is available now on DJI’s website in two color options: Athens Gray and Sunset White, for $139. Optional upgrades include a $39 carrying case (the device does come with a fabric storage pouch), as well as a $79 alternate magnetic smartphone clamp that adds battery-powered LED lights to improve low-light selfie shots.

via Gizmodo https://gizmodo.com

September 8, 2021 at 10:04AM

A Single Laser Fired Through a Keyhole Can Expose Everything Inside a Room

https://gizmodo.com/a-single-laser-fired-through-a-keyhole-can-expose-every-1847638281


Being able to see inside a closed room was a skill once reserved for super heroes. But researchers at the Stanford Computational Imaging Lab have expanded on a technique called non-line-of-sight imaging so that just a single point of laser light entering a room can be used to see what physical objects might be inside.

Non-line-of-sight (NLOS, for short) imaging is by no means a new idea. It’s a clever technique that’s been refined in research labs over the years to create cameras that can remarkably see around corners and generate images of objects that otherwise aren’t in the camera’s field of view, or are blocked by a series of obstacles. Previously, the technique has leveraged flat surfaces like floors or walls that are in the line of sight of both the camera and the obstructed object. A series of light pulses originating from the camera, usually from lasers, bounce off these surfaces and then bounce off the hidden object before eventually making their way back to the camera’s sensors. Algorithms then use the information about how long it took these reflections to return to generate an image of what the camera can’t see. The results aren’t high resolution, but they’re usually detailed enough to easily determine what the object in question is.

It’s an incredibly clever technique, and one day it could be a very useful technology for devices like autonomous cars that would potentially be able to spot potential hazards hidden around corners long before they’re visible to passengers in a vehicle, improving safety and obstacle avoidance. But the current NLOS techniques have a big limitation: They’re dependent on a large reflective surface where light reflections coming off a hidden object can be measured. Trying to image what’s inside a closed room from the outside is all but impossible—or at least it was until now.

The keyhole imaging technique, developed by researchers at Stanford University’s Computational Imaging Lab, is so named because all that’s needed to see what’s inside a closed room is a tiny hole (such as a keyhole or a peephole) large enough to shine a laser beam through, creating a single dot of light on a wall inside. As with previous experiments, the laser light bounces off a wall, an object in the room, and then off the wall again, with countless photons eventually being reflected back through the hole and to the camera which utilizes a single-photon avalanche photodetector to measure the timing of their return.

When an object hidden in the room is static, the new keyhole imaging technique simply can’t calculate what it’s seeing. But the researchers have found that a moving object paired with pulses of light from a laser generate enough usable data over a long period of exposure time for an algorithm to create an image of what it’s seeing. The quality of the results is even worse than with previous NLOS techniques, but it still provides enough detail to make an educated guess on the size and shape of the hidden object. A wooden mannequin ends up looking like a ghostly angel, but when paired with a properly trained image recognition AI, determining that a human (or human-shaped object) was in the room seems very feasible.

G/O Media may get a commission

The research could one day provide a way for police or the military to assess the risks of entering a room before actually breaking down the door and storming their way inside, using nothing but a small crack in the wall or a gap around a window or doorway. The new technique could also provide new techniques for autonomous navigation systems to spot hidden hazards long before they become a threat in situations where the previous NLOS techniques weren’t practical given the environment.

via Gizmodo https://gizmodo.com

September 8, 2021 at 04:15PM

This Backpack Turns Into an Airbag for Your Head

https://gizmodo.com/this-backpack-turns-into-an-airbag-for-your-head-1847643676


Over the years we’ve seen countless contraptions designed to help protect cyclists in the event of a crash, including a neck collar that inflates around the head, and pants that blow up like a balloon. EVOC Sports’ new Commute Air Pro 18 appears to be a simpler idea, hiding an emergency airbag inside a motion-sensing cycling backpack.

The pack isn’t a replacement for a helmet—you’ll still want to use basic common sense and do everything you can to protect your head when you’re out riding and sharing the road with cars and trucks. But it promises to provide additional protection for other body parts that are easily injured when a cyclist takes a tumble, including the shoulders, neck, chest, and collarbone.

The Commute Air Pro 18 looks and feels just like a standard backpack when worn, but when the magnetic buckle on the chest strap is closed, motion sensors are activated that monitor the position and movement of the pack around 1,100 times every second. If sudden extreme motions are detected, indicating a crash or collision has occurred that was severe enough for the rider to fall, an 18-liter airbag deploys from the top of the pack, inflating in less than 0.2 seconds and wrapping around their upper torso. EVOC Sports claims the airbag “reduces the impact forces and braking acceleration (HIC: head injury criterion) on the cyclist by up to 80%.” After the airbag has been deployed, and assuming it didn’t suffer any damage, it can be repacked and its igniter cartridge can be replaced, so thankfully the backpack and the safety system are reusable.

The Commute Air Pro 18 is a functional backpack, too, with the compressed airbag taking up very little space inside. It leaves room for a laptop which is accessible from the side of the pack, various pockets throughout the interior for keeping other electronics and knick-knacks organized, including a smartphone and glasses, and an adjustable hip belt that ensures the backpack isn’t sliding around while you ride, as well as taking some of the weight of the pack off the wearer’s shoulders.

G/O Media may get a commission

When available in the spring of next year, however, the Commute Air Pro 18 won’t come cheap. It will be priced at €900, which is a little over $1,000. That’s probably more expensive than the bikes many people actually commute to work on, but while it’s not cheap, you probably won’t bemoan how much you spent on it the day it saves you from serious injuries in a bike crash.

via Gizmodo https://gizmodo.com

September 9, 2021 at 12:27PM

Google’s Plan to Use 120% Less Water Doesn’t Quite Add Up

https://gizmodo.com/google-s-plan-to-use-120-less-water-doesn-t-quite-add-1847647300


Google pledges to use millions of gallons of wastewater to cool its data centers rather than drinking water.
Image: Google

Just as oil companies laud their plans to address climate change, tech companies also tend to sound off about their respective efforts to be more environmentally sound. Sometimes they don’t wholly add up either.

Google is the latest tech giant to announce its commitment to the Earth, this time through water conservation. The company said plans to replenish 120% of the water it consumes by 2030. Microsoft and Apple have made similar pledges.

Google explained in an official blog post that it’s been working on a plan for years to cut down water use, which it uses primarily to cool down its data centers. It started with its servers in Douglas County, Georgia, where the company previously came under scrutiny for using regular drinking water. (It now uses reclaimed wastewater.) And in the San Francisco Bay Area, a part of a state enveloped in the West’s unprecedented megadrought and where farmers have been facing water restrictions, Google is working with ecologists and architects to help “improve the resiliency” of the water landscape surrounding other server farms. In the post, here’s what the company had to say about its efforts:

“Our water stewardship journey will involve continuously enhancing our water use and consumption. At our data centers, we’ll identify opportunities to use freshwater alternatives where possible — whether that’s seawater or reclaimed wastewater. When it comes to our office campuses, we’re looking to use more on-site water sources — such as collected stormwater and treated wastewater — to meet our non-potable water needs like landscape irrigation, cooling and toilet flushing.”

Water cooling is an efficient way to keep a mass of data servers at an operational temperature. You can imagine how many gallons of water Google uses to power up every Gmail, YouTube, Google Maps, and Search query. (If you can’t imagine, the answer is billions.) These are, of course, part of the same suite of products you’d use to help yourself navigate a water-based climate emergency. Just the other day, I was looking up how many gallons of water my family and I would have to ration if we were to face mandatory water restrictions, something afflicting some communities and farmers that rely on Lake Mead. (“Water restrictions” has become an increasingly popular search in the U.S. this year, so it appears I’m not alone.)

Using wastewater for the “evaporative cooling” effect that Google employs to cool the air around its servers is more sustainable than potable water. So, too, are Google’s efforts to use seawater and other forms not fit for human consumption. But the promise to replenish 120% of the water seems dubious, considering there is no specific plan to use less water. For example, it takes about 1.25 million gallons of water a day to cool down the data centers located in Mesa, Arizona, and as internet access spreads, so too will the need for more water-hungry data centers.

G/O Media may get a commission

That is more gallons of water than some town reservoirs have in their emergency supply. And in parts of the country where water is scarce, it’s a big tradeoff to give it to a data center when there’s a struggling community sharing the supply nearby. Google has said it will replenish water in the “regions” it operates, but what counts as replenishment and how big a region is are key factors in determining what communities and what ecosystems will end up benefitting.

While Google’s environmental efforts are certainly a step in the right direction, it’s hard to see them as a long-term solution. After all, most of Google’s business operates from the cloud, facilitated by servers churning away on an increasingly water-stressed planet. In some ways, Google’s pledge is no different from the oil industry buying carbon offsets to diminish its environmental damage. On a planet where 1.8 billion people live in countries facing extreme water stress, the risk of disaster increases.

Google has seen pushback from communities about its data centers’ water use, though it’s still managed to operate and even expand in those places. TBD on if the 120% pledge will move the needle. A corporate pledge, though, can only go so far. To ultimately solve our water scarcity problem, policies to conserve it in the first place.

via Gizmodo https://gizmodo.com

September 9, 2021 at 05:57PM

Tesla Patented Laser Wipers to Blast Bird Poop Off of a Vehicle’s Windshield

https://gizmodo.com/tesla-patented-laser-wipers-to-blast-bird-poop-off-of-a-1847650967


You’d assume it would be all hands on deck at Tesla with researchers, engineers, and software developers all working together to rectify the car maker’s problematic Autopilot feature, but instead, the company is devoting resources to other futuristic features: like laser windshield wipers.

We’ll be the first to admit that the current method of cleaning a vehicle’s windshield while driving is far from perfect. Assuming the wiper blades themselves aren’t covered in grime and you didn’t forget to refill the wiper fluid, they tend to smear dirt and other debris across the windshield instead of effectively wiping it away. Windshield wipers might do a passable job at preventing raindrops from occluding a driver’s view of the road ahead, but they’re a technology that’s definitely due for an upgrade, and if Tesla has anything to say about it, one day they might be replaced by lasers.

As vehicles become smarter and more reliant on technologies like cameras, sensors, and even energy-generating photovoltaic panels whose performance is hindered by a build-up of dirt, there’s even more demand for a way to keep various surfaces of a vehicle clean that doesn’t involve a daily trip through a car wash.

Originally filed with the United States Patent and Trademark Office on May 10, 2019, but finally approved earlier this week, Tesla’s patent for a “Pulsed Laser Cleaning of Debris Accumulated on Glass Articles in Vehicles and Photovoltaic Assemblies” sounds as foreboding as it does innovative. The patent describes a system where “debris detection circuitry” can determine if dirt has built up on glass surfaces that need to be kept as clean as possible for optimal functionality, like cameras monitoring the road, or even the windshield that a driver is looking through. Based on how much dirt is detected and where it’s located, a laser’s intensity and focus are calibrated to direct the beam specifically to the problem area (imagine a patch of bird poop on your windshield) so that as it’s quickly pulsed it has enough intensity to burn the dirt away, without actually penetrating the glass surface and causing harm to occupants inside, or sensitive electronics.

G/O Media may get a commission

On one hand, the approach eliminates complicated mechanical machinery that’s prone to failure as well as an additional electric motor putting a heavy strain on an EV’s battery. It also removes the wiper blades moving back and forth across the windshield which can often be a distraction for a driver. On the other hand, a laser powerful enough to burn dirt off a window is definitely something to be concerned about. You’re not even supposed to look directly at the low-power laser diodes used in laser pointers, but the beam’s intensity would have to be increased dramatically for this application.

The patent does describe an approach where the laser’s beam is quickly pulsed so that it “limits penetration of the laser beam to a depth that is less than a thickness of the glass article” and while that might provide some peace of mind for the vehicle’s occupants, what about others around the vehicle, including other drivers on the road, who might be exposed to unwanted reflections of the beam? In an ideal world that would never happen, but in an ideal world self-driven Teslas also wouldn’t crash into police cruisers. The self-cleaning laser system could be limited to use only when the vehicle is safely stopped, or inside a garage even, but then it would be useless for the random times that something splatters across your windshield while you’re on the road. As futuristic as the idea sounds, we’re quite happy to stick with our old-school solution for the time being.

via Gizmodo https://gizmodo.com

September 10, 2021 at 09:45AM

Facebook’s first smart glasses are the Ray-Ban Stories

https://www.engadget.com/facebook-ray-ban-stories-smart-glasses-ar-160006477.html?src=rss

Facebook’s first foray into the world of smart glasses is here. Confusingly dubbed Ray-Ban Stories, they start at $299 and bring together much of the technology we’ve already seen in smart eyewear. They’ll let you take first-person photos and videos on the go, like Snap’s Spectacles. And, similar to Bose and Amazon’s speaker-equipped glasses, you’ll be able to listen to media, as well as take calls.

But what’s most impressive is that Facebook and Ray-Ban owner Luxottica have crammed the hardware components into frames that look and feel almost exactly like a pair of typical designer glasses. The only difference is that the pair of cameras mounted along the corners. 

[Be sure to check out our deeper hands-on with the Ray-Ban Stories!] 

The Ray-Ban Stories in the iconic Wayfarer style — those chunky ’50s-era frames that still look fashionable today — weigh just five grams more than the standard version. And that’s including its dual 5-megapixel cameras, Snapdragon processor, touchpad, speakers, three-microphone array and other hardware. I’ll be honest, I was a bit shocked when I learned how much they weighed. We’re used to smart glasses being thick and heavy, even when they’re coming from major brands like Bose. The Ray-Ban Stories look, well, normal.

I suppose that shouldn’t be too surprising, though, as both Facebook and Ray-Ban ultimately want to normalize smart frames to the point where they’re as common as wireless earbuds. That also helps the companies avoid the mistake Google made with Glass: Those things looked so alien and Borg-like that they were almost instantly reviled. 

Ray-Ban Stories
Ray-Ban and Facebook

Privacy remains a concern with all smart glasses, though. The Ray-Ban Stories have a bright LED that lights up when you’re taking photos and video, but I could see many people taking issue with the subtle camera placement. We’re all used to people capturing everything with their smartphones these days, but doing so still requires more effort than tapping your glasses or issuing a voice command to an all-seeing social network. 

If Facebook can successfully deliver the first smart glasses that don’t make the wearer feel like a joke, and which the general public doesn’t want to throw in a fire, it could gain a serious foothold in the augmented reality market. And, well, we know how much Mark Zuckerberg wants to transform it into a "metaverse company." 

In addition to the Wayfarer style, Ray-Ban Stories will be available in the brand’s Round and Meteor frames, five different colors, and your typical array of lenses: Clear, sun, prescription, transition and polarized. I’m surprised Ray-Ban isn’t offering polarized sunglass lenses by default though, which can reduce glare far better than lenses that are just tinted dark. As for battery life, Facebook claims the Ray-Ban Stories will last for around a day of use (around three hours of audio streaming), while the bundled charging case adds another three days of use.

As ambitious as they may seem, Ray-Ban Stories are also yet another example of how Facebook seemingly can’t help but imitate Snapchat, which has been dabbling in smart glasses since 2016. Even their name hearkens back to the social story format that Snap kicked off and was later copied by Facebook, Instagram and pretty much every other social media outfit. But at this point, I don’t think Facebook cares if everyone calls them copycats if it ultimately leads to more engagement.

Ray-Ban Stories Samples
Devindra Hardawar/Engadget

After testing out the Ray-Ban Stories for a few days, I found them far more compelling than any smart glasses today. They don’t look as goofy as the Snap Spectacles, and they’re far more comfortable to wear than Bose and Amazon’s Frames. I could only use the Stories in limited situations though, since I need prescription lenses to actually see well.

Still, I was surprised by how smooth video footage looked; it reminded me of YouTube professionals like J. Kenji Lopez-Alt who use head-mounted GoPros. It was also nice to have both hands free to capture fleeting moments of play with my daughter. I was less impressed with the Stories’ photo quality, but I suppose it could be useful if you wanted to take a pic without pulling out your phone. You can import your photos and videos from the smart glasses into Facebook View, a new app that lets you quickly edit your media and share it to practically every social media site (yes, even Snapchat!).

Ray-Ban Stories Samples
Devindra Hardawar/Engadget

While I didn’t expect much when it comes to audio playback, the Stories surprised me with sound that was good enough for listening to light tunes or podcasts. I could see them being particularly useful while jogging or biking outdoors, where you want to maintain situational awareness. During the day, I’m never too far from my wireless earbuds, but being able to get a bit of audio from my glasses in a pinch could be genuinely useful.

To control the Ray-Ban Stories, you can either invoke the Facebook assistant by saying "Hey Facebook" or by tapping the button on the right arm, or swiping on the side touchpad. Personally, I never want to be caught in public talking to Facebook, so I mostly relied on touch controls. But the voice controls worked just fine during the few occasions when nobody could hear my shame.

Ray-Ban Stories
Ray-Ban and Facebook

While they’re not exactly perfect, the Ray-Ban Stories are the first smart glasses I’d recommend to someone looking for a pair. But the Facebook of it all is still concerning. While the company says the glasses will only collect basic information to be functional — things like the battery level, your Facebook login and Wi-Fi details — who knows how that’ll change as its future smart glasses become more fully featured. Perhaps that’s why there’s no Facebook branding on the Ray-Ban Stories case and frames: It’s probably better if people forget these are also Facebook-powered products.

You’ll be able to buy Ray-Ban Stories today in 20 different styles in the US, Australia, Canada, Ireland, Italy and the UK. Even though the Ray-Ban Stories may seem to have limited availability right now, Facebook and Luxottica have a multi-year partnership that will result in even more products. It’s likely that true AR glasses, which can display information on your lenses, aren’t far off. And you can be sure of that, since Snapchat has already shown off its own AR Spectacles

via Engadget http://www.engadget.com

September 9, 2021 at 11:09AM