Volcon Grunt is an electric motorcycle you can ride underwater

https://www.autoblog.com/2020/10/24/volcon-grunt-electric-motorcycle-fat-tires/


The number of companies preparing to fight for a slice of the burgeoning electric off-roader segment grows on a regular basis, but Texas-based Volcon thinks it has what it takes to stand out. It unveiled a battery-powered motorcycle named Grunt with about 100 miles of range and the ability to operate under water.

Although it wasn’t designed as a submarine, the Grunt is powered by a drivertain that’s IP67-rated, which means it can spend up to 30 minutes under roughly 40 inches of water without sustaining damage. It’s not going to help divers explore wrecked ships on the ocean floor, but it can theoretically ford rivers with ease. Once it’s out of the water, its fat tires and its 12 inches of ground clearance allow it to tackle a wide variety of terrains.

Volcon quotes a six-second sprint from zero to 60 mph, which is also the Grunt’s top speed, thanks in part to an electric motor that develops 50 horsepower and 75 pound-feet of torque. All we know about the battery pack is that it holds enough electricity for up to 100 miles of riding, and it can be swapped in a matter of minutes with simple tools. Recharging it takes two hours when the Grunt is plugged into a regular household outlet.

Product development is currently underway; the motorcycle shown in the gallery above is a pre-production prototype. Deliveries are scheduled to begin in spring 2021, and pricing starts at $5,995. Volcon is based in Round Rock, Texas, as of October 2020, but it’s actively looking for a permanent home on the outskirts of Austin. It plans to manufacture motorcycles in the new facility, and it will later expand its range with a pair of side-by-sides named Stag and Beast, respectively. The former seats two, while the latter comes with four seats.

Related Video:

via Autoblog https://ift.tt/1afPJWx

October 24, 2020 at 10:20AM

Descript lets you edit videos by tweaking text scripts

https://www.engadget.com/descript-video-editing-audio-screen-recordings-181143665.html

Video editing is often a time-consuming process, but Descript is trying to take the sting out of it a bit with its latest suite of tools. Descript Video transcribes your footage and turns it into a text document. Changes that you make there are reflected in your video edit.

Cutting a flubbed line is as simple as deleting the transcribed text. You can even dub over any misspeaks by changing the words in the text editor — Descript’s AI-based tech can add audio in your own voice. You’ll be able to add titles, transitions, image and video overlays, keyframe animations and more using a multitrack video editor. There’s also the option to export projects to other editing suites.

You can also use Descript to capture screen recordings. You can record video of your screen and webcam feed, and edit them in the same way. Descript’s screen recordings are free and you’ll have limited access to the editing tools without a paid plan, which starts at $12/month.

Descript, whose CEO is Groupon co-founder Andrew Mason, has been best known until now for trying to make audio editing as easy as working in a collaborative Google Doc. Bringing that same level of simplicity to video editing is no mean feat, particularly since you can work on a project with several other people at the same time by sharing links with them.

Video editing has long been on Descript’s road map, according to Mason. "Today’s creators have a healthy disregard for the traditional boundaries of content mediums?— YouTubers make podcasts, podcasters are distributing on YouTube — and they need a tool that’s equally capable in audio and video," he wrote. "Breakthroughs in AI enable the creation of this new class of narrative media creation tools?—?one doesn’t force a trade-off between power and ease of use. We’ve been building this foundation for over five years, and it’s finally here."

via Engadget http://www.engadget.com

October 22, 2020 at 01:24PM

Photoshop’s Neural Filters can alter people’s expressions in convincing—and nightmarish—ways

https://www.popsci.com/story/technology/photoshop-neural-filters-face/

The Happiness filter quickly becomes the nightmare filter when you max it out.

The Happiness filter quickly becomes the nightmare filter when you max it out. (Stan Horaczek /)

Artificial intelligence sometimes comes in very handy when using Photoshop. For years, now, users have been able to employ the software’s Content Aware tech to quickly and easily remove objects from images and expand photos beyond their natural borders.

The latest AI innovation to make its way into the venerable image editing software, however, pushes the automatic adjustments to an entirely new level. The Neural Filters work on Adobe’s Sensei platform, which encompasses its machine learning technology. They let you automatically tweak pictures of a human face in some impressive—and sometimes borderline disturbing—ways.

This is the original source image used for the filters.

This is the original source image used for the filters. (Stan Horaczek /)

I chose the above picture of my own face to try out some of the filters because subjecting other people to AI-powered weirdness felt rude. For a while now, Photoshop has been able to recognize individual pieces of a person’s face in a photo and make independent adjustments to specific features via the Liquify tool. If you wanted to give someone the appearance of a small smile, you could already do that with a simple slider. You could slightly open or close their eyes a bit, too.

The new filters, however, take a much more holistic approach to the edits. To try them yourself, make sure you’re running the latest Photoshop 2021 update.

Under the Filters menu, select “Neural Filters” and it will open a new dialog box with two options visible: Skin Smoothing and Style Transfer. These are the basic AI-powered filters that do pretty much what you’d expect after reading their names.

Overly smoothed skin has become a trademark of selfies on the internet.

Overly smoothed skin has become a trademark of selfies on the internet. (Stan Horaczek /)

The Skin Smoothing effect looks like something you’d find in Snapchat or another automatic “portrait” retouching app meant to hide skin imperfections. Cranked to their maximum level, you’ll look totally ridiculous—like an infant standing in front of a floodlight. In the middle, however, you get a more familiar smooth glow. It doesn’t look natural and it nuked almost all of my freckles, but it doesn’t look completely bananas unless you crank it.

Style Transfer works like the Prisma app that was popular a few years ago, only it can take the style of a piece of art and vaguely translate it to your picture.

If you click the test tube under the Neural Filters tab on the menu, however, that’s where things get weird.

This is the Anger effect.

This is the Anger effect. (Stan Horaczek /)

The Smart Portrait features allow you to make sweeping adjustments to a human face on the whole. The top slider is called “Happiness” and pushing it to its maximum produced this horrible result that will almost certainly haunt you well after you’ve closed this article. It added teeth! Why?!

Move the Happiness slider all the way to -50, however, and you get something more believable. You may not even notice it was edited at all, if you weren’t familiar with the original.

The Surprise and Anger sliders look similarly ridiculous when slammed to their maximum values, but there are some believable looks in between. Because Adobe is processing the image in the cloud, however, you don’t get a live look at what you’re doing when you make an adjustment. You move the slider a little, then wait for the spinning wheel to disappear, and then you get a preview of your image. You can adjust from there.

The aging filter thought my hat was my hair. Still, it's not bad.

The aging filter thought my hat was my hair. Still, it’s not bad. (Stan Horaczek /)

This process takes an absurd amount of workload away from your machine and moves it to the cloud, but it still managed to spin up the fans in my powerful MacBook Pro.

Beyond the emotion filters, you can also tweak Facial Age, Gaze, and even head direction if you have the right photo. My beard really seems to throw off the tech whenever it tries to turn me in some direction.

The Light Direction filter is impressive. I took this photo as part of a light test for professional headshots, so it’s using three fancy studio lights placed in specific spots to get the desired effect. Moving the slider does a surprisingly convincing job of making it look like the main light source is coming from an entirely different place. Again, my beard screwed it up, but this is the kind of subtle effect that’s more convincing than aging my own face by 50 years. It’s also similar to what Apple and Google have been doing with their AI-driven lighting tweaks you can find in their photo editing and camera apps.

It still messed up on the beard, but the lighting looks like it's coming from a different direction.

It still messed up on the beard, but the lighting looks like it’s coming from a different direction. (Stan Horaczek /)

Some critics have already called this tech problematic because it enables people to edit and manipulate images in a convincing way. It is a big deal that this now lives inside the world’s most popular photo editing software, but these effects were already readily available with a few quick searches.

Right now, the results are fun, if not entirely refined. But, Adobe will get better over time as its tech has time to learn what works and what doesn’t from a growing sample size. For now, it still thinks my hat is a fancy hairdo and that it will age right along with me.

via Popular Science – New Technology, Science News, The Future Now https://www.popsci.com

October 22, 2020 at 07:05AM

Amazon Isn’t Even Hiding Its Intentions Anymore

https://gizmodo.com/amazon-isnt-even-hiding-its-intentions-anymore-1845442072


Photo: Ina Fassbender (Getty Images)

After spending years promising Congress that the data it collected from third-party sellers wasn’t used to beef up its private-label products, today Amazon decided to roll out a product meant to do exactly that. The Amazon Shopper Panel, as it’s called, promises to pay Amazon customers that offer intel to the ecommerce giant about where they shop when they’re not shopping on Amazon dot com.

Here’s how the Shopper Panel works: After getting an IRL or e-receipt from any business that isn’t owned by Amazon (so Whole Foods or Four Star locations are not eligible), panelists can either submit a picture of that receipt through the app, or in the case of digital copies, forward their emailed details to a panel-specific email address. According to the Panel website, folks that upload “at least” 10 receipts per month can either cash that in for $10 in Amazon credit or $10 donated to their charity of choice. Along with that baseline payout, the app will also dole out additional earnings to panelists who answer the occasional survey about certain brands or products within the app.

Not every receipt counts toward this program. Per Amazon, receipts from grocery stores, drug stores, restaurants, and movie theaters—along with just about any other “retailer” or “entertainment outlet”—are fair game. Receipts from casinos, gun stores, transit fare, tuition or apartment rentals aren’t.

While the program is invite-only for now, any curious Amazon customer based in the U.S. can download the Panel app from the iOS App Store or the Google Play Store if they want to put their name on the waitlist.

Panels like these have literally been around since before Jeff Bezos was born, and there are dozens of apps that offer similar payouts to anyone willing to lend their opinion on everything from sugary cereals to certain sports cars. But Amazon’s own app is a bit…more unsettling, especially for folks that have followed the company’s business model.

G/O Media may get a commission

Screenshot: Shoshana Wodinsky (Gizmodo

Under the Amazon Panel site’s “Privacy” tab, the company notes that any receipts that you share will go toward “[helping] brands offer better products and [making] ads more relevant on Amazon.” The company also notes any data gleaned from these receipts or surveys might also be used to “ improve the product selection on Amazon.com and affiliate stores such as Whole Foods Market,” and to “improve the content offered through Amazon services such as Prime Video.”

That’s why this rollout is a particularly gutsy move for Amazon to take right now. Recent months have seen the company come under an increasing barrage of regulatory fire from authorities both in the U.S. and in Europe over a scandal that largely revolved around tracking consumers’ purchase data—not unlike the data pulled from the average receipt—on its platform. This past spring, an investigation from the Wall Street Journal revealed that Amazon had spent years surveilling the purchases earned by the platform’s third party sellers specifically to create its own competing products under the Amazon private label. This story came out barely a year after Amazon’s associate general council, Nate Sutton, told Congress that the company didn’t use “individual seller data” to do just that.

The Panel Program is a slightly less slimy take on the same idea. Because Amazon doesn’t have access to the same purchase-by-purchase data from the myriad brick-and-mortar “sellers” that Amazon competes with offline, paying people for their receipts from these sellers ensures Amazon will have a steady stream of data from its IRL competitors. Whenever we feel comfortable walking back into movie theaters, our AMC receipts can be used to fuel new exclusives on Amazon Prime. Our receipts from our occasional grocery runs can be used to tell Whole Foods which products it should ramp up or abandon.

In a message to advertisers about the official rollout of the Panel Program, the company noted that “customers routinely use Amazon to discover and learn about products before purchasing them elsewhere. In fact, Amazon only represents 4% of U.S. retail sales,” which is technically true—and a point that Bezos has mentioned in the past when grilled on antitrust issues on the Senate floor.

But it’s a point that ignores how more and more of our purchases are happening online overall, and how more and more of those purchases are happening through Amazon. One 2020 report of the major e-commerce companies found that close to 39% of all online sales happened under Amazon’s watch. Its biggest competitor was Walmart, which controlled a little more than 5%. And now that Amazon is dipping into in-person purchase data, that percentage may dwindle even further.

via Gizmodo https://gizmodo.com

October 21, 2020 at 07:18PM

HOW TO: Fold the ‘Tube’ Paper Airplane [Video]

https://www.geeksaresexy.net/2020/10/22/how-to-fold-the-tube-paper-airplane-video/

John Collins, also known as ‘The Paper Airplane Guy,’ teaches us how to fold and fly our very own “Tube” paper airplane. The Tube doesn’t look like much, but it can certainly fly! You throw the Tube very much like you throw a football, letting the ring spin off of your fingertips. That rotation helps create lift, and lets the airplane do its thing.

[Wired]

The post HOW TO: Fold the ‘Tube’ Paper Airplane [Video] appeared first on Geeks are Sexy Technology News.

via [Geeks Are Sexy] Technology News https://ift.tt/23BIq6h

October 22, 2020 at 11:00AM

Adobe tries using AI to fix blurry video footage

https://www.engadget.com/adobe-sharp-shot-213047501.html

Between things like camera shake and poor lighting, the videos you take with your phone, DSLR or mirrorless camera can end up blurry for any numbers of reasons. Worse yet, it’s difficult to sharpen a photo or video after the fact, and more often than not, the results don’t look great. However, with the help of AI, Adobe thinks it may have the solution for blurry photos and videos.  

During the Sneaks portion of its Max 2020 conference, the company showed off an experimental feature called Sharp Shots that’s powered by its Sensei AI. It uses machine learning to deblur each frame of a video. The results can be a bit hard to appreciate in a compressed YouTube clip, but they mostly speak for themselves. You can see in the videos that Adobe shared that there’s a significant difference in image clarity. It’s most noticeable in the final example, with the facial features of the two friends in the clip much easier to make out in the AI-processed video.

One thing to keep in mind is we may not see Sharp Shots ever make its way to an Adobe product. It’s also worth pointing out we’ve seen other companies promise a lot with AI and then not deliver. In one of its most famous I/O demos, Google showed off a Photos feature it was working on that would allow you to remove objects from images. The company never ended up shipping that feature.

That said, AI is enabling some of the most compelling new features coming to Adobe’s products. In Photoshop, for example, AI will soon allow you to add more dramatic skies to your photos. That’s something that won’t change even if Adobe doesn’t end up shipping Sharp Shots.   

via Engadget http://www.engadget.com

October 21, 2020 at 04:36PM