Gemini AI Will Now Help You With Your Physics Homework

As if using ChatGPT for college essays wasn’t enough, students are now getting a Gemini tool to help with their Math and Physics homework, too. Google is utilizing its Circle to Search gesture, which was received quite well, to introduce a new feature.

No Google AI Search, I Don’t Need to Learn About the “Benefits of Slavery”

With this update, you’ll be able to simply circle the part you’re stuck on and then use a long-press shortcut to find a step-by-step solution to your homework questions. Google sounds proud about the fact that they’re not just giving you the answer, but also all the working for it. All you have to do to use the feature is opt-in for help with word problems from the Search Labs menu.

According to Google’s blog, it’s using its LearnLM tech to make this happen, which is apparently its “new family of models fine-tuned for learning.” The new feature is Android only for now and is available on 100 million devices today. Google says that number will double by the end of this year. The Alphabet company also adds that it will soon extend its feature to include reading and analyzing graphs, symbolic formulas, and diagrams as well.

Google Keynote (Google I/O ‘24)

If you ask me, I don’t have a good feeling about this feature. I’m well aware of how much I’m sounding like a Boomer now, but I’m worried that students are going to stop using their brains if advancements like these continue to happen. I was in college when ChatGPT came out and witnessed how students frequently turned to it for plagiarism. Thankfully, we now have tools to detect ChatGPT-generated content, and I’m guessing something like that will eventually come out for Gemini content, too.

We will be covering all of Google’s big announcements during I/O this week. However, it seems one thing that won’t be announced at this year’s I/O is any new hardware.

via Gizmodo

May 14, 2024 at 02:09PM

Google’s Project Gameface hands-free ‘mouse’ launches on Android

At last year’s Google I/O developer conference, the company introduced Project Gameface, a hands-free gaming "mouse" that allows users to control a computer’s cursor with movements of their head and facial gestures. This year, Google has announced that it has open-sourced more code for Project Gameface, allowing developers to build Android applications that can use the technology. 

The tool relies on the phone’s front camera to track facial expressions and head movements, which can be used to control a virtual cursor. A user could smile to "select" items onscreen, for instance, or raise their left eyebrow to go back to the home screen on an Android phone. In addition, users can set thresholds or gesture sizes for each expression, so that they can control how prominent their expressions should be to trigger a specific mouse action. 

The company developed Project Gameface with gaming streamer Lance Carr, who has muscular dystrophy that weakens his muscles. Carr used a head-tracking mouse to game before a fire destroyed his home, along with his expensive equipment. The early version of Project Gameface was focused on gaming and uses a webcam to detect facial expressions, though Google had known from the start that it had a lot of other potential uses. 

For the tool’s Android launch, Google teamed up with an Indian organization called Incluzza that supports people with disabilities. The partnership gave the company the chance to learn how Project Gameface can help people with disabilities further their studies, communicate with friends and family more easily and find jobs online. Google has released the project’s open source code on GitHub and is hoping that more developers decide to "leverage it to build new experiences."

Catch up on all the news from Google I/O 2024 right here!

This article originally appeared on Engadget at

via Engadget

May 15, 2024 at 07:33AM