AI-Powered Coding Will Add $1.5 trillion to Global GDP Say Researchers

https://www.discovermagazine.com/technology/ai-powered-coding-will-add-usd1-5-trillion-to-global-gdp-say-researchers

Back in October 2021, developers on the GitHub software development platform were given access to an exotic AI tool called Copilot. Created in association with OpenAI, the company behind ChatGPT, Copilot uses the same generative AI technology to produce computer code on request, rather than text.

And it is pretty good at it. Github reckons that Copilot successfully autocompletes coding suggestions about 50 percent of the time, which should significantly increase the productivity of the millions of developers who now use it.

And that raises an interesting question: just how influential is Copilot set to become?

Trillion Dollar Question

Now we get an answer thanks to the work of Thomas Dohmke at Github, Marco Iansiti at Harvard Business School and Greg Richards at Keystone.AI who have measured how developers are using Copilot. Their conclusion is that the AI-powered coding is producing a sea change in the software industry that is set to turbocharge global GDP by over $1.5 trillion by 2030. “This symbiotic relationship has the potential to shape the construction of the world’s software for future generations,” they say.

GitHub is a software development platform that allows developers to collaborate with ease on global scales. Launched in 2008, bought by Microsoft in 2018 and now with 100 million developers on its books, Github has rapidly become the go-to repository for software development, particularly for open-source projects.

Copilot is powered by the same Generative Pre-Trained Transformer technology behind ChatGPT. But instead of generating text, it produces code. As such, it has the potential to significantly influence software production.

So Dohmke and co decided to find out by how much. They analyzed the way almost a million Github developers use Copilot and the code it produces. “On average,” they say,” users accept nearly 30% of code suggestions and report increased productivity from these acceptances”.

The pattern of use is interesting too. Dohmke and co say that the productivity impact increases over time and that the benefits are greatest for less experienced developers. That is consistent with other work suggesting that less experienced workers benefit most from AI assistants because they have the most to learn.

In addition, Dohmke and co say that most of the innovation with AI-powered coding is on open-source projects and usually led by individuals rather than corporations. They say their findings “suggest that the open-source ecosystem, particularly in the United States, is driving generative AI software innovation. “

AI Innovation

So how much is this increased productivity worth? Dohmke and co point out that software development significantly contributes to global GDP already but that its contribution is limited by the global shortage of developers. They say the increased productivity from Copilot could fill some of this gap and that this increase in productivity would add $1.5 trillion to global GDP by 2030.

Dohmke and co are adamant that this is a conservative estimate, pointing out the adoption of AI-powered coding techniques is accelerating and that what we are seeing now is just the beginning of a massive change. “As more developers adopt these tools and become fluent in the skill set of prompting with generative AI, it is clear that this new way of software development has created an inextricable link between humankind and artificial intelligence that could well define how the world’s software is built for generations to come,” they conclude.

Hang on to your hats!


Ref: Sea Change in Software Development: Economic and Productivity Analysis of the AI-Powered Developer Lifecycle : arxiv.org/abs/2306.15033

via Discover Main Feed https://ift.tt/zkXgMLd

June 29, 2023 at 10:29AM

Researchers Created a Simple App That Turns Your Smartphone Screen Into an Accurate Thermometer

https://gizmodo.com/thermometer-app-1850565745

Device makers have struggled to incorporate temperature sensors into smartphones and smartwatches to turn them into medically accurate body thermometers, but researchers at the University of Washington claim they’ve come up with a way to turn an off-the-shelf smartphone into exactly that–with nothing but a new app. They’re calling it FeverPhone.

Is This the End of Apple’s Lightning Cable?

Although smart wearables like the Apple Watch Series 8 and Apple Watch Ultra can measure a user’s body temperature through newly added sensors, it’s a feature that Apple insists is not yet accurate enough to be used for medical diagnosis or treatment. Instead, both of those devices use temperature measurements to provide users with a better understanding of their sleep patterns throughout the night. Unlike heart rate readings, the Apple Watch still isn’t a digital thermometer that can make accurate temperature readings on demand.

As many of us discovered during the wilder days of the Covid-19 pandemic, non-contact digital thermometers aren’t terribly expensive, but they can quickly sell out when demand for them skyrockets. As a readily available alternative, researchers at the University of Washington turned to smartphones. One key difference: Their solution does not need any added attachments or hardware upgrades. Smartphones already rely on components called thermistors to measure the temperature of the device’s internals, including the battery, in order to activate safety precautions to ensure they don’t overheat. It’s why your iPhone will sometimes show a warning that it needs to cool down before you can safely use it again.

Thermistors, which are also used in medical-grade thermometers, can’t directly measure a user’s body temperature while inside a smartphone, but they can be used to track the amount of heat energy that has been transferred between a user and the mobile device they’re making contact with. To simulate a fevered test subject, the researchers used a sous-vide machine to heat a plastic bag full of water, and pressed the touchscreens of several different smartphones against it, including devices in protective cases, and those using screen protectors. The built-in thermistor was used to measure how quickly the device warmed up during this interaction, and that data was used to train a machine learning model powering the FeverPhone app that can estimate what a user’s body temperature is.

Using the FeverPhone app sounds easy enough, but it requires users to hold their device at its corners and press its touchscreen against their forehead for around 90 seconds. This was deemed the ideal amount of time for enough body heat to be transferred to the device, and since the forehead interaction is detected by the touchscreen, it allows the device and the app to know when a measurement is being deliberately made.

During a clinical trial at the University of Washington’s School of Medicine’s Emergency Department, the app was tested by 37 participants, which included 16 with a mild fever, and the results were compared against readings from an oral thermometer. FeverPhone was able to predict a user’s core body temperature with “an average error of about 0.41 degrees Fahrenheit (0.23 degrees Celsius),“ which is on par with the accuracy of home use thermometers, including non-contact options.

The researchers are currently working to improve the app’s accuracy by expanding the number of smartphone models that were used to train its machine learning model, as initially just three different devices were used. But they’re optimistic that it could be trained to work with smartwatches, too, which would actually work much better as their smaller size would allow them to heat up faster and allow for measurements much shorter than 90 seconds. FeverPhone may never be approved as a medical grade thermometer, but it sounds like it will be accurate enough to give users a better idea when they might actually be sick, and should be taking appropriate measures to protect themselves and others.

via Gizmodo https://gizmodo.com

June 22, 2023 at 01:22PM

We Pump So Much Groundwater We’ve Shifted The World’s Tilt and Contributed to Sea Level Rise

https://gizmodo.com/study-groundwater-pumping-has-shifted-earth-axis-1850581910

We’ve been sucking the earth dry, and it’s starting to change how our planet works. A study published this month in the journal Geophysical Research Letters, explains that we’ve extracted so much damn water out of the ground, it has changed the planet’s tilt and has contributed to sea level rise.

Is Google’s New $1,800 Pixel Phone Worth It? | Gizmodo Review

Groundwater is a pretty important source of water throughout the world, especially in the U.S. It’s used to provide drinking water and it’s a backup source of water when there’s drought. But there is such a thing as taking too much out of the ground. “Earth’s pole has drifted toward 64.16°E at a speed of 4.36 cm/yr during 1993–2010 due to groundwater depletion and resulting sea level rise,” researchers wrote in the study. That’s a tilt of about 1.7 inches towards the east per year, or more than 28 inches (70 centimeters) in less than two decades.

But why does this happen? The planet’s rotational pole, which is the point that the Earth rotates around, moves via a process that is called polar motion. This describes the axis the world spins on relative to the Earth’s crust. And the water distributed across the planet affects how the world spins, researchers explained in the study. So if groundwater is taken out of the Earth’s crust and moved elsewhere, it’s adding to the water in the ocean, and it’s shifting how mass is moved around. This makes a difference because there is a lot of water underground around the world. There is actually over a thousand times more water underground than there is in all the world’s rivers and lakes, according to the U.S. Geological Survey.

“Earth’s rotational pole actually changes a lot,” Ki-Weon Seo, a geophysicist at Seoul National University and study coauthor, said in a statement. “Our study shows that among climate-related causes, the redistribution of groundwater actually has the largest impact on the drift of the rotational pole.”

But water extraction hasn’t just changed how the world tilts. Researchers used groundwater data and climate models to conclude that we’ve pumped enough groundwater to contribute to 6.24 millimeters (.24 inches) of sea level rise from 1993 to 2010. These findings are alarming because the sea level has increased at a rate of 3.4 millimeters (.13 inches) a year since 1993, according to NASA. These numbers may seem small, but sea level rise that is already fueled by climate change has major implications for the world. Recent research shows how sea level rise is washing away the breeding ground for endangered species of turtles. Major cities on the coast will be swallowed up by sea level rise, potentially displacing hundreds of millions of people in a few decades.

Communities throughout the U.S., especially in the South, are struggling with sea level rise. Louisiana has a huge erosion problem. Data from the U.S. Geological Survey shows that the state lost an estimated 2,000 square miles of land between 1932 and 2016. That’s larger than the state of Rhode Island. Sea level rise has been a long-time growing concern for Florida’s coast, but flooding and king tides are projected to happen more often in the sunshine state. The increased flooding over time has also messed with the real estate market, and people in flood zones could see their property values plunge.

Some local governments in the U.S. have addressed the over-extraction of groundwater recently. Arizona officials recently paused some housing development expansion over groundwater supply. Construction can only continue for new builds in parts of Phoenix if developers are able to prove that there is a steady source of water for those future households. However, developers cannot rely on groundwater as a water source to obtain a certificate that would allow them to continue building, AZFamily reported. Officials have decided on this because groundwater is a finite resource, and if communities continue to overdraw water, it could take up to thousands of years to replenish that source of water.

Want more climate and environment stories? Check out Earther’s guides to decarbonizing your home, divesting from fossil fuels, packing a disaster go bag, and overcoming climate dread. And don’t miss our coverage of the latest IPCC climate report, the future of carbon dioxide removal, and the un-greenwashed facts on bioplastics and plastic recycling.

via Gizmodo https://gizmodo.com

June 27, 2023 at 01:35PM

Computer Enhance: Scientists Reconstruct 3D Images From Eye Reflections

https://gizmodo.com/eye-reflections-3d-reconstructions-experiment-1850586090

There’s a saying that the eyes are the window to a person’s soul, but according to a team of researchers from the University of Maryland, the eyes might instead be a mirror, providing enough data to reconstruct exactly what a person in a short video clip was looking at as an interactive 3D model.

Spoilers of the Week | June 10th

A lot of factors made CSI, and its spin-offs, a popular franchise for CBS, but what drew many viewers back, week after week, was the over-the-top technology the crime scene investigators had at their disposal. A grainy, low-resolution frame of video captured in the middle of the night by a security camera could be repeatedly enhanced until it revealed the individual fibers on a suspect’s shirt. Most of the tools used to solve crimes on the show still only exist in the writers’ imagination, but researchers are making impressive advances when it comes to the amount of data that can be extracted from just a few frames of video footage.

Leveraging previous research on neural radiance field (NeRF) technology, where complex scenes or objects can be fully recreated in 3D using just a partial set of 2D images captured at various angles, and the fact that the shape of the eye’s cornea is more-or-less the same for all healthy adults, the University of Maryland researchers were able to generate 3D recreations of simple scenes based on imagery extracted from eye reflections. Don’t expect the results to be used to solve any crimes, though.

This approach, as detailed in a recently published study, comes with some very unique challenges. For starters, recreating 3D models from 2D images usually starts with high-quality source material, like video captured from a modern smartphone’s digital camera. Here, the reflections are extracted from a tiny, low-res portion of each frame, and they’re layered atop the complex textures of the eye’s iris, which also varies in color from person to person, requiring extensive post-processing to clean up the imagery.

Further complicating this approach is the fact that the series of 2D images being used to create the reconstruction are all originating from the same location, with only slight variations introduced as the subject’s eye looks around. If you’ve ever tried to generate a 3D model of an object or a room using a smartphone’s camera and an app, you know you have to move around and record it from all sides and angles in order to get optimal results. But that’s not an option here.

As a result, the 3D models generated using this new technique are very low resolution and low on details. You can still identify objects like a plush dog, or a bright pink Kirby toy, as the researchers did, but these results were also achieved using optimal conditions: very basic scenes, very deliberate lighting, and high-res source imagery.

When the researchers attempted to apply their approach to footage they weren’t responsible for capturing but was still created using ideal lighting conditions, such as a clip of Miley Cyrus from her Wrecking Ball music video sourced from YouTube, it’s impossible to discern what you’re looking at in the resulting 3D model. The vague blob is probably a hole in a white shroud the camera lens was looking through in order to achieve the specific look of this shot, but not even Grissom’s CSI team could confirm that using this 3D model. As fascinating as this research may be, it’s going to be a while before it becomes a tool that has any practical applications.

via Gizmodo https://gizmodo.com

June 28, 2023 at 01:51PM