As generative AI material threatens to encroach further and further upon the entertainment industry, animation—and Japanese animation in particular—has become something of a major battleground, as both sides of production and distribution weigh up the worth (and potential backlash) of using the technology. But over the weekend, a surprisingly grim new frontier opened up in that battle: the arrival of AI-generated anime dubs.
Over the course of the holiday break in the United States, Prime Video rolled out the early stages of a new beta program that utilizes generative AI to voice English and Latin American language dubs of several anime series in the streamer’s catalogue, including the likes of Mappa’s 2018 adaptation of Banana Fish (which has, somewhat controversially, never received an English-language dub before this) and the 2017 Madhouse No Game No Life movie No Game No Life Zero. Not officially announced by Amazon, it took aggrieved anime fans kicking up a storm on social media to bring the rollout to people’s attention.
And for good reason, because the dubs sound (perhaps to the surprise of no one outside of the AI accelerationist sphere) absolutely awful:
io9 has reached out to Prime Video for comment on the rollout of its AI dub beta and will update if we hear back from the streamer.
Even before you get to the translated script itself, these dubs are well below any kind of level of acceptable. The intonation, the pacing, the emotion (or rather, distinct lack thereof): there’s always been a brand of anime diehard who has long had a perception of dubbed anime as lesser than the original Japanese work for myriad reasons and that dubbing has a legacy of poor quality, in spite of leaps and bounds of improvements in dubbing quality made over the years as anime has only become more and more mainstream. And yet these AI dubs are somehow even worse than the absolute lowest of those perceptions made manifest.
Beta labelling or otherwise, it’s almost shocking that Amazon would consider these acceptable to go live, regardless of how much or how little fanfare they made about the initiative. It’s further shocking that, in some cases, the AI dubbing is being used on projects that have either famously been waiting years for dubs, like Banana Fish, or, in some wild instances, already received dubs that utilize actual human beings—as is the case with No Game No Life Zero, which was dubbed by Sentai Filmworks. In those cases, the AI dub isn’t filling a void but effectively erasing the past for the sake of trying to shoehorn a misguided vision of the future into reality.
With any hope, Amazon will see the PR nightmare created by this “beta” and pull back from attempting more—a push and pull every studio is having to consider now as they try to march forward with public-facing generative AI content. But between Crunchyroll’s desire to experiment more and more with AI-translated subtitles and initiatives like this, it’s clear that some of the most oft-persecuted professionals when it comes to exporting anime are facing being cast aside, quality be damned—and regardless of how you feel about dub and translation quality in the here and now, non-Japanese anime audiences are only going to suffer if platforms keep trying to force this upon them in a race to the bottom.
Telecom regulators in India have reportedly asked smartphone manufacturers to preload a state-owned cybersecurity app that cannot be deleted onto all new devices, and push the app to existing devices via a software update. Reuters reports that, according to a non-public government order sent to manufacturers, Apple, Samsung, Xiaomi and others were given 90 days to comply.
The app in question is called Sanchar Saathi (meaning Communication Companion), and is primarily aimed at fraud prevention with tools that allow users to report and lock lost or stolen devices. According to Reuters, the app has a reported 5 million downloads since its release and has helped block 3.7 million stolen or lost phones in India. An additional 30 million reportedly fraudulent connections have been terminated using the app.
"If I lose my phone, immediately the app is on my phone which I can then register and make sure my phone is not used by any fraudulent individual. It’s a step to protect the consumer," Telecom Minister Shri Jyotiraditya M. Scindia said in an interview with CNBC. The Minister said the installation order should be issued in the "next couple of days."
How smartphone manufacturers will respond remains to be seen. Apple, for its part, doesn’t have the strongest history of standing up to governments that oversee large markets for the company. Just a few weeks ago Apple removed two of the largest LGBTQ+ dating apps from the Chinese App Store at the government’s request. In 2019 the iPhone maker removed a Hong Kong protest app following pressure from Chinese authorities. The company has also become increasingly entangled with India as it looks to move US-bound iPhone production to the country.
This article originally appeared on Engadget at https://ift.tt/3slaLEo
Flock, the automatic license plate reader and AI-powered camera company, uses overseas workers from Upwork to train its machine learning algorithms, with training material telling workers how to review and categorize footage including images people and vehicles in the United States, according to material reviewed by 404 Media that was accidentally exposed by the company.
Researchers at the University of Cincinnati are developing a drone with flapping wings that can locate and hover around a moving light like a moth to a flame. Moths have the incredible ability to hover in place or even fly backward. They automatically make fine adjustments to compensate for wind or obstacles to remain stationary or follow a moving object. Likewise, this mothlike drone makes fine adjustments to maintain a desired attitude and distance from a light, even when the light moves. The drone simultaneously measures the performance of whatever function it is programmed to optimize, like finding a light source, to correct its course in a constant feedback loop that allows for remarkably consistent and stable flight. This research is interesting not only for what it might mean for new autonomous unmanned aerial vehicles but also how these tiny insects manage their miraculous aerobatics with brains the size of a grain of pollen.
Scientists have detected tiny lightning bolts on Mars for the first time — they were found discharging around NASA’s Perseverance rover and coming from dust-storm fronts and whirling dust devils.
Finding the electrical discharges has solved a major Martian mystery, namely the origin of oxidants such as hydrogen peroxide on the Red Planet, which was discovered on Mars in 2003. These oxidants can react with organic molecules, potentially destroying biosignatures, while other chemical reactions triggered by the lightning can generate new organic molecules.
"This is exciting," Baptiste Chide of the Institut de Recherche en Astrophysique et Planétologie in Toulouse told Space.com. "It opens a new field of investigation for Mars."
Chide led a team of Mars rover scientists to find evidence for the electrical discharges hidden in data from the most unexpected of instruments: Perseverance‘s microphone.
Chide’s team discovered 55 electrical events across 29 hours’ worth of microphone recordings, spread out across two Martian years. The recordings each have a distinct audio signature. Initially there is a burst of static, called the overshoot, that lasts less than 40 microseconds. The overshoot is followed by an exponential drop in signal lasting perhaps 8 milliseconds, depending on how far away the microphone is from the discharge. Both the overshoot and subsequent drop are not real acoustic noises: They are the result of interference in the microphone’s electronics from the magnetic field generated by the discharge. The next part of the audio recordings is a real sound. This manifests as a second loud peak in the signal followed by smaller peaks, and these are caused by a modest shockwave produced by the flash of the lightning.
These electrical discharges are not forked lightning bolts lancing down from the sky like we have on Earth, because Mars does not have thunderstorms because it lacks atmospheric water. Instead, for the microphone to hear the electrical discharges, the discharges have to be much closer to the rover.
On Earth, lightning is caused mostly by friction between icy particles in the clouds. On Mars, it is friction between dust particles that prompts the discharges. We see something similar on Earth in volcanic plumes.
Get the Space.com Newsletter
Breaking space news, the latest updates on rocket launches, skywatching events and more!
However, the conditions on Earth and Mars are very different, evident in their respective "breakdown threshold." This describes the point when clouds of particles that have become electrically charged are able to discharge.
"The breakdown threshold is higher on Earth than on Mars, and is primarily to do with pressure and also the composition of the atmosphere," Daniel Mitchard of Cardiff University told Space.com. Mitchard is a physicist who studies lightning, though he is not a member of the rover team and did not participate in this study.
An image of Perseverance on Mars. (Image credit: Courtesy NASA/JPL-Caltech, Attribution, via Wikimedia Commons)
Earth’s predominantly nitrogen–oxygen atmosphere and Mars’ mostly carbon-dioxide atmosphere are electrically insulating, meaning a lot of charge has to build up to overcome the insulating effect and discharge. Because the surface pressure on Earth is one atmosphere, it means that there is a lot of insulating atmosphere that lightning has to pass through, and so the breakdown threshold is quite high, three megavolts per square meter. On Mars, where the surface pressure is just 0.006 atmospheres, there is less insulating atmosphere for an electrical discharge to overcome, so the breakdown threshold is much less, around 15 kilovolts per square meter.
"So this means that we would generally expect lightning on Mars to be weaker than on Earth," said Mitchard, who likens Mars’ electrical discharges to the static shock that you might receive rubbing a balloon or walking on an insulated flooring.
Of the 55 discharge events detected by Perseverance’s microphone, 54 of them occurred during the top 30% of strongest winds recorded during the 29 hours of recordings. This strongly connects the discharges to localized winds that are able to loft dust into the air, as is commonly found at a dust-storm front. Sixteen of the events also coincided with dust devils passing very close to the rover — the most distant electrical discharge measured is estimated to have been just 6.2 feet (1.9 meters) from Perseverance. Some of the discharges were caused by dust grains in the air, while a handful were actually the rover becoming electrically charged to several kilovolts following collisions with airborne dust particles and then discharging into the ground.
However, the rover and its instruments are well protected from electrical mishap. Nevertheless, Chide and the rover team speculate that the Soviet Mars 3 mission, which landed on Mars in the middle of a dust storm in 1971 and was only active for 20 seconds before going kaput, could have been damaged by electrical discharges.
To ensure that future missions are fully protected, the microphone readings can guide the future design of Mars missions. "Now that we have quantitative data on the energy [of the discharges], we will be able to adjust the specification we put on the design of electronic boards and potentially have new constraints on the space suits needed for astronauts," said Chide.
So far, only the microphone has picked up evidence for the discharges. Could Perseverance’s cameras potentially capture the flashes of these lightning bolts?
"Imaging the discharges would be hard," said Chide. This would be partly because many of them take place in the day when dust devils are most active, and those that would otherwise be bright enough might be obscured by dust. The flashes would also be very brief, lasting just microseconds, and most would be only millimeters in length – the largest bolts are those discharges from the rover itself, which extend several tens of centimeters to reach the Red Planet’s surface. To capture short, fast electrical discharges requires a high-speed, high-resolution camera that we don’t currently have on Mars.
"Hopefully, more advanced cameras will eventually make their way there," said Mitchard. This is now more likely if planetary scientists wish to study the lightning in more detail in the future.
A dust storm on Mars as seen from overhead. (Image credit: NASA/JPL-Caltech/UArizona)
Even then, it would not be a simple matter. "We wouldn’t really know where to point the camera," said Chide. "We’d have to be very lucky!"
Of more immediate interest is lightning’s connection to oxidants such as hydrogen peroxide. Because such oxidants can react with and chemically alter organic compounds, the presence of lightning is of interest to astrobiologists seeking biosignatures on the Red Planet. In theory, areas with high concentrations of oxidants should experience more dust devil and storm activity and therefore more electrical discharges. For example, dust devil activity in Gusev crater, where the Spirit Mars Exploration Rover landed in 2004, is twenty times higher than in Jezero crater where Perseverance is, while there is barely any dust devil activity on Elysium Planitia. Does this match the distribution of oxidants on Mars, and could scientists improve their chances of finding biosignatures by sending life-seeking missions to areas of Mars that do not experience as many dust devils and dust storms?
"This is a good question," said Chide. "The quantification of the amount of oxidants produced by this new phenomenon will be the next step, requiring lab experiments and models."
Whereas lightning has already been discovered in the clouds of the gas giants Jupiter and Saturn, this is the first time that electrical discharges have been discovered on a rocky planet other than Earth. It raises the possibility that similar phenomena could take place on Venus via dust or Saturn’s moon Titan via icy grains.
Meanwhile, the Martian discharges could assist dust storms, since the electrical static reduces the threshold velocity needed for winds to lift dust particles off the surface in the first place, creating a positive feedback loop of dust being helped off the surface, becoming further electrified, which helps more dust to become airborne, and so on. As such, the electrification of the dust could play an important role in Mars’ global dust cycle and hence what passes for its climate.
With thousands of smaller, regionalized dust storms every Martian year, it means that there are thousands of kilometers of electrified dust-storm front that could be crackling with tiny lightning bolts. The shocking story of the electrified Mars may not be over yet.
The research was published on Nov. 26 in the journal Nature.
The Chinese robotics company AgiBot has set a new world record for the longest continuous journey walked by a humanoid robot. AgiBot’s A2 walked 106.286 kilometers (66.04 miles), according to Guinness World Records, making the trek from Nov. 10-13.
The robot journeyed from Jinji Lake in China’s Jiangsu province to Shanghai’s Bund waterfront district, according to China’s Global Times news outlet. The robot never powered off and reportedly continued to operate while batteries were swapped out, according to UPI.
A video posted to YouTube shows a highly edited version of the walk that doesn’t give much insight into how it was presumably monitored by human handlers. But even if it did have some humans playing babysitter, the journey included just about everything you’d expect when traveling by foot in an urban environment, including different types of ground, limited visibility at night, and slopes, according to the Global Times.
The robot obeyed traffic signals, but it’s unclear what level of autonomy may have been at work. The company told the Global Times that “the robot was equipped with dual GPS modules along with its built-in lidar and infrared depth cameras, giving it the sensing capability needed for accurate navigation through changing light conditions and complex urban environments.”
That suggests it was fully autonomous, and the Guinness Book of World Records used the word “autonomous,” though Gizmodo couldn’t independently confirm that claim.
“Walking from Suzhou to Shanghai is difficult for many people to do in one go, yet the robot completed it,” Wang Chuang, partner and senior vice president at AgiBot, told the Global Times.
The amount of autonomy a robot is operating under is a big question when it comes to companies rolling out their demonstrations. Elon Musk’s Optimus robot has been ridiculed at various points because the billionaire has tried to imply his Tesla robot is more autonomous than it actually is in real life.
For example, Musk posted a video in January 2024 that appeared to show Optimus folding a shirt. That’s historically been a difficult task for robots to accomplish autonomously. And, as it turns out, Optimus was actually being teleoperated by someone who was just off-screen. Well, not too far off-screen. The teleoperator’s hand was peeking into the frame, which is how people figured it out.
Tesla’s Optimus robot folding laundry in Jan. 2024 with an annotation of a red arrow added by Gizmodo showing the human hand. Gif: Tesla / Gizmodo
Musk did something similar in October 2024 when he showed off Optimus robots supposedly pouring beer during his big Cybercab event in Los Angeles. They were teleoperated as well.
It’s entirely possible that AgiBot’s A2 walked the entire route autonomously. The tech really is getting that good, even if long-lasting batteries are still a big hurdle. But obviously, people need to remain skeptical when it comes to spectacular claims in the robot race.
We’ve been promised robotic servants for over a century now. And the people who have historically sold that idea are often unafraid to use deception to hype up their latest achievements. Remember Miss Honeywell of 1968? Or Musk’s own unveiling of Optimus? They were nothing more than humans in a robot costume.
Last week, Massachusetts Institute of Technology (MIT) published a study claiming that AI is already capable of replacing 11.7% of existing U.S. labor. It’s certainly the kind of eye-popping study guaranteed to get a lot of eyes on researchers’ work at a time of shaky faith in AI, as stockholders might want some reassurance that their AI investments are going to pan out.
The report on this research is called “The Iceberg Index: Measuring Skills-centered Exposure in the AI Economy,” but it also has its own dedicated page called “Project Iceberg” that lives on the MIT website. Compared to the research paper, the project page has a lot more emoji. Where the paper on the study comes across sort of like a warning about AI tech, the project page, which is headlined “Can AI Work with You?” feels more like an ad for AI, in part thanks to text like this:
“AI is transforming work. We have spent years making AIs smart—they can read, write, compose songs, shop for us. But what happens when they interact? When millions of smart AIs work together, intelligence emerges not from individual agents but from the protocols that coordinate them. Project Iceberg explores this new frontier: how AI agents coordinate with each other and humans at scale.”
The titular “Iceberg Index” comes from an AI simulation that uses what the paper called “Large Population Models” that apparently ran on processors housed at the federally funded Oak Ridge National Laboratory, which is affiliated with the Department of Energy.
Legislators and CEOS seem to be the target audience, and they’re meant to use Project Iceberg to “identify exposure hotspots, prioritize training and infrastructure investments, and test interventions before committing billions to implementation.”
The Large Population Model—should we start shortening this to LPM?—claims to be able to digitally track the behavior of 151 million human workers “as autonomous agents” with 32,000 trackable “skills,” along with other factors like geography.
The overall finding, the researchers claim, is that current AI adoption accounts for 2.2% of “labor market wage value,” but that 11.7% of labor is exposed—ostensibly replaceable based on the model’s understanding of what a human can currently do that an AI software widget can also do.
It should be noted that humans in actual jobs constantly work outside their job descriptions, handle exceptional and non-routine situations, and are—for now—uniquely capable of handling many of the social aspects of a given job. It’s not clear how the model accounts for this, although it does note that its findings are correlational not causal, and says “external factors—state investment, infrastructure, regulation—mediate how capability translates to impact.”
However, the paper says, “Policymakers cannot wait for causal evidence of disruption before preparing responses.” In other words, AI is too urgent to get hung up on the limitations of the study, according to the study.