Boost AMD’s Ryzen AI Max performance up to 60% with this memory trick

https://www.pcworld.com/article/2625637/boost-amds-ryzen-ai-max-performance-up-to-60-with-this-memory-trick.html

If you’ve purchased a laptop or tablet with an AMD Ryzen chip inside, there’s a performance tweak you absolutely need to know about. Apply it to AMD’s AI/gaming powerhouse Ryzen AI Max chip, and you can score enormous performance gains in just seconds!

Savvy gamers know instinctively that you can boost your game’s frame rate by lowering the resolution or the visual quality, or by making an adjustment to the Windows power-performance slider. But the Ryzen AI Max is a new kind of device: a killer mobile processor that can run modern games at elevated frame rates, and serve as an AI powerhouse.

What’s the secret? A simple adjustment of the Ryzen AI Max’s unified frame buffer, or available graphics memory. While it’s a simple fix, in my tests, it made an enormous difference: up to a 60 percent performance boost in some cases.

When I was comparing Intel’s “Arrow Lake” mobile processor to AMD’s Ryzen AI 300 processor, I was unable to perform certain AI tests because of the memory and VRAM requirements those tests required. After I published the story, AMD reached out to suggest dialing up the VRAM via the laptop BIOS to enable the test to run. (As it turned out, this didn’t make a difference in that specific test.)

When Asus ROG provided me an Asus ROG Flow Z13 gaming tablet for testing AMD’s Ryzen AI Max+ chip, the topic surfaced again. For whatever reason, Asus had configured its tablet with just 4GB of available video memory. The company recommended we test with 8GB instead. It seemed to be an opportune time to see what effects adjusting the Ryzen AI Max’s graphics memory would have.

Why does video memory matter?

VRAM stands for video RAM. If you use a discrete GPU, the amount of VRAM is predetermined by your graphics-card manufacturer, and the VRAM chips are soldered right onto the board. Some people use “VRAM” as shorthand when talking about integrated graphics, as well. That’s not entirely accurate; integrated graphics share memory between the PC’s main memory and the video logic, and the “VRAM” in this case is known as the “UMA frame buffer.”

VRAM (and the UMA frame buffer) stores the textures used by a PC game inside your graphics card, allowing them to be quickly accessed and rendered upon the screen. If the texture size exceeds the amount of VRAM, your PC must pull them in from somewhere else (RAM or the SSD), slowing down gameplay.

In the case of AI, the VRAM acts a lot like standard PC RAM, storing the weights and models of an AI algorithm, and allowing it to run. In many cases, insufficient VRAM means that a particular LLM might not run. In both cases, though, the amount of memory available to the GPU matters.

If you own a discrete graphics card from any manufacturer, the amount of VRAM can’t be adjusted. On Intel Core systems, the available UMA frame buffer is also typically fixed, to half of the available system memory. In other words, if you have an Intel Arrow Lake laptop with 16GB of total memory, up to 8GB is accessible by the integrated GPU. And yes, adding more system memory increases the size of the UMA frame buffer. However, you still can’t adjust the UMA frame buffer’s size to your own preferences.

Mark Hachman / Foundry

AMD, on the other hand, typically allows you to adjust the size of the UMA frame buffer on its Ryzen processors (and not just the AI Max!), typically through adjustments made to the firmware. In my case, I obtained access to the UEFI/BIOS just by rebooting the tablet and tapping the F2 key as it rebooted.

On the ROG tablet, I found the adjustment in the advanced settings. Several options were available, from an “Auto” setting to 24GB, with several increments in between. As I noted above, Asus had accidentally sent out its tablets with the 4GB setting enabled. All I did was select a new size for the frame buffer, save, and the boot cycle continued.

Mark Hachman / Foundry

Warning: If you own a handheld PC like a Steam Deck, adjusting the UMA frame buffer to improve game performance isn’t that unusual. Making adjustments to the BIOS/UEFI does carry some risk, however. In this case, allocating too much memory to the GPU might not allow an application to run at all, or cause instability. You’re probably safe leaving 8GB reserved for main memory.

If you don’t want to fiddle around in your laptop’s BIOS, you might find that your laptop or tablet ships with the AMD Adrenalin utility. In this case, you might be able to adjust the graphics memory from within the application itself. I found it within the Performance > Tuning tab. If you’re confused about how much memory you’re allocating to graphics and how much is left for your PC, the utility helps make that clear.

Mark Hachman / Foundry

Tested: Does adjusting the UMA frame buffer make a difference?

I’m currently in the process of reviewing the Asus ROG Flow Z13 and its AMD Ryzen AI Max+ chip, but I paused to check to see what improvements, if any, could be made by adjusting the size of the UMA frame buffer. In general, I found that adjusting it made the most improvement inside games and especially AI.

I didn’t see enormous gains in synthetic graphics benchmarks: Tweaking the UMA frame buffer using 3DMark’s Steel Nomad test boosted scores from 1,643 to 1,825, or about 11 percent. Ditto for PCMark 10, which measures productivity: The needle barely moved. I turned to Cyberpunk: 2077 to test how adjusting the UMA frame buffer applied to games.

I ran four benchmarks, each using a UMA frame buffer of 4GB, 8GB, or 16GB. The idea was to see what the chip could do on its own, just rasterizing the game, as well as turning on all of the GPU’s AI enhancements to maximize frame rate:

  • 1080p resolution, Ultra settings, with all enhancements turned off
  • 1080p resolution, Ultra settings, with all enhancements (AMD FSR 3 scaling and frame generation) turned on
  • 1080p resolution, Ray Tracing Overdrive settings, with all enhancements turned off
  • 1080p resolution, Ray Tracing Overdrive settings, with all enhancements turned on

Mark Hachman / Foundry

You can immediately see that the largest improvements in overall frame rate simply come from turning on AMD’s FidelityFX Super Sampling 3 (FSR 3). But adjusting the frame buffer gives you about a 10 percent boost in the Ray Tracing Overdrive mode. More importantly, it bumped up the frame rate on the 1080p Ultra settings, with FSR 3 on, by 17 percent. That’s all from a free, easy adjustment that costs you nothing.

It’s worth pointing out that adjusting the UMA frame buffer doesn’t always scale. It seems to under the Ray Tracing Overdrive setting, but the same gains don’t play out elsewhere. I’ve found Cyberpunk‘s tests to be quite reproducible, so I’m inclined to believe that it’s not just statistical variance.

I saw the same thing when testing AI, too. I used UL Procyon’s AI Text Generation benchmark, which loads in four separate LLMs or AI chatbots, and asks a series of questions. UL generates its own scores, based on speed and token output. Here, there was an amazing jump in overall performance in places: 64 percent in Llama 2 just by dialing up the frame buffer! But it was also interesting that the performance increase was “capped” at 16GB. Selecting a 24GB frame buffer actually lowered the performance slightly.

Mark Hachman / Foundry

(It might be worth noting that Procyon didn’t think it could run on the Ryzen AI Max, because the system didn’t report enough VRAM. It ran just fine, of course.)

I also checked my work using MLCommons’ own client benchmark, which uses a 7 billion-parameter LLM to generate “scores” of tokens per second and an initial “time to first token.” I didn’t see as much improvement — just about 10 percent from the 4GB to 8GB buffer, and then virtually nothing when I tested it with a 16GB buffer instead.

Integrated graphics matters again

Until now, adjusting the UMA frame buffer was largely irrelevant: Integrated GPUs were simply not powerful enough for that tweak to matter. Things are different now.

Again, if you’re a handheld PC user, you may have seen others adjusting their UMA frame buffer to eke out a bit more performance. But as my colleague Adam Patrick Murray commented, the Ryzen AI Max+ processor inside the Asus ROG Flow Z13 tablet is an odd hybrid of a handheld and a tablet. It’s more than that, of course, given that companies like Framework are putting it in small form-factor PCs.

But lessons learned from the handheld PC space do apply — and well! — to this new class of AI powerhouse chips. Adjusting the UMA frame buffer on an AMD Ryzen AI Max+ chip can be a terrific free tweak that can boost your performance by dramatic amounts.

via PCWorld https://www.pcworld.com

March 7, 2025 at 07:07AM

Prepare For Discord To Get Way Worse

https://kotaku.com/discord-reddit-ipo-gaming-server-gta-6-enshitification-1851768033

Discord has become a central artery for gaming. It’s the default place for companies to build communities around their games and for the people who play them to hang out with one another, coordinate gaming sessions, or just shoot the shit about whatever’s going on in life. It’s one of the least shitty social media platforms out there right now, but it might not stay that way for long.

The New York Times reports that Discord’s founders are meeting with investment bankers in preparation for a possible IPO as early as this year. While plans can always change, what that means right now is that the meeting place for 200 million monthly users, over 90 percent of whom use it for gaming, is exploring going from a private company to a public one ruled by the founding principle of the stock market: number must go up.

“We understand there is a lot of interest around Discord’s future plans, but we do not comment on rumors or speculation,” a spokesperson for the company told The New York Times. “Our focus remains on delivering the best possible experience for our users and building a strong, sustainable business.”

Discord was released in 2015 by founders Jason Citron and Stanislav Vishnevskiy. After shipping a MOBA royale on iPad called Fates Forever in 2014 that never took off, they pivoted to making communications platform for online gamers. Online games were exploding at the time and the tools for communicating in them were pretty bad. Discord filled the void when it came to coordinating raids in Final Fantasy XIV or ganks in Dota 2.

It eventually became a one-stop shop for gaming conversations online, offering an alternative to boring, stilted workplace productivity tools like Skype and Slack, as well as archaic gaming forums, and fire-hose social media platforms like Reddit and Twitter. Discord is now fully-integrated on Xbox Series X and PlayStation 5, and serves as a better solution to Nintendo Switch’s terrible voice chat app, and the frustrating hiccups of cross-platform party chat. Powering this takeover has been the fact 1) Discord works and 2) the majority of its features are entirely free.

Going public could change all of that. We’ve seen the playbook before. Tech company woos hundreds of millions of users with a great product that’s practically free, and then once they’re locked into the platform through history, network effects, or just old habit, squeezes as much financial value out of the thing by making it worse in almost every way possible. If this were A Christmas Carol and Discord was Ebenezer Scrooge then Facebook would be the Jacob Marley shaking its chains in the chat app’s face.

This is what the author Cory Doctorow has dubbed enshittification. Online products and services get worse over time as the companies behind them are forced to extract more and more value no matter the long-terms costs. Being private doesn’t make a company immune from this, but going public certainly has the potential to be like throwing TNT into a dumpster fire. This shift was already evident in Discord—see its embrace of online advertising just last year—but seems only likely to worsen with a potential IPO payout looming overhead. Look no farther then Reddit’s road to going public, paved with generative-AI deals and API price hikes.

The increasing enshittification of Discord is even more worrying. It’s now the defacto town square for many of the biggest games around, from Marvel Rivals to Helldivers 2. Rockstar Games just launched its own Discord channel in preparation for Grand Theft Auto VI, likely to be the biggest game launch ever when it arrives later this year. The price of exiting the platform if features get worse or locked behind a paywall is only going up. And any communities that do leave will have to leave their pasts behind, sealed off like The Cask of Amontillado from the rest of the internet. Discord’s separation from that crumbling infrastructure has long been one of its greatest virtues, making it feel like a relative safe heaven from SEO spam and algorithmic churn. It could one day make it feel more like a tomb.

High on a pandemic-fueled influx of new users, Discord once flirted with growing beyond its gaming roots. There was talk of it supplanting Slack and of Microsoft buying it for a potential $6.5 billion. Then last year it promised it was getting back to basics and doubling down on its original core mission. “We believe Discord can play a uniquely important role in the future of gaming,” Citron wrote at the time. “We’re going to focus on making it easier and more fun for people to talk and hang out before, during, and after playing—and we’ll help developers of those games bring their creativity to life.”

We’ll see if that remains the case.

.

via Kotaku https://kotaku.com

March 6, 2025 at 10:00AM

AI Could Translate 5,000-Year-Old-Language, Saving Time and Historical Insights

https://www.discovermagazine.com/the-sciences/ai-could-translate-5-000-year-old-language-saving-time-and-historical

Tens of thousands of cuneiform tablets are sitting around, just waiting to be translated. It’s not an easy job; the ancient language is based on wedge-shaped pictograms and includes more than 1,000 unique characters that vary by era, geography, and individual writer.

But decoding the pictograms could be a culturally and historically significant task. Cuneiform arose about 5,000 years ago in Mesopotamia, in what is now Iraq. It is one of four known pristine languages — writing systems with no known influences from any other. Some translated cuneiform tablets have revealed contents as banal as a record of inventory for shipping. Others have been more profound — like the “Epic of Gilgamesh," the first known written work of literature.

Those translations, done by a relatively few individuals who know the language, required a lot of labor — and perhaps some guesswork. Decoding such complexity would be the perfect job for artificial intelligence, thought some Cornell University researchers, who, with colleagues at Tel Aviv University, created a system to do just that, they report in a paper to be presented at an April 2025 conference.

AI Deciphers Ancient Tablets

The research team developed a system that overcomes the many obstacles that variations present to translation.

“When you go back to the ancient world, there’s a huge variability in the character forms,” Hadar Averbuch-Elor, a Cornell computer science professor who led the research, said in a press release. “Even with the same character, the appearance changes across time, and so it’s a very challenging problem to be able to automatically decipher what the character actually means.”

The computer system reads photographs of clay cuneiform tablets, then adjusts by computationally overlaying the images atop ones with similar features, and whose meaning is known. Because the system automatically aligns the two images until they digitally click into place, they named the system ProtoSnap.


Read More: Could AI Language Models Like ChatGPT Unlock Mysterious Ancient Texts?


What We Can Learn From Ancient Texts

In the paper, the researchers demonstrated that the snapped characters can be used to train the system to see other similarities between other characters later in the process, what they call downstream. When the system received such training, ProtoSnap performed much better at recognizing cuneiform characters — even rare ones or characters with lots of differences — than previous AI efforts.

This advance could help automate the tablet-reading process. This would save an enormous amount of time. It could also help scholars better compare writings from different times, cities, and authors. But most importantly, it would dramatically hasten the translation process — ultimately giving the world access to an abundance of ancient writing.

“The base of our research is the aim to increase the ancient sources available to us by tenfold,” Yoram Cohen, a co-author and archaeology professor at TAU said in the press release. “This will allow us, for the first time, the manipulation of big data, leading to new measurable insights about ancient societies – their religion, economy, social and legal life.”

Although many translated tablets will likely just show, say, a receipt for a livestock purchase, others could contain fascinating historical accounts — or even another epic poem.


Article Sources

Our writers at Discovermagazine.com use peer-reviewed studies and high-quality sources for our articles, and our editors review for scientific accuracy and editorial standards. Review the sources used below for this article:


Before joining Discover Magazine, Paul Smaglik spent over 20 years as a science journalist, specializing in U.S. life science policy and global scientific career issues. He began his career in newspapers, but switched to scientific magazines. His work has appeared in publications including Science News, Science, Nature, and Scientific American.

via Discover Main Feed https://ift.tt/JfaxYzd

March 5, 2025 at 03:54PM

When will Toyota build an EV with its simulated manual transmission?

https://www.autoblog.com/news/when-will-toyota-build-an-ev-with-its-simulated-manual-transmission

Toyota has long been known for its innovative approach to automotive technology and its commitment to providing drivers with engaging and reliable vehicles. Yet, when it comes to electric vehicles, the automaker has been more cautious than some of its competitors, focusing heavily on hybrids and hydrogen fuel cells. Although it is famous for its line of trucks and SUVs, the brand has also worked hard to bring more engaging gas-powered vehicles into its lineup, including the GR86, GR Supra, and GR Corolla, all of which have manual transmissions.

Lexus UX 300e

Toyota

In 2023, Toyota developed a simulated manual transmission for electric vehicles that provides a driving experience similar to that of a gas-powered automobile. They even put the revolutionary system in a Lexus UX 300e, and it did exactly what it was intended to do. This kind of system would be a first for production EVs, offering a rare blend of electric efficiency and old-school driving involvement. But when will Toyota actually build an EV with this unique and potentially game-changing feature?

Related: Costco members can save $3,000 on a new Chevy Corvette

A manual “shifting” transmission for a single-speed electric setup?

Toyota’s simulated manual transmission has a clutch pedal, a six-speed manual shifter, synthetic engine sounds, and even the ability to stall the engine. If you want convenience and comfort, you can even drive it in regular, boring EV mode. The simulated manual transmission Toyota developed for EVs isn’t just a novelty—it’s an effort to preserve the engaging driving experience that many enthusiasts associate with stick-shift cars.

Toyota EV shifter

Toyota

Traditional EVs are typically single-speed, using the instant torque of electric motors to deliver seamless acceleration without the need for gear changes. While this makes them efficient and smooth, it also removes the tactile, interactive element of shifting gears — something that many driving purists miss.

Lexus UX 300e


View the 4 images of this gallery on the
original article

Toyota’s simulated manual works via software, which allows the car to shake if the driver fails to depress the accelerator enough, shift into the wrong gear, or mishandle the clutch pedal. The six-speed shift gear shift is embedded in an old-school H-gate with microswitches in each gear position but no physical connection to any shift rods, syncros, etc. The clutch pedal utilizes a return spring for feel. The gearshift lever and the clutch aren’t mechanically connected to anything. This is a brilliant and innovative system designed to make the driving experience as authentic as possible.

Related: Volvo ES90: An all-electric sedan that defines the future

What is Toyota’s plan with the revolutionary transmission?

Toyota’s electric vehicle strategy has been deliberate and measured as evidenced by its single EV offering, the bZ4x. The company has invested heavily in hybrid technology and has been a strong proponent of hydrogen fuel-cell vehicles like the Mirai. That said, Toyota has been criticized for lagging behind in the all-electric space, especially as automakers like Tesla, Hyundai, and Ford push forward with ambitious EV lineups.

Toyota FT-Se

Toyota

In December 2021, Toyota announced plans to release 30 new EV models by 2030, with the aim of selling 3.5 million electric vehicles annually by the end of the decade. They have since scaled back their EV plans, and rightly so given the slowing demand for electric vehicles. Toyota now expects to produce around 1 million EVs by 2026, down from the previous 1.5 million target. But as for when the simulated manual transmission will debut, Toyota has been tight-lipped.

There was the possibility of integrating this new transmission into future electric sports cars, like a potential EV successor to the Toyota GR86 or Supra, but now it appears that both models’ successors will probably have gas engines with mild hybrid assist. 

Toyota FT-Se


View the 5 images of this gallery on the
original article

Reports from Toyota insiders suggest that the company could launch a prototype featuring the simulated manual transmission by 2026, and the most likely candidate is the sporty, all-wheel-drive FT-Se EV concept introduced in 2023. This timeline aligns with Toyota’s broader EV goals and its push to bring more diverse electric options to the market. The use of a driver-centric EV aligns nicely with the company’s increasing focus on sporty vehicles.

As Toyota continues to expand its EV lineup, the prospect of a driver-focused electric sports car with a simulated stick shift is exciting. While we may have to wait a few more years to see it in action, the mere possibility suggests that Toyota is serious about keeping driving fun, even in an electric future.

Related: 2025 Toyota Camry vs Honda Accord: a midsize mashup

Final thoughts

A simulated manual could differentiate Toyota’s EVs in an increasingly crowded market. As electric cars become more common, features that enhance driving experience — rather than just efficiency and range — will likely become key selling points.

Toyota will need to determine whether a simulated manual transmission in an EV is a niche offering or something that could attract a wider audience by providing a more connected driving experience. Then there’s the question of how well the final product will be executed. To draw enthusiasts, it would have to improve on its mechanical “feel” and how well it mimics a real manual gearbox. If the system feels artificial or gimmicky, it could struggle to win people over.

Love reading Autoblog? Sign up for our weekly newsletter to get exclusive articles, insider insights, and the latest updates delivered right to your inbox. Click here to sign up now!

Related: GM sued for selling driver data to insurers

via Autoblog https://ift.tt/LO1aifm

March 5, 2025 at 04:03PM

This AI Bookmark Might Actually Help You Finish Reading Books

https://gizmodo.com/this-ai-bookmark-might-actually-help-you-finish-reading-books-2000569596

The problem: You don’t read physical books through to completion. The solution? According to the two developers behind it, it is a bookmark that helps you pick up where you left off. The AI-powered Mark does not yet exist but is suddenly available for pre-order with one objective: to help you remember what you just read.

Mark is a bookmark you place into a book when you are finished reading to mark your spot. Once you close the book, the gadget does the hefty lifting. It sends a summary of the pages you just read to your smartphone and then concocts a generalized summary based on information already associated with that title. The idea is that when you come back to it later, your brain will catch up with the breadcrumbs left behind.

For $130, Mark hopes to address those who feel like their attention is constantly “fragmented” and the books they’re reading remain “underutilized.” Specifically, this product is for “Americans who prefer physical books to e-readers and tablets,” or at least that’s what it claims in its manifesto.

“Just like Strava keeps you motivated in fitness, Mark keeps you inspired in reading,” says the Mark Twitter/X account. This refers to the social media element of the bookmark, which lets your friends know you’ve made a dent in your reading once you’ve shut the book. Mark will measure your reading pace and summarize key themes as you progress. A “Mark Wrapped” feature even keeps track of what you read, similar to services like Goodreads.

As I’ve repeatedly mentioned, I’m a millennial, which means I’m of the generation that got through school essays with the help of CliffsNotes. Eventually, they were replaced by Wikipedia summaries. I don’t see Mark offering groundbreaking technology, especially since I’m not sure how it does what it purports to do. While I appreciate the idea of being caught up on what I was reading before I abandoned the story, this whole practice is taking the onus off of the reader to keep track of what they’re reading.

I get the premise of being an inconsistent reader. I switched to audiobooks because they were easier to pick up and catch on with the plot, even if it was weeks before I could get to it. But $130 to pick up reading where you last left is a grotesque amount for something that doesn’t perform other functions. It is much cheaper to buy a packet of sticky notes and stay engaged with your reading instead of relying on the computer to do the heavy lifting.

You can sign up for the waitlist if you’re morbidly curious about Mark. I did because I wanted to know what the heck was going on here.

via Gizmodo https://gizmodo.com/

February 28, 2025 at 08:27AM

Citigroup Briefly Makes Customer the Richest Person in History With Mistaken $81 Trillion Transfer

https://gizmodo.com/citigroup-briefly-makes-customer-the-richest-person-in-history-with-mistaken-81-trillion-transfer-2000570240

Last year, an employee at Citigroup accidentally initiated a fund transfer to a customer’s account that would have made them the wealthiest person in the history of human existence. The incident, which took place last April, credited a client’s account with a whopping $81 trillion instead of the intended amount (a mere $280), the Financial Times first reported.

Citigroup itself only has a market capitalization of about $150 billion, and the entire U.S. GDP is only worth about $27 trillion. The GDP of the European Union is some $17 trillion. The GDP of China is close to $18 trillion. So, to be clear, the transfer amount would have been more money than most of the economies of the developed world combined. It’s not clear where the bank planned to get the money, and, unfortunately, the customer in question did not get to keep the funds (not that they ever existed).

It’s also unclear whether the person who initiated the transfer got to keep their job. In communications with the Federal Reserve and the Office of the Comptroller of the Currency, Citi referred to the incident as a “near miss” which, you know, is probably an understatement. No funds ever left the bank, the FT reports.

Actually, “near misses” seem to happen quite a lot and are a formal category of screwup in the banking industry. The said category applies to incidents that do not qualify for regulatory scrutiny, according to FT reporting:

A total of 10 near misses — incidents when a bank processes the wrong amount but is ultimately able to recover the funds — of $1bn or greater occurred at Citi last year, according to an internal report seen by the FT. The figure was down slightly from 13 the previous year. Citi declined to comment on this broader set of events. Near misses do not need to be reported to regulators, meaning there is no comprehensive public data on how often these incidents occur across the sector. Several former regulators and bank risk managers said near misses of greater than $1bn were unusual across the US bank industry.

Ultimately, automated systems at the bank were responsible for halting the impossibly massive transfer, while two human employees initially missed the gargantuan outflow of money.  A third employee finally caught wind that something was amiss approximately 90 minutes after the transfer was initiated, the FT writes. “Despite the fact that a payment of this size could not actually have been executed, our detective controls promptly identified the inputting error between two Citi ledger accounts and we reversed the entry,” the company told the New York Times.

The Times notes that Citi has made some massive fuckups before. Some two years ago, an accounting error for a trade inspired a huge selloff of stocks in Europe that ultimately obliterated some $322 billion in value. For having caused such significant economic chaos, Citigroup was fined $79 million.

Gizmodo reached out to Citigroup for comment and will update this story if it responds.

via Gizmodo https://gizmodo.com/

February 28, 2025 at 03:27PM

Aerospace company Firefly released fantastic POV footage of Blue Ghost landing on the Moon

https://www.engadget.com/science/space/aerospace-company-firefly-released-fantastic-pov-footage-of-blue-ghost-landing-on-the-moon-195821368.html?src=rss

We already knew that the aerospace company Firefly successfully maneuvered its Blue Ghost lander onto the surface of the Moon, but now we have some gorgeous video proof. The lander captured footage throughout the touchdown, complete with a cinematic finale. Check it out below.

The POV footage shows the lander descending toward the Moon and the subsequent landing. It ends with a striking view of Blue Ghost emerging from a cloud of dust as its shadow stretches across the lunar surface. It’s pretty darn cool, with surprisingly-crisp HD visuals.

The touchdown happened Sunday at 3:30 AM ET and Blue Ghost made its home in a region known as Mare Crisium. This isn’t the first commercial lander to make its way to the Moon, but was the first one to land properly. The mission was a joint effort between Firefly and NASA’s Commercial Lunar Payload Services (CLPS) program, an organization that hopes to pave the way for an increased commercial presence on good ‘ole Luna.

Since landing, Blue Ghost has begun its surface operations. These include deploying payloads, sampling local regolith and capturing a bevy of images. The stationary lander will spend around two weeks on the lunar surface as it conducts various tests. It’s packed with ten NASA instruments designed to probe the ground and to test subsurface drilling methods.

This article originally appeared on Engadget at https://ift.tt/f1tFCsq

via Engadget http://www.engadget.com

March 5, 2025 at 02:03PM