Rocket Lab’s Mars probes reach launch site ahead of 1st flight on Blue Origin New Glenn rocket (photos)

https://www.space.com/rocket-lab-mars-probes-arrive-launch-site-new-glenn

Two Mars-bound smallsats that will fly on the highly anticipated debut of Blue Origin’s New Glenn rocket have arrived at their launch site in Florida.

The satellite duo, known as ESCAPADE ("Escape and Plasma Acceleration and Dynamics Explorers"), are set to launch atop New Glenn no earlier than Oct. 13 from Cape Canaveral Space Force Station in Florida. They’ll arrive in Mars orbit in September 2025, on a NASA mission to study how incoming charged particles from the sun interact with and alter the planet’s magnetic environment. 

The two coordinated robotic explorers could paint a more detailed picture of how Mars’ interaction with the solar wind influences the leaking of the planet’s thin atmosphere, and how its climate evolved over time to lose what scientists think was once a plentiful reserve of liquid water on the surface.

The twin ESCAPADE spacecraft appear side by side at a Rocket Lab facility in Long Beach, California, before being shipped to Florida in August 2024. (Image credit: Rocket Lab)

On Aug. 18, the twin ESCAPADE spacecraft arrived at the Astrotech Space Operations Facility in Titusville, Florida, to prepare for launch. At the facility, which is owned by Lockheed Martin, teams will check various aspects of the satellites in a dedicated cleanroom, including their electrical circuits and potential leaks in their tanks before carrying out the final assembly.

The spacecraft will be fueled next month, NASA said in a recent statement

Related: New Glenn: Blue Origin’s big new reusable rocket

"The successful delivery of the spacecraft marks a significant milestone and the culmination of over three years of dedicated teamwork from individuals across the project," ESCAPADE Principal Investigator Rob Lillis, of the University of California, Berkeley, said in the NASA statement. "Now, we’re thrilled to embark on this first step of our journey to Mars!"

Breaking space news, the latest updates on rocket launches, skywatching events and more!

The two small satellites were built by California-based Rocket Lab and UC Berkeley, which is leading the mission and has dubbed the satellites "Blue" and "Gold" after the school’s traditional colors. Each probe weighs under 198 pounds (90 kilograms) and carries three science instruments. The total cost for the mission is less than $80 million, according to the mission website

The value of the launch contract that NASA signed with Blue Origin is $20 million, SpaceNews reported.

ESCAPADE is among a wave of low-cost, high-risk NASA missions to other planets, which otherwise typically demand over a decade of development and exceed $1 billion in costs. "Sending two spacecraft to Mars for the total cost of $80 million is just unheard of, but current NASA leadership is taking the risk," Lillis said in a previous press release. "Instead of spending $800 million for a 95% chance of success, can we spend $80 million for an 80% chance? This is what NASA is trying to find out with these missions, and we are lucky to be one of the guinea pigs."

The satellites were initially scheduled to piggyback on the same SpaceX Falcon Heavy rocket that would loft the Psyche asteroid mission in August 2022. They were removed, however, after delays to the Psyche mission resulted in a new trajectory that didn’t support dropping ESCAPADE off at Mars. (Psyche ended up launching successfully in October 2023.) The satellite duo then became the primary payload onboard Blue Origin’s first orbital rocket, New Glenn, which is a two-stage heavy-lift rocket named after NASA astronaut John Glenn, who in 1962 became the first American to circle Earth.

New Glenn is 322 feet (98 meters) tall, roughly the height of a 30-story building, and is capable of launching roughly 45 metric tons into low Earth orbit. Its maiden flight will come after more than a decade of development by Blue Origin, which was founded by Amazon CEO Jeff Bezos, and over three years later than the company hoped for.  

Once the ESCAPADE satellites reach Mars in September 2025, "the mission team will need several months to configure their orbits for science observations," NASA said in the recent news release. Their orbit will be adjusted over several months such that by early 2026 they’ll follow each other in a "string of pearls" formation, which will allow them to gather data on Mars’ rapidly changing response to the solar wind

They will later break into different orbits such that they can observe both the solar wind and Mars’ upper atmosphere in real time, according to the space agency. The science mission is designed to last 11 months in Mars orbit, until March 2027. 

Meanwhile, the New Glenn first stage is expected to return to Earth shortly after liftoff, demonstrating its reusability. Blue Origin has said that the first stage will operate like a commercial airliner but with cleaner fuel, leading to less waste and lower launch costs.

In recent weeks, portions of other New Glenn rockets scheduled for future flights have suffered damage, Bloomberg reported on Aug. 21. In one incident, the top of Blue Origin’s second New Glenn rocket crumpled "like a crushed Coke can," partly due to an error by factory workers who had moved the section into a chilled storage hangar but didn’t monitor the hardware afterward. In the second incident, a part of the third New Glenn rocket failed during stress testing and exploded inside a building.

No injuries were reported in either incident, according to Bloomberg, which first broke the news about the recent failures. The issues haven’t affected New Glenn’s planned debut launch in October, a Blue Origin spokesperson told GeekWire’s Alan Boyle.

via Space https://www.space.com

August 31, 2024 at 12:18PM

Chefs are using fungus to transform food garbage into fancy, fully edible dishes

https://www.popsci.com/science/food-waste-fungus/

Science is uncovering more novel ways to transform those smelly, spent banana peels decaying in your garbage into something people might actually want to eat. Around a third of the world’s overall food supply is wasted or lost every year. That, according to the UN’s World Food Program, adds up to around $1 trillion worth of lost food every year. All of this waste may send as much as three billion new tons of greenhouse gasses emitted into the atmosphere. But new research into Neurospora intermedia, a fungus at the heart of a classic fermented Indonesian dish, may offer a partial solution to this growing agricultural dilemma.

The study, published last week in Nature Microbiology reveals Neurospora may possess a unique ability to remove indigestible plant material found in common food waste and turn that detritus into edible and surprisingly tasty new dishes. Now, multiple Michelin star chefs are using those findings to reimagine normally discarded food scraps as totally new fine-dining dishes. The ultimate goal: use the fungus’ transformative properties to simultaneously reduce food waste and create tasty treats. 

“Our food system is very inefficient. A third or so of all food that’s produced in the U.S. alone is wasted, and it isn’t just eggshells in your trash,” former chef and research lead author Vayu Hill-Maini said in a statement. “It’s on an industrial scale. What happens to all the grain that was involved in the brewing process, all the oats that didn’t make it into the oat milk, the soybeans that didn’t make it into the soy milk? It’s thrown out.”

A fungus with transformative properties 

Hill-Maini began his research with an exploration of the Indonesian dish oncom, a fermented product traditionally derived from discarded soy leftovers from making tofu. Hill-Maini performed a metagenomic survey of oncom samples to determine what exactly was responsible for turning the soy waste into an edible product. The results revealed Neurospora, in the form of a mold, was dominant. A closer look into the underlying genetic structure of the fungus revealed something even more interesting; multiple enzymes capable of breaking down indigestible plant material like pectin and glucose and transforming it into material digestible by humans. The entire transformational process occurs in just around 36 hours.

Further analyses showed this process works not just for soy byproducts, but around 30 other types of food waste products from almond shells and banana peels, to stale rice bread. In each case, the fermentation process rapidly reimagined these waste products into seemingly new food with unique, often unexpected flavor profiles. Even more surprising, analyses of the Neurospora from the oncom dish sample and Neurospora found in the wild found striking fundamental differences. Indonesian cooks, the study suggests, may have unintentionally created a distinct, domesticated strain of the fungus. 

“The fungus readily eats those things and in doing so makes this food and also more of itself, which increases the protein content,” Hill-Maini  said. “So you actually have a transformation in the nutritional value. You see a change in the flavor profile. Some of the off-flavors that are associated with soybeans disappear.”

Michelin star chefs use Neurospora to create entirely new dishes 

Armed with that chemical knowledge, Hill-Maini decided to partner with prominent chefs in the US and Europe to explore whether or not the fungus could be adapted for a more westernized palate. To start, Hill-Maini partnered with Rasmus Munk, head chef co-owner of the Copenhagen-based two Michelin star restaurant “Alchemist” to conduct a baseline taste test. They presented the traditional red oncom dish to 60 testers who had never tasted it before. They consistently rated it a six out of nine in terms of tastiness. Overall, the tasters said the oncom had an earthy, nutty, and, maybe not unsurprisingly, mushroomy taste to it. At the same time, it was still relatively subtle.

“Its flavor is not polarizing and intense like blue cheese,” Hill-Maine said. “It’s a milder, savory kind of umami earthiness. Different substrates impart their own flavors, however, including fruity notes when grown on rice hulls or apple pomace.”

Munk and Andrew Luzmore, chef in charge of special projects at a New York-based Michelin star restaurant called Blue Hill, set out to create entirely new dishes based around Neurospora’s transformative properties. At Blue Hill, Luzmore experimented by placing the fungus mold on top of several days old bread. Once the 36 hour fermentation process was complete, Luzmore took the concoction and fried it up. The result was something that looked and tasted remarkably like a toasted cheese sandwich even though no cheese or dairy at all was involved. 

“It’s incredibly delicious,” Luzmore said in a statement. “It looks and tastes like you grated cheddar onto bread and toasted it. It’s a very clear window into what can be done with this.”

Munk, by contrast, opted to create a fungus-inspired dessert. For his dish, the chef incorporated the fungus sample into a colorless and bland rice custard. After a 60 hour fermentation process, the once bland custard came out sweet with a surprisingly pineapple-ey, fruity flavor profile. Munk is adding the oddball sweet to his restaurant’s menu, served alongside a jelled plum wine and lime syrup.

“We experienced that the process changed the aromas and flavors in quite a dramatic way—adding sweet, fruity aromas,” Munk said in a statement. “I found it mind-blowing to suddenly discover flavors like banana and pickled fruit without adding anything besides the fungi itself.” 

Hill-Maini, the researcher, says these culinary experiments were important in order to ensure the study findings “don’t just stay in the lab.” That said, his ambitions are much larger than the limited, rarefied confines of the fine-dining elite. In theory, Hill-Maini believes this natural occurring fermentation process could be applied to on an industrial scale to simultaneously reduce common food waste and introduce new, nutritionally rich protein substitutes into the marketplace. The fact that they seem to taste decent is an added bonus. More fundamentally, Hill-Maini says–quite ambitiously–that these revelations, rooted in centuries of traditional Indonesian cooking, could help reimagine what we actually consider “waste” in terms of food. 

“My long term vision is to look at these large waste things that come out of the food system and look at them as an opportunity, as an ingredient in and of itself,” Hill Maini said. “Why does it have to be called waste?”

The post Chefs are using fungus to transform food garbage into fancy, fully edible dishes appeared first on Popular Science.

via Popular Science – New Technology, Science News, The Future Now https://www.popsci.com

September 2, 2024 at 09:00AM

Judge Rules $400 Million Algorithmic System Illegally Denied Thousands of People’s Medicaid Benefits

https://gizmodo.com/judge-rules-400-million-algorithmic-system-illegally-denied-thousands-of-peoples-medicaid-benefits-2000492529

Thousands of Tennesseans were illegally denied Medicaid and other benefits due to programming and data errors in an algorithmic system the state uses to determine eligibility for low-income residents and people with disabilities, a U.S. District Court judge ruled this week.

The TennCare Connect system—built by Deloitte and other contractors for more than $400 million—is supposed to analyze income and health information to automatically determine eligibility for benefits program applicants. But in practice, the system often doesn’t load the appropriate data, assigns beneficiaries to the wrong households, and makes incorrect eligibility determinations, according to the decision from Middle District of Tennessee Judge Waverly Crenshaw Jr.

“When an enrollee is entitled to state-administered Medicaid, it should not require luck, perseverance, and zealous lawyering for him or her to receive that healthcare coverage,” Crenshaw wrote in his opinion.

The decision was a result of a class action lawsuit filed in 2020 on behalf of 35 adults and children who were denied benefits.

“This is a tremendous win for the plaintiffs and all TennCare members who have lost their vital health coverage due to TennCare’s unlawful policies and practices,” said Michele Johnson, executive director of the Tennessee Justice Center, which was one of several organizations that represented the plaintiffs. “We are proud to have stood with the courageous families that brought the case in order to protect the health coverage of many thousands of their neighbors across the state.”

The TennCare Connect system, which launched in 2019, was the result of a years-long effort by the state to modernize its Medicaid system and adhere to new eligibility criteria and streamlined enrollment requirements mandated by the Affordable Care Act. Under the new rules, states were supposed to provide a single application process that would collect residents’ information and determine which of the many complex health and disability benefits programs they were eligible for. Crenshaw found that TennCare Connect did not consider whether applicants were eligible for all available programs before it terminated their coverage.

Deloitte was a major beneficiary of the nationwide modernization effort, winning contracts to build automated eligibility systems in more than 20 states, including Tennessee and Texas. Advocacy groups have asked the Federal Trade Commission to investigate Deloitte’s practices in Texas, where they say thousands of residents are similarly being inappropriately denied life-saving benefits by the company’s faulty systems.

via Gizmodo https://gizmodo.com/

August 29, 2024 at 10:43AM

Critics Slam Amazon’s ‘Water Positive’ Pledge as Data Centers Strain Local Resources

https://gizmodo.com/critics-slam-amazons-water-positive-pledge-as-data-centers-strain-local-resources-2000493389

This story was originally published by Grist. Sign up for Grist’s weekly newsletter here.

Earlier this year, the e-commerce corporation Amazon secured approval to open two new data centers in Santiago, Chile. The $400 million venture is the company’s first foray into locating its data facilities, which guzzle massive amounts of electricity and water in order to power cloud computing services and online programs, in Latin America — and in one of the most water-stressed countries in the world, where residents have protested against the industry’s expansion.

This week, the tech giant made a separate but related announcement. It plans to invest in water conservation along the Maipo River, which is the primary source of water for the Santiago region. Amazon will partner with a water technology startup to help farmers along the river install drip irrigation systems on 165 acres of farmland. The plan is poised to conserve enough water to supply around 300 homes per year, and it’s part of Amazon’s campaign to make its cloud computing operations “water positive” by 2030, meaning the company’s web services division will conserve or replenish more water than it uses up.

The reasoning behind this water initiative is clear: Data centers require large amounts of water to cool their servers, and Amazon plans to spend $100 billion to build more of them over the next decade as part of a big bet on its Amazon Web Services cloud-computing platform. Other tech companies such as Microsoft and Meta, which are also investing in data centers to sustain the artificial-intelligence boom, have made similar water pledges amid a growing controversy about the sector’s thirst for water and power.

Amazon claims that its data centers are already among the most water-efficient in the industry, and it plans to roll out more conservation projects to mitigate its thirst. However, just like corporate pledges to reach “net-zero” emissions, these water pledges are more complex than they seem at first glance. While the company has indeed taken steps to cut water usage at its facilities, its calculations don’t account for the massive water needs of the power plants that keep the lights on at those very same facilities. Without a larger commitment to mitigating Amazon’s underlying stress on electricity grids, conservation efforts by the company and its fellow tech giants will only tackle part of the problem, according to experts who spoke to Grist.

The powerful servers in large data centers run hot as they process unprecedented amounts of information, and keeping them from overheating requires both water and electricity. Rather than try to keep these rooms cool with traditional air-conditioning units, many companies use water as a coolant, running it past the servers to chill them out. The centers also need huge amounts of electricity to run all their servers: They already account for around 3 percent of U.S. power demand, a number that could more than double by 2030. On top of that, the coal, gas, and nuclear power plants that produce that electricity themselves consume even larger quantities of water to stay cool.

Will Hewes, who leads water sustainability for Amazon Web Services, told Grist that the company uses water in its data centers in order to save on energy-intensive air conditioning units, thus reducing its reliance on fossil fuels.

“Using water for cooling in most places really reduces the amount of energy that we use, and so it helps us meet other sustainability goals,” he said. “We could always decide to not use water for cooling, but we want to, a lot, because of those energy and efficiency benefits.”

In order to save on energy costs, the company’s data centers have to evaporate millions of gallons of water per year. It’s hard to say for sure how much water the data center industry consumes, but the ballpark estimates are substantial. One 2021 study found that U.S. data centers consumed around 415,000 acre-feet of water in 2018, even before the artificial-intelligence boom. That’s enough to supply around a million average homes annually, or about as much as California’s Imperial Valley takes from the Colorado River each year to grow winter vegetables. Another study found that data centers operated by Microsoft, Google, and Meta withdrew twice as much water from rivers and aquifers as the entire country of Denmark.

It’s almost certain that this number has ballooned even higher in recent years as companies have built more centers to keep up with the artificial-intelligence boom, since AI programs such as ChatGPT require massive amounts of server real estate. Tech companies have built hundreds of new data centers in the last few years alone, and they are planning hundreds more. One recent estimate found that ChatGPT requires an average-sized bottle of water for every 10 to 50 chat responses it provides. The on-site water consumption at any one of these companies’ data centers could now rival that of a major beverage company such as PepsiCo.

Amazon doesn’t provide statistics on its absolute water consumption; Hewes told Grist the company is “focused on efficiency.” However, the tech giant’s water usage is likely lower than some of its competitors — in part because the company has built most of its data centers with so-called evaporative cooling systems, which require far less water than other cooling technologies and only turn on when temperatures get too high. The company pegs its water usage at around 10 percent of the industry average, and in temperate locations such as Sweden, it doesn’t use any water to cool down data centers except during peak summer temperatures.

Companies can reduce the environmental impact of their AI business by building them in temperate regions that have plenty of water, but they must balance those efficiency concerns with concerns about land and electricity costs, as well as the need to be close to major customers. Recent studies have found that data center water consumption in the U.S. is “skewed toward water stressed subbasins” in places like the Southwest, but Amazon has clustered much of its business farther east, especially in Virginia, which boasts cheap power and financial incentives for tech firms.

“A lot of the locations are driven by customer needs, but also by [prices for] real estate and power,” said Hewes. “Some big portions of our data center footprint are in places that aren’t super hot, that aren’t in super water stressed regions. Virginia, Ohio — they get hot in the summer, but then there are big chunks of the year where we don’t need to use water for cooling.”  Even so, the company’s expansion in Virginia is already causing concerns over water availability.

To mitigate its impacts in such basins, the company also funds dozens of conservation and recharge projects like the one in Chile. It donates recycled water from its data centers to farmers, who use it to irrigate their crops, and it has also helped restore the rivers that supply water-stressed cities such as Cape Town, South Africa; in northern Virginia, it has worked to install cover crop farmland that can reduce runoff pollution in local waterways. The company treats these projects the way other companies treat carbon offsets, counting each gallon recharged against a gallon it consumes at its data centers. Amazon said in its most recent sustainability report that it is 41 percent of the way to meeting its goal of being “water positive.” In other words, it has funded projects that recharge or conserve a little over 4 gallons of water for every 10 gallons of water it uses.

But despite all this, the company’s water stewardship goal doesn’t include the water consumed by the power plants that supply its data centers. This consumption can be as much as three to 10 times as large as the on-site water consumption at a data center, according to Shaolei Ren, a professor of engineering at the University of California, Riverside, who studies data center water usage. As an example, Ren pointed to an Amazon data center in Pennsylvania that relies on a nuclear power plant less than a mile away. That data center uses around 20 percent of the power plant’s capacity.

“They say they’re using very little water, but there’s a big water evaporation happening just nearby, and that’s for powering their data center,” he said.

Companies like Amazon can reduce this secondary water usage by relying on renewable energy sources, which don’t require anywhere near as much water as traditional power plants. Hewes says the company has been trying to “manage down” both water and energy needs through a separate goal of operating on 100 percent renewable energy, but Ren points out that the company’s data centers need round-the-clock power, which means intermittently available renewables like solar and wind farms can only go so far.

Amazon isn’t the only company dealing with this problem. CyrusOne, another major data center firm, revealed in its sustainability report earlier this year that it used more than eight times as much water to source power as it did on-site at its data centers.

“As long as we are reliant on grid electricity that includes thermoelectric sources to power our facilities, we are indirectly responsible for the consumption of large amounts of water in the production of that electricity,” the report said.

As for replenishment projects like the one in Chile, they too will only go part of the way toward reducing the impact of the data center explosion. Even if Amazon’s cloud operations are “water positive” on a global scale, with projects in many of the same basins where it owns data centers, that doesn’t mean it won’t still compromise water access in specific watersheds. The company’s data centers and their power plants may still withdraw more water than the company replenishes in a given area, and replenishment projects in other aquifers around the world won’t address the physical consequences of that specific overdraft.

“If they are able to capture some of the growing water and clean it and return to the community, that’s better than nothing, but I think it’s not really reducing the actual consumption,” Ren said. “It masks out a lot of real problems, because water is a really regional issue.”

Correction: This story has been corrected to clarify that Amazon’s “water positive” pledge applies only to its web services division.

This article originally appeared in Grist. Grist is a nonprofit, independent media organization dedicated to telling stories of climate solutions and a just future. Learn more at Grist.org.

via Gizmodo https://gizmodo.com/

September 1, 2024 at 08:06AM

NASA’s Solar Sail Mission Is Finally Flying After Deployment Glitch

https://gizmodo.com/nasas-solar-sail-mission-is-finally-flying-after-deployment-glitch-2000493203

Despite a failed first attempt, NASA deployed its pioneering solar sail system, which will harness energy from the Sun to propel itself forward through space.

The Advanced Composite Solar Sail System is now fully deployed after NASA succeeded in extending the mission’s experimental booms on Thursday at 1:33 p.m. ET, the space agency announced. NASA teams will begin testing the new form of space travel, initiating different maneuvers to see how well the sail fares in orbit.

NASA’s solar sail mission launched in April to test new materials and deployable structures for a propulsion system that runs on photons from the Sun. A few months after its launch, the mission’s sail became stuck when an onboard power monitor detected higher-than-expected motor currents, pausing the unfurling process.

The mission teams were successful on their second attempt to deploy the solar sail, fully unfurling it to stretch across 860 square feet (80 square meters), or about as large as half a tennis court. The sail needs to be large enough to generate sufficient thrust, while also being at a high enough orbit to gain altitude and overcome atmospheric drag using the subtle force of sunlight on the sail. NASA’s solar sail orbits Earth at approximately twice the altitude of the International Space Station.

With the solar sail fully deployed, it may be visible to observers from Earth. Using four cameras on board the spacecraft, NASA captured panoramic views of the unfurling process, which will be available on September 4.

Over the next few weeks, NASA engineers will test the maneuvering capabilities of the spacecraft, raising and lowering its orbit using only the pressure of sunlight acting on the sail. The mission’s initial flight phase is designed to last for two months. “Raising and lowering the orbit of the Advanced Composite Solar Sail System spacecraft will provide valuable information that may help guide future concepts of operations and designs for solar sail-equipped science and exploration missions,” NASA wrote in its update.

NASA’s solar sail mission is meant to test new materials and deployable structures for the experimental propulsion systems, including new composite booms that are used to unfurl the sail. The composite booms are made from a polymer material; they’re lightweight while still being stiff and resistant to bending and warping when exposed to different temperatures. They work the same way as a sailboat’s boom, except they are designed to catch the propulsive power of sunlight rather than wind.

NASA is hoping this new form of low-cost space travel can grant it more access to different destinations across the solar system, although solar sails are limited by the durability of the materials and spacecraft electronic systems.

Solar sails harness energy produced by light from the Sun, using it to propel spacecraft forward. As the photons hit the spacecraft’s sails, it causes small bursts of momentum that propel it farther away from the star. We’ll be watching closely as this tiny cubesat, and this interesting new concept, takes its important first baby steps in space.

 

 

via Gizmodo https://gizmodo.com/

August 30, 2024 at 01:36PM