XB-1 took off from the runway at Mojave Air & Space Port near Barstow, California at about 11:21 AM EST. From there, Boom Supersonic’s Chief Test Pilot Tristan “Geppetto” Brandenburg ascended in the experimental plane to an altitude of 34,000 ft before turning left and beginning its supersonic test. After successfully achieving Mach 1.1 at 11:32 PM EST, Brandenburg continued XB-1 on its deceleration and descent path. At one point, however, XB-1 briefly broke the sound barrier once again.
“Alright, knock it off, knock it off,” someone in Boom Supersonic’s flight control room could be heard joking during the livestream.
XB-1 surpassed Mach 1 yet again a few minutes later before landing 11:54 PM EST after a total flight time of 33.49 minutes. The airspace in which Boom Supersonic complete its test holds historic significance—known as the Bell X-1 Supersonic Corridor, the area is named after the first plane to break the sound barrier in 1947.
Tuesday’s success comes less than a year after the demonstrator aircraft’s debut flight on March 22, 2024. The XB-1 conducted another 10 flights prior to today’s Mach 1 breakthrough. Its most recent took place on January 10, when Brandenburg topped out at Mach 0.95 at an altitude of 29,481 ft (575 knots true airspeed, or roughly 661 mph). Today’s success officially makes Boom Supersonic’s XB-1 the first civil aircraft to ever go supersonic over the continental US.
XB-1 was accompanied by two ‘chase planes’
At almost 63-feet-long, the XB-1 is about one-third the size of Overture, Boom Supersonic’s proposed commercial jet. Overture is intended to seat 64-80 passengers, and complete international trips at speeds as fast as Mach 1.7. That’s around twice the speed of today’s subsonic jets, but slightly slower than the Concorde.
The path to Overture’s commercial debut has faced multiple delays over the years. XB-1’s first flight was originally scheduled for 2021, but required pushbacks to address various engineering and design concerns. Although such issues are common in the aircraft industry, that still means Overture’s proposed 2029 release date likely will be shuffled at least a couple times before a working commercial supersonic plane takes to the skies.
“Historically, the human race has always wanted to go faster,” livestream co-host and former Chief Concorde Pilot Mike Bannister said shortly after XB-1’s pair of supersonic achievements.
A research team from DGIST’s (President Kunwoo Lee) Division of Energy & Environmental Technology, led by Principal Researcher Kim Jae-hyun, has developed a lithium metal battery using a “triple-layer solid polymer electrolyte” that offers greatly enhanced fire safety and an extended lifespan. This research holds promise for diverse applications, including in electric vehicles and large-scale energy storage systems.
Spacecraft powered by electric propulsion could soon be better protected against their own exhaust, thanks to new supercomputer simulations.
Electric propulsion is a more efficient alternative to traditional chemical rockets, and it’s being increasingly used on space missions, starting off with prototypes on NASA’s Deep Space 1 and the European Space Agency‘s SMART-1 in 1998 and 2003, respectively, and subsequently finding use on flagship science missions such as NASA’s Dawn and Psyche missions to the asteroid belt. There are even plans to use electric propulsion on NASA’s Lunar Gateway space station.
The idea behind electric propulsion is that an electric current ionizes (i.e. removes an electron from) atoms of a neutral gas, such as xenon or krypton, stored on board a spacecraft. The ionization process produces a cloud of ions and electrons. Then a principle called the Hall effect generates an electric field that accelerates the ions and electrons and channels them into a characteristically blue plume that emerges from the spacecraft at over 37,000 mph (60,000 kph). Hence an electric propulsion system is also referred to as an ion engine.
According to Sir Isaac Newton‘s third law of motion, every action has an equal and opposite reaction. The plume of ions jetting out from the spacecraft therefore acts to provide thrust. It takes a while to build up momentum, however, because, despite moving at high velocity, the ion plume is pretty sparse. The impulse generated is not as immediately forceful as a chemical rocket, but ion engines require less fuel and therefore less mass, which reduces launch costs, and ion engines don’t use up all their fuel as quickly as chemical rockets do.
An Advanced Electric Propulsion System undergoing tests at NASA’s Glenn Research Center. (Image credit: NASA/Jef Janis)
The energy for the electromagnetic fields is often provided by solar arrays, and hence the technology is sometimes referred to as solar electric propulsion. But for missions farther from the sun, where the sunlight is fainter, nuclear power in the form of radioisotope thermoelectric generators (RTGs) can also be used to drive the electric propulsion.
Though electric propulsion is now maturing and is being used in a variety of missions, it’s not a perfect technology. One problem in particular is that the ion plume can damage a spacecraft. Although the plume is pointed away from the probe, electrons in the plume can find themselves redirected, moving against the plume’s direction of travel and impacting the spacecraft, damaging solar arrays, communication antennas and any other exposed components. Suffice to say, this isn’t good for the probe.
Get the Space.com Newsletter
Breaking space news, the latest updates on rocket launches, skywatching events and more!
"For missions that could last years, [electric propulsion] thrusters must operate smoothly and consistently over long periods of time," Chen Cui of the University of Virginia School of Engineering and Applied Science said in a statement.
Before solutions can be put in place to protect a spacecraft from these backscattered electrons, their behavior in an ion-engine plume must first be understood, which is where Cui and Joseph Wang of the University of Southern California come in. They’ve performed supercomputer simulations of an ion engine’s exhaust, modeling the thermodynamic behavior of the electrons and how they affect the overall characteristics of the plume.
"These particles may be small, but their movement and energy play an important role in determining the macroscopic dynamics of the plume emitted from the electric propulsion thruster," said Cui.
What Cui and Wang found was that the electrons in the plume behave differently depending upon their temperature and their velocity.
"The electrons are a lot like marbles packed into a tube," said Cui. "Inside the beam, the electrons are hot and move fast. Their temperature doesn’t change much if you go along the beam direction. However, if the ‘marbles’ roll out from the middle of the tube, they start to cool down. This cooling happens more in a certain direction, the direction perpendicular to the beam’s direction."
In other words, the electrons in the core of the beam that are moving fastest have a more or less constant temperature, but those on the outside cool off faster, slow down and move out of the beam, potentially being back-scattered and impacting the spacecraft.
Now that scientists better understand the behavior of the electrons in the ion plume, they can incorporate this into designs for future electric propulsion engines, looking for ways to limit the back-scatter, or perhaps confine the electrons more to the core of the beam. Ultimately, this could help missions powered by electric propulsion to fly farther and for longer, pushed by the gentle blue breeze of its ion plume.
OpenAI is reportedly preparing for the launch of Operator sometime this week. Operator is name of its computer-use agent that can complete tasks in a user’s web browser on their behalf. Other companies including Google and Anthropic have been developing similar “agents” in hopes they will be the next major leap towards AI fulfilling its promise of being able to perform tasks currently done by humans.
According to The Information, which first reported on the impending launch, Operator will provide users with suggested prompts in categories like travel and dining and events. Users could, for instance, ask Operator to find a good flight from New York to Maui that would not have them landing too late in the evening. Operator will not complete a transaction—the user will remain in the loop and complete the checkout process.
It is easy to imagine certain ways Operator could be useful. Aging individuals who are not computer savvy could potentially ask Operator to help them send an email, and see it navigate to Gmail and open a compose window for them. Tech savvy people do not need this type of help, but older generators often struggle navigating the web and completing even simple tasks is a challenge. Bots could help in other areas as well, such as in quality-assurance testing where companies need to test that their new websites or services work properly.
So-called “computer use agents” do come with potential risks. We have already seen a startup introduce a web-navigating bot to automate the process of posting marketing spam to Reddit. Bots that take control of the end-user client are able to bypass API limitations meant to block automation. AI startups will need to take some measures to combat abuse, or else websites will become even more flooded with spam than they are today.
These agents like Operator essentially work by taking screenshots of a user’s browser and sending the images back to OpenAI for analysis. Once its models determine the next step necessary to complete a task, a command is sent back to the browser to move and click the mouse on the appropriate target, or type into an input box. It takes advantage of multi-modal technology OpenAI and others have been developing that can interpret multiple forms of input, in this case text and imagery.
The entire promise of a recent crop of AI startups is that they will be able to create an artificial general intelligence (AGI) that can replace humans on most tasks they perform today and make everyone’s lives more efficient. As exponential gains in the performance of language models have slowed, these companies have been looking for new unlocks that will get them there, and computer use agents are one. An artificial intelligence can not truly replace humans until it can physically complete the tasks for them—writing is just part of a task. Bots also need to be able to navigate spreadsheets, watch videos, and more.
After Anthropic released an initial preview of its computer use bot, early testers complained it was half-baked at best, getting stuck in loops when it does not know what to do or forgetting the task and starting to do something else entirely, like looking at pictures of nature on Google Images. It is also slow, and expensive to operate.
Keeping humans in the loop will be essential with a bot that is granted such high-level control and access to critical data. It seems like perhaps computer-use agents will be akin to self-driving cars. Google was able to make a car drive down a straightaway on its own easy enough, but the edge-case scenarios have taken years to solve.
There is debate on how to measure AGI and when it will be “achieved,” but OpenAI has told its biggest backer Microsoft that it believes AGI will be reached once it has created an AI that can generate at least $100 billion in profit. That is a lofty goal considering OpenAI predicts it will generate $12 billion in revenue in 2025 while still losing billions.
At the same time, neither Microsoft nor Google has seen enterprise customers willing to adopt AI tools as fast as they hoped. Instead of charging $20-30 per employee to add AI tools into their bundles, both companies are now shoving AI into their standard bundles and hiking the prices by a couple of dollars respectively.
California-based Vast Space has big ambitions. The company is aiming to launch a commercial space station, the Haven-2, into low Earth orbit by 2028, which would allow astronauts to stay in space after the decommissioning of the International Space Station (ISS) in 2030. In doing so, it is attempting to muscle in on NASA’s plans to develop commercial low-orbit space stations with partner organizations—but most ambitious of all are Vast Space’s goals for what it will eventually put into space: a station that has its own artificial gravity.
“We know that in weightlessness we can live a year or so, and in conditions that are not easy. Perhaps, however, lunar or Martian gravity is enough to live comfortably for a lifetime. The only way to find out is to build stations with artificial gravity, which is our long-term goal,” says Max Haot, Vast’s CEO.
Science Newsletter
Your weekly roundup of the best stories on health care, the climate crisis, new scientific discoveries, and more. Delivered on Wednesdays.
Vast Space was founded in 2021 by 49-year-old programmer and businessman Jed McCaleb, the creator of the peer-to-peer networks eDonkey and Overnet, as well as the early and now defunct crypto exchange Mt. Gox. Vast Space announced in mid-December a partnership with SpaceX to launch two missions to the ISS, which will be milestones in the company’s plan to launch its first space station, Haven-1, later in 2025. The missions, still without official launch dates, will fall within NASA’s private astronaut missions program, through which the space agency wants to promote the development of a space economy in low Earth orbit.
Graphical representation of Haven-1 in orbit.
Photograph: Vast Space
For Vast, this is part of a long-term business strategy. “Building an outpost that artificially mimics gravity will take 10 to 20 years, as well as an amount of money that we don’t have now,” Haot admits. “However, to win the most important contract in the space station market, which is the replacement of ISS, with our founder’s resources, we will launch four people on a [SpaceX] Dragon in 2025. They will stay aboard Haven-1 for two weeks, then return safely, demonstrating to NASA our capability before any competitor.”
Space for One More?
What Vast Space is trying to do, by showing its capabilities, is get involved in NASA’s Commercial Destinations in Low Earth Orbit (CLD) program, a project the space agency inaugurated in 2021 with a $415 million grant to support the development of private low-Earth orbit stations.
The money was initially allocated to three different projects: one from aerospace and defense company Northrop Grumman, which has since exited the progam; a joint venture called Starlab; and Orbital Reef, from Jeff Bezos’ Blue Origin. Vast has no contract with the US space agency, but it aims to outstrip its competitors by showing NASA that it can put a space station into space ahead of these others. The agency will choose which project’s station to back in the second half of 2026.
By doing this, Vast is borrowing from SpaceX’s playbook. Not only has Vast Space drawn some of its employees and the design of equipment and vehicles from Elon Musk’s company, it’s also trying to replicate its approach to market: to be ready before anyone else, by having technologies and processes already qualified and validated in orbit. “We are lagging behind,” Haot says. “What can we do to win? Our answer, in the second half of 2025, will be the launch of Haven-1.”
Haven-1 will have a habitable volume of 45 cubic meters, a docking port, a corridor with consumable resources for the crew’s personal living quarters, a laboratory, and a deployable communal table set up next to a domed window about a meter high. On board, roughly 425 kilometers above Earth’s surface, the station will use Starlink laser links to communicate with satellites in low Earth orbit, tech that was first tested during the Polaris Dawn mission in the autumn of 2024.
Imagine: A switch is flicked and, in a heartbeat, every process spewing deadly pollution into the heavens is replaced with something clean and sustainable. Sadly, even then, the Earth would still tip towards being uninhabitable thanks to all of the carbon we’ve already dumped up there. If we as a species are to survive then all of that junk needs to be pulled back to Earth, and fast. Proponents of Direct Air Capture believe it’s a vital weapon to accomplish that task; its critics say it’s so inefficient that we’d be better off trying anything else first.
Direct Air Capture
Mission Zero
Put simply, Direct Air Capture (DAC) is the practice of removing CO2 from the atmosphere by pulling air through a mechanical or chemical filter. Air is typically drawn through a DAC system via one or more fans, while filtering is done with a solid (known as a sorbent) or with a liquid (known as a solvent). Once captured, heat or electricity is applied to the filter material to remove the CO2, both to re-use the filter and get the CO2 ready to move on. It’s this last stage that’s often the most energy-intensive, and therefore costly, part of the process. Given the amount of air that will need to be cleaned (all of it) for this to work, DAC needs to be as energy efficient as possible.
The most cost-effective way to do this is by capping the smokestacks of a carbon-intensive process, like a factory or fossil fuel power plant to prevent more CO2 release. But that does nothing to reduce the excess CO2 already in the atmosphere. That’s why some scientists and entrepreneurs are inclined to gamble on DAC plants in free air to scrub the heavens clean.
The NOAA explains that in 1960, humanity was pumping out 11 billion tons of carbon dioxide into the air each year. Half a century later, and that figure now stands closer to 40 billion, which is why emissions-reduction work is so vital. But even if we did manage to reduce all of our new emissions to zero, we’d still have to address the 950 gigatons or so of CO2 lurking in the atmosphere already. At the time of writing, the CO2 in the atmosphere as recorded by the NOAA’s Global Monitoring Lab at Mauna Loa is 422.38ppm. The scientific consensus is any figure over 350ppm will spell catastrophic doom for humanity and the state of the planet more generally.
This June, the University of Oxford published research saying that if we want to limit warming to just 1.5 degrees (which would be catastrophic), humanity will need to extract between seven and nine billion tons of carbon dioxide out of the air each year by 2050. The COP28 declaration supports signatory nations throwing their weight behind carbon capture technologies. The Intergovernmental Panel on Climate Change (IPCC) says there is no viable pathway to averting climate change unless large volumes of CO2 are pulled from the air. This has been the status quo for a while: In 2017, a coalition of prominent scientists led by Professor Jim Hansen said it was imperative that humanity began mass-removing atmospheric CO2.
What to do with all the CO2
Once DAC has sucked the unwanted carbon out of the air, it needs to be put somewhere. One option, The British Geological Survey explains, is to easily and affordably convert CO2 to its supercritical form, which behaves like a runny liquid. This liquid can then be stored underground after being injected into porous rocks, with old oil fields and coal seams appearing to be ideal places. The oil and gas industry actually uses this approach to boost production in existing fields, as the liquid CO2 fills up the space, pushing more oil toward the extraction site. But the International Energy Agency’s (IEA) briefing paper on Direct Air Capture suggests more than half of all atmospheric CO2 emissions recovered will need to be sequestered.
Obviously, getting more fossil fuels out of the ground to burn does not do very much for the climate, and ideally the governments of the world would just invest in effective carbon capture to prevent us from boiling to death. Fortunately for humanity’s fixation on market solutions, recycling some of the non-sequestered CO2 could become an industry unto itself.
CO2 can also be turned into synthetic fuels in traditional combustion engines. Air travel is the most obvious example, especially given that the size and weight of batteries make it nearly impossible to build an electric jumbo jet. Recovered CO2 can also be used as the base for common non-fuel products including construction materials, in chemical and agricultural products, not to mention putting the fizz in our drinks.
Holocene is one of many companies looking to turn CO2 extraction into a viable, long term business by selling carbon removal credits to big businesses. Its approach is to pull air through water which has been embedded with an amnio acid that binds to CO2. The water and CO2 mix is then combined with guanidine, which turns the CO2 into a solid that can be easily filtered out, allowing the amino acid water to be reused. The solid CO2 is then heated to a low temperature, which separates the guanidine from gaseous CO2, ready for use or sequestration. Holocene believes a reusable solvent (and reusable chemical treatment) combined with the low-temperature heat makes its approach far more cost-effective than that of its rivals.
Mission Zero is also looking to develop a low-cost way of procuring large quantities of CO2 from the atmosphere. It draws air into its hardware and then applies a water-based solvent. But rather than treating this mix chemically, it uses electrodialysis and an ion exchange process to purify the liquid and extract the CO2. From there, the liquid can be reused and the CO2, again, can either be buried underground or, turned into viable products. The company says that its electro-chemical process is similarly far more cost and energy-efficient than many of the other companies operating in this space.
Given the commercial sensitivities involved, it’s not easy to get a real handle on how much it costs to extract CO2 from the atmosphere using DAC in open air. Depending on where you look, the figure can be as much as $600 per ton, but a more common figure is between the $300 and $400 mark. For years, the received wisdom has been that DAC needs to reach a cost of $100 per ton in order to become economically viable.
Earlier this year, a German climate-focused VC firm, Extantia Capital went digging into the source of that $100 shibboleth and traced it back to a paper from early DAC firm Carbon Engineering in 2018 when it published a paper projecting its long-term cost would fall to as little as $94 per ton. Suddenly, the phrase “less than $100 per ton” became the benchmark to which all other DAC companies were held. But, as Extantia’s Torben Schreiter wrote, that figure was also pegged to 2016 dollar prices, so it hasn’t grown with inflation. In 2023, the World Economic Forum said the cost of Direct Air Capture had to fall “below $200 per ton” before it would be widely adopted.
It doesn’t matter if your aims are environmental or industrial, we know the volume of CO2 that needs to be extracted from the atmosphere is significant. For that to be viable, the cost of extraction needs to fall by a significant degree. A more mature metric would be that pricing falls in line with, or below, the perpetually in-flux cost of carbon dioxide as a commodity.
Holocene
“All these DAC approaches use a bunch of energy,” said Holocene’s CEO Keeton Ross. Ross says it’s the cost of this energy that is keeping the price of Direct Air Capture higher than it needs to be. He believes heat-based systems (like Holocene’s) will likely win out in the end because heat can come from any number of affordable sources. These claims of being able to cut the costs of DAC were compelling enough that in September Google invested in Holocene and pledged to buy carbon credits from it in future.
Dr. Nicholas Chadwick, CEO of Mission Zero, told Engadget his company is targeting around $350 per ton by 2026, but that figure is “dependent on a specific price of electricity.” That price, he believes, is "substantially better than what’s available in the commodity market,” making it a no-brainer for industries that are reliant on CO2 to start buying from Mission Zero.
Roadblocks
The obvious objection to Direct Air Capture is that while there’s a lot of carbon dioxide in the atmosphere, it’s still a relatively small proportion of the whole. I’ve heard the process described as panning for gold in the ocean, and the energy costs alone will make it unfeasible on the scale necessary. In 2022, the Institute for Energy Economics and Financial Analysis bluntly claimed the process “simply won’t work.” Part of the objection was that it can be (and is) used for enhanced oil recovery, but also that when DAC facilities are up and running, they’re often far less effective at capturing CO2 than initially promised.
In 2023, a piece published by the Bulletin of Atomic Scientists expressed outrage that the US Department of Energy invested $600 million in one such project. Its authors said the energy costs required to filter that much air to extract just 0.04 percent of its total are far in excess of other, already less expensive ways to reduce emissions, and that there won’t be any dramatic improvement in the physics and chemistry that will make Direct Air Capture dramatically more efficient. They said, bluntly, "It’s just dumb to build today something that we won’t need for 50 years, if ever."
Chadwick said a lot of the criticisms around DAC center on its technical feasibility, which he says is the wrong point. “There are tons of industrial processes where the thermodynamics are terrible, look at ammonia,” he said, “it took years and years to get the yields to where they are right now.” What drove those otherwise inefficient processes was the “economic imperative for it in the marketplace,” he said. “When someone proves they can do [Direct Air Capture] for $200 a ton, all of these arguments go away.”
Both Chadwick and Ross spoke about the importance of scale to help accelerate the still quite nascent industry. In 2023, Carbon Engineering, 1PointFive and Occidental broke ground on the Stratos plant in Texas that, when completed, is expected to suck 500,000 tons of CO2 out of the air per year. Both are optimistic, however, that the projects that are currently under construction will help engineers solve those questions. It’s a long, long way to go before we get to the billions of tons experts believe we’ll need to be extracting to have a hope of survival.
This article originally appeared on Engadget at https://ift.tt/BlTXryZ
In the future, oysters might help in the global fight against antibiotic-resistant bacteria. A protein found in the blood–or hemolymph–of the Sydney rock oyster (Saccostrea glomerata) appears to kill bacteria, while also increasing the effectiveness of some of the antibiotics.. The findings are detailed in a study published January 21 in the journal PLOS ONEand could be used for developing new ways to fight bacterial infections.
Over the past half-century, bacteria have developed resistance to many conventional antibiotics used to treat illnesses like pneumonia and strep throat due to their overuse and misuse. A 2024 study predicts that antimicrobial resistance will cause 40 million deaths by 2050. Discovering and developing new antibiotics is a crucial public health and safety measure.
“Most organisms have natural defence mechanisms to protect themselves against infection,” Kirsten Benkendorff, a study co-author and interdisciplinary marine scientist at Southern Cross University in Australia, said in a statement. “Oysters are constantly filtering bacteria from the water, so they are a good place to look for potential antibiotics.”
CREDIT: Southern Cross University
CREDIT: Southern Cross University
A 2024 study by the same team identified a protein found in the hemolymph of the Sydney rock oyster that inhibits Streptococcus pneumoniae. This bacterium causes respiratory infections in humans, including tonsilitis and pneumonia and is a leading cause of death in children under five and hospitalization in older adults.
Bacteria like S. pneumoniae that cause infection often escape from antibiotics and the immune system by forming biofilms to protect themselves. These biofilms are communities of microorganisms that attach themselves to surfaces in a sticky, protective matrix. The study found that oyster blood can kill bacterial pathogens in the biofilms.
“The oyster hemolymph proteins were found to prevent biofilm formation and disrupt biofilms, so the bacteria remain available to antibiotic exposure at lower doses,” said Benkendorff. “The hemolymph contains a mixture of proteins with known antimicrobial properties. These may act to directly kill the bacteria, as well as preventing them from attaching to the cell surface.”
Crucially, the oyster hemolymph proteins were not toxic to human lung cells. This means it might be possible to optimize the proteins to create a safe and effective dose. While a new antibiotic developed from oyster blood is still quite some time away, the findings are a next step in developing new methods for treating serious infections.
“It provides great opportunities for collaboration between researchers, aquaculture and pharmaceutical industries,” said Benkendorff. “In the meantime, slurping oysters could help keep the respiratory bugs away. Oysters contain zinc which boosts the immune system and they have really good polyunsaturated fatty acids and vitamins that also help modulate immunity.”