Roundup weed killer found in all kids’ oat cereals tested

https://www.treehugger.com/green-food/roundup-weed-killer-found-all-kids-oat-based-cereals-tested.html


EWG tested 28 brands of conventional oat-based cereals; they all had glyphosate residue, most of them above healthy standards.

Earlier this year, environmental health watchdog, Environmental Working Group, tested 45 conventional oat-based cereal products and found glyphosate in 43 of them. They’ve conducted a second round of testing and the results are as grim as the first time around. In the second set of tests, glyphosate – the active ingredient in Monsanto’s Roundup weed killer – was found in all 28 samples.

Glyphosate is a systemic, broad-spectrum herbicide that kills things not genetically modified to resist it. (That is, it doesn’t kill the genetically modified plants grown exclusively from Monsanto seeds.)

While the companies that make the products in question claim that there is no cause for alarm since the levels fall within regulatory limits set by the Environmental Protection Agency, the levels were nonetheless higher than what EWG scientists consider protective of children’s health with an acceptable margin of safety.

“Just because something is legal doesn’t mean it’s safe,” notes EWG. “Federal government standards for pesticides in food are often outdated, not based on the best and most current science. The EPA’s standards for pesticides and other chemicals are also heavily influenced by lobbying from industry.”

Glyphosate is the most commonly used herbicide across the globe. It is great at killing weeds … and that’s not all. It is classified by the International Agency for Research on Cancer as “probably carcinogenic” to people. It is also listed by the California Office of Environmental Health Hazard Assessment as a chemical known to the state to cause cancer.

The independent lab found that all 28 samples had levels above EWG’s health benchmark of 160 parts per billion (ppb). One product had a level of 2,837 ppb … nearly 18 times higher than EWG’s children’s health benchmark. EWG’s benchmark is based on the risks of lifetime exposure, because when a person consumes a small bit every day, it accumulates.

“How many bowls of cereal and oatmeal have American kids eaten that came with a dose of weed killer?” asks EWG President Ken Cook. “… if those companies would just switch to oats that aren’t sprayed with glyphosate, parents wouldn’t have to wonder if their kids’ breakfasts contained a chemical linked to cancer.”

The idea that chemicals linked to cancer our in our kids’ food is really just dreadful. The government standards are outdated and influenced by industry, and the bottom line is that we just shouldn’t be feeding our kids weed killer, no matter the level.

As Cook says, “Glyphosate and other cancer-causing chemicals simply don’t belong in children’s food, period.”

You can read more about the report, see which products were tested, and sign a petition at EWG.

Oh, and in the meantime, buy organic oat cereals.

EWG tested 28 brands of conventional oat-based cereals; they all had glyphosate residue, most of them above healthy standards

via TreeHugger https://ift.tt/2v7tbJp

October 24, 2018 at 09:31AM

This Company Wants to Make the Internet Load Faster

https://www.wired.com/story/company-wants-to-make-internet-load-faster


The internet went down on February 28, 2017. Or at least that’s how it seemed to some users as sites and apps like Slack and Medium went offline or malfunctioned for four hours. What actually happened is that Amazon’s enormously popular S3 cloud storage service experienced an outage, affecting everything that depended on it.

It was a reminder of the risks when too much of the internet relies on a single service. Amazon gives customers the option of storing their data in different “availability regions” around the world, and within those regions it has multiple data centers in case something goes wrong. But last year’s outage knocked out S3 in the entire North Virginia region. Customers could of course use other regions, or other clouds, as backups, but that involves extra work, including possibly managing accounts with multiple cloud providers.

A San Francisco based startup called Netlify wants to make it easier to avoid these sorts of outages by automatically distributing its customers’ content to multiple cloud computing providers. Users don’t need accounts with Amazon, Microsoft Azure, Rackspace, or any other cloud company—Netlify maintains relationships with those services. You just sign-up for Netlify, and it handles the rest.

You can think of the company’s core service as a cross between traditional web hosting providers and content delivery networks, like Akamai, that cache content on servers around the world to speed up websites and apps. Netlify has already attracted some big tech names as customers, often to host websites related to open source projects. For example, Google uses Netlify for the website for its infrastructure management tool Kubernetes, and Facebook uses the service for its programming framework React. But Netlify founders Christian Bach and Mathias Biilmann don’t want to just be middlemen for cloud hosting. They want to fundamentally change how web applications are built, and put Netlify at the center.

Traditionally, web applications have run mostly on servers. The applications run their code in the cloud, or in a company’s own data center, assemble a web page based on the results, and send the result to your browser. But as browsers have grown more sophisticated, web developers have begun shifting computing workloads to the browser. Today, browser-based apps like Google Docs or Facebook feel like desktop applications. Netlify aims to make it easier to build, publish, and maintain these types of sites.

Back to the Static Future

Markus Seyfferth, the COO of Smashing Media, was converted to Netlify’s vision when he saw Biilman speak at a conference in 2016. Smashing Media, which publishes the web design and development publication Smashing Magazine and organizes the Smashing Conference, was looking to change the way it managed its roughly 3,200-page website.

Since its inception in 2006, Smashing Magazine had been powered by WordPress, the content management system that runs about 32 percent of the web according to technology survey outfit W3Techs, along with e-commerce tools to handle sales of books and conference tickets and a third application for managing its job listing site. Using three different systems was unwieldy, and the company’s servers struggled to handle the site’s traffic, so Seyfferth was looking for a new approach.

When you write or edit a blog post in WordPress or similar applications, the software stores your content in a database. When someone visits your site, the server runs WordPress to pull the latest version from the database, along with any comments that have been posted, and assembles it into a page that it sends to the browser.

Building pages on the fly like this ensures that users always see the most recent version of a page, but it’s slower than serving prebuilt “static” pages that have been generated in advance. And when lots of people are trying to visit a site at the same time, servers can bog down trying to build pages on the fly for each visitor, which can lead to outages. That leads companies to buy more servers than they typically need; what’s more, servers can still be overloaded at times.

“When we had a new product on the shop, it needed only a couple hundred orders in one hour and the shop would go down,” Seyfferth says.

WordPress and similar applications try to make things faster and more efficient by “caching” content to reduce how often the software has to query the database, but it’s still not as fast as serving static content.

Static content is also more secure. Using WordPress or similar content managers exposes at least two “attack surfaces” for hackers: the server itself, and the content management software. By removing the content management layer, and simply serving static content, the overall “attack surface” shrinks, meaning hackers have fewer ways to exploit software.

The security and performance advantages of static websites have made them increasingly popular with software developers in recent years, first for personal blogs and now for the websites for popular open source projects.

In a way, these static sites are a throwback to the early days of the web, when practically all content was static. Web developers updated pages manually and uploaded pre-built pages to the web. But the rise of blogs and other interactive websites in the early 2000s popularized server-side applications that made it possible for non-technical users to add or edit content, without special software. The same software also allowed readers to add comments or contribute content directly to a site.

At Smashing Media, Seyfferth didn’t initially think static was an option. The company needed interactive features, to accept comments, process credit cards, and allow users to post job listings. So Netlify built several new features into its platform to make a primarily static approach more viable for Smashing Media.

The Glue in the Cloud

Biilmann, a native of Denmark, spotted the trend back to static sites while running a content management startup in San Francisco, and started a predecessor to Netlify called Bit Balloon in 2013. He invited Bach, his childhood best friend who was then working as an executive at a creative services agency in Denmark, to join him in 2015 and Netlify was born.

Initially, Netlify focused on hosting static sites. The company quickly attracted high-profile open source users, but Biilman and Bach wanted Netlify to be more than just another web-hosting company; they sought to make static sites viable for interactive websites.

Open source programming frameworks have made it easier to build sophisticated applications in the browser. And there’s a growing ecosystem of services like Stripe for payments, Auth0 for user authentication, and Amazon Lambda for running small chunks of custom code, that make it possible to outsource many interactive features to the cloud. But these types of services can be hard to use with static sites because some sort of server side application is often needed to act as a middleman between the cloud and the browser.

Biilmann and Bach want Netlify to be that middleman, or as they put it, the “glue” between disparate cloud computing services. For example, they built an e-commerce feature for Smashing Media, now available to all Netlify customers, that integrates with Stripe. It also offers tools for managing code that runs on Lambda.

Smashing Media switched to Netlify about a year ago, and Seyfferth says it’s been a success. It’s much cheaper and more stable than traditional web application hosting. “Now the site pretty much always stays up no matter how many users,” he says. “We’d never want to look back to what we were using before.”

There are still some downsides. WordPress makes it easy for non-technical users to add, edit, and manage content. Static site software tends to be less sophisticated and harder to use. Netlify is trying to address that with its own open source static content management interface called Netlify CMS. But it’s still rough.

Seyfferth says for many publications, it makes more sense to stick with WordPress for now because Netlify can still be challenging for non-technical users.

And while Netlify is a developer darling today, it’s possible that major cloud providers could replicate some of its features. Google already offers a service called Firebase Hosting that offers some similar functionality.

For now, though, Bach and Biilmann say they’re just focused on making their serverless vision practical for more companies. The more people who come around to this new approach, the more opportunities there are not just for Netlify, but for the entire new ecosystem.


More Great WIRED Stories

via Wired Top Stories https://ift.tt/2uc60ci

October 24, 2018 at 06:03AM

Microplastics Have Been Found in People’s Poop–What Does It Mean?

https://www.scientificamerican.com/article/microplastics-have-been-found-in-people-rsquo-s-poop-what-does-it-mean/


Everywhere scientists have looked for them they have found tiny bits of degraded plastic—including, now, in human poop. New research provides evidence of something scientists have suspected since microplastics were first detected in seafood, salt and bottled water: People are eating plastic particles, and excreting at least some of them.


Although the study is a small one, geared toward showing microplastics can be detected in excrement and are actually found there, it tees up future work to look for broader patterns of human microplastic exposure and the potential associated health impacts.


Microplastics include fragments smaller than five millimeters (the diameter of a grain of rice) that result from the breakdown of larger debris, such as bottles, in the environment. They are also made up of fibers shed by synthetic fabrics, and plastic beads added to some cosmetics. They have turned up everywhere from the seafloor to farm soil to the air around us—as well as in the first few foods and beverages scientists looked at—making it almost certain people have been ingesting them. But until now there were no direct samples from humans showing this was happening.


Stools seemed “the most promising place to look in humans for the first time,” says study co-author Bettina Liebmann of Environment Agency Austria. Detecting microplastics in poop is tricky, though. She and Philipp Schwabl, of the Medical University of Vienna, spent weeks developing a method that would break down the organic matter present in feces without affecting any microplastics that might be there so the plastic could be isolated from samples.


A stool sample prepared on a filter ready for analysis to detect microplastics.  Credit: Umweltbundesamt – Environment Agency Austria / S. Koeppel


The team collected samples from eight participants across Europe and Asia, who were instructed on how to minimize contamination from, for example, the fibers that are continuously floating in the air—the bane of many microplastics researchers. The scientists analyzed the stools for microplastics ranging in size from 50 micrometers (almost twice the diameter of a human skin cell) to five millimeters. “We were quite astonished that we found microplastics in every single sample,” Liebmann says. They also detected nine of the 10 common types of plastic polymers they looked for—notably polypropylene (used, for example, in bottle caps), polyethylene terephthalate (used in drink bottles) and polystyrene (used in food containers). Although they could not identify the exact source of each particle, the findings “confirm that we are surrounded by plastics in our everyday life,” she says.


The work, presented October 23 in Vienna at United European Gastroenterology Week, an annual meeting of specialists in digestive health, serves as a jumping-off point for further research. Liebmann and Schwabl hope to launch a larger study with more participants to look for any links between the amounts, types and sizes of plastic particles, along with where people live, what they eat and other lifestyle factors. They also hope to look for smaller sizes of plastic—which are the most likely to be able to penetrate the gut lining and enter the circulatory system and other organs, such as has been found to happen with other nanosize, man-made particles.


The new research suggests at least some microplastics (at the upper end of the size range) are being excreted by the body, which Liebmann calls “a good sign.” It remains unclear, though, how what is coming out compares with what might still be left in the body. “We’re just simply missing the frame of reference,” says Martin Wagner, an ecotoxicologist at Norwegian University of Science and Technology who was not involved in the new study.


Future work will also need to explore what, if any, negative health impacts microplastics might have on the body via physical damage to the gut or other organs, or due to the introduction of plastics’ chemical additives. Scientists do not yet know how microplastics might be different in this regard from any of the other indigestible particles to which humans are exposed. “We need to know, is it really toxic?” Wagner says. “We’re blind on that.”


But with the new work, “now we know how to tackle the problem, and we have the tools at hand” to start looking at microplastics in humans beyond the assumptions that have been made before, Liebmann says. “Now we have the proof that there is something that is worth looking at.”

via Scientific American https://ift.tt/n8vNiX

October 24, 2018 at 05:48AM

Behind the supersonic rise and fall of the Concorde, 15 years after its final flight

https://www.popsci.com/concorde-anniversary-future-of-supersonic-flight?dom=rss-default&src=syn


Here’s how the typical storyline of technology goes: something new is invented, then it becomes old, and then we replace it with a more advanced version. But in rare instances tech is so advanced that we’re not actually prepared to replace it by the time it ages out of fashion. Case in point: the Concorde. It was a plane ahead of its time—quite literally, as a flight from Paris or London to New York was so fast it’d actually land more than two hours before it took off: something that’s only possible today if you cross the International Date Line. The supersonic jet was supposed to usher in a new age of transportation, but just 27 years after its inaugural commercial flight the futuristic aircraft retired with no successor—15 years ago today, in fact—and supersonic passenger travel ceased to exist.

The reasons were manifold, but typically distilled into two major problems: the Concorde was not economical, and the sonic boom it produced was such a nuisance to people on the ground that it could only fly over water. The first and last generation of Concorde reached old age before anyone had managed to solve those problems, so nobody unveiled a shiny new model to replace it. But there’s hope on the horizon. In 2016, NASA announced a new program to develop a quieter supersonic aircraft and awarded a contract to Lockheed Martin, meaning that the general public may soon soar faster than the speed of sound once more.

A Supersonic Rise And Fall

After the Wright Brothers made the first flight in Kitty Hawk, North Carolina, on December 17, 1903, aviation developed at an incredible pace. Within two decades WWI was taking warfare to the skies and commercial airlines were ferrying customers around the world. On October 14th, 1947, aviation took another big leap forward; test pilot Chuck Yeager became the first human to break the sound barrier, achieving Mach 1 in the Bell X-1 rocket-powered aircraft, a collaborative project between the U.S. Air Force and National Advisory Committee for Aeronautics (NACA), the precursor to NASA. But the X-1 itself was designed primarily for research, not commercial passengers. Soon supersonic military jets were on the rise, but like the X-1, they were sprinters: they could only fly at Mach 1 for a few seconds, perhaps a few minutes at most, before they ran out of fuel. While this worked for small aircraft performing sharp maneuvers, large commercial airliners—which often travel in straight lines or gentle curves—would need to cruise at supersonic speed for a much longer period of time.

The progression did, however, inspire the commercial aviation industry to look into the creation of supersonic transports (SSTs), or civilian supersonic aircraft. While the X-1 proved we had the right tools to fly at supersonic speeds, a few major details needed ironing out, like the capability to cruise above Mach 1 for the duration of a relatively long flight, as well as the economic viability of such a project. Multiple countries, including the U.S., started research in the 1950s, but a slew of difficulties facing SSTs during development meant that just three nations would go on to build and fly such crafts: the United Kingdom, France, and the Soviet Union.

“The only European countries that had the interest, the technology, and the financing to design and build an SST were France and Great Britain,” says John Little, assistant curator at the Museum of Flight in Seattle. “They each wanted to develop an SST, but neither country could afford to do so on its own. So, somewhat reluctantly, France and Great Britain agreed to become partners and to develop an SST jointly.”

The U.S.S.R., on the other hand, was able to develop its Tupolev Tu-144 independently, though the airliner only made 55 passenger flights before the program was canceled due to high failure rate. (There was, for instance, a high-profile crash at the 1973 Paris Air Show.) The Concorde was by far the superior aircraft, making daily flights for nearly three decades.

Flying (And Spending) High

In order to make SST possible, Concorde engineers from the U.K.’s British Aircraft Corporation, France’s Aérospatiale, and the other companies contracted to work on portions of the aircraft (like Rolls-Royce, which designed the engines) had to develop new technologies or refine old ones, from the fly-by-wire controls in the cockpit (electronic interfaces versus analog ones) to heat-resistant tires to the elegant delta wing. “In my opinion, Concorde’s most innovative technology was the ability to cruise at Mach 2, or twice the speed of sound,” says Little. “After a few minutes of supersonic flight, most military airplanes would run low on fuel. By contrast, Concorde could cruise at twice the speed of sound for over three hours.”

Airlines rushed to place Concorde orders years before the plane was even built—more than 70 aircraft were ordered by 16 companies. But as the Concorde’s development progressed, so did the project’s cost. “Cost overruns were tremendous, going from £70 million to £1.3 billion,” says Aero Consulting Experts CEO Ross “Rusty” Aimer, a former pilot (that’s about $91 million to $1.7 billion in 2018 USD). Then the Concorde ran into other unexpected problems—although its faster trips meant it used less fuel on a journey than standard aircraft, environmentalists protested the high rate of fuel consumption (approximately 6,700 gallons per hour, compared to the Boeing 747’s 3,600 gallons per hour), as well as the potential damage the Concorde’s pollutants might do to the ozone layer at its high cruising altitude of 60,000 feet. And what might have been the biggest blow to the airliner was the banning of flights over land by air transportation regulators due to the sonic boom, which followed the aircraft in a 16-mile-wide trail. Thus the Concorde was limited to routes over water, and given its flying range of approximately 4,500 miles, it could barely cross the Atlantic, much less the Pacific. “The original orders from airlines around the world started to drop like a bad run on banks,” Aimer notes. “British Airways and Air France were the only airlines forced to order a small number due to political pressure and national pride.”

Ultimately, only 20 Concordes were ever built, including six prototypes: just 14, seven each for British Airways and Air France, ever entered commercial service. Despite being plagued with problems over the course of its development, the aircraft was highly regarded as one of most beautiful in the world—as well as one of the safest—and its exclusivity due to limited seats and sky-high ticket prices (in today’s dollars, a round-trip flight on the Concorde could cost upwards of $20,000, compared to the $6,000 to $10,000 you’d spend flying first-class on a subsonic Air France jet in 2018) created great demand. A massive fan base of aviation enthusiasts and high-profile passengers like celebrities and politicians grew quickly, and ticket sales soared.

Flying aboard a Concorde was a luxurious experience, akin to flying first class on one of today’s airliners. The notoriously cramped and noisy cabins never stopped guests from enjoying their journey: they sipped Champagne from glass flutes, dined on three-course meals served by hand rather than trolley, and indulged in after-dinner drinks. “We were trained to be efficient and elegant,” says Air France head purser Alain Verschuere, who served as a flight attendant aboard Concorde from 1999 till its retirement in 2003. “Air France was famous for this kind of class. The service was very nice, as were our uniforms and the decor aboard. We were very proud to fly on this aircraft. Even nowadays, 15 years later, passengers on my flights always say to me, ‘You were so lucky to fly on the Concorde!’”

But all good things must come to an end. Between the Concorde’s first passenger flight in 1976 and its last flight in 2003, the airliner was dealt some difficult hands—this aside from its economical and auditory woes during its development phase.

In With A Boom, Out With A Whimper

“When Concorde was conceived in late 1950s and designed in the mid-1960s, oil was cheap, jet fuel cost just pennies per gallon, and nobody foresaw that price increasing. Then came the Oil Crisis of 1973-1974, which caused the price of oil, and everything that was derived from it, including jet fuel, to soar,” says Little. “An increase of even a penny per gallon could mean the difference between operating a flight at a profit or a loss, and no airliner was more-susceptible to fluctuations jet fuel pricing than Concorde, which burned about 2,000 pounds of fuel per passenger while flying across the Atlantic Ocean.”

“Even worse, as global business travel shifted toward Asia, Concorde became less competitive. Because it could not fly supersonically over land, it could not fly to Asia eastward from Paris or London, nor could it fly westward, as it did not have the range to cross the Pacific Ocean,” Little adds. “Thus, ironically, in the long-haul market, where supersonic flight makes the most sense economically, Concorde was a non-starter.”

Market issues aside, there was also the main problem that burdens any aircraft—time. The fleet aged, and maintenance was extremely costly. By the end of the 1990s, given fuel and maintenance costs, as well as limitations to the route, the aircraft’s fate was effectively sealed. Then there was the final blow: On July 25, 2000, Air France Flight 4590, a Concorde bound for New York from Paris crashed just minutes after takeoff, killing everyone on board and several people on the ground. “Just one year later, the fleet was authorized to fly again, since we developed new technology that resolved the weaknesses that contributed to the crash,” says Jacques Rocca, Director of the Heritage Department at Airbus, which maintains most of the assets of the now defunct Aérospatiale. “But because of the 9/11 terrorist attacks, which happened during that year, less people were requesting to fly Concorde when it returned to service.”

The Concorde’s return was brief—the plane phased out of service in 2003, with the final flight taking place on October 24, ending the limited run of one of the most legendary aircraft in aviation history. “Essentially, the Concorde was more about technological prowess than economical reality,” says Aimer.

A Second Wind

There was no SST to replace the aged Concorde, so airline passengers have been cruising at subsonic speeds ever since. But 15 years later the world is more connected than ever, and there’s incredible demand for faster aircraft. “As Asia becomes increasingly central to the world’s economy, business travelers need a way to get to Asia quickly from Europe and the Americas,” says Little. “The first aircraft-maker that develops a hypersonic airliner [one that travels at Mach 5 or higher] that can fly between New York and Beijing in, say, three hours, will sell a lot of airplanes.”

Today, a number of companies are researching ways to resolve the Concorde’s shortcomings for both commercial airliners and business jets. “From an engineering standpoint, the big challenges will be to reduce the fuel burn, reduce the emissions, and reduce or eliminate the sonic boom—all will be extremely expensive to solve,” says Little. “For reducing fuel burn and emissions, the best option is to develop engines that do not require the burning of petroleum-based fuel. That option, however, will be risky for any manufacturer to undertake, and there is no guarantee of success.”

NASA is currently working with Lockheed Martin on an experimental aircraft, the X-59 QueSST, that will reduce the supersonic boom to a quiet thump. “It’s not about the specific material used, the level of attention to screws, bolts or seams. The most important aspect of the design is its shape—the outer mold line and what’s touching the air,” explains Erica Tierney, Program Communications and Media Relations at Lockheed Martin Skunk Works. “The X-59’s long pointed noise, the sharply swept wings, and shape of the canards ensure the individual pressure waves, produced at speeds faster than Mach 1, never converge to cause a traditional sonic boom.”

The plane is currently under development with a delivery date in 2021. Once Lockheed Martin hands the completed aircraft over to NASA, the agency will fly the X-59 over U.S. cities to study the effect of the sonic thump on the general population.

“NASA will recruit members of the community to participate in surveys each day during flight testing to understand how they respond to the sounds of quiet supersonic overflight,” says Peter Coen, Commercial Supersonic Technology project manager at NASA. “The data from flight tests will be given to U.S. and international regulators for their use in considering new rules that would allow commercial supersonic flight over land.”

Should the Federal Aviation Administration lift the ban on supersonic travel over U.S. land for quieter SSTs, aircraft manufacturers could use similar tech to develop new supersonic planes.

“The X-59 will be a breakthrough for the aircraft and transportation industries,” says Tierney. “It will make possible an entirely new global aerospace market, enabling passengers around the world to travel anywhere in half the time it takes today.”

But will the return of SSTs be in the broader commercial space for all types of passengers? Perhaps not—experts suggest supersonic travel might go in the direction of private planes, also known as business jets. “The business jet market is rapidly spinning up,” says Dr. James Ladesic, Professor of Aerospace Engineering and Associate Dean, Industry Relations and Outreach, College of Engineering at Embry-Riddle University. “SSTs are seen as having a niche that can work here, since business jets are generally smaller in passenger count and of significant price value in the market.”

Samme Chittum, author of Last Days of the Concorde: The Crash of Flight 4590 and the End of Supersonic Passenger Travel, agrees that the future is in business jets. “All current aerodynamic research indicates that the airframe of any quieter ‘boomless’ aircraft must have a small length-to-width ratio, much like an arrow—no wide-bodies need apply,” she says. “This narrow configuration is not amenable to carrying a lot of passengers to reduce passenger mile per gallon of fuel costs. If commercial supersonic aircraft do arrive, they will most likely be as small business planes for the super wealthy, not as bus transportation for the rest of us.”

NASA, however, hopes for a different outcome. “Our vision for the future supersonic flight is one in which the speed of travel benefits of aircraft like the Concorde are available broadly to the public,” says Coen. Should the X-59 be successful in its mission to reduce the sonic boom, we might all be flying faster than the speed of sound in no time.

via Popular Science – New Technology, Science News, The Future Now https://ift.tt/2k2uJQn

October 24, 2018 at 01:53PM

Super Typhoon Yutu Could Strike the Northern Marianas Islands as a Category 5 Beast

https://earther.gizmodo.com/typhoon-yutu-could-strike-guam-as-a-category-5-beast-1829948534


Typhoon Yut’s eye is beginning to clear out.
GIF: CIMSS

It feels like 2018 is the year of rapid intensification. Storm after storm after storm has spun up from humble begins to cyclonic monster around the world. The latest cyclone to join the ignominious club is Typhoon Yutu which is in the midst of spinning up into a forecast Category 5 super typhoon by Wednesday. Guam and the Northern Mariana Islands lie squarely in the typhoon’s path.

Yutu spent all of Tuesday ramping up and as of the latest Joint Typhoon Warning Center bulletin, it had estimated winds of 127 mph. That’s the equivalent of a very strong Category 3 storm, and with nothing but warm water in its path, Yutu is expected to continuing amping up. The storm’s winds could be roaring around 155 mph as it approaches Guam and the Northern Mariana Islands on Thursday morning local time.

The National Weather Service Guam office has posted a typhoon warning calling for powerful surf and up to half a foot of rain. The hilly terrain of Guam will be but a speed bump for Yutu, which is forecast to keep climbing in intensity through the end of the week. Its winds could reach an astounding 172 mph, which would put it among the strongest storm on Earth this year. Thankfully, it will achieve that terrifying feat over open water with no threat to land.

Some weather experts have been watching the storm via satellite and wondering if the official forecast is a bit underdone and if Yutu could be more intense than current estimates. That’s because there’s a subjective element involved. Satellites don’t measure storm’s wind speeds directly, but provide a snapshot of convection and the processes that indicate how healthy a storm is.

Meteorologists use that data to estimate wind speeds using something called the Dvorak technique. Without getting into a bunch of equations, suffice to say forecasters use what the satellite imagery reveals about convection, temperatures within the storm, and a few other factors to come up with a “T number,” which can be translated into an estimated wind speed.

The technique is super useful for remote parts of the Pacific that don’t have the benefit of estimates from aircraft like the National Oceanic Atmospheric Administration’s Hurricane Hunters. And Chris Velden, a hurricane expert at the Cooperative Institute for Meteorological Satellite Studies (CIMSS), told Earther that using the technique “gets the mean state right most of the time, but it can break down when you have anomalous conditions like a very rapidly intensifying storm or decaying storm.”

Yutu is one of those storms, and while Velden said he felt the current official forecast was pretty spot-on, that doesn’t mean the storm won’t continue to push the limits of what’s possible.

“This has potential to grow into a Category 5 real quickly,” he said. “People at the Joint Typhoon Warning Center are watching this. They know what they’re doing.”

via Gizmodo https://gizmodo.com

October 23, 2018 at 05:27PM

Apollo 11’s Journey To The Moon, Annotated

https://geekologie.com/2018/10/apollo-11s-journey-to-the-moon-annotated.php


apollo-11-mission-to-moon.jpg

This is a fascinating five minute video discussing the Apollo 11 mission to the moon, and what it took to be a success. SPOILER: A whole bunch of luck and magic. “It was math and science.” You know, there really is no difference if you look closely enough. See what I’m doing here? “Staring at your nipple through a magnifying glass?” I rest my case.

Keep going for the video.

blog comments powered by Disqus

via Geekologie – Gadgets, Gizmos, and Awesome https://geekologie.com/

October 24, 2018 at 11:16AM

NASA fixes Hubble gyroscope by turning it off and on again

https://www.engadget.com/2018/10/24/nasa-hubble-gyroscope-fix/



NASA

Hubble’s designers prepared for gyroscope failure by equipping the observatory with a backup. Unfortunately, when one of Hubble’s gyroscopes conked out in early October, the backup didn’t work as expected — it was rotating too fast and hence won’t be able to hold the telescope in place when it needs to stay still and lock in on a target. NASA has since been able to reduce its rotation rates and fix its issues by implementing an age-old fix for malfunctioning electronics: turning it off and on again.

The process sounds a thousand times more complex than simply pressing a switch, though. The Hubble team had to move the gyro around while switching it from high-rotation to low-rotation mode again and again in order to clear any blockage that might be preventing it from working properly. Here’s how NASA describes it:

“In an attempt to correct the erroneously high rates produced by the backup gyro, the Hubble operations team executed a running restart of the gyro on Oct. 16th. This procedure turned the gyro off for one second, and then restarted it before the wheel spun down. The intention was to clear any faults that may have occurred during startup on Oct. 6th, after the gyro had been off for more than 7.5 years. However, the resulting data showed no improvement in the gyro’s performance.

On Oct. 18th, the Hubble operations team commanded a series of spacecraft maneuvers, or turns, in opposite directions to attempt to clear any blockage that may have caused the float to be off-center and produce the exceedingly high rates. During each maneuver, the gyro was switched from high mode to low mode to dislodge any blockage that may have accumulated around the float.

Following the Oct. 18th maneuvers, the team noticed a significant reduction in the high rates, allowing rates to be measured in low mode for brief periods of time. On Oct. 19th, the operations team commanded Hubble to perform additional maneuvers and gyro mode switches, which appear to have cleared the issue. Gyro rates now look normal in both high and low mode.

NASA will conduct a few more tests to ensure the backup can do its job during routine science operations. Thus far, they seem optimistic that Hubble will be back in business in the near future.

via Engadget http://www.engadget.com

October 24, 2018 at 04:12AM