Scientists Turned a Regular Fidget Spinner Into a Centrifuge That Separates Blood

https://gizmodo.com/scientists-turned-a-regular-fidget-spinner-into-a-centr-1831996597


Image: Drew Angerer (Getty Images)

Kids are already over fidget spinners after adults rapidly made them uncool, but a team of scientists in Taiwan has found a nifty way to repurpose these small toys. They turned them into inexpensive centrifuges that could allow health workers in impoverished areas to carry out certain blood tests with ease.

Centrifuges are used to separate out the main components of blood: plasma and blood cells. Plasma can be then tested to confirm health conditions like HIV, viral hepatitis, and malnutrition. This separation is usually done with relatively expensive, electrically powered devices that can spin fast and long enough to create the needed centrifugal force. But researchers at the National Taiwan University wondered if there was a low-tech way to accomplish the same thing. Fidget spinners, funnily enough, weren’t the first toy they tested out.

“In the beginning, we tried to use the Beyblade burst to replace a centrifuge,” lead author Steve Chen told Gizmodo, referring to one of the toylines based on the popular Japanese manga and anime Beyblade. “However, the rotation speed wasn’t sufficient to separate blood. Then we jumped to use fidget spinners to do the tests.”

The tests they came up with were relatively simple. First, they put small samples of blood into three slender tubes and taped each tube to a different arm of a store-bought spinner. Then they spun the spinner as you typically would, waiting for it to stop on its own before spinning it again. They spun them as long as it took before they could see a decent amount of the distinctive yellowish plasma.

On average, it took around four to seven minutes for plasma to separate out, with three to five finger flicks needed. Tests showed that, on average, about 30 percent of the total plasma in a sample had been filtered out through the fidget spinning. But the filtered plasma was also 99 percent pure plasma.

To test their devices a step further, they laced the blood with a protein belonging to the HIV-1 virus, the most common form of the disease. And when they screened the filtered plasma with a paper detection test looking for that specific protein, they were able to confirm the presence of the virus. All in all, the fidget centrifuge had done its job well enough.

Their findings were published in December in Analytical Chemistry.

Chen said that cheaper, low-tech centrifuges would be a boon to doctors and hospitals in areas where medical resources are limited. You wouldn’t have to worry about keeping blood samples preserved for long periods of time before they can be transported to major testing centers. And the added speed in getting back results should make people more willing to go through with testing.

“My personal interest is to develop diagnostic systems for use in resource-limited areas, so I always tell my students to imagine that they will do all the tests in the desert and they can only carry a backpack with them,” said Chen.

As it turns out, this isn’t the first time scientists have tried to create a low-tech centrifuge. In 2017, bioengineers at Stanford University were able to create and successfully test out a paper centrifuge with materials totaling 20 cents. The design of the paperfuge was also based on an ancient spinning toy, the whirligig.

Given the recent fidget spinner fad, though, there should be plenty of discarded spinners to retrofit into centrifuges. Chen noted that his four-year-old son has three fidget spinners of his own.

Chen and his team are already trying to test out their spinner centrifuges in the African country of Malawi, conducting blood tests on site. They’re also testing custom-made, 3D-printed, handheld devices that should be even more efficient, low-tech centrifuges. They hope to publish research on those devices later this year.

[Analytical Chemistry via the American Chemical Society]

via Gizmodo https://gizmodo.com

January 24, 2019 at 09:15AM

Cool: Guy Takes Stroll Atop World’s Deepest Lake With Crystal Clear Ice On Top

https://geekologie.com/2019/01/cool-guy-takes-stroll-atop-worlds-deepes.php


This is a six-minute video of a man walking around on Lake Baikal (the world’s deepest lake) in Siberia while it’s got an almost 6-inch sheet of crystal clear ice cover. In his own words while I swish my feet around in the shower and pretend I’m walking on water, but am really just trying to make sure all the pee goes down the drain:

“Walk on the incredibly transparent, crystal clear ice. It felt like I was standing on the water or walking on a very fragile glass. Despite this, I knew it’s safe to be there as thickness of ice was about 15cm

I could clearly see the bottom of the lake, stones, fish. All of this made me feel like I was looking into a fairyland, which never existed anywhere apart from dreams. Such a miracle. Iced Baikal is something really extraordinary and mesmerizing”

Obviously, with a max depth of 1,642-meters (5,387-feet), he’s not looking at the bottom of the deepest portion of the lake, but some of its much, much shallower waters towards the shore. And speaking of shallow– “You should never judge a book by its cover.” Exactly, especially since *removing book jacket* TA-DA! “It’s actually a nudie magazine.” Works great everywhere there isn’t somebody looking over your shoulder.

Keep going for the whole video.

Thanks to Jacoby, who agrees it would have been even more nuts to see a body under that ice.

blog comments powered by Disqus

via Geekologie – Gadgets, Gizmos, and Awesome https://geekologie.com/

January 23, 2019 at 04:30PM

Scientists Reconstruct an Object by Photographing Its Shadow

https://www.wired.com/story/scientists-reconstruct-an-object-by-photographing-its-shadow


Vivek Goyal isn’t a professional photographer, but he and his colleagues have developed an intriguing party trick: they can capture the image of an object completely out of sight.

They demonstrated the trick in a windowless room on the Boston University campus, where Goyal works as an electrical engineering professor. In the room, a flat-screen monitor displayed a series of crude drawings created by Goyal’s graduate student, Charles Saunders. Among them were several masterpieces: A mushroom that resembles Toad from Mario Kart, a Simpsons-yellow dude wearing a sideways red baseball cap, the red letters “BU” for school pride. These are the images that Goyal and his team wanted to capture while pointing the camera lens in a completely different direction.

In the darkened room, the flickering of the screen produced a dim blobby blur on the opposite wall. Using a camera mounted on a tripod, Saunders took 20 quick snaps of the blob for a total exposure time of 3 seconds, and fed it all into a computer program. A few minutes later—voilà. A blurred image of Toad, slightly askew, popped up on their screen.

“This is not magic,” Goyal tells me, in case anyone was confused.

Charles Saunders

It’s actually more like forensics, as he describes in a paper published Wednesday in Nature. The screen vomits light at the wall, and the camera records the scrambled aftermath. Making use of the fact that light travels predictably according to the laws of physics, the researchers designed algorithms to retrace the trajectory of the light rays and reproduce what was on the screen. Theoretically, you could photograph not just the screen, but any dimly lit object in the same room.

Their work is part of a larger effort to design cameras that can peer around corners, also known as non-line-of-sight cameras. These devices could eventually help self-driving cars avoid collisions, firefighters rescue people from burning buildings, or governments spy on adversaries.

But past rigs required expensive hardware: pulsed lasers and extremely sensitive detectors, for example. In contrast, Goyal’s setup used an off-the-shelf 4-megapixel digital camera that costs around $1400, which his team estimates is at least 30 times cheaper than previous setups. “They’ve demonstrated that in some situations, you really don’t need expensive sensors,” says Gordon Wetzstein, an electrical engineer at Stanford University, who was not involved with the work but is developing a similar camera, using lasers.

With such simple hardware and algorithms that don’t require too much computing power, the technology might have been possible five, ten years ago, if people had just thought of it. “It feels more like a discovery than an invention,” says Goyal.

Nature

Curiously, the key to photographing the hidden object is to block part of it. Goyal’s team stuck a piece of black foamboard on a metal stand in front of the screen. The foamboard casts a shadow on the wall, and shadows give the computer algorithm more information to work with. Imagine taking a nighttime stroll in your neighborhood and seeing several shadows of yourself on the ground from the surrounding street lamps. If you wanted to, you could calculate the locations of all the lamps by looking at the orientation of your shadows. Without the shadows, it would be much harder to figure out where the lamps were. Reconstructing Toad is a more complicated version of mapping the street lamps, says Goyal.

The work builds on a 2017 MIT study that used shadows to record people walking behind a corner wall. In that study, the researchers showed they could use the dim shadows to track how people were moving. But at the time, they could not capture illustrative images. Now, Goyal’s team has demonstrated much higher-resolution pictures, and in color for the first time. “It’s a really new field, and it’s advancing really quickly,” says Saunders.

Experts are excited about the multitude of potential applications. You could stick the cameras on robots to help them maneuver, says Goyal. They could make self-driving cars safer. Wetzstein mentions medical imaging. “You could use it to detect tumors, or to see around bones,” he says.

And also: better government surveillance. Both Goyal and Wetzstein’s work have been funded through a Darpa program called Reveal, which sponsors research on how to produce images using very little light. Future cameras based on Goyal’s or Wetzstein’s techniques could conceivably be installed on drones for spying.

But don’t jump the gun, says Wetzstein. They’re still figuring out how well this technology works. The pictures are still far too pixelated to be useful for a lot of applications. And even though researchers can see around corners in a controlled lab setting, they’re still about a decade away from building the first prototype devices, he says. The camera can see Toad from around a barrier, but it’s not good enough to make out your face—yet.


More Great WIRED Stories

via Wired Top Stories http://bit.ly/2uc60ci

January 23, 2019 at 12:09PM

China’s Latest Cloned-Monkey Experiment Is an Ethical Mess

https://gizmodo.com/chinas-latest-cloned-monkey-experiment-is-an-ethical-me-1831987348


The cloned monkeys
Screenshot: Science China Press

Chinese researchers have cloned five gene-edited monkeys with a host of genetic disease symptoms, according to two scientific papers published today.

The researchers say they want to use the gene-edited macaques for biomedical research; basically, they hope that engineering sick primates will reduce the total number of macaques used in research around the world. But their experiment is a minefield of ethical quandaries—and makes you wonder whether the potential benefits to science are enough to warrant all of the harm to these monkeys.

The researchers began by using CRISPR/Cas9 to alter the DNA of a donor macaque. CRISPR/Cas9 is the often-discussed gene editing tool derived from bacteria that combines repeating sequences of DNA and a DNA-cutting enzyme in order to customize DNA sequences. Experts and the press have heralded it as an important advance due to how quickly and cheaply it can alter DNA, but recent research has demonstrated it may cause more unintended effects than previously thought.

In this experiment, the researchers turned off a gene called BMAL-1, which is partially responsible for the circadian rhythm. Monkeys with this gene turned off demonstrate increased anxiety and depression, reduced sleep time, and even “schizophrenia-like behaviors,” according to a Science China Press release. Circadian rhythm disruption has even been linked to diabetes and cancer.

The team then transferred the nuclei from the donor monkey’s tissue cells into an egg cell in order to create the clones. The result was five cloned monkeys, each exhibiting symptoms of the genetic disorder introduced into the donor monkey. The scientists described their results in a pair of papers published in the National Science Review.

But why? Researchers hoped to generate custom model organisms closer to humans in order to understand and find potential cures for genetic diseases, including brain diseases, cancer, and other disorders, Hung-Chun Chang, senior author and investigator from the Chinese Academy of Sciences Institute of Neuroscience said in a statement. By creating identical animals, perhaps they could localize causes and cures more quickly.

This research combines a ton of ethical issues into one package, from those surrounding animal rights to cloning to gene editing. As bioethicist Carolyn Neuhaus from The Hastings Center summarized her reaction to the announcement: “Whoa, this is a doozy.”

“It’s very clear that these monkeys are seen as tools,” she explained. The team of scientists behind the new announcement tout the monkeys’ suffering—anxiety, depression, and “schizophrenia-like behaviors”—as a success. She noted that the researchers didn’t have scientific hypothesis or treatment they were attempting to prove or disprove, and instead were basically just trying to see what would happen if they edited a crucial gene. It’s kind of like deleting a cryptically named file from your computer’s system folder just to see what will happen.

Additionally, these monkeys might appear to be showing behaviors that look like analogues for human diseases, but the comparison between monkeys and humans ends at some point. It’s possible that turning off the BMAL-1 gene would have different effects in humans. And finally, the monkeys’ identical DNA might instead be a weakness. Humans display plenty of genetic differences, so a cure that works for identical macaques might not work when presented to the far messier human genomes.

That’s not to say we shouldn’t genetically edit monkeys, said Neuhaus. But, she said, “if I were on an ethics review committee, I would be very hesitant to approve [this research] because of the incredible amount of harm to the animals. I would expect the scientists who are proposing this research to have very good responses to very hard questions about their methods and the expected benefits of their research.”

Gizmodo has reached out to Chang regarding these ethical concerns, but he did not respond by the time of publication. We will update the post when we hear back.

Things are moving fast in the realms of cloning and gene editing. Chinese scientists (including some of the same researchers behind this new work) announced the birth of the first-ever cloned monkeys almost a year ago to the day, and we hinted that this might be the potential use of the cloned monkeys. The CRISPR technique is already being used in people, and Chinese scientist He Jiankui recently claimed to have produced the first gene-edited human babies, sparking outrage from many other scientists. The Chinese government has said he broke the law.

Scientists are already discussing the ethics of gene-editing humans, and some have cautiously agreed that it’s “morally permissible” to genetically modify babies in some cases. New technology always brings up new ethical questions, but the principle that potential benefits should outweigh harms is not a new one. In this latest case, it’s not at all clear the research meets that basic standard.

via Gizmodo https://gizmodo.com

January 23, 2019 at 12:21PM

Amazon starts testing its ‘Scout’ delivery robot

https://www.engadget.com/2019/01/23/amazon-scout-delivery-robot/



Amazon

Amazon is working on delivery robots, and it’s already bringing the self-driving machines to the streets. Starting today, six Amazon Scout devices are delivering packages in a neighborhood in Snohomish County, Washington, north of Amazon’s Seattle home base. While the robots can navigate by themselves, an Amazon employee will accompany them, at least for now.