The Most Comfortable Shoes Are Made of Merino, No Socks Required

Forget merino socks, it’s time for merino shoes.

http://ift.tt/2aEbxDp

Allbirds are the most comfortable shoes I’ve ever worn, no contest, but they’re also incredibly functional. Allbirds are lightweight, odor resistant, moisture wicking, machine washable, temperature regulating, and environmentally conscious, and come in under $100.

Merino is a wonder material that helps keep you cool when its hot and warm when it’s cold, while wicking water away from your skin to keep you dry. – Indefinitely Wild

http://ift.tt/2cyrd7C

Hopefully you’ve already replaced some (or all) of your basics like undershirts, underwear, and socks with fabrics like merino, modal, and tencel, but Allbirds go one step (get it) further. These shoes are comfortable enough and cool enough that socks are now optional.

I’ve been wearing mine constantly from the office to dog walking to light hiking since I got them.


Commerce Content is independent of Editorial and Advertising, and if you buy something through our posts, we may get a small share of the sale. Click here to learn more, and don’t forget to sign up for our email newsletter. We want your feedback.

from Lifehacker http://ift.tt/2cxGb1P
via IFTTT

Giant E Ink Screens Turn Trucks Into Dynamic Rolling Billboards

Despite the gloriously colorful screens used in devices like the new iPhone 7, monochromatic E Ink displays have remained a popular choice for devices like e-readers since they’re cheap, durable, and work fine in direct sunlight. It also means they’re the perfect technology for turning trucks into in-your-face rolling billboards.

Read more…

from Gizmodo http://ift.tt/2cF6Lpa
via IFTTT

Stanford has developed a roadside breathalyzer for weed

Blood, breath and urine. These are the holy trinity of determining alcohol intoxication but are virtually useless when measuring the amount of THC in your system thanks the the molecule’s ability to remain present in bodily fluids for up to a month after consumption. However, a technological breakthrough from Stanford University could soon enable law enforcement to accurately determine how blunted you are as soon as they pull you over.

Rather than the three standard fluids, Stanford’s "potalyzer" measures the amount of THC present in your saliva. It’s reportedly accurate enough to detect as little as 0 to 50 nanograms of THC per milliliter of spit. The system, developed by Dr. Shan Wang and his team, uses magnetic biosensors to detect the THC molecules present in saliva. The technology actually grew out of Wang’s earlier research into in vitro cancer diagnostics and magnetic information storage.

The test itself involves first mixing the saliva sample with antibodies that bind to the THC molecules and act as markers. The sample is then spread on a test strip that’s been pre-coated in THC and loaded into a handheld measuring device. The more THC that’s present in the sample, the fewer antibodies will be free to bind with the THC on the test strip. By measuring the amount of unbound THC on test strip, the system can accurately estimate how much THC was present in the initial sample. This estimate is then confirmed by applying magnetic nanoparticles that are precisely engineered to only bind with the THC-antibodies and measuring the electrical differential. The meter’s results are then displayed on a Bluetooth-connected mobile device.

Interestingly, this technology could easily be applied to almost any small molecule including morphine, heroin, meth or any number of illicit or prescription substances. Of course, even if the potalyzer works as advertised, we’re still going to have to wait for existing laws to catch up to the technology. In Colorado, for example, where recreational cannabis use is perfectly legal, there are no strict limits on the amount of THC you can have in your system that determines DWI culpability — the entire process is left up to the officer’s discretion. While devices like this can help curb discretionary abuses by law enforcement, more research into how one’s tolerance impacts their ability to handle different amounts of THC will be necessary to prevent the establishment of unfair arbitrary legal limits.

from Engadget http://ift.tt/2cqA6Sp
via IFTTT

I drove around Pittsburgh in a self-driving Uber

"Did you do that, or did the car do that?" I first asked that of my self-driving Uber’s "safety driver" when the car pulled out of the lane it was in to go around a pedestrian on the side of the road. I then asked it another half-dozen times throughout the 30 minutes I spent as a passenger in one of Uber’s autonomous cars that are hitting the streets of Pittsburgh today. Nearly every time, the answer came back: "the car did that."

Indeed, my time as a passenger in the self-driving Uber as it drove around downtown Pittsburgh was blessedly uneventful — and in that relative safety and peace, I got an up-close look at what the challenges will be in making autonomous vehicles a widespread reality. I even got behind the wheel to "not drive" the car for myself.

For starters, it’s important to note that I never once felt like the car made any unsafe maneuvers. It obeyed the speed limit, left plenty of space between it and the car in front of it, took turns slowly and smoothly and generally behaved like an excellent citizen in vehicular society. It decides on speed first by determining a safe driving distance from the vehicle in front of it, and then by going as fast as the speed limit when it was able to. Uber’s engineers are also able to call out roads in which the car can exceed the speed limit to drive safely with the flow of traffic.

All told, it was a pretty boring ride — aside from the fact that a freaking computer was driving me around downtown Pittsburgh. That’s all the more impressive when you consider how much harder making a self-driving car operate around a city is compared to on the highway (like Tesla’s autopilot feature does).

A freaking computer drove me around downtown Pittsburgh.

The times when my safety driver had to take control were less about the car doing something unsafe and more about it being confused about what its many sensors and cameras were recording. For example, the car didn’t know how to deal with a truck that was double-parked very well. It read the truck as a vehicle stopped in the road, but it didn’t have the context to know that it wasn’t going to move any time soon, so we just sat behind it until the driver pulled around it.

The car also had a tough time dealing with a four-way intersection — while an autonomous car will obey the letter of the law, humans don’t. So, with safety as a top priority, the car sat at the stop sign, waiting for crossing cars to come to a complete stop before it would enter the intersection. But most people out there don’t come to a complete stop at a stop sign, so we just sat and waited while multiple cars crossed in front of us, glancing curiously at the strange bed of sensors on top of the vehicle.

Uber’s cars will likely learn these intricacies sooner than later, and I got to see examples of that learning on display in my drive. Apparently, when you’re stopped at a red light in Pittsburgh, it’s customary to let the first car across from you take a left turn if they need to before continuing straight through the intersection (it’s called the "Pittsburgh left," appropriately). The autonomous cars thus are programmed to take a little pause before continuing through an intersection when a car across from it has its left blinker on. That’s not about driving "right" or "wrong" — it’s about knowing local rules of the road and respecting them. Every area these cars go into will have their own quirky rules like this they’ll need to learn.

The few hiccups we encountered didn’t really detract from the experience; the overall ride was a smooth as I’ve ever had with a human driver behind the wheel. The autonomous system is finely tuned to provide a smooth and safe ride, and it never accelerated or decelerated in a way that made me feel uncomfortable. If you’ve taken a cab around any major city, you’ve probably experienced some car sickness from a driver with a heavy foot on both the brake and gas, but there was none of that here.

I’m someone with a rather sensitive stomach, but I felt fine for the rather lengthy ride I went on. In fact, the ride almost felt too smooth, too in control. Like a computer was driving — which, of course, it was. That’s not a bad thing, but you can definitely tell the difference between a human behind the wheel and the autonomous system.

While sitting in the back of the Uber, I could look at an iPad mounted to show the riders some details on the car. You can see how far you’ve driven autonomously, the current speed and a graphic showing the movements of the steering wheel and when the brakes are applied. But most interesting was a view of what the car’s radar system is seeing at any given moment. You can see cars, buildings, pedestrians and anything else in range of the car. It’ll satisfy the curiosity of people interested in how the car works as well as provide some transparency and possible security to people skeptical about the system.

After cruising around Pittsburgh for a bit, I was offered my own chance to get behind the wheel. At first, I thought I was just getting a look at what the driver sees while they’re behind the wheel, but nope — I was getting a chance to sit up front while the car drove me around. The most interesting thing about that experience was the strange awareness I needed to keep while letting the car do its thing. I was tempted to look around and take in the sights of the city, because I felt totally comfortable letting the car do its thing.

Getting behind the wheel wasn’t any more nerve-wracking than riding in the back, because I was in complete control of the car.

Of course, the system is not even close to ready to have a driver totally check out, so I kept my hands touching the wheel and a foot ready to tap the gas or brake so I could take over. Fortunately, it’s dead simple to take control of the car: moving the steering wheel or applying any pressure to the brake or gas will deactivate the autonomous driving system. I took over the car a few times, mostly just to see how it worked, and it was dead-simple to both drive as normal and then hit a button near the shifter to put the car back into autonomous mode.

It’s not surprising that full manual control is so easy to activate, but it makes sense that Uber would want the press to see firsthand how easy it is to snap the car into your control. That said, I could definitely see a situation in which a "safety driver" couldn’t help but tune out a bit during a long shift behind the wheel. It’s also not the easiest thing to keep your foot hovering over a pedal and hands lightly gripping the wheel without accidentally engaging with them.

The fundamentals appear to be in place for Uber, here in Pittsburgh at least. But there’s a long way to go before its cars can navigate all of the city, let alone other cities. A number of Uber engineers and spokespeople I talked to made it clear the focus was to build out Pittsburgh first, both in terms of increasing the area that autonomous cars could travel as well as fixing little oddities like its performance at four-way stop signs. Other cities will likely come in the future, depending on how the pilot goes, but right now all thoughts are focused on Pittsburgh.

One of the big challenges for Uber will be learning more about how the cars deal with inclement weather. That’s one of the reasons they’re testing in Pittsburgh — between the complexity of the old city’s layout (small streets, lots of one-way roads, lots of congestion) and the fact that it sees all kinds of weather, there will be a lot to learn from testing here. Uber engineers feel that if they can master Pittsburgh, they can make the system work pretty much anywhere. (I’m thinking both Boston and Manhattan will make for a serious challenge.)

Good luck mapping out Boston and Manhattan, Uber.

But it’s not clear exactly how Uber will deal with bad weather. The team said they’ve tested in rain and had good success thus far, but I wasn’t able to get a straight answer when I asked about how it’ll recognize and account for snow. It seems that it’ll be up to the safety driver to decide when to engage the autonomous features, and I have a feeling that in the winter these cars will be operated in traditional fashion to be on the safe side.

As much as the pilot is to gauge Uber’s technical prowess, it’ll also be a judge to how the public reacts to self-driving cars. In some ways, it’s like Google’s very public beta of Glass — except that no one was going to die if Glass went horribly wrong. Consumers will understandably be a bit nervous the first time they get into one of these vehicles. But with a human being behind the wheel and the cars operating at relatively low speeds around the city, the potential for true disaster seems pretty low.

The 1.3 million who Uber said die every year in car accidents is a big part of why they’re doing this in the first place. The company says that 94 percent of those accidents are caused by some variety of human error, and it believes that self-driving cars can see and process more than humans, making them safer. There’s a lot to be done before that’s a reality, and Uber’s definitely starting small. But right now, they have a lead on just about every other company working on self-driving cars.

from Engadget http://ift.tt/2cqF0zw
via IFTTT

Proterra’s electric bus can travel 350 miles before recharging

A startup called Proterra has been working on electric buses for years, and its latest model has a pretty impressive range. Its Catalyst E2 Series buses can drive up to 350 miles on a single charge, which means it can go a quite a bit further than Tesla’s top-tier Model S that already boasts a 300-plus-mile range. The vehicle can also outlast its predecessor that can only go for 258 miles. As Wired notes, electric buses might even be better than cars, since they don’t need a huge network of charging stations. They drive a set route, so cities can simply install some where they’re bound to pass — the E2 might not even need to recharge until the end of the day. Further, not everyone can afford an electric vehicle, but most people can afford to ride a bus.

The Catalyst E2 Series buses are powered by two gargantuan batteries the size of mattresses that can store up to 660 kWh. Its lightweight frame, along with its regenerative breaking system, also helps it achieve that impressive range. The only thing that might hold cities and companies back from purchasing E2 is that one will set them back $799,000, over twice the amount of a typical diesel bus. Proterra is probably hoping that government subsidies, coupled with the fuel and maintenance savings they’ll get, can convince them to buy the vehicle. If you’re in Los Angeles, you might be able to ride one of the first E2 buses scheduled to hit the road in 2017.

Source: Wired, Proterra

from Engadget http://ift.tt/2cYG7s7
via IFTTT

How to master computer science, minus the student loans

These days, you don’t need to spend a fortune on college tuition to gain valuable professional skills. The Complete Computer Science Bundle gives students a full programming education without having to spend the next ten years of your life in debt. Right now, Engadget readers can get this 8-course bundle for nearly 90 percent off retail price—just $39.

You’ll learn essential programming languages like C, C++, and Java through nearly 80 hours of hands-on instruction, and get familiar with database management and other core concepts of computer science. You’ll also learn about Raspberry Pi and how to build products for the Internet of Things.

Here’s what’s included in your bundle:

  • From 0 to 1: C Programming – Drill Deep
  • Byte Size Chunks: Java Object-Oriented Programming & Design
  • From 0 to 1: Data Structures & Algorithms in Java
  • From 0 to 1: SQL And Databases – Heavy Lifting
  • From 0 to 1: Learn Python Programming – Easy as Pie
  • Learn By Example: C++ Programming – 75 Solved Problems
  • From 0 to 1: Raspberry Pi and the Internet of Things
  • Case Studies: Facebook, Twitter, LinkedIn, Apple

If you’re interested in learning all about the latest applications of computer science, don’t miss out on the Complete Computer Science Bundle, now down to just $39.

Bonus: We’re giving away an Oculus Rift and HTC Vive VR Headset—enter for your chance to win the one of your choice.

Engadget is teaming up with StackCommerce to bring you deals on the latest gadgets, tech toys, apps, and tutorials. This post does not constitute editorial endorsement, and we earn a portion of all sales. If you have any questions about the products you see here or previous purchases, please contact StackCommerce support here.

from Engadget http://ift.tt/2cZOZOc
via IFTTT