Here’s The Main Problem With Tesla’s Supercharger Network

https://jalopnik.com/heres-the-main-problem-with-teslas-supercharger-network-1840110802

For all the crap we give Tesla, they rightfully deserve to be lauded for the Supercharger network of fast-charging stations. You want to drive across the country in your electric car? Easy peasy, thanks to the ability to quick charge up to about an 80% tank of electron juice in about 20 minutes. Unless you and some of the hundreds of thousands of other Tesla owners want to use the exact same Supercharger station at the exact same time.

I’ve written whole rants on the topic, but I really do love the Supercharger system. While companies like GM twiddle their thumbs and pen useless op-eds for CNN wondering who, WHO, could possibly build the infrastructure their cars require for them, Tesla just went out and did it. The freedom the literally thousands of Superchargers enables is why, for all of Elon’s inanity, if I was in the market for an expensive electric car at the moment, I’d probably get a Tesla over anything else.

But “literally thousands” of Superchargers isn’t enough. It’s not enough for the millions of cars Tesla itself wants to build, and other automakers are still playing catchup, let alone coming close to the number of Superchargers that exist.

Basically, it’s going to take some brave corporations millions, if not billions, of dollars, to go ahead and start building quick-charging stations on the scale that we have gas stations now.

Until then, we’ll get scenes like this in high-Tesla-ownership areas, shot by a reader named Steve at the San Luis Obispo, California, Supercharger station on Thanksgiving Day:

Build a big restaurant and a Quick-E Mart and an arcade and whatever the hell it is people use to kill 20 minutes, and you’ll make a killing yourself.

Oh hell, I’ll do it. Anyone got a couple billion?

H/t to Steve!

via Gizmodo https://gizmodo.com

November 30, 2019 at 11:45AM

Say What Now?: US Army’s Plan For The Human Cyborg Soldiers Of 2050

https://geekologie.com/2019/12/say-what-now-us-armys-plan-for-the-human.php

cyborg-soldiers-of-2050.jpg
According to a recent report released by the U.S. Army’s Combat Capabilities Development Command (DEVCOM), these are some of the cyborg capabilities the Army soldiers of 2050 might have. Will they become reality? LOL, Earth surviving to 2050 — good one, Army.

The ‘thought experiment’ involved dozens of scientists, military personnel, ethicists and other experts discussing future technologies, what impact cyborgs would have on society and how it would change warfare.
The study involved breaking down the future design of a cyborg soldier into the main areas of enhancement likely to be possible by 2050. They examined changes to the eyes, ears, brain and muscular system through four ‘case studies’ examining the different technologies that could be developed and what impact they would have on society and warfare.
The study predicted that human machine enhancements would become widely available before 2050 and likely be led by medical use rather than the military.

I thought one of the more interesting ethical points made in the article was whether or not soldiers that had been augmented would need to be "throttled back" (read: restored to factory settings) for the return to civilian life, or if they’ll just use their abilities to bankrupt casinos and everything else I would do if I was enhanced in any way besides naturally, in the penis.
Thanks to Thaylor H, who agrees at what point do you just start using 100% androids?

via Geekologie – Gadgets, Gizmos, and Awesome https://geekologie.com/

December 2, 2019 at 11:00AM

Intel is losing against AMD

https://www.engadget.com/2019/11/28/amd-beating-intel-threadripper-ryzen/

When AMD launched its third-generation Zen 2 Ryzen processors earlier this year, Intel had to be sweating. Its rival had developed an all-new architecture with improvements to clock speed, core count and instructions per clock and promised performance that matched — and even beat — Intel’s CPUs.

Then, along came AMD’s mainstream 12-core and 16-core Ryzen 9 3900x and 3950X CPUs, which doubled the thread count of competing i9-9900 series chips. That threatened not only Intel’s gaming market but muscled in on its workstation territory. To compete, Intel was forced to launch the Cascade Lake i9-10980X at $999 — half the price of the previous 9980XE model. Unfortunately for Intel, the 3950X still keeps pace with the i9-10980X in most tests, despite costing just $750.

Reviews are now in for AMD’s latest 32-core 3970X and 24-core 3960X Threadripper processors, and it’s more bad news for Intel. Those high-end desktop (HEDT) chips perform better for both video and 3D rendering than Intel’s latest i9-10980X CPU. Furthermore, they’ve even made many of Intel’s high-end workstation Xeon CPUs obsolete. And the kicker is that AMD has yet to release the 64-core, Threadripper 3990X.

Intel still has a lead in gaming, but only just, and given AMD’s progress and recent Zen 3 announcements, is any market safe? Let’s explore the state of this rivalry and how Intel might counterpunch.

A rapid rise

Ever since AMD launched the Zen architecture just over three years ago, it has progressed at a relentless pace. At the time, Intel’s $1,100 8-core i7-6900 seemed like it had all the cores you’d ever need. However, AMD moved the goalposts, unveiling the 8-core Ryzen 7 1800 at a much lower $499 price point. That was followed a little later by the $999 Threadripper 1950X with no fewer than 16 cores.

Intel had already unveiled the 18-core i9-7980XE, but it cost double the price at $1,999. Yes, it had a significant speed edge over the Threadripper 1950X, but AMD was starting to close the gap for multi-threaded workstation CPUs — a key and very profitable market for Intel.

AMD quickly changed the game again with its Zen 1 (second generation) Ryzen and Threadripper CPUs. When it first shipped, the 32-core $1,799 Threadripper 2990X was nearly on par with Intel’s best workstation chips, including the $3,000 28-core Xeon W-3175X.

Several months later, Intel launched the $2,000 i9-9980XE. It was still a better option for clock-speed sensitive tasks like Adobe Creative Suite and gaming than AMD’s Threadripper or Ryzen 7 chips, thanks to its stronger per-core performance and better memory architecture. However, the 2990WX could hold its own for multithreaded rendering, thanks to the sheer number of cores, and took the lead in tile-based rendering scenarios like Blender.

AMD 16-Core Ryzen 3950X processor

When AMD announced the 16-core, Ryzen 9 3950X for just $750, it had good reason to be confident. Some pretty huge architectural improvements and increased clock speeds made performance significantly stronger than the Core i9-9980XE, but at nearly a third of the price. To counter, Intel launched the 18-core Core i9-10980XE and cut the price by half, down to $1,000 — something it wouldn’t have likely done without the competition — but it still wasn’t enough.

AMD has replied again with the 24- and 32-core Threadripper 3960X and 3970X CPUs, priced at $1,399 and $1,999, respectively. According to reviews, those chips are competitive with the i9-10980XE for clock speed-sensitive tasks, but they’re now handily winning when it comes to multi-threaded workstation rendering performance, albeit it at a significantly higher price point.

In a short time, AMD has progressed from eight to 32 cores and moved from a 14-nanometer process down to 7-nanometers, while making big architectural improvements to cut the gap in IPC, or instructions per clock. It has also refined its manufacturing technique for "chiplets," which are a very efficient way to build multi-core CPUs. Meanwhile, Intel is still stuck on a refined 14-nanometer++ process for its key gaming and workstation parts and has scrambled to stay competitive in the multi-core segment.

AMD takes over workstations

AMD threadripper workstation HEDT

AMD’s new Threadripper chips pose a huge problem for Intel. Yes, they’re more expensive than the consumer-oriented Core i9-10980XE, but Intel now has nothing to counter them in the HEDT market. The closest chip it has is the 28-core Xeon W-3275M, but it costs nearly quadruple the price, at $7,453. (Last year’s $3,000 28-core W-3175X is far behind AMD’s 3970X chip in performance.)

Another huge plus for AMD is the new TRX40 platform for its latest HEDT chips. While it doesn’t let you drop the new Threadripper chips into old Zen 1 motherboards, the new architecture has some big advantages for creators. Mainly, it introduces PCIe 4.0 support, delivering much higher SSD speeds for 4K and 8K video editors. The new motherboards also have more PCIe slots, ECC RAM (256GB) and USB 3.0 ports.

In terms of performance, the Threadripper 3960X and 3970X chips trounce the Core i9-10980XE in every single workstation benchmark, even in single-threaded mode. So while the chips are more expensive, the rendering performance per dollar is considerably higher.

That equation will likely shift even more in AMD’s favor when the Threadripper 3990X comes out in 2020, as that chip should cost even less per core. That will help 3D animation and post-production facilities reduce costs as they’ll be able to get more rendering performance out of a single workstation or render node.

Intel’s still winning in gaming and mainstream computing (for now)

Intel Core i9-9900KS special edition processor

Intel’s primary gaming CPUs, like the i9-9900K and i9-9900KS, outperform AMD’s Ryzen 9 3950X by a comfortable margin — and cost less to boot. However, the consumer-focused Ryzen 3600 and 3700 cost less than the 9900K and are considered better picks than Intel’s i7 and i5 offerings.

Virtually every desirable high-end laptop out there packs Intel Core i9 or i7 chips, along with NVIDIA’s GTX or RTX graphics cards. Why is that? Intel still has the edge in overall clock speeds, though AMD has reduced the gap over the last generation. At the same time, AMD seems determined to make its chips appeal to both workstation and gaming users, offering gaming-oriented chips like the 3700 with higher thread counts than competing Intel products. (At the same time, gaming performance depends much more on the GPU than the CPU, so slight differences in CPU performance aren’t that important.)

And when it comes to laptops, Intel simply dominates, with a huge portfolio of eight-, ninth- and 10th-gen CPUS, all of which have built in GPUs. It’s a lot of work for a manufacturer to switch from Intel to AMD — you can’t simply swap out a CPU as the chipset and other motherboard architecture needs to change with it. Advanced IO like Thunderbolt has also been a key differentiator for Intel; although, USB4 changes that. As it can’t really compete, AMD has seemingly avoided the high-performance segment. Its laptop chips simply have poor performance per watt next to Intel, and few AMD CPUs have integrated graphics. That could change, though, with its Zen 3 7-nanometer+ architecture. It doesn’t seem like AMD would have much trouble crashing this segment, too, if it wanted to.

Intel also owns the mainstream laptop market, particularly in the 15- to 25-watt CPU segment, where you’d find machines like the 13-inch MacBook Pro. Again, it’s hard for AMD to compete with the sheer number of parts Intel has across its eight-, ninth- and 10th-gen mobile platforms. Cracking the laptop market seems more like a question of will rather than a lack of technological prowess from AMD. For now, it might feel it’s better off focusing on the more profitable high-end segment.

What’s to come

When Intel finally releases full-fledged, 10-nanometer gaming chips, it will no doubt change the equation against AMD. From what we’ve seen so far, they’ll deliver modest performance gains of around 10 percent, while sipping less power.

That should be enough to keep it ahead in the mainstream segment it already dominates — and possibly in gaming (depending on what AMD does). What about for workstations? At this point, it seems unlikely that Intel will be able to best AMD in the all-important performance-per-dollar category, unless it really pulls off something incredible.

AMD, meanwhile, says that its Zen 3 process is ready and will first appear on its Epyc server CPUs. It has promised an "entirely new" architecture based on a 7 nanometer+ manufacturing process. That could deliver performance gains in the neighborhood of 15 percent, with correspondingly lower power draws. At the same time, AMD will continue increasing core counts, memory bandwidth and I/O connectivity.

What’s also interesting is that AMD has promised to continue its current "tick-tock" design cycle. That means it will shrink the chip size down on the "tick" cycle, while changing the architecture on the "tock." Intel abandoned that strategy some years ago, dancing instead to the tune of a "three step."

Wrap-up

AMD Ryzen Threadripper 3990X 64-core CPU

None of this is any good for Intel. Even if 10-nanometer Ice Lake delivers the promised performance gains when it arrives next year, the chip giant won’t have long to enjoy it before AMD’s Zen 3 4000-series chips arrive.

With AMD now ahead in HEDT workstation performance, it could seriously bite into its rival in this high-profit segment. Intel still enjoys a lead in gaming, but that’s also at risk if AMD does a tick-tock while its rival is three-stepping.

As someone who does both gaming and video editing, I’d be more interested in AMD than Intel if I was in the market for a processor. To me, AMD feels like the future of high-end desktop computing, with more interesting and advanced technology across the board. Meanwhile, Intel is struggling just to get its existing products out, let alone create new ones. It really needs to do something to shake things up, or it could soon find itself in an unfamiliar position: second place.

via Engadget http://www.engadget.com

November 28, 2019 at 08:48AM

Researchers develop E. coli strain that ‘eats’ carbon-dioxide

https://www.engadget.com/2019/11/29/researchers-develop-e-coli-strain-that-eats-carbon-dioxide/

While you’re stuffing turkey leftovers in your belly, the last thing you want to think about is E. coli. But spare a thought for the bacterium, it’s not always here to harm you (and it needs to eat, too). According to a new paper published in Cell, scientists have developed a strain of E. coli that feeds on carbon dioxide. As Nature explains, the bacteria usually prefer sugars (glucose), but the lab-created strain could be used to create biofuels with a lower emissions footprint than conventional production methods.

E. coli, for all its bad press, has already been used to do many useful things. Several years ago, researchers managed to store encrypted data in the microorganisms, and there are even E. Coli-based "computers."

This isn’t the first time we’ve seen carbon-guzzling strains, either, but previous efforts have only consumed CO2 as a small part of their "diet" compared to this latest generation. If you were hoping the new bacteria could be used to suck CO2 out of the air and help save the planet, sadly that’s not viable right now. Not least because this modified bacterium currently emits more than it consumes. But the team behind the research does claim the strain could be used to develop "food," and hopes that switching to electricity as an energy source might reduce those emissions.

As appetizing as E. coli food sounds, we’ll have to wait a long time to find out what the dinner plate of the future looks like. The researchers say the work is mostly a proof of concept at this time, so our dreams (nightmares?) of an E. coli-based holiday dinner are still some way off. Something else to be thankful for?

Via: Nature

Source: Cell

via Engadget http://www.engadget.com

November 28, 2019 at 11:42PM