Japan Display Develops 1.6-Inch Micro LED Display Module: 265 PPI & 3,000 Nits

https://www.anandtech.com/show/15174/japan-display-develops-16inch-micro-led-display-module-265-ppi-3000-nits

Japan Display Inc. (JDI) announced this week that it has completed development of its first Micro LED module. The prototype of the Micro LED module, which is a potential building block for next generation displays and TVs, offers a 265 PPI pixel density. JDI will be demonstrating the module at Fintech Japan 2019 later this week.

JDI’s Micro LED display prototype is a square module with a 1.6-inch diagonal, and offers a resolution of 300×300 pixels along with a maximum brightness of 3,000 nits. The prototype uses gallium nitride LED chips developed by glo (a Micro LED pioneer) and JDI’s LTPS backplane.

JDI’s Micro LED Display Module Prototype
Feature Characteristic
Screen size 1.6 inches
Resolution 300×300×RGB
Pixel density 265 PPI
Luminance 3,000 cd/m²
Viewing angle >178°

As part of the development process for Micro-LED displays, the hope is that display makers can assemble full-sized displays by building them out of individual modules such as JDIs. A modular approach not only offers more control over yields (as opposed to having to fail a full-sized panel over a flaw), but it would also allow manufacturers to easily support multiple resolutions and aspect ratios just by changing the number of modules. That said, being able to produce commercial panels in volume is still years off, and for now Japan Display has made significant progress just in completing their prototype module.

Micro LED technology is a promising candidate for higher-end displays and television that will be available three to four years down the road. The technology has virtually all the quality advantages that OLED has to offer (over LCD), including individually-controlled LEDs, high contrast, fast response times, and wide viewing angles. But equally important, it does not come with the major disadvantages that OLEDs are known for, such as off-axis color shifting and aging-related burn-in. There are many small companies working on Micro LED technology, but so far only a handful of actual displays/TVs manufacturers have showcased their Micro LED prototypes, and only two of them have started to commercialize this technology on a very small scale.

Related Reading:

Source: Japan Display

via AnandTech https://ift.tt/phao0v

December 2, 2019 at 03:09PM

AWS Designing a 32-Core Arm Neoverse N1 CPU for Cloud Servers

https://www.anandtech.com/show/15181/aws-designs-32core-arm-cpu-for-cloud-servers

Amazon Web Services’s CPU design unit is working on a new multi-core processor for AWS servers. The new CPU is said to use Arm’s new Neoverse N1 architecture and would feature a considerably higher core-count when compared to AWS’s first-generation Graviton processor, which should result in a significant performance increase.

The yet-to-be-named AWS CPU will be based on Arm’s Neoverse N1 microarchitecture and will integrate as many as 32 cores, according to Reuters, which cites two sources with knowledge of the matter. The chip will also be able to connect to various special-purpose accelerators using a ‘fabric’ interface to greatly speed up certain workloads.

On a high level, the Neoverse N1 (aka Ares) to a large degree resembles Arm’s consumer-oriented Cortex-A76 microarchitecture: a 4-wide fetch/decode machine with a pipeline depth of only 11 stages that can reduce itself to 9 when needed. Meanwhile, the Neoverse N1 is designed to run at relatively high frequencies to provide maximum single-thread performance, it has a different cache architecture (coherent, with 1 MB L2 option, yet caches are technically not a part of the microarchitecture per se), and some other enhancements. Overall, with the Neoverse N1 Arm is looking at clocks of up to 3.1 GHz and a ~100 W TDP per SoC.

Readers who are interested to find out more about Arm’s Neoverse N1 platform can read our coverage from earlier this year, but the key thing in the context of the newly released information is that AWS continues to believe in custom Arm-based processors for servers and would be among the first adopters of the Neoverse N1. As noted above, the microarchitecture and the platform were optimized for cloud server workloads from the ground up, so with with further customization from Amazon, the 32-core processor promises to offer rather serious performance in applications that it was designed for. Will these CPUs challenge AMD’s Rome or Intel’s Cascade Lake? Probably not, but the importance of custom chips is their ability to offer the right total cost of ownership and sufficient performance, not win all the benchmarks.

Related Reading:

Source: Reuters

via AnandTech https://ift.tt/phao0v

December 2, 2019 at 12:09PM

How to Get Solar Power on a Rainy Day? Beam It From Space

https://www.wired.com/story/how-to-get-solar-power-on-a-rainy-day-beam-it-from-space

Earlier this year, a small group of spectators gathered in David Taylor Model Basin, the Navy’s cavernous indoor wave pool in Maryland, to watch something they couldn’t see. At each end of the facility there was a 13-foot pole with a small cube perched on top. A powerful infrared laser beam shot out of one of the cubes, striking an array of photovoltaic cells inside the opposite cube. To the naked eye, however, it looked like a whole lot of nothing. The only evidence that anything was happening came from a small coffee maker nearby, which was churning out “laser lattes” using only the power generated by the system.

The laser setup managed to transmit 400 watts of power—enough for several small household appliances—through hundreds of meters of air without moving any mass. The Naval Research Lab, which ran the project, hopes to use the system to send power to drones during flight. But electronics engineer Paul Jaffe has his sights set on an even more ambitious problem: beaming solar power to Earth from space. For decades the idea had been reserved for The Future, but a series of technological breakthroughs and a massive new government research program suggest that faraway day may have finally arrived.

Since the idea for space solar power first cropped up in Isaac Asimov’s science fiction in the early 1940s, scientists and engineers have floated dozens of proposals to bring the concept to life, including inflatable solar arrays and robotic self-assembly. But the basic idea is always the same: A giant satellite in orbit harvests energy from the sun and converts it to microwaves or lasers for transmission to Earth, where it is converted into electricity. The sun never sets in space, so a space solar power system could supply renewable power to anywhere on the planet, day or night, rain or shine.

Like fusion energy, space-based solar power seemed doomed to become a technology that was always 30 years away. Technical problems kept cropping up, cost estimates remained stratospheric, and as solar cells became cheaper and more efficient, the case for space-based solar seemed to be shrinking.

That didn’t stop government research agencies from trying. In 1975, after partnering with the Department of Energy on a series of space solar power feasibility studies, NASA beamed 30 kilowatts of power over a mile using a giant microwave dish. Beamed energy is a crucial aspect of space solar power, but this test remains the most powerful demonstration of the technology to date. “The fact that it’s been almost 45 years since NASA’s demonstration and it remains the high water mark speaks for itself,” Jaffe says. “Space solar wasn’t a national imperative, and so a lot of this technology didn’t meaningfully progress.”

John Mankins, a former physicist at NASA and director of Solar Space Technologies, witnessed how government bureaucracy killed space solar power development first hand. In the late 1990s, Mankins authored a report for NASA that concluded it was time to take space solar power seriously again and led a project to do design studies on a satellite system. Despite some promising results, the agency ended up abandoning it anyway.

In 2005, Mankins left NASA to work as a consultant, but he couldn’t shake the idea of space solar power. He did some modest space solar power experiments himself and even got a grant from NASA’s Innovative Advanced Concepts program in 2011. The result was SPS-ALPHA, which Mankins called “the first practical solar power satellite.” The idea, says Mankins, was “to build a large solar-powered satellite out of thousands of small pieces.” His modular design brought the cost of hardware down significantly, at least in principle.

via Wired Top Stories https://ift.tt/2uc60ci

December 2, 2019 at 06:09AM

All new cellphone users in China must now have their face scanned

https://www.technologyreview.com/f/614781/all-new-cellphone-users-in-china-must-now-have-their-face-scanned/

The news: Customers in China who buy SIM cards or register new mobile phone services will have to have their faces scanned under a new law that came into affect yesterday. China’s government says the new rule, which was passed into law back in September, will “protect the legitimate rights and interest of citizens in cyberspace.”

A controversial step: It can be seen as part of an ongoing push by China’s government to make sure that people use services on the internet under their real names, thus helping to reduce fraud and boost cyber security. On the other hand, it also looks like part of a drive to make sure every member of the population can be surveilled.

How do Chinese people feel about it?: It’s hard to say for sure, given how strictly the press and social media are regulated, but there are hints of growing unease over the use facial recognition technology within the country. From the outside, there has been a lot of concern over the role the technology will play in the country’s controversial social credit system, and how it’s been used to suppress Uighur Muslims in the western region of Xinjiang.

A potential knock-on effect: The use of facial recognition in China might seem irrelevant to people in other countries, but Chinese groups are helping to shape United Nations standards for the technology, the Financial Times reported yesterday.

via Technology Review Feed – Tech Review Top Stories https://ift.tt/1XdUwhl

December 2, 2019 at 05:56AM

Rocket Lab to Take Big Step Toward Reusability with Launch Friday

https://www.space.com/rocket-lab-reusable-technology-tenth-mission.html

Rocket Lab will have a lot to be thankful for this Friday (Nov. 29), if all goes according to plan.

The California-based company is scheduled to launch its 10th mission at 2:56 a.m. EST (0756 GMT) on Friday morning, a day after the U.S. Thanksgiving holiday. You can watch the action live here at Space.com, courtesy of Rocket Lab, or directly via the company.

Join our Space Forums to keep talking space on the latest missions, night sky and more! And if you have a news tip, correction or comment, let us know at: community@space.com.

via Space.com https://ift.tt/2CqOJ61

November 27, 2019 at 11:01AM

Bluetooth Is Good Now

https://gizmodo.com/bluetooth-is-good-now-1838635803

It was nearly a decade ago, when a pretty blue gadget arrived in my mailbox and fundamentally changed my understanding of wireless technology. The gadget was a Jawbone Jambox—arguably the first popular Bluetooth speaker—that let me play music from my phone from across my backyard. Like most Bluetooth contraptions at the time, the Jambox was barely usable at times, but this idea of wireless living, it was exciting! Now, ten years later, Bluetooth is actually good.

In 2009, the year before the Jambox hit the market, Bluetooth standards saw major updates that would begin to transform the ecosystem, such as higher data transfer speeds and Bluetooth Low Energy. My Jambox didn’t support those standards when I bought it, and surely as a result, it was a nightmare to use. Rocking Bluetooth 2.1, the Jambox struggled to stay connected to any of my devices, so much so that I ended up plugging my phone directly into the speaker most of the time. Even then, the damn thing seemed to die after an hour of cruddy music playback.

Fast forward a decade, and it seems like every gadget I use daily is powered by some sort of Bluetooth. My laptop, my desktop, my mouse, my keyboard, my headphones, my smartwatch, and yes, my non-Jambox speaker all stuffed with Bluetooth technology that lets them talk to each other and exchange data at lightning speeds. The real miracle, nowadays, is that they’re all incredibly dependable. They connect quickly and stay connected. Their batteries last a long time. It’s the opposite of how crappy Bluetooth was ten years ago.

The extent to which I don’t even think twice about using Bluetooth these days is evidence that the technology is not only good now, it’s incredible. And it’s about to get better. How we got here is essential to understanding what’s coming next.

Bluetooth is fairly still a new technology. Believe it or not, both cellular and wi-fi have been around since the 1970s, and today, they’re undeniably the most prevalent wireless technologies. The first consumer device with Bluetooth—a hands-free headset made by Ericsson—didn’t hit the market until 1999. As the new kid on the wireless block, Bluetooth was laughed at.

In its early days, Bluetooth’s bandwidth was crap, which is part of the reason why it was primarily used for telephone headsets. That changed dramatically with the introduction of Bluetooth 3.0 in 2009, but the real game-changer was a new technology called Bluetooth Low Energy which arrived with the 4.0 standard in 2011. This essentially split the standard into two segments: Bluetooth Classic and Bluetooth Low Energy. This was a big deal.

“Bluetooth Low Energy ultimately made it possible for anything and everything to be a connected device,” Chuck Sabin of the Bluetooth Special Interest Group (SIG) told me recently. “It kicked off what you would consider the modern day internet of things.”

Part of the reason why this happened is that, at that time, Bluetooth was built into pretty much every phone on the market, and Bluetooth LE made it easier for developers to build apps that took advantage of Bluetooth’s capabilities. Whereas Bluetooth Classic specialized in higher bandwidth connections and required cumbersome pairing operations, the newer Low Energy technology dealt in smaller bursts of data, which was ideal for devices like smart watches, beacons, and smart home gadgets. As the name implies, Bluetooth LE also used less energy, so those devices could be powered by on-board batteries. The new Bluetooth standard also allowed the pairing process to happen in the background, which encouraged even more developers to come up with new tricks for the tech.

These upgrades blew the lid off Bluetooth adoption in many ways. Since its low-cost chips could be powered by coin cell batteries for months or even years, Bluetooth LE started showing up in all kinds of new gadgets, from blood pressure monitors to little tile-shaped tracking devices that you can attach to your keychain. Bluetooth LE’s streamlined pairing process also made it a handy alternative to near-field communication (NFC) devices. One key advantage, however, is that Bluetooth LE can get devices dozens of feet apart to talk to each, while NFC only works over a few inches. Bluetooth LE can also support dual connections with Bluetooth Classic, which has led to new features and better battery life all around. (Bluetooth LE beacons can also spot Bluetooth devices as they’re passing by which is terrifying to privacy advocates but a gold mine for retail.)

It was about five years ago that I started reviewing Bluetooth headphones. The challenge, at the time, was figuring out if any of them were dependable for daily use. Even the most expensive sets were prone to cutting out or simply seizing up in the presence of interference.

This was also an era when plenty of people had their doubts about the audio quality on Bluetooth. These were somewhat warranted during when it came to early Bluetooth standards. The original 1.0 specification could handle a maximum throughput of 721 Kbps which would work fine for phone calls but probably wouldn’t do your FLAC files much justice. As new Bluetooth versions came out, however, bandwidth increased, and codecs like aptX and LDAC offer improved compression for audio files. To be honest, five years ago, it was hard to tell the difference in audio quality when listening through Bluetooth versus wired headphones.

In my experience, performance also varied widely by brand as some companies were simply better equipped the navigate the standards and integrate the technology into their products. Jabra, for instance, was an early standout. This pioneer of hands-free headsets was on of the first companies to produce Bluetooth products in the early aughts, and its popular Jabra Move was one of the first sets of wireless headphones I tested that offered almost flawless connectivity.

This was back in 2014, when most Bluetooth headphones were still a nightmare to use. (Jabra would later blow my mind with its Elite 65t truly wireless earbuds, which I still use on a daily basis.) Within a couple years, everyone from Bowers & Wilkins to Sony was making Bluetooth headphones that could stay connected, play high quality audio, and in some cases, last longer than eight hours on a single charge.

Then, in 2016, Apple came along with its W1 chip. This system-on-a-chip (SOC) made its first appearance in the original AirPods as well as a few Beats models. The W1 helped maintain a remarkably good Class 1 Bluetooth connection between the earbuds and any Apple device. (Class 1 is the strongest, most power-intensive Bluetooth available.) This sort of proprietary tweaking meant that you could pair the AirPods to an iPhone without digging through settings menu, and then, the magical little wireless earbuds would connect to the device immediately and unflinchingly. Bluetooth suddenly, for some, was easier than plugging in a wire.

Around the same time as AirPods started upending the market in early 2017, devices with Bluetooth 5 support started pouring in. While this standard didn’t create entirely new categories of products like Bluetooth LE did, the performance improvements stand to change what existing Bluetooth devices can do. The longer range, higher speeds, and better locations services capabilities helped Bluetooth 5 grow out of consumer devices and into massive new systems.

“That middle of the decade delivery of Bluetooth 5 kicked off the revolution and evolution of Bluetooth into industrial, commercial, smart buildings, smart home, smart city type applications for the technology,” Sabin recalled. He added that sensor networks can make use of new mesh networking capabilities in Bluetooth that can enable thousands of devices to communicate with each other.

The improvements seen in Bluetooth 5 are also transforming how the technology works in consumer devices. Because it’s more efficient, Bluetooth 5 is boosting battery life in many devices. Master & Dynamic, for instance, added Bluetooth 5 to a new generation of its popular MW07 truly wireless earbuds, and battery life nearly tripled from 3.5 hours in the old version to 10 hours in the new one.

Combined with efficiency, the improved bandwidth of Bluetooth 5 is making new features possible, like always-on microphones for voice assistants in earbuds and headphones.

You’ve probably noticed that I’m talking a lot about Bluetooth headphones and speakers. That’s partly because I spend a lot of time testing these things out, and I can speak to how dramatically Bluetooth updates have improved the way they work. But it’s also important to realize that Bluetooth got its start in audio and will continue to be audio-focused in years to come. The folks at Bluetooth SIG told me that we can look forward to new codecs for high quality audio, more hearing aid capabilities, and some broadcasting features as the next decade approaches.

Otherwise, it will be interesting to see Bluetooth battle it out with technologies like wi-fi in our future of connected everything. We’ve only scratched the surface in terms of what Bluetooth 5 can accomplish in commercial and industrial spaces, but it’s important to note that wi-fi remains the ruler of wireless because wi-fi is everywhere. While a growing number of devices use Bluetooth to talk to each other, wi-fi gives gadgets the ability to connect to other devices as well as to connect directly to the internet. So as we’re seeing Bluetooth-powered lightbulbs become popular devices, the wi-fi-powered bulbs may ultimately be more capable.

“Wi-fi will be a very strong contender for just about any connectivity scenario in the home, because the wi-fi network is already there in almost all home,” Kevin Robinson of the Wi-Fi Alliance told me. “When you talk about the Internet of Things, one of the key words there is ‘Internet.’”

Robinson is not wrong. Bluetooth technology also faces a number of limitations that make wi-fi more attractive to developers. Speed is the big one. While data transfer speeds doubled from Bluetooth 4 to Bluetooth 5, the latest standard can still only handle 48Mbps. The upcoming Wi-Fi 6 standard tops out at 9.6Gbps. Range and battery draw are two more issues that have always plagued Bluetooth, and while they’ve improved over time, wi-fi is still better in some circumstances.

But ultimately, as these technologies converge, the people win. Bluetooth is good for things that wi-fi is not and vice versa. The simple fact that wireless connectivity is becoming the standard is an exceptional leap forward. In the coming years, we’ll also see more innovation in the 5G space, so it’s not entirely absurd to believe that we won’t be worried about wireless in a decade because wireless will be everywhere. The only question then is how to make the most of it.

via Gizmodo https://gizmodo.com

November 30, 2019 at 10:03AM

Here’s The Main Problem With Tesla’s Supercharger Network

https://jalopnik.com/heres-the-main-problem-with-teslas-supercharger-network-1840110802

For all the crap we give Tesla, they rightfully deserve to be lauded for the Supercharger network of fast-charging stations. You want to drive across the country in your electric car? Easy peasy, thanks to the ability to quick charge up to about an 80% tank of electron juice in about 20 minutes. Unless you and some of the hundreds of thousands of other Tesla owners want to use the exact same Supercharger station at the exact same time.

I’ve written whole rants on the topic, but I really do love the Supercharger system. While companies like GM twiddle their thumbs and pen useless op-eds for CNN wondering who, WHO, could possibly build the infrastructure their cars require for them, Tesla just went out and did it. The freedom the literally thousands of Superchargers enables is why, for all of Elon’s inanity, if I was in the market for an expensive electric car at the moment, I’d probably get a Tesla over anything else.

But “literally thousands” of Superchargers isn’t enough. It’s not enough for the millions of cars Tesla itself wants to build, and other automakers are still playing catchup, let alone coming close to the number of Superchargers that exist.

Basically, it’s going to take some brave corporations millions, if not billions, of dollars, to go ahead and start building quick-charging stations on the scale that we have gas stations now.

Until then, we’ll get scenes like this in high-Tesla-ownership areas, shot by a reader named Steve at the San Luis Obispo, California, Supercharger station on Thanksgiving Day:

Build a big restaurant and a Quick-E Mart and an arcade and whatever the hell it is people use to kill 20 minutes, and you’ll make a killing yourself.

Oh hell, I’ll do it. Anyone got a couple billion?

H/t to Steve!

via Gizmodo https://gizmodo.com

November 30, 2019 at 11:45AM