Someone Uploaded All 285 Issues Of Nintendo Power To Archive.org

https://kotaku.com/nintendo-power-archive-org-magazine-1849785150


Few gaming magazines are as beloved as Nintendo Power. In the NES era, it offered many young Nintendo fans their first glimpse of the upcoming games that fired their imaginations, and poring over the detailed maps and tantalizing bits of info was a ritual almost as enjoyable as playing the games themselves. Discontinued in 2012 (though revived as a podcast five years later), the official magazine was an essential source of reviews, previews, and strategies. Now, thanks to community projects and audacious archivists, every single issue of the legendary magazine is yours to view.

Uploaded to Archive.org today by Gumball, all 285 issues of Nintendo Power are now unofficially available in .cbr format. At just over 40 gigabytes for the whole shebang, the vast majority of the collection comes courtesy of Retromags, a community-run project dedicated to archiving classic video game magazines. A couple of remaining issues were sourced via Reddit by Gumball. Scanned in full color, the collection is a wonderful way to browse through gaming and media history.

Gumball is no stranger to gathering video game print materials, as they state in a Reddit comment, “I’ve been collecting manuals and stuff for systems I grew up with.” “It is a big piece of a lot of kid’s childhoods and gaming history, so I think it’s important that they are available for everyone to read,” they say.

The escalating Reddit post is gaining a lot of attention and appreciation from gamers who have either been looking to complete their own collections or to find the couple of missing issues that weren’t in the Retromags collection. “I just wanted to get every issue in one place,” Gumball says in another Reddit reply. “The ones that I could not find were issues 208 and 285. Retromags did not have them [but] a dude over in the r/DHexchange happened to have both of these [and] allowed me to complete the set.”

Unfortunately, Nintendo’s history with these sorts of efforts isn’t exactly comforting. But as physical media, especially printed manuals and magazines like Nintendo Power, become harder to find, having access to archives like this is an essential way to preserve this history.

via Kotaku https://kotaku.com

November 15, 2022 at 11:20AM

Consumer Reports finds hybrid cars are more reliable than gas-only models

https://www.engadget.com/consumer-reports-hybrid-reliability-ev-phev-170003341.html

Hybrid cars aren’t just valuable for their fuel efficiency, apparently. Consumer Reports has published annual reliability survey data indicating that hybrids are generally more reliable than their gas-only equivalents. Hybrid cars were the most reliable among vehicle types, with their SUV siblings ranking third. Certain models were stand-outs, including the Ford Maverick pickup, Lexus NX luxury SUV and Toyota Corolla sedan — they all had above-average reliability on top of major fuel savings.

That trustworthiness doesn’t always extend to other electrified cars. The publication found that plug-in hybrids aren’t as reliable. Toyota’s Prius Prime and RAV4 Prime are less reliable than their conventional hybrid versions, and the Chrysler Pacifica hybrid was one of the most unreliable vehicles in the survey. EVs continue to struggle, too. While there are some exceptions, such as the "outstanding" reliability of the Kia EV6, the category is still plagued with glitches — and not just Tesla’s build quality issues. Ford’s Mustang-Mach-E dipped to below average due to its electronics flaws. Only four out of 11 models with enough survey data had average or better reliability.

A straightforward hybrid isn’t always the best choice, either. Consumer Reports warns that BMW, Mercedes, Ram and others offer "mild" hybrids that don’t offer much in the way of fuel savings, and are sometimes focused more on adding power. These vehicles weren’t included in the hybrid reliability rankings.

The greater reliability of hybrids isn’t a total surprise. While they offer improved fuel economy, they’re ultimately based on familiar model lines using well-established combustion engine technology. EVs are more likely to be brand new models based on young electric motor systems and don’t have years of refinement.

Automakers will have to improve their safety tech if they want to stay in Consumer Reports‘ good graces, whatever powertrain they’re using. As of November, the outlet will penalize models that don’t include pedestrian-aware automatic emergency braking as a standard feature. CR will also stop handing out bonus points to vehicles that only have blind spot warnings (they’ll need rear cross traffic warnings as well) and forward collision alerts. This will theoretically push car creators to strengthen their default safety packages and potentially save lives.

via Engadget http://www.engadget.com

November 15, 2022 at 11:04AM

MIT solved a century-old differential equation to break ‘liquid’ AI’s computational bottleneck

https://www.engadget.com/mit-century-old-differential-equation-liquid-ai-computational-bottleneck-160035555.html

Last year, MIT developed an AI/ML algorithm capable of learning and adapting to new information while on the job, not just during its initial training phase. These “liquid” neural networks (in the Bruce Lee sense) literally play 4D chess — their models requiring time-series data to operate — which makes them ideal for use in time-sensitive tasks like pacemaker monitoring, weather forecasting, investment forecasting, or autonomous vehicle navigation. But, the problem is that data throughput has become a bottleneck, and scaling these systems has become prohibitively expensive, computationally speaking.

On Tuesday, MIT researchers announced that they have devised a solution to that restriction, not by widening the data pipeline but by solving a differential equation that has stumped mathematicians since 1907. Specifically, the team solved, “the differential equation behind the interaction of two neurons through synapses… to unlock a new type of fast and efficient artificial intelligence algorithms.”

“The new machine learning models we call ‘CfC’s’ [closed-form Continuous-time] replace the differential equation defining the computation of the neuron with a closed form approximation, preserving the beautiful properties of liquid networks without the need for numerical integration,” MIT professor and CSAIL Director Daniela Rus said in a Tuesday press statement. “CfC models are causal, compact, explainable, and efficient to train and predict. They open the way to trustworthy machine learning for safety-critical applications.”

So, for those of us without a doctorate in Really Hard Math, differential equations are formulas that can describe the state of a system at various discrete points or steps throughout the process. For example, if you have a robot arm moving from point A to B, you can use a differential equation to know where it is in between the two points in space at any given step within the process. However, solving these equations for every step quickly gets computationally expensive as well. MIT’s “closed form” solution end-arounds that issue by functionally modeling the entire description of a system in a single computational step. AS the MIT team explains:

Imagine if you have an end-to-end neural network that receives driving input from a camera mounted on a car. The network is trained to generate outputs, like the car’s steering angle. In 2020, the team solved this by using liquid neural networks with 19 nodes, so 19 neurons plus a small perception module could drive a car. A differential equation describes each node of that system. With the closed-form solution, if you replace it inside this network, it would give you the exact behavior, as it’s a good approximation of the actual dynamics of the system. They can thus solve the problem with an even lower number of neurons, which means it would be faster and less computationally expensive.

By solving this equation at the neuron-level, the team is hopeful that they’ll be able to construct models of the human brain that measure in the millions of neural connections, something not possible today. The team also notes that this CfC model might be able to take the visual training it learned in one environment and apply it to a wholly new situation without additional work, what’s known as out-of-distribution generalization. That’s not something current-gen models can really do and would prove to be a significant step towards the generalized AI systems of tomorrow.

via Engadget http://www.engadget.com

November 15, 2022 at 10:03AM