Blue Origin announces plans for a commercial space station

https://www.engadget.com/orbital-reef-200848765.html?src=rss

Blue Origin has more ambitious plans than simply space tourism. Today, the spaceflight company owned by Jeff Bezos announced that it is working on creating its very own space station as well. Called Orbital Reef, it promises to be something of an industrial and commercial hub, and is meant to start operating in the second half of this decade. 

It will be developed, owned and operated in partnership with Sierra Space, a subsidiary of the Sierra Nevada corporation. Sierra Space is perhaps better known for Dream Chaser, a spacecraft that’s set to begin operating in 2022 and carry cargo to the International Space Station. Orbital Reef is also backed by Boeing, Redwire Space, Genesis Engineering Solutions and Arizona State University. The company hopes to use Boeing’s Starliner and Sierra Space’s aforementioned Dream Chaser to ferry both cargo and passengers to Orbital Reef.

Think of Orbital Reef as essentially a “business park,” but in space. In a press release, Blue Origin said that the destination “will offer research, industrial, international, and commercial customers the cost competitive end-to-end services they need including space transportation and logistics, space habitation, equipment accommodation, and operations including onboard crew.” Anyone who wants to “establish their own address in orbit” can do so.

Concept images of Orbital Reef.
Blue Origin

Blue Origin said that Orbital Reef would be habitable for up to 10 people, which is almost as much as that of the International Space Station. It will feature “human-centered space architecture” with “world-class services and amenities.” There will be multiple ports for visiting spacecraft and modules. Orbital Reef will apparently feature an open system that will enable any customer or nation to use it. As the market for such facilities grows, Blue Origin promises that Orbital Reef will scale the amenities and utilities to match.

“Seasoned space agencies, high-tech consortia, sovereign nations without space programs, media and travel companies, funded entrepreneurs and sponsored inventors, and future-minded investors all have a place on Orbital Reef,” said the company in a press release.

“For over sixty years, NASA and other space agencies have developed orbital space flight and space habitation, setting us up for commercial business to take off in this decade,” said Brent Sherwood, Senior Vice President of Advanced Development Programs for Blue Origin. “We will expand access, lower the cost, and provide all the services and amenities needed to normalize space flight. A vibrant business ecosystem will grow in low Earth orbit, generating new discoveries, new products, new entertainments, and global awareness.”

Blue Origin’s only successful project is a suborbital tourist program that sends passengers to the edge of space (and back) on the New Shepard. It has already flown eight people, which includes Bezos as well as Star Trek’s William Shatner. Other projects, such as the New Glenn rocket (which the company hopes to use to launch some of Orbital Reef’s modules) and the Blue Moon lunar lander are still in development. 

via Engadget http://www.engadget.com

October 25, 2021 at 03:15PM

Volvo’s self-driving loader prototype is based on a Lego model

https://www.engadget.com/volvo-lx03-autonomous-wheel-loader-lego-model-201519677.html?src=rss

Volvo is eager to bring self-driving technology to construction crews, but it’s taking a decidedly unusual route to get there. The automaker has unveiled an autonomous wheel loader prototype, the LX03, that’s based on a Lego model — 42081 Lego Technic Concept Wheel Loader Zeux, if you’re looking for it. The machine can haul 5 tons and can make its own decisions in a wide variety of situations, including team-ups with human workers. 

The LX03 is also uniquely modular. Volvo can make "just one or two changes" to produce a larger or smaller loader to meet a customer’s demands. It’s unsurprisingly electric and can last for up to eight hours depending on the job. It should be available for a typical workday, then.

The prototype isn’t indicative of a production model. It does represent the "next stage" in Volvo’s efforts to both explore AI and decarbonize construction, however. And there’s little denying the appeal of building a real, fully functional vehicle based on a plastic building system, particularly when it could usher in a futre that keeps humans away from dangerous and monotonous tasks.

via Engadget http://www.engadget.com

October 27, 2021 at 03:27PM

Xbox One Now Streams Xbox Series X|S Games (Only Available For Testers, For Now)

https://www.gamespot.com/articles/xbox-one-now-streams-xbox-series-x-s-games-only-available-for-testers-for-now/1100-6497440/


Microsoft has rolled out a new Xbox One update that introduces support for cloud gaming on the console, allowing users to stream Xbox Series X|S-quality games on the aging console. This feature is currently only available for testers, however.

The October 25 update for Alpha Skip-Ahead users added cloud gaming support on Xbox One, a console originally released in 2013 and very underpowered compared to the Series X|S. The feature was already made available on Xbox Series X|S for testers in the Xbox Insider program, which is free and open to everyone to enroll in.

Xbox Cloud Gaming allows users to stream a selection of Game Pass titles to their console. While the quality isn’t always perfect, and it can fluctuate depending on the strength and consistency of your connection, it’s a nice option to try out a game without needing to download it. Microsoft says downloading a game to your hard drive will offer the best experience, however.

With this new feature, it means Xbox One owners can not only stream Series X|S versions of Xbox games, but also new-generation exclusives that Xbox One owners wouldn’t otherwise have access to on that console.

The same October 25 update made it easier to stream to Twitch. These updates are available now for testers before Microsoft rolls them out for everyone at a later date.

As mentioned, you can join the Xbox Insider program and enroll your console in one of the testing rings for a chance to try out updates before they are released for everyone.

As for Xbox Cloud Gaming, it remains in beta, but Microsoft has high hopes for it. The company has a bold ambition to reach 3 billion gamers worldwide, and thanks to Xbox Cloud Gaming, everyone with a smartphone is a potential customer, in addition to PC and console.

Got a news tip or want to contact us directly? Email news@gamespot.com

via GameSpot’s PC Reviews https://ift.tt/2mVXxXH

October 27, 2021 at 12:33PM

Facebook wants machines to see the world through our eyes

https://www.technologyreview.com/2021/10/14/1037043/facebook-machine-learning-ai-vision-see-world-human-eyes/

We take it for granted that machines can recognize what they see in photos and videos. That ability rests on large data sets like ImageNet, a hand-curated collection of millions of photos used to train most of the best image-recognition models of the last decade. 

But the images in these data sets portray a world of curated objects—a picture gallery that doesn’t capture the mess of everyday life as humans experience it. Getting machines to see things as we do will take a wholly new approach. And Facebook’s AI lab wants to take the lead.

It is kick-starting a project, called Ego4D, to build AIs that can understand scenes and activities viewed from a first-person perspective—how things look to the people involved, rather than to an onlooker. Think motion-blurred GoPro footage taken in the thick of the action, instead of well-framed scenes taken by someone on the sidelines. Facebook wants Ego4D to do for first-person video what ImageNet did for photos.  

For the last two years, Facebook AI Research (FAIR) has worked with 13 universities around the world to assemble the largest ever data set of first-person video—specifically to train deep-learning image-recognition models. AIs trained on the data set will be better at controlling robots that interact with people, or interpreting images from smart glasses. “Machines will be able to help us in our daily lives only if they really understand the world through our eyes,” says Kristen Grauman at FAIR, who leads the project.

Such tech could support people who need assistance around the home, or guide people in tasks they are learning to complete. “The video in this data set is much closer to how humans observe the world,” says Michael Ryoo, a computer vision researcher at Google Brain and Stony Brook University in New York, who is not involved in Ego4D.

But the potential misuses are clear and worrying. The research is funded by Facebook, a social media giant that has recently been accused in the US Senate of putting profits over people’s well-being—as corroborated by MIT Technology Review’s own investigations.

The business model of Facebook, and other Big Tech companies, is to wring as much data as possible from people’s online behavior and sell it to advertisers. The AI outlined in the project could extend that reach to people’s everyday offline behavior, revealing what objects are around your home, what activities you enjoyed, who you spent time with, and even where your gaze lingered—an unprecedented degree of personal information.

“There’s work on privacy that needs to be done as you take this out of the world of exploratory research and into something that’s a product,” says Grauman. “That work could even be inspired by this project.”

FACEBOOK

The biggest previous data set of first-person video consists of 100 hours of footage of people in the kitchen. The Ego4D data set consists of 3,025 hours of video recorded by 855 people in 73 different locations across nine countries (US, UK, India, Japan, Italy, Singapore, Saudi Arabia, Colombia, and Rwanda).

The participants had different ages and backgrounds; some were recruited for their visually interesting occupations, such as bakers, mechanics, carpenters, and landscapers.

Previous data sets typically consisted of semi-scripted video clips only a few seconds long. For Ego4D, participants wore head-mounted cameras for up to 10 hours at a time and captured first-person video of unscripted daily activities, including walking along a street, reading, doing laundry, shopping, playing with pets, playing board games, and interacting with other people. Some of the footage also includes audio, data about where the participants’ gaze was focused, and multiple perspectives on the same scene. It’s the first data set of its kind, says Ryoo.

FAIR has also launched a set of challenges that it hopes will focus other researchers’ efforts on developing this kind of AI. The team anticipates algorithms built into smart glasses, like Facebook’s recently announced Ray-Bans, that record and log the wearers’ day-to-day lives. It means that augmented- or virtual-reality “metaverse” apps could, in theory, answer questions like “Where are my car keys?” or “What did I eat and who did I sit next to on my first flight to France?” Augmented-reality assistants could understand what you’re trying to do and offer instructions or useful social cues.

It’s sci-fi stuff, but closer than you think, says Grauman. Large data sets accelerate the research. “ImageNet drove some big advances in a short time,” she says. “We can expect the same for Ego4D, but for first-person views of the world instead of internet images.”

Once the footage was collected, crowdsourced workers in Rwanda spent a total of 250,000 hours watching the thousands of video clips and writing millions of sentences that describe the scenes and activities filmed. These annotations will be used to train AIs to understand what they are watching.

Where this tech ends up and how quickly it develops remain to be seen. FAIR is planning a competition based on its challenges in June 2022. It is also important to note that FAIR, the research lab, is not the same as Facebook, the media megalodon. In fact, insiders say that Facebook has ignored technical fixes that FAIR has come up with for its toxic algorithms. But Facebook is paying for the research, and it is disingenuous to pretend the company is not very interested in its application.

Sam Gregory at Witness, a human rights organization that specializes in video technology, says this technology could be useful for bystanders documenting protests or police abuse. But he thinks those benefits are outweighed by concerns around commercial applications. He notes that it is possible to identify individuals from how they hold a video camera. Gaze data would be even more revealing: “It’s a very strong indicator of interest,” he says. “How will gaze data be stored? Who will it be accessible to? How might it be processed and used?” 

“Facebook’s reputation and core business model ring a lot of alarm bells,” says Rory Mir at the Electronic Frontier Foundation. “At this point many are aware of Facebook’s poor track record on privacy, and their use of surveillance to influence users—both to keep users hooked and to sell that influence to their paying customers, the advertisers.” When it comes to augmented and virtual reality, Facebook is seeking a competitive advantage, he says: “Expanding the amount and types of data it collects is essential.”

When asked about its plans, Facebook was unsurprisingly tight-lipped: “Ego4D is purely research to promote advances in the broader scientific community,” says a spokesperson. “We don’t have anything to share today about product applications or commercial use.”

via Technology Review Feed – Tech Review Top Stories https://ift.tt/1XdUwhl

October 14, 2021 at 07:05AM

Driver-assistance technology can fail in rain, leading to crashes in AAA study

https://www.autoblog.com/2021/10/15/driver-assistance-systems-rain-crashes-aaa-study/


Human drivers don’t necessarily see the road ahead as well when it rains, and it turns out that driver-assistance technology doesn’t either. The systems used to help your car automatically brake and stay within its lane is significantly impaired by rain, according to a study by AAA released Thursday.

American Automobile Association researchers found that automatic emergency braking, in several instances during testing conducted in simulated moderate-to-heavy rainfall, failed to detect stopped vehicles ahead, resulting in crashes. Lane-keeping technology also faired badly. 

AAA cautioned drivers, who should always be vigilant of these systems even in ideal conditions, to not rely on them in the rain.

“Vehicle safety systems rely on sensors and cameras to see road markings, other cars, pedestrians and roadway obstacles. So naturally, they are more vulnerable to environmental factors like rain,” said Greg Brannon, AAA’s director of automotive engineering and industry relations. “The reality is people aren’t always driving around in perfect, sunny weather so we must expand testing and take into consideration things people actually contend with in their day-to-day driving.”

Advanced driver-assistance systems, or ADAS, are common in newer vehicles. They do not perform autonomous driving, but they can automate some limited driving tasks such as adaptive cruise control and staying centered in one’s lane. Auto emergency braking has been shown to significantly reduce rear-end crashes in tests by insurance groups.

For its tests, AAA employed a 2020 Buick Enclave Avenir, a 2020 Hyundai Santa Fe, a 2020 Toyota RAV4 and a 2020 Volkswagen Tiguan.

No test car crashed into a stopped vehicle under dry, ideal conditions. But then researchers turned on the simulated rainfall — and 17% of the test runs resulted in crashes at speeds of 25 mph (40 km/h). At 35 mph, the instances of crashes increased to 33%.

You can extrapolate from there as to the dangers at highway speeds.

Researchers simulated rainfall in the vehicles’ field of vision by using a device involving a spray nozzle that obscured the sensors in the windshield, as shown in the photo above. That way, they were able to keep the roadway dry. AAA noted that wet roads in real driving conditions could result in even higher crash rates.

As for lane-keeping technology, vehicles crossed lane markers 37% of the time during ideal conditions in the AAA test — and that rate jumped to 69% once rain was added.

This is not the first AAA study to note shortcomings in driver-assistance systems. On the bright side, this study noted that merely having a dirty or bug-spattered windshield had little effect on the systems’ sensors.

AAA emphasizes that while these systems have potential, they are no match for an attentive driver. For driving in rain, AAA offers these tips:

  • Keep windshield clean and ensure that wipers are not streaking the windshield.
  • Slow down and avoid hard braking and sharp turning. If possible, follow in the tracks of other vehicles.
  • Increase following distance to 5-6 seconds behind the vehicle ahead.
  • Do not use cruise control in order to stay alert and to respond quickly if the car’s tires lose traction with the road.
  • If the car begins to hydroplane, ease off the accelerator to gradually decrease speed until the tires regain traction, and continue to look and steer where you want to go. Don’t jam on the brakes—this can cause further traction loss.

Reuters was used in this report.

 

 

via Autoblog https://ift.tt/1afPJWx

October 15, 2021 at 09:57AM

Lego Master Builds Incredible Transforming Super Nintendo Robots

https://kotaku.com/lego-master-builds-incredible-transforming-super-ninten-1847862139


Arise, Super Famitron.
Gif: Baron von Brunk / Kotaku

Building a Lego model of a Super Nintendo console, with sliding buttons and a spring-loaded eject button, is pretty impressive. Building a Lego SNES, two corded controllers, and a game cartridge, each of which is capable of being transformed into a robot with no reassembly? That takes a Lego Master.

We’ve been covering the work of one such a Lego Master for over a decade now. Julius von Brunk, also known as Baron von Brunk, has been turning game consoles into Transformers using Lego bricks for years. We featured his Nintendo 64 robot in 2013. A year later he came back with a colorful transforming Game Boy Advance. Now witness his greatest creation yet.

At first glance, it’s nothing more than a very well-built Super Nintendo model, complete with Polybius game cartridge and a pair of controllers, one for Super NES, and one for Super Famicom.

A closer look shows off more detail. The purple buttons on the console actually slide in little tracks. The eject button doesn’t pop out the cartridge, but it can be pushed down just like the real thing.

G/O Media may get a commission

The game controllers actually plug into the console using little recreated ports designed by von Brunk.

The cartridge, which fits into a slot in the console, can be pulled out and played with. Note the “Super Famitron” logo on the label.

It’s a very cool build. Then it gets much, much cooler. Each of the four components, the console, both controllers, and cartridge, are built so they can quickly shift from interactive entertainment products into combat-ready robots. There’s no pulling off parts and reattaching them, or rebuilding Lego elements. Using a combination of normal Lego bits, some Technic pieces, and a couple of ape heads from the animalistic Lego Chima line, Baron von Brunk has created the ultimate Lego SNES Transformer team.

Meet Super Famitron and his minions, Polybius, Simian Kong, and Primal Kong. Super Famitron is a robot so big and bulky he has a little trouble standing. His minions, inspired by beloved Decepticon Soundwave and his cassettes, are adorable smaller bots, ready to do their boss’ bidding.

Due to the nature of the Super Nintendo, Super Famitron is a bit monochromatic, but the details von Brunk has worked into the robot form are outstanding. The shoulder-mounted cannons, the articulated fingers—I especially love the way his chest opens up to reveal his giant robot head.

Want to know how Baron von Brunk designed and built Super Famitron and friends? The good Baron has posted a lengthy video on his YouTube channel, detailing the challenges of creating this robot team and how he overcame each one, like avoiding having the eject button on the robot form’s crotch, which would have been hilariously cringe.

Want to see more of Baron von Brunk’s Lego creations, including more shots of Super Famitron? Check out his Flickr, which is still a thing apparently, or his Instagram, which is the newer, cooler version of Flickr. And if you’re very lucky, you might even stumble upon instructions on how to build your own transforming controllers and cartridges.

via Kotaku https://kotaku.com

October 14, 2021 at 08:56AM