Suspect arrested for cyber bank heists that amassed $1.2 billion

Suspect arrested for cyber bank heists that amassed $1.2 billion

https://ift.tt/2DW4I9Z

Europol announced today that the suspected leader of an international bank heist scheme has been arrested. The arrest was a result of an investigation that involved a number of cooperating law enforcement groups including the Spanish National Police, Europol, the FBI and the Romanian, Belarusian and Taiwanese authorities. The person was arrested in Alicante, Spain.

Since the crime group began its cyberattacks in 2013, they’ve hit more than 100 financial institutions in 40 countries around the world. They’re said to have stolen over $1.2 billion. The crime group started with a malware campaign called Anunak, which later led to more sophisticated versions known as Carbanak and, later, Cobalt. The team would send phishing emails with malicious attachments to bank employees, and once the malware was downloaded, it gave the hackers control over the banks’ machines and access to servers that controlled ATMs.

They used three main methods to fraudulently obtain cash. In some cases, they would instruct ATMs to dispense cash at certain times and members of the crime group would wait nearby and grab the cash once it was released. They also took advantage of money transfer systems and in other instances, would inflate bank balances and have money mules withdraw that amount from ATMs. The stolen cash was ultimately laundered with cryptocurrencies.

"This global operation is a significant success for international police cooperation against a top level cybercriminal organisation," Steven Wilson, head of Europol’s European Cybercrime Centre, said in a statement. "The arrest of the key figure in this crime group illustrates that cybercriminals can no longer hide behind perceived international anonymity. This is another example where the close cooperation between law enforcement agencies on a worldwide scale and trusted private sector partners is having a major impact on top level cybercriminality."

Via: Reuters

Source: Europol

Tech

via Engadget http://www.engadget.com

March 26, 2018 at 10:09AM

Flat Earth advocate finally launches his homemade rocket

Flat Earth advocate finally launches his homemade rocket

https://ift.tt/2pFafxn

For years, "Mad" Mike Hughes has not only insisted that the Earth is flat, but has maintained he could prove it by launching himself into space with his own rocket. He even claimed to have launched a homebrew rocket in 2014, but didn’t have evidence of it besides his recovery from the landing. However, he finally did it — not that he’s about to change scientists’ minds. Hughes’ steam-powered vessel launched near Amboy, California, climbing to about 1,875 feet before coming down in the Mojave Desert. Despite the clear lack of safety features, paramedics determined that Hughes should be fine.

He had originally pegged the launch for November, but had to postpone the launch multiple times due to a mix of legal requirements (the Bureau of Land Management wasn’t fond of him firing a crewed rocket on public ground) and engineering troubles. He eventually launched from private land provided by Amboy’s owner, and turned a mobile home into a vertical ramp to make sure he stayed on private lands.

Hughes hopes to fly much, much higher the next time around. His aim is to build a rocket that will launch from a balloon and take him to an altitude of 68 miles — roughly where space begins. If all goes according to plan, that would take place in August.

The irony, as you might guess, is that this launch wouldn’t even be possible with a flat Earth. A disc-shaped planet would have gravity that pulls straight down at only one point, and would become increasingly horizontal. Unless Hughes had perfect placement, his rocket would likely go very sideways. And that’s assuming the atmosphere stayed put (it would likely float off into space) or that Earth would maintain a steady distance from the Sun (Earth’s orbit keeps it from crashing into the star). Hughes is dependent on the science that disproves his beliefs just for the sake of living, let alone climbing high enough to discover that he’s wrong.

Via: Gizmodo

Source: AP News, Matt Hartman (YouTube)

Tech

via Engadget http://www.engadget.com

March 26, 2018 at 12:57AM

How To Mine Bitcoins [Humor]

How To Mine Bitcoins [Humor]

https://ift.tt/2I50MpX

A computer hardware store manager has gotten into mining bitcoins and wants to show off his mining setup to his employee.

Disclaimer: No 1080 TIs were actually harmed in the making of this video

[Viva La Dirt League]

The post How To Mine Bitcoins [Humor] appeared first on Geeks are Sexy Technology News.

Tech

via [Geeks Are Sexy] Technology News https://ift.tt/23BIq6h

March 25, 2018 at 07:47AM

Autogyros

Autogyros

https://xkcd.com/1972/

I understand modern autogyros are much more stable, so I've probably angered the autogyro people by impugning their safety. Once they finish building the autogyros they've been working on in their garages for 10 years, they'll come after me.

Funny

via xkcd.com https://xkcd.com/

March 26, 2018 at 09:03AM

Lidar maker Velodyne is confused by fatal Uber crash

Lidar maker Velodyne is confused by fatal Uber crash

https://ift.tt/2IVNOw4

Lidar maker Velodyne has put out a statement concerning the fatal accident

between an Uber autonomous vehicle and a pedestrian

in Tempe, Arizona, last week.

Uber’s

self-driving

Volvo XC90

uses a Velodyne lidar (light detection and range) unit,

said to be an HDL-64E

. That model has a

360-degree field-of-view and a 120-meter range

, so one of the big questions has been why didn’t the lasers (

or the 360-degree radar

) pick up pedestrian Elaine Herzberg before the vehicle hit her. Velodyne president Marta Thoma Hall told Bloomberg, “We are as baffled as anyone else. Certainly, our lidar is capable of clearly imaging Elaine and her bicycle in this situation. However, our lidar doesn’t make the decision to put on the brakes or get out of her way.”

The company, which supplies lidar units to a number of tech firms testing

autonomous cars

, wants to make sure its equipment isn’t blamed for the crash. The accident took place around 10 p.m., and in fact, lidar works better at night than during the day because the lasers won’t suffer any interference from daylight reflections. T

here’s a large, anxious, questioning audience

watching the rise of autonomous cars, so a fatal misstep could have huge consequences. The

Tempe police chief has already made

comments that reveal a lack of understanding about the systems that underpin self-driving vehicles.

Thoma Hall’s comments have been about clarifying a lidar array’s role in the driving task; namely, that even when the lasers detect an object, “it is up to the rest of the system to interpret and use the data to make decisions. We do not know how the Uber system of decision-making works.” If Uber’s software doesn’t process the data properly, then it doesn’t matter what the lasers register.

Her statements to

Bloomberg

and the

BBC

echo those of outside autonomous researchers. One expert in the field of autonomy, Bryant Walker Smith, told

Reuters

, “Although this video isn’t the full picture, it strongly suggests a failure by Uber’s automated driving system….”

In additional comments to Jalopnik

, Smith said the Uber software probably “classified [Herzberg] as something other than a stationary object.” Another expert told

Reuters

the cameras and radar should have taken note of Herzberg, so, “Though no information is available, one would have to conclude based on this video alone, that there are problems in the Uber vehicle software that need to be rectified.”

The CEO of Waymo

, the autonomous driving division of Alphabet — Alphabet also owns Google — told the Washington Post, “Our car would have been able to handle it,” and not hit Herzberg.

Waymo

and Uber have a history, though; when a Waymo engineer defected to Uber, Waymo said he took trade secrets with him, so it sued Uber. The two companies settled the court case with

Uber agreeing to pay Waymo $245 million

in Uber stock. Waymo once used Velodyne units, but now uses its own in-house lidar array.

The

National Transportation Safety Board

and local Arizona authorities continue to investigate the Arizona accident. They will undoubtedly turn their

attention to Uber’s Advanced Technologies Group

in Pittsburgh, where more than 700 engineers write the autonomous software and test the company’s products. Velodyne’s Thoma Hall said she hasn’t been in touch with Uber, but her company will soon speak to investigators. The

NTSB

said a preliminary report on the accident should be ready in the next few weeks; a more detailed report will take a few months.

Related Video:

Cars

via Autoblog http://www.autoblog.com

March 26, 2018 at 08:14AM

Video suggests huge problems with Uber’s driverless car program

Video suggests huge problems with Uber’s driverless car program

http://ift.tt/2GjcMah

There’s something very wrong with Uber’s driverless car program.

On Wednesday night, police released footage of Sunday night’s deadly car crash in Tempe, Arizona, where an Uber self-driving car crashed into 49-year-old Elaine Herzberg. The details it reveals are damning for Uber.

“The idea that she ‘just stepped out’ or ‘came out in a flash’ into the car path is clearly false,” said Tara Goddard, an urban planning professor at Texas A&M University, after seeing the video. “It seems like the system should have responded.”

The video shows that Herzberg crossed several lanes of traffic before reaching the lane where the Uber car was driving. You can debate whether a human driver should have been able to stop in time. But what’s clear is that the vehicle’s lidar and radar sensors—which don’t depend on ambient light and had an unobstructed view—should have spotted her in time to stop.

On top of that, the video shows that Uber’s “safety driver” was looking down at her lap for nearly five seconds just before the crash. This suggests that Uber was not doing a good job of supervising its safety drivers to make sure they actually do their jobs. The combination of these failures—and Herzberg’s decision to jaywalk in the first place—led to her death.

But zooming out from the specifics of Herzberg’s crash, the more fundamental point is this: conventional car crashes killed 37,461 in the United States in 2016, which works out to 1.18 deaths per 100 million miles driven. Uber announced that it had driven 2 million miles by December 2017 and is probably up to around 3 million miles today. If you do the math, that means that Uber’s cars have killed people at roughly 25 times the rate of a typical human-driven car in the United States.

Of course, it’s possible that Uber just got exceptionally unlucky. But it seems more likely that, even with the safety driver, Uber’s self-driving cars are way more dangerous than a car driven by the average human driver.

This shouldn’t surprise us. Uber executives know they’re behind Waymo in developing a self-driving car, and they’ve been pulling out all the stops to catch up. Uber inherited a culture of rule-breaking and corner-cutting from its founder and former CEO Travis Kalanick. That combination made a tragedy like this almost inevitable.

Uber probably wasn’t at fault, legally speaking, in recent crashes

An Uber self-driving car in San Francisco in 2017.
Enlarge /

An Uber self-driving car in San Francisco in 2017.

Justin Sullivan/Getty Images

Consider these recent crashes involving self-driving Uber cars:

In March 2017, an Uber self-driving car was struck on the left side as it went through an intersection in Tempe, Arizona. Uber was in the right-most lane on a six-lane road, approaching an intersection. The other two lanes in Uber’s direction were backed up with traffic. The other car was traveling in the opposite direction and making a left turn. The driver of that other vehicle said that cars stopped in the other lanes blocked her view, preventing her from seeing the Uber vehicle.

“Right as I got to the middle lane about to cross the third, I saw a car flying through the intersection, but I couldn’t brake fast enough to completely avoid the collision,” the driver of the non-Uber car said in the police report. Police cited the non-Uber driver for failing to yield the right of way. The Uber driver was not cited.

In February 2018, an Uber vehicle in Pittsburgh collided with another vehicle after the other car made a left turn in front of it. The Uber vehicle had its turn signal on, and the other driver thought this meant the Uber vehicle was going to turn at the intersection rather than go straight through. Uber says the car had its turn signal on because it was planning to change lanes.

Police did not determine who was at fault in the accident. But a Pennsylvania attorney told Ars that “generally speaking, before you take a left-hand turn, you’re required to ensure there’s not traffic coming from the other direction.”

In March 2018, we had this Sunday’s deadly crash in Tempe. Authorities have not reached any final conclusions about the case, but experts have told Ars there’s good reason to believe Herzberg may have been at fault, legally speaking. She was jaywalking in the middle of the night in a poorly lit area outside of a marked crosswalk.

“I think that preliminary results coming out is that the automation of the car was not at fault because the pedestrian stepped into the road,” said Mohamed Abdel-Aty, a civil engineer and traffic safety expert at the University of Central Florida.

So in all three of these incidents, there’s a strong argument that the other people involved—not the Uber car—were legally at fault for the crashes.

That doesn’t mean Uber’s cars are driving well

Jessica McLemore took this picture of the damage to her car shortly after a crash with an Uber vehicle in Pittsburgh in February 2018.
Enlarge /

Jessica McLemore took this picture of the damage to her car shortly after a crash with an Uber vehicle in Pittsburgh in February 2018.

Jessica McLemore

“One of my big concerns about this incident is that people are going to conflate an on-the-spot binary assignment of fault with a broader evaluation of the performance of the automated driving system, the safety driver, and Uber’s testing program generally,” said Bryant Walker Smith, a law professor at the University of South Carolina.

“Human drivers recognize that they are going to deal with all kinds of behaviors that are not exactly lawful,” he added. “An obligation imposed under most if not all state vehicle codes are that drivers shall take due care to avoid a collision. You never get to say, well of course I hit them, they were in the road in my way.”

Indeed, it’s entirely possible to imagine a self-driving car system that always follows the letter of the law—and hence never does anything that would lead to legal finding of fault—but is nevertheless way more dangerous than the average human driver. Indeed, such a system might behave a lot like Uber’s cars do today.

For example, in that March 2017 collision in Tempe, the Uber driver reported that he was traveling 38 miles per hour at the time of the crash—just shy of the 40-mile-per-hour speed limit.

“As I entered the intersection, I saw the vehicle turning left,” he wrote. “There was no time to react as there was a blind spot created by the line of southbound traffic.”

The Uber car may have had a legal right to zip past two lanes of stopped cars at 38 miles per hour. But a prudent driver could have anticipated the possibility of a car in the blind spot—or, for that matter, a pedestrian trying to dart between the stopped cars in the next lane—and slowed down to 30 or even 20 miles per hour.

So, too, in Pittsburgh. There was no police report, so we don’t know how fast the Uber car was traveling or if it tried to stop. But a prudent driver approaching an intersection where an oncoming car has a left turn signal on will slow down a bit and be prepared to stop—just in case the other car decides to turn illegally.

As for this month’s crash in Tempe, there seems to be little doubt that there was a serious failure of the car’s technology. It may or may not have been feasible for the car to stop based on camera data. But lidar works just as well at night as it does in the daytime. Even if Uber’s software didn’t know Herzberg and her bicycle was a person, a car should always slow down if it sees an object that big moving into its lane.

Moreover, if the car really couldn’t have stopped in time to avoid killing Herzberg, that seems like a sign that the car was driving too quickly. It’s not like she jumped out from behind some bushes.

Tech

via Ars Technica https://arstechnica.com

March 22, 2018 at 03:39PM

Cambridge Analytica breach results in lawsuits filed by angry Facebook users

Cambridge Analytica breach results in lawsuits filed by angry Facebook users

http://ift.tt/2GiY6Yo

Tech

via Ars Technica https://arstechnica.com

March 22, 2018 at 07:06PM