[Brazelton] Family Sues Apple for NOT Implementing Drive Detect Feature

Self-Driving Shuttle Debuts in Vegas, crashes 2 hours later: Human blamed in self-driving shuttle bus crash

Even if human's fault, still kinda funny.
Yeah, a truck was exiting an alley and ran right into the shuttle.

Turns out the shuttles are built just a few miles from me, and the company uses the University of Michigan's autonomous village for testing. Navaya for those that are interested - french company - released this in a big show two days ago.

I think Uber's on the right track with automated vehicles - this area is ripe for automation, so even with this crash I expect we're only years from seeing autonomous vehicles daily, and probably cheaper to use in big cities than personal vehicles.
 
Why din't the bus detect the truck and back out of its way? I mean, my sensors would have detected the moving truck and moved my vehicle out of the way.

I really can't believe that self driving cars are being allowed on the roads already.
From the article:
"The shuttle did what it was supposed to do, in that it's (sic) sensors registered the truck and the shuttle stopped to avoid the accident," the city said in a statement. "Unfortunately the delivery truck did not stop and grazed the front fender of the shuttle. Had the truck had the same sensing equipment that the shuttle has the accident would have been avoided."
So you want TCAS for the ground? Not only "don't hit others" but also "get out of the way if somebody's going to hit you?" I have no doubt some of that is already there, but there are limits. Most humans can't do that effectively IMO.
 
Why din't the bus detect the truck and back out of its way? I mean, my sensors would have detected the moving truck and moved my vehicle out of the way.

I really can't believe that self driving cars are being allowed on the roads already.
The bus was proceeding forward on a road at under 30mph, and had the right of way. A truck was exiting an alley onto the road, and did not have the right of way. The bus had the right of way, and had no obligation to reverse.

You might have done so as a courtesy to the truck driver, but the truck driver did not have the right of way, and was subsequently cited for causing an accident.

It may be that the program will be altered to provide more human allowances and courtesies, but I don't think that's the right way to approach AI driving because it introduces too much undefined behavior and variability, leading to confusion, inefficiencies, and accidents.

The programmers have probably chosen, for now, to only practice passive accident avoidance. Swerving and braking, for instance. Actively backing out of the way is a whole 'nother can of worms.

You will not see a presidential limo driven by an AI anytime soon, except perhaps as a publicity stunt. Those drivers can do amazing things with their vehicles when they detect a dangerous situation.
 
Like I told my teenaged students, you can be right and be dead right too. Big trucks in city limits rely heavily on the courtesy of cars that have the right of way.
 

GasBandit

Staff member
Like I told my teenaged students, you can be right and be dead right too. Big trucks in city limits rely heavily on the courtesy of cars that have the right of way.
In all likelihood, they'll probably be made driverless PDQ, and we'll all be better for it. Except the truck drivers who are out of jobs, I suppose.
 
In all likelihood, they'll probably be made driverless PDQ, and we'll all be better for it. Except the truck drivers who are out of jobs, I suppose.
It'll be a long time before we see operator-less vehicles. All the current plans companies have to introduce driverless trucks include someone who sits there observing the situation.

So they won't be out of a job, they'll just be paid a lot less to essentially press buttons and perform the few actions the AI can't do yet.
 
It may be that the program will be altered to provide more human allowances and courtesies, but I don't think that's the right way to approach AI driving because it introduces too much undefined behavior and variability, leading to confusion, inefficiencies, and accidents.
100% behind you on this. There are rules (right-of-way, etc) and the whole purpose of programming ANYTHING is to get it to follow rules. If you start programming exceptions, then you introduce additional complications which must be compensated for, and the whole thing just snowballs.

--Patrick
 
I know this idea was batted around here before... but self driving cars have to be programmed to kill.

https://gizmodo.com/your-self-driving-car-will-be-programmed-to-kill-you-de-1782499265
I don't buy into the clickbait at all. The car isn't "programmed to kill", and the reality is that with modern safety system in vehicle you can survive a head on impact at 70MPH with a concrete barrier. You won't be happy, but you'll live. The car will be programmed to reduce casualties, but the reality is that it has far more information about the occupants on the interior than the exterior. As long as the cars follow the rules of the road, and the pedestrians follow the rules of the road, accidents will be vanishingly rare, save for those who choose to actively disobey the rules.

So I think the ethicists and philosophers are going to extremes, trying to apply the trolley problem to real life, when the reality is much more mundane.
 
So I think the ethicists and philosophers are going to extremes, trying to apply the trolley problem to real life, when the reality is much more mundane.
Have to justify those student loans for a worthless college degree somehow.

Oversimplifying technology that they have only a broadest understanding of (if any) then coming up with crazily complex problems and situations that would be generous to call an edge case, all based around the faulty simplification seems to be the new hotness.

See every “AI will take over the world if we’re don’t take action now!” techno panic article.
 
Appeals verdict is in:
Apple FaceTime car crash lawsuit dismissed
The [family of the girl killed in the accident] alleged Apple was responsible for the accident as it had considered utilising technology to detect motion on its phones and disable certain functions when driving. Apple had patented this technology but it was not included on the iPhone 6. The lawsuit states Apple's iPhone 6 was "defective" and shouldn't have been shipped without the lock-out feature.
[...]
In May, a court had dismissed the case, leading to the appeal.
The appeals court [has now] agreed with the earlier decision, concluding Apple "did not owe the [girl's family] a duty of care."
[The court affirmed] it was not up to the tech giant to take responsibility for actions of individuals using its applications.
Good. Product liability lawsuits are already outta control. A "guilty" verdict would've meant the first steps towards a world where even our warning labels would have warning labels ("Please keep warning label away from nose and mouth, or serious bodily harm or death could result.").

--Patrick
 
(We don't have a patent troll thread, so this "let's sue Apple" thread seemed the next best place for it)

Apparently Apple has had it up to here with those people described as "patent trolls" filing lawsuits in the famously popular Eastern District of Texas, so it is just going ahead and shutting down all (both) of the stores located in that district and opening up one big one a day later just outside the EDT border in Dallas. According to the article, all of the employees from the two stores were given the choice of relocating, transitioning to work-from-home, or severance packages, so at least they aren't being thrown out into the streets.

Apple, lawyers, patent trolls, Eastern District of Texas...no matter what your stance on any of this, I think what blows my mind the most is the idea that patent trolls are enough of a problem now that companies are now considering amputation as a viable solution. Like 'em or not, there's already a history of Apple doing a "courageous" thing (deleting floppy/optical drives/headphone jacks, adding a screen notch, whatever) followed by other industries watching to gauge the reaction...and then frequently following the example. Does this mean that big retailers' answer to lawsuits is going to change from the usual "settle" to "settle and evac?" Because if so, we're going to have a flight situation much like the one @GasBandit describes about taxes, where "number of tort-happy lawyers" becomes as much of a barrier to choosing a location for a commercial presence as tax rates, real estate prices, etc., and we're going to have more of that "What happens when Wal-Mart leaves?" kind of thing.

--Patrick
 
Last edited:
I honestly think the "amputate" option will become more and more viable the more insane things become politically.
 
According to the complaint, the sound "tore apart" the boy's eardrums, damaged his cochlea, and caused permanent hearing loss in one ear.
Eardrums torn apart? By AirPods? This sounds more to me like the kid had his ears boxed while wearing a set of AirPods.

--Patrick
 
Apple, 2021
More than 90 policy groups from the US and around the world signed an open letter urging Apple to drop its plan to have Apple devices scan photos for child sexual abuse material (CSAM).
(PDF of the plan in question)
Apple abandoned the plan after receiving the criticism and determining it would be too invasive to individuals' privacy.

Apple, 2024
The suit, which represents a potential 2,680 victims, argues that Apple's failure to implement a previously announced child safety tool is what caused the abuse material to continue circulating.
Apple is now being sued for $1.2 BILLION for not implementing the invasive scanning tool that almost a hundred industry experts agreed would've been a bad idea.

Talk about being damned if you do, damned if you don't.

--Patrick
 
Top