this post was submitted on 02 May 2024
60 points (91.7% liked)

Technology

59166 readers
2122 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

According to the news self driving trucks are about to hit the road with no driver on board.

But according to this book that is not going to happen. The author says that the real purpose is to get rid of the skilled drivers and replace them with underpaid button pushers.

Will they really do that? What's going to be the situation few years from now?

all 46 comments
sorted by: hot top controversial new old
[–] MondayToFriday@lemmy.ca 32 points 6 months ago (4 children)

I see at least four big problems with having drivers that sit around to supervise the AI.

  • It's a mind-numbing boring task. How does one stay alert when most of the stimulus is gone? It's like a real-life version of Desert Bus, the worst video game ever.
  • Human skills will deteriorate with lack of practice. Drivers won't have an intuitive sense for how the truck behaves, and when called upon to intervene, they will probably respond late or overreact. Even worse, the AI will call on the human to intervene only for the most complex and dangerous situations. That was a major contributing factor to the crash of Air France 447: the junior pilots were so used to pushing buttons, they had no stick-handling skills for when the automation shut off, and no intuition to help them diagnose why they were losing altitude. We would like to have Captain Sullys everywhere, but AI will lead to the opposite.
  • The AI will shut off before an impending accident just to transfer the blame onto the human. The human is there to serve as the "moral crumple zone" to absolve the AI of liability. That sounds like a terrible thing for society.
  • With a fleet of inexperienced drivers, if an event such as a snowstorm deactivates AI on a lot of trucks, the chaos would be worse than it is today.
[–] IphtashuFitz@lemmy.world 9 points 6 months ago (1 children)
  • The AI will shut off before an impending accident just to transfer the blame onto the human.

I may be mistaken but I thought a law was passed (or maybe it was just a NHTSA regulation?) that stipulated any self driving system was at least partially to blame if it was in use within 30 seconds of an accident. I believe this was done after word got out that Tesla’s FSD was supposedly doing exactly this.

[–] barsquid@lemmy.world 5 points 6 months ago (1 children)

Time limit should be higher but that sounds like a step in the right direction.

[–] laurelraven@lemmy.blahaj.zone 1 points 6 months ago

The time limit is probably adequate since 30 seconds is actually quite a long time on the road in terms of response. Actions taking place that far before an accident will not lead irrevocably to the accident

[–] FortuneMisteller@lemmy.world 2 points 6 months ago* (last edited 6 months ago)

You assume that it will be either the self driving software in charge or the button pusher taking the wheel. You did not consider that the button pusher might have a foot on the brake, but instead of taking the wheel he might have to enter some commands.

Like the case where there is a road block ahead and the button pusher has to evaluate whether it is safe to move forward or not, but he wouldn't take the wheel he would tell to the driving software where to go. In similar cases he would have to decide whether it is safe to pass aside an obstacle or stop there. Even in case of a burglar trying to get on board he would have to call the police and then give some commands to the driving software.

The idea at the base of the question is that in the future the AI or whatever you want to call it might be always in charge for the specialized functions, like calculating the right trajectory and turning the wheel, while the human will be in charge to check the surrounding environment and evaluate the situation. So the Ai is never supposed to be deactivated, in that case the truck would stop until the maintenance team arrives.

[–] Abnorc@lemm.ee 2 points 6 months ago

Maybe you put a revenant in the truck to keep things interesting.

[–] abhibeckert@lemmy.world -1 points 6 months ago* (last edited 6 months ago) (1 children)

It’s a mind-numbing boring task. How does one stay alert when most of the stimulus is gone? It’s like a real-life version of Desert Bus, the worst video game ever.

Agreed. I don't see any chance humans will be continuously supervising trucks except as some sort of quality assurance system. And there's no reason for the driver to be in the truck for that - let them watch via a video feed so you can have multiple people supervising and give them regular breaks/etc.

Human skills will deteriorate with lack of practice. Drivers won’t have an intuitive sense for how the truck behaves, and when called upon to intervene, they will probably respond late or overreact. Even worse, the AI will call on the human to intervene only for the most complex and dangerous situations. That was a major contributing factor to the crash of Air France 447: the junior pilots were so used to pushing buttons, they had no stick-handling skills for when the automation shut off, and no intuition to help them diagnose why they were losing altitude. We would like to have Captain Sullys everywhere, but AI will lead to the opposite.

I don't see that happening at all. An passenger jet is a special case of nasty where if you slow down or stop, you die. With a truck in the rare occasion you encounter something unexpected, just have the human go slow. Also seriously it's just not that difficult. Right pedal to go forward, left pedal to go backward, steering wheel to turn and if you screw up, well maybe you'll damage some panels.

The AI will shut off before an impending accident just to transfer the blame onto the human. The human is there to serve as the “moral crumple zone” to absolve the AI of liability. That sounds like a terrible thing for society.

So you're thinking a truck sees that it's about to run a red light, and transfers control to a human who wasn't paying attention? Yeah I don't see that happening. The truck will just slam on the brakes. And it will do it with a faster reaction time than any human driver.

With a fleet of inexperienced drivers, if an event such as a snowstorm deactivates AI on a lot of trucks, the chaos would be worse than it is today.

Hard disagree. A snowstorm is a lot less problematic when there's no human in the truck who needs to get home somehow. An AI truck will just park until the road is safe. If that means two days stuck in the breakdown lane of a freeway, who cares.

[–] Dieinahole@kbin.social 10 points 6 months ago (1 children)

Driving a truck is extremely more difficult than that.

I'm continually boggled by the fact any jackass can walk into a uhaul and drive out with 30 foot box truck, because those are wildly different to handle than a regular car.

Massively larger stopping distance, something almost no one leaves in their regular cars, massively wider turning radius, and heavy enough that if you make a mistake or lose control, there's a whole lot more destructive capability that you clearly are not appreciative of.

Going down a hill with a loaded box truck requires multiple different braking methods than just pushing the left pedal. You engine brake as much as possible, and use what's called stab braking, to keep the pads and rotor cool enough so they don't fail.

All of this is multiplied when you go from an automatic transmission, straight box truck to an actual semi truck, which weighs another order of magnitude more, has usually has a ten speed manual transmission (and three pedals, not two) and the whole trailer aspect.
And despite the extra weight, heavy winds can still blow the things over.

Frankly your cavalier attitude about how easy it is to drive anything is exactly why the roads are so dangerous.

Because nothing I said really mentions how people driving cars interact with trucks or buses on the road. It's a constant stream of getting cut off and having to slam on the brakes because the dipshits don't even know where the edges of their own vehicle are, let alone where mine begins, or the wildly longer stopping distance, or my extremely limited maneuvering capabilities , especially at speed, or the simple fact the larger vehicle will absolutely crush their whole car and everyone in it completely fucking flat.

Driving is absolutely a skill, and like any other, it will atrophy without use.

[–] Riven@lemmy.dbzer0.com 4 points 6 months ago (1 children)

Agreed, I haven't driven an 18 wheeler but I've driven the big fedex box trucks for a living and even those are harder to maneuver than a regular sedan. The hardest part is dealing with people who think you leaving all that extra space in front of you is just so anyone can slip in and drive there.

[–] Dieinahole@kbin.social 2 points 6 months ago (1 children)

The wild part is, the stepvans have wayy more visibility than a modern car. Like you can actually see your bumper and what it might run into. Taking time to learn the ends of your vehicle is important, but when you can't see shit anyway, what's the point?

I understand, crumple zones and shit require bigger a-frames, but I'd rather be surrounded by more competent drivers than crumple zones.

I recall hearing about some guy who was pushing for graded licenses and roads, and if you didn't qualify as skilled for that road, you couldn't drive there. It wasn't a simple 'this is harder, this is easier', either. Tight-in low-speed city roads were a certain classification. Highways were another, twisties were another, and so on.

I fervently wish that'd taken hold, along with a vehicle classification to match. Mopeds and scooters in the city? Easy! Stepvan in the city? Hard! Modern 'pickup' in the city? Fuck no!

[–] Riven@lemmy.dbzer0.com 1 points 6 months ago (1 children)

That does sound like a good idea in concept. I'm sure it could work in other countries but there's fuck all chance it would ever work in the US unfortunately. Japan could totally do it.

[–] Dieinahole@kbin.social 2 points 6 months ago (1 children)

Oh the ship has sailed, but said dude was pushing this idea when roads and automobiles were a relatively new thing

[–] Dieinahole@kbin.social 2 points 6 months ago

Well, automobiles at least, lol. Road's kind of an old idea, eh

[–] MeatsOfRage@lemmy.world 6 points 6 months ago

My bet, fully automated with localized maintenance workers who can travel around and perform repairs to fix the trucks stuck in their areas.

[–] carleeno@reddthat.com 6 points 6 months ago (1 children)

As someone who worked there previously, I can confirm that both of your statements are correct. (This has already been publicly shared by Aurora)

There will be nobody in (most of) their trucks. There will be button pushers remotely to help it in confusing situations or failures.

They've already been operating the trucks near-fully autonomously with safety drivers behind the wheel and copilots in the right seat monitoring the system. They plan to remove both operators from the vehicle completely, eventually.

(Now for some of my own speculation) Someone else mentioned mother goose, they may do a similar approach, however the follow trucks don't need to keep up with the lead truck. It would be only for the lead truck to be an early warning for unexpected road conditions (new construction for example) that is handled by the safety driver, and info sent back to other trucks quickly on how to handle it or to pull over and wait for help (default action if it gets confused). It's impossible to require that a convoy remains together in close formation, too many scenarios can split up the trucks even on the highway.

In a mechanical failure it would pull over and wait for a rescue team. The rescue team will probably include backup drivers in case it can't resume driving autonomously.

Also, always take timetables with a grain of salt regarding anything related to autonomous vehicles.

My guess is the situation a few years from now will be that an inconsequential percentage of the US trucking fleet will be autonomous, a smaller percentage will have no safety drivers, and the remote operators will still be 1:1 ratio, maybe 1:2 (one operator for 2 trucks), but not the desired 1:10. This tech advances very slowly.

[–] abhibeckert@lemmy.world 3 points 6 months ago* (last edited 6 months ago) (1 children)

This tech advances very slowly.

Historically, anything that reduces cost of transporting goods has advanced extremely quickly. The best comparison, I think, is the shipping container.

It took about ten years for shipping containers to go from an invention nobody had heard of to one that was being used in every major seaport in the world and about another ten years for virtually all shipping used that method.

The New York docks for example, dramatically increased activity (as in, handled several times more cargo per day) while also reducing the workforce by two thirds. I think self driving trucks will do the same thing - companies/cities/highways that adopt AI will grow rapidly and any company/city/highway that doesn't support self driving trucks will suddenly stop being used almost entirely.

Shipping containers were not a simple transition. New ships and new docks had to be built to take advantage of it. A lot of new trucks and trains were also built. Just 20 years to replace nearly all the infrastructure in one of the biggest and most important industries in the world.

[–] carleeno@reddthat.com 2 points 6 months ago

I don't disagree with you. There will be a rapid rate of adoption.

But how long before it's capable enough to be adopted? We (as in anybody) don't know. We just know that it's been many many years and they're still not there yet, and just because a few driverless vehicles are operating (in extremely ideal scenarios with lots of help) doesn't mean it's ready for the kind of hockey stick curve that the industry is looking forward to.

It will happen eventually, sure. My prediction was in regards to the OP's question of what will things look like in a few years. I don't think the tech will be ready for mass adoption in just a few years, neither does the author of the article linked.

[–] cynar@lemmy.world 4 points 6 months ago (1 children)

It will likely be a mix. E.g. you might have 10 trucks on a particular run. You put a driver in the lead truck, as a human-in-the-loop safety. The rest play duckling to the mother duck.

What it will do is lower the skill level needed, and lower the stress. A driver having a nap isn't a problem anymore. They just need to be able to get involved either if the autopilot has issues and has to stop, or if they need to fill out paperwork at the destination.

[–] sailingbythelee@lemmy.world 2 points 6 months ago (1 children)

The duck-duckling model would probably work okay on the highway, but not so well once you arrive in a town or city. You can't reliably get ten semis through a set of lights in traffic without getting split up. I guess they could have a depot outside of town where human drivers would meet the ducklings for the final leg of the journey.

[–] cynar@lemmy.world 2 points 6 months ago

I believe it's common to have separate long haul trucks and last leg trucks. If the depot is right next to the motorway/highway, then it provides an obvious place for a handover. It also means drivers can stay in 1 area, and so go home each night.

[–] foggy@lemmy.world 3 points 6 months ago

Driverless trucks will get Jesse James'd until they have armed guards.

Just a motivated criminal, a signal jammer, and a driverless truck enter an area with no signal. Just a happy criminal leaves.

[–] cmoney@lemmy.world 3 points 6 months ago (1 children)

If you've been injured by a self driving truck call our office right away as you may be entitled to compensation.

[–] dukethorion@lemmy.world 2 points 6 months ago (1 children)

Can't call anyone if you're a stain on the pavement.

[–] cmoney@lemmy.world 2 points 6 months ago

Well if you or someone you don't like was injured give us a call.

[–] jimmydoreisalefty@lemmy.world 3 points 6 months ago

IMO:

Bare bones skeleton crews, similar to Railroad workers. They will try to strike but then gov't will make it illegal to do so ASAP.

Staying hopeful though, keep learning and teaching, while being involved at your local community!

The future of our jobs is not a mystery. It is the result of a transformation that started a long time ago. It is obvious, clearly understandable, but well hidden behind of curtain of confusion. This book starts from the most asked question: "will AI take over our jobs?" In order to show how misleading it is. Misleading are also all the alarms raised over the power AI, but the real dangers could have even more deleterious consequences, leading to an era where the masses could be trapped in jobs that are alienating, mind numbing and underpaid. Exposing the arguments in a manner understandable by the layman, The Age of the Button Pushers goes trough the fields of computer science, economics and media communication. The whole picture will be reconstructed taking into account the lessons from the past with the changes brought by the industrial revolution, the present with the consequences of automation, the near future with the risk of an economy dominated by monopolistic giants. Part of the book will be dedicated to all the fabricated stories that dominate the current narrative on the media, highlighting the flaws and the inconsistencies, showing how altogether these stories paint a picture that makes absolutely no sense.

[–] 0x0@programming.dev 3 points 6 months ago* (last edited 6 months ago) (1 children)

Because we haven't learnt anything about the status quo of autonomous driving from Tesla's "Auto Pilot", huh?

Similar post earlier.

[–] FortuneMisteller@lemmy.world 2 points 6 months ago* (last edited 6 months ago) (1 children)

A serious self driving vehicle must be able to see around with different sensors. But then it must have a lot of computing power on board to merge different streams of data coming from different sensor. That adds up to the computing power required to make a proper prediction of the trajectories of dozen of other objects moving around the vehicle. I don't know about the latest model, but I knew that the google cars few years ago had the boot occupied by big computers with several CUDA cards.

That's not something you can put in a commercial car sold to the public, what you get is a car that relies only on one camera to look around and has a sensor in the bumper that cuts the engine if activated, but it does not create an additional stream of data. Maybe that there is a second camera looking down at the line on the road, but the data stream is not merged to the other, it is used to adjust the driving commands. I don't even know if the little onboard computer they have is able to computes the trajectories of all the objects around the car. Few sensors and little processing power, that is not enough, it is not a self driving car.

When Tesla sells a car with driving assistance they tell to the customer that their car is not a self driving car, but they fail to explain why, where is the difference. How big is the gap. That's one of the reasons why we had so many accidents.

Similar post earlier.

It starts from the same news, but taking the idea from the book in the link it asks something different.

[–] abhibeckert@lemmy.world 2 points 6 months ago* (last edited 6 months ago) (1 children)

the google cars few years ago had the boot occupied by big computers

But those were prototypes. These days you can get an NVIDIA H100 - several inches long, a few inches wide, one inch thick. It has 80GB of memory running at 3.5TB/s and 26 teraflops of compute (for comparison, Tesla autopilot runs on a 2 teraflop GPU).

The H100 is designed to be run in clusters, with eight GPUs on a single server, but I don't think you'd need that much compute. You'd have two or maybe three servers, with one GPU each, and they'd be doing the same workload (for redundancy).

They're not cheap... you couldn't afford to put one in a Tesla that only drives 1 or 2 hours a day. But a car/truck that drives 20 hours a day? Yeah that's affordable.

[–] FortuneMisteller@lemmy.world 1 points 6 months ago

A real self driving software must do a lot of things in parallel. Computer vision is just one of the many tasks it has to do. I don't think that a single H100 will be enough. The fact that the current self driving vehicles did not use so many processing power doesn't mean a lot, they are prototypes running in controlled environments or under strict supervision.

[–] Bishma@discuss.tchncs.de 3 points 6 months ago* (last edited 6 months ago) (1 children)

Different companies have different plans. Arizona has had auto-driving trucks on freeways off and on for a couple years now as part of test programs. Always with a driver in the cab though.

A few years ago I would have though robo-convoys would be where things landed because three or four companies where working toward that. That's where the front truck has an operator and all the other trucks follow that leader driverlessly.

Now I feel like I have no idea where any of it is going. Step 1 in driverless should have always been to adopt an industry-wide mesh-network for all vehicles with level 3 (or higher) autonomy. If I'm on the road with (or inside of) an autonomous vehicle, I want it to be able get help from every other nearby car if its sensors suddenly die or start feeding it bad data. Especially after they've been on the road, poorly maintained by their owners, for a decade or more. If there are autonomous cars where will eventually be autonomous jalopies that drive like a drunk toddler because they sees lidar echos.

[–] 0x0@programming.dev 6 points 6 months ago (3 children)

robo-convoys

You mean like trains?

[–] TimeSquirrel@kbin.social 4 points 6 months ago* (last edited 6 months ago)

Can't get a train track to every single depot and loading dock in the country that receives shipments (which is like, practically every big box store and warehouse there is). There has to be a handover at some point.

Edit: also not a big fan of the train system in the US, since the vast majority of rail is privately owned. The operators have too much control. They'll charge towns extra to put automated crossing guards on their rail and then keep charging them for its maintenance. The jurisdiction can't use their own third party workers to maintain it. The railroads are legally only required to put up a sign. It's extortion if you ask me.

[–] Bishma@discuss.tchncs.de 2 points 6 months ago

Trains are only a good idea in tubes now. There was a memo about it a few years ago.

[–] treefrog@lemm.ee 1 points 6 months ago

Only less efficient.

[–] son_named_bort@lemmy.world 2 points 6 months ago

They'll keep someone in the truck for maintenance purposes. A self driving truck wouldn't be able to change a flat tire for example and it would be more efficient to have the human driver change it than wait for someone to come out and change the flat.

[–] pr06lefs@lemmy.ml 2 points 6 months ago

Just like mercedes 'full self driving' this sounds like its on limited routes where there's been extensive testing. I don't expect truck driving to go full auto on arbitrary roads in the next few years. The tech is not there yet.

[–] victorz@lemmy.world 2 points 6 months ago

I'm hoping they'll hit nobody on the road, with somebody on board.

I have a feeling funding to self driving truck tech may stall a bit if marijuana rescheduling can change the fact that a single positive piss test can get your CDL revoked for good.