Top
Best
New

Posted by mhb 4 days ago

Sub-$200 Lidar could reshuffle auto sensor economics(spectrum.ieee.org)
392 points | 529 commentspage 4
9999_points 14 hours ago|
I wonder if this could be adapted to the vtuber market. Saw a vtuber body tracker being marketed at $11k recently.
brador 18 hours ago||
Is this Human safe at these volumes? There was a time you could get your feet sized by putting them into an X-ray box at the shoe store. Removed from stores once the harm was known.
skandinaff 16 hours ago|
Well, the energy levels used in those devices should be miniscule, and the wavelengths used are well studies. The problem with x-rays - was lack of studies on health effects, and regulations on those effects. I think, since that time, we've studies radiation (be it light, rf or other parts of spectrum) much more. There is indeed a possibility that we're overlooking some bio-electromagnetic interaction effects; for instance now there is some evidence that led lights might not be harmless - but again, it's not the they affect biological structures somehow, but the lack of spectral components has some effects. It is an interesting topic to research. But, the lidar "should" be safe
brador 8 hours ago||
The main damage risk from LIDAR is to retinal rods and cones. You just know some jerk is going to overclock his system and we know some people just don't care about the harm they cause so long as they get a benefit. As a combo that means I'll be wearing protective eyewear outdoors the day this tech comes to the roads.
ingend88 8 hours ago||
Would it be no easier to integrate it into home vaccuums ?
thegeek108 18 hours ago||
What is this author even doing with these numbers?
colechristensen 19 hours ago||
can I buy it on digikey yet?
bjrobz 20 hours ago||
I saw a Waymo in Seattle, today. If Waymo can get Seattle right, that gives me a lot of confidence that their stack is very capable of difficult road conditions.

Note: I have not had the pleasure of riding in one yet, but from what my friend in SJ says, it’s very convenient and confidence-inspiring.

geminiboy 19 hours ago||
I have had the pleasure of riding a few times in SanFrancisco.

The drive was delightful and felt really safe. It handled the SF terrain, traffic and mixed traffic like trams very well.

I wouldnt trust a self driving tesla ( or any camera only systems) though!

rediguanayum 19 hours ago|||
I took the Waymo from San Jose airport to home on the peninsula. It took the 101 highway back for the most part, driving very conservatively at 65-55 mph, and in the right most lane. It still has a few quirks though. When there aren't any cars around it will speed up to 65 mph, but at on-ramps, it will slow down to 55 and then speed up once past. It will get stuck behind slow drivers being in the right most lane and patiently follow them a few car length behind them. On the plus side, the lidar stack field of view as shown on the internal display seems to see pretty far down the highway.
the_real_cher 19 hours ago||||
Tesla doesnt have Lidar?
eptcyka 19 hours ago|||
No. They don't even have radar, camera is all you need, as per Elon.
disillusioned 18 hours ago||
Even more fury-inducing, they don't even have ultrasonic parking sensors on cars that have ultrasonic parking sensors. They disabled them to move to a vision-only stack that is no where near as accurate or as good and which categorically cannot tell a difference in ground truth has occurred in its blind spot. But hey, all _people_ need are two cameras, right?
aaronbrethorst 19 hours ago|||
hooboy, https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...
the_real_cher 15 hours ago||
That's wild!
small_model 18 hours ago|||
Why wouldn't you trust a Telsa, millions of people let there Tesla drive them all over USA (not geofences like Waymo) without touching the wheel from parking spot to parking spot everyday. Have you tried it?
lccerina 18 hours ago|||
Maybe because of the multiple investigations Tesla has currently due to crashes, deaths, injuries, etc. all caused by "whoops our cameras were fooled by some glare/fog and accelerated into a truck/pole"
small_model 18 hours ago||
Those are mainly autopilot which people conflate with FSD, and high percentage are human caused accidents (auto pilot requires full attention and driver is liable).
Gigachad 17 hours ago||
Why does Tesla ship a feature called "autopilot" which kills you if you use it instead of "FSD"?
simondotau 17 hours ago||
Autopilot is Tesla’s brand name for adaptive cruise control with lane centering. This is a common feature available on a wide range of vehicles from nearly every major manufacturer, though marketed under different names (e.g., ProPilot, BlueCruise).

Drivers can and do misuse adaptive cruise control systems, sometimes with fatal consequences. Memes aside, there is no strong evidence that fatal misuse occurs more frequently by owners of Tesla cars than with comparable systems from other brands.

This perception reflects the Baader–Meinhof phenomenon, more commonly known as the frequency illusion. Nobody is collecting statistics for other brands, so it’s assumed the phenomenon doesn’t occur.

A similar pattern occurred with media coverage of EV fires. Except in this case, good statistics exist which prove the opposite: ICE vehicles catch fire more often than EVs.

cheema33 17 hours ago||||
> Why wouldn't you trust a Telsa, millions of people let there Tesla drive them all over USA (not geofences like Waymo)

I own a Tesla and paid about $10K for the full self driving capability a few years ago. Yeah, I would not trust a Tesla to drive me from airport to my house. There is a reason Tesla is still stuck at level 2 autonomy certification and not 3, 4 or 5.

Rohansi 15 hours ago||
I would agree for most Teslas on the road. However, the very latest (HW4) cars are significantly better at FSD where I would nearly trust it now. Most of those older (pre-2023?) cars will not have their hardware upgraded so they'll still have FSD that drives like an idiot!
notTooFarGone 18 hours ago||||
Because it is not real autonomous driving? Being liable for software that you can neither verify nor trust is THE dealbreaker. Once Tesla says "We are liable for all accidents with FSD" with higher level autonomous driving this game changes. But Waymo is just way more reliable.
tzs 12 hours ago|||
> millions of people let there Tesla drive them all over USA

There aren't a million Teslas with FSD active in the US. According to Tesla in their latest earnings report there are 1.1 million people worldwide with FSD.

TulliusCicero 5 hours ago||
Seattle probably isn't any harder than SF, other than the occasional weather event where the hills ice over and we get a bunch of funny (and scary) videos.
speedgoose 18 hours ago||
How could I buy one?
ck2 12 hours ago||
BTW what happens when there are hundreds of Lidar signals at one intersection?

There's no way a sensor can tell if a signal was from its origin?

Guessing any signal should be treated as untrusted until verified but I suspect coders won't be doing that unless it's easy

rurban 13 hours ago||
What? You get Chinese lidar sensors for 12 EUR for a long time already.
fragmede 18 hours ago|
It might, but comma.ai proves that lidar is red herring, which is further supported by the fact that Waymo are able to drive vision-only if necessary.
TulliusCicero 5 hours ago||
No one really disputes that some level of autonomous driving is possible with only cameras, it's a matter of how safe and sure you wanna be.
KaiserPro 18 hours ago||
> comma.ai proves that lidar is red herring

I mean it doesn't. If you actually look at it comma.ai proves that level two doesn't require lidar. Thats not the same as full speed safe autonomy.

whilst it is possible to drive vision only (assuming the right array of cameras (ie not the way tesla have done it) lidar gives you a low latency source of depth that can correct vision mistakes. Its also much less energy intensive to work out if an object is dangerous, and on a collision course.

To do that in vision, you need to work out what the object is (ie is it a shadow) then you have to triangulate it. That requires continuous camera calibration, and is all that easy. If you have a depth "prior" ie, yes its real, yes its large and yes its going to collide, its much much more simple to use vision to work out what to do.

fragmede 18 hours ago||
It's fair to point out that comma.ai is SAE level two system, however it's not geofenced at all, which is an SAE level 5 requirement. But really that brings up the fact that SAE's levels aren't the right ones, merely the ones they chose to define since they're the standards body. A better set of levels are the seven I go into more detail about on my blog.

As far as distinguishing shadows on the road, that's what radar is for. Shadows on the road as seen by the vision system don't show up on radar as something the vehicle will run into.

imtringued 16 hours ago||
Your autonomy scale is pretty arbitrary and encodes assumptions about the underlying technology and environments the vehicle is supposed to implement and operate in.

The SAE autonomy scale is about dividing responsibility between the driver and the assistance system. The lowest revel represents full responsibility on the driver and the highest level represents full responsibility on the system.

If there is a geofenced transportation system like the Vegas loop and the cars can drive without a human driver, then that is a level 5 system. By the way, geofencing is not an "SAE level 5" requirement. Geofencing is a tool to make it easier to reach requirements by reducing the scope of what full autonomy represents.

More comments...