Posted by jjwiseman 21 hours ago
If you want a hardware upgrade it's actually reasonably inexpensive to build something direct drive (which is how the real sub-30kg ones work). This gives you many advantages over the plastic gear type including a much faster response time and more accurate positioning.
Look for "gimbal motors", basically a large skinny pancake brushless motor. Combine those with a storm32, simplebgc or odrive and a magnetic encoder. A 3D printer will help.
You can also have a look at low cost suppliers of cheap UAV stuff if you want something fully integrated for you. A basic Gremsy, Viewpro or Siyi isn't that much more than your Amazon thing. Various software bugs but they can be worked around. The DJI units can sometimes be had used and some of the protocols have been RE'd already.
Granted pronouncing the name is ambiguous, Wes-cam or We-scam. But they're known well enough in the industry at this point that it's not a problem for them.
For a project I needed a low-latency RTSP stream as well. When reading a video stream with OpenCV, the default video buffer is quite big, which, when filling up, makes the video lag behind a second or two. It then becomes impossible to perform any interaction on it.
I wasn't familiar with the setting you use to overcome this: setting cv2.CAP_PROP_BUFFERSIZE to 1 on the VideoCapture. I am not sure, but you might get even lower latency by turning to OpenCV's GStreamer support. For me the trick was:
gst = f"rtspsrc location={video_url} latency=0 buffer-mode=auto ! decodebin ! videoconvert ! appsink max-buffers=1 drop=true"
self.video = cv2.VideoCapture(gst, cv2.CAP_GSTREAMER)
When testing, I also found out that the codec and image settings of the camera matter. With a h264 stream, the images came in batches of a number of frames, whereas MJPEG provided a more constant image stream with lower latency. Lastly, disabling 3D noise reduction also removed some delay.I'm looking forward to doing more experiments including integrating my own payloads (thermal and optical) with an off the shelf motion control system. I also have an automotive radar unit coming which may provide some interesting options for cueing without using ADS-B in some situations (with relatively close targets).
---
"Lucas," Bobby said, his mouth half full of cold fried chicken, "how come it's taking us an hour and a half to get to New York? We aren't exactly crawling.
"Because," Lucas said, pausing for another sip of cold white wine, "that's how long it's taking us. Ahmed has all the factory options, including a first-rate countersurveillance system. On the road, rolling, Ahmed provides a remarkable degree of privacy, more than I'm ordinarily willing to pay for in New York. Ahmed, you get the feeling anybody's trying to get to us, listen in or anything?"
"No, sir," the voice said. "Eight minutes ago our identification panel was infra-scanned by a Tactical helicopter. The helicopter's number was MH-dash-3-dash-848, piloted by Corporal Roberto
"Okay, okay," Lucas said. "Fine. Never mind You see? Ahmed got more on those Tacs than they got on us."
> Their motors are designed for slow, dampened pans across a stage, not for tracking a jet moving at 300 knots. The mechanical and electronics latency is significant; if you simply tell the camera to “follow that plane,” by the time the motors react, the target has often moved out of the frame.
Is he able to move the motors faster than they are designed to be moved? Is this the __Control (PID + Feed-Forward Loop)__ fix?
In terms of angular speed, unless you have a helicopter flying super low, I doubt aircraft move significantly faster than a preacher or a teacher, which are the intended use case according to the article.
And it can be combined with a source of ADS-B data so it knows what it's looking at, displaying the info on the OSD.