extended hub failed reading devices in parallel

denji_00

New Member
Joined
Jul 27, 2023
Messages
1
Reaction score
0
Credits
18
I am working on Ubuntu 20.04.6 LTS. I tried to connect multiple cameras to my pc and wrote the python code to read video frame from each cameras in parallel. I am able to recognize unique device id for each camera and can take picture when accessing those cameras one at a time. Note that the extended hub have more than one cameras connected to it. The extended hub have high bandwidth just in case data reading is smooth. However, The parallel process I initiate can read only one camera out of many cameras connected to this extended hub.
I want to know what are the factors here which is preventing other cameras to stream the video? When I check the status, the log says the camera can't simply capture image. I checked the code and it is not code problem but seems like it is how underlying os is handling this communication. Can you guys please help me or show direction to approach this problem or any solutions if you have already faced and solved this problem?
 


Generally this is caused by webcams requesting all the available bandwidth on the USB host controller. With that in mind If you have wireshark and capinfos to see just how much bandwidth a single camera used. It should be something like

4 megabits per second at 320x240
14 megabits per second at 640x480
32 megabits per second at 1280x720

v4l-utils is one the best tools for debugging USB camera issues - which you can get through the repos

You can check what modes your cameras support by running the following command:

Code:
v4l2-ctl -d /dev/video0 --list-formats
 
@denji_00 :-

I concur with m'colleague here. Bandwidth is the primary factor involved, though depending on what you're using to view the streams, there's also other stuff to consider.

You have to remember; hubs can provide you with many additional USB ports.....but a hub is plugged-into just a single port, so whatever is plugged into the hub has to share that single-port bandwidth between themselves.

I have a whole bunch of webcams kicking around - I seem to collect the darned things! - and a few months ago, following inspiration from another Puppy Linux community member, I decided to write my own application for viewing/streaming/recording multiple webcams with the idea of building my own simple home CCTV system.

I settled on V4l2 & mPlayer as the backend. V4l2 has been supported by the kernel for ages, and mPlayer has got so many different options it's ridiculous.

~~~~~~~~~~~~~~~~​

I wrote the app in Bash, and created a GUI for it using YAD. Initially, I was plugging the cams into a 7-port self-powered USB 2.0 hub. One played fine; a second one showed only a black screen.

I soon discovered that by dropping the resolution, it was possible to run 2 cams thru the USB 2.0 hub.....but a 3rd one was just too much. So, I then switched to using the HP desktop rig's USB 3.0 ports.....and by careful juggling of resolutions (these are all 1080p cams, so capable of pretty high resolutions), I was able to add 2 more cams.

For my purposes, 4 cams were plenty, so I left it at that. The only thing I was never able to figure out was how to persuade mPlayer to monitor any given cam's output at the same time as recording it; as stated above, mPlayer has like a million-and-one options. The project has been around for well over 20 years, and the user documentation by this point is like a small stack of Bibles, there's so much of it!

Webcam support for compression factors in here, too; M-JPEG and H.264 will make use of this, whereas YUYV will NOT.

The whole area is something of a minefield. You're very welcome to read through the development thread for MultiCam over on the Puppy Linux forums, which can be found here:-

https://forum.puppylinux.com/viewtopic.php?t=7717

Might give you some pointers...


Mike. ;)
 

Members online


Top