> Qt included a component which would poll for networks every 30 seconds whenever a “network access manager” was instantiated, causing pretty much any Qt app using the network to degrade your wifi for ~5 out of every 30 seconds.
This intrigues me and makes me wonder:
1. Why should it need to do this?
2. Why should this degrade the wireless network performance? (I don't recall the details of wi-fi but shouldn't it be able to do this passively without disrupting anything?)
I vaguely remember from the time (~10 years ago) is that this was a requirement for phones (eg, when Qt is used on a Nokia phone) that the network request go to the interface preferred by the user (so not to use the cell network when wifi is available).
Anyway, the whole thing was implemented in a "bearer manager" plugin, and that thing was quite buggy. https://doc.qt.io/qt-5/bearer-management.html
I think apart from people running Qt on phone, this had virtually no use. And yet this was enabled by default on all platform, and causing lots of troubles. What we ended up doing for the application we were shipping was to make sure that this plugin was not included in the final package.
2. if it needs to send/receive at the lowest bitrate (used for discovery/broadcasting) that takes up a ton of wall clock time, which is equivalent to a lot of bandwidth at the highest bitrate.
1. If you want the machine to switch to stronger / more preferred networks when available it needs to know about them somehow.
2. The radio in most (all?) WiFi radios can only tune to one channel at a time, scanning all channels requires retuning to each one, which takes a little time.
This intrigues me and makes me wonder:
1. Why should it need to do this?
2. Why should this degrade the wireless network performance? (I don't recall the details of wi-fi but shouldn't it be able to do this passively without disrupting anything?)