Reading the comments under the article made me think when I'll become the bad neighbour and suffer from deauth attacks without knowing it. I have two AP's and 3 SSID's, one 2.4ghz for my 15 IoT devices, a 5Ghz using the same network, and a separate 5Ghz for "serious" devices that I use only for work. However I live in an apartment building and I have both most devices and SSID's. I've pondered before whether I am causing inconvenience to someone and how to solve it if complaints ever arise.
If you want to be a good neighbor, disable support for "legacy" rates in your access points. The periodic beacon frames are transmitted at the slowest supported rate so that the least capable client devices can receive them. For 802.11b, that means something like 1Mbps. At such a low rate, the beacon frames consume a non-trivial amount of bandwidth. Each access point you see could be consuming over 3% of the channel. Turning off 802.11b/g support (if you can) will significantly reduce the cost of each access point broadcasting. If you're really nice you could decrease the frequency of beacons, but that's not really worth it due to buggy clients sometimes not coping well.
The channels around my house are so busy that in order to make wifi reliable, I have three access points, each one on its own non-overlapping 20Mhz 2.4Ghz channel. Two of them listen in on 80Mhz 5.8Ghz as well. I have them all set to the same SSID, so that my devices just choose whatever they think has the best signal strength. I also ran cat6 to my TV and desktop so that they don't need to go over wifi. I disabled 802.11b/g on all but one of them to keep the amount of overhead down from all the beacon frames.
This is why the Comcast SSIDs everywhere are so annoying. They make the spectrum tangibly worse just by existing. Same thing for wifi printers that broadcast their own AP. Something like 30 APs sending beacon frames is enough to completely trash 20Mhz of 2.4Ghz.