This whole conversation is based on a flawed assumption: that the most common resolutions are multiples of 360. They're not. The most common resolution for video is, in fact, 720x480, which is not a multiple of 360. The lowest resolution being used for computer monitors today is 1360x768, again not a multiple of 360.
In fact, most of the common computer resolutions do not fit this rule.
The most common VGA resolutions are 640x480 and 320x200.
Super-VGA adds 800x600 and 1024x768
The next step up from that was 1280x1024 (not even a 4:3 ratio... it's actually 5:4)
It wasn't until HDTV finally took off that widescreen resolutions became a thing, and the most common resolution on a laptop display was actually 1440x900. The recent proliferation of cheap TV displays actually lowered that to 1360x768. (I don't think you can even buy a 720p computer monitor. Everything is 1360x768, 1920x1080, or higher.)
So if you actually look at all the vertical resolutions out there in display panels: (480, 720, 768, 900, 1024, 1080, 1440) there's not a clear pattern. The same goes with video resolutions. Some people may choose to encode at progressive levels of quality because the numbers look good, but the accepted resolutions in standards like DVD and Blu-Ray also do not have a numerical pattern.
So as far as I can tell, this comes down to a case of Apophenia: seeing a pattern where there isn't one.