This argument never made much sense to me, although I do subscribe to Fitts' Law. With desktop monitor sizes since 20+ years ago, the distance you have to travel, together with the visual disconnect between application and the menu bar, negates the easier targetability. And with smaller screen sizes, you would generally maximize the application window anyway, resulting in the same targetability.
The actual historical rationale for the top menu bar was different, as explained by Bill Atkinson in this video: https://news.ycombinator.com/item?id=44338182. The problem was that due to the small screen size, non-maximized windows often weren't wide enough to show all menus, and there often wasn't enough space vertically below the window's menu bar to show all menu items. That's why they moved the menus to the top of the screen, so that there always was enough space, and despite the drawback, as Atkinson notes, of having to move the mouse all the way to the top. This drawback was significant enough that it made them implement mouse pointer acceleration to compensate.
So targetability wasn't the motivation at all, that is a retconned explanation. And the actual motivation doesn't apply anymore on today's large and high-resolution screens.
> With desktop monitor sizes since 20+ years ago, the distance you have to travel, together with the visual disconnect between application and the menu bar, negates the easier targetability.
Try it on a Mac; the way its mouse acceleration works makes it really, really easy to just flick either a mouse or a finger on a trackpad and get all the way across the screen.
I’m not saying it’s necessarily harder to reach a menu bar at the top of the screen, given suitable mouse acceleration. But you also have to move the mouse pointer back to whatever you are doing in the application window, and moving to the top menu bar is not that much (if at all) easier to really justify the cognitive and visual separation. It that were the case, then as many application controls as possible should be moved to the border of the screen.
I've been on Mac for 20 years and it's still annoying as hell.
Another side effect is the uselessness of the Help menu. What help am I looking at? The application owns the menu, so where's the OS help?
Oh right, it's just all mixed together. When I'm searching for information in some developer tool I'm using, I really enjoy all the arbitrary hits from the OS help about setting up printers, sending E-mail, whatever.
The actual historical rationale for the top menu bar was different, as explained by Bill Atkinson in this video: https://news.ycombinator.com/item?id=44338182. The problem was that due to the small screen size, non-maximized windows often weren't wide enough to show all menus, and there often wasn't enough space vertically below the window's menu bar to show all menu items. That's why they moved the menus to the top of the screen, so that there always was enough space, and despite the drawback, as Atkinson notes, of having to move the mouse all the way to the top. This drawback was significant enough that it made them implement mouse pointer acceleration to compensate.
So targetability wasn't the motivation at all, that is a retconned explanation. And the actual motivation doesn't apply anymore on today's large and high-resolution screens.