User Interaction - Keyboard, Mouse, Context, Action
((sorry for the duplicate, but for some reason the Feature request (Feature #832) was set to 'closed' as I submitted it?))
I would like all the User Interactions to be configurable and the configuration to be context based.
For each [context] (windowed/frameless/fullscreen/...) all user interactions should/could have a unique configuration.
Hence a signal, 'keyboard right arrow' for example, can map to different actions depending on context:
[navigate next image] in windowed mode
[pan image] in fullscreen mode
..and so on..
It should be possible to unbind everything, not forcing any specific keyboard or mouse behavior.
It should be possible to assign several signals to the same action, example: keyboard press 'n' or 'right arrow' or 'F4' or mouse scrollwheel 'down' should all be able to map to, for example; [navigate next image].
It should be possible to get [on screen information/help] about the bindings for the current context, if I bind 'F2' to [show on screen bindings] and then press F2 in any context, the bindings for that context should be listed on screen.
Implemented, I think; you will receive less "I want mouse/keyboard to work like in <product X>" or "I want mouse/keyboard to work MY specific way".
(all-in-all happy users that will be able to interact with the application EXACTLY as They want without bothering you guys)