Input: Prevent focus change on virtual keyboards#3984
Conversation
This fix addresses an edge case whereby clicking on a shell layer surface that is acting as a virtual keyboard or virtual pointer will cause a focus change when it's on an output different to the currently focused window. This is expected behaviour for most cases, but shouldn't happen with a virtual keyboard. A virtual keyboard should forward key events onto whatever is focused, regardless of what screen it's on. It's not unreasonable to imagine a virtual keyboard on a small touchscreen while the "content" is on a larger display, and the current focus change behaviour makes that impossible. In the usual click (or touch) handling logic, if a toplevel window hasn't been found under the cursor to either focus or deliver the event to, the output gets focused. This makes sense because, if the output is empty (barring shell surfaces like Waybar), it'll pre-select the output so the next opened window will appear on that output. This patch adds a quick check to see if there is a shell layer surface under the pointer, checks if that surface belongs to a client that has any `zwp_virtual_keyboard_v1` or `zwlr_virtual_pointer_v1` objects, and prevents the output focus change if it does.
|
Thanks. I agree this is a problem, however could you please research how other compositors handle this? Checking if virtual keyboard/pointer are bound is fairly hacky and surely we'll hit some false positives? Also @stefonarch perhaps you have some insight? |
|
Thanks for writing back. With a quick test, Hyprland handles the pointer_button case properly, but I'm having a bit of trouble mapping the touchscreen so I'll test that properly when I figure that out. Sway handles both pointer_button and touch_down properly. That's all I have installed at the moment, but I'm guessing that's probably enough? I imagine something like GNOME and Plasma would be irrelevant since this is more of a workspace-per-monitor situation. I'll see if I can find the relevant input handling logic in those codebases. |
|
Do they activate the output on pointer and touch in general? How does it work there |
|
In Hyprland, when you move your pointer from screen to screen, you can see the workspace become focused (thanks to the highlighting in Waybar), but that doesn't seem to affect the focused window at all (unless you land over a window when you have focus-follows-pointer enabled). Even clicking in a blank spot or a shell layer keeps the currently focused window focused. I think the focused workspace only affects where the next window will land, and doesn't touch keyboard focus at all. For Sway, if focus-follows-pointer is off, nothing changes when you move from screen to screen. No visual hints in Waybar, and my next opened window appears on the previously active screen. Clicking on a shell layer changes nothing. Clicking in a blank spot, though, is a bit weird. It activates the workspace, and kind of semi-activate every (visible?) window on the workspace... Like, their borders all become highlighted, but you can see that they're not actually activated. With focus-follows-pointer, landing anywhere on another screen that isn't a window, will just focus the most recently focused window on that surface, like Niri. |
|
Finally had a look at Hyprland's code, and they maintain a "desktop" focus state separately from the Seat's focus state. The "desktop" focus state holds the active monitor, window, and surface. The Seat focus state contains the keyboard+pointer+touch focus (and drag-drop) that I've seen in Niri. Changing the desktop's window focus also changes the seat's focus. Changing the desktop's monitor focus pretty much does nothing. In their pointer_motion and pointer_button handlers, they pretty much set the monitor focus first thing. Then they look for surfaces and focus accordingly. A shell layer will only receive focus if it allows keyboard interactivity. Maybe it's that simple? Don't change window focus when changing monitor focus. I'll mock up a draft patch to test locally. |
This fix addresses an edge case whereby clicking on a shell layer surface that is acting as a virtual keyboard or virtual pointer will cause a focus change when it's on an output different to the currently focused window. This is expected behaviour for most cases, but shouldn't happen with a virtual keyboard. A virtual keyboard should forward key events onto whatever is focused, regardless of what screen it's on. It's not unreasonable to imagine a virtual keyboard on a small touchscreen while the "content" is on a larger display, and the current focus change behaviour makes that impossible.
In the usual click (or touch) handling logic, if a toplevel window hasn't been found under the cursor to either focus or deliver the event to, the output gets focused. This makes sense because, if the output is empty (barring shell surfaces like Waybar), it'll pre-select the output so the next opened window will appear on that output.
This patch adds a quick check to see if there is a shell layer surface under the pointer, checks if that surface belongs to a client that has any
zwp_virtual_keyboard_v1orzwlr_virtual_pointer_v1objects, and prevents the output focus change if it does.I'm not overly keen on iterating the whole collection of a client's objects, so I was initially hoping that there'd be a quick way to use the global
niri.virtual_keyboard_stateobjects to query any activeZwpVirtualKeyboardV1instances, assuming the manager global kept a list, and checking if that was "owned" by the target client, but it didn't look like this was possible. Maybe this could be implemented? I dunno, I'm new here. 😅 I'd be happy to try and draft this up if it sounds reasonable to those more experienced than me.Please note that I'm not very familiar with Rust or its conventions, I'd never looked at Smithay before this, and this is my first ever PR, so I kinda have no idea what I'm doing. But I know this code works.
Fixes #3971