Hello,
I don’t think this exists.
There are methods to simulate mouse & touch interactions into containers/widgets, but nothing to simulate keyboard interactions.
I realise that this might be difficult behaviour to define, as keyboard actions can be “routed” to focussed widgets, with the focus being “drilled” down through multiple layers of containers, or in the global space to be picked up by the keyboard operators.
My use case is an on-screen-keyboard to allow me to send arrow keys and keystrokes into a text field widget.
The idea being a user can select text within the widget, then press the delete key to delete the selection. Or to start typing to replace the text. Or use arrow keys to navigate the edit cursor. Essentially have the OSK exactly replicate a physical keyboard.
I realise I would have to manually manage the focus back to the widget to send keystrokes into it, as the user will have just changed focus to a button on the OSK, but I believe there are methods for setting focus.
I am currently appending or popping to the text widgets value parameter. But I would like to provide a more native experience.