Being able to respond to user-input is a fundamental part of user-interface design. Depending on the use-case that an application solves, and the form-factor of the device that the application runs on, the best way to receive user-input may be different.
Allowing users to physically touch a screen to interact with an application is a popular user-interface paradigm on portable devices like smartphones and tablets. In desktop applications, detecting and reacting to clicks and presses according to the mouse cursor position is a fundamental concept in user-interface design.
Touch-driven and mouse-driven user interfaces are supported by various input handler types, and visual object types such as Flickable and MouseArea .
另请参阅文档编制关于 Qt Quick 鼠标事件 .
Supporting input from a keyboard is a vital component of the user interface of many applications.
Any visual item can receive keyboard input through the Keys attached type. Additionally, the issue of keyboard focus arises when multiple items are required to receive key events, as these events must be passed to the correct item. See the documentation about Qt Quick 键盘聚焦 for more information on this topic.
Qt Quick also provides visual text items which automatically receive keyboard events and key-presses, and displays the appropriate text. See the documentation about 文本输入 for in-depth information on the topic.
Detecting device gestures with an accelerometer, or through camera-based gesture recognition, can allow users to interact with an application without requiring their full and undevided attention. It can also provide a more interactive and engaging experience.
Qt Quick itself does not offer first-class support for physical device motion gestures; however, the Qt Sensors module provides QML types with support for such gestures. See the Qt Sensors module documentation for more information on the topic.