Getting in touch with Qt Quick: Gestures and QML

We here in Qt development are pretty excited about the Qt 4.7.0 release. In particular, some of us have been focusing our enthusiasm on the Qt Quick framework.

Gestures are a natural fit for the easy UI development the QML language enables. Currently however, there isn't any gesture functionality exposed in the core declarative library. A custom class can both grab gestures, and expose a QML interface, but this raises the bar for use significantly. It also leaves the current set of declarative elements unloved. This isn't a recipe for a happy ending. ;-(

Not Quite Zero

The observant documentation readers out there may have noted the existence of a GestureArea QML element in the documentation. I won't waste time and screen real estate duplicating what is written there, but I would like to provide a brief sketch. First, please note the caveat:  Elements in the Qt.labs module are not guaranteed to remain compatible in future versions. That said, let's take a look at what this element provides:

A GestureArea handles one or more gestures within an area of the screen, much as a MouseArea handles mouse events. Each gesture type is handled by a corresponding signal. To illustrate, QGestureType::TapGesture can be accepted by implementing the Tap signal:

import Qt.labs.gestures 0.1
GestureArea {
onTap: console.log("tap received")
}

Each signal has one or more properties describing  the gesture. Going back to the tap, the gesture is described through a point called position. This property contains the point where the tap was registered.

Taking Another Step

Starting from the labs module described above, we've been experimenting with taking the GestureArea forward toward production quality. The name has been kept, but the rest of the element has seen significant changes. Being developers, we do a lot of our thinking in code (and on whiteboards, but code is compact), so here's something to start the explanation from:

import Qt.labs.gestures 2.0
GestureArea {
Tap: {
when: gesture.hotspot.x > gesture.hotspot.y
onStarted: console.log("tap in upper right started")
onFinished: console.log("tap in upper right completed")
}
}

The first thing is, yes, the version number has jumped. Moving along, the syntax for hooking a gesture has changed. Rather than using a signal, you specify the gesture as a sub-element. All the default gesture names are recognized, and custom gestures can be as well. To do so, the recognizer needs to be registered with Qt via qmlRegisterUncreatableType() and qmlRegisterType(). See the GestureArea plugin.cpp for details.

Within a gesture sub-element, there's an optional property called when. This property is used to specify a set of conditions that dictate when the gesture should be accepted. If an incoming gesture doesn't pass the test, it isn't accepted and future updates will be ignored. You can access the properties of the gesture through the gesture property, as well as anything else that happens to fall in scope.

If the when property evaluates to true, the appropriate gesture state signal (onStarted, onUpdated, onFinished, onCanceled) is invoked.

Examples

Our development was guided by a few example interfaces that we thought should be easy to piece together. Along the way we hit a few walls, wrote a few patches, and had a great time.

Demonstrating Pan & Pinch

import Qt 4.7
import Qt.labs.gestures 2.0

Rectangle {
id: rootWindow
width: 320
height: 320
color: "white"
property int inGesture: 0

signal reset
onReset: { color = "#ffffff"; gestureText.text = "Gesture: none"; inGesture = 0 }

Text {
id: gestureText
anchors.centerIn: parent
text: "Gesture: none"
}

GestureArea {
anchors.fill: parent

Pan {
when: inGesture != 2
onStarted: {rootWindow.color = "#fffca4"; inGesture = 1}
onUpdated: gestureText.text = "Pan: X offset = " + gesture.offset.x.toFixed(3)
onFinished: rootWindow.reset()
}

Pinch {
when: inGesture != 1
onStarted: {rootWindow.color = "#a3e2ff"; inGesture = 2}
onUpdated: gestureText.text = "Pinch: scale = " + gesture.scaleFactor.toFixed(3)
onFinished: rootWindow.reset()
}
}
}

Photo manipulation


This is where we ask something of you, dear readers. Grab the GestureArea module , and start creating. Have a look at the examples for inspiration. The module is targeted at Qt 4.7.1 [Edit: as in should build using, not will be shipping with], and there are some experiments being carried out in this research repository.

And then tell us what you think. This release bears the same warning as the first implementation. We want to stabilize this functionality and get it into the declarative core, but we need your feedback.


Blog Topics:

Comments

Commenting for this post has ended.

?
2beers
0 points
177 months ago

hi. I tried 1 week ago gestures 1.0 but I found out that the QML simulator does not support gestures. You will provide gestures support from the simulator?

?
Jeremy.Katz
0 points
177 months ago

@2beers: I'm not sure what the simulator you refer to is. qmlviewer?

To use this gesture module, or the version documented in 4.7.0 (0.1), the module needs to be in a place it can be found by the declarative runtime. http://doc.qt.nokia.com/4.7... lists the options for doing so.

?
ddenis
0 points
177 months ago

I guess it is about Qt Simulator - http://doc.qt.nokia.com/qts...
That is an excellent idea, we should definitely consider adding gesture support there.

?
2beers
0 points
177 months ago

@Jeremy.Katz Yes I was talking about qml viewer . I tried this example : http://doc.qt.nokia.com/4.7... but it didn't happen anything. (I didn't received any errors). So I supposed that the gestures are not implemented in qml viewer . I've installed qt sdk (qt 4.7 and qt creator 2.01). I mean can I simulate gestures using the mouse. I don't have touch screen on my desktop monitor.

@ddenis I wasn't talking about that Qt simulator but you are right, that will be a good idea. :)

?
Philippe
0 points
177 months ago

Under Windows 7, the Gestures implementation causes all widgets to become native, reducing performances and causing flickering when resizing a large/complex UI. Is there a plan to fix this issue someday? I can't use Gestures for this reason.

?
Jeremy.Katz
0 points
177 months ago

@2beers: The default gesture recognizers are not in or specific to the viewer application. They're in src/gui/kernel/qstandardgestures.cpp, which is linked into libQtGui. These recognizers only handle touch events, so they won't work with mouse events unless you have modified them or install custom recognizers.

The GestureArea module in the repository above provides its own recognizers which do handle mouse events. These are in qdeclarativegesturerecognizers.cpp.

@Philippe: I'm not intimately familiar with the situation, but my understanding is that this is done to make it possible to receive native Windows gestures. I can't comment on whether the situation will change in the future, but filing a bug report is your best bet for increasing the visibility of any issue.

?
2beers
0 points
177 months ago

@Jeremy.Katz thanks for your reply. I will try to see what I can change, but I still think it will be a good idea to implement some sort of gesture simulation using the mouse from the start. It will help a lot of people.

?
xiamiaoren
0 points
177 months ago

I want this version has less bugs ang let us live more happily.

?
xiamiaoren
0 points
177 months ago

I want this version has less bugs and let us live more happily.