Qt wiki will be updated on October 12th 2023 starting at 11:30 AM (EEST) and the maintenance will last around 2-3 hours. During the maintenance the site will be unavailable.

Using-QtMobility-sensors-and-QML-from-PySide: Difference between revisions

From Qt Wiki
Jump to navigation Jump to search
No edit summary
No edit summary
Line 1: Line 1:
[[Category:LanguageBindings::PySide]]<br />[[Category:snippets]]<br />[[Category:Developing_with_Qt::Qt Quick]]<br />[[Category:Developing_with_Qt::Qt Quick::QML]]<br />[[Category:Developing_with_Qt::Qt Quick::Tutorial]]<br />[[Category:Developing with Qt::QtMobility]]
[[Category:LanguageBindings::PySide]]
[[Category:snippets]]
[[Category:Developing_with_Qt::Qt Quick]]
[[Category:Developing_with_Qt::Qt Quick::QML]]
[[Category:Developing_with_Qt::Qt Quick::Tutorial]]
[[Category:Developing with Qt::QtMobility]]


= Using QtMobility sensors and QML from PySide =
= Using QtMobility sensors and QML from PySide =


This [[PySide]] tutorial shows how to use QtMobility APIs to read the accelerometer from Python, scale and smoothen the resulting data and expose it to a QML application in order to keep an image always upright. In the future, Qt Mobility 1.2 (still not released as of December 2010) will have &quot;QML Plugins&amp;quot;:http://doc.qt.nokia.com/qt-mobility-snapshot/qml-plugins.html, but right now we have to write some glue code in Python (and one might want to do more with the accelerometer data than just using it in the UI layer, so this will still be relevant when Qt Mobility 1.2 is out).
This [[PySide]] tutorial shows how to use QtMobility APIs to read the accelerometer from Python, scale and smoothen the resulting data and expose it to a QML application in order to keep an image always upright. In the future, Qt Mobility 1.2 (still not released as of December 2010) will have "QML Plugins":http://doc.qt.nokia.com/qt-mobility-snapshot/qml-plugins.html, but right now we have to write some glue code in Python (and one might want to do more with the accelerometer data than just using it in the UI layer, so this will still be relevant when Qt Mobility 1.2 is out).


== UnderMeSensi.py ==
== UnderMeSensi.py ==
Line 11: Line 16:
This is basically the same as in the previous tutorials (PySide modules needed, the QtOpenGL module is optional) with the new addition of the QtMobility Sensors API. On your '''N900''', you have to install the '''python-qtmobility''' metapackage in order to get the right modules.
This is basically the same as in the previous tutorials (PySide modules needed, the QtOpenGL module is optional) with the new addition of the QtMobility Sensors API. On your '''N900''', you have to install the '''python-qtmobility''' metapackage in order to get the right modules.


<code><br />import sys
<code>
import sys


from PySide import QtCore, QtGui, QtDeclarative, QtOpenGL<br />from QtMobility import Sensors<br /></code>
from PySide import QtCore, QtGui, QtDeclarative, QtOpenGL
from QtMobility import Sensors
</code>


=== The listener / controller ===
=== The listener / controller ===
Line 19: Line 27:
Next, we need to define a QObject subclass that takes care of receiving events from the accelerometer, scaling and smoothing the value and finally exposing the calculated rotation value as property so that we can access it from within our QML UI:
Next, we need to define a QObject subclass that takes care of receiving events from the accelerometer, scaling and smoothing the value and finally exposing the calculated rotation value as property so that we can access it from within our QML UI:


<code><br />class Listener(QtCore.QObject):<br /> def ''init''(self):<br /> QtCore.QObject.''init''(self)<br /> self._initial = True<br /> self._rotation = 0.
<code>
class Listener(QtCore.QObject):
def ''init''(self):
QtCore.QObject.''init''(self)
self._initial = True
self._rotation = 0.


def get_rotation(self):<br /> return self._rotation
def get_rotation(self):
return self._rotation


def set_rotation(self, rotation):<br /> if self._initial:<br /> self._rotation = rotation<br /> self._initial = False<br /> else:<br /> # Smooth the accelermeter input changes<br /> self._rotation *= .8<br /> self.''rotation += .2*rotation
def set_rotation(self, rotation):
<br /> self.on_rotation.emit()
if self._initial:
<br /> on_rotation = QtCore.Signal()<br /> rotation = QtCore.Property(float, get_rotation, set_rotation,  notify=on_rotation)
self._rotation = rotation
<br /> &amp;#64;QtCore.Slot()<br /> def on_reading_changed(self):<br /> accel = self.sender()<br /> # Scale the x axis reading to keep the image roughly steady<br /> self.rotation = accel.reading().x()*7<br /></code>
self._initial = False
<br />h3. Putting it all together
else:
<br />We create a new QAccelerometer from the Sensors module, which abstracts away the underlying system accelerometer and sends us easy-to-use events. We then create an instance of our listener class, and connect the '''readingChanged''' signal of the accelerometer (which gets called every time the reading changes, obviously) to the listener's '''on_reading_changed''' slot. We also have to tell the accelerometer to start reading the sensor and send out events.
# Smooth the accelermeter input changes
<br />We then only need to set up our QDeclarativeView as usual, and expose our listener object to the QML context, so that we can access it from the UI:
self._rotation *= .8
<br /><code><br />app = QtGui.QApplication(sys.argv)
self.''rotation += .2*rotation
<br />accel = Sensors.QAccelerometer()<br />listener = Listener()<br />accel.readingChanged.connect(listener.on_reading_changed)<br />accel.start()
 
<br />view = QtDeclarative.QDeclarativeView()<br />glw = QtOpenGL.QGLWidget()<br />view.setViewport(glw)<br />view.setResizeMode(QtDeclarative.QDeclarativeView.SizeRootObjectToView)<br />view.rootContext().setContextProperty('listener', listener)<br />view.setSource(file.replace('.py', '.qml'))<br />view.showFullScreen()
self.on_rotation.emit()
<br />app.exec''()<br /></code>
 
on_rotation = QtCore.Signal()
rotation = QtCore.Property(float, get_rotation, set_rotation,  notify=on_rotation)
 
&amp;#64;QtCore.Slot()
def on_reading_changed(self):
accel = self.sender()
# Scale the x axis reading to keep the image roughly steady
self.rotation = accel.reading().x()*7
</code>
 
h3. Putting it all together
 
We create a new QAccelerometer from the Sensors module, which abstracts away the underlying system accelerometer and sends us easy-to-use events. We then create an instance of our listener class, and connect the '''readingChanged''' signal of the accelerometer (which gets called every time the reading changes, obviously) to the listener's '''on_reading_changed''' slot. We also have to tell the accelerometer to start reading the sensor and send out events.
 
We then only need to set up our QDeclarativeView as usual, and expose our listener object to the QML context, so that we can access it from the UI:
 
<code>
app = QtGui.QApplication(sys.argv)
 
accel = Sensors.QAccelerometer()
listener = Listener()
accel.readingChanged.connect(listener.on_reading_changed)
accel.start()
 
view = QtDeclarative.QDeclarativeView()
glw = QtOpenGL.QGLWidget()
view.setViewport(glw)
view.setResizeMode(QtDeclarative.QDeclarativeView.SizeRootObjectToView)
view.rootContext().setContextProperty('listener', listener)
view.setSource(file.replace('.py', '.qml'))
view.showFullScreen()
 
app.exec''()
</code>


== UnderMeSensi.qml ==
== UnderMeSensi.qml ==
Line 39: Line 87:
This one is really trivial: Have an enclosing rectangle (which fills the whole screen) and then an image centered into it that shows the PySide logo, gets scaled a bit (so that it fits the screen nicely) and finally has its '''rotation''' property set to the rotation property of '''listener''' (this is the key part here - it will update the image's rotation everytime the listener's rotation property changes).
This one is really trivial: Have an enclosing rectangle (which fills the whole screen) and then an image centered into it that shows the PySide logo, gets scaled a bit (so that it fits the screen nicely) and finally has its '''rotation''' property set to the rotation property of '''listener''' (this is the key part here - it will update the image's rotation everytime the listener's rotation property changes).


<code><br />import Qt 4.7
<code>
import Qt 4.7


Rectangle {<br /> width: 800<br /> height: 480
Rectangle {
width: 800
height: 480


Image {<br /> source: &quot;images/pysidelogo.png&amp;quot;<br /> fillMode: Image.PreserveAspectFit<br /> width: parent.width/2<br /> height: parent.height/2<br /> anchors.centerIn: parent<br /> rotation: listener.rotation<br /> }<br />}<br /></code>
Image {
source: "images/pysidelogo.png"
fillMode: Image.PreserveAspectFit
width: parent.width/2
height: parent.height/2
anchors.centerIn: parent
rotation: listener.rotation
}
}
</code>


== How the example app looks like ==
== How the example app looks like ==


Copy the files '''UnderMeSensi.py''' and '''UnderMeSensi.qml''' to your N900 and download the file &quot;logo.png&amp;quot;:http://www.pyside.org/wp-content/themes/openbossa/images/logo.png as '''images/pysidelogo.png''' (or use a custom image of your choosing and set the '''source:''' path/URL in the QML file correctly. On an N900, it looks like this:
Copy the files '''UnderMeSensi.py''' and '''UnderMeSensi.qml''' to your N900 and download the file "logo.png":http://www.pyside.org/wp-content/themes/openbossa/images/logo.png as '''images/pysidelogo.png''' (or use a custom image of your choosing and set the '''source:''' path/URL in the QML file correctly. On an N900, it looks like this:


[[Image:http://farm6.static.flickr.com/5090/5231046791_e21f00d175.jpg|Example app running on a N900]]
[[Image:http://farm6.static.flickr.com/5090/5231046791_e21f00d175.jpg|Example app running on a N900]]


The example app in action: &quot;Video of the example on YouTube&amp;quot;:http://youtu.be/DpVpSZSOcGM
The example app in action: "Video of the example on YouTube":http://youtu.be/DpVpSZSOcGM

Revision as of 10:16, 25 February 2015


Using QtMobility sensors and QML from PySide

This PySide tutorial shows how to use QtMobility APIs to read the accelerometer from Python, scale and smoothen the resulting data and expose it to a QML application in order to keep an image always upright. In the future, Qt Mobility 1.2 (still not released as of December 2010) will have "QML Plugins":http://doc.qt.nokia.com/qt-mobility-snapshot/qml-plugins.html, but right now we have to write some glue code in Python (and one might want to do more with the accelerometer data than just using it in the UI layer, so this will still be relevant when Qt Mobility 1.2 is out).

UnderMeSensi.py

Importing the required modules

This is basically the same as in the previous tutorials (PySide modules needed, the QtOpenGL module is optional) with the new addition of the QtMobility Sensors API. On your N900, you have to install the python-qtmobility metapackage in order to get the right modules.

import sys

from PySide import QtCore, QtGui, QtDeclarative, QtOpenGL
from QtMobility import Sensors

The listener / controller

Next, we need to define a QObject subclass that takes care of receiving events from the accelerometer, scaling and smoothing the value and finally exposing the calculated rotation value as property so that we can access it from within our QML UI:

class Listener(QtCore.QObject):
 def ''init''(self):
 QtCore.QObject.''init''(self)
 self._initial = True
 self._rotation = 0.

def get_rotation(self):
 return self._rotation

def set_rotation(self, rotation):
 if self._initial:
 self._rotation = rotation
 self._initial = False
 else:
 # Smooth the accelermeter input changes
 self._rotation *= .8
 self.''rotation += .2*rotation

 self.on_rotation.emit()

 on_rotation = QtCore.Signal()
 rotation = QtCore.Property(float, get_rotation, set_rotation,  notify=on_rotation)

 &amp;#64;QtCore.Slot()
 def on_reading_changed(self):
 accel = self.sender()
 # Scale the x axis reading to keep the image roughly steady
 self.rotation = accel.reading().x()*7

h3. Putting it all together

We create a new QAccelerometer from the Sensors module, which abstracts away the underlying system accelerometer and sends us easy-to-use events. We then create an instance of our listener class, and connect the readingChanged signal of the accelerometer (which gets called every time the reading changes, obviously) to the listener's on_reading_changed slot. We also have to tell the accelerometer to start reading the sensor and send out events.

We then only need to set up our QDeclarativeView as usual, and expose our listener object to the QML context, so that we can access it from the UI:

app = QtGui.QApplication(sys.argv)

accel = Sensors.QAccelerometer()
listener = Listener()
accel.readingChanged.connect(listener.on_reading_changed)
accel.start()

view = QtDeclarative.QDeclarativeView()
glw = QtOpenGL.QGLWidget()
view.setViewport(glw)
view.setResizeMode(QtDeclarative.QDeclarativeView.SizeRootObjectToView)
view.rootContext().setContextProperty('listener', listener)
view.setSource(file.replace('.py', '.qml'))
view.showFullScreen()

app.exec''()

UnderMeSensi.qml

This one is really trivial: Have an enclosing rectangle (which fills the whole screen) and then an image centered into it that shows the PySide logo, gets scaled a bit (so that it fits the screen nicely) and finally has its rotation property set to the rotation property of listener (this is the key part here - it will update the image's rotation everytime the listener's rotation property changes).

import Qt 4.7

Rectangle {
 width: 800
 height: 480

Image {
 source: "images/pysidelogo.png"
 fillMode: Image.PreserveAspectFit
 width: parent.width/2
 height: parent.height/2
 anchors.centerIn: parent
 rotation: listener.rotation
 }
}

How the example app looks like

Copy the files UnderMeSensi.py and UnderMeSensi.qml to your N900 and download the file "logo.png":logo.png as images/pysidelogo.png (or use a custom image of your choosing and set the source: path/URL in the QML file correctly. On an N900, it looks like this:

Example app running on a N900

The example app in action: "Video of the example on YouTube":http://youtu.be/DpVpSZSOcGM