Qt wiki will be updated on October 12th 2023 starting at 11:30 AM (EEST) and the maintenance will last around 2-3 hours. During the maintenance the site will be unavailable.

QtMultimedia on iOS and tvOS: Difference between revisions

From Qt Wiki
Jump to navigation Jump to search
No edit summary
 
No edit summary
Line 1: Line 1:
The Qt Multimedia module is supported on iOS, but there are some limitations in what is supported, and some additional steps that need to be taken to make use of the support. One of the unusual things about the iOS port is that everything must be compiled statically due to the Apple policy of dis-allowing share object files to be bundled with applications. It is not a problem to statically compile the Qt modules (including Multimedia) and link them into your application, but most of QtMultimedia’s functionality is provided by plugins. These plugins also need to be statically linked into your application, and to do that you must add a line to your qmake project file.
The Qt Multimedia module is supported on iOS, but there are some limitations in what is supported, and some additional steps that need to be taken to make use of the support. One of the unusual things about the iOS port is that everything must be compiled statically due to the Apple policy of dis-allowing share object files to be bundled with applications. It is not a problem to statically compile the Qt modules (including Multimedia) and link them into your application, but most of QtMultimedia's functionality is provided by plugins. These plugins also need to be statically linked into your application, and to do that you must add a line to your qmake project file.


==Low Latency Audio and Low Level Audio Support:==
== Low Latency Audio and Low Level Audio Support: ==


To access the low level audio <span class="caps">API</span>’s on iOS you need to statically link in the CoreAudio plugin:<br />''<span class="caps">QTPLUGIN</span> += qtaudio_coreaudio''<br /> This gives you the ability to use the C++ Low Level Audio <span class="caps">API</span>s (ex. QAudioDeviceInfo, QAudioOutput, QAudioInput)<br /> You are also about to use QSoundEffect and the <span class="caps">QML</span> component SoundEffect which allows you to playback uncompressed <span class="caps">WAV</span> files using the low latency audio system. This provides a convenient way to playback short sounds very quickly (like a click sound when you press a button in your UI).
To access the low level audio API's on iOS you need to statically link in the CoreAudio plugin:<br />_QTPLUGIN ''= qtaudio_coreaudio_<br />This gives you the ability to use the C''+ Low Level Audio APIs (ex. QAudioDeviceInfo, QAudioOutput, QAudioInput)<br />You are also about to use QSoundEffect and the QML component SoundEffect which allows you to playback uncompressed WAV files using the low latency audio system. This provides a convenient way to playback short sounds very quickly (like a click sound when you press a button in your UI).


==Audio Engine support (3D Audio)==
== Audio Engine support (3D Audio) ==


iOS should also support using the <span class="caps">QML</span> only QtAudioEngine <span class="caps">API</span> provided by QtMultimedia. This gives you the ability to simulate 3D Environmental sound effects (using OpenAL) inside of <span class="caps">QML</span>. If you are using this <span class="caps">API</span> in your iOS application you need to include the audioengine backend by adding the following line to your qmake project file:<br />''<span class="caps">QTPLUGIN</span> += qtmedia_audioengine''. From Qt 5.3 however you don’t need to add the <span class="caps">QTPLUGIN</span> line at all as this is done automatically.
iOS should also support using the QML only QtAudioEngine API provided by QtMultimedia. This gives you the ability to simulate 3D Environmental sound effects (using OpenAL) inside of QML. If you are using this API in your iOS application you need to include the audioengine backend by adding the following line to your qmake project file:<br />''QTPLUGIN ''= qtmedia_audioengine_. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.
<br />h2. Camera Capture
<br />iOS has basic support for capturing images and videos via the builtin camera devices. To use these API's you need to statically link in AVFoundation Camera backend and you do this by adding the following line to your qmake project file:<br />_QTPLUGIN''= qavfcamera''. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.


==Camera Capture==
== Multimedia Playback (Videos, Audio) ==


iOS has basic support for capturing images and videos via the builtin camera devices. To use these <span class="caps">API</span>’s you need to statically link in <span class="caps">AVF</span>oundation Camera backend and you do this by adding the following line to your qmake project file:<br />''<span class="caps">QTPLUGIN</span> += qavfcamera''. From Qt 5.3 however you don’t need to add the <span class="caps">QTPLUGIN</span> line at all as this is done automatically.
iOS also supports the playback of both Audio and Video files or streams. This works by creating a QMediaPlayer (MediaPlayer in QML) object and passing it a media source to play. Right now video playback only works in C++ via QWidget, but Audio media playback works from both QML and C++. The limitation of not being able to playback Video in QML is due to us not being able to render video into an OpenGL texture for QtQuick and is something that will have to be resolved in the future (not in 5.2). To enable Multimedia playback functionality you need to include the AVFoundation mediaplayer backend, and you do this by including the following line in your qmake project file:<br />''QTPLUGIN += qavfmediaplayer''. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.
 
==Multimedia Playback (Videos, Audio)==
 
iOS also supports the playback of both Audio and Video files or streams. This works by creating a QMediaPlayer (MediaPlayer in <span class="caps">QML</span>) object and passing it a media source to play. Right now video playback only works in C++ via QWidget, but Audio media playback works from both <span class="caps">QML</span> and C++. The limitation of not being able to playback Video in <span class="caps">QML</span> is due to us not being able to render video into an OpenGL texture for QtQuick and is something that will have to be resolved in the future (not in 5.2). To enable Multimedia playback functionality you need to include the <span class="caps">AVF</span>oundation mediaplayer backend, and you do this by including the following line in your qmake project file:<br />''<span class="caps">QTPLUGIN</span> += qavfmediaplayer''. From Qt 5.3 however you don’t need to add the <span class="caps">QTPLUGIN</span> line at all as this is done automatically.
 
It may seem cumbersome to have to manually link these backends into each of your Qt applications, but the alternative would be to always include all 4 backends, which would unnecessarily increase the size and memory footprint of your iOS applications. This same issue exists in other modules that derive their functionality from plugins (ex QtSensors), so make sure to keep this in mind when building applications for iOS (or other platforms where you are statically linking your application).

Revision as of 09:33, 24 February 2015

The Qt Multimedia module is supported on iOS, but there are some limitations in what is supported, and some additional steps that need to be taken to make use of the support. One of the unusual things about the iOS port is that everything must be compiled statically due to the Apple policy of dis-allowing share object files to be bundled with applications. It is not a problem to statically compile the Qt modules (including Multimedia) and link them into your application, but most of QtMultimedia's functionality is provided by plugins. These plugins also need to be statically linked into your application, and to do that you must add a line to your qmake project file.

Low Latency Audio and Low Level Audio Support:

To access the low level audio API's on iOS you need to statically link in the CoreAudio plugin:
_QTPLUGIN = qtaudio_coreaudio_
This gives you the ability to use the C
+ Low Level Audio APIs (ex. QAudioDeviceInfo, QAudioOutput, QAudioInput)
You are also about to use QSoundEffect and the QML component SoundEffect which allows you to playback uncompressed WAV files using the low latency audio system. This provides a convenient way to playback short sounds very quickly (like a click sound when you press a button in your UI).

Audio Engine support (3D Audio)

iOS should also support using the QML only QtAudioEngine API provided by QtMultimedia. This gives you the ability to simulate 3D Environmental sound effects (using OpenAL) inside of QML. If you are using this API in your iOS application you need to include the audioengine backend by adding the following line to your qmake project file:
QTPLUGIN = qtmedia_audioengine_. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.
h2. Camera Capture
iOS has basic support for capturing images and videos via the builtin camera devices. To use these API's you need to statically link in AVFoundation Camera backend and you do this by adding the following line to your qmake project file:
_QTPLUGIN= qavfcamera. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.

Multimedia Playback (Videos, Audio)

iOS also supports the playback of both Audio and Video files or streams. This works by creating a QMediaPlayer (MediaPlayer in QML) object and passing it a media source to play. Right now video playback only works in C++ via QWidget, but Audio media playback works from both QML and C++. The limitation of not being able to playback Video in QML is due to us not being able to render video into an OpenGL texture for QtQuick and is something that will have to be resolved in the future (not in 5.2). To enable Multimedia playback functionality you need to include the AVFoundation mediaplayer backend, and you do this by including the following line in your qmake project file:
QTPLUGIN += qavfmediaplayer. From Qt 5.3 however you don't need to add the QTPLUGIN line at all as this is done automatically.