On Symbian, video content can currently only be rendered to a rectangular region of the screen, with no transformations other than scaling and rotation in 90 degree steps. This means that the following use cases cannot be supported:
- 2.5D effects such as perspective shear
- Embedding video in 3D scenes such as carousel video previews
- Application of transition effects or animations: page-turn; free rotation; etc
All of the above are offered by both HTML5 and QML; as such, Symbian currently offers only limited support for embedding video within UIs developed with those technologies.
The public C++ APIs used for video rendering in QtMultimediaKit (part of QtMobility) are:
- QMediaPlayer: allows client to set media source; play, pause, stop etc
The QMediaPlayer client can choose to route video output to either of the following:
- QVideoWidget: video output which does not support any transforms other than scaling
- QGraphicsVideoItem: video output which can be transformed arbitrarily
The QMediaPlayer backend for Symbian is built on top of the MMF client API (CVideoPlayerUtility2). The only output target supported by this API is a native window.
QWidget has a 1-1 relationship with a native window handle; the implementation of QVideoWidget is therefore simple.
QGraphicsItems exist as part of a QGraphicsScene, which represents a 2D canvas. Items in this scene are rendered via a QGraphicsView, which is itself a QWidget. Whereas, within a widget-based UI, individual QWidgets can have a 1-1 mapping onto native windows, a Graphics View-based UI has only one native window, owned by the QGraphicsView. Individual QGraphicsItems must paint their content into this window.
Transformation matrices can be applied both to individual items, and to the view as a whole. Each time the view is updated, each item within the view receives an instruction to repaint; contained within this instruction is the overall transformation matrix which must be applied to the item’s content.
In order for this painting to take place, the QtMultimediaKit backend must expose a representation of the decoded video frames. This is done by the QVideoFrame class, whose API provides:
- Accessors for the frame size and pixel format.
- Functions for determining whether the frame can be mapped into system memory. If it is mappable, the API allows raw pixel buffer access.
- An accessor which provides a handle to the frame. The backend reports the type(s) of frame handles is can provide; types currently supported by the API are:
- GL texture ID
- XVideo shared memory handle
- CoreImage handle
Painting can only be done if one of the available handle types is compatible with the Qt graphics system currently being used. For the raster (software) graphics system, the video frame needs to be mappable into system memory; the OpenGL(ES) graphics system requires a GL texture ID, etc.
In order to implement QGraphicsVideoItem on Symbian therefore, we need to be able to provide video frame handles compatible with each of the graphics systems that we want to support. Today on the N8, the default graphics system is OpenVG, with raster being used as a fallback in low memory situations. The medium term plan is to move to an OpenGLES backend. The longer term plan is that, for rendering QML-based UIs, Graphics View will eventually be replaced by a Scene Graph. Whereas Qt today has multiple different graphics systems, the Scene Graph will only backend onto OpenGLES.