PWG
PWG
3)
Richard John Boulton Erik Walthinsen Steve Baker Leif Johnson Ronald S. Bultje Stefan Kost Tim-Philipp Mller
GStreamer Plugin Writers Guide (0.10.32.3) by Richard John Boulton, Erik Walthinsen, Steve Baker, Leif Johnson, Ronald S. Bultje, Stefan Kost, and Tim-Philipp Mller
This material may be distributed only subject to the terms and conditions set forth in the Open Publication License, v1.0 or later (the latest version is presently available at http://www.opencontent.org/openpub/).
Table of Contents
I. Introduction ...........................................................................................................................................vi 1. Preface............................................................................................................................................1 1.1. What is GStreamer?...........................................................................................................1 1.2. Who Should Read This Guide? .........................................................................................1 1.3. Preliminary Reading..........................................................................................................2 1.4. Structure of This Guide .....................................................................................................2 2. Foundations....................................................................................................................................5 2.1. Elements and Plugins ........................................................................................................5 2.2. Pads....................................................................................................................................5 2.3. Data, Buffers and Events ...................................................................................................6 2.4. Mimetypes and Properties .................................................................................................8 II. Building a Plugin ................................................................................................................................12 3. Constructing the Boilerplate ........................................................................................................13 3.1. Getting the GStreamer Plugin Templates ........................................................................13 3.2. Using the Project Stamp ..................................................................................................13 3.3. Examining the Basic Code ..............................................................................................14 3.4. GstElementDetails...........................................................................................................15 3.5. GstStaticPadTemplate......................................................................................................16 3.6. Constructor Functions .....................................................................................................18 3.7. The plugin_init function ..................................................................................................18 4. Specifying the pads ......................................................................................................................20 4.1. The setcaps-function........................................................................................................21 5. The chain function .......................................................................................................................23 6. What are states? ...........................................................................................................................25 6.1. Managing lter state ........................................................................................................25 7. Adding Arguments.......................................................................................................................28 8. Signals..........................................................................................................................................31 9. Building a Test Application .........................................................................................................32 III. Advanced Filter Concepts ................................................................................................................36 10. Caps negotiation.........................................................................................................................37 10.1. Caps negotiation use cases ............................................................................................37 10.2. Fixed caps ......................................................................................................................38 10.3. Downstream caps negotiation........................................................................................39 10.4. Upstream caps (re)negotiation.......................................................................................41 10.5. Implementing a getcaps function...................................................................................42 11. Different scheduling modes .......................................................................................................43 11.1. The pad activation stage ................................................................................................43 11.2. Pads driving the pipeline ...............................................................................................44 11.3. Providing random access ...............................................................................................46 12. Types and Properties ..................................................................................................................49 12.1. Building a Simple Format for Testing ...........................................................................49 12.2. Typend Functions and Autoplugging ..........................................................................49 12.3. List of Dened Types ....................................................................................................51 13. Request and Sometimes pads.....................................................................................................68 13.1. Sometimes pads .............................................................................................................68
iii
13.2. Request pads..................................................................................................................71 14. Clocking .....................................................................................................................................73 14.1. Types of time .................................................................................................................73 14.2. Clocks ............................................................................................................................73 14.3. Flow of data between elements and time.......................................................................73 14.4. Obligations of each element. .........................................................................................74 15. Supporting Dynamic Parameters................................................................................................75 15.1. Getting Started...............................................................................................................75 15.2. The Data Processing Loop.............................................................................................75 16. Interfaces....................................................................................................................................77 16.1. How to Implement Interfaces ........................................................................................77 16.2. URI interface .................................................................................................................78 16.3. Mixer Interface ..............................................................................................................79 16.4. Tuner Interface...............................................................................................................82 16.5. Color Balance Interface.................................................................................................84 16.6. Property Probe Interface................................................................................................84 16.7. X Overlay Interface .......................................................................................................87 16.8. Navigation Interface ......................................................................................................89 17. Tagging (Metadata and Streaminfo)...........................................................................................90 17.1. Overview .......................................................................................................................90 17.2. Reading Tags from Streams...........................................................................................90 17.3. Writing Tags to Streams ................................................................................................91 18. Events: Seeking, Navigation and More......................................................................................94 18.1. Downstream events........................................................................................................94 18.2. Upstream events ............................................................................................................95 18.3. All Events Together .......................................................................................................96 IV. Creating special element types .......................................................................................................101 19. Pre-made base classes ..............................................................................................................102 19.1. Writing a sink ..............................................................................................................102 19.2. Writing a source ..........................................................................................................104 19.3. Writing a transformation element................................................................................105 20. Writing a Demuxer or Parser ...................................................................................................106 21. Writing a N-to-1 Element or Muxer.........................................................................................107 22. Writing a Manager ...................................................................................................................108 V. Appendices.........................................................................................................................................109 23. Things to check when writing an element ...............................................................................110 23.1. About states .................................................................................................................110 23.2. Debugging ...................................................................................................................110 23.3. Querying, events and the like ......................................................................................111 23.4. Testing your element ...................................................................................................111 24. Porting 0.8 plug-ins to 0.10......................................................................................................113 24.1. List of changes.............................................................................................................113 25. GStreamer licensing .................................................................................................................115 25.1. How to license the code you write for GStreamer.......................................................115
iv
List of Tables
2-1. Table of Example Types .......................................................................................................................8 12-1. Table of Audio Types .......................................................................................................................51 12-2. Table of Video Types........................................................................................................................58 12-3. Table of Container Types..................................................................................................................66 12-4. Table of Subtitle Types .....................................................................................................................67 12-5. Table of Other Types ........................................................................................................................67
I. Introduction
GStreamer is an extremely powerful and versatile framework for creating streaming media applications. Many of the virtues of the GStreamer framework come from its modularity: GStreamer can seamlessly incorporate new plugin modules. But because modularity and power often come at a cost of greater complexity (consider, for example, CORBA (http://www.omg.org/)), writing new plugins is not always easy. This guide is intended to help you understand the GStreamer framework (version 0.10.32.3) so you can develop new plugins to extend the existing functionality. The guide addresses most issues by following the development of an example plugin - an audio lter plugin - written in C. However, the later parts of the guide also present some issues involved in writing other types of plugins, and the end of the guide describes some of the Python bindings for GStreamer.
Chapter 1. Preface
1.1. What is GStreamer?
GStreamer is a framework for creating streaming media applications. The fundamental design comes from the video pipeline at Oregon Graduate Institute, as well as some ideas from DirectShow. GStreamers development framework makes it possible to write any type of streaming multimedia application. The GStreamer framework is designed to make it easy to write applications that handle audio or video or both. It isnt restricted to audio and video, and can process any kind of data ow. The pipeline design is made to have little overhead above what the applied lters induce. This makes GStreamer a good framework for designing even high-end audio applications which put high demands on latency. One of the the most obvious uses of GStreamer is using it to build a media player. GStreamer already includes components for building a media player that can support a very wide variety of formats, including MP3, Ogg/Vorbis, MPEG-1/2, AVI, Quicktime, mod, and more. GStreamer, however, is much more than just another media player. Its main advantages are that the pluggable components can be mixed and matched into arbitrary pipelines so that its possible to write a full-edged video or audio editing application. The framework is based on plugins that will provide the various codec and other functionality. The plugins can be linked and arranged in a pipeline. This pipeline denes the ow of the data. Pipelines can also be edited with a GUI editor and saved as XML so that pipeline libraries can be made with a minimum of effort. The GStreamer core function is to provide a framework for plugins, data ow and media type handling/negotiation. It also provides an API to write applications using the various plugins.
Anyone who wants to add support for new ways of processing data in GStreamer. For example, a person in this group might want to create a new data format converter, a new visualization tool, or a new decoder or encoder. Anyone who wants to add support for new input and output devices. For example, people in this group might want to add the ability to write to a new video output system or read data from a digital camera or special microphone.
Chapter 1. Preface
Anyone who wants to extend GStreamer in any way. You need to have an understanding of how the plugin system works before you can understand the constraints that the plugin system places on the rest of the code. Also, you might be surprised after reading this at how much can be done with plugins.
This guide is not relevant to you if you only want to use the existing functionality of GStreamer, or if you just want to use an application that uses GStreamer. If you are only interested in using existing plugins to write a new application - and there are quite a lot of plugins already - you might want to check the GStreamer Application Development Manual. If you are just trying to get help with a GStreamer application, then you should check with the user manual for that particular application.
Building a Plugin - Introduction to the structure of a plugin, using an example audio lter for illustration. This part covers all the basic steps you generally need to perform to build a plugin, such as registering the element with GStreamer and setting up the basics so it can receive data from and send data to neighbour elements. The discussion begins by giving examples of generating the basic structures and registering an element in Constructing the Boilerplate. Then, you will learn how to write the code to get a basic lter plugin working in Chapter 4, Chapter 5 and Chapter 6. After that, we will show some of the GObject concepts on how to make an element congurable for applications and how to do application-element interaction in Adding Arguments and Chapter 8. Next, you will learn to build a quick test application to test all that youve just learned in Chapter 9. We will just touch upon basics here. For full-blown application development, you should look at the
Advanced Filter Concepts - Information on advanced features of GStreamer plugin development. After learning about the basic steps, you should be able to create a functional audio or video lter plugin with some nice features. However, GStreamer offers more for plugin writers. This part of the guide includes chapters on more advanced topics, such as scheduling, media type denitions in GStreamer, clocks, interfaces and tagging. Since these features are purpose-specic, you can read them in any order, most of them dont require knowledge from other sections. The rst chapter, named Different scheduling modes, will explain some of the basics of element scheduling. It is not very in-depth, but is mostly some sort of an introduction on why other things work as they do. Read this chapter if youre interested in GStreamer internals. Next, we will apply this knowledge and discuss another type of data transmission than what you learned in Chapter 5: Different scheduling modes. Loop-based elements will give you more control over input rate. This is useful when writing, for example, muxers or demuxers. Next, we will discuss media identication in GStreamer in Chapter 12. You will learn how to dene new media types and get to know a list of standard media types dened in GStreamer. In the next chapter, you will learn the concept of request- and sometimes-pads, which are pads that are created dynamically, either because the application asked for it (request) or because the media stream requires it (sometimes). This will be in Chapter 13. The next chapter, Chapter 14, will explain the concept of clocks in GStreamer. You need this information when you want to know how elements should achieve audio/video synchronization. The next few chapters will discuss advanced ways of doing application-element interaction. Previously, we learned on the GObject-ways of doing this in Adding Arguments and Chapter 8. We will discuss dynamic parameters, which are a way of dening element behaviour over time in advance, in Chapter 15. Next, you will learn about interfaces in Chapter 16. Interfaces are very target- specic ways of application-element interaction, based on GObjects GInterface. Lastly, you will learn about how metadata is handled in GStreamer in Chapter 17. The last chapter, Chapter 18, will discuss the concept of events in GStreamer. Events are, on the one hand, another way of doing application-element interaction. It takes care of seeking, for example. On the other hand, it is also a way in which elements interact with each other, such as letting each other know about media stream discontinuities, forwarding tags inside a pipeline and so on.
Chapter 1. Preface Because the rst two parts of the guide use an audio lter as an example, the concepts introduced apply to lter plugins. But many of the concepts apply equally to other plugin types, including sources, sinks, and autopluggers. This part of the guide presents the issues that arise when working on these more specialized plugin types. The chapter starts with a special focus on elements that can be written using a base-class (Pre-made base classes), and later also goes into writing special types of elements in Writing a Demuxer or Parser, Writing a N-to-1 Element or Muxer and Writing a Manager.
Appendices - Further information for plugin developers. The appendices contain some information that stubbornly refuses to t cleanly in other sections of the guide. Most of this section is not yet nished.
The remainder of this introductory part of the guide presents a short overview of the basic concepts involved in GStreamer plugin development. Topics covered include Elements and Plugins, Pads, Data, Buffers and Events and Types and Properties. If you are already familiar with this information, you can use this short overview to refresh your memory, or you can skip to Building a Plugin. As you can see, there a lot to learn, so lets get started!
Creating compound and complex elements by extending from a GstBin. This will allow you to create plugins that have other plugins embedded in them. Adding new mime-types to the registry along with typedetect functions. This will allow your plugin to operate on a completely new media type.
Chapter 2. Foundations
This chapter of the guide introduces the basic concepts of GStreamer. Understanding these concepts will help you grok the issues involved in extending GStreamer. Many of these concepts are explained in greater detail in the GStreamer Application Development Manual; the basic concepts presented here serve mainly to refresh your memory.
2.2. Pads
Pads are used to negotiate links and data ow between elements in GStreamer. A pad can be viewed as a
Chapter 2. Foundations place or port on an element where links may be made with other elements, and through which data can ow to or from those elements. Pads have specic data handling capabilities: A pad can restrict the type of data that ows through it. Links are only allowed between two pads when the allowed data types of the two pads are compatible. An analogy may be helpful here. A pad is similar to a plug or jack on a physical device. Consider, for example, a home theater system consisting of an amplier, a DVD player, and a (silent) video projector. Linking the DVD player to the amplier is allowed because both devices have audio jacks, and linking the projector to the DVD player is allowed because both devices have compatible video jacks. Links between the projector and the amplier may not be made because the projector and amplier have different types of jacks. Pads in GStreamer serve the same purpose as the jacks in the home theater system. For the most part, all data in GStreamer ows one way through a link between elements. Data ows out of one element through one or more source pads, and elements accept incoming data through one or more sink pads. Source and sink elements have only source and sink pads, respectively. See the GStreamer Library Reference for the current implementation details of a GstPad (../../gstreamer/html/GstPad.html).
An exact type indicating what type of data (control, content, ...) this Data is. A reference count indicating the number of elements currently holding a reference to the buffer. When the buffer reference count falls to zero, the buffer will be unlinked, and its memory will be freed in some sense (see below for more details).
There are two types of data dened: events (control) and buffers (content). Buffers may contain any sort of data that the two linked pads know how to handle. Normally, a buffer contains a chunk of some sort of audio or video data that ows from one element to another. Buffers also contain metadata describing the buffers contents. Some of the important types of metadata are:
Chapter 2. Foundations
An integer indicating the size of the buffers data. A timestamp indicating the preferred display timestamp of the content in the buffer.
Events contain information on the state of the stream owing between the two linked pads. Events will only be sent if the element explicitly supports them, else the core will (try to) handle the events automatically. Events are used to indicate, for example, a clock discontinuity, the end of a media stream or that the cache should be ushed. Events may contain several of the following items:
A subtype indicating the type of the contained event. The other contents of the event depend on the specic event type.
Events will be discussed extensively in Chapter 18. Until then, the only event that will be used is the EOS event, which is used to indicate the end-of-stream (usually end-of-le). See the GStreamer Library Reference for the current implementation details of a GstMiniObject (../../gstreamer/html/gstreamer-GstMiniObject.html), GstBuffer (../../gstreamer/html/gstreamer-GstBuffer.html) and GstEvent (../../gstreamer/html/gstreamer-GstEvent.html).
Chapter 2. Foundations Many sink elements have accelerated methods for copying data to hardware, or have direct access to hardware. It is common for these elements to be able to create downstream-allocated buffers for their upstream peers. One such example is ximagesink. It creates buffers that contain XImages. Thus, when an upstream peer copies data into the buffer, it is copying directly into the XImage, enabling ximagesink to draw the image directly to the screen instead of having to copy data into an XImage rst. Filter elements often have the opportunity to either work on a buffer in-place, or work while copying from a source buffer to a destination buffer. It is optimal to implement both algorithms, since the GStreamer framework can choose the fastest algorithm as appropriate. Naturally, this only makes sense for strict lters -- elements that have exactly the same format on source and sink pads.
channels
integer
greater than 0
Chapter 2. Foundations Mime Type audio/x-raw-int Description Property Property Type Property Values integer Property Description
G_BIG_ENDIANThe order of (4321) or bytes in a G_LITTLE_ENDIAN sample. The (1234) value G_LITTLE_ENDIAN (1234) means little-endian (byte-order is least signicant byte rst). The value G_BIG_ENDIAN (4321) means big-endian (byte order is most signicant byte rst).
signed
boolean
TRUE or FALSE Whether the values of the integer samples are signed or not. Signed samples use one bit to indicate sign (negative or positive) of the value. Unsigned samples are always positive. greater than 0 Number of bits allocated per sample.
width
integer
Chapter 2. Foundations Mime Type depth Description integer Property greater than 0 Property Type Property Values The number of bits used per sample. This must be less than or equal to the width: If the depth is less than the width, the low bits are assumed to be the ones used. For example, a width of 32 and a depth of 24 means that each sample is stored in a 32 bit word, but only the low 24 bits are actually used. integer 1, 2 or 4 Property Description
audio/mpeg
Audio data mpegversion compressed using the MPEG audio encoding scheme.
The MPEG-version used for encoding the data. The value 1 refers to MPEG-1, -2 and -2.5 layer 1, 2 or 3. The values 2 and 4 refer to the MPEG-AAC audio encoding schemes.
framed
boolean
0 or 1
A true value indicates that each buffer contains exactly one frame. A false value indicates that frames and buffers do not necessarily match up.
10
Chapter 2. Foundations Mime Type layer Description integer Property 1, 2, or 3 Property Type Property Values The compression scheme layer used to compress the data (only if mpegversion=1). The bitrate, in bits per second. For VBR (variable bitrate) MPEG data, this is the average bitrate. There are currently no specic properties dened for this type. Property Description
bitrate
integer
greater than 0
audio/x-vorbis
11
Initialized empty Git repository in /some/path/gst-template/.git/ remote: Counting objects: 373, done. remote: Compressing objects: 100% (114/114), done. remote: Total 373 (delta 240), reused 373 (delta 240) Receiving objects: 100% (373/373), 75.16 KiB | 78 KiB/s, done. Resolving deltas: 100% (240/240), done.
This command will check out a series of les and directories into gst-template. The template you will be using is in the gst-template/gst-plugin/ directory. You should look over the les in that directory to get a general idea of the structure of a source tree for a plugin. If for some reason you cant access the git repository, you can also download a snapshot of the latest revision (http://cgit.freedesktop.org/gstreamer/gst-template/commit/) via the cgit web interface.
13
Chapter 3. Constructing the Boilerplate The standard way of dening the boilerplate is simply to write some code, and ll in some structures. As mentioned in the previous section, the easiest way to do this is to copy a template and add functionality according to your needs. To help you do so, there is a tool in the ./gst-plugins/tools/ directory. This tool, make_element, is a command line utility that creates the boilerplate code for you. To use make_element, rst open up a terminal window. Change to the gst-template/gst-plugin/src directory, and then run the make_element command. The arguments to the make_element are: 1. the name of the plugin, and 2. the source le that the tool will use. By default, gstplugin is used. For example, the following commands create the MyFilter plugin based on the plugin template and put the output les in the gst-template/gst-plugin/src directory:
shell $ cd gst-template/gst-plugin/src shell $ ../tools/make_element MyFilter
Note: Capitalization is important for the name of the plugin. Keep in mind that under some operating systems, capitalization is also important when specifying directory and le names in general.
Now one needs to adjust the Makefile.am to use the new lenames and run autogen.sh from the parent directory to bootstrap the build environment. After that, the project can be built and installed using the well known make && sudo make install commands.
Note: Be aware that by default autogen.sh and configure would choose /usr/local as a default location. One would need to add /usr/local/lib/gstreamer-0.10 to GST_PLUGIN_PATH in order to make the new plugin show up in gstreamer.
14
Chapter 3. Constructing the Boilerplate not crucial.) The code here can be found in
examples/pwg/examplefilter/boiler/gstexamplefilter.h.
} GstMyFilter; /* Standard definition defining a class for this element. */ typedef struct _GstMyFilterClass { GstElementClass parent_class; } GstMyFilterClass; /* Standard macros for defining types for this element. */ #define GST_TYPE_MY_FILTER (gst_my_filter_get_type()) #define GST_MY_FILTER(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_MY_FILTER,GstMyFilter)) #define GST_MY_FILTER_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_MY_FILTER,GstMyFilterClass)) #define GST_IS_MY_FILTER(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_MY_FILTER)) #define GST_IS_MY_FILTER_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_MY_FILTER)) /* Standard function returning type information. */ GType gst_my_filter_get_type (void);
Using this header le, you can use the following macro to setup the GObject basics in your source le so that all functions will be called appropriately:
#include "filter.h" GST_BOILERPLATE (GstMyFilter, gst_my_filter, GstElement, GST_TYPE_ELEMENT);
15
3.4. GstElementDetails
The GstElementDetails structure provides a hierarchical type for element information. The entries are:
A long, English, name for the element. The type of the element, see the docs/design/draft-klass.txt document in the GStreamer core source tree for details and examples. A brief description of the purpose of the element. The name of the author of the element, optionally followed by a contact email address in angle brackets.
For example:
static const GstElementDetails my_filter_details = { "An example plugin", "Example/FirstExample", "Shows the basic structure of a plugin", "your name <[email protected]>" };
The element details are registered with the plugin during the _base_init () function, which is part of the GObject system. The _base_init () function should be set for this GObject in the function where you register the type with GLib.
static void gst_my_filter_base_init (gpointer klass) { GstElementClass *element_class = GST_ELEMENT_CLASS (klass); static const GstElementDetails my_filter_details = { [..] }; [..] gst_element_class_set_details (element_class, &my_filter_details); }
3.5. GstStaticPadTemplate
A GstStaticPadTemplate is a description of a pad that the element will (or might) create and use. It contains:
16
A short name for the pad. Pad direction. Existence property. This indicates whether the pad exists always (an always pad), only in some cases (a sometimes pad) or only if the application requested such a pad (a request pad). Supported types by this element (capabilities).
For example:
static GstStaticPadTemplate sink_factory = GST_STATIC_PAD_TEMPLATE ( "sink", GST_PAD_SINK, GST_PAD_ALWAYS, GST_STATIC_CAPS ("ANY") );
Those pad templates are registered during the _base_init () function. Pads are created from these templates in the elements _init () function using gst_pad_new_from_template (). The template can be retrieved from the element class using gst_element_class_get_pad_template (). See below for more details on this. In order to create a new pad from this template using gst_pad_new_from_template (), you will need to declare the pad template as a global variable. More on this subject in Chapter 4.
static GstStaticPadTemplate sink_factory = [..], src_factory = [..]; static void gst_my_filter_base_init (gpointer klass) { GstElementClass *element_class = GST_ELEMENT_CLASS (klass); [..] gst_element_class_add_pad_template (element_class, gst_static_pad_template_get (&src_factory)); gst_element_class_add_pad_template (element_class, gst_static_pad_template_get (&sink_factory)); }
The last argument in a template is its type or list of supported types. In this example, we use ANY, which means that this element will accept all input. In real-life situations, you would set a mimetype and optionally a set of properties to make sure that only supported input will come in. This representation should be a string that starts with a mimetype, then a set of comma-separates properties with their supported values. In case of an audio lter that supports raw integer 16-bit audio, mono or stereo at any samplerate, the correct template would look like this:
17
Values surrounded by curly brackets ({ and }) are lists, values surrounded by square brackets ([ and ]) are ranges. Multiple sets of types are supported too, and should be separated by a semicolon (;). Later, in the chapter on pads, we will see how to use types to know the exact format of a stream: Chapter 4.
18
Note that the information returned by the plugin_init() function will be cached in a central registry. For this reason, it is important that the same information is always returned by the function: for example, it must not make element factories available based on runtime conditions. If an element can only work in certain conditions (for example, if the soundcard is not being used by some other process) this must be reected by the element being unable to enter the READY state if unavailable, rather than the plugin attempting to deny existence of the plugin.
19
static gboolean gst_my_filter_setcaps (GstPad GstCaps *caps); static GstFlowReturn gst_my_filter_chain (GstPad GstBuffer *buf);
*pad, *pad,
static void gst_my_filter_init (GstMyFilter *filter, GstMyFilterClass *filter_klass) { GstElementClass *klass = GST_ELEMENT_CLASS (filter_klass); /* pad through which data comes in to the element */ filter->sinkpad = gst_pad_new_from_template ( gst_element_class_get_pad_template (klass, "sink"), "sink"); gst_pad_set_setcaps_function (filter->sinkpad, gst_my_filter_setcaps); gst_pad_set_chain_function (filter->sinkpad, gst_my_filter_chain);
gst_element_add_pad (GST_ELEMENT (filter), filter->sinkpad); /* pad through which data goes out of the element */ filter->srcpad = gst_pad_new_from_template ( gst_element_class_get_pad_template (klass, "src"), "src");
20
/* Capsnego succeeded, get the stream properties for internal * usage and return success. */ gst_structure_get_int (structure, "rate", &filter->samplerate); gst_structure_get_int (structure, "channels", &filter->channels); g_print ("Caps negotiation succeeded with %d Hz @ %d channels\n", filter->samplerate, filter->channels); return TRUE; }
21
Chapter 4. Specifying the pads In here, we check the mimetype of the provided caps. Normally, you dont need to do that in your own plugin/element, because the core does that for you. We simply use it to show how to retrieve the mimetype from a provided set of caps. Types are stored in GstStructure (../../gstreamer/html/gstreamer-GstStructure.html) internally. A GstCaps (../../gstreamer/html/gstreamer-GstCaps.html) is nothing more than a small wrapper for 0 or more structures/types. From the structure, you can also retrieve properties, as is shown above with the function gst_structure_get_int (). If your _link () function does not need to perform any specic operation (i.e. it will only forward caps), you can set it to gst_pad_proxy_link (). This is a link forwarding function implementation provided by the core. It is useful for elements such as identity.
22
static GstFlowReturn gst_my_filter_chain (GstPad *pad, GstBuffer *buf) { GstMyFilter *filter = GST_MY_FILTER (GST_OBJECT_PARENT (pad)); if (!filter->silent) g_print ("Have data of size %u bytes!\n", GST_BUFFER_SIZE (buf)); return gst_pad_push (filter->srcpad, buf); }
Obviously, the above doesnt do much useful. Instead of printing that the data is in, you would normally process the data there. Remember, however, that buffers are not always writeable. In more advanced elements (the ones that do event processing), you may want to additionally specify an event handling function, which will be called when stream-events are sent (such as end-of-stream, discontinuities, tags, etc.).
static void gst_my_filter_init (GstMyFilter * filter) { [..] gst_pad_set_event_function (filter->sinkpad, gst_my_filter_event); [..] }
static gboolean gst_my_filter_event (GstPad *pad, GstEvent *event) { GstMyFilter *filter = GST_MY_FILTER (GST_OBJECT_PARENT (pad)); switch (GST_EVENT_TYPE (event)) { case GST_EVENT_EOS: /* end-of-stream, we should close down all stream leftovers here */ gst_my_filter_stop_processing (filter); break; default: break;
23
In some cases, it might be useful for an element to have control over the input data rate, too. In that case, you probably want to write a so-called loop-based element. Source elements (with only source pads) can also be get-based elements. These concepts will be explained in the advanced section of this guide, and in the section that specically discusses source pads.
24
which will from now on be referred to simply as NULL, READY, PAUSED and PLAYING. GST_STATE_NULL is the default state of an element. In this state, it has not allocated any runtime resources, it has not loaded any runtime libraries and it can obviously not handle data. GST_STATE_READY is the next state that an element can be in. In the READY state, an element has all default resources (runtime-libraries, runtime-memory) allocated. However, it has not yet allocated or dened anything that is stream-specic. When going from NULL to READY state (GST_STATE_CHANGE_NULL_TO_READY), an element should allocate any non-stream-specic resources and should load runtime-loadable libraries (if any). When going the other way around (from READY to NULL, GST_STATE_CHANGE_READY_TO_NULL), an element should unload these libraries and free all allocated resources. Examples of such resources are hardware devices. Note that les are generally streams, and these should thus be considered as stream-specic resources; therefore, they should not be allocated in this state. GST_STATE_PAUSED is the state in which an element is ready to accept and handle data. For most elements this state is the same as PLAYING. The only exception to this rule are sink elements. Sink elements only accept one single buffer of data and then block. At this point the pipeline is prerolled and ready to render data immediately. GST_STATE_PLAYING is the highest state that an element can be in. For most elements this state is exactly the same as PAUSED, they accept and process events and buffers with data. Only sink elements need to differentiate between PAUSED and PLAYING state. In PLAYING state, sink elements actually render incoming data, e.g. output audio to a sound card or render video pictures to an image sink.
25
Chapter 6. What are states? If you use a base class, you will rarely have to handle state changes yourself. All you have to do is override the base classs start() and stop() virtual functions (might be called differently depending on the base class) and the base class will take care of everything for you. If, however, you do not derive from a ready-made base class, but from GstElement or some other class not built on top of a base class, you will most likely have to implement your own state change function to be notied of state changes. This is denitively necessary if your plugin is a decoder or an encoder, as there are no base classes for decoders or encoders yet. An element can be notied of state changes through a virtual function pointer. Inside this function, the element can initialize any sort of specic data needed by the element, and it can optionally fail to go from one state to another. Do not g_assert for unhandled state changes; this is taken care of by the GstElement base class.
static GstStateChangeReturn gst_my_filter_change_state (GstElement *element, GstStateChange transition); static void gst_my_filter_class_init (GstMyFilterClass *klass) { GstElementClass *element_class = GST_ELEMENT_CLASS (klass); element_class->change_state = gst_my_filter_change_state; }
static GstStateChangeReturn gst_my_filter_change_state (GstElement *element, GstStateChange transition) { GstStateChangeReturn ret = GST_STATE_CHANGE_SUCCESS; GstMyFilter *filter = GST_MY_FILTER (element); switch (transition) { case GST_STATE_CHANGE_NULL_TO_READY: if (!gst_my_filter_allocate_memory (filter)) return GST_STATE_CHANGE_FAILURE; break; default: break; } ret = GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); if (ret == GST_STATE_CHANGE_FAILURE) return ret; switch (transition) { case GST_STATE_CHANGE_READY_TO_NULL: gst_my_filter_free_memory (filter);
26
Note that upwards (NULL=>READY, READY=>PAUSED, PAUSED=>PLAYING) and downwards (PLAYING=>PAUSED, PAUSED=>READY, READY=>NULL) state changes are handled in two separate blocks with the downwards state change handled only after we have chained up to the parent classs state change function. This is necessary in order to safely handle concurrent access by multiple threads. The reason for this is that in the case of downwards state changes you dont want to destroy allocated resources while your plugins chain function (for example) is still accessing those resources in another thread. Whether your chain function might be running or not depends on the state of your plugins pads, and the state of those pads is closely linked to the state of the element. Pad states are handled in the GstElement classs state change function, including proper locking, thats why it is essential to chain up before destroying allocated resources.
27
/* properties */ enum { ARG_0, ARG_SILENT /* FILL ME */ }; static void gst_my_filter_set_property (GObject guint prop_id, const GValue *value, GParamSpec *pspec); static void gst_my_filter_get_property (GObject guint prop_id, GValue *value, GParamSpec *pspec); *object,
*object,
static void gst_my_filter_class_init (GstMyFilterClass *klass) { GObjectClass *object_class = G_OBJECT_CLASS (klass); /* define properties */ g_object_class_install_property (object_class, ARG_SILENT, g_param_spec_boolean ("silent", "Silent", "Whether to be very verbose or not", FALSE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); /* define virtual function pointers */ object_class->set_property = gst_my_filter_set_property; object_class->get_property = gst_my_filter_get_property; } static void gst_my_filter_set_property (GObject *object, guint prop_id, const GValue *value, GParamSpec *pspec) { GstMyFilter *filter = GST_MY_FILTER (object); switch (prop_id) { case ARG_SILENT:
28
The above is a very simple example of how arguments are used. Graphical applications - for example GStreamer Editor - will use these properties and will display a user-controllable widget with which these properties can be changed. This means that - for the property to be as user-friendly as possible - you should be as exact as possible in the denition of the property. Not only in dening ranges in between which valid properties can be located (for integers, oats, etc.), but also in using very descriptive (better yet: internationalized) strings in the denition of the property, and if possible using enums and ags instead of integers. The GObject documentation describes these in a very complete way, but below, well give a short example of where this is useful. Note that using integers here would probably completely confuse the user, because they make no sense in this context. The example is stolen from videotestsrc.
typedef enum { GST_VIDEOTESTSRC_SMPTE, GST_VIDEOTESTSRC_SNOW, GST_VIDEOTESTSRC_BLACK } GstVideotestsrcPattern; [..] #define GST_TYPE_VIDEOTESTSRC_PATTERN (gst_videotestsrc_pattern_get_type ()) static GType gst_videotestsrc_pattern_get_type (void) {
29
30
Chapter 8. Signals
GObject signals can be used to notify applications of events specic to this object. Note, however, that the application needs to be aware of signals and their meaning, so if youre looking for a generic way for application-element interaction, signals are probably not what youre looking for. In many cases, however, signals can be very useful. See the GObject documentation (http://library.gnome.org/devel/gobject/stable/) for all internals about signals.
31
that GStreamer searches, then you will need to set the plugin path. Either set GST_PLUGIN_PATH to the directory containing your plugin, or use the command-line option --gst-plugin-path. If you based your plugin off of the gst-plugin template, then this will look something like gst-launch --gst-plugin-path=$HOME/gst-template/gst-plugin/src/.libs TESTPIPELINE However, you will often need more testing features than gst-launch can provide, such as seeking, events, interactivity and more. Writing your own small testing program is the easiest way to accomplish this. This section explains - in a few words - how to do that. For a complete application development guide, see the Application Development Manual (../../manual/html/index.html). At the start, you need to initialize the GStreamer core library by calling gst_init (). You can alternatively call gst_init_with_popt_tables (), which will return a pointer to popt tables. You can then use libpopt to handle the given argument table, and this will nish the GStreamer initialization. You can create elements using gst_element_factory_make (), where the rst argument is the element type that you want to create, and the second argument is a free-form name. The example at the end uses a simple lesource - decoder - soundcard output pipeline, but you can use specic debugging elements if thats necessary. For example, an identity element can be used in the middle of the pipeline to act as a data-to-application transmitter. This can be used to check the data for misbehaviours or correctness in your test application. Also, you can use a fakesink element at the end of the pipeline to dump your data to the stdout (in order to do this, set the dump property to TRUE). Lastly, you can use the efence element (indeed, an eletric fence memory debugger wrapper element) to check for memory errors. During linking, your test application can use xation or ltered caps as a way to drive a specic type of data to or from your element. This is a very simple and effective way of checking multiple types of input and output in your element. Running the pipeline happens through the gst_bin_iterate () function. Note that during running, you should connect to at least the error and eos signals on the pipeline and/or your plugin/element to check for correct handling of this. Also, you should add events into the pipeline and make sure your plugin handles these correctly (with respect to clocking, internal caching, etc.). Never forget to clean up memory in your plugin or your test application. When going to the NULL state, your element should clean up allocated memory and caches. Also, it should close down any references held to possible support libraries. Your application should unref () the pipeline and make sure it doesnt crash.
#include <gst/gst.h> static gboolean
32
33
/* putting an audioconvert element here to convert the output of the * decoder into a format that my_filter can handle (we are assuming it * will handle any sample rate here though) */ convert1 = gst_element_factory_make ("audioconvert", "audioconvert1"); /* use "identity" here for a filter that does nothing */ filter = gst_element_factory_make ("my_filter", "my_filter"); /* there should always be audioconvert and audioresample elements before * the audio sink, since the capabilities of the audio sink usually vary * depending on the environment (output used, sound card, driver etc.) */ convert2 = gst_element_factory_make ("audioconvert", "audioconvert2"); resample = gst_element_factory_make ("audioresample", "audioresample"); sink = gst_element_factory_make ("osssink", "audiosink"); if (!sink || !decoder) { g_print ("Decoder or output could not be found - check your install\n"); return -1; } else if (!convert1 || !convert2 || !resample) { g_print ("Could not create audioconvert or audioresample element, " "check your installation\n"); return -1; } else if (!filter) { g_print ("Your self-written filter could not be found. Make sure it " "is installed correctly in $(libdir)/gstreamer-0.10/ or " "~/.gstreamer-0.10/plugins/ and that gst-inspect-0.10 lists it. " "If it doesnt, check with GST_DEBUG=*:2 gst-inspect-0.10 for " "the reason why it is not being loaded."); return -1; } g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL); gst_bin_add_many (GST_BIN (pipeline), filesrc, decoder, convert1, filter, convert2, resample, sink, NULL); /* link everything together */ if (!gst_element_link_many (filesrc, decoder, convert1, filter, convert2, resample, sink, NULL)) { g_print ("Failed to link one or more elements!\n"); return -1;
34
35
37
Chapter 10. Caps negotiation Downstream elements are notied of a newly set caps only when data is actually passing their pad. This is because caps is attached to buffers during data ow. So when the vorbis decoder sets a caps on its source pad (to congure the output format), the converter will not yet be notied. Instead, the converter will only be notied when the decoder pushes a buffer over its source pad to the converter. Right before calling the chain-function in the converter, GStreamer will check whether the format that was previously negotiated still applies to this buffer. If not, it rst calls the setcaps-function of the converter to congure it for the new format. Only after that will it call the chain function of the converter.
The xed caps can then be set on the pad by calling gst_pad_set_caps ().
[..] caps = gst_caps_new_simple ("audio/x-raw-float", "width", G_TYPE_INT, 32, "endianness", G_TYPE_INT, G_BYTE_ORDER, "buffer-frames", G_TYPE_INT, <bytes-per-frame>, "rate", G_TYPE_INT, <samplerate>, "channels", G_TYPE_INT, <num-channels>, NULL); if (!gst_pad_set_caps (pad, caps)) { GST_ELEMENT_ERROR (element, CORE, NEGOTIATION, (NULL), ("Some debug information here")); return GST_FLOW_ERROR; } [..]
Elements that could implement xed caps (on their source pads) are, in general, all elements that are not renegotiable. Examples include:
A typender, since the type found is part of the actual data stream and can thus not be re-negotiated. Pretty much all demuxers, since the contained elementary data streams are dened in the le headers, and thus not renegotiable. Some decoders, where the format is embedded in the data stream and not part of the peercaps and where the decoder itself is not recongurable, too.
38
Chapter 10. Caps negotiation All other elements that need to be congured for the format should implement full caps negotiation, which will be explained in the next few sections.
static gboolean gst_my_filter_setcaps (GstPad *pad, GstCaps *caps) { GstMyFilter *filter = GST_MY_FILTER (GST_OBJECT_PARENT (pad)); GstStructure *s; /* forward-negotiate */ if (!gst_pad_set_caps (filter->srcpad, caps)) return FALSE; /* negotiation succeeded, so now configure ourselves */ s = gst_caps_get_structure (caps, 0); gst_structure_get_int (s, "rate", &filter->samplerate); gst_structure_get_int (s, "channels", &filter->channels); return TRUE; }
39
Chapter 10. Caps negotiation There may also be cases where the lter actually is able to change the format of the stream. In those cases, it will negotiate a new format. Obviously, the element should rst attempt to congure pass-through, which means that it does not change the streams format. However, if that fails, then it should call gst_pad_get_allowed_caps () on its sourcepad to get a list of supported formats on the outputs, and pick the rst. The return value of that function is guaranteed to be a subset of the template caps. Lets look at the example of an element that can convert between samplerates, so where input and output samplerate dont have to be the same:
static gboolean gst_my_filter_setcaps (GstPad *pad, GstCaps *caps) { GstMyFilter *filter = GST_MY_FILTER (GST_OBJECT_PARENT (pad)); if (gst_pad_set_caps (filter->sinkpad, caps)) { filter->passthrough = TRUE; } else { GstCaps *othercaps, *newcaps; GstStructure *s = gst_caps_get_structure (caps, 0), *others; /* no passthrough, setup internal conversion */ gst_structure_get_int (s, "channels", &filter->channels); othercaps = gst_pad_get_allowed_caps (filter->srcpad); others = gst_caps_get_structure (othercaps, 0); gst_structure_set (others, "channels", G_TYPE_INT, filter->channels, NULL); /* now, the samplerate value can optionally have multiple values, so * we "fixate" it, which means that one fixed value is chosen */ newcaps = gst_caps_copy_nth (othercaps, 0); gst_caps_unref (othercaps); gst_pad_fixate_caps (filter->srcpad, newcaps); if (!gst_pad_set_caps (filter->srcpad, newcaps)) return FALSE; /* we are now set up, configure internally */ filter->passthrough = FALSE; gst_structure_get_int (s, "rate", &filter->from_samplerate); others = gst_caps_get_structure (newcaps, 0); gst_structure_get_int (others, "rate", &filter->to_samplerate); } return TRUE; } static GstFlowReturn gst_my_filter_chain (GstPad GstBuffer *buf) {
*pad,
40
41
Chapter 10. Caps negotiation It is important to note here that different elements actually have different responsibilities here:
Elements should implement a padalloc-function in order to be able to change format on renegotiation. This is also true for lters and converters. Elements should allocate new buffers using gst_pad_alloc_buffer (). Elements that are renegotiable should implement a setcaps-function on their sourcepad as well.
Unfortunately, not all details here have been worked out yet, so this documentation is incomplete. FIXME.
static GstCaps * gst_my_filter_getcaps (GstPad *pad) { GstMyFilter *filter = GST_MY_FILTER (GST_OBJECT_PARENT (pad)); GstPad *otherpad = (pad == filter->srcpad) ? filter->sinkpad : filter->srcpad; GstCaps *othercaps = gst_pad_get_allowed_caps (otherpad), *caps; gint i; /* We support *any* samplerate, indifferent from the samplerate * supported by the linked elements on both sides. */ for (i = 0; i < gst_caps_get_size (othercaps); i++) { GstStructure *structure = gst_caps_get_structure (othercaps, i); gst_structure_remove_field (structure, "rate"); } caps = gst_caps_intersect (othercaps, gst_pad_get_pad_template_caps (pad)); gst_caps_unref (othercaps); return caps; }
Using all the knowledge youve acquired by reading this chapter, you should be able to write an element that does correct caps negotiation. If in doubt, look at other elements of the same type in our git repository to get an idea of how they do what you want to do.
42
If all pads of an element are assigned to do push-based scheduling, then this means that data will be pushed by upstream elements to this element using the sinkpads _chain ()-function. Prerequisites for this scheduling mode are that a chain-function was set for each sinkpad usinggst_pad_set_chain_function () and that all downstream elements operate in the same mode. Pads are assigned to do push-based scheduling in sink-to-source element order, and within an element rst sourcepads and then sinkpads. Sink elements can operate in this mode if their sinkpad is activated for push-based scheduling. Source elements cannot be chain-based. Alternatively, sinkpads can be the driving force behind a pipeline by operating in pull-based mode, while the sourcepads of the element still operate in push-based mode. In order to be the driving force, those pads start a GstTask when they are activated. This task is a thread, which will call a function specied by the element. When called, this function will have random data access (through gst_pad_get_range ()) over all sinkpads, and can push data over the sourcepads, which effectively means that this element controls data ow in the pipeline. Prerequisites for this mode are that all downstream elements can act in chain-based mode, and that all upstream elements allow random access (see below). Source elements can be told to act in this mode if their sourcepads are
43
Chapter 11. Different scheduling modes activated in push-based fashion. Sink elements can be told to act in this mode when their sinkpads are activated in pull-mode.
lastly, all pads in an element can be assigned to act in pull-mode. too. However, contrary to the above, this does not mean that they start a task on their own. Rather, it means that they are pull slave for the downstream element, and have to provide random data access to it from their _get_range ()-function. Requirements are that the a _get_range ()-function was set on this pad using the function gst_pad_set_getrange_function (). Also, if the element has any sinkpads, all those pads (and thereby their peers) need to operate in random access mode, too. Note that the element is supposed to activate those elements itself! GStreamer will not do that for you.
In the next two sections, we will go closer into pull-based scheduling (elements/pads driving the pipeline, and elements/pads providing random access), and some specic use cases will be given.
Demuxers, parsers and certain kinds of decoders where data comes in unparsed (such as MPEG-audio or video streams), since those will prefer byte-exact (random) access from their input. If possible, however, such elements should be prepared to operate in chain-based mode, too. Certain kind of audio outputs, which require control over their input data ow, such as the Jack sound server.
In order to start this task, you will need to create it in the activation function.
#include "filter.h" #include <string.h> static gboolean gst_my_filter_activate (GstPad * pad); static gboolean gst_my_filter_activate_pull (GstPad * pad, gboolean active); static void gst_my_filter_loop (GstMyFilter * filter); GST_BOILERPLATE (GstMyFilter, gst_my_filter, GstElement, GST_TYPE_ELEMENT);
44
[..] } [..] static gboolean gst_my_filter_activate (GstPad * pad) { if (gst_pad_check_pull_range (pad)) { return gst_pad_activate_pull (pad, TRUE); } else { return FALSE; } } static gboolean gst_my_filter_activate_pull (GstPad *pad, gboolean active) { GstMyFilter *filter = GST_MY_FILTER (GST_OBJECT_PARENT (pad)); if (active) { filter->offset = 0; return gst_pad_start_task (pad, (GstTaskFunction) gst_my_filter_loop, filter); } else { return gst_pad_stop_task (pad); } }
Once started, your task has full control over input and output. The most simple case of a task function is one that reads input and pushes that over its source pad. Its not all that useful, but provides some more exibility than the old chain-based case that weve been looking at so far.
#define BLOCKSIZE 2048 static void gst_my_filter_loop (GstMyFilter * filter) { GstFlowReturn ret; guint64 len; GstFormat fmt = GST_FORMAT_BYTES; GstBuffer *buf = NULL; if (!gst_pad_query_duration (filter->sinkpad, &fmt, &len)) { GST_DEBUG_OBJECT (filter, "failed to query duration, pausing");
45
46
Data sources, such as a le source, that can provide data from any offset with reasonable low latency. Filters that would like to provide a pull-based-like scheduling mode over the whole pipeline. Note that elements assigned to do random access-based scheduling are themselves responsible for assigning this scheduling mode to their upstream peers! GStreamer will not do that for you. Parsers who can easily provide this by skipping a small part of their input and are thus essentially "forwarding" random access requests literally without any own processing involved. Examples include tag readers (e.g. ID3) or single output parsers, such as a WAVE parser.
The following example will show how a _get_range ()-function can be implemented in a source element:
#include "filter.h" static GstFlowReturn gst_my_filter_get_range (GstPad guint64 offset, guint length, GstBuffer ** buf);
* pad,
static void gst_my_filter_init (GstMyFilter * filter) { GstElementClass *klass = GST_ELEMENT_GET_CLASS (filter); filter->srcpad = gst_pad_new_from_template ( gst_element_class_get_pad_template (klass, "src"), "src"); gst_pad_set_getrange_function (filter->srcpad, gst_my_filter_get_range); gst_element_add_pad (GST_ELEMENT (filter), filter->srcpad); [..] } static gboolean gst_my_filter_get_range (GstPad guint64 offset, guint length, GstBuffer ** buf) {
* pad,
GstMyFilter *filter = GST_MY_FILTER (GST_OBJECT_PARENT (pad)); [.. here, you would fill *buf ..] return GST_FLOW_OK; }
47
Chapter 11. Different scheduling modes In practice, many elements that could theoretically do random access, may in practice often be assigned to do push-based scheduling anyway, since there is no downstream element able to start its own task. Therefore, in practice, those elements should implement both a _get_range ()-function and a _chain ()-function (for lters and parsers) or a _get_range ()-function and be prepared to start their own task by providing _activate_* ()-functions (for source elements), so that GStreamer can decide for the optimal scheduling mode and have it just work ne in practice.
48
Do not create a new type if you could use one which already exists. If creating a new type, discuss it rst with the other GStreamer developers, on at least one of: IRC, mailing lists. Try to ensure that the name for a new format is as unlikely to conict with anything else created already, and is not a more generalised name than it should be. For example: "audio/compressed" would be too generalised a name to represent audio data compressed with an mp3 codec. Instead "audio/mp3" might be an appropriate name, or "audio/compressed" could exist and have a property indicating the type of compression used. Ensure that, when you do create a new type, you specify it clearly, and get it added to the list of known types so that other developers can use the type correctly when writing their elements.
49
library dependencies) to put it elsewhere. The reason for this centralization is to reduce the number of plugins that need to be loaded in order to detect a streams type. Below is an example that will recognize AVI les, which start with a RIFF tag, then the size of the le and then an AVI tag:
static void gst_my_typefind_function (GstTypeFind *tf, gpointer data) { guint8 *data = gst_type_find_peek (tf, 0, 12); if (data && GUINT32_FROM_LE (&((guint32 *) data)[0]) == GST_MAKE_FOURCC (R,I,F,F) && GUINT32_FROM_LE (&((guint32 *) data)[2]) == GST_MAKE_FOURCC (A,V,I, )) { gst_type_find_suggest (tf, GST_TYPE_FIND_MAXIMUM, gst_caps_new_simple ("video/x-msvideo", NULL)); } } static gboolean plugin_init (GstPlugin *plugin) { static gchar *exts[] = { "avi", NULL }; if (!gst_type_find_register (plugin, "", GST_RANK_PRIMARY, gst_my_typefind_function, exts, gst_caps_new_simple ("video/x-msvideo", NULL), NULL)) return FALSE; }
Note that gst-plugins/gst/typefind/gsttypefindfunctions.c has some simplication macros to decrease the amount of code. Make good use of those if you want to submit typending patches with new typend functions.
50
Chapter 12. Types and Properties Autoplugging has been discussed in great detail in the Application Development Manual.
Table of Audio Types Table of Video Types Table of Container Types Table of Subtitle Types Table of Other Types
Note that many of the properties are not required, but rather optional properties. This means that most of these properties can be extracted from the container header, but that - in case the container header does not provide these - they can also be extracted by parsing the stream header or the stream content. The policy is that your element should provide the data that it knows about by only parsing its own content, not another elements content. Example: the AVI header provides samplerate of the contained audio stream in the header. MPEG system streams dont. This means that an AVI stream demuxer would provide samplerate as a property for MPEG audio streams, whereas an MPEG demuxer would not. A decoder needing this data would require a stream parser in between two extract this from the header or calculate it from the stream. Table 12-1. Table of Audio Types Mime Type audio/* Description Property Property Property Property Description Type Values rate greater than 0 integer The number of channels of audio data. greater than 0 The sample rate of the data, in samples (per channel) per second.
51
Chapter 12. Types and Properties Mime Type Description Property Property Property Property Description Type Values
audio/x- Unstruc- endianness integer G_BIG_ENDIAN The order of bytes in a sample. The value raw-int tured and (4321) or G_LITTLE_ENDIAN (1234) means little-endian uncomG_LITTLE_ENDIAN (byte-order is least signicant byte rst). The value pressed (1234) G_BIG_ENDIAN (4321) means big-endian (byte raw order is most signicant byte rst). xedsigned boolean TRUE or Whether integer FALSE the audio values of data. the integer samples are signed or not. Signed samples use one bit to indicate sign (negative or positive) of the value. Unsigned samples are always positive. width integer greater than 0 Number of bits allocated per sample.
52
Chapter 12. Types and Properties Mime Type depth Description Property Property Property Property Description Type Values integer greater The than 0 number of bits used per sample. This must be less than or equal to the width: If the depth is less than the width, the low bits are assumed to be the ones used. For example, a width of 32 and a depth of 24 means that each sample is stored in a 32 bit word, but only the low 24 bits are actually used.
53
Chapter 12. Types and Properties Mime Type Description Property Property Property Property Description Type Values G_BIG_ENDIAN The order of bytes in a sample. The value (4321) or G_LITTLE_ENDIAN (1234) means little-endian G_LITTLE_ENDIAN (byte-order is least signicant byte rst). The value (1234) G_BIG_ENDIAN (4321) means big-endian (byte order is most signicant byte rst).
audio/x- Unstruc- endianness integer raw-oat tured and uncompressed raw oatingpoint audio data. width integer greater than 0 The amount of bits used and allocated per sample.
All encoded audio types. audio/x- AC-3 or ac3 A52 audio streams. audio/x- ADPCM layout adpcm Audio streams. string quicktime, dvi, microsoft or 4xm. There are currently no specic properties dened or needed for this type.
The layout denes the packing of the samples in the stream. In ADPCM, most formats store multiple samples per channel together. This number of samples differs per format, hence the different layouts. On the long term, we probably want this variable to die and use something more descriptive, but this will do for now.
block_align integer
Any
Chunk buffer size. There are currently no specic properties dened or needed for this type.
audio/x- Audio as cinepak provided in a Cinepak (Quicktime) stream. audio/x- Audio as dv provided in a Digital Video stream.
There are currently no specic properties dened or needed for this type.
54
Chapter 12. Types and Properties Mime Type Description Property Property Property Property Description Type Values There are currently no specic properties dened or needed for this type.
audio/x- Free ac Lossless Audio codec (FLAC). audio/x- Data gsm encoded by the GSM codec. audio/x- A-Law alaw Audio. audio/x- Mu-Law mulaw Audio. audio/x- MACE maceversion integer mace Audio (used in Quicktime). audio/mpeg Audio mpegversion integer data compressed using the MPEG audio encoding scheme. 3 or 6
There are currently no specic properties dened or needed for this type.
There are currently no specic properties dened or needed for this type. There are currently no specic properties dened or needed for this type. The version of the MACE audio codec used to encode the stream.
1, 2 or 4 The MPEG-version used for encoding the data. The value 1 refers to MPEG-1, -2 and -2.5 layer 1, 2 or 3. The values 2 and 4 refer to the MPEG-AAC audio encoding schemes.
55
Chapter 12. Types and Properties Mime Type framed Description Property Property Property Property Description Type Values boolean 0 or 1 A true value indicates that each buffer contains exactly one frame. A false value indicates that frames and buffers do not necessarily match up. integer 1, 2, or 3 The compression scheme layer used to compress the data (only if mpegversion=1). greater than 0 The bitrate, in bits per second. For VBR (variable bitrate) MPEG data, this is the average bitrate.
layer
bitrate
integer
56
Chapter 12. Types and Properties Mime Type Description Property Property Property Property Description Type Values There are currently no specic properties dened or needed for this type.
audio/x- Data qdm2 encoded by the QDM version 2 codec. audio/x- Realmedia raversion integer pnAudio realaudio data. audio/x- Data speex encoded by the Speex audio codec audio/x- Vorbis vorbis audio data audio/x- Windows wmaversion integer wma Media Audio audio/x- Ensoniq paris PARIS audio audio/x- Amiga svx IFF / SVX8 / SV16 audio audio/x- Sphere nist NIST audio audio/x- Sound voc Blaster VOC audio audio/x- Berkeley/IRCAM/CARL ircam audio 1,2 or 3 1 or 2
The version of the Real Audio codec used to encode the stream. 1 stands for a 14k4 stream, 2 stands for a 28k8 stream. There are currently no specic properties dened or needed for this type.
There are currently no specic properties dened or needed for this type. The version of the WMA codec used to encode the stream. There are currently no specic properties dened or needed for this type. There are currently no specic properties dened or needed for this type.
There are currently no specic properties dened or needed for this type. There are currently no specic properties dened or needed for this type.
There are currently no specic properties dened or needed for this type.
57
Chapter 12. Types and Properties Mime Type Description Property Property Property Property Description Type Values There are currently no specic properties dened or needed for this type.
Table 12-2. Table of Video Types Mime Type video/* height Description Property Property Property Property Description Type Values All videowidth types integer greater than 0 integer The height of the video image greater than 0 The width of the video image
58
Chapter 12. Types and Properties Mime Description Property Property Property Property Description Type Type Values frameratefraction greater The or equal (average) framer0 ate in frames per second. Note that this property does not guarantee in any way that it will actually come close to this value. If you need a xed framerate, please use an element that provides that (such as videorate). 0 means a variable framerate. All raw video types.
59
Chapter 12. Types and Properties Mime Type Description Property Property Property Property Description Type Values fourcc YUY2, YVYU, UYVY, Y41P, IYU2, Y42B, YV12, I420, Y41B, YUV9, YVU9, Y800 greater than 0 The layout of the video. See FourCC denition site (http://www.fourcc.org/) for references and denitions. YUY2, YVYU and UYVY are 4:2:2 packed-pixel, Y41P is 4:1:1 packed-pixel and IYU2 is 4:4:4 packed-pixel. Y42B is 4:2:2 planar, YV12 and I420 are 4:2:0 planar, Y41B is 4:1:1 planar and YUV9 and YVU9 are 4:1:0 planar. Y800 contains Y-samples only (black/white).
integer The number of bits used per pixel by the R/G/B components. This is usually 15, 16 or 24.
The number of bits allocated per pixel. This is usually 16, 24 or 32.
60
Chapter 12. Types and Properties Mime Description Property Property Property Property Description Type Type Values endianness integer G_BIG_ENDIAN The (4321) or order of bytes in a G_LITTLE_ENDIAN (1234) sample. The value G_LITTLE_ENDIAN (1234) means littleendian (byteorder is least signicant byte rst). The value G_BIG_ENDIAN (4321) means bigendian (byte order is most signicant byte rst). For 24/32bpp, this should always be big endian because the byte order can be given in both.
61
Chapter 12. Types and Properties Mime Description Property Property Property Property Description Type Type Values red_mask, integer any The green_mask masks that and cover all blue_mask the bits used by each of the samples. The mask should be given in the endianness specied above. This means that for 24/32bpp, the masks might be opposite to host byte order (if you are working on littleendian computers). All encoded video types. video/x- 3ivx 3ivx video. video/x- DivX divx video. divxversion integer There are currently no specic properties dened or needed for this type. 3, 4 or 5 Version of the DivX codec used to encode the stream.
62
Chapter 12. Types and Properties Mime Type Description Property Property Property Property Description Type Values systemstream boolean FALSE Indicates that this stream is not a system container stream. Version of the FFMpeg video codec used to encode the stream.
video/x- FFMpeg ffvversion integer ffv video. video/x- H-263 h263 video. variant string
itu, lead, Vendor specic variant of the format. itu is the mistandard. crosoft, vdolive, vivo, xirlink
h263version string
h263, Enh263p, hanced h263pp versions of the h263 codec. variant string itu, Vendor specic variant of the format. itu is the videosoft standard. There are currently no specic properties dened or needed for this type. indeoversion integer 3 Version of the Indeo codec used to encode this stream.
video/x- H-264 h264 video. video/x- Huffyuv huffyuv video. video/x- Indeo indeo video. video/x- H-263 intelvideo. h263 video/x- Motionjpeg JPEG video.
variant
string
intel
There are currently no specic properties dened or needed for this type. Note that video/x-jpeg only applies to Motion-JPEG pictures (YUY2 colourspace). RGB colourspace JPEG images are referred to as image/jpeg (JPEG image). mpegversion integer 1, 2 or 4 Version of the MPEG codec that this stream was encoded with. Note that we have different mimetypes for 3ivx, XviD, DivX and "standard" ISO MPEG-4. This is not a good thing and were fully aware of this. However, we do not have a solution yet.
63
Chapter 12. Types and Properties Mime Description Property Property Property Property Description Type Type Values systemstream boolean FALSE Indicates that this stream is not a system container stream. video/x- Microsoftmsmpegversion integer msmpeg MPEG-4 video deviations. video/x- Microsoftmsvideoversion integer msvideocodec Video 1 (oldish codec). video/x- Realmedia rmversioninteger pnvideo. realvideo video/x- RLE ani- layout rle mation format. string 41, 42 or Version of the MS-MPEG-4-like codec that was used to 43 encode this version. A value of 41 refers to MS MPEG 4.1, 42 to 4.2 and 43 to version 4.3.
1, 2 or 3 Version of the Real Video codec that this stream was encoded with.
"microsoft" The RLE format inside the Microsoft AVI container has or a different byte layout than the RLE format inside "quick- Apples Quicktime container; this property keeps track time" of the layout.
depth
integer
1 to 64
Bit depth of the used palette. This means that the palette that belongs to this format denes 2^depth colors.
64
Chapter 12. Types and Properties Mime Description Property Property Property Property Description Type Type Values palette_data GstBuffer Buffer containing a color palette (in nativeendian RGBA) used by this format. The buffer is of size 4*2^depth. video/x- Sorensen svqversion integer svq Video. video/x- Tarkin tarkin video. video/x- Theora theora video. video/x- VP-3 vp3 video. 1 or 3 Version of the Sorensen codec that the stream was encoded with. There are currently no specic properties dened or needed for this type. There are currently no specic properties dened or needed for this type. There are currently no specic properties dened or needed for this type. Note that we have different mimetypes for VP-3 and Theora, which is not necessarily a good idea. This could probably be improved. 1,2 or 3 Version of the WMV codec that the stream was encoded with. There are currently no specic properties dened or needed for this type.
video/x- Windows wmvversion integer wmv Media Video video/x- XviD xvid video. All image types. image/gifGraphics Interchange Format.
There are currently no specic properties dened or needed for this type.
65
Chapter 12. Types and Properties Mime Type Description Property Property Property Property Description Type Values There are currently no specic properties dened or needed for this type. Note that image/jpeg only applies to RGB-colourspace JPEG images; YUY2-colourspace JPEG pictures are referred to as video/x-jpeg ("Motion JPEG"). There are currently no specic properties dened or needed for this type.
image/jpeg Joint Picture Expert Group Image. image/png Portable Network Graphics Image. image/tiffTagged Image File Format. Table 12-3. Table of Container Types Mime Type
There are currently no specic properties dened or needed for this type.
Description Property Property Property Property Description Type Values There are currently no specic properties dened or needed for this type.
video/x- Advanced ms-asf Streaming Format (ASF). video/x- AVI. msvideo video/x- Digital dv Video. systemstream boolean TRUE
There are currently no specic properties dened or needed for this type. Indicates that this is a container system stream rather than an elementary video stream. There are currently no specic properties dened or needed for this type. Indicates that this is a container system stream rather than an elementary video stream.
video/x- Matroska. matroska video/mpeg Motion systemstream boolean TRUE Pictures Expert Group System Stream. application/ogg Ogg. video/quicktime Quicktime.
There are currently no specic properties dened or needed for this type. There are currently no specic properties dened or needed for this type.
66
Chapter 12. Types and Properties Mime Type Description Property Property Property Property Description Type Values There are currently no specic properties dened or needed for this type. There are currently no specic properties dened or needed for this type.
Table 12-4. Table of Subtitle Types Mime Type Description Property Property Property Property Description Type Values None dened yet. Table 12-5. Table of Other Types Mime Type Description Property Property Property Property Description Type Values None dened yet.
67
The code to parse this le and create the dynamic sometimes pads, looks like this:
typedef struct _GstMyFilter { [..] gboolean firstrun; GList *srcpadlist; } GstMyFilter; static void gst_my_filter_base_init (GstMyFilterClass *klass) { GstElementClass *element_class = GST_ELEMENT_CLASS (klass); static GstStaticPadTemplate src_factory = GST_STATIC_PAD_TEMPLATE (
68
69
70
Note that we use a lot of checks everywhere to make sure that the content in the le is valid. This has two purposes: rst, the le could be erroneous, in which case we prevent a crash. The second and most important reason is that - in extreme cases - the le could be used maliciously to cause undened behaviour in the plugin, which might lead to security issues. Always assume that the le could be used to do bad things.
static GstPad * gst_my_filter_request_new_pad (GstElement GstPadTemplate *templ, const gchar *name); static void gst_my_filter_base_init (GstMyFilterClass *klass) { GstElementClass *element_class = GST_ELEMENT_CLASS (klass);
*element,
71
*element,
context = g_new0 (GstMyFilterInputContext, 1); pad = gst_pad_new_from_template (templ, name); gst_pad_set_element_private (pad, context); /* normally, you would set _link () and _getcaps () functions here */ gst_element_add_pad (element, pad); return pad; }
72
14.2. Clocks
GStreamer can use different clocks. Though the system time can be used as a clock, soundcards and other devices provides a better time source. For this reason some elements provide a clock. The method get_clock is implemented in elements that provide one. As clocks return an absolute measure of time, they are not usually used directly. Instead, a reference to a clock is stored in any element that needs it, and it is used internally by GStreamer to calculate the element time.
73
Chapter 14. Clocking If the stream is seeked, the next samples sent will have a timestamp that is not adjusted with the element time. Therefore, the source element must send a discontinuous event.
Notes
1. Sometimes it is a parser element the one that knows the time, for instance if a pipeline contains a lesrc element connected to a MPEG decoder element, the former is the one that knows the time of each sample, because the knowledge of when to play each sample is embedded in the MPEG format. In this case this element will be regarded as the source element for this discussion. 2. With some schedulers, gst_element_wait() blocks the pipeline. For instance, if there is one audio sink element and one video sink element, while the audio element is waiting for a sample the video element cannot play other sample. This behaviour is under discussion, and might change in a future release.
74
Even though the gstcontroller library may be linked into the host application, you should make sure it is initialized in your plugin_init function:
static gboolean plugin_init (GstPlugin *plugin) { ... /* initialize library */ gst_controller_init (NULL, NULL); ... }
It makes not sense for all GObject parameter to be real-time controlled. Therefore the next step is to mark controllable parameters. This is done by using the special ag GST_PARAM_CONTROLLABLE. when setting up GObject params in the _class_init method.
g_object_class_install_property (gobject_class, PROP_FREQ, g_param_spec_double ("freq", "Frequency", "Frequency of test signal", 0.0, 20000.0, 440.0, G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | G_PARAM_STATIC_STRINGS));
75
This call makes all parameter-changes for the given timestamp active by adjusting the GObject properties of the element. Its up to the element to determine the synchronisation rate.
76
77
Chapter 16. Interfaces for a simple interface with no further dependencies. For a small explanation on GstImplementsInterface (../../gstreamer/html/GstImplementsInterface.html), see the next section about the mixer interface: Mixer Interface.
static void gst_my_filter_some_interface_init (GstSomeInterface *iface); GType gst_my_filter_get_type (void) { static GType my_filter_type = 0; if (!my_filter_type) { static const GTypeInfo my_filter_info = { sizeof (GstMyFilterClass), (GBaseInitFunc) gst_my_filter_base_init, NULL, (GClassInitFunc) gst_my_filter_class_init, NULL, NULL, sizeof (GstMyFilter), 0, (GInstanceInitFunc) gst_my_filter_init }; static const GInterfaceInfo some_interface_info = { (GInterfaceInitFunc) gst_my_filter_some_interface_init, NULL, NULL }; my_filter_type = g_type_register_static (GST_TYPE_MY_FILTER, "GstMyFilter", &my_filter_info, 0); g_type_add_interface_static (my_filter_type, GST_TYPE_SOME_INTERFACE, &some_interface_info); } return my_filter_type; } static void gst_my_filter_some_interface_init (GstSomeInterface *iface) { /* here, you would set virtual function pointers in the interface */ }
78
static void gst_my_filter_implements_interface_init (GstImplementsInterfaceClass *iface); static void gst_my_filter_mixer_interface_init (GstMixerClass *iface); GType gst_my_filter_get_type (void) { [..] static const GInterfaceInfo implements_interface_info = { (GInterfaceInitFunc) gst_my_filter_implements_interface_init, NULL,
79
80
81
Chapter 16. Interfaces The mixer interface is very audio-centric. However, with the software ag set, the mixer can be used to mix any kind of stream in a N-to-1 element to join (not aggregate!) streams together into one output stream. Conceptually, thats called mixing too. You can always use the element factorys category to indicate type of your element. In a software element that mixes random streams, you would not be required to implement the _get_volume () or _set_volume () functions. Rather, you would only implement the _set_record () to enable or disable tracks in the output stream. to make sure that a mixer-implementing element is of a certain type, check the element factorys category.
static void gst_my_filter_implements_interface_init (GstImplementsInterfaceClass *iface); static void gst_my_filter_tuner_interface_init (GstTunerClass *iface); GType gst_my_filter_get_type (void) { [..] static const GInterfaceInfo implements_interface_info = { (GInterfaceInitFunc) gst_my_filter_implements_interface_init, NULL, NULL };
82
83
*tuner,
filter->active_input = g_list_index (filter->channels, channel); g_assert (filter->active_input >= 0); } static void gst_my_filter_tuner_interface_init (GstTunerClass *iface) { iface->list_channels = gst_my_filter_tuner_list_channels; iface->get_channel = gst_my_filter_tuner_get_channel; iface->set_channel = gst_my_filter_tuner_set_channel; }
As said, the tuner interface is very analog video-centric. It features functions for selecting an input or output, and on inputs, it features selection of a tuning frequency if the channel supports frequency-tuning on that input. Likewise, it allows signal-strength-acquiring if the input supports that. Frequency tuning can be used for radio or cable-TV tuning. Signal-strength is an indication of the signal and can be used for visual feedback to the user or for autodetection. Next to that, it also features norm selection, which is only useful for analog video elements.
84
Chapter 16. Interfaces change during the life of an element. The contents of an enumeration list are static. Currently, property probing is being used for detection of devices (e.g. for OSS elements, Video4linux elements, etc.). It could - in theory - be used for any property, though. Property probing stores the list of allowed (or recommended) values in a GValueArray and returns that to the user. NULL is a valid return value, too. The process of property probing is separated over two virtual functions: one for probing the property to create a GValueArray, and one to retrieve the current GValueArray. Those two are separated because probing might take a long time (several seconds). Also, this simplies interface implementation in elements. For the application, there are functions that wrap those two. For more information on this, have a look at the API reference for the GstPropertyProbe interface. Below is a example of property probing for the audio lter element; it will probe for allowed values for the silent property. Indeed, this value is a gboolean so it doesnt make much sense. Then again, its only an example.
#include <gst/propertyprobe/propertyprobe.h> static void gst_my_filter_probe_interface_init (GstPropertyProbeInterface *iface); GType gst_my_filter_get_type (void) { [..] static const GInterfaceInfo probe_interface_info = { (GInterfaceInitFunc) gst_my_filter_probe_interface_init, NULL, NULL }; [..] g_type_add_interface_static (my_filter_type, GST_TYPE_PROPERTY_PROBE, &probe_interface_info); [..] } static const GList * gst_my_filter_probe_get_properties (GstPropertyProbe *probe) { GObjectClass *klass = G_OBJECT_GET_CLASS (probe); static GList *props = NULL; if (!props) { GParamSpec *pspec; pspec = g_object_class_find_property (klass, "silent"); props = g_list_append (props, pspec); } return props;
85
86
return array; } static GValueArray * gst_my_filter_probe_get_values (GstPropertyProbe *probe, guint prop_id, const GParamSpec *pspec) { GstMyFilter *filter = GST_MY_FILTER (probe); GValueArray *array = NULL; switch (prop_id) { case ARG_SILENT: array = gst_my_filter_get_silent_values (filter); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (probe, prop_id, pspec); break; } return array; } static void gst_my_filter_probe_interface_init (GstPropertyProbeInterface *iface) { iface->get_properties = gst_my_filter_probe_get_properties; iface->needs_probe = gst_my_filter_probe_needs_probe; iface->probe_property = gst_my_filter_probe_probe_property; iface->get_values = gst_my_filter_probe_get_values; }
You dont need to support any functions for getting or setting values. All that is handled via the standard GObject _set_property () and _get_property () functions.
87
Chapter 16. Interfaces created the window by itself. In that case the plugin is responsible of destroying that window when its not needed any more and it has to tell the applications that a window has been created so that the application can use it. This is done using the have_xwindow_id signal that can be emitted from the plugin with the gst_x_overlay_got_xwindow_id method. As you probably guessed already active mode just means sending a X11 window to the plugin so that video output goes there. This is done using the gst_x_overlay_set_xwindow_id method. It is possible to switch from one mode to another at any moment, so the plugin implementing this interface has to handle all cases. There are only 2 methods that plugins writers have to implement and they most probably look like that :
static void gst_my_filter_set_xwindow_id (GstXOverlay *overlay, XID xwindow_id) { GstMyFilter *my_filter = GST_MY_FILTER (overlay); if (my_filter->window) gst_my_filter_destroy_window (my_filter->window); my_filter->window = xwindow_id; } static void gst_my_filter_get_desired_size (GstXOverlay *overlay, guint *width, guint *height) { GstMyFilter *my_filter = GST_MY_FILTER (overlay); *width = my_filter->width; *height = my_filter->height; } static void gst_my_filter_xoverlay_init (GstXOverlayClass *iface) { iface->set_xwindow_id = gst_my_filter_set_xwindow_id; iface->get_desired_size = gst_my_filter_get_desired_size; }
You will also need to use the interface methods to re signals when needed such as in the pad link function where you will know the video geometry and maybe create the window.
static MyFilterWindow * gst_my_filter_window_create (GstMyFilter *my_filter, gint width, gint height) { MyFilterWindow *window = g_new (MyFilterWindow, 1); ... gst_x_overlay_got_xwindow_id (GST_X_OVERLAY (my_filter), window->win);
88
89
90
Chapter 17. Tagging (Metadata and Streaminfo) Be sure to use functions like gst_value_transform () to make sure that your data is of the right type. After data reading, the application can be notied of the new taglist by calling gst_element_found_tags () or gst_element_found_tags_for_pad () (if they only refer to a specic sub-stream). These functions will post a tag message on the pipelines GstBus for the application to pick up, but also send tag events downstream, either over all source pad or the pad specied. We currently require the core to know the GType of tags before they are being used, so all tags must be registered rst. You can add new tags to the list of known tags using gst_tag_register (). If you think the tag will be useful in more cases than just your own element, it might be a good idea to add it to gsttag.c instead. Thats up to you to decide. If you want to do it in your own element, its easiest to register the tag in one of your class init functions, preferrably _class_init ().
static void gst_my_filter_class_init (GstMyFilterClass *klass) { [..] gst_tag_register ("my_tag_name", GST_TAG_FLAG_META, G_TYPE_STRING, _("my own tag"), _("a tag that is specific to my own element"), NULL); [..] }
91
92
Note that normally, elements would not read the full stream before processing tags. Rather, they would read from each sinkpad until theyve received data (since tags usually come in before the rst data buffer) and process that.
93
filter = GST_MY_FILTER (gst_pad_get_parent (pad)); ... switch (GST_EVENT_TYPE (event)) { case GST_EVENT_NEWSEGMENT: /* maybe save and/or update the current segment (e.g. for output * clipping) or convert the event into one in a different format * (e.g. BYTES to TIME) or drop it and set a flag to send a newsegment * event in a different format later */ ret = gst_pad_push_event (filter->src_pad, event); break; case GST_EVENT_EOS: /* end-of-stream, we should close down all stream leftovers here */ gst_my_filter_stop_processing (filter); ret = gst_pad_push_event (filter->src_pad, event); break; case GST_EVENT_FLUSH_STOP: gst_my_filter_clear_temporary_buffers (filter);
94
If your element is chain-based, you will almost always have to implement a sink event function, since that is how you are notied about new segments and the end of the stream. If your element is exclusively loop-based, you may or may not want a sink event function (since the element is driving the pipeline it will know the length of the stream in advance or be notied by the ow return value of gst_pad_pull_range(). In some cases even loop-based element may receive events from upstream though (for example audio decoders with an id3demux or apedemux element in front of them, or demuxers that are being fed input from sources that send additional information about the stream in custom events, as DVD sources do).
95
Chapter 18. Events: Seeking, Navigation and More The processing you will do in that event handler does not really matter but there are important rules you have to absolutely respect because one broken element event handler is breaking the whole pipeline event handling. Here they are : Always forward events you wont handle upstream using the default gst_pad_event_default method. If you are generating some new event based on the one you received dont forget to gst_event_unref the event you received. Event handler function are supposed to return TRUE or FALSE indicating if the event has been handled or not. Never simply return TRUE/FALSE in that handler except if you really know that you have handled that event. Remember that the event handler might be called from a different thread than the streaming thread, so make sure you use appropriate locking everywhere and at the beginning of the function obtain a reference to your element via the gst_pad_get_parent() (and release it again at the end of the function with gst_object_unref ().
End of Stream (EOS) Flush Start Flush Stop New Segment Seek Request Navigation Tag (metadata)
For more comprehensive information about events and how they should be used correctly in various circumstances please consult the GStreamer design documentation. This section only gives a general overview.
96
Chapter 18. Events: Seeking, Navigation and More there is any) and then forward the event further downstream. The gst_pad_event_default () takes care of all this, so most elements do not need to support this event. Exceptions are elements that explicitly need to close a resource down on EOS, and N-to-1 elements. Note that the stream itself is not a resource that should be closed down on EOS! Applications might seek back to a point before EOS and continue playing again. The EOS event has no properties, which makes it one of the simplest events in GStreamer. It is created using the gst_event_new_eos() function. It is important to note that only elements driving the pipeline should ever send an EOS event. If your element is chain-based, it is not driving the pipeline. Chain-based elements should just return GST_FLOW_UNEXPECTED from their chain function at the end of the stream (or the congured segment), the upstream element that is driving the pipeline will then take care of sending the EOS event (or alternatively post a SEGMENT_DONE message on the bus depending on the mode of operation). If you are implementing your own source element, you also do not need to ever manually send an EOS event, you should also just return GST_FLOW_UNEXPECTED in your create function (assuming your element derives from GstBaseSrc or GstPushSrc).
97
Chapter 18. Events: Seeking, Navigation and More If your element keeps temporary caches of stream data, it should clear them when it receives a FLUSH-STOP event (and also whenever its chain function receives a buffer with the DISCONT ag set). The ush-stop event is created with gst_event_new_flush_stop (). Like the EOS event, it has no properties.
98
Chapter 18. Events: Seeking, Navigation and More samples for audio, etc.]). Seeking can be done with respect to the end-of-le, start-of-le or current position, and usually happens in upstream direction (downstream seeking is done by sending a NEWSEGMENT event with the appropriate offsets for elements that support that, like lesink). Elements receiving seek events should, depending on the element type, either just forward it upstream (lters, decoders), change the format in which the event is given and then forward it (demuxers), or handle the event by changing the le pointer in their internal stream resource (le sources, demuxers/decoders driving the pipeline in pull-mode) or something else. Seek events are built up using positions in specied formats (time, bytes, units). They are created using the function gst_event_new_seek (). Note that many plugins do not support seeking from the end of the stream or from the current position. An element not driving the pipeline and forwarding a seek request should not assume that the seek succeeded or actually happened, it should operate based on the NEWSEGMENT events it receives. Elements parsing this event can do this using gst_event_parse_seek().
18.3.6. Navigation
Navigation events are sent upstream by video sinks to inform upstream elements of where the mouse pointer is, if and where mouse pointer clicks have happened, or if keys have been pressed or released. All this information is contained in the event structure which can be obtained with gst_event_get_structure (). Check out the navigationtest element in gst-plugins-good for an idea how to extract navigation information from this event.
99
Chapter 18. Events: Seeking, Navigation and More Elements parsing this event can use the function gst_event_parse_tag () to acquire the taglist that the event contains.
100
It requires that the sink only has one sinkpad. Sink elements that need more than one sinkpad, cannot use this base-class. The base-class owns the pad, and species caps negotiation, data handling, pad allocation and such functions. If you need more than the ones provided as virtual functions, then you cannot use this base-class. By implementing the pad_allocate () function, it is possible for upstream elements to use special memory, such as memory on the X server side that only the sink can allocate, or even hardware memory mmap ()ed from the kernel. Note that in almost all cases, you will want to subclass the GstBuffer object, so that your own set of functions will be called when the buffer loses its last reference.
Sink elements can derive from GstBaseSink using the usual GObject type creation voodoo, or by using the convenience macro GST_BOILERPLATE ():
GST_BOILERPLATE_FULL (GstMySink, gst_my_sink, GstBaseSink, GST_TYPE_BASE_SINK); [..]
102
Derived implementations barely need to be aware of preroll, and do not need to know anything about the technical implementation requirements of preroll. The base-class does all the hard work. Less code to write in the derived class, shared code (and thus shared bugxes).
There are also specialized base classes for audio and video, lets look at those a bit.
Automatic synchronization, without any code in the derived class. Also automatically provides a clock, so that other sinks (e.g. in case of audio/video playback) are synchronized. Features can be added to all audiosinks by making a change in the base class, which makes maintainance easy. Derived classes require only three small functions, plus some GObject boilerplate code.
103
Chapter 19. Pre-made base classes In addition to implementing the audio base-class virtual functions, derived classes can (should) also implement the GstBaseSink set_caps () and get_caps () virtual functions for negotiation.
Because of preroll (and the preroll () virtual function), it is possible to display a video frame already when going into the GST_STATE_PAUSED state. By adding new features to GstVideoSink, it will be possible to add extensions to videosinks that affect all of them, but only need to be coded once, which is a huge maintainance benet.
Fixes to GstBaseSrc apply to all derived classes automatically. Automatic pad activation handling, and task-wrapping in case we get assigned to start a task ourselves.
The GstBaseSrc may not be suitable for all cases, though; it has limitations:
There is one and only one sourcepad. Source elements requiring multiple sourcepads cannot use this base-class. Since the base-class owns the pad and derived classes can only control it as far as the virtual functions allow, you are limited to the functionality provided by the virtual functions. If you need more, you cannot use this base-class.
104
Chapter 19. Pre-made base classes It is possible to use special memory, such as X server memory pointers or mmap ()ed memory areas, as data pointers in buffers returned from the create() virtual function. In almost all cases, you will want to subclass GstBuffer so that your own set of functions can be called when the buffer is destroyed.
Does syncronization and provides a clock. New features can be added to it and will apply to all derived classes automatically.
105
They can be the driving force of the pipeline, by running their own task. This works particularly well for elements that need random access, for example an AVI demuxer. They can also run in push-based mode, which means that an upstream element drives the pipeline. This works particularly well for streams that may come from network, such as Ogg.
In addition, audio parsers with one output can, in theory, also be written in random access mode. Although simple playback will mostly work if your element only accepts one mode, it may be required to implement multiple modes to work in combination with all sorts of applications, such as editing. Also, performance may become better if you implement multiple modes. See Different scheduling modes to see how an element can accept multiple scheduling modes.
106
107
To add support for private events with custom event handling to another element. To add support for custom pad _query () or _convert () handling to another element. To add custom data handling before or after another elements data handler function (generally its _chain () function). To embed an element, or a series of elements, into something that looks and works like a simple element to the outside world.
Making a manager is about as simple as it gets. You can derive from a GstBin, and in most cases, you can embed the required elements in the _init () already, including setup of ghostpads. If you need any custom data handlers, you can connect signals or embed a second element which you control.
108
V. Appendices
This chapter contains things that dont belong anywhere else.
Make sure the state of an element gets reset when going to NULL. Ideally, this should set all object properties to their original state. This function should also be called from _init. Make sure an element forgets everything about its contained stream when going from PAUSED to READY. In READY, all stream states are reset. An element that goes from PAUSED to READY and back to PAUSED should start reading the stream from he start again. People that use gst-launch for testing have the tendency to not care about cleaning up. This is wrong. An element should be tested using various applications, where testing not only means to make sure it doesnt crash, but also to test for memory leaks using tools such as valgrind. Elements have to be reusable in a pipeline after having been reset.
23.2. Debugging
Elements should never use their standard output for debugging (using functions such as printf () or g_print ()). Instead, elements should use the logging functions provided by GStreamer, named GST_DEBUG (), GST_LOG (), GST_INFO (), GST_WARNING () and GST_ERROR (). The various logging levels can be turned on and off at runtime and can thus be used for solving issues as they turn up. Instead of GST_LOG () (as an example), you can also use GST_LOG_OBJECT () to print the object that youre logging output for. Ideally, elements should use their own debugging category. Most elements use the following code to do that:
GST_DEBUG_CATEGORY_STATIC (myelement_debug); #define GST_CAT_DEFAULT myelement_debug [..] static void gst_myelement_class_init (GstMyelementClass *klass) { [..] GST_DEBUG_CATEGORY_INIT (myelement_debug, "myelement", 0, "My own element"); }
110
At runtime, you can turn on debugging using the commandline option --gst-debug=myelement:5.
Elements should use GST_DEBUG_FUNCPTR when setting pad functions or overriding element class methods, for example:
gst_pad_set_event_func (myelement->srcpad, GST_DEBUG_FUNCPTR (my_element_src_event));
Elements that are aimed for inclusion into one of the GStreamer modules should ensure consistent naming of the element name, structures and function names. For example, if the element type is GstYellowFooDec, functions should be prexed with gst_yellow_foo_dec_ and the element should be registered as yellowfoodec. Separate words should be separate in this scheme, so it should be GstFooDec and gst_foo_dec, and not GstFoodec and gst_foodec.
All elements to which it applies (sources, sinks, demuxers) should implement query functions on their pads, so that applications and neighbour elements can request the current position, the stream length (if known) and so on. Elements should make sure they forward events they do not handle with gst_pad_event_default (pad, event) instead of just dropping them. Events should never be dropped unless specically intended. Elements should make sure they forward queries they do not handle with gst_pad_query_default (pad, query) instead of just dropping them. Elements should use gst_pad_get_parent() in event and query functions, so that they hold a reference to the element while they are operating. Note that gst_pad_get_parent() increases the reference count of the element, so you must be very careful to call gst_object_unref (element) before returning from your query or event function, otherwise you will leak memory.
gst-launch is not a good tool to show that your element is nished. Applications such as Rhythmbox and Totem (for GNOME) or AmaroK (for KDE) are. gst-launch will not test various things such as proper clean-up on reset, interrupt event handling, querying and so on. Parsers and demuxers should make sure to check their input. Input cannot be trusted. Prevent possible buffer overows and the like. Feel free to error out on unrecoverable stream errors. Test your demuxer using stream corruption elements such as breakmydata (included in gst-plugins). It will randomly insert, delete and modify bytes in a stream, and is therefore a good test for robustness. If your element crashes when adding this element, your element needs xing. If it errors out properly, its good enough. Ideally, itd just continue to work and forward data as much as possible.
111
Demuxers should not assume that seeking works. Be prepared to work with unseekable input streams (e.g. network sources) as well. Sources and sinks should be prepared to be assigned another clock then the one they expose themselves. Always use the provided clock for synchronization, else youll get A/V sync issues.
112
Discont events have been replaced by newsegment events. In 0.10, it is essential that you send a newsegment event downstream before you send your rst buffer (in 0.8 the scheduler would invent discont events if you forgot them, in 0.10 this is no longer the case). In 0.10, buffers have caps attached to them. Elements should allocate new buffers with gst_pad_alloc_buffer (). See Caps negotiation for more details. Most functions returning an object or an object property have been changed to return its own reference rather than a constant reference of the one owned by the object itself. The reason for this change is primarily thread-safety. This means effectively that return values of functions such as gst_element_get_pad (), gst_pad_get_name (), gst_pad_get_parent (), gst_object_get_parent (), and many more like these have to be freeed or unreferenced after use. Check the API references of each function to know for sure whether return values should be freeed or not. In 0.8, scheduling could happen in any way. Source elements could be _get ()-based or _loop ()-based, and any other element could be _chain ()-based or _loop ()-based, with no limitations. Scheduling in 0.10 is simpler for the scheduler, and the element is expected to do some more work. Pads get assigned a scheduling mode, based on which they can either operate in random access-mode, in pipeline driving mode or in push-mode. all this is documented in detail in Different scheduling modes. As a result of this, the bytestream object no longer exists. Elements requiring byte-level access should now use random access on their sinkpads.
Negotiation is asynchronous. This means that downstream negotiation is done as data comes in and upstream negotiation is done whenever renegotiation is required. All details are described in Caps negotiation. For as far as possible, elements should try to use existing base classes in 0.10. Sink and source elements, for example, could derive from GstBaseSrc and GstBaseSink. Audio sinks or sources could even derive from audio-specic base classes. All existing base classes have been discussed in Pre-made base classes and the next few chapters. In 0.10, event handling and buffers are separated once again. This means that in order to receive events, one no longer has to set the GST_FLAG_EVENT_AWARE ag, but can simply set an event handling function on the elements sinkpad(s), using the function gst_pad_set_event_function (). The _chain ()-function will only receive buffers.
113
Although core will wrap most threading-related locking for you (e.g. it takes the stream lock before calling your data handling functions), you are still responsible for locking around certain functions, e.g. object properties. Be sure to lock properly here, since applications will change those properties in a different thread than the thread which does the actual data passing! You can use the GST_OBJECT_LOCK () and GST_OBJECT_UNLOCK () helpers in most cases, fortunately, which grabs the default property lock of the element. all *_fixed_list_* () functions were renamed to GstValueArray and *_array_* ().
GstValueFixedList and
The semantics of GST_STATE_PAUSED and GST_STATE_PLAYING have changed for elements that are not sink elements. Non-sink elements need to be able to accept and process data already in the GST_STATE_PAUSED state now (i.e. when prerolling the pipeline). More details can be found in Chapter 6. If your plugins state change function hasnt been superseded by virtual start() and stop() methods of one of the new base classes, then your plugins state change functions may need to be changed in order to safely handle concurrent access by multiple threads. Your typical state change function will now rst handle upwards state changes, then chain up to the state change function of the parent class (usually GstElementClass in these cases), and only then handle downwards state changes. See the vorbis decoder plugin in gst-plugins-base for an example. The reason for this is that in the case of downwards state changes you dont want to destroy allocated resources while your plugins chain function (for example) is still accessing those resources in another thread. Whether your chain function might be running or not depends on the state of your plugins pads, and the state of those pads is closely linked to the state of the element. Pad states are handled in the GstElement classs state change function, including proper locking, thats why it is essential to chain up before destroying allocated resources. As already mentioned above, you should really rewrite your plugin to derive from one of the new base classes though, so you dont have to worry about these things, as the base class will handle it for you. There are no base classes for decoders and encoders yet, so the above paragraphs about state changes denitively apply if your plugin is a decoder or an encoder.
gst_pad_set_link_function (),
which used to set a function that would be called when a format was negotiated between two GstPads, now sets a function that is called when two elements are linked together in an application. For all practical purposes, you most likely want to use the function gst_pad_set_setcaps_function (), nowadays, which sets a function that is called when the format streaming over a pad changes (so similar to _set_link_function () in GStreamer-0.8).
If the element is derived from a GstBase class, then override the set_caps ().
gst_pad_use_explicit_caps ()
has been replaced by gst_pad_use_fixed_caps (). You can then set the xed caps to use on a pad with gst_pad_set_caps ().
114
115