<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="de-AT">
	<id>https://becomwiki.live.md-websolutions.com/index.php?action=history&amp;feed=atom&amp;title=Argos%C2%AE2D_A100_Programmer%27s_Guide</id>
	<title>Argos®2D A100 Programmer&#039;s Guide - Versionsgeschichte</title>
	<link rel="self" type="application/atom+xml" href="https://becomwiki.live.md-websolutions.com/index.php?action=history&amp;feed=atom&amp;title=Argos%C2%AE2D_A100_Programmer%27s_Guide"/>
	<link rel="alternate" type="text/html" href="https://becomwiki.live.md-websolutions.com/index.php?title=Argos%C2%AE2D_A100_Programmer%27s_Guide&amp;action=history"/>
	<updated>2026-05-13T12:45:17Z</updated>
	<subtitle>Versionsgeschichte dieser Seite in BECOM Systems Support</subtitle>
	<generator>MediaWiki 1.43.5</generator>
	<entry>
		<id>https://becomwiki.live.md-websolutions.com/index.php?title=Argos%C2%AE2D_A100_Programmer%27s_Guide&amp;diff=22&amp;oldid=prev</id>
		<title>Peter: 1 Version importiert</title>
		<link rel="alternate" type="text/html" href="https://becomwiki.live.md-websolutions.com/index.php?title=Argos%C2%AE2D_A100_Programmer%27s_Guide&amp;diff=22&amp;oldid=prev"/>
		<updated>2023-10-31T08:03:04Z</updated>

		<summary type="html">&lt;p&gt;1 Version importiert&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;de-AT&quot;&gt;
				&lt;td colspan=&quot;1&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Nächstältere Version&lt;/td&gt;
				&lt;td colspan=&quot;1&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Version vom 31. Oktober 2023, 10:03 Uhr&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-notice&quot; lang=&quot;de-AT&quot;&gt;&lt;div class=&quot;mw-diff-empty&quot;&gt;(kein Unterschied)&lt;/div&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;</summary>
		<author><name>Peter</name></author>
	</entry>
	<entry>
		<id>https://becomwiki.live.md-websolutions.com/index.php?title=Argos%C2%AE2D_A100_Programmer%27s_Guide&amp;diff=21&amp;oldid=prev</id>
		<title>en&gt;Peter: 1 Version importiert</title>
		<link rel="alternate" type="text/html" href="https://becomwiki.live.md-websolutions.com/index.php?title=Argos%C2%AE2D_A100_Programmer%27s_Guide&amp;diff=21&amp;oldid=prev"/>
		<updated>2023-08-22T19:35:45Z</updated>

		<summary type="html">&lt;p&gt;1 Version importiert&lt;/p&gt;
&lt;p&gt;&lt;b&gt;Neue Seite&lt;/b&gt;&lt;/p&gt;&lt;div&gt;==Overview of ISMs (Image Sensor Modules)==&lt;br /&gt;
&lt;br /&gt;
===ISM-MT9M131===&lt;br /&gt;
*See the [[Linux Software User Manual (i.MX53 Modules)#Camera module ISM-MT9M131|Linux Software User Manual]] to start with this camera module.&lt;br /&gt;
*Image data format: YUV (UYVY), 16 bpp&lt;br /&gt;
*GStreamer capture plug-in special parameters:&lt;br /&gt;
 gst-launch mfw_v4lsrc&lt;br /&gt;
&lt;br /&gt;
===ISM-MT9M025 Monochrome===&lt;br /&gt;
*See the [[Linux Software User Manual (i.MX53 Modules)#Camera module ISM-MT9M024/5 |Linux Software User Manual]] to start with this camera module.&lt;br /&gt;
*Image data format: Grayscale, up to 10 bpp (default: 8bpp)&lt;br /&gt;
*GStreamer capture plug-in special parameters:&lt;br /&gt;
 gst-launch mfw_v4lsrc color-mode=1 capture-mode=5&lt;br /&gt;
*The GStreamer capture plug-in can only handle 8bpp.&lt;br /&gt;
&lt;br /&gt;
===ISM-MT9M025 Color===&lt;br /&gt;
&amp;lt;span style=&amp;quot;color:#cc0000&amp;quot;&amp;gt;&amp;#039;&amp;#039;NOTE: Recommended for experienced customers only. The BSP does not contain software for white balancing, color correction, etc.&amp;#039;&amp;#039;&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
*See the [[Linux Software User Manual (i.MX53 Modules)#Camera module ISM-MT9M024/5 |Linux Software User Manual]] to start with this camera module.&lt;br /&gt;
*Image data format: RGB, Bayer filter array (GRBG), up to 10 bpp (default: 8bpp)&lt;br /&gt;
*GStreamer capture plug-in special parameters:&lt;br /&gt;
 gst-launch mfw_v4lsrc color-mode=2 capture-mode=5&lt;br /&gt;
*The GStreamer capture plug-in can only handle 8bpp.&lt;br /&gt;
&lt;br /&gt;
===ISM-MT9P031 Monochrome===&lt;br /&gt;
*See the [[Linux Software User Manual (i.MX53 Modules)#Camera module ISM-MT9P031  |Linux Software User Manual]] to start with this camera module.&lt;br /&gt;
*Image data format: Grayscale, up to 10 bpp (default: 8bpp)&lt;br /&gt;
*GStreamer capture plug-in special parameters:&lt;br /&gt;
 gst-launch mfw_v4lsrc color-mode=1 capture-mode=5&lt;br /&gt;
*The GStreamer capture plug-in can only handle 8bpp.&lt;br /&gt;
&lt;br /&gt;
===ISM-MT9P031 Color===&lt;br /&gt;
&amp;lt;span style=&amp;quot;color:#cc0000&amp;quot;&amp;gt;&amp;#039;&amp;#039;NOTE: Recommended for experienced customers only. The BSP does not contain software for white balancing, color correction, etc.&amp;#039;&amp;#039;&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
*See the [[Linux Software User Manual (i.MX53 Modules)#Camera module ISM-MT9P031  |Linux Software User Manual]] to start with this camera module.&lt;br /&gt;
*Image data format: RGB, Bayer filter array (Normal mode: RGGB; Rotate mode: BGGR), up to 10 bpp (default: 8bpp)&lt;br /&gt;
*GStreamer capture plug-in special parameters (normal mode):&lt;br /&gt;
 gst-launch mfw_v4lsrc color-mode=3 capture-mode=5&lt;br /&gt;
*GStreamer capture plug-in special parameters (rotate mode):&lt;br /&gt;
 gst-launch mfw_v4lsrc color-mode=4 capture-mode=5&lt;br /&gt;
*The GStreamer capture plug-in can only handle 8bpp.&lt;br /&gt;
&lt;br /&gt;
==GStreamer Programming Guide==&lt;br /&gt;
[[Image:Gstreamer-logo-75.png|100px|bottom]]&lt;br /&gt;
GStreamer is a pipeline-based multimedia framework written in the C programming language with the type system based on GObject.&lt;br /&gt;
GStreamer allows a programmer to create a variety of media-handling components, including simple audio playback, audio and video playback, recording, streaming and editing. The pipeline design serves as a base to create many types of multimedia applications such as video editors, streaming media broadcasters and media players. (https://en.wikipedia.org/wiki/GStreamer)&lt;br /&gt;
&lt;br /&gt;
===Plug-in list===&lt;br /&gt;
The i.MX53 IPU and VPU are supported in GStreamer by means of plug-ins.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Name!!Vendor!!Package!!Description!!Important parameters&lt;br /&gt;
|-&lt;br /&gt;
|mfw_ipucsc||Freescale||gst-fsl-plugin||IPU Color Space Converter||&lt;br /&gt;
|-&lt;br /&gt;
|mfw_v4lsink||Freescale||gst-fsl-plugin||V4L Sink - An overlay framebuffer for video playback or live view for output on a display||&amp;#039;&amp;#039;disp-width&amp;#039;&amp;#039;...width of the image to be displayed&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;disp-height&amp;#039;&amp;#039;...height of the image to be displayed&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;axis-top&amp;#039;&amp;#039;...top co-ordinate of the origin of display&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;axis-left&amp;#039;&amp;#039;...left co-ordinate of the origin of display&lt;br /&gt;
|-&lt;br /&gt;
|mfw_v4lsrc||Freescale/Bluetechnix||gst-fsl-plugin||Video Source plug-in - to capture image data from ISM camera modules||&amp;#039;&amp;#039;capture-width&amp;#039;&amp;#039;...width of the image to be captured&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;capture-height&amp;#039;&amp;#039;...height of the image to be captured&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;fps-n&amp;#039;&amp;#039;...frame rate at which the input stream is to be captured&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;color-mode&amp;#039;&amp;#039;...Sets the color format. 0...YUV (ISM-MT9M131); 1...8-bit grayscale (Monochrome cameras); 2...Bayer GRBG (ISM-MT9M025); 3...Bayer RGGB (ISM-MT9P031); 4...Bayer BGGR (ISM-MT9P031 Rotate), 5...Bayer GBRG&lt;br /&gt;
|-&lt;br /&gt;
|mfw_vpuencoder||Freescale||gst-fsl-plugin||Hardware (VPU) Encoder||&amp;#039;&amp;#039;codec-type&amp;#039;&amp;#039;...codec type for encoding: 0...MPEG4; 1...H.263; 2...AVC (H.264); 7...MJPEG&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;framerate&amp;#039;&amp;#039;...framerate at which the input stream is to be encoded&lt;br /&gt;
|-&lt;br /&gt;
|ismconv||Bluetechnix||gst-plugins-bad||Bayer/Grayscale to RGB24/RGB565 converter for Bluetechnix ISMs (ARM NEON optimized)||&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
To see all supported in/out formats and all options of a plug-in, type&lt;br /&gt;
 gst-inspect &amp;lt;plugin&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Pipeline example 1 (Linux Shell)===&lt;br /&gt;
&lt;br /&gt;
The GStreamer API is available in various programming languages. The quickest is to use the Linux shell for constructing the pipeline. For simple pipelines without much interaction, this is sufficient.&lt;br /&gt;
The following example is similar to the &amp;#039;&amp;#039;stream-mt9m131.sh&amp;#039;&amp;#039; streaming part in the Argos&amp;amp;reg;&amp;lt;sup&amp;gt;2D&amp;lt;/sup&amp;gt; A100 demo application.&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;What it does&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
*Capture YUV camera stream&lt;br /&gt;
*Duplicate stream&lt;br /&gt;
*Encode to MJPEG format&lt;br /&gt;
*Decrease frame rate&lt;br /&gt;
*Write single frame to file&lt;br /&gt;
*Encode to H.264 format&lt;br /&gt;
*Write video stream to file&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Command&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
 gst-launch -e mfw_v4lsrc fps-n=25 capture-width=800 capture-height=600 \&lt;br /&gt;
     ! tee name=splitter \&lt;br /&gt;
         ! queue \&lt;br /&gt;
         ! mfw_vpuencoder codec-type=std_mjpg \&lt;br /&gt;
         ! image/jpeg,height=600,width=800,framerate=25/1 \&lt;br /&gt;
         ! videorate skip-to-first=true \&lt;br /&gt;
         ! image/jpeg,height=600,width=800,framerate=1/2 \&lt;br /&gt;
         ! multifilesink location=/tmp/ecam-singleshot.jpg \&lt;br /&gt;
     splitter. \&lt;br /&gt;
         ! queue \&lt;br /&gt;
         ! mfw_vpuencoder codec-type=std_avc bitrate=2000 framerate=25 \&lt;br /&gt;
         ! video/x-h264,height=800,width=600,framerate=25/1 \&lt;br /&gt;
         ! filesink location=/tmp/fifo-h264 &amp;amp;&lt;br /&gt;
&lt;br /&gt;
As you can see, the application that lets you start GStreamer is &amp;#039;&amp;#039;gst-launch&amp;#039;&amp;#039;. The &amp;#039;&amp;#039;-e&amp;#039;&amp;#039; flag assures that if you press Ctrl+C for quitting GStreamer, an EOS (End-of-Stream) flag is sent down the pipeline and it is terminated properly (i.e., file content is written properly, etc.).&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Pipeline&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
A pipeline within the shell always looks like this:&lt;br /&gt;
 &amp;lt;src element&amp;gt; ! &amp;lt;element 1&amp;gt; ! &amp;lt;element 2&amp;gt; ! ... ! &amp;lt;sink element&amp;gt;&lt;br /&gt;
i.e. single elements are concatenated with &amp;#039;&amp;#039;&amp;#039;!&amp;#039;&amp;#039;&amp;#039;. Each element may have arguments of the form &amp;#039;&amp;#039;name=value&amp;#039;&amp;#039;. We will run through the above example now.&lt;br /&gt;
&lt;br /&gt;
The basic outline of the above example is as follows:&lt;br /&gt;
 Capture       --&amp;gt; Duplicate +--&amp;gt; Queue   --&amp;gt; Encode           --&amp;gt; Throttling  --&amp;gt; Write single frames&lt;br /&gt;
 (mfw_v4lsrc)      (tee)     |    (queue)     (mfw_vpuencoder)     (videorate)          (multifilesink)&lt;br /&gt;
                             |&lt;br /&gt;
                             +--&amp;gt; Queue   --&amp;gt; Encode           --&amp;gt; Write stream&lt;br /&gt;
                                  (queue)     (mfw_vpuencoder)     (filesink)&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Capturing&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
 mfw_v4lsrc fps-n=25 capture-width=800 capture-height=600&lt;br /&gt;
The capture plug-in is configured for a capture size of 800x600 pixels and 25 frames per second.&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Duplicating the stream&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Since we want to use two different encodings from the same stream, we have to duplicate the stream before each one is fed in an instance of the video encoder. The &amp;#039;&amp;#039;&amp;#039;tee&amp;#039;&amp;#039;&amp;#039; plug-in has the following syntax:&lt;br /&gt;
 &amp;lt;nowiki&amp;gt;&amp;lt;&amp;lt;/nowiki&amp;gt;source element&amp;gt; ! tee name=&amp;lt;span style=&amp;quot;color:#cc0000&amp;quot;&amp;gt;&amp;#039;&amp;#039;&amp;#039;splitter&amp;#039;&amp;#039;&amp;#039;&amp;lt;/span&amp;gt; ! &amp;lt;here is the pipeline for stream 1&amp;gt; &amp;lt;span style=&amp;quot;color:#cc0000&amp;quot;&amp;gt;&amp;#039;&amp;#039;&amp;#039;splitter.&amp;#039;&amp;#039;&amp;#039;&amp;lt;/span&amp;gt; ! &amp;lt;here is the pipeline for stream 2&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Encoding&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
We have two encodings here: MJPEG and AVC/H.264. Bitrate is in kbps. Framerate is in fps. The encoding is done in the i.MX53 Video Processing Unit, and does not consume ARM CPU cycles.&lt;br /&gt;
 mfw_vpuencoder codec-type=std_mjpg&lt;br /&gt;
 mfw_vpuencoder codec-type=std_avc bitrate=2000 framerate=25&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Frame rate throttling and filter caps&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
In the example, the encoded MJPEG video stream is throttled to a frame rate of 0.5, i.e. one frame each 2 seconds. In the Argos&amp;amp;reg;&amp;lt;sup&amp;gt;2D&amp;lt;/sup&amp;gt; A100 demo application, this is used for the still capture embedded into the web site.&lt;br /&gt;
&lt;br /&gt;
Throttling is easily be done with the &amp;#039;&amp;#039;videorate&amp;#039;&amp;#039; plug-in. To let it know the input and output frame specification, we use GStreamer &amp;quot;filter caps&amp;quot;. Capabilities (short: caps) describe the type of data that is streamed between two pads. They have a MIME type, width, height, and so on.&lt;br /&gt;
&lt;br /&gt;
Our input caps are specified by the encoder output, and we only adjust &amp;#039;&amp;#039;framerate&amp;#039;&amp;#039;.&lt;br /&gt;
 ! image/jpeg,height=600,width=800,framerate=25/1 \     &amp;#039;&amp;#039;&amp;lt;--Plug-in&amp;#039;s sink caps&amp;#039;&amp;#039;&lt;br /&gt;
 ! videorate skip-to-first=true \                       &amp;#039;&amp;#039;&amp;lt;--Plug-in&amp;#039;&amp;#039;&lt;br /&gt;
 ! image/jpeg,height=600,width=800,framerate=1/2 \      &amp;#039;&amp;#039;&amp;lt;--Plug-in&amp;#039;s source caps&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;File sinks&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
As the name suggests, file sinks write content they receive to a file. They are typically at the end of a pipeline. The file name is given in the &amp;#039;&amp;#039;location&amp;#039;&amp;#039; parameter.&lt;br /&gt;
&lt;br /&gt;
In the example, two different file sinks are used. The first one, &amp;#039;&amp;#039;multifilesink&amp;#039;&amp;#039;, writes each frame into a separate file. If the given file name contains a decimal field, files are automatically numbered, e.g. &amp;#039;&amp;#039;frame%04d.jpg&amp;#039;&amp;#039; as file name will create files &amp;#039;&amp;#039;frame0001.jpg&amp;#039;&amp;#039;, &amp;#039;&amp;#039;frame0002.jpg&amp;#039;&amp;#039;, and so on. In the example, no decimal field is used, which means, the file is overwritten with each frame:&lt;br /&gt;
 multifilesink location=/tmp/ecam-singleshot.jpg&lt;br /&gt;
The second one, &amp;#039;&amp;#039;filesink&amp;#039;&amp;#039;, is the more common variant. It writes all data to one file:&lt;br /&gt;
 filesink location=/tmp/fifo-h264&lt;br /&gt;
&lt;br /&gt;
===Pipeline example 2 (C code)===&lt;br /&gt;
The following example is written in C and is compiled with the default cross toolchain.&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;What it does&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
*Creates a (recording) thread&lt;br /&gt;
*Thread constructs MP4 recording pipeline&lt;br /&gt;
*Waits a minute&lt;br /&gt;
*Quits GStreamer pipeline&lt;br /&gt;
*Waits for thread finish&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;C Code&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Includes&lt;br /&gt;
 #include &amp;lt;gst/gst.h&amp;gt;&lt;br /&gt;
 #include &amp;lt;glib.h&amp;gt;&lt;br /&gt;
 #include &amp;lt;pthread.h&amp;gt;&lt;br /&gt;
 #include &amp;lt;unistd.h&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The Gstreamer pipeline element&lt;br /&gt;
 GstElement *g_pipeline;&lt;br /&gt;
Thread stuff&lt;br /&gt;
 pthread_t g_gstThreadId = -1;&lt;br /&gt;
&lt;br /&gt;
Message handler of the GStreamer pipeline&amp;#039;s bus&lt;br /&gt;
 static gboolean bus_call(GstBus *bus, GstMessage *msg, gpointer data)&lt;br /&gt;
 {&lt;br /&gt;
 	GMainLoop *loop = (GMainLoop *) data;&lt;br /&gt;
 	&lt;br /&gt;
 	switch(GST_MESSAGE_TYPE(msg)) {&lt;br /&gt;
&lt;br /&gt;
EOS signal quits GStreamer&lt;br /&gt;
 	case GST_MESSAGE_EOS:&lt;br /&gt;
 		g_main_loop_quit (loop);&lt;br /&gt;
 		break;&lt;br /&gt;
 		&lt;br /&gt;
 	case GST_MESSAGE_ERROR: {&lt;br /&gt;
 		gchar  *debug;&lt;br /&gt;
 		GError *error;&lt;br /&gt;
 		&lt;br /&gt;
 		gst_message_parse_error (msg, &amp;amp;error, &amp;amp;debug);&lt;br /&gt;
 		g_free (debug);&lt;br /&gt;
 		&lt;br /&gt;
 		fprintf (stderr, &amp;quot;ERROR: %s\n&amp;quot;, error-&amp;gt;message);&lt;br /&gt;
 		g_error_free (error);&lt;br /&gt;
 		&lt;br /&gt;
 		g_main_loop_quit (loop);&lt;br /&gt;
 		break;&lt;br /&gt;
 	}&lt;br /&gt;
 	default:&lt;br /&gt;
 		break;&lt;br /&gt;
 	}&lt;br /&gt;
 	&lt;br /&gt;
 	return TRUE;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
Thread function&lt;br /&gt;
 static int demo_thread(void)&lt;br /&gt;
 {&lt;br /&gt;
 	GMainLoop *loop;&lt;br /&gt;
 	&lt;br /&gt;
 	GstElement *source, *encoder, *filter, *muxer, *sink;&lt;br /&gt;
 	GstCaps *filtercaps;&lt;br /&gt;
 	GstBus *bus;&lt;br /&gt;
 	&lt;br /&gt;
 	loop = g_main_loop_new (NULL, FALSE);&lt;br /&gt;
&lt;br /&gt;
Create gstreamer elements&lt;br /&gt;
 	g_pipeline = gst_pipeline_new (&amp;quot;demo-pipeline&amp;quot;);&lt;br /&gt;
 	source = gst_element_factory_make (&amp;quot;mfw_v4lsrc&amp;quot;, &amp;quot;camera-source&amp;quot;);&lt;br /&gt;
 	encoder = gst_element_factory_make (&amp;quot;mfw_vpuencoder&amp;quot;, &amp;quot;mpeg4-encoder&amp;quot;);&lt;br /&gt;
 	filter = gst_element_factory_make (&amp;quot;capsfilter&amp;quot;, &amp;quot;filter&amp;quot;);&lt;br /&gt;
 	muxer = gst_element_factory_make (&amp;quot;mp4mux&amp;quot;, &amp;quot;mp4-muxer&amp;quot;);&lt;br /&gt;
 	sink = gst_element_factory_make (&amp;quot;filesink&amp;quot;, &amp;quot;file-sink&amp;quot;);&lt;br /&gt;
 	&lt;br /&gt;
 	if (!g_pipeline || !source || !encoder || !filter || !muxer || !sink) {&lt;br /&gt;
 		fprintf(stderr, &amp;quot;ERROR: One GST element could not be created.\n&amp;quot;);&lt;br /&gt;
 		return -1;&lt;br /&gt;
 	}&lt;br /&gt;
&lt;br /&gt;
Set source properties: Width, height, frames per second&lt;br /&gt;
 	g_object_set (G_OBJECT(source),&lt;br /&gt;
 		      &amp;quot;fps-n&amp;quot;, 25, /* fps */&lt;br /&gt;
 		      &amp;quot;capture-width&amp;quot;, 800,&lt;br /&gt;
 		      &amp;quot;capture-height&amp;quot;, 600,&lt;br /&gt;
 		      NULL);&lt;br /&gt;
Set encoder properties: Codec type, Quantization parameter&lt;br /&gt;
 	g_object_set (G_OBJECT(encoder),&lt;br /&gt;
 		      &amp;quot;codec-type&amp;quot;, 0, /* codec type mpeg4 */&lt;br /&gt;
 		      &amp;quot;qp&amp;quot;, 5, /* quantization parameter --&amp;gt; quality! */&lt;br /&gt;
 		      NULL);&lt;br /&gt;
Set sink property: Output file path&lt;br /&gt;
 	g_object_set (G_OBJECT(sink),&lt;br /&gt;
 		      &amp;quot;location&amp;quot;, &amp;quot;/tmp/gst-demo-output.mp4&amp;quot;, /* output file path */&lt;br /&gt;
 		      NULL);&lt;br /&gt;
&lt;br /&gt;
Install the message handler&lt;br /&gt;
 	bus = gst_pipeline_get_bus (GST_PIPELINE (g_pipeline));&lt;br /&gt;
 	gst_bus_add_watch (bus, bus_call, loop);&lt;br /&gt;
 	gst_object_unref (bus);&lt;br /&gt;
&lt;br /&gt;
Add all elements into the pipeline&lt;br /&gt;
 	gst_bin_add_many (GST_BIN (g_pipeline),&lt;br /&gt;
 			  source, encoder, filter, muxer, sink, NULL);&lt;br /&gt;
&lt;br /&gt;
Link the elements together&lt;br /&gt;
 	gst_element_link_many (source, encoder, filter, muxer, sink, NULL);&lt;br /&gt;
&lt;br /&gt;
Set filtercaps for the filter&lt;br /&gt;
 	filtercaps = gst_caps_new_simple (&amp;quot;video/mpeg&amp;quot;,&lt;br /&gt;
 					  &amp;quot;height&amp;quot;, G_TYPE_INT, 600,&lt;br /&gt;
 					  &amp;quot;width&amp;quot;, G_TYPE_INT, 800,&lt;br /&gt;
 					  &amp;quot;framerate&amp;quot;, GST_TYPE_FRACTION, 25, 1,&lt;br /&gt;
 					  &amp;quot;mpegversion&amp;quot;, G_TYPE_INT, 4,&lt;br /&gt;
 					  NULL);&lt;br /&gt;
 	g_object_set(G_OBJECT(filter), &amp;quot;caps&amp;quot;, filtercaps, NULL);&lt;br /&gt;
&lt;br /&gt;
Set the pipeline to &amp;quot;playing&amp;quot; state&lt;br /&gt;
 	gst_element_set_state (g_pipeline, GST_STATE_PLAYING);&lt;br /&gt;
&lt;br /&gt;
Iterate - the loop will return not before the pipeline got an EOS event&lt;br /&gt;
 	g_main_loop_run (loop);&lt;br /&gt;
&lt;br /&gt;
Out of the main loop, clean up nicely&lt;br /&gt;
 	gst_element_set_state (g_pipeline, GST_STATE_NULL);&lt;br /&gt;
 	gst_object_unref (GST_OBJECT (g_pipeline));&lt;br /&gt;
 &lt;br /&gt;
 	return 0;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
Application&amp;#039;s main function&lt;br /&gt;
 int main(int argc, char **argv)&lt;br /&gt;
 {&lt;br /&gt;
 	gst_init(NULL, NULL);&lt;br /&gt;
Start thread with GStreamer stuff&lt;br /&gt;
 	pthread_create(&amp;amp;g_gstThreadId,&lt;br /&gt;
 		       NULL,&lt;br /&gt;
 		       (void *)&amp;amp;demo_thread,&lt;br /&gt;
 		       (void *)NULL);&lt;br /&gt;
&lt;br /&gt;
Record for 60 seconds&lt;br /&gt;
 	sleep(60);&lt;br /&gt;
&lt;br /&gt;
Send EOS to pipeline&lt;br /&gt;
 	gst_element_send_event(g_pipeline, gst_event_new_eos());&lt;br /&gt;
&lt;br /&gt;
As soon as EOS is through the pipeline, the thread will quit. Here we wait for it&lt;br /&gt;
 	pthread_join(g_gstThreadId, NULL);&lt;br /&gt;
 	g_gstThreadId = -1;&lt;br /&gt;
 &lt;br /&gt;
 	return 0;&lt;br /&gt;
 }&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Building&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prepare environment for cross compilation&lt;br /&gt;
 export PATH=$PATH:/opt/freescale/usr/local/gcc-4.4.4-glibc-2.11.1-multilib-1.0/arm-fsl-linux-gnueabi/bin&lt;br /&gt;
 export ARCH=arm&lt;br /&gt;
 export CROSS_COMPILE=arm-none-linux-gnueabi-&lt;br /&gt;
Build command line&lt;br /&gt;
 arm-fsl-linux-gnueabi-gcc gstreamer.c \&lt;br /&gt;
...Compiler flags&lt;br /&gt;
 -Wall -Wstrict-prototypes -Wno-trigraphs -O2 -fno-strict-aliasing -fno-common \&lt;br /&gt;
...Compiler include paths&lt;br /&gt;
  -I &amp;lt;ltib&amp;gt;/rootfs/usr/include/gstreamer-0.10/ \&lt;br /&gt;
  -I &amp;lt;ltib&amp;gt;/rootfs/usr/include glib-2.0/ \&lt;br /&gt;
  -I &amp;lt;ltib&amp;gt;/rootfs/usr/lib/glib-2.0/include/ \&lt;br /&gt;
  -I &amp;lt;ltib&amp;gt;/rootfs/usr/include/libxml2/ \&lt;br /&gt;
...Linker include paths&lt;br /&gt;
  -L &amp;lt;ltib&amp;gt;/rootfs/usr/lib/ \&lt;br /&gt;
  -lgstreamer-0.10 \&lt;br /&gt;
  -lgobject-2.0 \&lt;br /&gt;
  -lgthread-2.0 \&lt;br /&gt;
  -lgmodule-2.0 \&lt;br /&gt;
  -lglib-2.0 \&lt;br /&gt;
  -lxml2 \&lt;br /&gt;
  -lz&lt;br /&gt;
&lt;br /&gt;
==V4L (Video4Linux) Programming Guide==&lt;br /&gt;
To get familiar with the V4L interface, we use the unit tests provided by Freescale. The unit test package is shipped automatically with the LTIB Linux BSP and installed on the target by default.&lt;br /&gt;
&lt;br /&gt;
===Overview===&lt;br /&gt;
&lt;br /&gt;
First we want to unpack the source code. Go to your LTIB directory and execute the following command.&lt;br /&gt;
 ./ltib -m prep -p imx-test&lt;br /&gt;
Afterwards, go to &lt;br /&gt;
 &amp;lt;ltib&amp;gt;/rpm/BUILD/imx-test-11.05.01/test/mxc_v4l2_test/&lt;br /&gt;
where you will find the following C files which are of interest:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! Source File !! Target Executable !! Description !! Supported sensors&lt;br /&gt;
|-&lt;br /&gt;
| mxc_v4l2_capture.c || /unit_tests/mxc_v4l2_capture.out || Captures a video stream from camera and writes it to a file. Various YUV and RGB color modes are supported. Color modes &amp;quot;GREY&amp;quot; and &amp;quot;Y16 &amp;quot; were added for raw image sensors to capture 8/10bpp pixel data. It is explained in more detail later on. || Raw and YUV sensors&lt;br /&gt;
|-&lt;br /&gt;
|mxc_v4l2_output.c || /unit_tests/mxc_v4l2_output.out || Displays a YUV video stream on the display (on the background or overlay frame buffer) || YUV sensors only&lt;br /&gt;
|-&lt;br /&gt;
|mxc_v4l2_overlay.c || /unit_tests/mxc_v4l2_overlay.out || Live view from camera to display (on the background or overlay frame buffer). The &amp;#039;&amp;#039;liveview&amp;#039;&amp;#039; application of the Argos&amp;amp;reg;&amp;lt;sup&amp;gt;2D&amp;lt;/sup&amp;gt; A100 demo for ISM-MT9M131 is based on it. || YUV sensors only&lt;br /&gt;
|-&lt;br /&gt;
|mxc_v4l2_still.c || /unit_tests/mxc_v4l2_still.out || Still YUV image capture via V4L read() function || YUV sensors only&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
NOTE: Execute any of these with the &amp;#039;&amp;#039;-help&amp;#039;&amp;#039; parameter to see the command usage.&lt;br /&gt;
&lt;br /&gt;
===mxc_v4l2_capture.c===&lt;br /&gt;
&lt;br /&gt;
We will illustrate parts of the code here for capturing camera video data. The following code is simplified for readability! As you will see, accessing the V4L device is all about I/O controls (IOCTLs):&lt;br /&gt;
&lt;br /&gt;
Open V4L device file&lt;br /&gt;
 char v4l_device[100] = &amp;quot;/dev/video0&amp;quot;;&lt;br /&gt;
 fd_v4l = open(v4l_device, O_RDWR, 0);&lt;br /&gt;
Get sensor frame size&lt;br /&gt;
 struct v4l2_frmsizeenum fsize;&lt;br /&gt;
 int g_capture_mode = 0;&lt;br /&gt;
 fsize.index = g_capture_mode;&lt;br /&gt;
 &lt;br /&gt;
 ioctl(fd_v4l, VIDIOC_ENUM_FRAMESIZES, &amp;amp;fsize);&lt;br /&gt;
 printf(&amp;quot;sensor frame size is %dx%d\n&amp;quot;, fsize.discrete.width,&lt;br /&gt;
                                        fsize.discrete.height);&lt;br /&gt;
Set stream parameters&lt;br /&gt;
:&amp;#039;&amp;#039;g_camera_framerate is set with the -fr parameter&amp;#039;&amp;#039;&lt;br /&gt;
 struct v4l2_streamparm parm;&lt;br /&gt;
 parm.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&lt;br /&gt;
 parm.parm.capture.timeperframe.numerator = 1;&lt;br /&gt;
 parm.parm.capture.timeperframe.denominator = g_camera_framerate;&lt;br /&gt;
 parm.parm.capture.capturemode = g_capture_mode;&lt;br /&gt;
 &lt;br /&gt;
 ioctl(fd_v4l, VIDIOC_S_PARM, &amp;amp;parm);&lt;br /&gt;
Set input mode&lt;br /&gt;
:&amp;#039;&amp;#039;0...CSI--&amp;gt;PRP_ENC--&amp;gt;MEM; allows CSC and rotation in IPU hardware; only suitable for YUV sensors&amp;#039;&amp;#039;&lt;br /&gt;
:&amp;#039;&amp;#039;1...CSI--&amp;gt;MEM; capture directly to memory; required for raw sensors&amp;#039;&amp;#039;&lt;br /&gt;
 int g_input = 0;&lt;br /&gt;
 &lt;br /&gt;
 ioctl(fd_v4l, VIDIOC_S_INPUT, &amp;amp;g_input);&lt;br /&gt;
Set cropping of the sensor input image&lt;br /&gt;
 struct v4l2_crop crop;&lt;br /&gt;
 crop.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&lt;br /&gt;
 crop.c.width = g_in_width;&lt;br /&gt;
 crop.c.height = g_in_height;&lt;br /&gt;
 crop.c.top = g_top;&lt;br /&gt;
 crop.c.left = g_left;&lt;br /&gt;
 &lt;br /&gt;
 ioctl(fd_v4l, VIDIOC_S_CROP, &amp;amp;crop);&lt;br /&gt;
Set data format&lt;br /&gt;
:&amp;#039;&amp;#039;For raw sensors, use either V4L2_PIX_FMT_GREY or V4L2_PIX_FMT_Y16 for g_cap_fmt.&amp;#039;&amp;#039;&lt;br /&gt;
 struct v4l2_format fmt;&lt;br /&gt;
 fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&lt;br /&gt;
 fmt.fmt.pix.pixelformat = g_cap_fmt;&lt;br /&gt;
 fmt.fmt.pix.width = g_out_width;&lt;br /&gt;
 fmt.fmt.pix.height = g_out_height;&lt;br /&gt;
 &lt;br /&gt;
 ioctl(fd_v4l, VIDIOC_S_FMT, &amp;amp;fmt);&lt;br /&gt;
Get data format&lt;br /&gt;
 fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&lt;br /&gt;
 &lt;br /&gt;
 ioctl(fd_v4l, VIDIOC_G_FMT, &amp;amp;fmt);&lt;br /&gt;
 &lt;br /&gt;
 printf(&amp;quot;\t Width = %d&amp;quot;, fmt.fmt.pix.width);&lt;br /&gt;
 printf(&amp;quot;\t Height = %d&amp;quot;, fmt.fmt.pix.height);&lt;br /&gt;
 printf(&amp;quot;\t Image size = %d\n&amp;quot;, fmt.fmt.pix.sizeimage);&lt;br /&gt;
 printf(&amp;quot;\t pixelformat = %d\n&amp;quot;, fmt.fmt.pix.pixelformat);&lt;br /&gt;
Set rotation&lt;br /&gt;
:&amp;#039;&amp;#039;Not available for raw sensors&amp;#039;&amp;#039;&lt;br /&gt;
 struct v4l2_control ctrl;&lt;br /&gt;
 ctrl.id = V4L2_CID_PRIVATE_BASE + 0;&lt;br /&gt;
 ctrl.value = g_rotate;&lt;br /&gt;
 &lt;br /&gt;
 ioctl(fd_v4l, VIDIOC_S_CTRL, &amp;amp;ctrl);&lt;br /&gt;
:&amp;#039;&amp;#039;For possible values, here is an excerpt from ltib/rpm/BUILD/linux/include/linux/mxc_v4l2.h:&amp;#039;&amp;#039;&lt;br /&gt;
 #define V4L2_MXC_ROTATE_NONE			0&lt;br /&gt;
 #define V4L2_MXC_ROTATE_VERT_FLIP		1&lt;br /&gt;
 #define V4L2_MXC_ROTATE_HORIZ_FLIP		2&lt;br /&gt;
 #define V4L2_MXC_ROTATE_180			3&lt;br /&gt;
 #define V4L2_MXC_ROTATE_90_RIGHT		4&lt;br /&gt;
 #define V4L2_MXC_ROTATE_90_RIGHT_VFLIP		5&lt;br /&gt;
 #define V4L2_MXC_ROTATE_90_RIGHT_HFLIP		6&lt;br /&gt;
 #define V4L2_MXC_ROTATE_90_LEFT			7&lt;br /&gt;
Initiate memory mapping of buffers&lt;br /&gt;
 #define TEST_BUFFER_NUM 3&lt;br /&gt;
 &lt;br /&gt;
 struct v4l2_requestbuffers req;&lt;br /&gt;
 memset(&amp;amp;req, 0, sizeof (req));&lt;br /&gt;
 req.count = TEST_BUFFER_NUM;&lt;br /&gt;
 req.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&lt;br /&gt;
 req.memory = V4L2_MEMORY_MMAP;&lt;br /&gt;
 &lt;br /&gt;
 ioctl(fd_v4l, VIDIOC_REQBUFS, &amp;amp;req);&lt;br /&gt;
Query the buffer status for memory mapping&lt;br /&gt;
 struct v4l2_buffer buf;&lt;br /&gt;
 &lt;br /&gt;
 memset(&amp;amp;buf, 0, sizeof (buf));&lt;br /&gt;
 buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&lt;br /&gt;
 buf.index = 0; // buffer number 0..x&lt;br /&gt;
 &lt;br /&gt;
 ioctl(fd_v4l, VIDIOC_QUERYBUF, &amp;amp;buf);&lt;br /&gt;
 &lt;br /&gt;
 buffers[i].length = buf.length;&lt;br /&gt;
 buffers[i].offset = (size_t) buf.m.offset;&lt;br /&gt;
Enqueue an empty buffer for capturing&lt;br /&gt;
 memset(&amp;amp;buf, 0, sizeof (buf));&lt;br /&gt;
 buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&lt;br /&gt;
 buf.memory = V4L2_MEMORY_MMAP;&lt;br /&gt;
 buf.index = 0; // buffer number 0..x&lt;br /&gt;
 buf.m.offset = buffers[i].offset;&lt;br /&gt;
 &lt;br /&gt;
 ioctl (fd_v4l, VIDIOC_QBUF, &amp;amp;buf);&lt;br /&gt;
Start streaming (capturing)&lt;br /&gt;
 enum v4l2_buf_type type;&lt;br /&gt;
 type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&lt;br /&gt;
 &lt;br /&gt;
 ioctl (fd_v4l, VIDIOC_STREAMON, &amp;amp;type);&lt;br /&gt;
Dequeue a filled buffer&lt;br /&gt;
 struct v4l2_buffer buf;&lt;br /&gt;
 memset(&amp;amp;buf, 0, sizeof (buf));&lt;br /&gt;
 buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&lt;br /&gt;
 buf.memory = V4L2_MEMORY_MMAP;&lt;br /&gt;
 &lt;br /&gt;
 ioctl (fd_v4l, VIDIOC_DQBUF, &amp;amp;buf);&lt;br /&gt;
 &lt;br /&gt;
 fwrite(buffers[buf.index].start, fmt.fmt.pix.sizeimage, 1, fd_y_file);&lt;br /&gt;
Stop streaming (capturing)&lt;br /&gt;
 enum v4l2_buf_type type;&lt;br /&gt;
 type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&lt;br /&gt;
 &lt;br /&gt;
 ioctl (fd_v4l, VIDIOC_STREAMOFF, &amp;amp;type);&lt;br /&gt;
Close V4L device file&lt;br /&gt;
 close(fd_v4l);&lt;br /&gt;
&lt;br /&gt;
==Video Processing Unit (VPU)==&lt;br /&gt;
&lt;br /&gt;
A comfortable way to use the VPU for encoding or decoding video streams is to use GStreamer (&amp;#039;&amp;#039;mfw_vpuencoder&amp;#039;&amp;#039;, &amp;#039;&amp;#039;mfw_vpudecoder&amp;#039;&amp;#039; plug-ins).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
If you don&amp;#039;t want to use GStreamer, you may start by looking at the VPU unit test application delivered by Freescale. It is contained in the &amp;#039;&amp;#039;imx-test&amp;#039;&amp;#039; package.&lt;br /&gt;
&lt;br /&gt;
Here is how to unpack it using LTIB:&lt;br /&gt;
:&amp;lt;pre&amp;gt;./ltib -m prep -p imx-test&amp;lt;/pre&amp;gt;&lt;br /&gt;
Thereafter, you will find the VPU example source code at&lt;br /&gt;
:&amp;lt;pre&amp;gt;ltib/rpm/BUILD/imx-test-11.05.01/test/mxc_vpu_test/&amp;lt;/pre&amp;gt;&lt;br /&gt;
On your target, the unit test is installed as&lt;br /&gt;
:&amp;lt;pre&amp;gt;/unit_tests/mxc_vpu_test.out&amp;lt;/pre&amp;gt;&lt;br /&gt;
To get the command usage, type&lt;br /&gt;
:&amp;lt;pre&amp;gt;/unit_tests/mxc_vpu_test.out -help&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Further reading==&lt;br /&gt;
&lt;br /&gt;
===Freescale documents===&lt;br /&gt;
&lt;br /&gt;
Download here: https://www.freescale.com/webapp/sps/site/prod_summary.jsp?code=i.MX535&amp;amp;nodeId=018rH3ZrDR988D&amp;amp;fpsp=1&amp;amp;tab=Documentation_Tab&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
i.MX Linux Multimedia Framework - User&amp;#039;s Guide&lt;br /&gt;
:Download item: IMX53_1105_LINUX_MMDOCS &lt;br /&gt;
:Location: {{mp|Linux_Multimedia_Framework_Docs_2.0.1.tar.gz|Linux_Multimedia_Framework_Docs_MX53Ubuntu_2.0.1|docs|Linux_Multimedia_Framework_User_Guide.pdf}}&lt;br /&gt;
&lt;br /&gt;
i.MX5x VPU Application Programming Interface Linux - Reference Manual&lt;br /&gt;
:Download item: IMX53_1105_LINUXDOCS_BUNDLE&lt;br /&gt;
:Location: {{mp|L2.6.35_11.05.01_ER_docs.tar.gz|L2.6.35_11.05.01_ER_docs|doc|mx5|i.MX5x_Linux_VPU_API.pdf}}&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
GStreamer Hello world program&lt;br /&gt;
:https://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/chapter-helloworld.html&lt;br /&gt;
&lt;br /&gt;
GStreamer Application Development Manual&lt;br /&gt;
:https://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/index.html&lt;br /&gt;
&lt;br /&gt;
V4L2 API specification&lt;br /&gt;
&lt;br /&gt;
:https://v4l2spec.bytesex.org/&lt;br /&gt;
&lt;br /&gt;
===Source code===&lt;br /&gt;
&lt;br /&gt;
GStreamer development header files&lt;br /&gt;
:&amp;lt;pre&amp;gt;ltib/rootfs/usr/include/gstreamer-0.10/&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
V4L2 Linux Kernel header files&lt;br /&gt;
:V4L2 main header file (V4L2_* enums, color formats, etc.)&lt;br /&gt;
:&amp;lt;pre&amp;gt;ltib/rpm/BUILD/linux/include/linux/videodev2.h&amp;lt;/pre&amp;gt;&lt;br /&gt;
:i.MX-specific V4L2 stuff&lt;br /&gt;
:&amp;lt;pre&amp;gt;ltib/rpm/BUILD/linux/include/linux/mxc_v4l2.h&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Unpack Freescale GStreamer plug-in&lt;br /&gt;
:&amp;lt;pre&amp;gt;./ltib -m prep -p gst-fsl-plugin&amp;lt;/pre&amp;gt;&lt;br /&gt;
:Source code is in&lt;br /&gt;
:&amp;lt;pre&amp;gt;rpm/BUILD/gst-fsl-plugin-2.0.1/src/&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Unpack &amp;#039;&amp;#039;ismconv&amp;#039;&amp;#039; plug-in&lt;br /&gt;
:&amp;lt;pre&amp;gt;./ltib -m prep -p gst-plugins-bad&amp;lt;/pre&amp;gt;&lt;br /&gt;
:Source code is in&lt;br /&gt;
:&amp;lt;pre&amp;gt;rpm/BUILD/gst-plugins-bad-0.10.11/gst/bluetechnix/&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Argos2D-A100]]&lt;br /&gt;
[[Category:i.MX53]]&lt;/div&gt;</summary>
		<author><name>en&gt;Peter</name></author>
	</entry>
</feed>