Argos®2D A100 Programmer's Guide
Overview of ISMs (Image Sensor Modules)
ISM-MT9M131
- See the Linux Software User Manual to start with this camera module.
- Image data format: YUV (UYVY), 16 bpp
- GStreamer capture plug-in special parameters:
gst-launch mfw_v4lsrc
ISM-MT9M025 Monochrome
- See the Linux Software User Manual to start with this camera module.
- Image data format: Grayscale, up to 10 bpp (default: 8bpp)
- GStreamer capture plug-in special parameters:
gst-launch mfw_v4lsrc color-mode=1 capture-mode=5
- The GStreamer capture plug-in can only handle 8bpp.
ISM-MT9M025 Color
NOTE: Recommended for experienced customers only. The BSP does not contain software for white balancing, color correction, etc.
- See the Linux Software User Manual to start with this camera module.
- Image data format: RGB, Bayer filter array (GRBG), up to 10 bpp (default: 8bpp)
- GStreamer capture plug-in special parameters:
gst-launch mfw_v4lsrc color-mode=2 capture-mode=5
- The GStreamer capture plug-in can only handle 8bpp.
ISM-MT9P031 Monochrome
- See the Linux Software User Manual to start with this camera module.
- Image data format: Grayscale, up to 10 bpp (default: 8bpp)
- GStreamer capture plug-in special parameters:
gst-launch mfw_v4lsrc color-mode=1 capture-mode=5
- The GStreamer capture plug-in can only handle 8bpp.
ISM-MT9P031 Color
NOTE: Recommended for experienced customers only. The BSP does not contain software for white balancing, color correction, etc.
- See the Linux Software User Manual to start with this camera module.
- Image data format: RGB, Bayer filter array (Normal mode: RGGB; Rotate mode: BGGR), up to 10 bpp (default: 8bpp)
- GStreamer capture plug-in special parameters (normal mode):
gst-launch mfw_v4lsrc color-mode=3 capture-mode=5
- GStreamer capture plug-in special parameters (rotate mode):
gst-launch mfw_v4lsrc color-mode=4 capture-mode=5
- The GStreamer capture plug-in can only handle 8bpp.
GStreamer Programming Guide
GStreamer is a pipeline-based multimedia framework written in the C programming language with the type system based on GObject. GStreamer allows a programmer to create a variety of media-handling components, including simple audio playback, audio and video playback, recording, streaming and editing. The pipeline design serves as a base to create many types of multimedia applications such as video editors, streaming media broadcasters and media players. (https://en.wikipedia.org/wiki/GStreamer)
Plug-in list
The i.MX53 IPU and VPU are supported in GStreamer by means of plug-ins.
Name | Vendor | Package | Description | Important parameters |
---|---|---|---|---|
mfw_ipucsc | Freescale | gst-fsl-plugin | IPU Color Space Converter | |
mfw_v4lsink | Freescale | gst-fsl-plugin | V4L Sink - An overlay framebuffer for video playback or live view for output on a display | disp-width...width of the image to be displayed
disp-height...height of the image to be displayed axis-top...top co-ordinate of the origin of display axis-left...left co-ordinate of the origin of display |
mfw_v4lsrc | Freescale/Bluetechnix | gst-fsl-plugin | Video Source plug-in - to capture image data from ISM camera modules | capture-width...width of the image to be captured
capture-height...height of the image to be captured fps-n...frame rate at which the input stream is to be captured color-mode...Sets the color format. 0...YUV (ISM-MT9M131); 1...8-bit grayscale (Monochrome cameras); 2...Bayer GRBG (ISM-MT9M025); 3...Bayer RGGB (ISM-MT9P031); 4...Bayer BGGR (ISM-MT9P031 Rotate), 5...Bayer GBRG |
mfw_vpuencoder | Freescale | gst-fsl-plugin | Hardware (VPU) Encoder | codec-type...codec type for encoding: 0...MPEG4; 1...H.263; 2...AVC (H.264); 7...MJPEG
framerate...framerate at which the input stream is to be encoded |
ismconv | Bluetechnix | gst-plugins-bad | Bayer/Grayscale to RGB24/RGB565 converter for Bluetechnix ISMs (ARM NEON optimized) |
To see all supported in/out formats and all options of a plug-in, type
gst-inspect <plugin>
Pipeline example 1 (Linux Shell)
The GStreamer API is available in various programming languages. The quickest is to use the Linux shell for constructing the pipeline. For simple pipelines without much interaction, this is sufficient. The following example is similar to the stream-mt9m131.sh streaming part in the Argos®2D A100 demo application.
What it does
- Capture YUV camera stream
- Duplicate stream
- Encode to MJPEG format
- Decrease frame rate
- Write single frame to file
- Encode to H.264 format
- Write video stream to file
Command
gst-launch -e mfw_v4lsrc fps-n=25 capture-width=800 capture-height=600 \ ! tee name=splitter \ ! queue \ ! mfw_vpuencoder codec-type=std_mjpg \ ! image/jpeg,height=600,width=800,framerate=25/1 \ ! videorate skip-to-first=true \ ! image/jpeg,height=600,width=800,framerate=1/2 \ ! multifilesink location=/tmp/ecam-singleshot.jpg \ splitter. \ ! queue \ ! mfw_vpuencoder codec-type=std_avc bitrate=2000 framerate=25 \ ! video/x-h264,height=800,width=600,framerate=25/1 \ ! filesink location=/tmp/fifo-h264 &
As you can see, the application that lets you start GStreamer is gst-launch. The -e flag assures that if you press Ctrl+C for quitting GStreamer, an EOS (End-of-Stream) flag is sent down the pipeline and it is terminated properly (i.e., file content is written properly, etc.).
Pipeline
A pipeline within the shell always looks like this:
<src element> ! <element 1> ! <element 2> ! ... ! <sink element>
i.e. single elements are concatenated with !. Each element may have arguments of the form name=value. We will run through the above example now.
The basic outline of the above example is as follows:
Capture --> Duplicate +--> Queue --> Encode --> Throttling --> Write single frames (mfw_v4lsrc) (tee) | (queue) (mfw_vpuencoder) (videorate) (multifilesink) | +--> Queue --> Encode --> Write stream (queue) (mfw_vpuencoder) (filesink)
Capturing
mfw_v4lsrc fps-n=25 capture-width=800 capture-height=600
The capture plug-in is configured for a capture size of 800x600 pixels and 25 frames per second.
Duplicating the stream
Since we want to use two different encodings from the same stream, we have to duplicate the stream before each one is fed in an instance of the video encoder. The tee plug-in has the following syntax:
<source element> ! tee name=splitter ! <here is the pipeline for stream 1> splitter. ! <here is the pipeline for stream 2>
Encoding
We have two encodings here: MJPEG and AVC/H.264. Bitrate is in kbps. Framerate is in fps. The encoding is done in the i.MX53 Video Processing Unit, and does not consume ARM CPU cycles.
mfw_vpuencoder codec-type=std_mjpg mfw_vpuencoder codec-type=std_avc bitrate=2000 framerate=25
Frame rate throttling and filter caps
In the example, the encoded MJPEG video stream is throttled to a frame rate of 0.5, i.e. one frame each 2 seconds. In the Argos®2D A100 demo application, this is used for the still capture embedded into the web site.
Throttling is easily be done with the videorate plug-in. To let it know the input and output frame specification, we use GStreamer "filter caps". Capabilities (short: caps) describe the type of data that is streamed between two pads. They have a MIME type, width, height, and so on.
Our input caps are specified by the encoder output, and we only adjust framerate.
! image/jpeg,height=600,width=800,framerate=25/1 \ <--Plug-in's sink caps ! videorate skip-to-first=true \ <--Plug-in ! image/jpeg,height=600,width=800,framerate=1/2 \ <--Plug-in's source caps
File sinks
As the name suggests, file sinks write content they receive to a file. They are typically at the end of a pipeline. The file name is given in the location parameter.
In the example, two different file sinks are used. The first one, multifilesink, writes each frame into a separate file. If the given file name contains a decimal field, files are automatically numbered, e.g. frame%04d.jpg as file name will create files frame0001.jpg, frame0002.jpg, and so on. In the example, no decimal field is used, which means, the file is overwritten with each frame:
multifilesink location=/tmp/ecam-singleshot.jpg
The second one, filesink, is the more common variant. It writes all data to one file:
filesink location=/tmp/fifo-h264
Pipeline example 2 (C code)
The following example is written in C and is compiled with the default cross toolchain.
What it does
- Creates a (recording) thread
- Thread constructs MP4 recording pipeline
- Waits a minute
- Quits GStreamer pipeline
- Waits for thread finish
C Code
Includes
#include <gst/gst.h> #include <glib.h> #include <pthread.h> #include <unistd.h>
The Gstreamer pipeline element
GstElement *g_pipeline;
Thread stuff
pthread_t g_gstThreadId = -1;
Message handler of the GStreamer pipeline's bus
static gboolean bus_call(GstBus *bus, GstMessage *msg, gpointer data) { GMainLoop *loop = (GMainLoop *) data; switch(GST_MESSAGE_TYPE(msg)) {
EOS signal quits GStreamer
case GST_MESSAGE_EOS: g_main_loop_quit (loop); break; case GST_MESSAGE_ERROR: { gchar *debug; GError *error; gst_message_parse_error (msg, &error, &debug); g_free (debug); fprintf (stderr, "ERROR: %s\n", error->message); g_error_free (error); g_main_loop_quit (loop); break; } default: break; } return TRUE; }
Thread function
static int demo_thread(void) { GMainLoop *loop; GstElement *source, *encoder, *filter, *muxer, *sink; GstCaps *filtercaps; GstBus *bus; loop = g_main_loop_new (NULL, FALSE);
Create gstreamer elements
g_pipeline = gst_pipeline_new ("demo-pipeline"); source = gst_element_factory_make ("mfw_v4lsrc", "camera-source"); encoder = gst_element_factory_make ("mfw_vpuencoder", "mpeg4-encoder"); filter = gst_element_factory_make ("capsfilter", "filter"); muxer = gst_element_factory_make ("mp4mux", "mp4-muxer"); sink = gst_element_factory_make ("filesink", "file-sink"); if (!g_pipeline || !source || !encoder || !filter || !muxer || !sink) { fprintf(stderr, "ERROR: One GST element could not be created.\n"); return -1; }
Set source properties: Width, height, frames per second
g_object_set (G_OBJECT(source), "fps-n", 25, /* fps */ "capture-width", 800, "capture-height", 600, NULL);
Set encoder properties: Codec type, Quantization parameter
g_object_set (G_OBJECT(encoder), "codec-type", 0, /* codec type mpeg4 */ "qp", 5, /* quantization parameter --> quality! */ NULL);
Set sink property: Output file path
g_object_set (G_OBJECT(sink), "location", "/tmp/gst-demo-output.mp4", /* output file path */ NULL);
Install the message handler
bus = gst_pipeline_get_bus (GST_PIPELINE (g_pipeline)); gst_bus_add_watch (bus, bus_call, loop); gst_object_unref (bus);
Add all elements into the pipeline
gst_bin_add_many (GST_BIN (g_pipeline), source, encoder, filter, muxer, sink, NULL);
Link the elements together
gst_element_link_many (source, encoder, filter, muxer, sink, NULL);
Set filtercaps for the filter
filtercaps = gst_caps_new_simple ("video/mpeg", "height", G_TYPE_INT, 600, "width", G_TYPE_INT, 800, "framerate", GST_TYPE_FRACTION, 25, 1, "mpegversion", G_TYPE_INT, 4, NULL); g_object_set(G_OBJECT(filter), "caps", filtercaps, NULL);
Set the pipeline to "playing" state
gst_element_set_state (g_pipeline, GST_STATE_PLAYING);
Iterate - the loop will return not before the pipeline got an EOS event
g_main_loop_run (loop);
Out of the main loop, clean up nicely
gst_element_set_state (g_pipeline, GST_STATE_NULL); gst_object_unref (GST_OBJECT (g_pipeline)); return 0; }
Application's main function
int main(int argc, char **argv) { gst_init(NULL, NULL);
Start thread with GStreamer stuff
pthread_create(&g_gstThreadId, NULL, (void *)&demo_thread, (void *)NULL);
Record for 60 seconds
sleep(60);
Send EOS to pipeline
gst_element_send_event(g_pipeline, gst_event_new_eos());
As soon as EOS is through the pipeline, the thread will quit. Here we wait for it
pthread_join(g_gstThreadId, NULL); g_gstThreadId = -1; return 0; }
Building
Prepare environment for cross compilation
export PATH=$PATH:/opt/freescale/usr/local/gcc-4.4.4-glibc-2.11.1-multilib-1.0/arm-fsl-linux-gnueabi/bin export ARCH=arm export CROSS_COMPILE=arm-none-linux-gnueabi-
Build command line
arm-fsl-linux-gnueabi-gcc gstreamer.c \
...Compiler flags
-Wall -Wstrict-prototypes -Wno-trigraphs -O2 -fno-strict-aliasing -fno-common \
...Compiler include paths
-I <ltib>/rootfs/usr/include/gstreamer-0.10/ \ -I <ltib>/rootfs/usr/include glib-2.0/ \ -I <ltib>/rootfs/usr/lib/glib-2.0/include/ \ -I <ltib>/rootfs/usr/include/libxml2/ \
...Linker include paths
-L <ltib>/rootfs/usr/lib/ \ -lgstreamer-0.10 \ -lgobject-2.0 \ -lgthread-2.0 \ -lgmodule-2.0 \ -lglib-2.0 \ -lxml2 \ -lz
V4L (Video4Linux) Programming Guide
To get familiar with the V4L interface, we use the unit tests provided by Freescale. The unit test package is shipped automatically with the LTIB Linux BSP and installed on the target by default.
Overview
First we want to unpack the source code. Go to your LTIB directory and execute the following command.
./ltib -m prep -p imx-test
Afterwards, go to
<ltib>/rpm/BUILD/imx-test-11.05.01/test/mxc_v4l2_test/
where you will find the following C files which are of interest:
Source File | Target Executable | Description | Supported sensors |
---|---|---|---|
mxc_v4l2_capture.c | /unit_tests/mxc_v4l2_capture.out | Captures a video stream from camera and writes it to a file. Various YUV and RGB color modes are supported. Color modes "GREY" and "Y16 " were added for raw image sensors to capture 8/10bpp pixel data. It is explained in more detail later on. | Raw and YUV sensors |
mxc_v4l2_output.c | /unit_tests/mxc_v4l2_output.out | Displays a YUV video stream on the display (on the background or overlay frame buffer) | YUV sensors only |
mxc_v4l2_overlay.c | /unit_tests/mxc_v4l2_overlay.out | Live view from camera to display (on the background or overlay frame buffer). The liveview application of the Argos®2D A100 demo for ISM-MT9M131 is based on it. | YUV sensors only |
mxc_v4l2_still.c | /unit_tests/mxc_v4l2_still.out | Still YUV image capture via V4L read() function | YUV sensors only |
NOTE: Execute any of these with the -help parameter to see the command usage.
mxc_v4l2_capture.c
We will illustrate parts of the code here for capturing camera video data. The following code is simplified for readability! As you will see, accessing the V4L device is all about I/O controls (IOCTLs):
Open V4L device file
char v4l_device[100] = "/dev/video0"; fd_v4l = open(v4l_device, O_RDWR, 0);
Get sensor frame size
struct v4l2_frmsizeenum fsize; int g_capture_mode = 0; fsize.index = g_capture_mode; ioctl(fd_v4l, VIDIOC_ENUM_FRAMESIZES, &fsize); printf("sensor frame size is %dx%d\n", fsize.discrete.width, fsize.discrete.height);
Set stream parameters
- g_camera_framerate is set with the -fr parameter
struct v4l2_streamparm parm; parm.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; parm.parm.capture.timeperframe.numerator = 1; parm.parm.capture.timeperframe.denominator = g_camera_framerate; parm.parm.capture.capturemode = g_capture_mode; ioctl(fd_v4l, VIDIOC_S_PARM, &parm);
Set input mode
- 0...CSI-->PRP_ENC-->MEM; allows CSC and rotation in IPU hardware; only suitable for YUV sensors
- 1...CSI-->MEM; capture directly to memory; required for raw sensors
int g_input = 0; ioctl(fd_v4l, VIDIOC_S_INPUT, &g_input);
Set cropping of the sensor input image
struct v4l2_crop crop; crop.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; crop.c.width = g_in_width; crop.c.height = g_in_height; crop.c.top = g_top; crop.c.left = g_left; ioctl(fd_v4l, VIDIOC_S_CROP, &crop);
Set data format
- For raw sensors, use either V4L2_PIX_FMT_GREY or V4L2_PIX_FMT_Y16 for g_cap_fmt.
struct v4l2_format fmt; fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; fmt.fmt.pix.pixelformat = g_cap_fmt; fmt.fmt.pix.width = g_out_width; fmt.fmt.pix.height = g_out_height; ioctl(fd_v4l, VIDIOC_S_FMT, &fmt);
Get data format
fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; ioctl(fd_v4l, VIDIOC_G_FMT, &fmt); printf("\t Width = %d", fmt.fmt.pix.width); printf("\t Height = %d", fmt.fmt.pix.height); printf("\t Image size = %d\n", fmt.fmt.pix.sizeimage); printf("\t pixelformat = %d\n", fmt.fmt.pix.pixelformat);
Set rotation
- Not available for raw sensors
struct v4l2_control ctrl; ctrl.id = V4L2_CID_PRIVATE_BASE + 0; ctrl.value = g_rotate; ioctl(fd_v4l, VIDIOC_S_CTRL, &ctrl);
- For possible values, here is an excerpt from ltib/rpm/BUILD/linux/include/linux/mxc_v4l2.h:
#define V4L2_MXC_ROTATE_NONE 0 #define V4L2_MXC_ROTATE_VERT_FLIP 1 #define V4L2_MXC_ROTATE_HORIZ_FLIP 2 #define V4L2_MXC_ROTATE_180 3 #define V4L2_MXC_ROTATE_90_RIGHT 4 #define V4L2_MXC_ROTATE_90_RIGHT_VFLIP 5 #define V4L2_MXC_ROTATE_90_RIGHT_HFLIP 6 #define V4L2_MXC_ROTATE_90_LEFT 7
Initiate memory mapping of buffers
#define TEST_BUFFER_NUM 3 struct v4l2_requestbuffers req; memset(&req, 0, sizeof (req)); req.count = TEST_BUFFER_NUM; req.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; req.memory = V4L2_MEMORY_MMAP; ioctl(fd_v4l, VIDIOC_REQBUFS, &req);
Query the buffer status for memory mapping
struct v4l2_buffer buf; memset(&buf, 0, sizeof (buf)); buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; buf.index = 0; // buffer number 0..x ioctl(fd_v4l, VIDIOC_QUERYBUF, &buf); buffers[i].length = buf.length; buffers[i].offset = (size_t) buf.m.offset;
Enqueue an empty buffer for capturing
memset(&buf, 0, sizeof (buf)); buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; buf.memory = V4L2_MEMORY_MMAP; buf.index = 0; // buffer number 0..x buf.m.offset = buffers[i].offset; ioctl (fd_v4l, VIDIOC_QBUF, &buf);
Start streaming (capturing)
enum v4l2_buf_type type; type = V4L2_BUF_TYPE_VIDEO_CAPTURE; ioctl (fd_v4l, VIDIOC_STREAMON, &type);
Dequeue a filled buffer
struct v4l2_buffer buf; memset(&buf, 0, sizeof (buf)); buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; buf.memory = V4L2_MEMORY_MMAP; ioctl (fd_v4l, VIDIOC_DQBUF, &buf); fwrite(buffers[buf.index].start, fmt.fmt.pix.sizeimage, 1, fd_y_file);
Stop streaming (capturing)
enum v4l2_buf_type type; type = V4L2_BUF_TYPE_VIDEO_CAPTURE; ioctl (fd_v4l, VIDIOC_STREAMOFF, &type);
Close V4L device file
close(fd_v4l);
Video Processing Unit (VPU)
A comfortable way to use the VPU for encoding or decoding video streams is to use GStreamer (mfw_vpuencoder, mfw_vpudecoder plug-ins).
If you don't want to use GStreamer, you may start by looking at the VPU unit test application delivered by Freescale. It is contained in the imx-test package.
Here is how to unpack it using LTIB:
./ltib -m prep -p imx-test
Thereafter, you will find the VPU example source code at
ltib/rpm/BUILD/imx-test-11.05.01/test/mxc_vpu_test/
On your target, the unit test is installed as
/unit_tests/mxc_vpu_test.out
To get the command usage, type
/unit_tests/mxc_vpu_test.out -help
Further reading
Freescale documents
Download here: https://www.freescale.com/webapp/sps/site/prod_summary.jsp?code=i.MX535&nodeId=018rH3ZrDR988D&fpsp=1&tab=Documentation_Tab
i.MX Linux Multimedia Framework - User's Guide
- Download item: IMX53_1105_LINUX_MMDOCS
- Location: Linux_Multimedia_Framework_Docs_2.0.1.tar.gz {{#if: Linux_Multimedia_Framework_Docs_MX53Ubuntu_2.0.1 | Linux_Multimedia_Framework_Docs_MX53Ubuntu_2.0.1 {{#if: docs | docs {{#if: Linux_Multimedia_Framework_User_Guide.pdf | Linux_Multimedia_Framework_User_Guide.pdf {{#if: | {{{5}}} {{#if: | {{{6}}} {{#if: | {{{7}}} {{#if: | {{{8}}} {{#if: | {{{9}}} {{#if: | {{{10}}}| }}| }}| }}| }}| }}| }}| }}| }}| }}
i.MX5x VPU Application Programming Interface Linux - Reference Manual
- Download item: IMX53_1105_LINUXDOCS_BUNDLE
- Location: L2.6.35_11.05.01_ER_docs.tar.gz {{#if: L2.6.35_11.05.01_ER_docs | L2.6.35_11.05.01_ER_docs {{#if: doc | doc {{#if: mx5 | mx5 {{#if: i.MX5x_Linux_VPU_API.pdf | i.MX5x_Linux_VPU_API.pdf {{#if: | {{{6}}} {{#if: | {{{7}}} {{#if: | {{{8}}} {{#if: | {{{9}}} {{#if: | {{{10}}}| }}| }}| }}| }}| }}| }}| }}| }}| }}
External links
GStreamer Hello world program
GStreamer Application Development Manual
V4L2 API specification
Source code
GStreamer development header files
ltib/rootfs/usr/include/gstreamer-0.10/
V4L2 Linux Kernel header files
- V4L2 main header file (V4L2_* enums, color formats, etc.)
ltib/rpm/BUILD/linux/include/linux/videodev2.h
- i.MX-specific V4L2 stuff
ltib/rpm/BUILD/linux/include/linux/mxc_v4l2.h
Unpack Freescale GStreamer plug-in
./ltib -m prep -p gst-fsl-plugin
- Source code is in
rpm/BUILD/gst-fsl-plugin-2.0.1/src/
Unpack ismconv plug-in
./ltib -m prep -p gst-plugins-bad
- Source code is in
rpm/BUILD/gst-plugins-bad-0.10.11/gst/bluetechnix/