[maemo-developers] video processing on N810 using gstreamer

From: Bruno botte.pub at gmail.com
Date: Mon Aug 18 15:57:47 EEST 2008
Hello,

I'm sorry, wanted to send the previous message to the list but I just sent
it to Stefan.

So my VIDEO_ defines are :

#define VIDEO_SRC "v4l2src"
#define VIDEO_SINK "xvimagesink"

I removed the second caps part, I pasted it from the original code I was
using has an example but that has nothing to do in my pipeline ;)


But the problem don't come from this, actually my pipeline is working, I
have a start/stop button, and when I press it, the video from the camera
appears on the screen.

Now I'd like to run some code continuously, calculating a result with the
current image buffer, and doing it with the next frame when it's done.

Bruno


2008/8/18, Stefan Kost <ensonic at hora-obscura.de>:
>
> Bruno schrieb:
>
> > Hi Steph,
> >
> > - By video_sink I meant screen_sink, which contain the current frame
> > buffer from the camera.
> > - The video/x-raw-rgb seems to configure the input of the sensor, I
> > just took it like this in the maemo camera example.
>
>
> Please reply to the list. I am not mr. gstreamer support, right.
>
> Then please read my comments again. There is no element called
> "screen_sink", so please also show that #define VIDEO_SINK ....
> 2nd the first video/x-raw-rgb is for the camera format, but you define
> another caps futher below, but not using it in the code snippet.
>
> Stefan
>
> >
> >
> > Actually, I'm just trying to find a place where I can put my image
> > processing stuff so it can proceed the frames from the camera.
> > I'd like that the program process one frame from the camera, ignore
> > the others till it finishs image processing on it, and then process
> > the next one :
> >
> >  * |Camera|     |CSP |      |Screen|      |Screen|     | Calculate |
> >  * |src       | -> |Filter| ->  |queue  | ->  |  sink  | ->
> > |expression| ->  Display
> >
> >
> > Here is my pipeline with some useless stuff removed :
> >
> > static gboolean initialize_pipeline(AppData *appdata,
> >         int *argc, char ***argv)
> > {
> >     GstElement *pipeline, *camera_src, *screen_sink;
> >     GstElement *screen_queue;
> >     GstElement *csp_filter;
> >     GstCaps *caps;
> >     GstBus *bus;
> >
> >
> >     /* Initialize Gstreamer */
> >     gst_init(argc, argv);
> >
> >     /* Create pipeline and attach a callback to it's
> >      * message bus */
> >     pipeline = gst_pipeline_new("test-camera");
> >
> >     bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline));
> >     gst_bus_add_watch(bus, (GstBusFunc)bus_callback, appdata);
> >     gst_object_unref(GST_OBJECT(bus));
> >
> >     /* Save pipeline to the AppData structure */
> >     appdata->pipeline = pipeline;
> >
> >     /* Create elements */
> >     /* Camera video stream comes from a Video4Linux driver */
> >     camera_src = gst_element_factory_make(VIDEO_SRC, "camera_src");
> >     /* Colorspace filter is needed to make sure that sinks understands
> >      * the stream coming from the camera */
> >     csp_filter = gst_element_factory_make("ffmpegcolorspace",
> > "csp_filter");
> >     /* Queue creates new thread for the stream */
> >     screen_queue = gst_element_factory_make("queue", "screen_queue");
> >     /* Sink that shows the image on screen. Xephyr doesn't support XVideo
> >      * extension, so it needs to use ximagesink, but the device uses
> >      * xvimagesink */
> >     screen_sink = gst_element_factory_make(VIDEO_SINK, "screen_sink");
> >
> >
> >     /* Check that elements are correctly initialized */
> >     if(!(pipeline && camera_src && screen_sink && csp_filter &&
> > screen_queue))
> >     {
> >         g_critical("Couldn't create pipeline elements");
> >         return FALSE;
> >     }
> >
> >
> >     /* Add elements to the pipeline. This has to be done prior to
> >      * linking them */
> >     gst_bin_add_many(GST_BIN(pipeline), camera_src, csp_filter,
> >             screen_queue, screen_sink, NULL);
> >
> >     /* Specify what kind of video is wanted from the camera */
> >     caps = gst_caps_new_simple("video/x-raw-rgb",
> >             "width", G_TYPE_INT, 640,
> >             "height", G_TYPE_INT, 480,
> >             "framerate", GST_TYPE_FRACTION, 25, 1,
> >             NULL);
> >
> >
> >     /* Link the camera source and colorspace filter using capabilities
> >      * specified */
> >     if(!gst_element_link_filtered(camera_src, csp_filter, caps))
> >     {
> >         return FALSE;
> >     }
> >     gst_caps_unref(caps);
> >
> >     /* Connect Colorspace Filter -> Tee -> Screen Queue -> Screen Sink
> >      * This finalizes the initialization of the screen-part of the
> > pipeline */
> >     if(!gst_element_link_many(csp_filter, screen_queue, screen_sink,
> > NULL))
> >     {
> >         return FALSE;
> >     }
> >
> >     /* gdkpixbuf requires 8 bits per sample which is 24 bits per
> >      * pixel */
> >     caps = gst_caps_new_simple("video/x-raw-rgb",
> >             "width", G_TYPE_INT, 640,
> >             "height", G_TYPE_INT, 480,
> >             "bpp", G_TYPE_INT, 24,
> >             "depth", G_TYPE_INT, 24,
> >             NULL);
> >
> >
> >
> >
> >
> >     /* As soon as screen is exposed, window ID will be advised to the
> > sink */
> >     g_signal_connect(appdata->screen, "expose-event",
> > G_CALLBACK(expose_cb),
> >              screen_sink);
> >
> >
> >
> >
> >     gst_element_set_state(pipeline, GST_STATE_PAUSED);
> >
> >     return TRUE;
> > }
> >
> >
> > I tried to have a printf("hello") on each video frame. I put the
> > printf in the pipeline initialisation, in the main, and in the expose
> > cb function which is :
> >
> > /* Callback to be called when the screen-widget is exposed */
> > static gboolean expose_cb(GtkWidget * widget, GdkEventExpose * event,
> > gpointer data)
> > {
> >     /* Tell the xvimagesink/ximagesink the x-window-id of the screen
> >      * widget in which the video is shown. After this the video
> >      * is shown in the correct widget */
> >     gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(data),
> >                      GDK_WINDOW_XWINDOW(widget->window));
> >     return FALSE;
> > }
> >
> > And no success. the printf is showed only one when I start the
> > pipeline. (I have a button to start and stop it)
> >
> > Maybe I can make a function with the processing part in it, and tell
> > the pipeline to run it with a g_signal_connect call.
> >
> > Thanks in advance,
> > Bruno
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.maemo.org/pipermail/maemo-developers/attachments/20080818/39b78766/attachment.htm 
More information about the maemo-developers mailing list