libgrip for gtk extension?

Asked by Bernard Victor Delvaux

Hello

I'm using OpenGL for a 3D application; the OpenGL display is suported by SDL library, and there is a Gtk GUI. I'm considering using the LibGtkext, that allows to diplay OpenGL directly in Gtk.

I'd like to add multitouch input to my app; with SDL, i have to use SDL1.3, that is still in developpment. My question is : if I use GtkExt, wich binds GTK to OpenGL directly, woudl it be possible to have the multitouch input thanks to libgrip in the OpenGL window?

I'm not sure this is the right place to ask this question, since it involves different libraries.
Thank you very much

Victor

Question information

Language:
English Edit question
Status:
Solved
For:
libgrip Edit question
Assignee:
Stephen M. Webb Edit question
Solved by:
Stephen M. Webb
Solved:
Last query:
Last reply:
Revision history for this message
Duncan McGreggor (oubiwann) said :
#1

Thanks for your question, victor -- you have good ones :-)

Stephen Webb is a little familiar with libgrip, but a lot familiar with OpenGL in general. I'll let him answer you for now. Note that we're in the middle of a release, though... things are busy right now. If you don't hear back from us within 2 weeks, please bump/ping :-)

Thanks!

Revision history for this message
Bernard Victor Delvaux (nadaeck) said :
#2

Thank you Duncan for all this.

As far as I understand, OpenGL only manage 3D rendering but not input events, such as keyboard , mouse, or joystick events and now touch gestures; so we have to use another library, such SDL, the GLUT toolkit or something else to use input events with openGL.

So the question is : does the gestures of libgripi inside Gtk "extend" to the OpenGl extension allowed by LibGlExt. Maybe I should ask also the question to the LibGlExt people.

Thank you and I'll be in touch

Victor

Revision history for this message
Best Stephen M. Webb (bregma) said :
#3

If you're using GTK, you can use libgrip to connect multi-touch gestures to the GTK widgets, regardless of whether you're using GtkExt or SDL and OpenGL.

If you want to use multi-touch or gestures directly in your application, you can also use libutouch-geis directly. I'm not sure if it's a good idea to use both libgrip and libutouch-geis in the same application because I haven't tested it, but in theory it should work.

Revision history for this message
Bernard Victor Delvaux (nadaeck) said :
#4

Hello Stephen

Thank you for the answer. In my app, there is a Gtk GUI, and after a click on the "play" button, a SDL/openGL window is opened, so I suppose libgrip won't work in that case since the Gtk and SDL window are not directly linked in my app...

Maybe the following part should be placed in another location.
So I coud use libgeis directly since I only want multitouch input for the 3D display (to manipulate the displayed image actually)? That would be great; but do I need only libgeis, or do I need also utouch-grail? Is there any tutorial or how-to for "multitouch beginner" that would explain step by step how to implement this in an application?

So this is what I understand : it is better to have the gestures managed at a system-wide level (like in Unity in Natty for example) and just implement gesture"possibility" in a specific app with libgeiss or libgrip, so it can be integrated smoothly an there won't be any conflict in the "management" of the gestures. In that regard, libgeiss is enough if Unity is installed, but not enough if Unity is not installed (like in Maverick classic desktop)...?

Am I wrong? May be I should open another post in the utouch or utouch-geiss section; because the last answer solved my question.

Victor

Revision history for this message
Bernard Victor Delvaux (nadaeck) said :
#5

Thanks Stephen M. Webb, that solved my question.

Revision history for this message
Stephen M. Webb (bregma) said :
#6

Using libutouch-geis directly for multi-touch gestural input is appropriate for manipulating the display directly. Pure multi-touch is not yet available through the GEIS API, it's scheduled for the Oneiric Ocelot release, but gestures and associated touch information is definitely available now.

Gesture recognition in Maverick and Natty is done at the systemwide level, Unity is not required. The libutouch-geis-dev package is the application interface to the uTouch stack (which consists of geis, grail, and other components), but the entire utouch stack ships with Natty and is available even with the classic desktop.

A very simple example of libutoich-geis used in an SDL+OpenGL application can be found at <http://bazaar.launchpad.net/~bregma/ucube/trunk/view/head:/ucube/app.cpp>. API documentation is available at <http://people.canonical.com/~stephenwebb/geis-v2-api/> and in the package libutoich-geis-doc, and it includes documented example code.

Revision history for this message
Bernard Victor Delvaux (nadaeck) said :
#7

Since the topic is about utouch-geis, I've created a new post at the utouch-geis section :

https://answers.launchpad.net/utouch-geis/+question/150521