[sf-lug] how can we use leapmotion with linux?
Kai Chang
kai.salmon.chang at gmail.com
Wed Mar 27 15:48:01 PDT 2013
Leap Motion now supports Linux. The drivers were released today.
http://www.ubuntuvibes.com/2013/03/the-leap-motion-controller-now-supports.html
On Wed, Mar 27, 2013 at 12:05 PM, Kai Chang <kai.salmon.chang at gmail.com> wrote:
> Yes, there are also Windows drivers that I've used.
>
> That's all that's required. There's an SDK, but for JavaScript only
> the drivers are necessary to develop. You just need to listen to a
> websocket for the messages coming from the drivers (examples in that
> repo).
>
> There is also a JavaScript library, but it's not well-documented.
>
> https://github.com/leapmotion/leapjs
>
> On Wed, Mar 27, 2013 at 11:43 AM, Michael Shiloh
> <michaelshiloh1010 at gmail.com> wrote:
>> I would be most interested in hearing about the Linux drivers when possible.
>>
>> You mentioned you used an OSX computer. Are the drivers also available on
>> Windows? Since I teach, I need to support all 3 operating systems.
>>
>> Besides the drivers, is anything else required?
>>
>> I will check out your code. Thanks very much for the link.
>>
>> Michael
>>
>>
>> On 03/27/2013 11:40 AM, Kai Chang wrote:
>>>
>>> I borrowed an OS X computer to develop.
>>>
>>> Unfortunately I can't share details of the developer program, but
>>> there are Linux drivers that will probably be released before May.
>>> I'll email the list when this happens.
>>>
>>> I can share the data I collected. These are the JSON files that I
>>> recorded off the device. They include all the 3-D and orientation
>>> information as well as fingertip positions.
>>>
>>> https://github.com/syntagmatic/leap-play/tree/master/data/gestures
>>>
>>> All of my open-source JavaScript examples are in that directory as well.
>>>
>>> Supported languages are C++, C#, Objective-C, Python, Java,
>>> JavaScript, and Unity.
>>>
>>> The device uses two cameras and three infrared emitters, with some
>>> onboard processing. You can find YouTube videos of early dev kits that
>>> don't have the shiny casing.
>>>
>>> http://youtu.be/LY3Ya__6BHw?t=2m42s
>>>
>>> I'd also be happy to bring the device to user group meetup when the
>>> Linux drivers are released.
>>>
>>> On Wed, Mar 27, 2013 at 11:23 AM, Michael Shiloh
>>> <michaelshiloh1010 at gmail.com> wrote:
>>>>
>>>> But very interesting nonetheless. I was hoping to use the Leap but was
>>>> disgusted with the process of becoming a developer. I still haven't
>>>> received
>>>> confirmation that I was accepted.
>>>>
>>>> I (and I suspect the larger Linux community) would be very interested in
>>>> learning more.
>>>>
>>>> Did you develop under Linux?
>>>>
>>>> In what language(s) can you program?
>>>>
>>>> Is there a well documented API? Is it a library or a client/server
>>>> arrangement? Can you share the API with us?
>>>>
>>>> What else can you tell us?
>>>>
>>>> Michael
>>>>
>>>>
>>>>
>>>> On 03/27/2013 10:49 AM, Kai Chang wrote:
>>>>>
>>>>>
>>>>> This is a similar concept to the Leap. It's not open-source.
>>>>>
>>>>> https://www.leapmotion.com/
>>>>>
>>>>> I used a developer version of the device to record some hand gestures.
>>>>> These are the X/Y positions of the fingertips (in the plane of the
>>>>> screen).
>>>>>
>>>>> http://fleetinbeing.net/leap-play/recorder-gallery.html
>>>>>
>>>>> Here's a video of me using the device to rotate/scale a map
>>>>> projection. This one uses the normal vector of the palm, and Y
>>>>> position of the hand.
>>>>>
>>>>> http://www.youtube.com/watch?v=9PZs7VHhypk
>>>>>
>>>>> Anyways, probably not too helpful to your original problem of creating
>>>>> a device from scratch.
>>>>>
>>>>> Cheers,
>>>>> Kai
>>>>>
>>>>> On Wed, Mar 27, 2013 at 10:17 AM, Michael Shiloh
>>>>> <michaelshiloh1010 at gmail.com> wrote:
>>>>>>
>>>>>>
>>>>>> i'm having trouble figuring out the best search words to find what i'm
>>>>>> looking for.
>>>>>>
>>>>>> basically, i'm looking for crude kinect-like behavior. i don't expect
>>>>>> perfect recognition or tracking, but surely something along these lines
>>>>>> is
>>>>>> available.
>>>>>>
>>>>>> my plan is to mount 2 or 3 webcams in a small box. the user inserts
>>>>>> his/her
>>>>>> hand in the box, and i'd like to track the tips of their fingers. i
>>>>>> would
>>>>>> mount the webcams at right angles so as to track location in both the x
>>>>>> and
>>>>>> y (and z) planes. the inside of the box can be painted any color to
>>>>>> differentiate from the hand, which i can require be free of gloves or
>>>>>> clothing. i can light the box, although later it might be nicer to have
>>>>>> the
>>>>>> box dark and use infrared lighting.
>>>>>>
>>>>>> there are no other objects in the box, and only one hand at a time.
>>>>>>
>>>>>> thoughts?
>>>>>>
>>>>>> _______________________________________________
>>>>>> sf-lug mailing list
>>>>>> sf-lug at linuxmafia.com
>>>>>> http://linuxmafia.com/mailman/listinfo/sf-lug
>>>>>> Information about SF-LUG is at http://www.sf-lug.org/
More information about the sf-lug
mailing list