A list of puns related to "Gesture recognition"
Anyone know of any (paid or open source) solutions for developing custom gesture recognition, specifically / especially for use with hand tracking?
I picked up an Oculus Quest 2 to start experimenting with VR and the hand tracking API only really gives you a couple of system gestures and access to hand position/orientation and finger pose data. I don't think there's even support for a default "grab" gesture (maybe that's generally handled through physics simulation?)
I've done a fair bit of googling so I know there's plenty of research going on in this space, but in terms of implementation specifics 90% of what I've found so far is around using Tensorflow in Python with still image captures from your webcam, which is interesting (and proves that a machine learning approach may work well for this), but not what I'm after :-)
I was playing RE4 VR and found myself thinking: Why am I ordering Ashley around by clicking and selecting with my left thumbstick? Why cant I order here to Follow, Wait, Hide or Heal by just speaking the words?
Or in more general terms, Why not use speech recognition or gestures in way more games? Because now for the first time.
every user has a microphone in their headset.
Recognition for simple one word commands should be easy and "good enough" in accuracy to not impede the gameplay.
The computational load for speech recognition should be low enough for stand-alone use.
Think about it, Answering yes or no questions by nodding or shaking your head (or speaking)- Telling NPCs where to go (left or right) in selected points. Using hand gestrures to point at things and having NPCs to look or shoot at those? Shouting for a medic and getting help? Or having a generic Siri/Cortana/Amazon Echo like virtual assistant for the Quest OS and utility apps? Lets call him... >!Mark!<. ;D
Almost all of our computer interaction occurs via mouse, keyboards, and touch screens. An essential step in making human-computer interactions more efficient would be to move towards a more contactless form of communication like speech, facial expressions, and gestures, which are generally used when communicating with other humans. Specifically, past studies in hand gesture recognition have not been very successful in achieving high accuracy while maintaining low computational complexity. This poor recognition accuracy is primarily due to the rotation, translation, and scaling that often occurs during the capture of hand gesture images and differences in hand types that vary from person to person.
To improve the hand gesture recognition capability and ensure a low computational burden, a group of researchers from Sun Yat-sen University has proposed a hand-adaptive algorithm. The algorithm first classifies the input images as per their hand-types into slim, normal, and broad categories. Hand-type classification uses three features: palm-length, palm-width, and finger-length. The recognition task addressed in the research work deals with 9 hand gestures. After the hand-type classification step, 360 hand gesture images were captured (9 from each of 40 volunteers that participated), which formed the overall hand gestures library. Then, dedicated libraries from each hand type are used to do further classification. The research group has selected area-perimeter ratio (C), and effective area ratio (E) features along with seventh order Hu moments which offer low complexity and are invariant to rotation, translation, and scaling to a large extent. Thus the feature vector of each of the hand gesture images can be denoted as: {C,E,Hu1,β¦,Hu7}.
Paper: https://www.spiedigitallibrary.org/journals/journal-of-electronic-imaging/volume-30/issue-06/063026/Hand-gesture-recognition-algorithm-combining-hand-type-adaptive-algorithm-and/10.1117/1.JEI.30.6.063026.full?SSO=1
Been itching to use a Nintendo nunchuk controller in a project ever since I found out they use I2C. I pullled out the cabling/internal supports and added a micro USB port, RGB LED and ESP32.
You make gestures via the joystick (flick up, circle, etc.) which map out to user-defined actions. Multiple action mappings are supported; currently there's one for admin actions (restart, enter deep sleep, fade LED w/ acceleromter, etc.) and one for BLE HID actions (pause/play, next/prev track, volume up/down, make pairable) but the possibilities are endless:
Deep sleep is entered after a minute of inactivity to conserve battery. The micro USB port housing is connected to a touchpad, tapping it wakes up the device.
Source code, build instructions and CAD files can be found on Github.
I call it onechuk. Hope you like it!
I'm trying to use sxmo on my wileyfox-crackling but touch recognition seems to be completely broken - left and right edge swipes trigger brightness adjustment regardless of the height the swipe is started at, though brightness adjustment should only happen when swiping along the top edge. Other things, like vertical swipes, multi-finger swipes etc do not work at all.
Here's the lisgd
command line grepped from ps aux
:
lisgd -t 125 -T 60 -g 1 DRUL BR * sxmo_hotcorner.sh bottomright -g 1 DLUR BL * sxmo_hotcorner.sh bottomleft -g 1 ULDR TL * sxmo_hotcorner.sh topleft -g 1 DRUL TR * sxmo_hotcorner.sh topright -g 1 LR B L sxmo_gesturehandler.sh enter -g 1 RL B L sxmo_gesturehandler.sh back -g 1 LR L * sxmo_gesturehandler.sh prevdesktop -g 1 RL R * sxmo_gesturehandler.sh nextdesktop -g 1 DU L * P sxmo_gesturehandler.sh volup -g 1 UD L * P sxmo_gesturehandler.sh voldown -g 1 LR T * P sxmo_gesturehandler.sh brightnessup -g 1 RL T * P sxmo_gesturehandler.sh brightnessdown -g 1 DU B * sxmo_gesturehandler.sh showkeyboard -g 1 UD B * sxmo_gesturehandler.sh hidekeyboard -g 1 UD T * sxmo_gesturehandler.sh showmenu -g 1 DU T * sxmo_gesturehandler.sh hidemenu -g 2 UD T * sxmo_gesturehandler.sh showsysmenu -g 2 UD B * sxmo_gesturehandler.sh closewindow -g 3 UD B * sxmo_gesturehandler.sh killwindow -g 2 RL * * sxmo_gesturehandler.sh moveprevdesktop -g 2 LR * * sxmo_gesturehandler.sh movenextdesktop -g 1 DU R * P sxmo_gesturehandler.sh scrollup_short -g 1 UD R * P sxmo_gesturehandler.sh scrolldown_short -g 1 LR R S sxmo_gesturehandler.sh scrollright_short -g 1 RL L S sxmo_gesturehandler.sh scrollleft_short
Everything looks fine (except its ugly syntax lol). Anyway, that's the default setup, I didn't change anything.
I know this UI is made with pinephone in mind, but lisgd
itself is not pinephone-only according to their documentation, so what's wrong with it? Is there a way to make it work properly? If not, then maybe there's a saner alternative?
Thanks.
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.