帳號:guest(18.188.57.172)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目勘誤回報
作者:黃廉權
作者(英文):Lian-Chuan Huang
論文名稱:基於隱馬可夫模型感測校正方法用於Leap Motion手指追蹤
論文名稱(英文):The HMM-based Sensing Correction Method for Leap Motion Finger Tracking
指導教授:孫宗瀛
指導教授(英文):Tsung-Ying Sun
口試委員:謝昇達
孫宗瀛
林君玲
口試委員(英文):Sheng-Ta Hsieh
Tsung-Ying Sun
Chun-Ling Lin
學位類別:碩士
校院名稱:國立東華大學
系所名稱:電機工程學系
學號:610423019
出版年(民國):108
畢業學年度:107
語文別:中文
論文頁數:50
關鍵詞:隱馬可夫模型體感器手指追蹤
關鍵詞(英文):Hidden Markov modelLeap Motionfinger tracking
相關次數:
  • 推薦推薦:0
  • 點閱點閱:16
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:2
  • 收藏收藏:0
Leap Motion是一款小巧、高精度且高幀數(High frame rate)的手部感測器,為目前市面上最精準的手部感測器之一。然而,與所有的視覺影像、光學感測器一樣,因為受測物交疊、重疊而造成無法偵測或偵測錯誤的遮阻現象,Leap Motion也會因手指交疊造成感測盲區。
由於Leap Motion與手指行為正好符合隱馬可夫模型(Hidden Markov Model)中的雙重內嵌機率特性,因此本研究提出基於隱馬可夫模型的感測校正方法解決盲區的感測問題。本研究將盲區分為數個區域作為隱狀態,透過維特比演算法找出狀態序列,如此可知手指位於哪個狀態中。
測試資料包含手指張開方向以及手指閉合方向之運動,以此對每個狀態進行測試並統計為混淆矩陣(Confusion matrix)。實驗結果顯示,本研究提出的方法能夠有效改善Leap Motion在盲區偵測的效能,提升盲區的辨識率。比較有無套用HMM方法,效能指標Accuracy、True positive rate、False positive rate皆有顯著的提升,整體準確率(Total accuracy)由20%上升至70%。改善盲區偵測有助於需要更精確的操作應用,例如擬人化機械手臂的應用。
Leap Motion is a fast and high accuracy optical sensor for finger tracking. However, it has limitation when fingers and palm lapped over each other, which makes sensing error. This study proposes a novel method for error correction from the blind zone of Leap Motion. The proposed method focuses on data correction of finger direction by using Hidden Markov Model (HMM). Captured data from Leap Motion is a time series data set and the behavior of finger movement model exactly similar to the characteristic of HMM (doubly stochastic process). Each of hidden states stands for a space of finger movement region. Through Viterbi algorithm, the exact region of the finger can be identified. The test data consisted of finger-opened movement and finger-closed movement for each state, which created confusion matrix that indicated the recognition rate have significantly improved. All the accuracy, true positive rate and false positive rate better than directly observed from Leap Motion. The results of proposed method show that HMM can alleviate the obstacle of the blind zone and increase the correct rate of finger tracking without much delay. That is helpful for the scenario of gesture control that needed much accuracy in the blind zone. One of the appropriate application is anthropomorphic robotic hand control. The proposed method is able to capture the finger movement correctly. For this reason, anthropomorphic robotic hand can be controlled well.
第一章 緒論 1
第二章 研究方法與背景 7
第三章 手指運動與手部觀測模型 25
第四章 實驗 37
第五章 結論 47
[1]J. Artal-Sevil and J. L. Montañés, “Development of a Robotic Arm and implementation of a control strategy for gesture recognition through Leap Motion device,” in Proc. Technologies Applied to Electronics Teaching. TAEE’16, pp. 1–9, June 22-24, 2016, Seville, Spain.
[2]F. Weichert, D. Bachmann, B. Rudak, and D. Fisseler, “Analysis of the Accuracy and Robustness of the Leap Motion Controller,” Sensors, pp. 6380–6393, May. 2013.
[3]J. Guna, G. Jakus, M. Pogacnik, S. Tomazic and J. Sodnik, “An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking,” Sensors, vol. 14, no.2, pp. 3702-3720, Feb. 2014
[4]S. AlAwadhi, N. AlHabib, D. Murad, F. AlDeei, M. AlHouti, T. Beyrouthy, and S. Al-Kork, “Virtual Reality Application for Interactive and Informative Learning,” 2017 2nd International Conference on Bio-engineering for Smart Technologies (BioSMART), Paris, pp. 1-4, 2017.
[5]H. Ling and L. Rui, “VR glasses and Leap Motion Trends in Education,” The 11th International Conference on Computer Science & Education (ICCSE 2016), Nagoya, Japan, pp. 917-920, 2016.
[6]M. D. Wibowo, I. Nurtanio, and A. A. Ilham, “Indonesian sign language recognition using leap motion controller,” 2017 11th International Conference on Information & Communication Technology and System, Surabaya, Indonesian, pp. 67-72, Oct. 2017.
[7]D. Naglot and M. Kulkarni, “Real time sign language recognition using the leap motion controller,” IEEE International Conference on Inventive Computation Technologies, pp. 1-5, 2016.
[8]A.S. Elons, M. Ahmed, H. Shedid, and M.F. Tolba, “Arabic Sign Language Recognition Using Leap Motion Sensor,” 2014 9th International Conference on Computer Engineering & Systems (ICCES), pp. 368-373, 2014.
[9]C. Chen, L. Chen, X. Zhou, and W. Yan, “Controlling a Robot Using Leap Motion,” 2017 2nd International Conference on Robotics and Automation Engineering, pp. 48-51, 2017.
[10]H. Jin, Q. Chen, Z. Chen, Y. Hu, and J. Zhang, “Multi-Leap Motion Sensor Based Demonstration for Robotic Refine Tabletop Object Manipulation Task,” CAAI Transactions on Intelligence Technology, vol.1, issue.1, pp. 104-113, 2016.
[11]M. Elmezain, A. Al-Hamadi, and B. Michaelis, “Hand trajectory-based gesture spotting and recognition using hmm,” 2009 16th IEEE International Conference on Image Processing, pp. 3577-3580, 2019.
[12]Z. Yang, Y. Li, W. Chen, and Y. Zheng, “Dynamic hand gesture recognition using hidden Markov models,” 2012 7th International Conference on Computer Science & Education, pp. 360-365, 2012.
[13]F.-S. Chen, C.-M. Fu, and C.-L. Huang, “Hand Gesture Recognition Using a Real-Time Tracking Method and Hidden Markov Models,” Image and Vision Computing, vol. 21, no. 8, pp. 745-758, 2003.
[14]L. Nianjun, C. L. Brian, J. K. Peter, and A. D. Richard, “Model Structure Selection & Training Algorithms for a HMM Gesture Recognition System,” 9th International Workshop on Frontiers in Handwriting Recognition, pp. 100-105, 2004.
[15]K-Y. Lian and B-H. Lin, “Gesture Recognition Using Improved Hierarchical Hidden Markov Algorithm,” 2013 IEEE International Conference on System, Man, and Cybernetics, pp. 1738-1742, 2013.
[16]T. Chaikhumpha and P. Chomphuwiset, “Real-time Two Hand Gesture Recognition with Condensation and Hidden Markov Models,” 2018 International Workshop on Advanced Image Technology, pp. 1-4, 2018.
[17]LR. Rabiner, “A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition,” Proceedings of the IEEE, vol. 77, no. 2, pp. 257-286, 1989.
[18]InMoov, “Gaël LANGEVIN – French InMoov designer,” InMoov: open-source 3D printed life-size robot. [Online]. Available: http://inmoov.fr. [Accessed: May 31, 2019].
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *