帳號:guest(3.145.172.56)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目勘誤回報
作者:高子祺
作者(英文):Tzu-Chi Kao
論文名稱:應用於線上學習專注度評分之視線追蹤系統
論文名稱(英文):Gaze Tracking System for Evaluating Concentration on Online Courses
指導教授:孫宗瀛
指導教授(英文):Tsung-Ying Sun
口試委員:謝昇達
林君玲
口試委員(英文):Sheng-Ta Hsieh
Chun-Ling Lin
學位類別:碩士
校院名稱:國立東華大學
系所名稱:電機工程學系
學號:610523001
出版年(民國):107
畢業學年度:106
語文別:中文
論文頁數:46
關鍵詞:視線偵測眼部追蹤瞳孔輪廓模糊系統
關鍵詞(英文):gaze detectingeyes trackingpupil contourfuzzy system
相關次數:
  • 推薦推薦:0
  • 點閱點閱:12
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:3
  • 收藏收藏:0
  隨著科技日新月異,網際網路和個人電腦都有著極高的普及度,大規模網路開放課程(MOOC)、Webinar等線上教學方式也隨之成為新趨勢。然而,相信參與過任何線上學習的人都知道,要長時間專注在課程學習而不被社交網站或遊戲吸引,是一件十分困難的事。因此,判斷課程中學習者視線的興趣區塊並幫助其了解自己的學習狀況,成為非常熱門的議題。
  本論文提出一個以視線追蹤為基礎的學習者專注度評分系統,論文的第一部份實現以3D影像為基礎,運用頭部姿態進行眼睛追蹤,獲得興趣區塊(Region of Interest, ROI),經過影像前置處理後透過橢圓擬合(Ellipse Fitting)偵測與追蹤瞳孔位置,第二部份是發展模糊推論系統推斷學習者的視線興趣區塊,將結果與學習視窗加以比對,執行專注度評分的加減分。
  實驗結果顯示本論文提出的視線追蹤有一定的準確度,且能針對追蹤結果進行有效的專注度評分,進而幫助使用者了解自己的學習狀況。
 Recently, with the advance of computer science and internet, online course such as MOOC and webinar has been a popular educational application. However, online course students are easily distracted from the course by entertainment applications. In order to solve this problem, the technique for evaluating learners’ concentration becomes critical.
 This thesis proposes a gaze tracking system for evaluating concentration on online courses. This system can be divided into two parts: gaze tracking module and evaluating concentration module. Gaze tracking module uses fuzzy logic system and pupils position to infer the eye-gaze location of learners. And then evaluating concentration module uses the gaze location to evaluating learner’s concentration score.
 The experiment results showed that the proposed system estimated the gaze location effectively. And then learner’s concentration score could help learner being well aware of learning situation.
第一章 緒論 1
1-1 前言 1
1-2 文獻回顧 2
1-3 研究動機 4
1-4 研究方法 4
1-5 論文架構 5
第二章 瞳孔追蹤與視線偵測方法的相關理論 7
2-1 影像前處理 7
2-1-1 RGB影像 7
2-1-2 灰階影像 8
2-1-3 二值化影像 9
2-1-4 直方圖均衡化 10
2-2 眼睛追蹤 12
2-2-1 頭部姿態 13
2-2-2 簡單移動平均 15
2-3 瞳孔追蹤 16
2-3-1 邊緣偵測 16
2-3-2 輪廓偵測 19
2-3-3 橢圓擬合 19
2-4 模糊推論系統 21
2-4-1 規則式模糊系統 21
2-4-2 歸屬函數 22
2-4-3 模糊推論系統 23
第三章 專注度評分系統 25
3-1 視線偵測Ⅰ:頭部姿態 26
3-2 視線偵測Ⅱ:瞳孔位置 31
3-2-1 眼睛追蹤 32
3-2-2 瞳孔位置與視線區域 34
3-3 專注度評分標準設定 35
第四章 實驗結果 37
4-1 實驗說明 37
4-2 實驗結果 38
第五章 結論與未來展望 42
5-1 結論 42
5-2 未來展望 42
參考文獻 44
作者簡歷 47
[1] C.-M. Chen and C.-H. Wu, “Effects of different video lecture types on sustained attention emotion cognitive load and learning performance,” Computers & Education, vol. 80, pp. 108-121, 2015.
[2] A. Manna, A. Raffone, M. G. Perrucci, D Nardo, A. Ferretti, A. Tartaro, A. Londei, C. D. Gratta, M. O. Belardinelli, and G. L. Romani, “Neural correlates of focused attention and cognitive monitoring in meditation,” Brain research bulletin, vol. 82, no. 1, pp. 46-56, 2010.
[3] C. Calvi, M. Porta, and D. Sacchi, “e5Learning, an ELearning Environment Based on Eye Tracking,” in Porc. Advanced Learning Technologies 2008. ICALT'08. Eighth IEEE International Conference on, pp. 376-380, 2008.
[4] M. Porta, S. Ricotti, and C. J. Perez, “Emotional e-learning through eye tracking,” in Proc. Global Engineering Education Conference (EDUCON) 2012 IEEE, pp. 1-6, 2012.
[5] H. Egi, S. Ozawa, and Y. Mori, “Analyses of Comparative Gaze with Eye-Tracking Technique for Peer-Reviewing Classrooms,” in Proc. Advanced Learning Technologies (ICALT) 2014 IEEE 14th International Conference on, pp. 622-623, 2014.
[6] T. Ujbanyi, J. Katona, G. Sziladi, and A. Kovari, “Eye-tracking analysis of computer networks exam question besides different skilled groups,” in Proc. Cognitive Infocommunications (CogInfoCom) 2016 7th IEEE International Conference on, pp. 000277-000282, 2016.
[7] D. A. Robinson, “A method of measuring eye movement using a scleral search coil in a magnetic field,” IEEE Transactions on Bio-medical Engineering, vol. BME-10, pp. 137-145, Oct. 1963.
[8] A. Plotkin, O. Shafrir, E. Paperno, and D. M. Kaplan, “Magnetic eye tracking: a new approach employing a planar transmitter,” IEEE Transactions on Bio-medical Engineering, vol. 57, no. 5, pp. 1209-15, May 2010.
[9] R. Barea, L. Boquete, and M. Mazo, “System for assisted mobility using eye movements based on electrooculography,” Neural Systems and Rehabilitation Engineering, pp.209-218, 2002.
[10] K. Yamagishi, J. Hori, and M. Miyakawa, “Development of EOG-based communication system controlled by eight-directional eye movements,” in Proc. Proceedings of the 28th IEEE EMBS Annual International Conference, pp. 2574-2577, 2006.
[11] A. Lopez, I. Rodriguez, F. J. Ferrero, M. Valledor, and J. C. Campo, “Low-cost system based on electro-oculography for communication of disabled people,” in Proc. Multi-Conference on Systems Signals & Devices (SSD) 2014 11th International, pp. 1-6, 2014.
[12] T. Sugita, S. Suzuki, J. Kolodko, and H. Igarashi, “Development of Head-Mounted Eye Tracking System achieving Environmental Recognition Ability,” in Proc. the SICE Annual Conference 2007, pp. 1887-1891, 2007.
[13] L. H. Yu and M. Eizenman, “A new methodology for determining point-of-gaze in head-mounted eye tracking systems,” IEEE Transactions on Biomedical Engineering, vol. 51, pp. 1765-1773, 2004.
[14] L. Sun, Z. Liu, and M.-T. Sun, “Real time gaze estimation with a consumer depth camera,” Information Sciences, vol. 320, pp. 346-360, 2015.
[15] P. M. Corcoran, F. Nanu, S. Petrescu, and P. Bigioi, “Real-time eye gaze tracking for gaming design and consumer electronics systems,” IEEE Transactions on Consumer Electronics, vol. 58, no. 2, pp. 347-355, May 2012.
[16] A. McAndrew, Introduction to Digital Image Processing with MATLAB, MA, Boston: Course Technology, pp.78-79, 2004.
[17] Intel RealSense SDK 2016 R2 Reference Manual, Intel Corporation, Santa Clara, California, USA, 2016.
https://software.intel.com/sites/landingpage/realsense/camera-sdk/v1.1/documentation/html/index.html?doc_devguide_introduction.html
[18] Intel RealSense SDK 2016 R2 Reference Manual: Face Tracking, Intel Corporation, Santa Clara, California, USA, 2016.
https://software.intel.com/sites/landingpage/realsense/camera-sdk/v1.1/documentation/html/index.html?doc_face_face_tracking_and_recognition.html
[19] J. Canny, “A computational approach to edge detection,” IEEE Transactions on Patter Analysis and Machine Intelligence, 8(6):679-698, 1996.
[20] OpenCV-Python Tutorials: Canny Edge Detection.
[21] A. Fitzgibbo, M. Pilu, and R.B. Fisher, “Direct least square fitting of ellipses,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 21, no. 5, pp. 476-480, 1999.
[22] F.L. Bookstein, “Fitting Conic Sections to Scattered Data,” Computer Graphics and Image Processing, no. 9, pp. 56-71, 1979.
[23] L.A. Zadeh, “Fuzzy Sets,” Information Control, vol. 8, pp338-353, 1965.
[24] M. Sezgin and B. Sankur, “Survey over image thresholding techniques and quantitative performance evaluation,” Journal of Electronic Imaging 13(1), pp. 146-165, 2004.
[25] G. B. Arfken and H. J. Weber, Mathematical Methods for Physicists, MA, Cambridge: Academic Press, 2005.
[26] A. V. Oppenheim and R. W. Schafer, Discrete-Time Signal Processing, Upper Saddle River: Prentice Hall, 2007
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *