帳號:guest(3.147.242.19)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目勘誤回報
作者:黃偉智
作者(英文):Wei-Chih Huang
論文名稱:使用卷積距離進行非線性降維
論文名稱(英文):Nonlinear Dimensionality Reduction with convolution distance
指導教授:吳建銘
指導教授(英文):Jiann-Ming Wu
口試委員:郭大衛
盧東華
口試委員(英文):David Kuo
Dong-Hwa Lu
學位類別:碩士
校院名稱:國立東華大學
系所名稱:應用數學系
學號:610811106
出版年(民國):110
畢業學年度:109
語文別:英文
論文頁數:24
關鍵詞:可視化維度化約非監督學習拓撲保留分群卷積神經網路
關鍵詞(英文):VisualizationDimensionality reductionUnsupervised learningTopology supportClusteringConvolutional neural network
相關次數:
  • 推薦推薦:0
  • 點閱點閱:19
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:6
  • 收藏收藏:0
這項工作探索了極高維度圖案的卷積距離,並提出了一種新穎的基於卷積相似性的分群方法,並應用在非線性降維上。歐幾里得距離有一項涉及內積。將一個三維圖案通過另一個圖案進行卷積的操作會獲得許多卷積內積,每個內積對應於由任意三維位移引起的排列。卷積內積的最大值表示測量了最大相似性數值的最佳圖案匹配。提出了一種基於最大相似性的準則來分群極高維度圖案的新方法,還有基於最大相似性的準則的最近相鄰關係。分群後,以各群的局部中心點為代表點,並透過最近相鄰關係定義次級代表點。在此基礎上透過PCA(principal components analysis)建構支援空間,將資料降維到中間k維空間,計算代表點之間的混合距離,混合距離構成了目標空間中輸入影像的非線性約束,這指向了一個個體的LNE(Local nonlinear embedding)問題,單個LNE問題可以透過Levenberg-Marquardt算法解決。
This work explores convolution distances of extremely high dimensional patterns and proposes a novel approach for clustering based on convolution similarity with application to do nonlinear dimensionality reduction. The Euclidean distance has one term involving an inner product. The operation of convoluting a 3-dimensional pattern through the other pattern attains many convolutional inner products, each corresponding an alignment caused by an arbitrary 3-dimensional shift. The largest value of convolutional inner products indicates the best pattern match, measuring the quantity of maximal similarity. A novel approach for clustering extremely high-dimensional patterns and the nearest neighboring relations are proposed based on the criterion of maximal similarity. After clustering, the local center point of each cluster is preserved as the representative, and the secondary representative is defined through the nearest neighboring relations. On this basis, the support space is constructed through principal components analysis. The data is reduced to the intermediate k-dimensional space, and the mixing distance between representatives is calculated. The mixing distance constitutes the nonlinear constraint of the input image in the target space. This specifies an individual locally nonlinear embedding problem (LNE). A single LNE problem can be solved by the Levenberg-Marquardt algorithm.
1. Introduction 1
2. Methods 4
2.1 Neighboring relations of centroids in neural modules for NDR mapping 4
2.2 Convolution distance 6
2.3 Clustering based on convolution distances 9
2.4 Second-level clustering and posterior processes 12
2.5 K nearest neighbors and locally nonlinear embedding 13
3. Numerical Experiments 15
3.1 Experiment 1 15
3.2 Experiment 2 18
3.3 Experiment 3 19
3.4 Improvement method 21
4. Conclusions 23
Reference 24
1. Wu,S.-S.;Jong,S.-J.;Hu,K.; Wu, J.-M. Learning Neural Representations and Local Embedding for Nonlinear Dimensionality Reduction Mapping. Mathematics2021,9,1017. https:// doi.org/10.3390/math9091017
2. Roweis, S.; Saul, L. Nonlinear dimensionality reduction by locally linear embedding. Science 2000, 290, 2323–2326.
3. Hinton, G.; Salakhutdinov, R. Reducing the dimensionality of data with neural networks. Science 2006, 313, 504–507.
4. Sorzano, C.O.S.; Vargas, J.; Pascual-Montano, A.D. A Survey of Dimensionality Reduction Techniques. arXiv 2014, arXiv:1403.2877.
5. Afshar, M.; Usefi, H. High-dimensional feature selection for genomic datasets. Knowl. Based Syst. 2020, 206, 106370.
6. Belkin, M.; Niyogi, P. Laplacian eigenmaps and spectral techniques for embedding and clustering. In Advances in Neural Information Processing Systems (NIPS 2001); Dietterich, T., Becker, S., Ghahramani, Z., Eds.; MIT Press: Cambridge, MA, USA, 2002.
7. Hu, R.; Ratner, K.; Ratner, E. ELM-SOM plus: A continuous mapping for visualization. Neurocomputing 2019, 365, 147–156.
8. Tasoulis, S.; Pavlidis, N.G.; Roos, T. Nonlinear Dimensionality Reduction for Clustering. Pattern Recognit. 2020, 107, 107508.
9. Martin, B.; Jens, L.; André, S.; Thomas, Z. Robust dimensionality reduction for data visualization with deep neural networks. Graph. Models 2020, 108, 101060.
10. Ding, J.; Condon, A.; Shah, S.P. Interpretable dimensionality reduction of single cell transcriptome data with deep generative models. Nat. Commun. 2018, 9, 1–13.
11. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444.
12. R.Durbin, G. Willshaw, An analogue approach to the traveling salesman problem using an elastic net method. Nature 326(1987).
13. Wu, J.-M.(2008). Multilayer Potts perceptrons with Levenberg-Marquardt learning. IEEE Trans Neural Netw 19(12):2032-2043.
14. Jiann-Ming Wu, Pei-Hsun Hsu. Annealed Kullback-Leibler divergence minimization for generalized TSP, spot identification and gene sorting. Neuro-computing. 70(12-13), 2228-2240. doi:10.1016/j. neucom.201.03.002.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *