帳號:guest(3.145.172.140)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目勘誤回報
作者:林唯德
作者(英文):Wei-Te Lin
論文名稱:混合可加模型下之變數選取與分群
論文名稱(英文):Variable selection on the mixture of additive quantile regression models.
指導教授:吳韋瑩
指導教授(英文):Wei-Ying Wu
口試委員:王文廷
曹振海
吳韋瑩
口試委員(英文):Wen-Ting Wang
Chen-Hai Tsao
Wei-Ying Wu
學位類別:碩士
校院名稱:國立東華大學
系所名稱:應用數學系
學號:610511101
出版年(民國):107
畢業學年度:106
語文別:英文
論文頁數:40
關鍵詞:B 樣條混合性模型可加性模型EM 演算法加權群體 Lasso
關鍵詞(英文):B-splineMixture modelAdditive modelEM algorithmWeighted group Lasso
相關次數:
  • 推薦推薦:0
  • 點閱點閱:35
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:13
  • 收藏收藏:0
當資料來自於一個混合型可加分位數迴歸模型時,若我們對其直接運用現下流行之分位數迴歸分析來進行模型之選取,會發生一些不合理的結果。因此在我們的研究中,我們試圖開發一演算法,同時進行分群資料、變數選取以及可加模型之函數的辨別。在此演算法中,我們利用B-spline函數來逼近可加模型之函數,並利用分位數迴歸加入Adaptively Weighted Group Lasso懲罰項來進行變數之選取及函數之辨別。最後,我們利用模擬資料,來討論我們的演算法之結果。
When observations come from the mixture of additive quantile regression models, some unreasonable results of variable selection could happen if the existing quantile approaches are applied directly. In this work, we attempt to develop an algorithm to cluster data, select relevant variables, and identify the related structures simultaneously. In the proposed algorithm, B-spline function is utilized to approximate the additive model and the quantile regression with the Lasso-type penalty is employed for the variable selection and structure detection. The performance of the suggested algorithm is discussed through simulation problems.
1 Introduction 1
2 Literature Reviews 3
2.1 Quantile and Quantile Regression 3
2.2 Penalty Function 6
2.3 Mixture of Linear Regression Model 7
2.4 Additive Model 9
2.5 B-spline Function 10
3 Main Problem 13
3.1 Mixture of Additive model 13
3.2 Detecting the Function Structure of Additive Model via Quantile Regression 14
3.3 Mixture Additive Model on Quantile Regression (MAQR) Algorithm 16
4 Simulation Study and Real Data 19
4.1 Simulation Study 19
4.2 Boston Housing data 34
5 Conclusion 37
6 References 39
Bhattacharya, C. G. A simple method of resolution of a distribution into Gaussian components. Biometrics, 1967, 115-135.

Bohning, Dankmar. Computer-assisted analysis of mixtures and applications: meta-analysis, disease mapping and others. CRC press, 2000.

Breiman, Leo; Friedman, Jerome H. Estimating optimal transformations for multiple regression and correlation. Journal of the American statistical Association, 1985, 80.391: 580-598.

Breiman, L. Better subser selection using the non-negative garotte. Technical Report, University of California, Berkeley, 1993.

Faria, Susana; Soromenho, Gilda. Fitting mixtures of linear regressions. Journal of Statistical Computation and Simulation, 2010, 80.2: 201-225.

Hastie, Trevor J.; Tibshirani, Robert J. Generalized additive models, volume 43 of Monographs on Statistics and Applied Probability. 1990.

Honda, Toshio; Ing, Ching-Kang; Wu, Wei-Ying. Adaptively weighted group Lasso for semiparametric quantile regression models. 2017.

Koenker, Roger; Bassett Jr., Gilbert. Regression quantiles. Econometrica: journal of the Econometric Society, 1978, 33-50.

Lian, Heng. Semiparametric estimation of additive quantile regression models by two-fold penalty. Journal of Business \& Economic Statistics, 2012, 30.3: 337-350.

Lindsay, Bruce G. Mixture models: theory, geometry and applications. In: NSF-CBMS regional conference series in probability and statistics. Institute of Mathematical Statistics and the American Statistical Association, 1995. p. i-163.

Quandt, Richard E. A new approach to estimating switching regressions. Journal of the American statistical association, 1972, 67.338: 306-310.

Quandt, Richard E.; Ramsey, James B. Estimating mixtures of normal distributions and switching regressions. Journal of the American statistical Association, 1978, 73.364: 730-738.

Ramsay, James B. Mixtures of distributions and maximum likelihood estimation of parameters contained in finitely bounded compact spaces. 1975.

Schumaker, Larry. Spline functions: basic theory. Cambridge University Press, 2007.

Schumaker, Larry L. Spline functions: computational methods. SIAM, 2015.

Tibshirani, Robert. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society. Series B (Methodological), 1996, 267-288.

Yakowitz, S. Unsupervised learning and the identification of finite mixtures. IEEE Transactions on Information Theory, 1970, 16.3: 330-338.

Young, Tzay; Coraluppi, Giorgio. Stochastic estimation of a mixture of normal density functions using an information criterion. IEEE Transactions on Information Theory, 1970, 16.3: 258-263.

Yuan, Ming; Lin, Yi. Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 2006, 68.1: 49-67.

Zou, Hui. The adaptive lasso and its oracle properties. Journal of the American statistical association, 2006, 101.476: 1418-1429.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *