|
1. Anselin, L. (1988). Spatial econometrics: methods and models (Vol. 4). Springer Science & Business Media. 2. Agterberg, F. (2004). Georges Matheron: Founder of spatial statistics. Earth Sciences History, 23(2), 205-334. 3. Bohling, G. (2005). Introduction to geostatistics and variogram analysis. Kansas geological survey, 1(10), 1-20. 4. Biau, G., Devroye, L., & Lugosi, G. (2008). Consistency of random forests and other averaging classifiers. Journal of Machine Learning Research, 9(9). 5. Bowerman, B. L., & O’Connell, R. T. (1979). Time series and forecasting. North Scituate, MA: Duxbury Press. 6. Freund, Y., Schapire, R., & Abe, N. (1999). A short introduction to boosting. Journal-Japanese Society For Artificial Intelligence, 14(771-780), 1612. 7. Chen, T., & Guestrin, C. (2016, August). Xgboost: A scalable tree boosting system. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 785-794). 8. Cressie, N. (2015). Statistics for spatial data. John Wiley & Sons. 9. Calder, C. A. (2008). A dynamic process convolution approach to modeling ambient particulate matter concentrations. Environmetrics: The official journal of the International Environmetrics Society, 19(1), 39-48. 10. Chen, B. K., & Yang, C. Y. (2014). Differences in age-standardized mortality rates for avoidable deaths based on urbanization levels in Taiwan, 1971–2008. International journal of environmental research and public health, 11(2), 1776-1793. 11. Chen, W., Li, Y., Reich, B. J., & Sun, Y. (2020). Deepkriging: Spatially dependent deep neural networks for spatial prediction. arXiv preprint arXiv:2007.11972. 12. Dimitrakopoulos, R., & Luo, X. (1994). Spatiotemporal modelling: Covariances and ordinary kriging systems. Dordrecht: Springer Netherlands. 13. Datta, A., Banerjee, S., Finley, A. O., Hamm, N. A., & Schaap, M. (2016). Nonseparable dynamic nearest neighbor Gaussian process models for large spatiotemporal data with an application to particulate matter analysis. The annals of applied statistics, 10(3), 1286. 14. DeVore, R., Hanin, B., & Petrova, G. (2021). Neural network approximation. Acta Numerica, 30, 327-444. 15. Friedman, J. H. (2001). Greedy function approximation: a gradient boosting machine. Annals of statistics, 1189-1232. 16. Hong, B. Z., & Tsao, C. A. (2013). A comparison of random average regression methods. 17. Iranzad, R., Liu, X., Chaovalitwongse, W. A., Hippe, D., Wang, S., Han, J., ... & Bowen, S. (2022). Gradient boosted trees for spatial data and its application to medical imaging data. IISE transactions on healthcare systems engineering, 12(3), 165-179. 18. McCulloch, W. S., & Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. The Bulletin of Mathematical Biophysics, 5, 115-133. 19. Montero, J. M., Fernández-Avilés, G., & Mateu, J. (2015). Spatial and spatiotemporal geostatistical modeling and kriging. John Wiley & Sons. 20. Nielsen, D. (2016). Tree boosting with xgboost-why does xgboost win” every” machine learning competition? (Master’s thesis, NTNU). 21. Gopal, S. (2016). Artificial neural networks in geospatial analysis. International Encyclopedia of Geography: People, the Earth, Environment and Technology, 1-7. 22. Grinsztajn, L., Oyallon, E., & Varoquaux, G. (2022). Why do tree-based models still outperform deep learning on typical tabular data?. Advances in Neural Information Processing Systems, 35, 507-520. 23. Prokhorenkova, L., Gusev, G., Vorobev, A., Dorogush, A. V., & Gulin, A. (2018). Catboost: Unbiased boosting with categorical features. Advances in Neural Information Processing Systems, 31. 24. Paciorek, C. J., Yanosky, J. D., Puett, R. C., Laden, F., & Suh, H. H. (2009). Practical large-scale spatio-temporal modeling of particulate matter concentrations. The Annals of Applied Statistics, 370-397. 25. Sigrist, F. (2022). Gaussian process boosting. The Journal of Machine Learning Research, 23(1), 10565-10610. 26. Shwartz-Ziv, R., & Armon, A. (2022). Tabular data: Deep learning is not all you need. Information Fusion, 81, 84-90. 27. Tsao, C. A. (2014). A statistical introduction to ensemble learning methods. 中國統計學報, 52(1), 115-132.
|