Wu, JingjingXie, Xiaoran2014-08-062014-11-172014-08-062014http://hdl.handle.net/11023/1671This M.Sc. thesis focuses on improving the convergence rates of kernel density estimators. Firstly, a bias reduced kernel density estimator is introduced and investigated. In order to reduce bias, we intuitively subtract an estimated bias term from ordinary kernel density estimator. Theoretical properties such as bias, variance and mean squared error are investigated for this estimator and comparisons with ordinary kernel density estimator and location-scale kernel density estimator are made. Compared with the ordinary density estimator, this estimator has reduced bias and mean squared error (MSE) of the order O(h^3) and O(n^(-6/7)),respectively, and even further reduced orders O(h^4) and O(n^(-8 9)) respectively when the kernel is chosen symmetric. Secondly, we propose a geometric extrapolation of the location-scale kernel estimator and a geometric extrapolation of the bias reduced kernel estimator introduced above. Similarly, we investigate their theoretical properties and compare them with the geometric extrapolation of ordinary kernel estimator. These results show that among the three geometric extrapolated kernel estimators, the one based on bias reduced kernel estimator has smallest bias and MSE. The geometric extrapolation of bias reduced kernel estimator can improve the convergence rates of bias and MSE to O(h^6) and O(n^(-12/13)) respectively for symmetric kernels. The geometric extrapolation of location-scale kernel estimator can reduce bias and MSE of location-scale kernel estimator, however it has bias and MSE with a slower rate than those of the geometric extrapolation of ordinary kernel estimator. In order to assess nite sample performances of the proposed estimators, Monte Carlo simulation studies based on small to moderately large samples are carried out. Finally, an analysis of the old faithful geyser data are presented to demonstrate the proposed methods. Both the simulation studies and the real data analysis consolidate our theoretical findings.engUniversity of Calgary graduate students retain copyright ownership and moral rights for their thesis. You may use this material in any way that is permitted by the Copyright Act or through licensing that has been assigned to the document. For uses that are not allowable under copyright legislation or licensing, you are required to seek permission.StatisticsKernel Density EstimationGeometric ExtrapolationBias ReductionMean Squared ErrorConvergence RateSome Improvement on Convergence Rates of Kernel Density Estimatormaster thesis10.11575/PRISM/27828