Some Improvement on Convergence Rates of Kernel Density Estimator

atmire.migration.oldid2351
dc.contributor.advisorWu, Jingjing
dc.contributor.authorXie, Xiaoran
dc.date.accessioned2014-08-06T17:52:11Z
dc.date.available2014-11-17T08:00:38Z
dc.date.issued2014-08-06
dc.date.submitted2014en
dc.description.abstractThis M.Sc. thesis focuses on improving the convergence rates of kernel density estimators. Firstly, a bias reduced kernel density estimator is introduced and investigated. In order to reduce bias, we intuitively subtract an estimated bias term from ordinary kernel density estimator. Theoretical properties such as bias, variance and mean squared error are investigated for this estimator and comparisons with ordinary kernel density estimator and location-scale kernel density estimator are made. Compared with the ordinary density estimator, this estimator has reduced bias and mean squared error (MSE) of the order O(h^3) and O(n^(-6/7)),respectively, and even further reduced orders O(h^4) and O(n^(-8 9)) respectively when the kernel is chosen symmetric. Secondly, we propose a geometric extrapolation of the location-scale kernel estimator and a geometric extrapolation of the bias reduced kernel estimator introduced above. Similarly, we investigate their theoretical properties and compare them with the geometric extrapolation of ordinary kernel estimator. These results show that among the three geometric extrapolated kernel estimators, the one based on bias reduced kernel estimator has smallest bias and MSE. The geometric extrapolation of bias reduced kernel estimator can improve the convergence rates of bias and MSE to O(h^6) and O(n^(-12/13)) respectively for symmetric kernels. The geometric extrapolation of location-scale kernel estimator can reduce bias and MSE of location-scale kernel estimator, however it has bias and MSE with a slower rate than those of the geometric extrapolation of ordinary kernel estimator. In order to assess nite sample performances of the proposed estimators, Monte Carlo simulation studies based on small to moderately large samples are carried out. Finally, an analysis of the old faithful geyser data are presented to demonstrate the proposed methods. Both the simulation studies and the real data analysis consolidate our theoretical findings.en_US
dc.identifier.citationXie, X. (2014). Some Improvement on Convergence Rates of Kernel Density Estimator (Master's thesis, University of Calgary, Calgary, Canada). Retrieved from https://prism.ucalgary.ca. doi:10.11575/PRISM/27828en_US
dc.identifier.doihttp://dx.doi.org/10.11575/PRISM/27828
dc.identifier.urihttp://hdl.handle.net/11023/1671
dc.language.isoeng
dc.publisher.facultyGraduate Studies
dc.publisher.institutionUniversity of Calgaryen
dc.publisher.placeCalgaryen
dc.rightsUniversity of Calgary graduate students retain copyright ownership and moral rights for their thesis. You may use this material in any way that is permitted by the Copyright Act or through licensing that has been assigned to the document. For uses that are not allowable under copyright legislation or licensing, you are required to seek permission.
dc.subjectStatistics
dc.subject.classificationKernel Density Estimationen_US
dc.subject.classificationGeometric Extrapolationen_US
dc.subject.classificationBias Reductionen_US
dc.subject.classificationMean Squared Erroren_US
dc.subject.classificationConvergence Rateen_US
dc.titleSome Improvement on Convergence Rates of Kernel Density Estimator
dc.typemaster thesis
thesis.degree.disciplineMathematics and Statistics
thesis.degree.grantorUniversity of Calgary
thesis.degree.nameMaster of Science (MSc)
ucalgary.item.requestcopytrue
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
ucalgary_2014_xie_xiaoran.pdf
Size:
832.61 KB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.65 KB
Format:
Item-specific license agreed upon to submission
Description: