ROBUST M KIBRIA-LUKMAN ESTIMATOR FOR LINEAR REGRESSION MODEL WITH OUTLIERS IN THE X-DIRECTION: SIMULATIONS AND APPLICATIONS

Authors

  • Adejumo Taiwo Joel Department of Statistics, Ladoke Akintola University of Technology, Ogbomoso,
  • Ayinde Kayode Department of Mathematics and Statistics, Northwest Missouri State University, Maryville, Missouri,
  • Okegbade Ayobami Ibukun Department of Statistics, Ladoke Akintola University of Technology, Ogbomoso,
  • Akomolafe A.A. Department of Statistics, Federal University of Technology, Akure,
  • Oshuporu Opeoluwa Abosede Department of Statistics, Ladoke Akintola University of Technology, Ogbomoso,
  • Koleoso Sunday Olawale Department of Statistics, Federal University of Technology, Akure,

Abstract

The Ordinary Least Square (OLS) estimator remains Best Linear Unbiased Estimator (BLUE) when all the assumptions surrounding it stay intact, but at an iota of violation of the assumptions, it becomes inefficient and unstable. Some causes of the violation are the multicollinearity and the presence of extreme values (outliers). Recently robust Kibria –Lukman based on M estimator was proposed by Majid et al. (2022) but when there are outlying cases in the y-direction. Since, outliers in the x-direction may be inevitable in the data set, therefore it becomes imperative to examine the performance of the robust-M Kibria–Lukman (KL-M) estimator as alternative to already proposed robust estimators that can handle these problems when there are outliers in the x-direction. Through the Monte Carlo experiment, theoretical results under some conditions and factors, including application to real-life data, the new estimator outperformed other estimators considered in this study in the presence of multicollinearity and extreme values in the x-direction. As the error variances ( ), level of multicollinearity (rho) and percentage (px), and magnitude (mx) of outliers increase, the Mean Square Error (MSE) of the estimators' considered increase. Meanwhile the MSEs of the estimators decrease as the sample size (n) increases. When rho>0, mx>0, the (px) increases, and sample size (n) increases KL-M along sides, ordinary Kibria-Lukman (KL) estimator outperformed other estimators as the two anomalies occur simultaneously. The KL-M performed better, especially when the sample size was n=100. Conclusively, at the different biasing parameters of the estimators, KL-M performed better than other estimators considered in the study. In the same vein, real-life data was adopted to affirm the claim.

Downloads

Published

2024-06-30

Issue

Section

ARTICLES