Shapley Value Regression
Introduction:
In the econometric literature multicollinearity is defined as the incidence of high degree of correlation among some or all regressor variables. Strong multicollinearity has deleterious effects on the confidence intervals of linear regression coefficients (β in the linear regression model y=Xβ+u). Although it does not affect the explanatory power () of the regressors or unbiasedness of the estimated coefficients associated with them, it does inflate their standard error of estimate rendering test of hypothesis misleading or paradoxical, often such that although could be very high, individual coefficients may all have poor Student’s tvalues. Thus, strong multicollinearity may lead to failure in rejecting a false null hypothesis of ineffectiveness of the regressor variable to the regressand variable (type II error). Very frequently, it also affects the sign of the regression coefficients. However, it has been pointed out that the incidence of high degree of correlation (measured in terms of a large condition number; Belsley et al., 1980) among some or all regressor variables alone (unsupported by large variance of error in the regressand variable, y) has little effect on the precision of regression coefficients. Large condition number coupled with a large variance of error in the regressand variable destabilizes the regression estimator; either of the two in isolation cannot cause much harm, although the condition number is relatively more potent in determining the stability of estimated regression coefficients (Mishra, 2004-a).
Shapley value regression:
This is an entirely different strategy to assess the contribution of regressor variables to the regressand variable. It owes its origin in the theory of cooperative games (Shapley, 1953). The value of obtained by fitting a linear regression model y=Xβ+u is considered as the value of a cooperative game played by X (whose members, xj ϵ X; j=1, m, work in a coalition) against y (explaining it). The analyst does not have enough information to disentangle the contributions made by the individual members xj ϵ X; j=1, m, but only their joint 1 contribution () is known. The Shapley value decomposition imputes the most likely contribution of each individual xj ϵ X; j=1, m, to .
An algorithm to impute the contribution of individual variables to Shapley value:
Let there be m number of regressor variables in the model y=Xβ+u. Let X(p, r) be the r-membered subset of X in which the pth regressor appears and X(q, r) be the r-membered subset of X in which the pth regressor does not appear. Further, let be the obtained by regression of y on X(p, r) and be the obtained by regression of y on X(q, r). Then, the share of the regressor variable p (that is xp ϵ X) is given by . Moreover, Here k is the number of cases in which the evaluation in [.] was carried out. The sum of all S(p) for p=1, m (that is, is the of y=Xβ+u : (all xj ϵ X) or the total value of the game =
No comments:
Post a Comment