




版權說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權,請進行舉報或認領
文檔簡介
1、1Parallels with Simple Regression b0 is still the intercept b1 to bk all called slope parameters u is still the error term (or disturbance) Still need to make a zero conditional mean assumption, so now assume that E(u|x1,x2, ,xk) = 0 Still minimizing the sum of squared residuals, so have k+1 first o
2、rder conditions第1頁/共28頁2Interpreting Multiple Regressiontioninterpreta a has each is that , thatimplies fixed ,., holding so,.so ,.112221122110ribus ceteris paxyxxxxxyxxxykkkkkbbbbbbbbb第2頁/共28頁3A “Partialling Out” Interpretation22011211122110 regression estimated thefrom residuals theare where, then
3、 ,i.e. , 2 wherecase heConsider txxrryrxxykiiiibbbb第3頁/共28頁4“Partialling Out” continued Previous equation implies that regressing y on x1 and x2 gives same effect of x1 as regressing y on residuals from a regression of x1 on x2 This means only the part of xi1 that is uncorrelated with xi2 are being
4、related to yi so were estimating the effect of x1 on y after x2 has been “partialled out”第4頁/共28頁5Simple vs Multiple Reg Estimatesample in the eduncorrelat are and OR ) ofeffect partial no (i.e. 0:unless Generally, regression multiple with the regression simple theCompare21221122110110 xxxxxyxybbbbb
5、bbb第5頁/共28頁6Goodness-of-FitSSR SSE SSTThen (SSR) squares of sum residual theis (SSE) squares of sum explained theis (SST) squares of sum total theis :following thedefine then Wepart, dunexplainean and part, explainedan of upmade being asn observatioeach ofcan think We222iiiiiiuyyyyuyy第6頁/共28頁7Goodne
6、ss-of-Fit (continued) How do we think about how well our sample regression line fits our sample data? Can compute the fraction of the total sum of squares (SST) that is explained by the model, call this the R-squared of regression R2 = SSE/SST = 1 SSR/SST第7頁/共28頁8Goodness-of-Fit (continued)22222 val
7、ues theand actual thebetweent coefficienn correlatio squared the toequal being as of think alsocan WeyyyyyyyyRyyRiiiiii第8頁/共28頁9More about R-squared R2 can never decrease when another independent variable is added to a regression, and usually will increase Because R2 will usually increase with the n
8、umber of independent variables, it is not a good way to compare models第9頁/共28頁10Assumptions for Unbiasedness Population model is linear in parameters: y = b0 + b1x1 + b2x2 + bkxk + u We can use a random sample of size n, (xi1, xi2, xik, yi): i=1, 2, , n, from the population model, so that the sample
9、 model is yi = b0 + b1xi1 + b2xi2 + bkxik + ui E(u|x1, x2, xk) = 0, implying that all of the explanatory variables are exogenous None of the xs is constant, and there are no exact linear relationships among them第10頁/共28頁11Too Many or Too Few Variables What happens if we include variables in our spec
10、ification that dont belong? There is no effect on our parameter estimate, and OLS remains unbiased What if we exclude a variable from our specification that does belong? OLS will usually be biased 第11頁/共28頁12Omitted Variable Bias21111111022110 then, estimatebut we ,asgiven is model true theSupposexx
11、yxxuxyuxxyiiibbbbbb第12頁/共28頁13Omitted Variable Bias (cont)iiiiiiiiiiiiiuxxxxxxxuxxxxuxxy1121122111221101122110becomesnumerator theso , thatso model, true theRecallbbbbbbbb第13頁/共28頁14Omitted Variable Bias (cont) 2112112112111121121121have wensexpectatio taking0,)E( sincexxxxxEuxxuxxxxxxxiiiiiiiiiibbb
12、bbb第14頁/共28頁15Omitted Variable Bias (cont) 12112112111110212 so then on of regression heConsider tbbbExxxxxxxxxiii第15頁/共28頁16Summary of Direction of BiasCorr(x1, x2) 0 Corr(x1, x2) 0Positive biasNegative biasb2 0Negative biasPositive bias第16頁/共28頁17Omitted Variable Bias Summary Two cases where bias
13、is equal to zerob2 = 0, that is x2 doesnt really belong in modelx1 and x2 are uncorrelated in the sample If correlation between x2 , x1 and x2 , y is the same direction, bias will be positive If correlation between x2 , x1 and x2 , y is the opposite direction, bias will be negative第17頁/共28頁18The Mor
14、e General Case Technically, can only sign the bias for the more general case if all of the included xs are uncorrelated Typically, then, we work through the bias assuming the xs are uncorrelated, as a useful guide even if this assumption is not strictly true第18頁/共28頁19Variance of the OLS Estimators
15、Now we know that the sampling distribution of our estimate is centered around the true parameter Want to think about how spread out this distribution is Much easier to think about this variance under an additional assumption, soAssume Var(u|x1, x2, xk) = s2 (Homoskedasticity)第19頁/共28頁20Variance of O
16、LS (cont) Let x stand for (x1, x2,xk) Assuming that Var(u|x) = s2 also implies that Var(y| x) = s2 The 4 assumptions for unbiasedness, plus this homoskedasticity assumption are known as the Gauss-Markov assumptions第20頁/共28頁21Variance of OLS (cont) s other allon regressing from theis and where,1sAssu
17、mption Markov-Gauss Given the22222xxRRxxSSTRSSTVarjjjijjjjjsb第21頁/共28頁22Components of OLS Variances The error variance: a larger s2 implies a larger variance for the OLS estimators The total sample variation: a larger SSTj implies a smaller variance for the estimators Linear relationships among the
18、independent variables: a larger Rj2 implies a larger variance for the estimators第22頁/共28頁23Misspecified Models same there then theyed,uncorrelat are and unless Thus, that so ,model edmisspecifi again theConsider 2111121110 xxVarVarSSTVarxybbsbbb第23頁/共28頁24Misspecified Models (cont) While the variance of the estimator is smaller for the misspecified model, unless b2 = 0 the misspecified model is biased As the sample size grows, the variance of each estimator shrinks to zero, making the variance difference less important第24頁/共28
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預覽,若沒有圖紙預覽就沒有圖紙。
- 4. 未經(jīng)權益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負責。
- 6. 下載文件中如有侵權或不適當內(nèi)容,請與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準確性、安全性和完整性, 同時也不承擔用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。
最新文檔
- 課題申報書和立項書區(qū)別
- 蒙醫(yī)課題申報書
- 小課題研究申報書
- 上虞勞動合同范本
- 血脂管理課題申報書范文
- 南京房子合同范本
- 供暖商務合同范本
- 課題研究申報書范例圖表
- 朗讀課題立項申報書
- pos機銷售合同范本
- 骶髂關節(jié)損傷郭倩課件
- 內(nèi)科學疾病概要-支氣管擴張課件
- 2025陜西渭南光明電力集團限公司招聘39人易考易錯模擬試題(共500題)試卷后附參考答案
- 預防感冒和流感的方法
- 2024年黑龍江職業(yè)學院高職單招語文歷年參考題庫含答案解析
- 2024年南京旅游職業(yè)學院高職單招語文歷年參考題庫含答案解析
- 人教版高中英語挖掘文本深度學習-選修二-UNIT-4(解析版)
- 中藥學電子版教材
- 評審會專家意見表
- pep小學英語四年級上冊Unit3全英文說課稿
- 中藥知識文庫:天麻形態(tài)學
評論
0/150
提交評論