Additive Models 到 BackFitting

来源:互联网 发布:vv51软件下载 编辑:程序博客网 时间:2024/06/08 08:14

原文出处:http://zhfuzh.blog.163.com/blog/static/14553938720128995950696/


对原文的一些注释·

这里因为是一个自己选取的smoothing函数(任意指定),但是必须要使的它在样本上的均值为零。也就是该维特征的期望为0。

例子:
Figure 8.1: Estimated (solid lines) versus true additive component functions (circles at the input values)
\includegraphics[width=1.4\defpicwidth]{SPMdemob1.ps}

Consider a regression problem with four input variables $ X_1$ to $ X_4$. When $ n$ is small, it is difficult to obtain a precise nonparametric kernel estimate due to the curse of dimensionality. Let us take a sample of $ n=75$regression values. We use explanatory variables that are independent and uniformly distributed on $ [-2.5,2.5]$ and responses generated from the additive model

$\displaystyle Y= \sum_{\alpha=1}^4 g_\alpha (X_\alpha) + \varepsilon,\quad\varepsilon \sim N(0,1).$

The component functions are chosen as

\begin{displaymath}\begin{array}{ll} g_1 (X_1) = -\sin (2X_1), &\quad g_2 (X_2)......_3 , &\quad g_4(X_4) = \exp(-X_4)-E\{\exp(-X_4)\}.\end{array}\end{displaymath}
自己指定的smooth函数

In Figure  8.1 we have plotted the true functions (at the corresponding observations $ X_{i\alpha}$) and the estimated curves. We used backfitting with univariate local linear smoothers and set the bandwidth to $ h=1$ for each dimension (using the Quartic kernel). We see that even for this small sample size the estimator gives rather precise results. 

Next, we turn to a real data example demonstrating the use of additive regression estimators in practice and manifesting that even for a high dimensional data set the backfitting estimator works reasonably well.


0 0
原创粉丝点击