Jump to content

Talk:Partition of sums of squares

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Sum of squares

[edit]

This article should be merged with some others, but least squares is not the right one. Michael Hardy 23:51, 26 July 2006 (UTC)[reply]

I agree; Least squares is about so much more.
How about merging Sum of squares into Variance?
Reasons: The Sum of squares article is a lot like Variance, but makes a few good points which Variance could make better:

  1. Sum of squares gives a more patient explanation of where the term Deviation comes from.
  2. Sum of squares explains in more depth that the reason for dividing the sum of squares by n or n-1 is to keep the variance from growing linearly as more samples are gathered.

JEBrown87544 17:39, 27 July 2006 (UTC)[reply]


No, not variance. Least squares is NOT about "so much more". A whole book can be written on "sum of squares", and was--a fairly long one, by Henry Scheffe. If it's to be merged into something, maybe it should be analysis of variance. I'll be back to add some material.... Michael Hardy 21:16, 27 July 2006 (UTC)[reply]


You don't believe that Sum of squares overlaps with the Least squares and regression analysis section of Least squares?
In my experience, when people use the term "sums of squares", they use it in reference to ANOVA or fitting a regression model using the Least squares method—instead, they use terms like "random variable", "standard deviation", "variance", or "range" when talking about variation.
Personally, I'd expect an article on sums of squares to discuss the following instead of variance:
  • Sums of squares due to regression
  • Sums of squares due to error
  • Total sum of squares
  • Sums of squares due to a given ANOVA factor
  • The relationship between the above and ANOVA and regression models
How about this as a counter-proposal?:
  • Merge the current content of Sum of squares into Variance
  • Rewrite Sum of squares to discuss the role of sums of squares in ANOVA and in the least squares method of fitting regression models
--DanielPenfield 21:34, 27 July 2006 (UTC)[reply]
I like your counter-proposal, I came here looking for information about ANOVA...Zath42 (talk) 21:26, 21 September 2009 (UTC)[reply]

I don't think this should get merged with least squares. The least squares article could reasonably neglect some topics that belong here: how to partition the sum of squares in complicated experimental designs. Michael Hardy 17:49, 3 October 2006 (UTC)[reply]

page needs a rewrite.

[edit]

This page is so narrowly focused as to be an embarrassment. I've added the expert needed tag -- don't have time to rewrite myself (way too many other stat-related pages need rewriting). In the meantime, anyone interested in really learning about sums-of-squares should fine a good book -- Scheffe mentioned above, books on experimental design or ANOVA, or to really understand the mathematics, books on the geometry of least squares and linear models.

--Zaqrfv (talk) 21:15, 13 August 2008 (UTC)[reply]

Root sum of squares

[edit]

Is there a name for the root sum of squares? Like "geometric mean" except not that, obviously. Geometric sum or something? Is there a Wikipedia article on it? —Preceding unsigned comment added by 71.167.61.18 (talk) 19:16, 4 November 2009 (UTC)[reply]

Are you thinking of "root mean square"? --Doctorhook (talk) 21:13, 12 March 2010 (UTC)[reply]

sum of squares

[edit]

i was not able to understand what is all about sum of square..... —Preceding unsigned comment added by 121.54.117.246 (talk) 12:03, 17 February 2010 (UTC)[reply]

Sums of squares and cross-products matrix

[edit]

This article could usefully be expanded to include a section on the matrix generalization of sums of squares to XTX, the sums of squares and cross-products matrix. Or should that be a separate article? --Qwfp (talk) 11:45, 20 February 2011 (UTC)[reply]


Requested move

[edit]
The following discussion is an archived discussion of the proposal. Please do not modify it. Subsequent comments should be made in a new section on the talk page. No further edits should be made to this section.

Page moved to Sum of squares (statistics). Vegaswikian (talk) 03:50, 4 March 2011 (UTC)[reply]

Sum of squaresDecomposition of sums of squares — To include specific topic of article in line with discussion on Talk page, and so avoid expansion into other related topics already in other articles. This might be resisted on the grounds of the length of the proposed name, or otherwise. Melcombe (talk) 09:48, 25 February 2011 (UTC)[reply]

Survey

[edit]
Feel free to state your position on the renaming proposal by beginning a new line in this section with *'''Support''' or *'''Oppose''', then sign your comment with ~~~~. Since polling is not a substitute for discussion, please explain your reasons, taking into account Wikipedia's policy on article titles.
  • Comment if moved, it definitely should be replaced by a disambiguation page. That hatnote is enormous. And there's least squares from the first section of this talk page, that is missing from the hatnote as well. Not to mention the method of integration... 65.93.15.125 (talk) 12:31, 25 February 2011 (UTC)[reply]
  • Partition of sums of squares or Partition of a sum of squares could also be considered. So could Sum of squares (statistics). And there is the concept of a sum of square in number theory. A "sum of squares" disambiguation page could link to several statistics articles and several number theory articles. Michael Hardy (talk) 16:11, 25 February 2011 (UTC)[reply]

Discussion

[edit]
Any additional comments:

Given initial comments above, I have created Sum of squares (disambiguation), including all the topices from the head of this article and all the existing redirects that start with "sum of square". It would be good to check that these are redirecting to sensible topics. I have also included a line for sum of two squares. Melcombe (talk) 16:58, 25 February 2011 (UTC)[reply]

The above discussion is preserved as an archive of the proposal. Please do not modify it. Subsequent comments should be made in a new section on this talk page. No further edits should be made to this section.
Nice job on the disambiguation page. Lots of links! -- The page was really needed. Duoduoduo (talk) 17:39, 25 February 2011 (UTC)[reply]

Undefined variables

[edit]

In the section "Partitioning..." with calculations, the variables $\hat{y}_i$ and $\beta_i$, $\epsilon_i$ are undefined. I guess you can find them in the pages this page points to, but this is very inconvenient. Should there be "Here $\beta_1, .., \beta_p$ are the estimated coefficients of the regression model, $\hat{y}_i = beta_0 + \beta_1 x_{i1} + ... \beta_p x_{ip}$ is the predicted value and $\epsilon_i = y_i - \hat{y}_i$. I am not completely familiar with the terminology of this field, so I leave to make this change to someone who is.

Incomplete Proof of TSS=ESS+RSS

[edit]

The following parts must also be proven. They are unclear to many including me.

1. The requirement that the model includes a constant or equivalently that the design matrix contains a column of 1s ensures that Sigma1n Epsilonihat=0
2. (For all j=1,...,p) Sigma1n EpsilonihatXij=0 212.174.38.3 (talk) 07:40, 21 November 2016 (UTC)[reply]

78.60.14.188 (talk) 10:07, 9 November 2013 (UTC)[reply]

the derivation is apparently "missing the number of groups p completely"

[edit]

According to this edit. Graham87 14:15, 9 September 2018 (UTC)[reply]

No: p isn't the number of groups, it's the number of predictors, which is immaterial to the proof. The inclusion of p in the theorem is merely to be explicit about is meant by "a linear regression model ... including a constant". It could simply link to Linear regression#Introduction for the definition, at the expense of making this article less self-contained. Qwfp (talk) 19:36, 9 September 2018 (UTC)[reply]

wide overlap fgnievinski (talk) 22:35, 23 August 2022 (UTC)[reply]

Agree with the sentiment, but oppose the merge on the grounds that the scope of the articles is sufficiently different to warrant separate discussion. Squared deviations from the mean focuss on applications in statistics, which are important, whereas Partition of sums of squares covers the topic more broadly, with examples from linear regression. There readership are different, and best served by having the pages kept separate. Klbrain (talk) 20:05, 9 March 2023 (UTC)[reply]
Closing, given the uncontested objection and no support. Klbrain (talk) 12:19, 15 April 2023 (UTC)[reply]

Proof

[edit]

It is complete unclear to me why these sums should be 0

Geometrically I can understand that is the projection of on the space spanned by the x's, and hence is perpediculer to each x, But analytically it is not at all clear.

Madyno (talk) 18:22, 7 October 2022 (UTC)[reply]