Tag: statistics

  • Advanced R data manipulation functions

    http://www.ats.ucla.edu/stat/r/library/advanced_function_r.htm *apply, sweep, and column/row functions.

  • Tikhonov regularization

    http://en.wikipedia.org/wiki/Tikhonov_regularization Called ridge regression in statistics. Regularization: Ridge Regression and the LASSO is another good reference. In R, use lm.ridge(). Here’s sample code that describes not just lm.ridge, but also pls(partial least square), lasso, and pcr(principal component regression).

  • Automatic model selection

    Leaps package has regsubsets function that automatically finds best model for each model size. Here’s example from ?regsubsets. Let’s find the best model using Adjusted R square. Build a model using them. When doing model selection, instead of using automatic methods recklessly, one should consider if the model really makes sense based on the prior…

  • Cholesky decomposition

    http://en.m.wikipedia.org/wiki/Cholesky_decomposition In linear algebra, the Cholesky decomposition or Cholesky triangle is a decomposition of a Hermitian, positive-definitematrix into the product of a lower triangular matrix and itsconjugate transpose. It was discovered by André-Louis Choleskyfor real matrices. When it is applicable, the Cholesky decomposition is roughly twice as efficient as the LU decomposition for solving systems of linear equations.[1] In a loose, metaphorical sense, this can be thought of as the matrix analogue…

  • Best Approximation Theorem

    When we want to find of and if there is no exact answer X, then the least square approximation of is the projection of onto in column vector space . That’s because, if column vectors in are linearly independent, consists of a vector space. If there is no exact answer , then . Thus, exists…

  • Change of Basis

    http://en.wikipedia.org/wiki/Change_of_basis#Change_of_basis_for_vectors In linear algebra, change of basis refers to the conversion of vectors and linear transformations between matrix representations which have different bases.

  • Transformation Matrix

    http://en.wikipedia.org/wiki/Transformation_matrix In linear algebra, linear transformations can be represented by matrices. Most common geometric transformations that keep the origin fixed are linear, including rotation, scaling, shearing, reflection, and orthogonal projection

  • Robust Regression

    Classical linear model minimizes where is residual and called LS(Least Squares) or Sum of Least Squares. But it is not robust against outlier. Thus we need roboust regression methods. Least Absolute Deviation Seek to minimize . This method has breakdown point zero meaning that even a small number of outliers can damage the goodness of…

  • Kendall’s Tau VS Spearman’s Rho

    Nice vid. on Kendall’s tau and Spearman’s rho. Part 1. Part 2. Here’s another explanation: http://www.unesco.org/webworld/idams/advguide/Chapt4_2.htm. In most cases, these values are very similar, and when discrepancies occur, it is probably safer to interpret the lower value. More importantly, Kendall’s Tau and Spearman’s Rho imply different interpretations. Spearman’s Rho is considered as the regular Pearson’s…

  • Kendall’s Tau

    This is nice vid. from how2stats.com. Part 1. Part 2.