Questions Tagged With standardhttp://www.or-exchange.com/tags/standard/?type=rssquestions tagged <span class="tag">standard</span>enMon, 10 Sep 2012 11:45:18 -0400Standard Mathematical Models, Conditions on Variableshttp://www.or-exchange.com/questions/6429/standard-mathematical-models-conditions-on-variables<p>Dear all experts</p>
<p>I have had a problem since many years ago, but I've never tried to find the solution, but now I need it urgently.</p>
<p>Many of friends and classmates would ask me if they could use "condition on variables" in a summation in GAMS. Obviously for me, it was not possible. But all the time my answer was one thing: you have a problem in your model building procedure, try to modify it, it is not standard to have condition on decision variable which its value is obtained through the interaction of parameters, sets, scalars and etc. In my idea, it was a blind loop when range of a summation is dependent on a decision variable!
Now, my question is: <strong>am I in right side or not</strong>?</p>
<p>I really appreciate any kind of suggestion.
Looking for your answers</p>
<p>Bests</p>Bob PayMon, 10 Sep 2012 11:45:18 -0400http://www.or-exchange.com/questions/6429/standard-mathematical-models-conditions-on-variablessummationmodelgamsconditionstandardPlease Help! Linear Regression: How do I compute the t-values for the beta coefficients a design matrix with less than full rank?http://www.or-exchange.com/questions/6115/please-help-linear-regression-how-do-i-compute-the-t-values-for-the-beta-coefficients-a-design-matrix-with-less-than-full-rank<p>Is there a way to calculate the standard error estimates and t-values of my beta coefficients if my matrix does not have full rank and therefore is not invertible?</p>
<p>I am writing a script in Ruby to do least squares regression and analysis of variance. I can compute the beta coefficients for my least squares approximation with...</p>
<pre><code>b = ((X'*X)^(-1))*X*y'
</code></pre>
<p>However, this assumes my matrix has full rank.</p>
<p>I have data sets that occasionally contain a set of linearly dependent columns and therefore will not invert. I want to keep the linearly dependent columns in my design matrix, so I found discovered at <a href="http://planetmath.org/encyclopedia/Pseudoinverse.html">PlanetMath</a> that I can compute the betas using the "pseudo-inverse" from singular value decomposition (SVD)...</p>
<pre><code>b = V*(S'*S)^(-1)*S'*U'*y'
</code></pre>
<p>T-values are calculated by taking the diagonal of the inverse of the covariance matrix C[i][j]</p>
<pre><code>t[i] = b[i] / (s * Math.sqrt(C[i][i]))
</code></pre>
<p>where </p>
<pre><code>b[i] : beta for the ith feature
s : standard error of entire data set and its estimates
X : design matrix--the matrix with your independent variables
C = ((X'*X)^(-1))
</code></pre>
<p>Is there another way to calculate C using the S, U, or V from Singular Value Decomposition?</p>
<p>P.S.: I also found this <a href="http://www.nd.edu/~rwilliam/stats1/x91.pdf">paper</a> to be quite relevant. </p>dpott197Sat, 04 Aug 2012 19:40:22 -0400http://www.or-exchange.com/questions/6115/please-help-linear-regression-how-do-i-compute-the-t-values-for-the-beta-coefficients-a-design-matrix-with-less-than-full-rankestimatenormalequationsstandarderror