When the values of one variable are associated with or influenced by other variable, e.g., the age of husband and wife, the height of father and son, the supply and demand of a commodity and so on, Karl Pearson’s coefficient of correlation can be used as a measure of linear relationship between them. But sometimes there is interrelation between many variables and the value of one variable may be influenced by many others, e.g., the yield of crop per acre say (X1) depends upon quality of seed (X2) fertility of soil (X3) fertilizer used (X4) irrigation facilities (X5) weather conditions (X6) and so on. Whenever we are interested in studying the joint effect of a group of variables upon a variable not included in that group our study is that of multiple correlation and multiple regression.
Suppose in a trivariate or multi-variate distribution we are interested in the relationship between two variables only. There are two alternatives, viz (i) we consider only those two members of the observed data in which the other members have specified values or (ii) we may eliminate mathematically the effect of other variates on two variates. The first method has the disadvantage that it limits the size of the data and also it will be applicable to only the data in which the other variates have assigned values. In the variates but the linear effect can be easily eliminated. The correlation and regression between only two variates eliminating the linear effect of other variates in them is called the partial correlation and partial regression.
Online Statistics Help | Statistics Math Help | Statistics probability help | Statistics help | College statistics help | Business statistics help| Elementary statistics help | Probability and statistics help | Statistics tutor | Statistic Homework help | Excel help | Mathematics help | Matlab help | MegaStats help | Minitab help | PHStat2 help | POM/QM help | R code and S-Plus help | SAS help | SPSS Help | Stata help | TDISK help | Tree Plan help | Online Tutoring