Simple correlation does not prove to be an all-encompassing technique especially under the above circumstances. In order to get a correct picture of the relationship between two variables, we should first eliminate the influence of other variables.
For example, study of partial correlation between price and demand would involve studying the relationship between price and demand excluding the effect of money supply, exports, etc.
What Correlation does not Provide
Generally, a large number of factors simultaneously influence all social and natural phenomena. Correlation and regression studies aim at studying the effects of a large number of factors on one another.
In simple correlation, we measure the strength of the linear relationship between two variables, without taking into consideration the fact that both these variables may be influenced by a third variable.
For example, when we study the correlation between price (dependent variable) and demand (independent variable), we completely ignore the effect of other factors like money supply, import and exports etc. which definitely have a bearing on the price.
The correlation co-efficient between two variables X1 and X2, studied partially after eliminating the influence of the third variable X3 from both of them, is the partial correlation co-efficient r12.3.
Simple correlation between two variables is called the zero order co-efficient since in simple correlation, no factor is held constant. The partial correlation studied between two variables by keeping the third variable constant is called a first order co-efficient, as one variable is kept constant. Similarly, we can define a second order co-efficient and so on. The partial correlation co-efficient varies between -1 and +1. Its calculation is based on the simple correlation co-efficient.
The partial correlation analysis assumes great significance in cases where the phenomena under consideration have multiple factors influencing them, especially in physical and experimental sciences, where it is possible to control the variables and the effect of each variable can be studied separately. This technique is of great use in various experimental designs where various interrelated phenomena are to be studied.
However, this technique suffers from some limitations some of which are stated below.
The calculation of the partial correlation co-efficient is based on the simple correlation co-efficient. However, simple correlation coefficient assumes linear relationship. Generally this assumption is not valid especially in social sciences, as linear relationship rarely exists in such phenomena.
As the order of the partial correlation co-efficient goes up, its reliability goes down.
Its calculation is somewhat cumbersome - often difficult to the mathematically uninitiated (though software's have made life a lot easier).
Another technique used to overcome the drawbacks of simple correlation is multiple regression analysis.
Here, we study the effects of all the independent variables simultaneously on a dependent variable. For example, the correlation co-efficient between the yield of paddy (X1) and the other variables, viz. type of seedlings (X2), manure (X3), rainfall (X4), humidity (X5) is the multiple correlation co-efficient R1.2345 . This co-efficient takes value between 0 and +1.
The limitations of multiple correlation are similar to those of partial correlation. If multiple and partial correlation are studied together, a very useful analysis of the relationship between the different variables is possible.