Sunday, April 26, 2009

Survey Analysis

It's been a good weekend. I had time to analyze the survey results for a colleague and to provide some statistical analysis of the results. I won't go into details yet because the survey will be published as a white paper for his business. The original data was taken from surveymonkey and although I did not have access to each individual respondent we were successful at finding some significant correlations in the data for him.

The survey had reasonably good construction, however it was not set up to analyze mathematically very well. After some thought I was able to convert the responses to scaled variables and actually measure the differences between categorical responses. There was good opportunity for cross tab analysis and even a two group analysis, but we needed the data from each respondent to do this.

From a business standpoint it was a success as the results were going to be beneficial to the business problem the white paper is addressing.

Chalk this up to another successful case of analyzing a survey after the fact. With the popularity of online surveys, I think there will be more and more business opportunity for this type of service. Although it's better to get help with the front end design and planning for a survey. If you get stuck in the analysis there are some options.

Sunday, April 05, 2009

Understanding Margin of Error

Today, I was requested to participate in another LinkedIn Poll. The author of the poll is well intentioned and trying to gather information on a particular business issue relating to whether or not finance is a value proposition. During the course of conversation someone had indicated that the demographics and differences between how demographics had answered this poll were interesting.

The problem is that with only 50 respondents, at the time; the margin of error for the survey was +/- 13.8%. When I looked at how one group answered vs. another demographic group, in most cases it was withing the confidence limits considering the sample size and margin of error. So I personally did not see any interesting differences in how different groups answered the question. You have to be careful not to read to much in to these online polls. At best I consider them entertaining, but in no way a tool for serious research.

Another thing I noticed is that LinkedIn Polls is that they apparently have some built in error. due to LinkedIns processing system. Total respondents results added up to 98% instead of 100% in the main category across the board. I need to understand this more, but it looks like due to rounding they may lose 2% accuracy off the top.

This particular poll garnered 50+ respondents, however most LinkedIn polls I am able to view have less than 15 respondents and typically only 5. Sampling fifteen people for a typical yes/no type of poll gives a margin of error as high as +/- 25%. Why bother asking?

You can see why I have some distrust for using these popular social networking polls in obtaining true research information. Be real cautious if you intend to use this information for business decisions. The LinkedIn polls can be entertaining and provide some basis for conversation about a topic or point. From this standpoint they do have some value.