It seems to me that we are continuously bombarded with all kinds of “studies” that report what appears to be the result of painstaking scientific analysis of a subject. (A “study”, according to Webster’s, is “the results of paying careful attention to, and critical examination of, any subject, event, etc.”) One thing about “studies” is that they cover a wide variety of subjects – some of which are unusual to say the least. Here are few examples I found in “Business Insider”.
One study resulted in the following conclusion: “When you attach a weighted stick to a chicken’s butt, the chicken walks in the same manner that the dinosaurs are thought to have walked. ” Yep, that’s right, researchers somehow got funds to conduct this “critical examination” .
Care to know how long it takes the average mammal (including humans) to empty its bladder? Thanks to a groups of scientists who “studied” this question the answer is: about 21 seconds, plus or minus 13 seconds. Their study, called “Duration of urination does not change with body size,” was published May 14, 2015, in the journal PNAS. Now that’s something we simply had to know, right?
Okay, now here’s a study that sounds a bit more interesting. Apparently there may be some surprising biological benefits of “intense kissing.” Research suggests that the saliva exchange can reduce a person’s allergic response. The way this experiment was set up was: “The subject(s) kissed freely during 30 min with their lover or spouse alone in a room with closed doors while listening to soft music”. Researchers published the results in the journal Physiology & Behavior, but I couldn’t find what the “allergic response” consisted of, or how it was identified. Oh, well, I’m sure the participants didn’t care one way or the other.
Behavioral science, that is, the way people act and why they do the things they do, is a very fertile field for “studies”. They pop up like mushrooms in the spring – but which are truly mushrooms and which are poisonous toadstools? As one researcher into the reliability of such studies put it, “The most dangerous words in the English language today are, ‘studies show’ “. But how to distinguish between the valid investigations and those which simply propagate some preconceived notion is a bona fide question.
In the world of scientific, engineering, and medical research, replication, that is the capability to reproduce the results of a study, is a primary means of validation. This approach might well be used for examining behavioral studies.
Mr Andy Kessler, who writes on technology and related subjects for the Wall Street Journal, cites a report in the Center for Open Research that describes a four year effort by 270 researchers who tried to reproduce 100 leading psychology experiments. They were able to reproduce the results in only 39. Oops!
Mr Kessler also cites a recently published survey of 1,576 scientists that concluded: “More than 70% of researchers have tried and failed to reproduce another scientist’s experiments. And more than half have failed to reproduce their own experiments.” How about them apples.
One of the main problems with many of these “studies” is that they are sponsored by some individual or group that has a political and/or economic agenda. To support this objective, surveys and participants are carefully chosen to reflect the desired outcome. The recent fiasco by the “scientific” polling organizations in erroneously predicting the outcome of the presidential election is an excellent example. They asked the wrong questions of the wrong people.
A second major flaw is that correlation between events is not the same as causation. Just because the wind is blowing at the same time windmills are turning does not mean the windmills cause the wind to blow. Sound silly? Yet the same kind of reasoning we find being used to justify whatever conclusions the instigator(s) want. Our country has deployed our military in many locations around the world where insurgents are attacking and killing civilians. This correlation means our troop deployment is the reason for these massacres, right?
Well, I guess one way to evaluate the results of these “studies” is to find out who paid for them and what impact the results might have on the economic or political objectives of the sponsor. Might prove interesting. At least that’s how it seems to me.
Bill Taylor is a Greene County Daily columnist and area resident, he may be contacted at email@example.com.