With “big data” come big risks

Cartoon showing people considering crossing the valley of big data

Prebabble: Sound research is backed by the scientific method; it’s measurable, repeatable and reasonable consistent with theory-based hypotheses. Data analysis is a component of scientific research but is not scientific by itself. This article provides examples of how research or summary conclusions can be misunderstood by fault of either the reviewer or the researcher - especially when big data are involved. It is not specific to psychological research, nor is it a comprehensive review of faulty analysis or big data.

When I was a grad student, (and dinosaurs trod the earth) four terminals connected to a mainframe computer were the only computational resources available to about 20 psychology grad students. “Terminal time,” (beyond the sentence that was graduate school) was as precious and competitively sought after as a shaded parking spot in the summer. (I do write from the “Sunshine State” of Florida)

Even more coveted than time at one of the terminals, data from non-academic sources were incredibly desirable and much harder to come by. To gain access to good organization data was the “holy grail” of industrial organizational psychology dissertations. Whenever data were made available, one was not about to look this gift horse in the mouth without making every effort to find meaningful research within those data. Desperate, but crafty grad students could wrench amazing research from rusty data.

But some data are rusted beyond repair.

One of my cell-, I mean class-, mates came into the possession of a very large organizational database. Ordinarily the envy of those of us without data, such was not the case here. It was well known that this individual’s data, though big, were hollow; a whole lot of “zeroes.” To my surprise and concern, this individual seemed to be merrily “making a go of it” with their impotent data. Once convinced that they were absolutely going to follow through with a degree-eligible study (that no one “understood”), sarcasm got the best of me, “Gee, Jeff (identity, disguised), you’ve been at it with those data for some time. Are any hypotheses beginning to shake out of your analyses?”

“Working over” data in hope of finding a reasonable hypothesis is a breach of proper research and clearly unethical whether one knows it or not. But it happens – more today than ever before.

"Big data" has become the Sirens’ song, luring unwitting, (like my grad school colleague) or unscrupulous, prospectors in search of something – anything - statistically significant. But that’s not the way science works. That’s not how knowledge is advanced. That’s just “rack-n-hack” pool where nobody “calls their shots.”

It isn’t prediction if it’s already happened.

The statistical significance (or probability) of any prediction in relation to a given (already known) outcome is always perfect (hence, a “foregone” conclusion). This is also the source of many a superstition. Suppose you win the lottery by betting on your boyfriend’s prison number. To credit your boyfriend’s “prison name” for your winnings would be a mistake (and not just because he may claim the booty). Neither his number nor your choice of it had any influence in determining the outcome – even-though you did win. But if we didn’t care about “calling our shot’s” we’d argue for the impossibly small odds of your winning ticket as determined by your clever means of its choice.

This error of backward reasoning is also known by the Latin phrase, post hoc, ergo propter hoc, or, “after this, therefore because of this.” It’s not veridical to predict a cause from its effect. Unfortunately, the logic may be obvious, but the practice isn’t.

Sophisticated statistical methods can confuse even well-intended researchers who must decide which end of the line to put an arrow on. In addition, the temptation to “rewind the analysis” by running a confirmatory statistical model (i.e., “calling my shot” analysis) AFTER a convenient exploratory finding (i.e., “rack-n-hack” luck) can be irresistible when one’s career is at stake as is frequently the case in the brutal academic world of “publish or perish.” But doing this is more than unprofessional, it’s cheating and blatantly unethical. (Don’t do this.)

Never before has the possibility of bad research making news been so great. Massive datasets are flung about like socks in a locker room. Sophisticated analyses that once required an actual understanding of the math in order to do the programming can now be done as easily as talking to a wish-granting hockey puck named “Alexa.” (“What statistical assumptions?”) Finally, the ease of publishing shoddy “research” results to millions of readers is as easy as snapping a picture of your cat.

All of the aforementioned faux-paus (or worse) concern data “on the table.” The most dubious risk when drawing conclusions from statistical analyses – no matter how ‘big’ the data are – is posed by the data that AREN’T on the table.

A study may legitimately find a statistically significant effect on children’s grades based on time spent watching TV vs playing outdoors. The study may conclude, “When it comes to academic performance, children that play outside significantly outperform those that watch TV.” While this is a true conclusion, the causality of the finding is uncertain.

To further complicate things, cognitive biases work their way into the hornet’s nest of correlation vs causation. In an effort to simplify the burden on our overworked brains, correlation and causation tend to get thrown together in our “cognitive laundry bin.” Put bluntly, correlation is causation.

Although it’s easy to mentally “jump track” from correlation to causation, the opposite move, i.e., from causation to correlation, is not so naturally risky.

Cigarette makers were “Kool” (can I get in trouble for this?) with labeling that claimed an ‘association’ between smoking and a litany of health problems. They were, not-so-Kool with terminology using the word “causes.”

Causal statements trigger a more substantial and lasting mental impression than statements of association. “A causes B” is declarative and signals “finality,” whereas “A is associated with B” is descriptive and signals “probability.” Depending on how a statement of association is positioned, it can very easily evoke an interpretation of causation.

Sometimes obfuscation is the author’s goal, other times it’s an accident or merely coincidental. Both are misleading (at best) when our eyes for big data are bigger than our stomachs for solid research.

Psychways is owned and produced by Talentlift, LLC.

This simple hack* will reduce stress and improve health

Smiley face to reduce your stress

Most {known} psychological research confirms what people already know. Yep. Most psychological research could receive the “No-duh”  vs. the “Nobel” award. Beyond the obvious, others are obtuse. Good luck with their titles, less the method (that consumes most of the article. But sometimes something else happens. Here, I share a study, well done AND revealing; useful for everyday application. This research yields a simple exercise that, if done, WILL reduce stress and improve your health.

I’ve offered tips to manage mood and to reduce stress before: 3 (easy) office tips to enhance your influence, 3 Surprising Motivation Killers and a couple more. But I must confess that these “tips” are mostly the result of personal experience or general knowledge acquired from multiple sources.

This is different. Or as Dorothy so astutely mentions to Toto in The Wizard of Oz, “… we’re not in Kansas anymore.” (Scariest movie I’ve ever seen…)

Although most research reveals the obvious, what’s surprising is what we do (or don’t do) with this obvious information. Just to check me, I bet you can’t think of three things off the top of your head that would make you or someone else a better person.

You did, didn’t you? (smirk)

No kidding: Why haven’t you done them? If you have, why aren’t you still doing them?

You’re probably wondering, “why is Chris shooting himself in the foot?” It kinda sounds like he’s “giving up” his own profession; “psychological research is unsurprising and insignificant.”

Not quite.

One doesn’t fold with a straight flush, and I wouldn’t with a pair of aces (or would I?). I’ve come too far (and learned too much) to give in now.

Most of you will see through my thinly veiled attempt to entice and titillate as an effort to stir up your emotions. (Not sorry)

Beyond the sarcasm, pointing this out to you is making you even more emotional, even a bit demeaned. (Still, not sorry)

There’s an old saying in psychology, “All’s fair that changes behavior the way we want.” (Well, that’s what it should say.)

No. I’m no martyr. Not at all. I’m “the Fool.”

Here, I re-present one of many findings from I-O psychology that, if applied, would help so many. But it’s buried in an academic journal that few will notice. (I won’t mention it’s not even a journal of psychology, but that’s another story.)

Per Issac Newton, … “a body at rest remains at rest unless acted upon by a force.”

Transferring to psychology, human-kind is a pretty big, “body.” Consider this, “the force.”

What follows is solid I-O psychology research with implications that can really make a difference.

Now that I hope to have gained your attention, here’s the simple activity that will make you happier and healthier:

At the end of every day, write down three (3) good things that happened and why they did.

That’s it. Easy as Pi. (What does that mean, anyway?)

Really?

Yes, that’s it. Record and reflect on three good things that happened. Your spirits will lift and your blood pressure will drop. You can reduce stress. Measure it.

Bono, Glomb, Shen, Kim and Koch. 2013. “Building Positive Resources: Effects of Positive Events and Positive Reflection on Work Stress and Health.” Academy of Management Journal, 56: 1601-1627.

Don’t get me started on why this isn’t published in a journal known for PSYCHOLOGY!

Just get on with it. Prove me wrong.

{Yes. I am cool because I used the word “hack” vs. “tactic.”}

Psychways is owned and produced by Talentlift, LLC.

How about a little science with that intuition?

Psychology and intuition

We’re all practicing psychologists — aren’t we? With our uncanny insight and intuition we’re able to ‘read’ another person in a mere 10 seconds. (This is, in fact, what research reveals about employment interviews). We know ourselves, and we know others. As a matter of fact, it’s intuitive — so simple we can do it with almost no thought. Therefore: We never make mistakes when assessing ourselves or others.

But everyone else does. Right? Consider the bias at work here (Hint: Fundamental attribution error).

Intuition is NOT the same as insight. Though insight can come from many sources (e.g., dreams, experiences, reading) in this case I’m referring to the insight that comes as a result of either deductive or inductive reasoning. And the best logic starts with facts.

Why I’m doing this

I’ve started this blog to share insights from psychological research and my own experience applying psychology in the workplace to let you in on some of the most predictable truths you can use to understand and change yourself and others. This is about understanding intuition – its risks and benefits. But in order to understand intuition, like any phenomenon, we need to turn to a proven, reliable and valid source. This is about science and fact.

Where much of what we know or learn is informed almost completely on the methods of scientific research, this is not the case with psychology. Perhaps it’s because we only know the world by the function of psychology — everything we sense or experience must flow through a myriad of brain cells and nerves. Psychology is not only a topic for scientific study, it’s an inextricable part of any and all science. But the challenge with the study of psychology is especially difficult and even the best scientific research of psychology can’t reveal the level of understanding typically obtained by research in other fields (e.g., physics, chemistry, biology and physiology, etc.).

I don’t intend this to be an academic journal. But I will cite proper scientific research or share personal experience lending reasonably valid support to my posts. My goal is to publish articles in layman’s terms explaining complex phenomena. Some of these will be more appealing for the way I present myself in writing (i.e., they may not be particularly revealing in terms of the knowledge they generate) others will be more educational to the average reader. And, some will be more theoretical while others will be more practical. “There’s something for everyone” is what I would like for a critic to say about the collection of these writings.

I hope you find them helpful.

Psychways is owned and produced by Talentlift, LLC.