Lemmings aren’t actually suicidal

Weird Fact #30: The idea that lemmings kill themselves is the result of a movie fraud.

If you are like most folks, you’ve probably used the famous phrase about someone running off a cliff like a lemming. The small mammal, which looks a bit like a groundhog, has become a sort of symbol of mindless suicide for it’s tendency to hurtle itself off ledges during wild mating rushes.

The only problem is that none of it is true. Despite their pop culture image, lemmings don’t actually commit suicide. Where did this myth come from? It’s origins are uncertain but it was popularized by a 1958 nature flick called “White Wilderness” which featured dramatic scenes of a lemming migration in which crazed lemmings toppled over a cliff into the sea en masse. “They’ve become victims of an obsession,” intones the narrator.

The lemmings however, were actually the victims of an unscrupulous film crew. Producers completely faked the scenes by tossing the poor animals to their deaths or driving them off the edge of the cliff.

Amazingly, even the ocean below was fake. It was really a river.

Even worse, the moon isn’t made of green cheese either

Weird Fact #29: Michael Jackson didn’t invent the moonwalk.

dance“Who invented the moonwalk?” seems so easy to answer it almost sounds like a trick question similar to “Who is buried in Grant’s Tomb?”

Yet, the reality is that this ridiculously self-evident query has a surprising answer. It wasn’t Michael Jackson.

The late King of Pop may have made any number of musical innovations over the years but the Moonwalk wasn’t one of them. Known by some as the backslide, the move had been around for years and some versions of it date back decades. Cab Calloway claimed to have done something similar in the 1930s. Moreover, some early moonwalks and similar slide steps are on video. In his biography “Moonwalker” Jackson himself admitted his signature dance move wasn’t really his.

It was born as a breakdance step, a ‘popping’ type of thing that black kids had created dancing on the street corners in the ghetto…” he wrote regarding the step he premiered in during a performance of the “Billie Jean.” “So I said, ‘This is my chance to do it,’ and I did it. These three kids taught it to me. They gave me the basics–and I had been doing it a lot in private.”

After doing it in public, of course, he – and it – became a legend.



Of cooking spray and food labels – a startling tale of two realities

Weird Fact #28: “Fat-free” cooking sprays generally contain fat.

spray scaleThanks to the steady advance of technology we are accustomed to experiencing the amazing culinary miracles of modern living on a daily basis. Pizza can be delivered to our door. Oreos now come in Double Stuf varieties. And everywhere outside New York City, you can still get soda in a bucket the size of a feed trough. It is incredible to think that just a few generations ago, our barbarous ancestors killed and died for spice routes just so they could obtain enough oregano to make the unrefrigerated meat they were eating taste slightly less like a hunk of spoiled flesh whereas today, we start cussing if our neighborhood supermarket has less than five flavors of Cheez-It.

But for me, perhaps the biggest food miracle came in the form of an aerosol can filled with fat-free cooking spray. For those who haven’t bought groceries in the last half century or so, cooking spray is the magical substance that keeps my grilled cheese sandwich from becoming part of the pan but manages to do so without the nasty nutritional side effects of traditional cookware greases like butter or lard. You can now cook like grandma without acquiring her waistline and, for emotional validation, you can always turn to the trusty label which, on many cans, features the words “fat-free.” The “Nutrition Facts” section usually has a big reassuring zero next to “Total Fat.”

Hooray for the march of science. So, how do our ingenious alchemists in the food industry manufacture something which looks, acts and tastes so much like oil?


They use oil.

You know…the kind with fat in it.

Miracles just aren’t what they used to be and the miracle here turns out to be not so much in the chemistry of the cooking spray itself as it is in the regulations which govern food labeling and which mandate that as long as your product has less than half a gram of fat per serving, you can label it as though it doesn’t have any. A miracle indeed!

Moreover, servings turn out to be hilariously small. On many cans, they are as little as a quarter-second or third of a second spray. Your can of fat-free cooking spray can contain hundreds of servings, each with what the label likes to refer to as a “trivial” amount of fat.

That fat may be trivial to the manufacturer but unfortunately your body still notices it, especially if you are slathering on gobs of the stuff because you think it has no consequences. Personally, I’ve been known to lacquer the pan for three or four seconds and some probably do it longer.

How much fat is actually in your cooking spray? It’s hard to say since the regulations allow the maker to round to zero under the half-gram rule but, applied to a spray with a quarter second serving, it means that you could have as much as nearly two grams of fat for every second of spray time. A six-or eight-second coating from such a spray means you could have theoretically just emptied as much as the fat equivalent of a cheeseburger or a slice of pepperoni pizza into your meal without having the slightest idea. Sprays with 1/3-second servings are better but still are far from fat-free.

Of course to be fair to the manufacturer it should be pointed out that cooking sprays do generally have less fat than pure oil. They are good products and probably are better for you than simply dumping butter or unadulterated corn oil into your pan. Cooking sprays are an effective way to cut down on fat.

However, that doesn’t mean they don’t contain significant amounts of the stuff or that you can load it into your favorite recipe without consequences. I’m afraid miracles aren’t always what they appear.

Well, at least there are still Cheez-Its.











Fictional phone numbers aren’t fictional

Weird Fact #27: 555- phone numbers are real.

phoneWhether it’s a film noir detective flick, a Saturday Night Live parody advertisement or simply a girl who doesn’t want to see you again jotting down her number in a bar, the 555 phone prefix has long been a classic standby for writers of creative fiction.

But what most don’t know is that there really are genuine 555 numbers out there. Though few writers obey the rule, the only triple fives officially reserved for use in fiction are 555-0100 through 555-0199. The rest have actually been fair game for real people since 1994 when the North American Numbering Plan Administration decided to let the nation’s most well-known non-prefix out for general use. You are probably struck, as I was, by two questions.

1)      We have a North American Numbering Plan Administration?

2)      Why have I never heard of any 555 numbers in real life?

The answers are, yes, of course we do and because it didn’t really work that well. The 555 exchanges were supposed to be a nationwide area codeless number similar to 1-800. Businesses loved the idea but phone carriers didn’t and said it would cost too much so mostly the concept flopped.

No one really seems to know why 555 became the fiction writer’s best friend. Some think it was the catchiness of it while another author believes it has to do with the letters marking the 5 key which used to be important when people were still requesting exchanges by name. (After all, how many place names use the letters JKL?)

Not that some fake numbers haven’t become more popular than others. For some reason, 555-2368 seems to be a favorite, having been associated with everything from The Ghostbusters to TV detective Jim Rockford.

Of course, some fictional digits have become famous for having very non-fictional (and often very irate) owners on the other end. The number 867-5309 is still a working number in some area codes although, thanks to the early 80s hit song Jenny, many unfortunate folks wish it wasn’t. At the peak of the song’s popularity, people received literally thousands of prank phone calls, most from snickering adolescents asking for Jenny and congratulating themselves for being clever enough to devise a practical joke that only every human in the known universe could have come up with.

It all could have been avoided if only Tommy Tutone would have stuck with 555.











Sudoku’s history isn’t what you think

Weird Fact #26: Sudoku isn’t Japanese.

JapanThe sudoku craze seems to be ebbing these days but most of us can remember a few years ago when it was really hot. You couldn’t swing a cat without hitting someone who was furiously scratching and erasing little gridded columns of numbers like they were deciphering the Enigma code. These annoying numerical crosswords litter the back pages of newspapers to this day. Frankly, I always found the bloody things irritating, like those damned Magic Eye illusions which gained extreme notoriety in the 1990s owing to the mysterious special property that allowed every human on Earth except me to see the hidden 3D picture. I still suspect Magic Eye was a giant hoax and everyone was just pretending to see the image so they could make me feel dumb.

But today’s weird fact is about sudoku nonetheless and it is kind of an interesting one.

Sudoku isn’t really Japanese. In fact, it wasn’t invented in Asia at all.

Actually, it’s American. The most famous Japanese number game on Earth was created by a fellow named Howard Garns in Indianapolis. Yep. And you don’t get much more American than a guy in Indiana named Howard. Sounds less romantic now, doesn’t it?

Anyway, Garns was an architect whose innovation in the competitive field of time-wasting first popped up uncredited and largely unnoticed in the late 1970s in a puzzle magazine. In fact, forerunners of modern sudoku date back some two centuries but Garns is the one who seems to have put it all together.

Apparently, it finally caught someone’s interest and took the Land of Rising Sun by storm, eventually becoming popular enough to traverse the Pacific back to its homeland with a new and exotic Japanese moniker.

The initial name may indeed have been part of the problem. Garns’ forgettably generic original title of the puzzle had been “Number Place.” The engineer may have been a genius at inventing math games but he clearly had a lot to learn about marketing.

Sadly, Garns never saw his brainchild hit the big time. He died in 1989.







The flag isn’t backwards…but you are probably still using it wrong

Weird Fact #25: American flag patches on military uniforms appear to be “reversed.”

FlagIf you’ve ever had occasion to look at the shoulder patch of a member of the United States military, you may have noticed something amiss. It’s subtle but it might cause a double take nonetheless.

The flag seems to be backwards.

Many people have picked up on this over the years and assumed the reversed flag was some sort of error. However, the reality is that the folks who sew flag patches on do know what they are doing. The “backwards” flag is quite intentional and actually has a symbolic meaning.

Back in the olden days when flag bearers would carry the nation’s colors into battle, the flag would unfurl behind its carrier with the canton, or blue field, facing forward and the stripes flapping behind in the breeze as the army charged ahead. Though it looks backwards to an observer on a right shoulder patch, this is the way the flag is supposed to be seen, with the stars pointing to the wearer’s front. If Old Glory were directed the other way, it might seem more natural to the eye but it would symbolically suggest a less-inspiring “retreating” flag, one whose wearer is moving in the wrong direction – away from the battle.

Incidentally, despite our reverence for the flag, our nation’s symbol is rarely treated with the respect it deserves under the country’s flag code. (C’mon, admit it. Bet you didn’t even know we had a flag code.)

Anyway, we’ve got one and a startlingly small number of people actually follow it, choosing instead to abuse the national emblem in well-meaning ways that would make Betsy Ross give up embroidery altogether.

For starters, the flag is never to be worn as a piece of clothing so those American flag shorts you bought on a lark for Fourth of July aren’t just an insult to fashion. They are also very disrespectful to our nation’s symbol. Ditto for sports uniforms and party favors like flag-themed paper napkins. Using Old Glory to wipe mesquite sauce off your piehole during the family barbecue turns out to be a less than stellar way to honor the colors. Who knew?

The flag is also never supposed to be used in product promotion which means that seemingly every beer commercial and President’s Day mattress sale ad is pretty much a rules book violation.

Surprisingly, fairly common activities are also off limits. Draping flags off of parade floats for instance as well as carrying the flag horizontally for display. You know those football games where at halftime the band comes out with an American flag the size of a ZIP code stretched out between them and marches around while energetically belting out John Philip Sousa tunes? Very patriotic, but also very wrong. (The flag part is wrong, I mean. Sousa is perfectly okay unless it’s 3 a.m. and your downstairs neighbors have work in the morning.)

But everyone seems to violate the rules anyway. Major political figures, even presidents, are sometimes caught autographing flags, another big no-no, as are alterations to the American flag to superimpose various images.

Strangely, even stamps could be considered a violation in the strict sense because they are disposable.

Flags are often treated badly in protests as well and not just by flag burners, but also by those who fly them upside down as a sign of “distress,” something that is prohibited except in situations of extreme emergency.








Untold secrets of the condiment bar

Weird Fact #24: You have no idea how to properly use a fast food ketchup cup.

Step 1

Step 1

If you are like most Americans you’ve probably spent your fair share of time in front of the condiment pumps at your local Burger ‘n Burp. And if you have, you are more than familiar with the standard container of the fast food world – that tiny soda cap-sized pleated paper thimble which holds about one-and-a-half squirts of tomato-based goodness for your fries. You know, the one that always falls over when you try to fill it and allows you to comfortably transport several individual molecules of ketchup to your seat. Seriously, I’ve seen people haul about four of these things back to their table. (Of course, by “four” I mean “ten” and by “people” I mean “me.”)

Step 2

Step 2

But regardless of your condiment-hogging proclivities, you may have wondered from time to time what the deal is with the pleated sides. I noticed this but always suspected it was some integral part of the cup-making process.

In reality, it is a design feature and virtually no one I have ever seen eating fries knows about it.

Pinch and tug on the sides and the cup opens like a flower leaving you with a convenient, flat, wide surface for all your dipping and eating pleasure.

Step 3

Step 3

Yep. It took a little effort but the pleats do pull apart and unfold. Neat, eh? Now you know how to pull out a real conversation-stopper on dates (assuming of course the conversation hasn’t already stopped when you suavely arrive at “Hey there, beautiful, lemme show you how to dip those fries correctly…”).

Of course, as empowered as one might feel knowing the right method to unhinge your ketchup cup, there is something a wee bit disconcerting in also understanding that this hidden ability has existed in something many of us have used hundreds of times and no one seems to have known a thing about it. It’s not like there are instructions. It’d be like finding out that you’ve been struggling all these years to get Tic Tac mints out through that stupid little hole when in fact there is a simple way to dispense them into the lid by … oh, seriously? Aw… geez…

Modern life is just too complicated.





What’s the real story behind Pearl Jam’s disturbing ‘Jeremy’ video?

Weird Fact #23: Pearl Jam’s infamous music video “Jeremy” was not about a school massacre.

BulletWith the tragedy at Sandy Hook Elementary School the public consciousness was yet again assaulted by the horrors of random violence when a deranged gunman took the lives of more than two dozen people, most of them children, as morning classes got underway.

But as with other mass shootings, especially those at a school, the awful scenes that may have popped unwillingly to mind probably came from popular culture rather than the real world. Particularly for those of the MTV generation, they may even have been associated with a name.


Released by Pearl Jam in 1991, the creepy and controversial music video of that name featured shots of a manically glaring, eerily wailing Eddie Vedder, belting out a song that appeared to narrate a troubled boy’s chilling descent into a nightmarish world of rage and insanity. Intercut with flashed words suggesting evil influence and dark scenes of the young man’s building anger and alienation, it ends with a ghastly shot of a shirtless Jeremy strolling into a classroom and casually tossing an apple to his teacher before leaving his classmates spattered with blood, their faces frozen in expressions of shock and horror. Vedder’s emotive and unnerving refrain of “Jeremy spoke in class today” still sends a cold shiver down the spine.

Artistically well-crafted, gut-wrenchingly offensive and viscerally disturbing, the video is still seen as controversial more than two decades after its creation. Released eight years before the killings at Columbine High School, it was thought in some ways to presage – perhaps even contribute to – a coming culture of random mass gun violence and its school setting made it particularly relevant to incidents like Columbine, Sandy Hook and Virginia Tech. In fact, the band’s hit was even cited in court as an influence in the defense of a Washington state high schooler who pleaded insanity after being accused of murdering a teacher and two classmates.

Yet, there is a surprising twist to the story of the song that became so emblematic of – and occasionally blamed for – mass killings in contemporary America.

The video had absolutely nothing to do with a massacre and it was never intended to convey a child murdering his classmates.

In reality, Jeremy was based on a real-life tragedy, a young man who did indeed come to school with a gun. But the actual Jeremy didn’t kill his fellow students. The Texas teen shot himself in front of them with a .357 Magnum. In the video version, the blood staining the students’ crisp white shirts isn’t meant to be their own but rather that of the titular character himself. The video is about a suicide, not a mass murder.

The reason for the mistaken impression?

It was a combination of censorship and poor editing. The original cut of the video showed Jeremy walking in, drawing a weapon and putting it into his mouth. That, understandably, was a bit too hot for MTV which nixed the idea. The gun scene was cut leaving simply the shot of Jeremy entering and the closing tableau of blood-soaked students. The implication of the video changed entirely.

The director of the piece, which won several awards, called the resulting misinterpretation his “greatest frustration” and said he still received calls about it years later, particularly after Columbine occurred.








The bizarre world of the McGurk Effect: How your brain lies to you every day

Weird Fact #22: Your eyes can cause your ears to hear things that aren’t there.

Your brain is pretty cool but your really shouldn't trust it.

Your brain is pretty cool but you really shouldn’t trust it.

There’s a great old line that most of us have used at one point or another in the heat of an argument: “Who am I going to believe? You or my own ears?”

After your mind is completely blown by today’s weird fact, you might never ask that question again or at least you won’t be sure of the answer.

First, a little background. Many years ago a researcher named Harry McGurk was testing the effects of mothers’ voices on their babies because… well…because scientists lead boring lives. At one point McGurk redubbed the voice of a mother saying one syllable over video of her saying a similar but different syllable. At first glance, the result would seem to be something like one of those hilariously retranslated kung fu movies. But oddly McGurk heard something pretty similar to the sound he saw the woman say, NOT the sound he actually redubbed. It was so bizarre that his first thought was to recheck the tape. But it was playing correctly. His ears however were not hearing it correctly. Instead, he was hearing something closer to what his eyes told him she was saying.

What McGurk had accidentally stumbled upon remains among the most extraordinary audio/visual illusions I’ve ever experienced. He published it in a paper called “Hearing Lips and Seeing Voices.”

You can test this truly strange effect on yourself here and here and here and here and here.

Now, if you are tired of people “bahing” “gahing” at you like you are a five-month-old, you might take a brief moment to think about the awesome implications of this perceptual effect and why it is so utterly freaky. It’s not just your conscious mind making the assumption that the syllable is what’s shown. You ACTUALLY hear the other syllable or at least some middle “compromise” syllable that your brain has decided upon. By simply closing your eyes, you will hear something completely different. Open them and the sound will change back.

We tend to think of our senses as separate feeds of raw information pouring into our pre-frontal cortex but they are not. What you think you see and hear is actually filtered and combined into a single coherent whole before it ever reaches your consciousness and its adjusted along the way for your listening and viewing pleasure. This is why, no matter how hard you try and no matter how much you know about the effect, you can’t hear the real syllable being said as long as your eyes see something else. Your brain simply won’t allow it because it doesn’t make sense. At an unconscious level, it alters the sound for you so what you are seeing syncs up with what you are hearing and it does this even if you consciously know what the real sound is and what you should hear. Your eyes still override your ears and send you a false message.

Before you are too hard on your brain for blatantly getting caught in a lie like this, you should remember that it is the result of millions of years of evolution, most of which took place well before the invention of video tape that psychologists could redub to fool babies (or other psychologists). In fact, your brain is just doing what it is supposed to do. It knows that lips which make a “gah” or “dah” shape shouldn’t sound like “bah” so it assumes, not unreasonably, that your ears made an error, one which it helpfully fixes for you.

But the very fact it can correct this error and change the sound before you mentally “hear” it is simply incredible. Like Neo in “The Matrix,” we are left to question whether what we are seeing and hearing is actually reality and we don’t even need the pain of having that creepy prong shoved into the back of our cerebellum — nor the even worse agony of watching Keanu Reeves attempt to convey human emotion.

The fact is that your brain lies to you every day. The theater of the mind shows you not what your senses perceive but an overall picture it develops for you based on the best information available.

And if there are any parts that don’t seem to make sense?

Well, those wind up on the cutting room floor.







Bad medicine: Why pills shouldn’t be kept in the only spot designed for them

Weird Fact #21: You should never store medicine in your medicine cabinet.

MedCabIt’s a part of the morning or evening ritual for millions of Americans. Somewhere between brushing our teeth and taking a shower, we click open the magical mirror to our own personal pharmacy, pop whatever miracle curatives modern medical authorities have decreed essential to our continued well-being and then go about our happy lives.

But there’s an odd irony lurking behind our smiling reflection that most of us never realize.

The medicine cabinet is among the absolute worst places imaginable to keep medicine.

It seems second nature to put pills in the bathroom. But it turns out that doctors don’t design homes and architects rarely have degrees in pharmacology. The fact is that an area that periodically undergoes rapid temperature change and fills with steam is about the last place you want to keep tablets and potions which often react badly to extreme environmental conditions. Moisture can be particularly damaging to such items, potentially shortening their effectiveness and lifespan. Cool, dry places are far better for keeping drugs from breaking down. Some recommend the top shelf of a linen closet or, better yet, a lockbox of some kind. The most important thing is to keep them out of the reach of children.

Why exactly we decided to stage long-term storage of medically sensitive items in an area that spends part of its day with the same climate as an equatorial rainforest remains a mystery but it certainly wasn’t the ideal choice. Somewhere along the line we simply began mentally classifying pills in with Q-Tips and shaving cream as something we use in the bathroom. It’s an error reinforced by many grocery and department stores which inevitably seem to place shampoo, mouthwash and similar personal care products near the pharmacy counter. But in reality grandpa’s heart prescription isn’t actually a toiletry and shouldn’t be treated like one.

So now you know. Before building your next house, consult a pharmacist.






Why Jerry Springer is really our fault

Weird fact #20: The Jerry Springer Show was never intended to be a trash TV vehicle.

SpringerWe’ve come to think of Jerry Springer as a sort of cultural touchstone for daytime sleaze and the reputation is well-deserved. Springer, a one-time politician (he was mayor of Cincinnati), has indeed pushed the bounds of decency far beyond what most of society is – or really should be – comfortable with. As a forerunner to much of reality TV’s craziness, The Jerry Springer Show will, for better or for worse, go down in television history as a trailblazing enterprise, even if a steady diet of programs like “I married a horse,” “I’m happy I cut off my legs” and “Kung fu hillbilly” may leave us wondering if we as a civilization really want to go where that trail might lead. But there is one surprising fact about the undisputed king of trash television that most people forget.

The Jerry Springer Show’s initial format was as a serious talk program that explored actual social problems.

Yep. Really. Strange as it may seem, Springer’s enterprise was never originally conceived as garbage by its creators.

In fact, the first few seasons of the show, launched in the early 1990s, were often rife with important topics like poverty and crime or uplifting stories and featured weighty guests like Oliver North and Jesse Jackson. Producers initially hoped to create a new Phil Donahue out of the dapper and intelligent Springer, who after politics became a television reporter and commentator known for his genuine, concise, personable editorial style. He seemed the perfect choice to explore social issues in a probing but sensitive way.

Yet, today, he makes a living supervising slap fights between cheating lovers and interviewing midget Klansmen while tossing “Jerry beads” to female audience members who expose themselves.

What on earth happened?


Springer’s early ratings were not particularly impressive. It turned out that serious topics didn’t elicit the kind of viewership backers were hoping for while some of the goofier and zanier subjects seemed to cause spikes in the numbers. The answer was obvious and the serious newsman began producing the sort of wince-inducing fare that would make him famous. The move worked and Springer’s program gave up having any sort of sobriety in favor of raking in megabucks by showcasing a parade of human detritus for a wildly hooting audience. The result is as one commentator put it, “unbearable to watch but impossible to turn off.”

There is perhaps also a dark lesson here. We like to blame the Jerry Springers of the world for civilization’s ceaseless deterioration though in the end, they may only be surfing the flood tide of toxic sewage it leaves behind. Before we are too hard on the emcees presiding over society’s entropic collapse, we should remember that Jerry Springer wanted to do a somber, no-nonsense show. It is you and I who voted with our dollars and convinced him otherwise.

Springer himself certainly seems to know this.

“It’s just a silly little show that has a niche,” he once said. “It has absolutely no redeeming social value whatsoever.”








Looking before you leap: Three centuries out of four isn’t bad

Weird Fact #19: Leap years aren’t really quadrennial.


It will be awhile before it matters but you may want to remember this week’s weird fact when it becomes relevant in about 87 years.

February 29 doesn’t actually occur every four years.

This seems an odd revelation given that everyone alive today remembers an extra day in February popping up on a quadrennial basis as reliably as presidential candidates in Iowa. But in reality, that isn’t the rule.

The entire purpose of a leap year is to periodically realign our troublesome solar calendar which gradually falls out of sync with the seasons since days are based on the spin of the earth on its axis and years are based on the planet’s orbit around the sun. While we think of days and years as connected, they, in fact, have nothing to do with one another. The earth doesn’t orbit the sun every 365 days but rather, it does so every 365.2422 days. The Julian calendar, introduced by Julius Caesar (or “Julie” as his friends called him), accounted for this problem by the addition of a leap day every four years.

But this created a new difficulty, namely, that .2422 isn’t a full quarter day and Caesar’s overcompensation left the calendar falling out of alignment the other way, albeit at a much slower rate.

The solution was the Gregorian calendar, instituted by Pope Gregory XIII, (“Greg” to his drinking buddies) which leaves out three leap days every four hundred years. This is normally done with each new century however centuries divisible by 400 are exempt. That’s why 2000 did have a leap year however, 1900 did not and neither will 2100.

Not to add to your worries but even good old Greg’s calendar will eventually create problems as it doesn’t align perfectly either. Being a few seconds out of kilter with the earth’s spin around the sun it will fall about one day out of sync over the next 3,300 years.

And you thought Daylight Savings Time was complicated.



Seeing red: The lies our wrists tell us

 Weird Fact #18: Deoxygenated blood isn’t blue.


Regular readers of this blog know that there are a lot of things we think we know that it turns out are flat wrong. Thanks to the viral power of the internet, such “facts” can spread with great ease and soon enough everyone knows something that doesn’t happen to be quite true. We expect that when it comes to urban legends, sometimes the world fibs to us.

But it is much more disturbing to learn one’s own eyes are no more truthful, which brings us to the obviously blue blood in your veins. If you are like most folks, you learned long ago that, after it has dropped its payload of oxygen, your hemoglobin turns blue, a fact you can easily confirm by looking at your own wrists which have nice tubes of bright azure flowing back to your heart. The only reason we don’t bleed blue, it is said, is that venous blood turns instantly red upon contact with air. However, the real reason we don’t bleed blue is far simpler.

We don’t have blue blood.

Not in veins. Not in arteries. Not ever. Human blood isn’t blue.

It is true that deoxygenated blood is a slightly different shade than its arterial cousin, a darker, duller red but still red nonetheless. The reasons we see it as blue have to do with the way light wavelengths penetrate the skin and vein walls which then filter out other colors. The darker color of the venous blood and the fact that veins are often closer to the skin and more visible may play a role as well.

Sadly, this misperception is frequently reinforced in the one place you’d expect it not to be – biology textbooks. Educational materials often show veins as blue and arteries as red but this is done for purposes of contrast not because our vital fluid is actually Smurf-toned in its natural state.

As a side note, the term “blueblood” as slang for an individual of aristocratic breeding probably came about as a result of the European elite displaying their fair skin, through which veins appear bluer, to others as proof they were of “purer” or more Germanic descent.

In reality however, the upper crust has the same blood the rest of us do – even if they were probably just as confused over exactly what color it might be.







President Bill Blythe: The almost famous name you’ve never heard of

Weird Fact #17: You probably don’t know the given last name of Bill Clinton.


Today, “Clinton” may be one of the most substantial brands in American politics but few remember that it wasn’t the moniker our 42nd president started life with. In fact, he was born William Jefferson Blythe III, named after his father, who was tragically killed in a car accident months before the future president was born.

Later, after his mother wed Roger Clinton, Sr., a teenage Bill decided to change his name to match that of his mother, stepfather and half-brother. Incidentally, the move came as a surprise to mom, who didn’t become aware of it until she received an unexpected call from the local courthouse informing her that her minor son had arrived to get the surname he would later make into a household word. However, her permission was required. Clinton later recalled that his mother “probably thought I had slipped a gear.”

Still, Virginia Clinton gave her assent and assured that the man who would inhabit the White House for eight years would not enter the history books as “President Bill Blythe” nor would his wife – and possible future contender for the top office – be known as “Hillary Rodham Blythe.”


Source: “My Life” by Bill Clinton

The legend of the famous ‘unrottable’ fast food burger rides yet again

Weird Fact #16: McDonald’s hamburgers possess no special protection against mold.

BurgerWe’ve all seen the experiment. Thanks to the internet it’s become perhaps the biggest meme since bored college students discovered the joys of mixing Mentos and Diet Coke in dorm rooms.

It’s the McDonald’s burger that won’t rot and its taken the Web by storm. The story is simple enough. An enterprising experimenter purchases a Mickey Ds hamburger, leaves it out for the elements to attack and the elements, in their infinite wisdom, stay the heck away from it. People have been wandering around toting nearly pristine burgers for years in some cases.

For contrast, many also display a homemade burger which quickly starts to look like a fungus playground. The conclusion that has spread far and wide virally is that burgers from the Golden Arches must be somehow imbued with a special artificial or generally unfoodlike property such that microbes won’t touch them and – bowing to the special knowledge of microorganisms – presumably neither should you.

But the truth may be quite different. While few would argue that regular visits to the king of fast food are good for your health, do McDonalds patties really possess some alien quality that allows them to resist the decomposition process?

The answer appears to be no.

The secret of the mummified burgers is probably just a result of amateur science conducted under non-standardized testing conditions, say some. The real reason, they note, is that the burgers simply dry out in the open air. If sufficient moisture leaves the patty before mold spores and tiny critters can gain a foothold (assuming, of course, that microscopic creatures actually had feet, which – unless someone is under the influence of very powerful hallucinogens – they do not), you don’t get mold growth. You get beef jerky.

So, why do homemade burgers show signs of rot? It’s probably because they are frequently larger. Bigger burgers lose moisture less quickly than their thinner fast food cousins. That means mold can get a start before dehydration sets in.

In fairness, I haven’t done the experiment myself but a columnist at Serious Eats tested this theory and found, lo-and-behold, that size and surface area were indeed the common important determinants on growth for both Ronald’s burgers and those whipped up on the home cooktop. The columnist also did get the small burgers to mold – as long as he put them in a plastic bag that held in the moisture. The unexciting truth is that McDonald’s burgers are, just as they claim, 100% beef with no odd or scary components that render them impervious to the ravages of time.

Part of all this hoopla seems to have a psychological element, perhaps a general public willingness to believe stories that make major corporations seem evil or unwholesome. Another corporate behemoth, Coca-Cola, has dealt with similar problems over the years, often the victim of unfounded tales claiming that their famous beverage can be used to do everything from kill sperm to clean engines to dissolve nails overnight.

Perhaps we all like the notion that the burger created lovingly by Flo at your local diner is somehow less soulless than one mass produced by a teenage burger jockey at the fast food gulp ‘n go. There’s nothing innately wrong with taking such a view but it’s also important to remember that that doesn’t mean any rumor one hears is automatically true.

So, there you have it. Science triumphs again. Don’t believe it? Get your own Happy Meal and leave it out for yourself in a sealed plastic bag.

But you are probably better off with a nice salad.










Don’t shake it like a Polaroid picture

 Weird Fact #15: Shaking instant photos does not help them develop.


It was second nature to a generation of American amateur shutterbugs. Snap the photo, let the camera spit it out with that cool little mechanical whirring sound … and then violently flap the poor thing back and forth like a wet towel because God forbid we should waste valuable seconds waiting to see if Aunt Mabel finally had her eyes open in this one. Welcome to the world of instant photography.

Of course, younger readers are now chiming in with “Wait. What’s instant photography?” Well, back in the day, Polaroid had (“Wait. What’s Polaroid?”)….uh…they’re a company that made film which…(“Wait. What’s film?”).

Okay, you know what? Younger readers should just skip this column altogether.

Anyway, you are not supposed to waggle instant film back and forth to make the colors come out quicker. This particular myth developed (heh, developed, get it?) from the idea that instant pics dried quicker if you shook them. But later Polaroid film actually dried beneath a clear sheet and fanning had no effect. In fact, it could harm the photo, a fact Polaroid felt compelled to warn people about after the success of 2003′s “Hey Ya!” by OutKast (“Wait. What’s OutKast?”) which encouraged people to “shake it like a Polaroid picture.” (As I recall, it also advised listeners to be on their baddest behavior and lend the singer some sugar because he was in fact their neighbor but Polaroid issued no guidance on this point.)

Interestingly, even in the age of memory cards and digital photography, instant film is still made for the Polaroid camera by some diehard enthusiasts who picked up the idea when Polaroid, which, incidentally, is also still around, lost interest. With the goal of bringing back instant photography, the new company named their venture “The Impossible Project.”

Of course, what’s really impossible is to get people out of the habit of shaking the damned thing.




Existence is inherently unstable and will eventually come apart

Weird Fact #14: Recent discoveries leave researchers believing that the universe is teetering precariously on the edge of total annihilation.

universeI don’t mean to alarm anyone but the universe is apparently doomed and will likely be ripped violently into a vacuous void of nothingness ending all physicality and casting everyone and everything everywhere into a formless oblivion.

And you were worried about an asteroid. Bet you feel silly now, eh?

Anyway, that’s the comforting conclusion of scientists who, regrettably cannot put a precise date on our eventual doom.

“To get the exact number, we need more funding,” joked one researcher.

Well, it’s good to have a sense of humor about your work.

But this isn’t parody. It’s the all-too-real conclusion of the good folks over at the Large Hadron Collider who have been on the hunt for the Higgs boson, dubbed the “God particle” by overenthused headline writers because it apparently gives matter its ability to have mass, which turns out to be important if you like a universe with things in it.

For the record, I was totally against giving scientists a Large Hadron Collider where they could search for something with a name like the “God Particle.” I felt strongly they should collide their hadrons at regular speeds like normal people and simply come up with a useful molecule or two.

Just throwing that out there.

In any event, if the researchers are right, we could, much like the intrepid travelers from Douglas Adams’ Hitchhikers Guide to the Galaxy, be residents of a plane of existence based on a miscalculation. Perched unstably on the edge of the abyss, this may be a slightly defective universe, which, at 14 billion years old is almost certainly out of warranty.

Regardless, researchers explain that any development of a tiny speck of true vacuum would careen at the speed of light across all of existence decaying protons, wiping out known reality and significantly raising car insurance rates.

“The universe wants to be in a different state,” said one, “so eventually to realize that, a little bubble of what you might think of as an alternate universe will appear somewhere, and it will spread out and destroy us.”

Yep. You heard that right. The universe actually wants to destroy itself. Isn’t that just peachy?

But the team said there isn’t much reason for concern. Our suicidal universe should still take tens of billions of years to implode erasing reality as we know it so you shouldn’t start looting or calling old exes for a final hookup quite yet. There’s still plenty of time for the bucket list.

“You won’t actually see it, because it will come at you at the speed of light. So in that sense don’t worry,” said one researcher displaying an example of what a particle physicist apparently considers a comforting thought.

Incidentally, researchers believe that what would replace the universe would be “boring.”

Yes, that’s an actual quote.

So, like Arthur Dent, if you’ve ever thought there was something fundamentally wrong with the universe, scientists have your back.

At least until backs are wiped out in the collapse of existence.







Going down the drain: Water does what it will regardless of geography

 Weird Fact #13: Liquid does not swirl in a different direction based on which side of the equator you are on.

SwirlIf you are like me, you probably believed this old chestnut for years. But contrary to popular belief, water does not spin down drains any differently in the Northern Hemisphere than in the Southern.

This particular myth is based on a real scientific phenomenon known as the Coriolis Effect but the dynamic only applies to large-scale vortexes such as hurricanes, which are unlikely to occur in residential drainage situations unless your home life is far more eventful than mine.

Technically, of course, the effect, which in this context is a byproduct of the Earth’s rotation, does impact all swirls whether in your bathroom sink or spread across the South Pacific but in practical terms, the effect is so tiny on ordinary vortexes, it makes virtually zero difference. The direction water moves is determined by factors like the shape of the basin, the friction involved and previous momentum.




Smoking out the truth on cigarettes

Weird Fact #12: Most people who try cigarettes don’t become addicted.

CigarettesAt one point or another, we’ve all had that one friend who is a three-pack-a-day guy. He’s the poor fellow with the wheezing cough who spends his lunch hour freezing outside the building and reeks of smoke – a sort of walking anti-tobacco ad. Moreover, he generally hates his habit as much as you do. One smoker I knew said if the man who gave her her first cigarette wasn’t dead, she’d kill him.

It is now well-acknowledged that smokes are deadly and that nicotine, their key ingredient, is a highly addictive and very nasty substance. That’s why most parents wisely advise kids to avoid lighting up for the first time. After a just few puffs, you could be that three-pack-a-day guy.

However, surprising statistics and research suggest the odds are that you won’t.

This is a heretical truth that is rarely shared. While cigarettes are indeed dangerously addictive to untold numbers around the world who develop hardcore habits, somewhere between half and two-thirds of Americans who try it never become smokers at all. Unlike with many other drugs, most find the experience deeply unpleasant and never go near cigarettes again. The habit isn’t as easy to pick up as popular culture might indicate.

In a more unexpected twist, even among those who continue to smoke, there are some who are not really addicted in the traditional sense, and can take or leave their cancer sticks much as they please. Known as “chippers,” some only smoke occassionally. Exact numbers aren’t known but estimates run between 5 and 20 percent of the smoking population. Some exhibit varying degrees of dependence. Others simply don’t.

Chipping also seems to be on the increase. Estimates say that as many as 15 million Americans who smoke don’t do so every day. An eye-opening World Health Organization report found that in Central America, non-daily users could account for as many as two-thirds of smokers. It should also be noted though that impoverished smokers in the developing world may simply have less cash to burn on their habit. Perhaps one of those rare instances of poverty being healthy.

Scientists believe genetic factors related to nicotine tolerance as well as social and economic influences, decide who becomes the three-pack-a-day guy, who smokes casually and who just coughs violently and makes that first butt their last.

Still, it’s a complex picture that impacts at the heart of what constitutes addiction as we generally understand the term. We tend to think in hard lines and black-and-white realities but in fact, the true nature of a drug’s potential for dependence is full of shades of gray.

Of course, little of this subtlety is mentioned in the fire and brimstone of PSAs or parental lectures. Nicotine is frequently talked about as being so habit-forming that even a single cigarette is a virtual guarantee that the monkey is on one’s back for life. Before one is too critical of such alarmism, it should be noted that, while this is not accurate for the majority of those who try the nicotine, there is good reason for the exaggeration. Smoking kills an estimated 443,000 Americans annually, often through horrible illnesses like emphysema. Almost 50,000 of the deaths are from secondhand smoke. In fact, the CDC estimates that a staggering one in every five American deaths are cigarette-related. Figures for the whole planet are even more frightening. The World Health Organization has implcated smoking in six of the eight leading causes of death on Earth, estimating it has felled an astonishing 100 million people over the last century. That’s almost the equivalent of fighting WWII – twice. Nicotine is quite literally a poison and a significant number of experimenters really do get hooked from the start. Addiction is quite real and its consequences are very, very tragic.

Yet, it is true that if you do choose to try a cigarette, your odds of not becoming a heavy smoker  – or ever touching a cigarette again for that matter – are actually probably better than even, a far cry from what you were told as a teenager.

Still, the socially beneficial lie may be healthier than the raw truth and may save a lot more lives. The odds in Russian Roulette are good too but it’s still best not to play where avoiding death is the best outcome possible.


“The Tipping Point,” by Malcolm Gladwell







Why a flaming cross as a symbol of hate? Well, it was in the script…

Weird Fact #11: The KKK’s infamous burning cross was the result of a filmmaker’s mistake.

crossA Ku Klux Klan rally is the sort of event most of us don’t find on our social schedules but you probably know the drill well enough, or at least you are aware of the most famous part of all, the fiery cross. They aren’t common anymore but for generations, cross burnings became the most visible symbol of fear, oppression and racial hatred in the nation and certainly the most closely associated with the KKK. They were meant to inspire terror and intimidation and unfortunately they often succeeded.

But did you ever wonder why the Klan chose a flaming cross as its calling card? After all, for an allegedly Christian organization, setting the symbol of one’s own religion on fire seems a strange practice.

In fact, the answer is simple.

It was a screw up.

In fact, the original Ku Klux Klan, a loose, short-lived band of violent, mayhem-creating yahoos roaming the countryside wearing masks, came into existence at the close of the Civil War as a sort of Reconstruction-era paramilitary terrorist group and didn’t have anything to do with burning crosses. However, just after the turn of the century, an author named Thomas Dixon romanticized the then-defunct outfit in a book he put together that outlined various practices of the Klan, including the burning of crosses, something based off a ritual of old Scottish clans as a way of rallying troops for war.

It wasn’t true however. For starters, the original KKK never burned any crosses.

But the error was compounded by another mistake. The Scots did burn crosses but they were St. Andrew’s Crosses, crosspieces which were shaped like “Xs”. Dixon apparently mistook the reference to “crosses” as meaning the Latin variety which has so much religious symbolism to a billion Christians around the world. That’s what made it into the book.

Dixon’s errors might have sunk into obscurity were it not for D.W. Griffith, a now-famous film director and noted Klan sympathizer who based his epic 1915 work “Birth of a Nation” upon the book, including the bit about the cross burnings which (A) featured the wrong crosses and (B) never actually happened.

Sure enough, life took to imitating art. Griffith’s movie (taught in film classes to this day for its historic contribution to Hollywood if not for its less-than-admirable content) was a hit and played a central role in the refounding of the Klan which loyally adopted much of what they saw on screen as “real” Klan behavior. Hence, Griffith’s classic film is not based off Klan history so much as the modern Klan is a result of what future members saw in the film.

Incidentally, the very name Ku Klux Klan is something of an inaccuracy as well. Ku Klux seems to be a bastardization of kuklos, Greek for circle. Klan is for… well… clan, rendered as such either as a marketing ploy or because the rubes in the sheets couldn’t spell all that well.

Ironically, today, many in the KKK say the burning cross is a vibrant Christian statement. In reality, perhaps the most potent and terrifying symbol of the horrors perpetrated by American racism was, of all things, a movie mistake.