How Your Birthday Affects Your Life

photo  (c) simon battensby

What does it take to make it big into professional sports? Good DNA? Talent? Great coaches? Money? All good bets, but there’s one particular thing, which is completely out of your control, that affects your chances like no other. Your birth date.

Alex Bellos – a math professor with a taste for football who’s written a brilliant book about numbers in everyday life – has conducted an analysis of birthdays among footballers participating in the upcoming World Cup in Brazil. He found out that February, the shortest month, has the highest number of birthdays, 79 out of 736. Moreover, birth dates are strongly skewed towards the first half of the year: the first five months are all above average, while five of the last six are below. There is only one day with no World Cup birthdays in January and February, but there are eight in November and December.

Why? Because the eligibility cut-off date for sports schools is usually January 1st. At an early age, this creates significant difference in physicality: a kid who’s born on January 1st could be playing alongside one who was born up to 12 months later. So it pays to be relatively older – or bigger than your peers, that is – if you want to make the varsity team.

This is called the Relative age effect. I first encountered it while reading Malcolm Gladwell’s Outliers, where he analyzed the work of a Canadian psychologyst who found out that most elite Canuck hockey players where born in the first half of the year.

That doesn’t mean that you should plan your pregnancies to end in early January at all costs. If you’d rather see your offspring pursue an academic career instead of a football, it would be best to look at autumn. Data shows that the likelihood of getting into Cambridge or Oxford – England’s most prestigious universities – are 30 percent higher for applicants born in October than in July. Why? Because the cut-off date for school year groups is September 1st. Autumn-born students have 25 percent more chances to be accepted than summer-born students. Those crucial few months of added development evidently affect one’s mental abilities as well, so when looking to get into a good school, September is actually January.

It turns out that your birth date, along with other things that you can’t choose like your birth place, your given name, and your ethnicity, can greatly affect the outcome of your life. But birthdays are somewhat paradoxical: how many people do you think must be in the same room to make it more likely than not that two of them share the same birthday? Try to come up with a number.

The answer is just 23. If there are 23 people in the room, it’s statistically more likely than not that two will share a birthday. Why does this sound surprising? Because of the way we relate to math problems such as this. Instinctively, you are prone to wonder how probable it is that someone in the room has the same birthday as you. You are reasoning on a 1 in 365 chance, but that is not the question asked. With 23 people, there are 253 possible pairs, but only 22 include you. Out of these 253 pairs, the probability that none of them includes an identical birthday – because there are only 365 birth dates – is 49 percent, or less than half. Think about this the next time you’re at a party.

Advertisements

The Boy Who Thought He Caused 9/11

Screenshot 2014-04-06 19.00.53

On the afternoon of Tuesday, September 11, 2001, a British boy who had been diagnosed with Obsessive Compulsive Disorder – a form of anxiety which is only relieved by performing certain repetitive actions – was walking home, while engaging in one of his compulsive rituals. This one made him step on specific white markings on the street, but that day he accidentally missed one. Shortly after, when news of the atrocities from the US reached him, he became convinced that he was the cause of the attacks, because he did not step on that particular white mark. The boy thought he was to blame for 9/11.

Consumed with guilt, he behaved erratically for days – he also had Tourette’s –  and had to be persuaded by his therapists that the time difference between the UK and the US meant he had missed his ritual after the attacks happened, so he wasn’t responsible.

People who suffer from OCD often believe that missing their rituals might cause terrible harm, to themselves or others. But it was the first time an OCD patient thought he had caused a terrorist attack. The fear can take many forms and can even be externalized on various objects, like in the curious case of a patient who freaked out whenever he saw a particular type of car in the street, an El Camino.

A malfunctioning brain can create very powerful delusions: Capgras Syndrome is a rare condition – only about one hundred accounts exist – in which a person believes that their spouse, friend, parent or other family member has been replaced by an identical looking impostor. A similar condition, the Fregoli delusion, named after an Italian quick-change artist, has patients think that different people are in fact a single person who keeps changing appearance to torment them.

They both originate from a problem with face recognition, often due to illness or brain damage. The brain can no longer recognize familiar faces, and while attempting to make sense of them, it produces what psychologists call a confabulation, an unintentional lie that rings true to no one but the person who creates it.

This sort of mental short-circuit can unfortunately apply to one’s own body as well. BIID, or Body Integrity Identity Disorder, is a neurological failing that brings patients to think that a limb in their body no longer belongs to them: they want it gone, and strongly desire to become amputees. They start to behave like amputees, and will often gruesomely attempt to get rid of the alien body part.

The cause of this condition is unknown, but it is often associated with apothemnophilia, a form of sexual arousal based on one’s image as an amputee. The list of peculiar sexual preferences connected to mental illness is a long one, but I’ve always been particularly struck by Object Sexuality, by which people become affectionate to things rather than persons.

One American woman, Erika Eiffel, is an advocate for this very persuasion: she has been in love with the Berlin Wall for over twenty years. We’re not talking about architectural interest or a passion for aesthetics, but feelings of true love: she even slept with a miniature portion of the wall, cuddling it and treating it as her boyfriend. In 2004, she fell in love with the Eiffel Tower, which she famously “married” in a ceremony in 2007, hence her name (she was born Erika LaBrie). She received significant media attention and she is the subject of a book and a musical theatre production.

What I find fascinating about mental disorders is that they almost always abide by our need for a narrative. All brains, including healthy ones, prefer to receive information in the form of a story: it’s called narrative bias. We often use narrative to make sense of what happens to us, finding connections between events even if there are none. A damaged brain will not stop doing so, and will happily take the narrative to grotesque extremes. That is why, at times, the only way to “cure” a patient is to operate inside their own narrative. Philosopher Alain de Botton, in this almost certainly apocryphal tale, beautifully captures this:

“Medical history tells us of the case of a man living under the peculiar delusion that he was a fried egg. Quite how or when this idea had entered his head, no one knew, but he now refused to sit down anywhere for fear that he would “break himself” and “spill the yolk.” His doctors tried sedatives and other drugs to appease his fears, but nothing seemed to work. Finally, one of them made the effort to enter the mind of the deluded patient and suggested he should carry a piece of toast with him at all times, which he could place on any chair he wished to sit on, and hence protect himself from spillage. From then on, the deluded man was never seen without a piece of toast handy and was able to continue a more or less normal existence.”

(From Essays In Love, by Alain de Botton)

Why You’ll Hate the New Facebook Design

SomeECards-Facebook-change

Why does everyone complain when Facebook gets a new design?

As soon as the changes appear, people start moaning. It’s happening right now, as the redesigned News Feed is being rolled out to all users, after almost a year of fine tuning.
But why does everyone get so grumpy?

Humans are change averse when it comes to graphical user interfaces, among many other things. Knowing your way around a website or software is a matter of habit. When it changes, you lose your points of reference and have to learn your way around again.
That leaves you dumbfounded until you new habits are formed over the old ones, which can be a bit of an annoyance. In other words, changes in Facebook make you feel stupid for a little while, and you hate that.

Change is difficult. Moving to a new city, starting a new job or learning how to use an operating system are all processes that require you to think about every little thing you’re doing. It’s hard work, but it’s something we generally accept, if maybe ungraciously, as a part of life. When change completely eludes our control, though, we feel lost. How would you react if you got home tonight to a completely rearranged furniture layout in your house? Change can be good and exciting, depending on your personality, but it’s a form of loss, and we are all loss averse by nature. If I gave you a $50 bill and asked if you want to gamble it on a coin toss for double or nothing, you’d probably do what most people do and choose to keep your $50. We tend to value what we already have about twice as much as potential gains, so the gamble is not worth the risk.

Computers haven’t been around that long in absolute terms, so we haven’t yet developed a specific set of psychological tools to apply to changes in that area, and we follow the general rule. Some folks feel that they have a right to retain the interface design they like best (the one they have gotten used to) and claim it’s intolerable that they are not given this option. They even make petitions that go nowhere and are soon forgotten. But Facebook isn’t a product you own or a software you’ve bought. It’s a free service that wants to make money off you and the user interface design is an integral part of its marketing effort, not a matter of anyone’s taste. Companies that pay Facebook to display their ads want to know how they will look like and demand consistency. As Mark Zuckerberg famously said, «you are the product».

People have a right to complain all they want. It’s a way to cope with the anxiety brought on by change. But it’s an empty effort when it comes to Facebook. As soon as you readjust to the new layout, a new habit forms and all is well. So, when you feel like you want to kick up a fuss, just wait a week. By then, it’s likely you’ll no longer care.

Does Peter Higgs Really Deserve a Nobel Prize?

Image

The Nobel Prize for Physics was just assigned to Peter Higgs and Francois Englert for the discovery of the famous Higgs boson.

What’s wrong with that? Mister Higgs has the boson named after himself, and there’s another guy sharing the honor for good measure, right?

In fact, this decision exposes some major flaws in the policy for Nobel assignment and robs other people, not just a few but thousands, of well deserved recognition.

The story of the Higgs boson starts in 1964. That year, three papers that theorized the existence of the elusive particle, the last to be observed in what we call the Standard Model of physics, were published almost simultaneously. Francois Eglert and Robert Brout released their work in August, Peter Higgs in October, and a team of three other researchers (Gerald Guralnik, Carl Hagen and Tom Kibble) followed suit in November. This is normal in science: discoveries tend to happen at a tipping point, and rarely they are the work of a lone genius (save relativity).

So, a total of six people are currently credited with coming up with the theory behind the boson, even though Higgs got the honor of the name (interestingly, bosons themselves are named after a person, Indian physicist Satyendra Bose, like the other class of fundamental particles, fermions, after Enrico Fermi). This happened after another physicist, Ben Lee, first addressed the particle as “Higgs-like” at Fermilab in 1972. Then a couple of Nobel winners mentioned a “Higgs-Kibble mechanism” in their acceptance speech, and the name stuck. Higgs himself has never been too happy about it and refers to his particle as a scalar boson. But names are important, and Higgs became the herald of this discovery even though, in reality, he wasn’t even the first one to publish something about it. Englert and Brout came first. Englert got a Nobel today, so why didn’t Brout? Because he’s dead.

The Nobel Prize has two main rules: it cannot be assigned posthumously, and to a maximum of three people. The key word being people. Outside of the Nobel Prize for Peace, currently a Nobel can’t be given to a group or institution. That’s a problem, because science is no longer the work of pioneering individuals. Gone are the times when Marie Curie would sentence herself to death by handling radioactive materials in her basement. There’s no more Wilhelm Rontgen repeatedly exposing his wife to deadly X-rays in the name of discovery. Research is now a collaborative effort conducted in ISO-approved labs with safety first in mind. Science papers can use more pages to list all the contributors than for the subject matter itself. It is weirdly anachronistic to not acknowledge this huge shift in how science is being made, and it leads to unfair decisions.

Back in 1964, there was no way to instrumentally confirm the existence of the famed boson. To do that, you need a huge particle accelerator, like the Large Hadron Collider at CERN, in Geneva. It took 48 years to design and build the technology that finally allowed us to find the particle, as CERN announced on July 4, 2012. For over a year, research was conducted by two separate groups called Atlas and CMS, after the names of the two particle detectors that analyze the results of experiments being run in the accelerator. An editorial on Scientific American that calls for the rules of Nobel assignment to change estimates that no less than 6,000 people contributed to these experiments. But since the prize can only be given to individuals, their effort is not recognized here.

The Nobel committee was faced with a difficult decision. They had six theoretical physicists all credited for the same discovery. One had passed away, but there were still two in excess. And then they had a couple of large teams lead by two additional individuals. A grand total of seven people plus thousands. They chose the path they thought was most logical: give the award to Higgs, the namesake, and to the living author of the paper that was published first. Higgs does deserve it, to answer the question in my provocative title, but the Nobel people chose to ignore the group of three that compounded the theoretical research, and most importantly to ignore the two groups that actually found the particle. Even though the assignment press release mentions them briefly (“…which recently was confirmed through the discovery of the predicted fundamental particle, by the ATLAS and CMS experiments at CERN’s Large Hadron Collider”), that simply isn’t enough.

Science has evolved. Its most prestigious award should do the same.

The Age of the Selfish Meme

Screen Shot 2013-03-27 at 15.56.05

The times, they are a-changin’.

The image above (via Reddit) comes from an Australian store that has started charging $5 to customers who peruse the goods but don’t buy anything, assuming they are just looking around to buy elsewhere later (possibly online), a strategy known as showrooming.

That’s a difficult problem to face if you’re a brick and mortar store: Best Buy has famously solved it by price-matching any online deal on its merchandise. This store chose a very different approach, one that is sure to alienate many potential customers. We are going through a big transition phase in how we deal with technology. Online shopping is already a taken-for-granted habit for many, but other changes are more subtle and take the stage less ceremoniously. Think of voicemail, which almost no one uses anymore. If you leave someone a voicemail and believe they will listen to it, you’re talking a dead language. But nobody tells you: you just have to know. That’s maybe why Nick Bilton of the New York Times caused a stir when he blogged about current trends in digital etiquette, saying that people who reply to an email or a text just to say “thank you” are rude.

It’s a generational clash all right, with grownups blaming the kids for being unpolite, but it’s way more than that. It’s not the first time we go through these hurdles, only the tools are different. One of «the first crises of techno-etiquette», as the Times calls it, happened just after the telephone was invented: nobody knew what to say when they picked up a call. Ahoy! and What is wanted? were popular options before we eventually settled on Hello. That was a single problem related to a single piece of technology. Think of how many of these processes we are going through today. The difference is that there’s no concerted effort, because most of these problems have not been around long enough to create a definite distinction between right and wrong. So everyone gives it a shot and hopes for the best.

In evolutionary terms, when someone smarter that you is around, you’re in trouble. Even if you’re standing on a freshly killed gazelle, there’s always a sneaky scavenger lurking somewhere. People being born today are delivered to a full-digital world, and this creates a fracture as large as we’ve ever seen. Some folks, the older generations, will fade away before this becomes more than a nuisance for them, but others who are still relatively young are at risk of suffering.

Technology is mature enough for some companies to have become dinosaurs. Think of the difference between Microsoft and Google. They were founded just 23 years apart, but if feels like a century. Yahoo, another flailing tech company, just made headlines around the world for having bought Summly, a news reading app that sums up news stories through an algorythm, for $30 million. The app was made by a 17-year old who is being hired by Yahoo. There’s no media outlet in the world that hasn’t picked up the story, because a kid who makes millions with an app has just slightly less appeal than a litter of puppies: it’s irresistible. But Summly had been around for a year, very few people used it, and Yahoo has already killed it. Yet this has become the global talk of the day and it’s given the company a fresh coat of paint, not a bad deal for $30 million, or 0,75 percent of Yahoo’s cash reserves. There’s no right or wrong in this: you can either see it as a brilliantly cynical PR move or a genuine sign that there’s hope for humanity. Either way, I’m not sure this trick can be successfully pulled off for much longer.

The following image is floating around Facebook: 536319_321387391297312_836869227_n It’s a static image that appears to be moving because it tricks your brain a little bit. The image is shared with the encouragement to “type 1 in the comments to see the magic”. Of course all the “magic” is already there, and absolutely nothing happens if you type “1” in the comments. You should know that. And yet this has racked up over 200,000 comments, with most people sheepishly complying and inflating the comment counter (an empty endeavor at that, but that’s how the Internet works). Imagine if someone told you in the street to shout out «one» very loud to “see the magic”. You’d think they’re nuts. You would certainly not comply. But in technology, the weakest of nudges will make you do things without thinking.

Richard Dawkins, the evolutionary biologist who coined the term meme in his 1976 book The Selfish Gene, saw it coming: just like genes use biological beings to propagate themselves, we are now slaves to memes as well. Most people don’t take these things seriously: they are not afraid to act like digital idiots. They still see this realm as something separate from reality, where the effort required to do stuff is minimal (one click) and so is the social cost of errors and mishaps. There’s an obvious detachment: people still use nicknames even where they’re not supposed to. There are companies that block off Facebook and other sites so that their employees do not slack off at work. In a few years (wether Facebook will still be around or not) that will be considered as outrageous as asking people to relinquish their phones before they sit at their desks.

For some, technology is not yet life, it’s something that still sits on top of it, separated. But it’s not. We are constantly going through sweeping social change, but it’s less apparent when you’re standing in the middle of it. And as with every social change, some people are on the forefront of it, some are puzzled by it, and some can’t even see it fly over their heads. It’s a very interesting time, but because transitions have uncertain boundaries it will be probably forgotten by history. Just like those times when we still hadn’t figured out what to say when we answered the phone.

The Darkness Around Suicide

The tragic death of the nurse who was the victim of a prank call at Kate Middleton’s hospital a few days ago highlights how little is known, or understood, about the complex issue of suicide.

After the news broke that Jacintha Saldanha had killed herself, the world was enraged. Twitter immediately turned into a violent storm of accusations and hatred, and the accounts of the culprits were quickly deleted. Many people stated that the two Australian DJs had «blood on their hands» and needed to be charged with manslaughter. They were to blame for what happened.

I am deeply sorry for the death of this woman, a married mother of two. But the pranksters cannot be blamed. While shame is a powerful motivator for suicide, a single event is almost never enough. Very few media outlets bothered to call a psychologist to comment on the events in their first round of reporting. Fox News, not a news source I normally admire, had the president of the American Psychologist Association correctly suggest that the call might have been «a final straw» that led Saldanha to take her own life.

No one is directly to blame. You could say that the hospital director should have know better than to put a vulnerable person in charge of the phone when he had the world’s most famous patient under his care. And you can question the taste, the timeliness and the motives of the pranksters, but those observations should stay the same regardless of what happened as a result. Prince Charles himself had a laugh about it, before tragedy struck of course, by saying «How do you know I’m not a radio station?».
If you listen to the call, the intentions of the hoaxers are pretty clear: «We thought a hundred people before us would have tried it, we just thought it was such a silly idea, and the accents were terrible», they explain in a video interview, «Not for a second we thought we would actually get to speak to anyone at the hospital. We wanted to be hung up on». The joke was stupid to start with, but it spiraled out of control.

Suicides only make the news under certain conditions, chiefly when they follow a murder, or when they are linked to fame. Not necessarily a celebrity: in 2010, Chinese manufacturer Foxconn, who builds tech gadgets for Apple and other leading brands, made magazine covers after a slew of suicides. There were 14 registered self-inflicted deaths among Foxconn’s 930,000 employees that year. Sadly, those were actually pretty good numbers compared to China’s national rate of over 22 suicides out of every 100,000 people each year (Foxconn only suffered 1,5). That didn’t stop the company from responding to the public outcry by installing nets in its dorms to discourage workers from jumping out of windows. Bad publicity kills good business.

What this shows is that not only suicide is questionably treated by the media, but that we are extremely unfamiliar with the numbers surrounding it. China has, admittedly, one of the world’s highest suicide rates: a staggering 287,000 people kill themselves each year, out of a population of 1,3 billion. That’s about 786 people a day, a figure that’s hard to swallow. You’d think that has mostly to do with China. But it doesn’t.

The United States have one of the highest rates of homicides among developed countries: in 2009, there were 16,500 deaths by murder. And how many deaths by suicide? Over 36,000. More than twice as many people kill themselves each year in the US than are killed by others. In the same year, 2009, suicide has surpassed car crashes as the leading cause of death by injury. And more American soldiers routinely die by taking their own lives than are killed in combat. In India, another country with elevated rates, suicide is now the second leading cause of death among young people (15-29 years of age) after transportation accidents, with around 187,000 fatalities in 2010 alone.

The puzzling numbers are not the end of the story. One commonly held belief is that suicide rates peak around holidays, especially in the winter. But that’s hardly the case: people commit suicide far more frequently during springtime. Why? There are many theories, but increased social interaction may be a leading factor: it’s just when you hope that things might get better, like the weather outside, that you get depressed the most if they actually don’t. Also, it’s easier to feel socially disconnected when everybody else is out having fun. The intricate inner workings of suicidal tendencies are also based upon cultural or racial differences: African-americans are half as likely to kill themselves and six times more likely to be murdered. But some of those homicides may be suicides in disguise: psychologists say this is due to the fact that white and black people externalize their frustrations about life differently, because of cultural heritage. There’s a type of murder called “victim-precipitated homicide”, which happens when someone engages in violent or reckless behavior that gets them killed. An estimated 30 percent of urban homicides may belong to this variant, which is not recognized as a form of suicide.
A similar thing happens after news break that a celebrity has committed suicide:
not only you get a spike in actual suicides out of emulation, but the number of fatal car accidents involving a single person also spikes. Many of those are people killing themselves.

In the end, nearly all suicides go unnoticed and undiscussed, because this is a taboo topic across all cultures. But there’s another reason why so few are reported in newspapers: even common people committing suicide, for example subway jumpers, inspire emulators. Especially individuals demographically similar to the person who died, and who lived in the same geographical area. It’s called the Werther Effect, after Goethe’s novel. The result is that media outlets in many countries self-regulate against reporting suicides to deter copycats.

Suicide is not a rare thing, it’s a common thing. It can hardly be caused by a single event, however devastating: it’s a cumulative problem often linked to mental illness. There’s not nearly enough awareness around it, compared to other similarly life-threatening issues. All the time and effort that have been spent over the last few days chastising two hapless idiots who didn’t know what they were doing, could have been better spent exploring the real problem. Awareness can save lives. Ruining those of two additional people accomplishes nothing.

The Perils of Gift Giving

The gifting season is upon us.

Like several things that require you to interact with others, gift giving is an apparently simple, benign activity that can quickly turn into a can of worms. The main problem is that the giver and the recipient have different perspectives and they often follow different incentives, which sometimes end up clashing.

As a giver, for example, you tend to think that more is better: two presents are a more generous offering than one. But there’s a twist. Imagine you’re buying a $300 sweater for a friend. At the counter, you add in a $25 gift card, so that he can get himself a necktie or a few socks. Which one is the best gift, just the sweater or the sweater and gift card combo? In this case, one is better than two, because as a recipient you tend to average out the value of all the gifts you receive from someone: the small gift is thus lowering the perceived value of the big gift, and you are better off with just the sweater.
You should refrain from the temptation of adding candy, gift cards or novelty items to your main present, whatever its value.

We have a peculiar way of evaluating items in a group. Take this example from Nobel laureate Daniel Kahneman. Suppose there are two sets of dinnerware that you can buy:

Set A has 24 pieces.
Set B has 40 pieces. It has all the pieces from Set A plus an additional 16 pieces, but 9 of these are broken. 

Which set is worth more? Obviously Set B, since it has everything from Set A plus an additional 7 intact pieces. And indeed, in the study this refers to, participants valued Set B more ($32) than Set A ($30), when presented with both options.
But in single evaluation, when sets were presented by themselves, results were reversed: Set A was valued far better ($33) than the larger Set B ($23). Why? Because the average value per dish in Set B is much lower due to the broken dishes, which nobody wants to pay for. Calculating the average makes the set with less pieces worth more, and taking out items from the larger set improves its value. Economic theory dictates the exact opposite: «adding a positively valued item to a set can only increase its value».
But not so in behavioral economics.

So yeah, less is more and all that. Surely, you might say, reducing the act of gift giving to just a matter of value is simplistic, not to mention heartless. But there’s a little economist inside our heads who strongly disagrees: that is why some people only give out kitschy, sarcastic gifts that have no intrinsic value beyond the chuckle they produce once the wrapping is torn off. It’s an efficient way of avoiding the problem of a sincere gift, and it works well with acquaintances or your sister, but it’s not an ideal option for everyone else in between.

In purely economical terms, the best gift is cash. I recommend that if you happen to have a friend who’s an economist, because then the gift will double its function and also get you a chuckle. A classic study called The Deadweight Loss of Christmas explains why: in a survey, it was found that «holiday gift-giving destroys between 10 percent and a third of the value of gifts». In other words, if you give your friend a $100 shirt, his perceived value of it, or how much he’d be willing to pay for it, will be 70 to 90 dollars. That’s the deadweight loss, which would be avoided to the benefit of everyone involved by just handing out cold, hard cash. The world would be a gloomy place, but it would save around $10 billion that go lost each year in the very transaction of gift giving.

When buying stuff, you also have to deal with another problem: choice. The average supermarket carries about 40,000 items, twice as many as just a decade ago. You can only marvel at the number of different types of jeans you can find in a Gap store now compared to just twenty years ago. More choice is good, right? Psychologist Barry Schwarz has a different opinion, as he explains in his book The Paradox of Choice.
In a memorable study conducted at a gourmet food store, jams were offered for tasting. On one day, 24 flavors were available; on another day, just 6. Surprisingly, 30 percent of buyers who tasted from the small selection made a purchase, compared to just 3 percent of those who tasted from the wider array. Apparently, more choice creates «decision paralysis» and a lower level of satisfaction, because it leaves room for greater regret: you have an increased chance of making the wrong decision as the number of options escalates.

(While not everyone agrees with this theory, companies seem to have picked up on it: when Procter & Gamble reduced the number of variations of its Head and Shoulders shampoo from 26 to 15, sales increased by 10 percent. And when the Golden Cat Corporation eliminated the 10 worst selling types of its kitty litters, sales rose by 12 percent and profits more than doubled due to reduced distribution costs).

If that wasn’t enough, be aware that simply buying groceries can mean relinquishing your most personal secrets. Large retail stores are ravenous for information about your shopping habits, and they can use the data they gather from fidelity programs to predict your future needs. Charles Duhigg narrates in his book The Power of Habit of an incident that happened at a Target store in Minnesota, when a man walked in protesting the fact that his daughter, still in high school, was getting coupons in the mail for baby cribs and clothes: «Are you encouraging teenage pregnancy?», he complained. When he received an apology call from customer service a few days later, he had to apologize himself:
«I had a talk with my daughter. It turns out there’s been some activities in my house I haven’t been completely aware of. She’s due in August». If you think that’s too much, let it be known that credit card companies can predict a divorce just by analyzing spending patterns.

But back to presents: there’s one final piece of advice that I can give you. If you want to extract the maximum amount of happiness from your purchases, wether it’s something you buy for yourself or someone else, buy experiences instead of things. The excitement of a brand new object soon fades, while a new experience (a weekend somewhere, a yoga lesson, a concert) gives you a memory than can be revisited and stays with you forever.