Pearn Kandola Banner Pearn Kandola Banner


Steve Jobs, Richard Branson, Mark Zuckerman. What do they have in common?

They all established hugely successful innovative organisations; Apple, Virgin and Facebook. They also all dropped out of school or college. It seems that being the best student is not necessarily the only route to success and, in fact, when it comes to being innovative, being a good learner could even be counter-productive. This doesn’t mean we should necessarily encourage our kids to drop out of school but to encourage creativity we do need to recognise the value of breaking away from conventional teachings and offer an environment in which unusual ideas are celebrated and fostered rather than suppressed. Some ideas will fail, but unless all ideas are welcome that huge new breakthrough might never come.

Create the right Environment. In his latest project “Ideas Britain”, musician Dave Stewart addresses what he sees as the “tendency of big business to shut its doors to new thinking” by providing an on-line space for ideas and inventions to be shared and to attract financial support. However, innovation is not the preserve of mavericks and rebels. There’s much that can be done at an organisational level to ensure creativity is allowed to flourish. Give ideas the SUN they need to grow: Suspend criticism, Understand and Nurture.

Make time and space. By stepping away from the doctrines of learning, innovators do something important which is make space for thinking and creating. In his TED talk ( , Jacob Barnett, a young autistic mathematical prodigy who was predicted to never achieve anything through conventional education, highlights that some of our greatest scientific breakthroughs came when scientists suspended learning and started thinking. He argues that Isaac Newton developed his greatest ideas whilst Cambridge University shut down due to the Plague, and Albert Einstein perhaps achieved more whilst being barred from a university teaching position in pre-Nazi Europe. However, like his role models, Jacob Barnett’s own ground-breaking theories did not appear out of the blue but from many hours of trial and error and exploration of new possibilities.

Work at it: To be creative, it probably helps if you were born an unconventional thinker. Dyer et al., 2011 propose that one third of creativity can be explained by our genetic composition. However, that leaves two- thirds of creativity as a learned skill and so it is, arguably, something we can all practice and improve. Innovation requires hard work: exploring things from new perspectives, making unusual connections, experimenting, challenging assumptions, taking the risk that an idea might not actually work but exploring it and trying it anyway.

A useful model to help develop your innovative thinking is CASTING: • C-onnect: Make unusual connections and associations. Force yourself to get off your mental tramlines and see what else is possible and where this takes you.
• A-sk: Challenge your assumptions and ask “What if?”. It might pay to throw out the rule book.
• S-ee: Observe what works or what doesn’t work. Look around you at what other people are doing and think about how you can adapt that to your situation.
• T-ry: Experiment with alternatives to find what works best. Avoid ruling out options without giving them a go.
• In–volve: Harness diverse people and perspectives. Cross fertilisation of ideas from different specialisms is invaluable.
• G–o Do It!: Be positive. Protect and nurture your ideas to help them grow and develop.

January is that time where all thoughts turn to New Year’s resolutions. As ever the internet seems saturated with articles and blogs on the ‘best’ resolutions to make, the ones that will make us most happy and, of course, how to keep them, which includes the invention of a bracelet that gives you an electric shock if you don’t do whatever you have committed to doing. Yes, really.

So I had pretty much given up on the idea of writing about New Year’s resolutions this year, until I came across an article on the Harvard Business Review on mindfulness. I’ve been a fan of mindfulness since I came across the Headspace App last year and, for me, it’s a pretty helpful thing to do. Whether it has made me more creative, compassionate or develop better relationships, as the research suggests, I’m not entirely sure. I do think, though, that it has helped me to be more focused and less stressed at times when I have a lot going on.

Of course, I don’t practice it nearly as much as I would like, so it was already hovering on the list of possible resolutions when the Harvard article caught my eye. The article highlights a piece of research from Central Michigan University which found that individuals who had completed 10 minutes of mindfulness training had significantly lower levels of implicit racial and age bias, than those who had not.

This finding is consistent with previous research which demonstrated that mindfulness reduced our brain’s reliance on automatic associations. Research from INSEAD in 2013 showed that mindfulness reduced our susceptibility to sunk cost bias (our tendency to persist with lost causes because of what we have already invested in them e.g. watching a bad film until the end, because you’ve watched half of it already).

There are some caveats; this was a fairly small piece of research, so more is needed to replicate this and understand how meditation impacts on UB in more detail. It will be interesting to know how long the effect of meditation lasts, for example.

Nonetheless it makes sense to me that this could be a tangible way to help reduce unconscious biases. Any method that can reduce the automatic associations or automatic processes our brain uses as shortcuts is likely to be helpful in reducing bias. So I’m sold on mindfulness as my New Year’s resolution. It’s good for me and could be good for others too if it helps me reduce bias. Now I’m just going to need to make sure I stick to it. Where’s that bracelet?

As we look ahead into 2015 and wish for a time of greater success and happiness, let’s not waste time hoping for perfection. Let’s face it; things can always be better than they are and there is a world of opportunity to focus on what went wrong. However, to move forward, set new goals and persist in the face of adversity, we could all benefit from putting on those rose-coloured spectacles and looking at the world in a new and more optimistic light.

Some are lucky to be born in a world where, for them, the sun always appears to be shining. But optimism can also be seen as learned skill which all of us can practise in order to unleash our happiness, creativity and self-belief.

Chris Hadfield, retired astronaut encourages us all to put our negativity aside and recognise the incredible successes of scientific achievement in the world. With this recognition we can then choose to take action on making it even better by starting with a resolution. []

Research has shown the power of optimism. Optimists live happier, live longer and achieve more. A key principal is how optimists deal with adversity. For example Seligman demonstrated that when competitive swimmers were told they were not swimming as fast as expected, the optimists carried on performing at their usual level whereas the pessimists got slower. Although there can be risks in being overly optimistic; on the whole there is a greater risk from maintaining an overly negative perspective.

Where pessimists go wrong is in seeing failure as all their own fault, exaggerating the significance of failures as reflecting everything else they do, and believing that the outcome cannot be changed. This leads to the pessimist losing heart and giving up, or wasting time ruminating on what can go wrong.

So how do we harness the power of optimism?

  • Give thanks: Write down and celebrate what was good every day.
  • Do something! Get busy in useful activities that absorb your interest.
  • Take pride: Recognise your strengths and how these can help you achieve good outcomes.
  • Let go: Accept that the outcome is not entirely under your control but carry on and try to do what you can.

Judges are impartial, objective arbiters of the law, making important decisions that affect the lives of people everyday. At least that’s the theory although in practice they are as fallible as anyone else, particularly when it comes to unconscious biases.

Two recent judgements in the UK shine a light on this issue. Andrew Mitchell recently lost his libel case against the Sun newspaper over the infamous ‘plebgate’ accusations. Without commenting on the merits of the case – I have no special knowledge in this area – it was interesting to hear the judge’s passing comments, as reported widely in the media. Consider the following quote from the BBC website: “the judge said PC Rowland was “not the sort of man who would have had the wit, imagination or inclination to invent on the spur of the moment an account of what a senior politician had said to him in temper””

This is worth restating - the judge did not believe that PC Rowland had the wit or imagination to make up his claim. How did the judge form this view? We are not dealing with a claim that PC Rowland is an expert in quantum mechanics. Surely, there is a huge irony in the judge concluding that Andrew Mitchell must have called PC Rowland a pleb because, in the judge’s view, PC Rowland is a pleb? On the assumption that PC Rowland was not subject to a battery of intelligence tests as part of the trial, it would seem that the judge’s less than complimentary view of his intellectual prowess was based on his assumptions about police officers, or perhaps the sort of police officer that he believes PC Rowland is.

Of course, ‘assumptions’ and ‘believes’ are the key words in the last sentence. They are an intrinsic part of unconscious bias – they are the gateway that links our implicit beliefs to our conscious decision making. In another recent example reported by The Times, an immigration judge has resigned since making comments about Deepa Patel, a 22-year-old victim of alleged harassment. When the prosecutor, Rachel Parker, said she was unsure whether Ms Patel could attend the court at such short notice, the judge replied: “It won’t be a problem. She won’t be working anywhere important where she can’t get the time off. She’ll only be working in a shop or an off-licence.” When Ms Parker asked him to clarify his comments, he allegedly replied: “With a name like Patel, and her ethnic background, she won’t be working anywhere important.”

In this rather breathtaking example there appears to be conscious bias at work, as well unconscious bias and assumptions about the Ms Patel based on nothing more than her name. In both cases we can see how the judges’ – and potentially our own – implicit assumptions combine with superficial information (he’s a policeman, her name is Patel) to form a ‘sensemaking narrative’ which, in turn, influences the actions and decisions that are made. As is often the case, unconscious biases are easier to see in others than they are in ourselves.

This month, Mercedes awarded a staggering £7million in bonuses to be shared amongst every single member of staff at their factory in Brackley, UK. It was a reward for the team’s first ever year of winning the World Constructors Championship (the most successful Formula One constructor of the season). The decision to celebrate success with all 700 people on the site, from chief engineer to cleaner, with a minimum £10k pay-out is surely the ultimate in team bonuses. Used to seeing Lewis Hamilton and Nico Rosberg taking all the glory on the podium, the recognition that the drivers could not have achieved this alone will surely secure a boost in company loyalty and engagement.

Team reward has both benefits and pitfalls. Intended to encourage co-operation versus competition, awarding a share of success with a whole team can minimise divisive comparisons of who contributed less and who contributed more. The benefit is that individuals will be more inclined to do what is needed for the business as a whole rather than what will attract the largest personal gain. It also ensures that the whole team feel valued and equally engaged, thereby building team morale and harmony. For that one day in October everyone at Mercedes put on the same team t-shirt, shared the same team breakfast and celebrated the strength that came from working together.

Some argue that team reward also carries risk. It is possible that it allows for free-riders who enjoy the spoils without making the same sacrifices; a situation that could breed resentment. On the other hand, social pressure and the desire to share the pride could comfortably outweigh this risk. And anyway, how do you compare the value of a safe 9 to 5 “plodder” and someone who offers occasional flashes of brilliance. There’s room for a spectrum of qualities and working styles within most teams.

It is likely, though, that individuals who thrive on competition might feel less motivated by team reward alone, feeling that they are denied their moment of personal glory. Within a Formula 1 team, competition between drivers is a good thing that spurs them on to greater achievements. A company that denies the power of individual reward could lose its star players to the competition. However, with an estimated salary of £20 million Lewis Hamilton can be in no doubt of his relative worth compared to his team-mate Nico Rosberg who only takes home only £15 million! And rather than feeling depressed about it I would bet that Rosberg will only try harder next year.

Coming back to the team itself, it is obvious that neither Hamilton nor Rosberg could have achieved this success alone. But how fantastic that Mercedes invited everyone to the party. Now that’s the type of company I would want to work for. I wonder if they are recruiting? I’m sure I could be the best floor sweeper that company has ever seen, but I don’t suppose any of their employees will be moving on so soon!

It turns out the rumour about Walt Disney being cryogenically frozen until a cure for lung cancer could be found was just that. A rumour. Its persistence comes from its appeal as an idea and the misplaced hope it generates.

It seems Disney is not the only American corporation investing in the misplaced hope that cryogenics offers. For those of you who thought that a company car, stock options and a private medical insurance were the top tier in employee benefits, you may need to think again. It was recently revealed that Apple and Facebook are offering female employees up to $20,000 to freeze their eggs, so as to delay the onset (as in the onset of a disease) of children.

The argument in favour of these initiatives goes along the lines of why should women interrupt their careers at a crucial stage? Wait until you have established yourself in your career before wrecking it by having kids. I can almost hear Sheryl Sandberg urging 30 somethings to ‘lean-in’ to the stirrups to facilitate the egg-removal. One quick clinical procedure later and motherhood is delayed until a career convenient time in the future, where they might even have invented a procedure for freezing babies who keep them up the night before that crucial presentation.

This idea is terrible on so many levels I feel reluctant to list them, but I will give it a go.

We all have a right to a private life. Organisational involvement in conception rather flouts this right. Imagine the conversation with your Line Manager to access this employee ‘benefit’. Perhaps your manager will become involved and send you a meeting invite for your child’s birth?

This initiative thrives on misplaced hope. The fact is that the success rate of a live birth from a frozen egg is very low. Just because a great deal of money has been spent on egg freezing does not guarantee success. Women will be more likely to conceive naturally. If schemes like this gain traction, there will be women who don’t become mothers who could have become mothers. I’m not sure that ethical businesses should be facilitating this risk.

Egg removal sounds very clinical and straightforward. The reality is very different. Eggs are harvested after a woman has been on a drug regime for 6 weeks which amongst other side effects, impacts mood. This is followed by an uncomfortable procedure. More of the same follows at the embryo implantation phase, with no guarantees of success.

This all sends a very dangerous message about women and motherhood to society. Delaying or avoiding motherhood equals career commitment whilst having a baby is for those who are not prepared to invest in their careers. These women now have a choice about when to conceive and if they want to prioritise motherhood over their career, why should I support her career?

The most troubling aspect of all this for me as a man is that, again, men are let off the hook. Gender equality will never be realised unless men invest in the care of their children to the same extent as women.

Imagine an average couple who start a family. The mother will tend to take maternity leave of 9 months and may return to work but perhaps in a part-time capacity. During her child’s infancy, she might experience reduced development opportunities and miss out on a number of promotion opportunities. She may be seen as less reliable due to increased absences to look after her sick child or through needing to do the school run. Now compare this with the experience of the father. The father returns to work after 2 weeks of paternity leave. This is a positive experience for him as his co-workers and, importantly, his boss will now recognise his warmth as a caregiver as well as his abilities to lead. During his child’s infancy he experiences rapid career progression as going home at a respectable time involves nappies, reading stories and screaming kids, he might as well stay in the calm work environment and take on that extra assignment. During this same period, several of his female employees have babies and he benefits from the increase in opportunities this creates.

Of course this imagined couple invokes several stereotypes that many will find offensive. The point though is that men benefit from women going on maternity leave, whether this happens now or 10 years later with some defrosted eggs. Unless more men are prepared to share the responsibility of parenthood, women will continue to be discriminated against.

Last week, Richard Branson made a bold announcement that the 170 staff he employs at the Virgin Group will no longer have to get permission to take holiday leave but will be free to take as much holiday as they choose as long as they are satisfied that colleagues and the business are not detrimentally affected.

He follows the suit of Netflix and other creative companies that have seen a boost in well-being, morale and performance as a result of giving staff complete freedom over their annual leave. Flexibility can work well for creative roles where innovation is encouraged by a taking a less formalised approach to the working day and where it is not essential for staff to be physically present at any one time. Results are more important for some roles than the number of hours worked, and can actually be improved with more time to relax between creative bursts. There is also something compelling about the idea of trusting employees to be responsible adults and not to cheat the system. It could be argued that this takes away the pressure of management to police the system and therefore reduces the negative impact of hierarchical power on team morale. Perhaps it could also reduce stress on those trying to juggle non-work demands.

I suspect, however, that the “non-policy” approach to annual leave is more likely to increase the levels of uncertainty and sense of unfairness within teams. In our leaner, post-recession organisations it may be difficult to ever find a time when your desk is truly clear and where taking a holiday wouldnt have any impact on the rest of your team. Further, without a clear policy that encourages people to take well-earned time off, as a workaholic nation we are more likely to feel pressured to work harder and longer than to be tempted into taking more time off. Competition to prove yourself amongst peers will make it harder to break away and this will, for many, increase the conflict between competing work and home roles. As a consequence there are likely to be negative effects on workplace diversity – the opposite of what was presumably intended by offering to tear up the holiday rule book.

Holiday policies are put in place to ensure fairness and protect people from the negative effects of peer pressure. They also take the pressure off managers and staff to make constant individual decisions based on their own discretion. Without written rules, what is and isn't fair becomes a matter of individual perception and vulnerable to challenge and lays businesses open to claims of unfair treatment that become very difficult to answer. If businesses want to foster individual freedom in order to maximise employee engagement they might do well to re-examine their policies on annual leave, for example giving people the chance to work during term time only, to buy additional holiday by sacrificing some salary, or sell any unused holiday leave (over the statutory minimum) back to the company for additional pay. Whatever a company does, it needs to be fair and this might as well be captured in a set of written rules rather than left to an unwritten set of cultural expectations that are vulnerable to bias.

Styles evolve. From the woollen three piece suit of the early 20th century and the power dressing of the 1980’s, through to today’s boom of business casual, company dress codes move with the times. In part this reflects a shift in culture from those that emphasise authority, hierarchy and convention to those that are more open and creative and which emphasise a greater work-life balance. However, for most employees this doesn’t mean you can wear whatever you like. Rules, both written and unwritten, still dictate what is regarded as suitable for work. Whilst most people are happy to leave their favourite party outfit in the wardrobe when attending an important meeting, when it comes to body art such as tattoos and piercings, or extreme hairstyles it’s not so easy to leave your personal style at home. For many companies there remain good reasons to manage their brand through the appearance of employees. However, the challenge of how to manage dress codes is becoming increasingly difficult as companies move towards more diverse and inclusive employment strategies and fashion moves more and more towards individualised body art. Tattoos seem to be particularly emotive, and invite strong reactions both for and against, so where and when should a company legislate against them? How do you decide what is, and is not, appropriate in the workplace?

It is perfectly legal to bar someone from employment on the basis of their personal dress and appearance. The company must simply have a written dress code in place that it applies equally. The dress code must apply the same levels of expectation to both men and women although it can stipulate different styles of clothing that fit with the prevailing conventions for that gender. Flexibility need only be extended where someone’s clothes or styling is associated with their religion or related to a disability. However, health and safety concerns may override personal freedoms. In practice, many companies must make a judgment of what is sensible for different groups of employees based on the level of contact and impact on customers, the expectations of other employees and the need to be clearly identifiable and in keeping with the company brand. Sub cultures have developed with different conventions and norms. In Silicon Valley you would be regarded as a freak if you came to work in a suit. Whereas, a city broker would attract attention if they turned up in jeans.

Fitting a company dress code to societal norms matters if it means the company will be perceived as more professional or approachable. The difficulty is that norms have changed dramatically within a short space of time and companies are struggling to keep pace. Tattoos, once associated with sailors and thugs, are now commonplace. It is estimated that 1 in 5 Britons have a tattoo, rising to a third of UK 16-44 year olds. In the public eye it is not just musicians, actors and sports stars that seek body art but establishment figures too. As our tattooed youngsters become the establishment, no doubt company dress codes will reflect the prevailing attitudes again. Perhaps we will even see employees encouraged to put ink to skin. Last year, a real estate company in the US offered staff a 15% pay increase if they had a tattoo of the company logo. Even so, I suspect that there will always be pockets of professions where tattoos will never become the norm; where employees as well as their employers will want to differentiate themselves as being different and more conventionally serious and respectable. For these groups I imagine tattoos will generally stay well hidden.

Let’s assume for just one moment that mothers have complete freedom of choice of whether they want to stay at home to care for their children or to pursue their career or find a way of combining elements of both; their company is supportive and offers flexible options to cater for this. Everyone is happy; the company retains the loyalty and skills of the woman, the woman is able to utilise her talents as she judges best.

Not so fast. A serious challenge for many women who would like to work is the cost of childcare, or the availability of able and willing grandparents. An option open to few will be to share or hand over childcare responsibility to the father during the working week. Here companies and society struggle to extend the same flexibility and this creates a serious barrier to the aim of encouraging inclusivity for women. Inclusivity for women cannot happen without commensurate flexibility for men.

One way to address this is to challenge unconscious bias towards men. More and more we are seeing men embracing the parenting role with the number of stay-at-home dads doubling over the last twenty years (Office of National Statistics). Yet attitudes remain entrenched with society struggling to embrace the idea of men being the primary care-giver rather than the primary bread-winner.

Break down the care-giver role into its constituent parts and our deep seated bias looks shaky at best. Teacher, first aider, chef, artist, activity and sports coach, story-teller, play mate, counsellor, taxi driver, expedition leader and hug-giver; we accept men can be all these things. However, when we role these things into one, we often assume that men will struggle. Recognising men’s talent for home-based roles is just as important as it has been to recognise women’s talent for work-based roles. In working families, one cannot be fully achieved without the other.

It seems graphology – or the study of handwriting – is back in the news. A recent article on the HR Magazine website suggests a resurgence in popularity of the technique with "30% of UK and US companies using handwriting analysis as part of the recruitment process". Apparently, it is "fast becoming a crucial element in the vetting and screening of incoming personnel".

Really!? I thought we had put this one to bed. Do we still need to ask whether graphology is worth the paper it’s printed on??

First, the short answer...No.

Now for the longer answer...

According to advocates of graphology, it is possible to infer various characteristics such as personality and predicted job performance from a person’s handwriting. As a technique, it has long been argued to provide an accurate and reliable mechanism for selection and assessment.

Unfortunately for the movement, decades of scientific research have thoroughly debunked this claim. Several empirical studies in the mid-1980s (e.g. Ben-Shakar et al, 1986) all found that graphology revealed nothing about subsequent job performance. Decades of research going back to the 1960s (e.g. Reid, 1983) identified no correlation between the interpretations of graphologists and personality characteristics of the subject. In 2000, Anderson and Cunningham-Snell reported validity coefficients (i.e. the degree to which success in the tool predicts success in the job) of graphology as an accurate tool for selection to be zero. It simply doesn’t work.

Interestingly however, its popularity dogmatically persists despite clear evidence that it simply doesn’t work. In France for example, graphology is very popular with up to 52% of companies using it at some point during the selection process (Smith and Abrahamson, 1992). It is this popularity that graphologists use as proof of its efficacy. However, just because people like something doesn’t mean it has any true value.

According to the same research, the UK uses the technique in just 3% of cases, not 30% as illustrated by the piece in HR Magazine. Indeed, it’s worth bearing in mind that this recent claim for renewed interest in graphology comes from Erik Rees, a leading graphology expert and former chairman of the British Institute of Graphologists, and comes with the caveat that companies may use the technique but “few would openly admit to doing so”. So, not only is the author of the claim a key source of PR for graphology but he also makes a claim that is irrefutable.

In all, graphology is simply not worth the paper it is written on. And its use as a selection tool is deeply worrying. These are decisions that influence a person’s career, their life. These are decisions that cost organisations a fortune with the estimated cost of a poor selection decision reaching three to five times the annual salary of the role in question.

Remember, despite what the 1980’s BT advert with Maureen Lipman would say, just because it is an “ology”, does not make it science.

Top of page
Subscribe to the Pearn Kandola blog feed.