Articles on this Page
- 04/27/15--18:49: _How-to: Tune Your A...
- 05/02/15--21:02: _How to Detox Your L...
- 05/10/15--20:27: _It’s Not a ‘Stream’...
- 05/10/15--21:46: _Are You Ready for P...
- 05/31/15--17:55: _When One Business M...
- 05/31/15--18:25: _Multiple Models 05-31
- 06/08/15--07:03: _A new grasp on robo...
- 06/08/15--07:23: _Mindfulness Can Lit...
- 06/08/15--08:26: _Want to Get Ahead? ...
- 06/09/15--11:23: _Mind = Blown! Reali...
- 06/09/15--11:47: _Google Wants You to...
- 06/09/15--12:56: _What Successful Pro...
- 06/10/15--05:02: _Staying in the Know...
- 06/19/15--04:40: _This blood test can...
- 06/25/15--09:34: _An Interview with D...
- 06/25/15--14:08: _6 reasons why we’re...
- 06/25/15--14:42: _The Four Phases of ...
- 06/26/15--05:32: _Design Thinking 06-26
- 06/28/15--07:27: _WHEN HEALTH CARE GE...
- 07/01/15--14:48: _Design Thinking Stu...
- 04/27/15--18:49: How-to: Tune Your Apache Spark Jobs (Part 1) 04-28
- 05/02/15--21:02: How to Detox Your Lungs Naturally 05-03
- 05/10/15--20:27: It’s Not a ‘Stream’ of Consciousness 05-11
- 05/10/15--21:46: Are You Ready for Personalized Predictive Analytics? 05-11
- 05/31/15--17:55: When One Business Model Isn’t Enough 05-31
- 05/31/15--18:25: Multiple Models 05-31
- 06/08/15--07:03: A new grasp on robotic glove 06-08
- 06/08/15--07:23: Mindfulness Can Literally Change Your Brain 06-08
- 06/08/15--08:26: Want to Get Ahead? Work on Your Improv Skills 06-08
- 06/09/15--12:56: What Successful Project Managers Do 06-09
- 06/10/15--05:02: Staying in the Know 06-10
- 06/19/15--04:40: This blood test can tell you every virus you’ve ever had 06-19
- 06/25/15--09:34: An Interview with Dr. David Norton 06-25
- 06/25/15--14:08: 6 reasons why we’re underhyping the Internet of Things 06-25
- 06/25/15--14:42: The Four Phases of Design Thinking 06-25
- 06/26/15--05:32: Design Thinking 06-26
- 06/28/15--07:27: WHEN HEALTH CARE GETS A HEALTHY DOSE OF DATA 06-28
How-to: Tune Your Apache Spark Jobs (Part 1)
How Spark Executes Your Program
Picking the Right Operators
When Shuffles Don’t Happen
When More Shuffles are Better
WikiHow is a community of voluntary contributors who contribute simple do it yourself "How to" articles. There are at present more than 190,000 or more articles. Shyam has been associated with wikiHow since April 2010. He has contributed 23 articles out of which 8 have been featured with a total page views exceeding 7,91,000. Shyam has also edited about 423 articles contributed by others. You can view Shyam articles Here
How to Detox Your Lungs Naturally
Two Methods:Using Verified MethodsUsing Unverified Methods
Keeping yourself detoxified is one of the best things you can do to stay healthy. Your lungs are some of your most important organs, and most human beings can live for mere minutes without air. Therefore, it is important to keep your lungs healthy to ensure that they can perform at their best throughout your life. Although you have little control over the air you breathe, you can take steps to detox your lungs using verified methods that are backed up by science, and unverified methods rooted in naturopathic healing and folk medicine.
Method 1 of 2: Using Verified Methods
1. Cook with oregano to reduce inflammation and congestion. Oregano's primary benefits are due to its carvacrol and rosmarinic acid content. Both compounds are natural decongestants and histamine reducers that have direct, positive benefits on the respiratory tract and nasal passage airflow.
The volatile oils in oregano, thymol and carvacol, have shown to inhibit growth of bacteria like staphylococcus aureus and pseudomonas aeruginosa.
Oregano can be used in cooking in its dried or fresh forms.
A few drops of oregano oil in milk or juice can be taken once a day for as long as you want to receive health benefits.
2. Inhale lobelia to relax your lungs and break up congestion. Lobelia contains an alkaloid known as lobeline, which thins mucus and breaks up congestion.
Additionally, lobelia stimulates the adrenal glands to release epinephrine, relaxing the airways and allowing for easier breathing.
Also, because lobelia helps to relax smooth muscles, it is included in many cough and cold remedies.
Extracts of lobelia inflata contain lobeline, which showed positive effects in the treatment of multidrug-resistant tumor cells.
You may add 5-10 leaves of lobelia and vaporize them for inhalation. Inhale the vapors for 10 minutes each day, morning and evening.
3. Steam treat yourself with eucalyptus to take advantage of its expectorant properties. Eucalyptus is a common ingredient in cough lozenges and syrups and its effectiveness is due to an expectorant compound called cineole, which can ease a cough, fight congestion, and soothe irritated sinus passages.
As an added bonus, because eucalyptus contains antioxidants, it supports the immune system during a cold or other illness.
You may add a few drops of eucalyptus oil into hot water and do a steam inhalation for 15 minutes each day to cleanse the lungs.
4. Take mullein to clear mucus and cleanse the bronchial tubes. Both the flowers and the leaves of the mullein plant are used to make an herbal extract that helps strengthen the lungs.
Mullein is used by herbal practitioners to clear excess mucus from the lungs, cleanse the bronchial tubes, and reduce inflammation present in the respiratory tract.
You can make a tea can from one teaspoon of the dried herb and one cup of boiled water.
4].Alternatively, you can take a tincture form of this herb.
5.Use peppermint to soothe your respiratory muscles. Peppermint and peppermint oil contain menthol, a soothing ingredient known to relax the smooth muscles of the respiratory tract and promote free breathing.
Paired with the antihistamine effect of peppermint, menthol is a fantastic decongestant.
Many people use therapeutic chest balms and other inhalants that contain menthol to help break up congestion.
Additionally, peppermint is an antioxidant and fights harmful organisms.
You may chew on 3-5 peppermint leaves each day to enjoy its anti-histaminic benefits.
6. Drink an infusion of elecampane to reap its soothing and expectorant benefits. The root of the elecampane plant helps kill harmful bacteria, lessens coughs, and expels excess mucus.
Elecampane contains inulin, a phytochemical that coats and soothes the lining of the bronchial passages and acts as an expectorant in the body.
In the respiratory system, it gradually relieves any fever that might be present while battling infection and maximizing the excretion of toxins through perspiration.
If you have a tickling cough or bronchitis, elecampane may be able to help.
Because of its action on excess mucus and toxins in the respiratory tract, it is often helpful with emphysema, asthma, bronchial asthma, and tuberculosis.
You can use one teaspoon of herb per cup of water in an infusion, or one-half to one teaspoon of tincture, three times a day for about 3 months.
7. Take hot showers to clear your lungs. Taking a shower with hot water for twenty minutes can be really helpful in clearing out your lungs.
If you can sit in a sauna, the hot air will be even more effective in clearing your lungs.
It is very important to allow your body to get rid of toxins through sweating.
A sauna or hot water increases the secretion of sweat, and helps the lungs rid themselves of toxic substances.
8. Stop smoking to protect your lungs from toxins. Smoking tobacco is a great way to introduce a variety of toxins directly into your lungs.
Tobacco smoke, nicotine, and the variety of other unhealthy substances found in cigarettes wreak havoc on your respiratory tract.
In addition to lowering your lung capacity, smoking puts you at risk for cancer and other long-term health complications.
9. Stay away from common toxic products. Eliminate household toxins that are part of detergents, cleansers, bleaches, and chemically scented air fresheners that have strong fragrances and might harm the lungs.
Pesticides must go as well, and there are alternatives that aren't toxic for humans.
All toxic commercial pesticides emit caustic gases or vapors that irritate the lungs.
Simply get some nice indoor plants that add life to your dwelling while removing toxins.
Method 2 of 2: Using Unverified Methods
1. Drink sage tea to dispel lung disorders. Sage’s textured leaves give off a heady aroma, which arises from sage’s essential oils. These oils are the source of the many benefits of sage tea for lung problems and common respiratory ailments.
Sage tea is a traditional treatment for sore throats and coughs.
The rich aromatic properties arising from sage’s volatile oils of thujone, camphor, terpene and salvene can be put to use by inhaling sage tea’s vapors to dispel lung disorders and sinusitis.
Alternatively, brew a strong pot of sage tea and place it into a bowl or a vaporizer.
Inhale the vapors for about 5-10 minutes 2-3 times a day, or for as long as you wish, since it is healthy and perfectly safe.
2. Eat boiled plantain leaf to soothe irritated mucous membranes. With fruit that is similar in appearance to a banana, plantain leaf has been used for hundreds of years to ease coughs and soothe irritated mucous membranes.
Many of its active ingredients show antibacterial and antimicrobial properties, as well as being anti-inflammatory and antitoxic.
Plantain leaf has an added bonus in that it may help relieve a dry cough by spawning mucus production in the lungs.
One may eat a boiled plantain fruit or sip on a decoction of 1-2 brewed plantain leaves.
You may continue this each day for about 2-3 months to take advantage of its healing benefits on the lungs.
3. Drink licorice root tea to clear out mucous in the lungs. Licorice is one of the most widely consumed herbs in the world to eliminate toxins from the lungs. Licorice is very soothing, and softens your mucous membranes in the throat, lungs, and stomach.
It reduces the irritation in the throat and has an expectorant action (loosening phlegm to be expelled).
It loosens the phlegm in the respiratory tract, so that the lungs can expel the mucus.
It also has antibacterial and antiviral effects which help fight off viral and bacterial strains in your body that can cause lung infections.
You can use one teaspoon of licorice root per cup of water in an infusion, or one teaspoon of tincture, 3 times a day.
4. Vaporize cannabis to open up your airways and sinuses. If it's legal in your area, use vaporized cannabis for about 5 minutes each day to open up your airways and sinuses.
Vaporizing cannabis mitigates the irritation to the oral cavity that comes from smoking.
Cannabis is perhaps one of the most effective anti-cancer plants in the world.
It also stimulates your body’s natural immune response and significantly reduces the ability of infections to spread.
Cannabis has even been shown to treat and reverse asthma.
5. Drink watercress (Nasturtium Officinale) tea to eliminate toxins. Watercress has the ability to eliminate the toxins from tobacco and decrease the chance of these toxins resulting in lung cancer.
This ability is due to an active ingredient that acts on a series of enzymes, preventing the development of cancer cells.
Watercress is used to make a simple and delicious soup, which efficiently cleanses the lungs of toxins.
It is recommended that you consume this soup twice a month, especially if you are an active or a passive smoker.
1 kg of watercress (flowers and stems)
2 cups of dates
4 cups of water
Put all ingredients in a pot over a low flame. When it boils, reduce the heat and allow it to simmer for a minimum of four hours. If the foam starts forming at the surface of the soup, remove it with a spoon. Once the soup is ready, season it according to your taste.
Note: It is very important to use the correct ratio of ingredients and cook the soup for a minimum of four hours. After such a long cooking time the soup becomes tasty, nutritious, and effective in detoxifying the lungs.
6.Try ginger to prevent lung cancer. Ginger is a powerful tool for detoxification of the lungs and prevention of lung cancer.
You can use it in many ways including ginger root tea mixed with lemon, which facilitates breathing and promotes the elimination of toxins from the respiratory tract.
You can also create a warm bath with powdered ginger. The bath should last at least twenty minutes.
The ginger bath opens pores and stimulates sweating, which helps eliminate toxins.
The steam you inhale goes directly into the airways and eases the process of purifying the lungs.
With every meal, you may eat a tiny piece of ginger.
This will improve your digestion and will contribute to the process of cleansing the body.
7. Use castor oil packs to draw toxins out of the body. Castor oil packs are easy to make at home and are great for drawing toxins out of the body. Castor oil has long been appreciated as a general health tonic and is believed to stimulate circulation and waste elimination.
Castor oil packs can be placed on your chest, perhaps similar to a vapor rub, and can break up congestion and toxins.
While the packs are not expensive to make, it is essential to use only organic, cold-pressed castor oil.
By using cold-pressed oil, you can be reasonably certain it will contain the vital compounds such as phytonutrients, undecylenic acid, and especially ricinoleic acid that are beneficial to the body.
Carefully warm about 8 oz. of castor oil in a pot on the stove to a comfortable temperature and then soak 12″x6″ strips of cloth in the oil.
Being careful not to spill the oil, take the pot with you to where you plan to lie down. Use a small piece of plastic like a “glove” to handle the packs.
Lie down on a plastic sheet, and then lay 3-4 strips over your chest and sides covering the lung areas. Do this on the right and left sides.
Then, cover the packs with a larger section of plastic and lay your heating pad over the plastic covered castor oil packs. Keep it there for 1-2 hours.
Alternate the heating pad from right to left sides.
It is believed that this helps by break up and draw out stored toxins and congestion from the lungs.
8. Take an osha root extract to increase circulation to the lungs. Osha roots contain camphor and other compounds that make it one of the best lung-supporting herbs. One of the main benefits of osha root is that it helps increase circulation to the lungs, which makes it easier to take deep breaths.
Also, when seasonal allergies inflame your sinuses, osha root can produce a similar effect to antihistamines and may help calm respiratory irritation.
An infusion prepared with the roots of osha can be taken orally to cure a number of medical conditions.
In addition, fresh liquid extract of the herb's roots can also be used internally.
The standard dose of the infusion prepared with osha root is one or two teaspoonful of cut and crushed, freshly obtained root infused for approximately 25 minutes.
If you are taking the root in a liquid extract, ensure that its strength is at a ratio of 1:1:8.
Take 20 to 60 drops of this root liquid extract once to four times every day.
9. Drink a lungwort tea to relieve various lung conditions. Lungwort is a tree-growing lichen that actually resembles lung tissue in appearance, and hence is used for various lung conditions.
Lungwort clears tar from the upper respiratory tract, nose, throat, and upper bronchial tubes, while helping the body soothe the mucous membranes in these regions.
It also has an anti-inflammatory action and is good for bronchitis.
As an infusion, mix one to two teaspoons of dried herb per cup and drink one cup three times a day.
10. Maintain a healthy diet to detox your whole body. Like all other types of detoxification, lung cleansing necessitates dietary changes.
A healthy diet is important because it stimulates the natural cleansing mechanisms of the body and strengthens the immune system.
During cleansing, it is recommended that you consume more water, fruits and vegetables.
Drink a cup of lemon juice before breakfast; lemon helps the lungs renew themselves with its high vitamin C content and is easy to digest.
Drink a glass of grapefruit juice because it contains natural anti-antioxidants and enhances the detoxing of your circulatory system.
Have a cup of carrot juice in the period between breakfast and lunch. Carrots contain vitamins A and C, which help to clean the respiratory system and boost immunity.
11. Consume a good amount of potassium. Potassium is one of the most detoxifying nutrients, especially when taken in liquid form.
To prepare a cup of juice rich in potassium, place some carrots, celery, spinach, parsley, and green algae in a blender.
12. Eat spicy foods to break down excess mucus. Chilies help break down excess mucus in the lungs and the body in general.
That is why when you eat spicy foods, you can immediately feel your nose beginning to run.
In the same way, spicy foods affect the excess mucus and tar in the lungs, helping your body eliminate them more easily.
13.Drink water to stay hydrated. Plain water is the best thing to drink while you are detoxing. Good hydration is key to good health and speeds up the process of detoxification.
Try to avoid sodas, coffee, and alcohol.
14. Do breathing exercises to facilitate clear lungs. Breathing exercises are one of the best ways to cleanse your lungs. There are many types of exercises for detoxifying your lungs.
Try this one:
Take a standing position. Be relaxed. Keep your arms at your sides and your feet slightly apart. Take a few deep breaths and exhale through the nose.
Now, breathe in through your nose and exhale slowly through your mouth deeply until you cannot exhale anymore.
But don't stop here, because there is still air left in your lungs. Some air always remains in the lungs and is not replaced with fresh air as we breathe.
Now, force your diaphragm to exhale all the air from your lungs with a wheezing sound.
Do this several times, exhaling through your mouth with a deep puff until you feel there is no more air in the lungs. At this point you will notice that you have pulled in your belly toward the spine.
Through your nose, slowly inhale fresh, clean air into your empty lungs.
Fill your lungs with fresh air, and then hold your breath for five seconds, counting them slowly.
Repeat the process to expel the remaining air out of the lungs. Repeat as many times as you like but at least 50 times each day.
Besides purifying the lungs, this exercise has another benefit: your stomach muscles will eventually become strong and taut.
Please read the original wikiHow article Here
It’s Not a ‘Stream’ of Consciousness
IN 1890, the American psychologist William James famously likened our conscious experience to the flow of a stream. “A ‘river’ or a ‘stream’ are the metaphors by which it is most naturally described,” he wrote. “In talking of it hereafter, let’s call it the stream of thought, consciousness, or subjective life.”
While there is no disputing the aptness of this metaphor in capturing our subjective experience of the world, recent research has shown that the “stream” of consciousness is, in fact, an illusion. We actually perceive the world in rhythmic pulses rather than as a continuous flow.
Some of the first hints of this new understanding came as early as the 1920s, when physiologists discovered brain waves: rhythmic electrical currents measurable on the surface of the scalp by means of electroencephalography. Subsequent research cataloged a spectrum of such rhythms (alpha waves, delta waves and so on) that correlated with various mental states, such as calm alertness and deep sleep.
Researchers also found that the properties of these rhythms varied with perceptual or cognitive events. The phase and amplitude of your brain waves, for example, might change if you saw or heard something, or if you increased your concentration on something, or if you shifted your attention.
But those early discoveries themselves did not change scientific thinking about the stream-like nature of conscious perception. Instead, brain waves were largely viewed as a tool for indexing mental experience, much like the waves that a ship generates in the water can be used to index the ship’s size and motion (e.g., the bigger the waves, the bigger the ship).
Recently, however, scientists have flipped this thinking on its head. We are exploring the possibility that brain rhythms are not merely a reflection of mental activity but a cause of it, helping shape perception, movement, memory and even consciousness itself.
What this means is that the brain samples the world in rhythmic pulses, perhaps even discrete time chunks, much like the individual frames of a movie. From the brain’s perspective, experience is not continuous but quantized.
Another clue that led to this discovery was the so-called wagon-wheel illusion, in which the spokes on a wheel are sometimes perceived to reverse the direction of their rotation. This illusion is easy to induce with a strobe light if the rotation of the wheel is such that each strobe flash captures the spoke location slightly behind the location captured on the previous flash, leading to the perception of reverse motion. The illusion results from “sampling” the scene in discrete frames or time chunks.
The telling fact, for perceptual scientists, is that this illusion can also occur during normal observation of a rotating wheel, in full daylight. This suggests that the brain itself, even in the absence of a strobe light, is sampling the world in discrete chunks.
Scientists have uncovered still more clues. It turns out, for example, that our ability to detect a subtle event, like a slight change in a visual scene, oscillates over time, cycling between better and worse perceptual sensitivity several times a second. Research shows that these rhythms correlate with electrical rhythms of the brain.
Consider a study that I conducted with my colleagues, forthcoming in the journal Psychological Science. We presented listeners with a three-beat-per-second rhythm (a pulsing “whoosh” sound) for only a few seconds and then asked the listeners to try to detect a faint tone immediately afterward. The tone was presented at a range of delays between zero and 1.4 seconds after the rhythm ended. Not only did we find that the ability to detect the tone varied over time by up to 25 percent — that’s a lot — but it did so precisely in sync with the previously heard three-beat-per-second rhythm.
Why would the brain do this? One theory is that it’s the brain’s way of focusing attention. Picture a noisy cafe filled with voices, clanging dishes and background music. As you attend to one particular acoustic stream — say, your lunch mate’s voice — your brain synchronizes its rhythm to the rhythm of the voice and enhances the perceptibility of that stream, while suppressing other streams, which have their own, different rhythms. (More broadly, this kind of synchronization has been proposed as a mechanism for communication between neural networks within the brain.)
All of this points to the need for a new metaphor. We should talk of the “rhythm” of thought, of perception, of consciousness. Conceptualizing our mental experience this way is not only more accurate, but it also situates our mind within the broader context of the daily, monthly and yearly rhythms that dominate our lives.
View at the original source
Are You Ready for Personalized Predictive Analytics?
When One Business Model Isn’t Enough
How LAN’s Three Models Interrelate
Maximal use of physical assets.
Reduction of the break-even load factor (BELF).
Diversification of revenues and profits.
Reduced threat of entry by other airlines.
One-stop shop for cargo in Latin America.
The Challenge of Managing Multiple Models
Broader organizational skills.
Greater employee flexibility.
No two business models share all resources, of course. In Miami, for example, where LAN’s cargo operations are headquartered, the company has almost 500,000 square feet of dedicated warehouse space and other cargo facilities that its passenger competitors do not need. Furthermore, to serve Latin America comprehensively, regulatory constraints preventing non-national companies from operating within certain countries have impelled LAN to create a series of separate companies for its no-frills short-haul passenger service: LAN Peru, LAN Ecuador, LAN Colombia, and LAN Argentina. It has also set up additional operating structures through alliances in Mexico and several other countries.
Distinguishing Complements From Substitutes
Are Your Business Models Complements or Substitutes?
A new grasp on robotic glove
Soft, lightweight robotic glove assists with grasping objects independently.
Want to Get Ahead? Work on Your Improv Skills
Mind = Blown! Reality Doesn't Exist? Until We Measure It! Quantum...
Reality doesn’t exist until we measure it, quantum experiment confirms
Google Wants You to Control Your Gadgets with Finger Gestures, Conductive Clothing
New Google technology addresses the tiny screen problem by letting you control wearables with tiny gestures, or by touching your clothes.
Small gadgets such as smart watches can be frustrating to use because their tiny buttons and touch screens are tricky to operate. Google has two possible solutions for the fat finger problem: control your gadgets by subtly rubbing your finger and thumb together, or by swiping a grid of conductive yarn woven into your clothing.
The first of those two ideas works thanks to a tiny radar sensor that could be integrated into, say, a smart watch and can detect fine motions of your hands from a distance and even through clothing. Levi Strauss announced today that it is working with Google to integrate fabric touch panels into its clothing designs. The new projects were announced at Google’s annual developer conference in San Francisco Friday by Ivan Poupyrev, a technical program lead in Google’s Advanced Technology and Projects research group.
What Successful Project Managers Do
Traditional approaches to project management emphasize long-term planning and a focus on stability to manage risk. But today, managers leading complex projects often combine traditional and “agile” methods to give them more flexibility — and better results.
Coping with frequent unexpected events requires an organizational culture that allows the project manager to exercise a great amount of flexibility. Here are two examples of advanced organizations that took steps to modify their cultures accordingly.In today’s dynamic and competitive world, a project manager’s key challenge is coping with frequent unexpected events. Despite meticulous planning and risk-management processes, a project manager may encounter, on a near-daily basis, such events as the failure of workers to show up at a site, the bankruptcy of a key vendor, a contradiction in the guidelines provided by two engineering consultants or changes in customers’ requirements.
Such events can be classified according to their level of predictability as follows: events that were anticipated but whose impacts were much stronger than expected; events that could not have been predicted; and events that could have been predicted but were not. All three types of events can become problems that need to be addressed by the project manager. The objective of this article is to describe how successful project managers cope with this challenge.
A group of 23 project managers who had come from all over NASA to participate in an advanced project management course declared mutiny. They left the class in the middle of the course, claiming that the course text, based on NASA’s standard procedures, was too restrictive for their projects and that they needed more flexibility. With the blessing of NASA’s top leadership, the class members then spent four months conducting interviews at companies outside of NASA. This led to a rewriting of numerous NASA procedures.
Among other things, NASA headquarters accepted the group’s recommendation to give NASA project managers the freedom to tailor NASA’s standard procedures to the unique needs of their projects. A similar movement to enhance project managers’ flexibility occurred at Procter & Gamble, where the number of procedures for capital projects was reduced from 18 technical standards and 32 standard operating procedures to four technical standards and four standard operating procedures.
Concurrent with these changes at NASA and P&G, a heated debate emerged within the wider project management profession regarding the need for flexibility, as opposed to the traditional approach, which emphasizes that project success depends on stability. According to the traditional approach, project success can be achieved by focusing on planning and on controlling and managing risks. Although the popularity of this approach has sharply increased across industries, research covering a wide variety of projects consistently reveals poor performance. A large percentage of projects run significantly over budget and behind schedule and deliver only a fraction of their original requirements.
The other side in this debate is best represented by a newer project management approach popular within the software industry. Called the agile method, it asserts that project success requires enormous flexibility throughout the project’s life. However, even proponents of the agile approach acknowledge that this approach is best suited to small projects and teams.
Our studies, employing experiential data collected from more than 150 successful project managers affiliated with more than 20 organizations, indicate that today’s successful project managers cope with unexpected events by a combination of the traditional and agile approaches, assuming four roles. (See “About the Research.”) Two of the roles are intention-driven and two are event-driven, with each role assumed on its own time schedule throughout the life of the project. The first role, developing collaboration, is performed early on during the project. The second role, integrating planning and review with learning, is performed periodically. The third role, preventing major disruptions, is performed occasionally. The fourth role, maintaining forward momentum, is performed continuously.5(See “The Four Roles of the Project Manager.”)
About the Research
In recent years, many researchers have concluded that one reason for the widespread poor statistics about project results is the wide gap between research and practice.i The overall objective of our research was to develop a practice-based theory of project management.ii To this end, we used three complementary approaches to collect firsthand data on the practices of successful project managers. Believing that management is best learned by emulating exemplary role models, we focused our studies on a selective sample of the best practitioners in their respective organizations.
1. Develop Collaboration
2. Integrate Planning and Review With Learning
3. Prevent Major Disruptions
4. Maintain Forward Momentum
Implications for Senior Managers
Staying in the Know
In an era of information overload, getting the right information remains a challenge for time-pressed executives. Is it time to overhaul your personal knowledge infrastructure?
A common thread runs through many recent corporate setbacks and scandals. In crises ranging from BP’s Deepwater Horizon oil spill debacle to the Libor rate-fixing scandal in the City of London, the troubles simmered below the CEO’s radar. By the time the problems were revealed, most of the damage had arguably already been done. Despite indications that large companies are becoming increasingly complicated to manage,1 executives are still responsible for staying abreast of what’s going in their organization. But how do you keep tabs on what your competitors and employees are doing? How do you spot the next big idea and make the best judgments? How do you distinguish usable information from distracting noise? And how do you maintain focus on what’s critical?
Many management experts have assumed that better information systems and more data would solve the problem. Some have pushed for faster and more powerful information technologies. Others have put their faith in better dashboards, big data and social networking. But is better technology or more tools really the most promising way forward? We think not. In this article, we maintain that the capacity of senior executives to remain appropriately and effectively knowledgeable in order to perform their jobs is based on a personal and organizational capability to continually “stay in the know” by assembling and maintaining what we call a “personal knowledge infrastructure.” And while information technologies may be part of this personal knowledge infrastructure, they are really just one of the components.
We are not the first researchers to make this claim. More than 40 years ago, organizational theorist Henry Mintzberg suggested that information was central to managerial work and that the most important managerial roles revolved around information (monitoring, disseminating and acting as a spokesperson). Mintzberg described managers as the nerve centers of organizations and said informational activities “tie all managerial work together.”2 Other researchers suggested that management itself could be considered a form of information gathering and that we are quickly moving from an information society to an attention economy, where competitive advantage comes not from acquiring more information but from knowing what to pay attention to.3 Later research confirmed that dealing with information is critical and found that managers’ communication abilities are directly related to their performance.4
While the importance of informational roles and activities is well established, we take the idea a step further, arguing that managers — and especially senior executives — are only as good at acquiring and interpreting critical information as their personal knowledge infrastructures are. Managers rely on specific learned modes to manage and allocate their attention.5 However, how we pay attention is not simply a matter of internal mental processes that we can do little about. Rather, attentiveness (in other words, the capacity to stay on top, and the ability to distinguish between what matters and what doesn’t) mostly stems from what managers do or don’t do, whom they talk to and when, and what tools and tricks of the trade they use. In short, attentiveness relies on and is facilitated by things we can observe — and things we can do something about.
Technologies and new tools are not and cannot be “silver bullet” solutions. At times, simpler things such as talking to customers or networking with board members may be more important, provided they are done methodically and with some purpose. Selecting when particular elements are appropriate depends on the circumstances. As a result, understanding and, when needed, overhauling one’s personal knowledge infrastructure should be routine. In this article, we explain how this can be done, drawing on insights obtained by shadowing individual CEOs as they went about their daily jobs.6
About the Research
To uncover how top executives deal with information and knowledge, we conducted an observation-based exploratory study using a rigorous ethnographic protocol successfully employed in the past. We followed seven chief executives through their working days for several weeks. We went where they went, watched what they did, listened to what they and others said and asked what was going on when we did not understand. We also discussed our findings with them and with invited colleagues as part of structured feedback sessions.
Our sample comprised seven CEOs of acute and mental health organizations that are part of the National Health Service in England. In England, health care is provided by public sector bodies called trusts. Our sample included organizations that run multiple hospitals, have an annual budget of more than 500 million pounds and have up to 10,000 employees. The CEOs have both legal and financial responsibility. The sample included both men and women (3:4). The CEOs had diverse professional backgrounds (NHS management, private sector, nursing and medical) and were at different points in their careers, both in terms of tenure in their present post and overall experience at the CEO level. The sample also included organizations with different performance levels according to indicators by which their performance was monitored by national regulators (for example, financially sound vs. struggling).
CEOs were observed for five or more weeks, apart from one subject, where observations lasted 3½ weeks. The researchers had good access to the CEOs and were able to document nearly all aspects of their work, with exceptions such as one-to-one supervisory meetings with junior colleagues, HR-related meetings concerning individuals and private meetings with patients. When CEOs worked from home, data was collected in interviews afterwards. We conducted semi-structured interviews with five of the CEOs and a number of informal interviews with the other two CEOs. In addition, we conducted two formal interviews with two different personal assistants, which were recorded and lasted approximately half an hour each. Additional data came from meeting papers, articles referenced by the CEOs and copies of publications consulted, and they were supplemented by externally available information such as annual trust reports and regulatory documents. Following the study, our results were shared with two groups of CEOs at dedicated sector events. Those CEOs helped us to refine our findings and elaborate on the notion of the personal knowledge infrastructure.
Our research is based on a two-year study of the day-to-day work of seven CEOs of some of the largest and most challenging hospital- and mental health-based organizations in England. (See “About the Research.”) We chose to study health-care executives because they sit at the crossroads between the private and public sectors and therefore are expected to meet multiple, often competing, demands. To say that the informational landscapes in these organizations are complex is an understatement. Yet the organizations are increasingly subject to pressures to become more transparent, even as they compete with each other. Therefore they seemed to be a good choice as settings for studying the challenges of using information and knowledge to stay on top and ahead of the curve. Throughout our research we sought to answer a simple question: How did the CEOs know what they needed to know in order to be effective at their jobs?
“Nothing but Talking”
One of the first things that struck us was that, in contrast to the popular image of CEOs as lonely, heroic decision makers, the individuals we studied did not seek information or utilize discrete pieces of evidence for the purpose of making decisions. Rather, they often sought something much more ordinary: to make themselves knowledgeable in order to be ready for any eventuality, so that they could understand what to do next. Indeed, one of their main preoccupations appeared to be staying on top of what was happening within and around their organizations. As one put it: “The worst thing for a CEO is to find yourself asking after the fact: How could this happen without me knowing?”
Notably, staying on top was not a separate activity in addition to what the CEOs already did, but rather something they mostly did without thinking and without noticing — or something that they achieved while doing something else. Calling a former colleague who was in the news for the wrong reasons (an accident, a bad report from inspectors, protests about the closure of a loss-making hospital) can produce multiple outcomes: reinforcing a relationship and demonstrating solidarity, but also finding out what is going on. Indeed, many of the CEOs had difficulty acknowledging that checking in with people was an integral part of their job — hence the often-heard comment, “I do not know what has happened to my workday … It seems I have done nothing but talking.”
The Personal Knowledge Infrastructure
The CEOs we studied didn’t leave the process of staying informed to chance. Rather, they relied on a habitual and recurrent set of practices, relationships and tools of the trade, which constituted a personal knowledge infrastructure that supported them in their daily tasks of understanding, foreseeing and managing. This tacit and rarely discussed infrastructure, which was very different from their IT system, helped them to know what needed to be done and to get a sense of the right way forward. What made some CEOs more effective than others was not merely the characteristics of the individual components of their personal knowledge infrastructure but also the quality of the whole and its fit with the specific needs of the job. This personal knowledge infrastructure comprised three main elements: routine practices, relationships, and tools and technologies.
First, every CEO had a set of routine practices she or he relied on — things such as checking the morning news, running periodic review meetings, dropping by immediate collaborators’ offices to ask what was often “just a quick question,” walking around and occasionally even going to the cafeteria “to check how things are going.” These practices were not just internal. The CEOs also met with board members and managers of other organizations, attended conferences and staff events and participated in ceremonial functions such as charity events. Some of the gatherings were framed as leisure opportunities (having a drink, playing golf), but they weren’t entirely social. CEOs returned from such events with information and news they shared with various associates. Similarly, sitting on boards of other organizations was often seen as a necessary evil that helped the CEOs get a broader overview of what was going on beyond their organization.
Second, the personal knowledge infrastructure contained a number of social relationships. Like prior researchers, we found that most CEOs’ work was conducted verbally and was accomplished with and through others. Our CEOs used their relationships both to gather information and to make sense of it.7 For example, every CEO we studied carefully cultivated strategic relationships within and outside the organization. These relationships were usually engineered to produce a combination of breadth and depth of intelligence. Their network of contacts constituted a form of social capital that had been accumulated over time. On many occasions, we observed CEOs interacting with long-term colleagues, previous members of their staff and people with whom they had done business. They used their contacts to gather weak signals (on the principle that today’s gossip can become tomorrow’s news), triangulate information and confirm or contradict their evolving insights. Some of the CEOs were extremely strategic and nurtured relationships with various stakeholders (for example, management consultants, politicians or local leaders), whom they saw regularly for dinner or a drink. Some CEOs also relied on small groups of peers, whom they met with on a regular basis. These groups of peers, who were often also “comrades in adversity” facing similar challenges, operated both as a support group and as a precious setting where sensitive information was exchanged on the basis of reciprocity.
However, not all of the contacts were treated the same way.8 The CEOs appeared to have an informal hierarchy: those who were more distant, who could be used as a source of signals and needed to be taken with a grain of salt; those who were trusted and tended to provide reliable intelligence (for example, board members or colleagues, as well as their assistants); and finally, those with whom the various streams of information could be discussed and processed — the inner circle. All of the CEOs in our sample relied heavily on such an inner circle, usually composed of selected executive team members with whom they had the most intense interactions. The CEOs used these individuals not only to obtain information, often informally (given the open-door policy that was in operation for most CEOs), but also as sounding boards to test emerging understandings and reconcile possibly competing insights. These interactions allowed the CEOs not only to connect the dots but, more importantly, to figure out which information qualified as a dot that had to be connected further.
Finally, the CEOs’ personal knowledge infrastructures included a variety of tools of the trade. These included traditional tools such as phone, email, reports and journal articles from industry magazines, as well as less traditional sources such as Twitter, blogs and other social media. Most CEOs utilized some form of electronic reporting system or audit-based dashboard that helped them track critical performance indicators, and most consulted these tools regularly, but the sophistication of the tools varied substantially. (See “Components of a Personal Knowledge Infrastructure.”)
Components of a Personal Knowledge Infrastructure
Although most CEOs had a small pile of “will read” books in their office, they rarely had time for books or magazines during the course of a working day. Personal preferences played an important role here, more so than in the two previous categories. For example, while some CEOs relied heavily on mobile phones for calls and texts, others used email almost exclusively. Contrary to our expectations, most of the CEOs dealt personally with a range of emails — this was how their work was done. Some CEOs made very little use of written documents and required short summaries. Others wanted to have complete documentation “just in case.” Some CEOs still found comfort in printed paper; few were happy to go paperless.
One critical aspect of personal knowledge infrastructures was the extent to which individual elements were often intended to support each other. For example, CEOs who liked to run large formal meetings also invested significant time in social relationships, consulting collaborators on a one-to-one basis. The CEO of an organization that was struggling with issues of trust and hidden or misplaced information was working to triangulate soft data with hard knowledge. This often required him to follow up with people individually (for example, phoning staff members directly to corroborate information or requesting documentation from outside sources) while also working to set up formal structures currently lacking.
How the Personal Knowledge Infrastructure Evolves
Although all of the CEOs relied on a personal knowledge infrastructure, how these were composed varied greatly. We saw differences across all seven CEOs in terms of whether personal knowledge infrastructures had particular elements and also the amount of emphasis each element received. Two examples illustrate how different personal knowledge infrastructures contrasted with specific leadership styles in different situations.
CEO 1: Knowing the Details in a Struggling Organization
A newly appointed CEO was running a struggling hospital-based organization that was receiving increased regulatory attention for financial reasons. His personal knowledge infrastructure was designed to help him closely monitor his organization. He ran large, often long, weekly management meetings that provided an opportunity for all team members to examine operations and share and obtain a wealth of information. After the formal meetings, conversations continued in the executive offices. The CEO spent the bulk of his time in regular meetings with local managers of health-care organizations and funding agencies, picking up signals and providing insight into the work and progress of his organization. He also spent time working on wards and visiting and talking to staff. Moreover, he cultivated a wide network of colleagues whom he often consulted in rather informal ways and maintained external links to support his key strategic tasks. He had an open-door policy, both as a symbol of change and as a permanent invitation. He felt comfortable digging through reports and documents, and he set aside time on trains to go through what he called “the train pile of documents.” Though the CEO used the phone and the Internet, he liked to attend conferences and networking events to develop a broad view of the business environment.
CEO 2: Managing Via a Mix of the Formal and Informal
The second CEO, who had been in his position for more than five years, used a very different set of practices and tools. He worked with an established team to run an organization that prided itself on its ability to achieve operational excellence and strategic growth. Throughout the day, this CEO had a series of chats with executives, which often expanded into conversations between him and several people. Indeed, much of his working day was spent in what appeared as free-form interaction: sharing information informally, only sporadically framed by discussion about an immediate problem concerning a patient or a medical concern.
This CEO rarely attended local meetings, so the other executives were an information gateway to local strategic issues for him. However, the headquarters-based, executive-team orientation was reinforced by several other structures, relationships and tools, purposefully arranged by the CEO so he could remain in the know. First, there was an executive who the CEO felt had a very different approach from the others, which gave him another voice and view to consider. The CEO also had an IT performance system, which he consulted every morning and which allowed him to identify any serious performance issues in the organization without needing to rely on reports from executives.
The CEO supplemented such insights with visits to wards and other areas of the hospitals late in the evening and on weekends, which allowed him to gain informal insights from veteran staff. The internal systems were supported by national-level policy work and involvement via leadership positions in sector organizations and initiatives and networking. These allowed the CEO to both formally and informally stay in the know regarding strategic issues of potential relevance, and also to influence their direction to the benefit of his organization.
What Makes It Personal
As we have seen, different CEOs use different knowledge infrastructures that reflect both what they personally need and where their organizations are at a particular point in time. In each case, the context was particularly relevant. We saw different types of knowledge infrastructures (in other words, combinations of tools, practices and relationships) in relation to seven factors:
The CEO’s Experience
More experienced CEOs often had a more defined personal style that they carried with them when they changed jobs. Some had specific practices that they tried to reactivate in the new workplace and a network of contacts that constituted the social capital they had accrued over the years; we saw this in the case of a CEO facing an operational challenge in his new organization, when he called a former colleague for advice.
The more time CEOs spent in the same organization, the more they learned, often the hard way, about sources they could trust and how they could make these sources work, given their existing infrastructures and approaches to work.
Makeup of the Executive Team and Board
The composition of the top management team, how competent its members are perceived to be and how well they work as a team affected the makeup of the inner circle. CEOs often included in their inner conversational circles directors who were easy to talk to or were particularly good at collecting and relaying intelligence. However, many CEOs, like the second CEO discussed above, also saw value in having friendly “devil’s advocates” on staff who were able to present different views and act as meaningful counterweights.
Organizational Conditions and Pressures
Organizations with different financial, efficiency, quality and safety environments posed different issues for the CEOs. When the conditions changed, they had to retune their antennae accordingly. A particular challenge involved the tools and technologies available. Systems can be powerful but are costly and difficult to change. Most CEOs worked to modify and develop existing systems but often they didn’t have a lot of room to make immediate changes. They worked with what they had while instigating long-term interventions so that the system would suit them rather than the other way around.
Entering new markets or introducing new products or services required CEOs to adapt their personal knowledge infrastructure accordingly. For instance, a CEO facing a possible merger began adding M&A events to his calendar.
Economic, Competitive and Regulatory Environment
The macro environment determined whether a CEO’s personal knowledge infrastructure was appropriate. Changes in the environment forced CEOs to adapt their existing personal knowledge infrastructure and reactivate old relationships.
The Kind of Manager the CEO Wants to Be
Ultimately, the above factors were filtered through the prism of “what kind of manager I would like to be.” For instance, a CEO who valued transparency and closeness to his organization’s users established a strong presence on social media and utilized this channel to garner insights into the experience of patients and their families. This sometimes allowed him to identify problems (such as low quality of service in a particular location or low staff morale) before his managers reported them.
Taken together, the factors above suggest that effective personal knowledge infrastructures tend to be unique and personal and conform to the preferences of the manager. They need to be continually adapted, tweaked and refined in keeping with the shifting nature of the CEO’s job, the environment and new opportunities.
Although most of our CEOs were reasonably successful, everyone saw room for improvement. Indeed, a CEO’s effectiveness was a reflection of his or her situation and person-specific alignment. For example, one CEO had spent years building a sophisticated IT performance monitoring system. Another CEO didn’t see having such a system as a priority; in his view, being an effective manager entailed moving away from operational considerations and focusing more on strategic and systemwide issues.
The challenge of changing as the organization changes was highlighted by several CEOs: The personal knowledge infrastructure that serves you well during a period of crisis and turmoil may get in the way in calmer waters. The lesson is that there is no single best personal knowledge infrastructure. Through personal reflection, managers and CEOs need to learn how to ask themselves difficult questions regarding the quality and fit of the practices, tools and relationships that they rely on to become knowledgeable. They also need to develop structured ways of asking such questions consistently and over time — rather than waiting for something to go terribly wrong.
The quality and fit of the CEO’s personal knowledge infrastructure is critical because it determines how he or she sees the world and defines himself or herself as a manager and CEO. It is the prism through which managers understand what is going on, and it provides the horizon of information sources through which this understanding will be probed and evaluated. However, a poorly designed personal knowledge infrastructure can lock the manager inside an information bubble and create information biases and blind spots.9 Managers may only realize this when something happens that was not on their radar or when an incident exposes the misalignment between the current demands and needs of their job and their own role. By closely examining the work practices of our CEOs over time, we identified four potential traps:
1. Not Obtaining the Information You Need
Although conventional wisdom suggests that the main problem for today’s executives is too much information, the real problem is not enough relevant information. Due to insufficient monitoring, an inappropriate mix of monitoring practices, inadequate or insufficient social relationships, and information overload, managers can find themselves without the information they need.
EVALUATING YOUR PERSONAL KNOWLEDGE INFRASTRUCTURE
2. Developing a Personal Knowledge Infrastructure That Points You in the Wrong Direction
A typical problem with personal knowledge infrastructures is that they can be poorly aligned with the demands of the job. For example, if a CEO wants to foster innovation but the infrastructure informs her about operational issues only, the CEO is likely to focus on things that aren’t of primary importance. A personal knowledge infrastructure not only reflects the rules of attention but also shapes those rules. Researchers have highlighted lessons from major spectacular failures from the past, from the Challenger space shuttle disaster to the global financial crisis.10 Many of the managers in question were completely current on the wrong information — or information about the wrong things.
3. Setting Up a Personal Knowledge Infrastructure That Is Not “You”
A manager’s personal knowledge infrastructure can clash with his management style, both in terms of what he does, the tools he uses and the type of manager he would like to be. In our study, we observed a CEO who wanted to be a manager who delegated. However, his personal knowledge infrastructure systematically drove him to focus on details, which led him to take a hands-on approach — against his best intentions. The most effective managers we observed were those who reshaped their personal knowledge infrastructure to fit their work, their management style and what they considered important.
4. Starting With Technology Rather Than Personal Need
Last, some managers make the mistake of addressing the issue from the wrong end — considering technology first rather than later. Rather than being technology-centered, personal knowledge infrastructures need to be geared toward personal development, not toward buying new technologies. Rather than asking, “Is this technology good?,” CEOs should ask, “Will it do any good for me?”
Improving Your Personal Knowledge Infrastructure
So how do managers improve their personal knowledge infrastructures? We found that although CEOs easily recognize the importance of their personal knowledge infrastructure, they very rarely pause to reflect on its effectiveness and fit. More often than not, they discover its inadequacies through comparison with others’ practices or, more commonly, following breakdowns and failures. Developing, refining and testing the effectiveness or present fit of your personal knowledge infrastructure should be routine. (See “How to Improve Your Personal Knowledge Infrastructure.”)
How to Improve Your Personal Knowledge Infrastructure
For CEOs or other executives concerned about improving their personal knowledge infrastructure, we have developed six steps designed to initiate learning and reflection. Examine the following:
There is a great deal that individuals can do for themselves. The starting point is being aware of the composition and functioning of your personal knowledge infrastructure and also being candid about its internal contradictions, potential misfits and misalignments. In our study, we found that this was best done through discussions with others: a mentor, a coach, colleagues or a trusted counselor. After all, your personal knowledge infrastructure is very much a part of you. Having a personal knowledge infrastructure in place is one thing; being honest about how well-suited it is to your particular circumstances is very different.
To this end, in addition to studying CEOs in action, we developed the outlines of a reflection and developmental process one can apply to one’s own circumstances. This is a framework to guide individual and peer reflection, built around a set of questions. (See “Evaluating Your Personal Knowledge Infrastructure”)
Being a manager in today’s complex world requires becoming information-savvy in ways that are manageable and work for you in your specific context. What we learned from the CEOs we studied also applies more broadly to executives in general. Becoming and remaining practically knowledgeable is a critical task. It is a capability that managers need to learn, develop and continually refine, and it becomes increasingly important as the manager moves through his or her career and up the corporate ladder, when the risk of information overload significantly increases.
View at the original source
This blood test can tell you every virus you’ve ever had
An Interview with Dr. David Norton
In an interview with James Creelman, head of Palladium’s Knowledge and Research
Center, Palladium Chairman Dr. David Norton explains why more and more
government organizations are using tools such as the Balanced Scorecard and
Execution Premium Process™ (XPP) to effectively manage complexity in the 21st
With particular reference to the military and police sectors, he explains how globalization and technology are changing the way work gets done and how this is driving government entities to adopt these tools so to better visualize and deliver to their mission and to manage inter- and intra-agency collaborations.
The Balanced Scorecard was in the right place at the right time. By the early 1990s the economic model was changing from one that was product-based to service-based. In this new economy there were requirements for a model to manage knowledge and tools for managing intangible assets. Many organizations were realizing that in this new economy measuring financial performance was still critical but that they needed a new approach to understanding the more intangible drivers of fi nancial success, and the Balanced Scorecard offered a way to do that.
It has endured because it delivered transformational results in many of the early adopters. Also, although originally a way to balance fi nancial and non-fi nancial measurement it developed into more of a management system than just a measurement tool. The adding of the Strategy Map was also an important milestone as this enabled organizationsto better visualize the strategy and what they had to do to deliver it.
The Balanced Scorecard concept is now almost 25 years old. Why has it proven to be so enduringly popular?
Since the mid-1990s the government sector has been a big user of the Balanced Scorecard, but usage has increasedsignifi cantly in recent years and across the globe.
What has driven this uptake?
Leaders of government entities increasingly saw the Balanced Scorecard as a good idea. They had seen others succeed with its usage and decided to try it. Some of the early government successes, such as the City of Charlotte in the USA in the mid-1990s also helped to spread the message that this new way of managing could work in the government or not-for-profi t sectors. A small number of early adopters inspired a growing number of followers. It is not unusual for any new idea to take time to trickle through and 20 years is a relatively short time.
The Balanced Scorecard is primarily a strategy implementation framework, yet many defense sector organizations have adopted it and focused more on “battle readiness.” In what important ways have defense organizations, such as Balanced Scorecard Hall of Fame™ inductees the Royal Norwegian Air Force and the US Army, tailored the Balanced Scorecard methodology for their own needs?
I would argue that “battle readiness” is a strategy. Every organization that we have worked with has a set of strategic themes that they must deliver to, rather than a one-dimensional strategy. Private sector fi rms have themes such as managing the core business, customer management, innovation, etc. The same is true for the military, which will have several themes that they must manage, such as operational effi ciency and battle readiness. The Strategy Map enables them to see those themes and how they work together.
Specifi cally related to police organizations, Abu Dhabi Police, Dubai Police, the FBI, and the Royal Canadian Mounted Police are also inductees into the Hall of Fame. What did they do well that others can learn from?
Most organizations have complex missions, but these organizations have very complex missions. The reason I say this is because to succeed to their mission they have to interface with many other organizations - success is impossible without doing so. For example, tackling the problem of drugs requires interfacing with many other agencies
such as customs or the coast guard. The Royal Canadian Mounted Police, for example, built strategic themes around pieces of their mission to drive such cooperation in areas related to drugs and gangs, in which they did not have all the knowledge required to deal with the problems on their own. The Balanced Scorecard provided these organizations
with a way to visualize and put into practice that integration and come up with a new paradigm for effective policing.
The Execution Premium framework is not just about strategy execution, but more broadly strategy management.
Why did you think it was important to expand on the original Balanced Scorecard concept?
This has been a natural evolution grounded in practical experience. Bob Kaplan and I began looking at a problem with measurement, and from that we developed the original Balanced Scorecard idea. From that we realized that the framework was most powerful when the strategic objectives were laid out as a map showing cause and effect,
and this took us to Strategy Maps. There was an evolution from how we measure to how we manage. The Balanced Scorecard also became a bridge to the management system – as examples, how we set performance objectives for
individuals and how we align investments in ways that best show the organization is delivering results. Measurement
itself does not guarantee results; for this to happen metrics have to be integrated into a broader management
system. We also realized early on the important of leadership in using the Balanced Scorecard.
This takes us to the role of leadership, which along with Bob Kaplan you have repeatedly highlighted as the critical
determinant of successful strategy execution and was deemed as such by a recent global survey by the Palladium
Group. When it comes to strategic leadership, what must organizations do right?
The success of the Balanced Scorecard is always linked to the visible usage by and buy-in of leadership. Leaders
will see it as a tool and they have lots of tools to choose from. Those leaders that get the most from a Balanced
Scorecard really use it as an agent of change, and strategy is just another word for change. I need to build effective
teams at the senior level – how do I do that? I have to get the organization to support a change of direction – how
do I do that? I need to build a high-performing culture across the globe – how do I do that? So the CEO or equivalent
sees the Balanced Scorecard as their framework for describing critical strategic goals and a tool for managing
© 2015 Palladium Group, Inc. | www.thepalladiumgroup.com
To do this, a good leader has to combine both right brain and left brain thinking. The right brain is unstructured and
about intuition and creativity - seeing opportunities, inspiring others, etc. The left brain is about structure – using
management tools, measuring performance, etc. Both sides of the brain are important and together deliver change.
For good reasons, defense and police organizations tend to be much more hierarchical than others in the public
and private sectors. Does this lead to any unique challenges when implementing the Balanced Scorecard or the
Execution Premium framework?
Absolutely. Strategy is horizontal in nature and not vertical. Strategy is about delivering solutions to common challenges
that the organization is facing and this is at odds with a vertical structure.
This is why a Strategy Map and in particular strategic themes are powerful within organizations with fairly rigid
hierarchies. By indentifying and laying out strategic themes on a map, these organizations are able to overlay a
horizontal form of management onto the necessary hierarchical structure. The themes enable the organizations to
more effectively drive and manage cross- and intra-organizational teamwork and pursuit of common goals.
How do you see the Balanced Scorecard/Execution Premium framework evolving over the next 3-5 years and are
there any particular implications for those organizations in the defense/police sectors?
The Balanced Scorecard and Execution Premium framework will become increasingly used to manage complexity.
And this complexity has two main drivers that are greatly impacting all fi rms and military and police agencies in
profound ways: globalization and technology.
First there’s globalization. As I have stressed, defense agencies now have to cooperate with other agencies across
the world to tackle increasingly globalized security and criminal activities: the Balanced Scorecard will help them
better manage the inherent complexities in doing so.
And then there’s technology. Obviously technology has changed the world in ways we were not able to even
comprehend a few decades ago and is further changing the world as we speak. This is having signifi cant impacts
on military and police agencies: think about how social media and video are now used to both prevent and solve
complex crimes. Technology is enabling more seamless interaction within and among government agencies across
the world and is becoming more integrated into the structures of these organizations. The need for a framework
that allows the focus on managing such complexity will become increasingly mission-critical.
6 reasons why we’re underhyping the Internet of Things
The Four Phases of Design Thinking
Thomas Edison created the electric lightbulb and then wrapped an entire industry around it. The lightbulb is most often thought of as his signature invention, but Edison understood that the bulb was little more than a parlor trick without a system of electric power generation and transmission to make it truly useful. So he created that, too.
Getting Beneath the Surface
How Design Thinking Happens
Taking a Systems View
Getting Back to the Surface
How Intermountain Healthcare is using data and analytics to transform patient care
American health care is undergoing a data-driven transformation — and Intermountain Healthcare is leading the way. This MIT Sloan Management Review case study examines the data and analytics culture at Intermountain, a Utah-based company that runs 22 hospitals and 185 clinics. Data-driven decision making has improved patient outcomes in Intermountain's cardiovascular medicine, endocrinology, surgery, obstetrics and care processes — while saving millions of dollars in procurement and in its the supply chain. The case study includes video clips of interviews and a downloadable PDF version.
The views of Utah’s Wasatch Mountains are spectacular from the east side of Intermountain Medical Center, but as 40-year-old Lee Pierce walked down a hallway on the fifth floor of the hospital’s administrative building, he hardly noticed them. Pierce, Intermountain’s chief data officer (CDO), was more focused on the giant countdown clock the implementation team had put up in the corridor. The clock was approaching zero, which marked the moment in February 2015 when Intermountain Healthcare would switch on its new electronic health records (EHR) system in two of its 22 hospitals and 24 of its 185 clinics.
Pierce was hardly the only health care executive concerned about a major EHR installation. Indeed, a year earlier, a key provision of the American Recovery and Reinvestment Act of 20091 went into effect, mandating that all health care providers adopt and demonstrate “meaningful use” of EHR systems to maintain their Medicaid and Medicare reimbursement levels.2 But while others scrambled to meet the deadline, Intermountain executives were thinking past it — because Intermountain was replacing an EHR system, not installing its first one.
In fact, Intermountain had created its own EHR system in the 1970s, helping the not-for-profit hospital develop a reputation as an innovator in evidence-based medicine. But that system had aged: It had become incompatible with new forms of input, like speech and data from wearable devices, and it was cumbersome and challenging for the nurses and physicians using it to navigate the antiquated interface to document and retrieve patient information.
Over the years, clinicians had learned to work with the system. It was part of a concerted effort to bring data-based insights to clinicians and managers across the Intermountain Healthcare organization. All clinical programs had embedded analytics support teams; procurement decisions were heavily influenced by data and analytics; and patient interactions were continuously enhanced by data, from the application of population health analytics to analyses of patient self-reports. A culture of data use was widespread among Intermountain’s clinicians and managers.
Even so, the switch to a new EHR system was expected to challenge Intermountain on two fronts: one technological, the other organizational. This was Intermountain’s second effort to update the technology behind its EHR system. An earlier attempt had been abandoned in 2012. Executives pulled the plug on a six-year overhaul involving tens of millions of dollars after deciding the technology was not going to work. This time, Intermountain leaders, including Pierce, were confident they had the right technology and the right systems in place to move the data and information where it needed to go.
There were concerns, however, about whether physicians were ready and willing to make a speedy transition to the new system. They had had only occasional interaction with the old system used in the hospitals, which meant, on the one hand, that physicians were unfamiliar with its interface, and on the other, that they would have to integrate technology into their approach to patient care in new ways.
Two months later, Pierce was at the Las Vegas airport returning from a data and analytics conference, standing with one of the Intermountain physicians working on the rollout. “You know, they said we would be up and running and as efficient as before in just a couple of weeks,” the physician commented. “Here we are, a couple of months in, and some people are still not there. We should have set the expectation that it will take a few weeks to months, depending on the physician’s comfort using technology, the complexity of individual workflows, and frequency of use.”
About Intermountain Healthcare
Intermountain Healthcare runs 22 hospitals and 185 clinics in Utah and Idaho. It employs more than 800 physicians. In 2014, it performed 150,000 surgeries and had 488,000 emergency room visits. It grew out of a system of 15 hospitals operated by the Church of Jesus Christ of Latter-day Saints, which donated the hospitals to their communities in 1975. Intermountain was formed as a secular operating company to oversee those hospitals. It also operates an insurer, SelectHealth, which had 750,000 members and $1.83 billion in revenues in 2014. Overall, in 2014 Intermountain Healthcare had $5.57 billion in revenues and an operating surplus of $301 million.
Pioneering Health Care Analytics
Computers barely existed when Intermountain began its quest to incorporate data analytics into its health care practices. In the 1950s, a cardiologist named Homer Warner joined one of the hospitals that eventually became part of the Intermountain Healthcare organization. Shortly thereafter, he began gathering data to understand why some heart patients had better outcomes than others. Warner would become known as the father of medical informatics — the use of computer programs to analyze patient data to determine treatment protocols — after he and some colleagues built a decision-support tool in 1968 called HELP (Health Evaluation through Logical Processing).3 HELP was one of the first EHR systems in the United States, and it provided doctors with diagnostic advice and treatment guidance. It was also effective in helping doctors identify the causes of adverse drug reactions.
Years later, Warner recalled that using computers to model diagnoses was not — at first — well received; some cardiologists were even insulted by claims that a computer could make a diagnosis. Despite the resistance, the system’s benefits began showing up in improved patient outcomes, and HELP became a key component in Intermountain’s approach to patient care. The innovation attracted attention from all over the world.4 In 1985, Intermountain began using the HELP system in all of its hospitals. Administrators saw an opportunity to put data-driven decision making at the forefront of the organization.
But it wasn’t easy.
Delivering an Analytics Culture
Over the next dozen years, Intermountain expanded its use of data-driven decision-support tools. In 1986, Intermountain hired Brent James, a physician with a master’s degree in statistics, to champion quality-improvement principles and initiatives across the organization. One early challenge was that expensive information technologies, such as data storage, were still improving, making the premise that large investments in data technology would improve care and lower costs somewhat risky. “It was really a decision made on faith at first, that if we invested in the systems, we would see results,” says Brent Wallace, chief medical officer (CMO) for the organization.
James focused on improving data quality and data-gathering techniques. As Mark Ott, chief of surgery at Intermountain, says, “I never want to give data to doctors that I can’t defend. Because once you’ve got bad data, it takes months to recover that level of trust. The single most important thing is the integrity of the data.” James adds that there needs to be a constant focus on data gathering, painstakingly mundane work that almost no one takes to naturally. “You have to have a data zealot who goes around and grabs teams and pulls them into line,” James says.
Helping physicians become comfortable with data became an important part of Intermountain’s approach to developing a data-oriented culture. A key facet of this approach was being as transparent as possible about data quality, CMO Wallace recalls:
When we first started presenting data to physicians about their own performance and how they were doing, most physicians, especially if they were not performing as well as they feel like they ought to be, have two comments. One is, “Well, the data really aren’t accurate. There are problems with the data.” And the second is, “I have sicker patients than my colleagues.” And you hear those two things over and over again.
We allow and actually encourage physicians to question the integrity of the data. If it’s a dataset around their own performance, we show them the names of the patients from whom the data was derived, and they can look at it and say, “Well, this isn’t my patient. This one really sees my partner.” And then we’ll take it out of their dataset. Or if they look at it and say, “You know, I just really don’t believe that this case costs this much money. I want to get in and see what were the contributing factors and challenge that. Have we really collected that accurately?”
And over time, many of our physicians who have been involved in this process iteratively have become pretty comfortable that the data we provide are accurate and okay. But they still know they have the capability to challenge it, if that is needed.
Intermountain's team-driven culture applies gentle peer pressure, extolling doctors or teams that have excellent results and encouraging others to take the same steps. Administrators in the surgical unit, for instance, show physicians how they are performing relative to their peers because they believe surgeons are competitive and want their names at the top of the board. This collegial approach comes in part because only a third of the company's doctors work directly for Intermountain. Another third work for affiliated medical practices, and the rest are independent and only occasionally interact with Intermountain. The system needs them all to contribute data that is as complete as possible, so that data quality doesn’t degrade.
In 1999, at the height of the Internet boom, Intermountain experienced something of an organizational epiphany when it discovered the power of data analytics to affect population health. That year, the American College of Obstetricians and Gynecologists recommended that doctors stop choosing to induce labor before the 39th week of pregnancy, because medical research showed that early induction carried significant risks for babies and mothers.
The hospital’s labor-and-delivery committee suggested that doctors should investigate the hospital’s elective induction rate. “We don’t have that problem here,” came the response from a majority of the obstetricians. The data said otherwise. In fact, 28% of Intermountain’s deliveries were elective preterm inductions, on par with the national average. Intermountain urged its doctors to think twice about performing them, but moving away from elective inductions was a bumpy process. With most deliveries’ timing now left to Mother Nature, many obstetricians had to get used to being on call again or working at odd hours. But eventually they accepted the changes in procedure, and by 2001, elective preterm inductions had fallen to less than 2% of all cases.
Hard work followed this organizational epiphany, as the organization spent years creating a common language for data across departments and hospitals. Colleen Roberts, who switched from being a nurse to a data manager in 2002 after earning a master’s degree in medical informatics, began building out data dictionaries. “Everybody knew that Emergent meant this, and Urgent meant this, but there weren’t clear definitions for every data element,” says Roberts. It took regular meetings with practicing clinicians to hammer out definitions that ultimately enabled Intermountain, for the first time, to directly compare hospitals and departments on a wide range of metrics. Over the last decade, the use of data has become completely ingrained in the culture, she says.
Today, “we never do a project or care initiative that we don’t first run baseline data to see where we were. And post implementation, we run data to see if we’ve shown improvement,” says Roberts, now director of operations for Intermountain’s cardiovascular clinical care unit.
As data analytics spread among Intermountain’s clinical care settings during the 2000s, the cost of gathering and storing data decreased rapidly, enabling more access to analytics. But the main reason analytics spread was not the cost of the technology but the results, how good the analytics were at helping patients.
An Appointment With Clinical Programs
Intermountain has set up multiple touch points for clinicians to access the data they need, or the data they want. Most of its 10 clinical programs, whether big ones like women’s and newborn and cardiovascular, or small specialty services like ear, nose, and throat, have their own data team, as does the clinical services group (pharmacy, imaging and radiology, nursing, physical therapy). Each data team consists of three people: a data manager who makes sure data is being collected correctly, a data analyst to flag important trends, and a data architect who pulls together data from various sources inside and outside Intermountain.
The data manager and data analyst are embedded in the clinical team’s staff and report to the clinical program’s operations manager. The data architects are based in a centralized IT department and report to managers who report to CDO Pierce. In addition, Intermountain has 240 data analysts spread throughout its facilities, as well as 70 researchers in the Homer Warner Center for Informatics Research, formed in 2011. A few of those report into Pierce’s group; the rest are involved in research projects.
In addition, the clinical programs’ operations directors spend part of their time ensuring that data is being gathered properly on the clinical side. There are even data abstracters — nurses assigned to gather data in the operating rooms and other locations — in part because Intermountain participates in a variety of national programs where hospitals contribute information on various procedures, which can require collecting more than a thousand points of data for some procedures.
Any Intermountain employee can make formal or informal requests for analytics support. Pierce notes that with 240 data analysts spread throughout the organization, many requests are made informally. They're water-cooler conversations or brief email exchanges along the lines of, “What does the data say about this kind of treatment?” Intermountain encourages this informal activity, though its analysts must make formally approved queries a priority.
Formal requests for analytics are processed through the internal Web portal. These requests include estimates of the likely time needed from data analysts, managers, and architects. If the combined time for the request is projected to exceed 40 hours, it must be approved and given a priority assessment at the monthly meeting of an information management council, chaired by Pierce, which handles analytics and data governance.
The cardiovascular practice, where Warner started the use of analytics, has expanded its use of analytics to support patient care not only through decision making but also at the policy level. Intermountain used data to decide that it should, for instance, have only four of its hospitals perform cardiovascular operations (surgeries and catheterizations), because concentrating procedure volumes and maintaining implicit controls over conditions was the best way to improve care and reduce costs. By concentrating expertise at each of the four hospitals, Intermountain increased response times for certain emergency procedures, for which speedy interventions are closely connected to better health outcomes.
For example, on average about 15% of people who suffer ST-elevation myocardial infarctions (STEMI) — heart attacks that occur when coronary arteries suddenly become completely blocked — die within 30 days. Better outcomes are achieved when patients receive rapid intervention to unblock the artery. The national standard is 90 minutes for what’s called “door-to-balloon time,” which represents the amount of time from the moment the patient enters the hospital to relief of the blockage via a balloon inflated within the blocked artery. Beating that national average of door-to-balloon time would mean more lives saved.
To work toward that goal, in 2011 Intermountain’s cardiology leadership began working with STEMI teams to set internal time standards and measure results. Every time a heart attack patient was treated, the data on the operation was circulated to the whole team within a few days, a process known as rapid process improvement. This feedback loop helped Intermountain reduce the median door-to-balloon time to 57 minutes. In the last three years, all STEMI patients at Intermountain have gone door-to-balloon in less than 90 minutes. Intermountain’s rate of STEMI patient survival beyond 30 days is now at 96%. “That was purely data-driven — and without the data, we’d have no clue what was going on,” says Don Lappé, the chief of cardiology at Intermountain.
Another example: The cardiovascular surgical team evaluated published research findings that suggested that blood sugar management helped heart patients after operations. Since surgery and anesthesia increase stress levels, which can cause spikes in blood sugar levels, the team asked their data analyst to build a query to examine average blood sugar levels before, during, and after surgery.5
The analysis showed that patient blood sugar levels reached between 300 and 400 mg/dL on average, which was well above the average values of around 90 to 160 mg/dL. A related query showed that Intermountain patients who went home without having their blood sugar managed had more health issues, including needing to be readmitted to a hospital, than those who received blood sugar management.
The cardiovascular surgical team evaluated research on the question with representatives from Intermountain’s four open-heart surgery programs and asked them to think about how to manage blood sugar levels. One hospital started testing blood sugar levels when patients were admitted, and put patients with high baseline blood glucose levels — even those who weren’t diabetic — on insulin. An anesthesiologist at one hospital devised a procedure where he would infuse patients with glucose and then adjust their insulin levels; he found that this caused patient blood sugar levels to fall below 200. He shared the results with his colleagues, who adopted the same techniques. The result from these efforts was a 50% drop in deaths after heart surgery as well as a reduction in time in intensive care units and shorter overall stays.
In 2014, Intermountain published its analysis of diabetics and angiograms in The Journal of the American Medical Association (JAMA).6 JAMA also published a commentary from a doctor at the Mayo Clinic arguing that what was really happening is that Intermountain does such a good job caring for diabetics that they face no higher risk of heart disease than the general population.7
In fact, information sharing has played an important role in how Intermountain providers manage blood sugar levels within their population of diabetic patients. The endocrinology data team analyzed which diabetic patients from across the entire Intermountain group had the lowest average blood sugar levels based on scores from a routine lab test. The practice team took this data and asked the doctors whose patients had the best scores what they had done to help their patients maintain their low levels.
Answers varied from using motivational tools to having their assistants call the patient every three months. The analysis gave all of the doctors with diabetes patients a way to connect their patients with the data by showing patients their scores and correlating scores with lifestyles. By doing so, the doctors were taking patient care, and analytics, outside the hospital.
If there has been a data holdout at Intermountain, it is orthopedics. It is effectively a self-contained department, in that orthopedic surgeries are usually one-time events, handled within an orthopedics group without a lot of patient follow-up except for physical therapy visits. The orthopedics practice does track short-term complications from procedures, such as infection rates, patient time out of work, and how many patients need to return to the operating room. There is a system used to collect physical therapy outcomes. The data from that system suggests that some orthopedists’ patients seem to recover more quickly, but data does not measure patients’ progress over time. Data doesn’t show, for instance, if full knee replacements create better long-term results than partial knee replacements.
Intermountain is evaluating different tools it can use to start to collect that data and use information to better analyze the impact of orthopedics on patient lives. “What I’d love to see is when the patient hits our system, wherever it is, a flag goes up and says ‘it’s been a year since this person had a knee replacement; fill out the survey and give us some follow-up,’” says CMO Wallace. “That will trigger other care-related questions in the EHR. When you can put that kind of information in front of doctors, they’ll start saying, ‘Huh? I’ve always been able to be the Lone Ranger, maybe it does make sense to talk to folks riding the range.’”
Intermountain’s chief of surgery, Mark Ott, gets reports on surgical infection rates every six months and is using that data to reduce infection rates in operating rooms. When the data showed that surgical infection rates at the flagship hospital, Intermountain Medical Center, were in line with national norms, he presented the findings to the surgeons there. He said, “You think you’re great, but compared to other hospitals in the country, you’re not above average.”
Intermountain uses a collaborative process to encourage behavioral change. Regarding infections, a committee of clinicians spent a year developing a list of 30 possible causes, then whittled it down to five and made recommendations of changes that would address them. Ott sent out a note announcing the five recommendations, and got, he says, “a bunch of people complaining — the usual thing.” In particular, they hated having to give up bringing personal items into the operating room, including fleece jackets they would wear to keep warm. “They literally hated that,” Ott says. “I would get calls all the time about how stupid that is.” Ott himself had to quit wearing his Boston Red Sox cap and instead cover his hair with disposable surgical caps. The doctors argued that there was no hard evidence that the recommendations would actually help. Ott agreed, but told them that in six to nine months he would have data — and if it didn’t show results, they could go back to the old ways.
In fact, infection rates fell to half the national standard. When the doctors got the data, they were delighted. But they also asked to relax the rules against personal items in the OR. Ott held firm, saying that since it was not clear how much each of the five factors worked, they needed to keep doing them all.
Ott also explained how data is being used to change the way Intermountain surgeons approach postoperative care following gall bladder removals. Each year, Intermountain performs thousands of gall bladder removals. In 90% of the cases patients receive postoperative antibiotics whether or not they have an infection. Ott believed that this standard practice of administering antibiotics was unnecessary. He asked for a data analysis on the use of antibiotics after gall bladder removal. While antibiotics aren’t expensive, they still cost something.
And if a patient has an allergic reaction to the antibiotic, or develops a drug-resistant C. difficile infection leading to colitis, treatment gets pricey. Ott found that the use rate varied across the system; most hospitals used antibiotics at a near 100% rate, while ambulatory care facilities, usually staffed by the same doctors, did not prescribe them at all for the same gall bladder removal surgery. Same doctors, same operation, just a different building. “Why is that?” Ott asks. He says the data from the different venues show no difference in patient results, so he’s encouraging surgeons to rethink prescribing antibiotics.
Ear, Nose, and Throat
In 2014, Intermountain began applying data analytics to ear, nose, and throat surgeries, a subspecialty service within the surgical services program. Wallace had observed that surgeons used four different methods to cauterize, or seal off, bleeding during tonsillectomies. Each method differs significantly in price. Wallace says the question became: Was one method better than the others in limiting bleeding and improving how patients fared? The data showed that there were essentially no differences in complications, length of stay, or hospital readmissions, says Wallace.
In fact, the oldest (and cheapest) method, electrocauterization, held a slight, though insignificant, statistical edge. When the data was presented to the surgeons, they did not exactly embrace the findings. “They said, ‘that’s all well and good, but — ’ especially for those that use the more expensive new stuff, ‘ — I think my patients do better, they feel better after surgery,’” Wallace says. So a follow-up survey is underway, to collect more data on patient recovery issues. In most cases, there are no dictators in the Intermountain process, says Ott. “We don’t tell the surgeons what to use. We say, ‘Here’s the data. You can use what you want.’”
Care Process Models
Intermountain’s doctors and nurses use dozens of different data-based decision-support tools (also known as care process models) to help them care for patients. In the cardiovascular unit, for instance, a tool runs every morning at 9:15 in all 22 of Intermountain’s hospitals, pulling readings from patients’ vital signs. It sends an email alert telling clinicians which patients are at risk of heart failure, including assessments of their likelihood of being readmitted to the hospital once released, or of dying. That helps Intermountain adapt its care pathways and the way it handles patient care, accelerating the education process for these patients. It might mean assigning patients to palliative care or a hospice.
Brent Wallace, chief medical officer, Intermountain Healthcare; Colleen Roberts, director of operations, cardiovascular clinical care, Intermountain Healthcare; Lee Pierce, chief data officer, Intermountain Healthcare;
These tools help track things humans might miss, says Kim Henrichsen, Intermountain’s chief nursing officer. She says that over time, a patient’s vital signs can shift subtly, and tools built into the system analyze that data and will automatically send alerts to nurses to monitor patients or check specific vital signs. The algorithms also flag patients who appear to be at high risk for readmission based on previous data patterns, and may lead to Intermountain assigning home care to help reduce the likelihood of readmission. Over time, the hospital has also developed monitoring tools for patients who have a single episode, like hip surgery, versus those with a chronic condition, like chronic heart failure.
Patients who have suffered heart failure are put on up to 14 different drugs, from aspirin to beta-blockers, after their release. Because of the number of medications, Intermountain developed a tool to automatically create the list of medications heart failure patients need. CMO Wallace says this lets clinicians spend their mental energy focusing on what is unique about the patient.
Industry analysts predict that supply costs will exceed hospitals’ top expense — labor — by 2020. The challenge, they say, is that a lack of price transparency and no system for sharing cost information leaves doctors unaware of their supply costs or how to reduce them by requesting equally effective but less expensive alternatives.8
At Intermountain, applying analytics to this challenge started in earnest in 2005, when the company started a supply chain organization. With 12,000 vendors, $1.3 billion in non-labor expenses, and a culture that ceded much purchasing authority to doctors, the supply chain managers had their work cut out for them. Perhaps the most significant challenge was finding a way to reduce expenses for physician preference items (PPIs). These are the devices or supplies that doctors request because they prefer them to comparable products. PPI suppliers worked hard to develop relationships with doctors to create physician loyalty to their products. But PPIs could consume as much as 40% of a hospital’s supply budget — and one study found nearly $5 billion in annual losses in the health care industry due to PPI-driven waste in the supply chains.
In 2014, Intermountain launched Intermountain ProComp, a system designed to reduce costs by tracking its 50 highest-volume procedures and presenting information to surgeons on their supply options in real time.
Launching ProComp has led to significant cost reductions. Ott’s data team dug through about a dozen different systems to figure out what various supplies cost. One thing they found was that some coronary surgeons used sutures that cost $750, while others used sutures that cost $250. The analytics revealed no appreciable difference in patient outcomes. Ott presented the data to the surgeons. “They were fascinated by that,” Ott says. “They had no idea that the things they were using cost so much.” Most of them stopped using the more expensive sutures.
Sometimes, though, Ott had to attack the problem from the supplier side. In bowel surgeries, Intermountain surgeons use two kinds of end-to-end anastomotic staplers. One type of stapler cost $270, the other, $870. Doctors prefer the more expensive one; two-thirds of the surgeons use it, in fact. Ott says, “I’ve used them both. I don’t really think there’s a difference. But when I talk to my surgeons, they are adamant that the more expensive product is clearly better.”
They felt that way even after Ott showed them data that found the two staplers were equivalent. Surgeons said patients’ bowels leaked more after they used the cheaper stapler, which meant patients would get sick and need another operation. Or they said that it led to more bleeding after the operation.
Ott turned to his data analytics team, who pulled 170 cases from one of Intermountain’s hospitals and combined it with data from the American College of Surgeons’ National Surgical Quality Improvement Program. The data showed that leak rates for the two staplers were the same, at about 5%, and the only major bleeding event involved the more expensive stapler.
Ott went back to the surgeons, who acknowledged the data but still wanted to use the expensive stapler. Ott didn’t force them to quit using it. Instead, he showed his data to the supplier. “I said, ‘either you lower your price to the competitor price, or we’re taking you off the shelf.’ And they immediately lowered their price.” That one minor change saved Intermountain $235,000 a year. In its first year, ProComp cut $25 million from operating costs in its Surgical Services Clinical Program alone. It aims to cut costs by $400 million by 2018.
A New Record System
According to the Organisation for Economic Co-operation and Development, the United States spends, on a per capita basis, more than twice the average spent by 34 industrialized nations on health care ($8,745 in 2012 compared to an average of $3,484), but gets health results towards the bottom of the pack.9 Critics have fastened on the U.S. fee structure as a big part of the problem, arguing that the system is built around paying for visits, tests, and procedures, many of which are unnecessary, some even harmful. This provides an incentive for providers to focus on quantity of services over quality of care. Intermountain has seen this in action for more than 20 years. It believes its use of data has improved quality and therefore saved lives — more than 1,000 to date. But for all the benefits the data-centered care brings, it has been a struggle sometimes to pay for it. That change in elective inductions? It was a huge success for patients, but actually meant a revenue loss. In a value-based model, Intermountain would have been rewarded for the better health outcomes.
Brent Wallace, chief medical officer, Intermountain Healthcare
That is a reason why Intermountain is eager to move to a value-based business model where it gets paid for effectively caring for patients. In a value-based model, insurers will reward health care providers that lower costs by sharing the cost savings. The importance of value-based care to the future of U.S. health care is reflected in the U.S. Department of Health and Human Services’ recent announcement that it will tie half of all Medicare provider payments to value-based models by the end of 2018. CDO Pierce knows the effective use of data will be central to making the shift to a new way of doing business.
Intermountain’s new EHR is expected to play a pivotal role in helping the organization make this shift.
When selecting its new EHR, Intermountain placed its interest in value-based health care at the forefront of its decision making. It selected Cerner, a large EHR vendor based in Kansas City, Missouri. The executive team thought Cerner had the careful attention to the secondary use of data for back-end analytics in addition to an excellent clinical transaction system that could help clinicians make better patient-care decisions. Intermountain’s configuration of Cerner products is called iCentra.
Once contracts were signed, Cerner set up shop in the offices next to the Intermountain Medical Center and relocated some of its top development talent to Utah. Pierce meets regularly with his counterparts at Cerner, which includes the occasional six-hour meeting to work through deployment strategies. He’s moved his office from the headquarters building to Intermountain Medical Center to be closer to the action but also because there were twists to the deal: Intermountain wanted to retain its own data management and analytic systems, resulting in the need for increased coordination between the organizations.
In the 18 months since they started working together, the two companies have formed a close relationship. Intermountain is consulting with Cerner on a massive Pentagon contract bid, and the two companies are discussing creating new products built on Intermountain’s data management processes and its data warehouse framework.
Intermountain has a standard three-phase approach to all of its technology rollouts. Implementation is the first phase, and includes all of the design, build, training, and “go-live” activities. To prepare for the go-live phase, Intermountain combined a mix of supports that included classroom training, one-on-one coaching sessions, group simulations, super-user experts, physician coaches, and a telephone support help desk that now has the ability to remotely access the user’s screen to solve problems and offer guidance.
Once the system is up and running and technically stable in a given hospital — usually after three weeks — the adoption phase begins. It is typical to have large variations in practice surface only after the go-live phase, so Intermountain has a process in place to identify and standardize these variations and adjust workflow designs. Some physicians may not be using the tools as they were designed to be used, or some may be using the tools in a laborious way. The quick-order page, for example, was redesigned as a result of analytics built on data regarding the first weeks of use.
The third stage — optimization — typically occurs over a longer time frame, from four months out to many years. This stage reflects ongoing efforts to improve the effectiveness and efficiency of the system. The time frame for this stage depends on the scope of changes that need to be made to the system.
No system implementation is without bumps, so Pierce’s stomach didn’t exactly sink while standing there in the Las Vegas airport, talking to the physician who said that expectations around the adaption of the tools were too aggressive. Once Pierce was back in Salt Lake City, he set up a conversation with Sameer Badlani, who joined Intermountain in October 2014 as its first chief health information officer (CHIO). The CHIO role was created to reflect that caring for patients was going to expand beyond hospitals and clinics into people’s homes and communities.
Brent Wallace, chief medical officer, Intermountain Healthcare; Colleen Roberts, director of operations, cardiovascular clinical care, Intermountain Healthcare; Lee Pierce, chief data officer, Intermountain Healthcare;
“The common pushback is ‘I’m doing too much data entry, spending less time with a patient,’” Badlani told Pierce. Some of this came because they knew the old system better than the new. Some of it was that in the new system, doctors did need to spend more time inputting data, which they hadn’t really done a lot of before. And some of it was expectations. “They expect to be facile in a matter of two weeks, and it just doesn't work that way,” Badlani said.
In the old system, doctors got a piece of paper with an order on it for prescription or follow-up, signed it, and sent it over to be input by a nurse or a unit clerk. But in the new system, IT was taking five or six minutes to put in an order. “The physician is appropriately saying, ‘My day is getting longer,’” Badlani said.
Pierce grimaced. But as he talked to Badlani, he realized that the main problem was managing expectations and large-scale change management. While some doctors do adapt quickly, most have a longer, slower learning curve.
The iCentra system gives Intermountain the analytics to leverage and support change. They just needed to do a better job of explaining to doctors that yes, it might take six minutes to input data, but once it was in the system, patients were getting their next steps processed far more quickly. Errors in things like prescriptions were also dropping significantly. “When you say to a physician, ‘Look at what this does for your patient,’ that’s really powerful,” Badlani says.
After Pierce’s discussion with Badlani, he and iCentra leadership from Intermountain and Cerner began looking at the iCentra analytics on how physicians were using the system, how much time they spent on documentation, and on order entry, looking for clues to areas where usability needed to be improved.
Pierce also found out that Intermountain had inadvertently run a test in the rollout. All the groups got the same basic training, but some groups went out and organized practice sessions on their own time. These groups as a rule handled the rollout much more effectively. So Badlani, along with the lead iCentra physician executive Mark Briesacher and the rest of the rollout team, started to develop a prescribed training methodology involving follow-up coaching sessions for the physicians after the initial classroom training and unit-based practice.
Badlani and Briesacher are focused on supporting the physicians and clinicians through this massive change. They are working with Pierce to use data in this process. “Our nurses and our doctors are believers,” Pierce says. “They’re seeking far more data, and they’re seeking far more opportunities to have the analytics, to prove better ways of providing care and lowering costs.”
Intermountain is not waiting for the industry to fully embrace value-based health care. In 2016, it will dive right in by launching a new insurance product that will make physicians and Intermountain jointly responsible for health care costs. Doctors who reduce costs will earn more income. Wallace thinks this will make them even more focused on data. “If there’s a surgeon in a group who’s not following that care process model, are going to look at that surgeon and say, ‘You start to follow the care process model, or you’re out.’ That’s a peer pressure model that can work well in some circumstances,” says Wallace.
Wallace cautions that Intermountain will not force this on people, but that if they don’t adopt the process models, doctors won’t be able to participate in the shared-risk system that is coming to Intermountain. He thinks that by 2018 this will represent between 50–80% of how all Intermountain Healthcare billing happens.
There will be cultural challenges that emerge from the use of the new system — beyond just getting doctors and nurses to adopt it. Primary care doctors who need to refer patients to specialists will be able to see rankings of these specialists based on internal data and make decisions accordingly. “I’m going to be able to look at what their clinical outcomes are, what their costs are, what their patient satisfaction is,” Wallace says. “That’s going to be totally transparent among the group. Our goal is to make that ultimately publically transparent.”
Pierce knows that transparency will put even more pressure on data quality. The specialist ranking project is set to launch in mid-2015, right around when Pierce will be evaluating how the most recent phase of rollouts of iCentra has gone. Looking at the iCentra launch should provide more data for analyzing ways to improve, so that each rollout in the future will go even more smoothly.
‘No Pain, No Gain’ in the Transition to Data-Driven Health Care
The heart of the latest analytics initiative at Intermountain Healthcare is the implementation of a new electronic health records (EHR) system. As the case study on the implementation shows, despite Intermountain’s history of success with analytics, even the best system implementations can be difficult pills to swallow. They produce a lot of extra work for everyone, and they carry considerable risk and unexpected difficulties. For those learning new systems, the suffering as they’re in the early stages of implementation is concrete and visceral; the promised benefits can be abstract and far less certain.
Yet despite the considerable effort and potential for difficulties, many aspects of Intermountain’s new EHR implementation are notable and laudable.
As it rolled out the new EHR system, Intermountain limited the number of hospitals and clinics involved in the initial deployment — to 2 of its 22 hospitals and 24 of its 185 clinics. These numbers are small enough to keep the project scope manageable, but at the same time, large enough to create opportunities to benefit from information exchange.
Furthermore, Intermountain has positioned the new EHR system as part of an inclusive analytics initiative with everyone working together on something that, while difficult, has benefits for both patients and the organization itself. Too often, new initiatives come across with users feeling forced to use a new system by management or IT. The prior cardiovascular, endocrinology, and surgery examples each show that Intermountain uses collaborative approaches to benefiting patients rather than fiat-based mandates.
Intermountain is building on a strong foundation: Its history of prior analytics innovation helps on both the technological and organizational fronts. Technologically, it has a solid infrastructure and technical experience that help reduce project uncertainty. Even an abandoned 2012 overhaul provides a basis for Intermountain technical staff to learn from and perhaps build on. Organizationally, executives and staff have had an analytics culture for many years, with many success stories that illustrate how analytics can transform patient care.
Intermountain’s approach to the role and limitations of technology is savvy. It positions tools as helping “track things that humans might miss” and allowing clinicians to “spend their mental energy focusing on what is unique about the patient.” This mindset helps with setting realistic expectations about what technology can and cannot do, and it reduces resentment of new technology. The key with analytics is to blend the strengths of technology with the strengths of people. Neither alone is sufficient.
The organization’s embrace of transparency is particularly notable. By providing access to data and being forthcoming about its limitations, Intermountain encourages a culture that works to improve data quality. As errors or shortcomings are found, the feedback improves processes. While conceptually easy for many organizations to avow, embracing feedback is difficult to do in practice. Intermountain demonstrates the cumulative benefits that result from building what it calls “the integrity of the data” in a way that engenders a “level of trust from the doctors.”
Intermountain shows signs of analytical maturity throughout the case study. We see senior-level leadership on analytics, a high value being placed on data in people’s day-to-day work, and a widespread analytics culture — all of which are associated with analytical maturity.
An important feature of analytical maturity is that organizations embed analytics in processes rather than simply regard analytics as a set of beneficial, but ad hoc, projects. A process approach is clear in the feedback loops for data quality, common languages for data “across departments and hospitals,” and structured processes for analytical decision making (e.g., operating room clothing). Analytics teams have defined roles (data analysts, data managers, and data architects); equally important, people have career paths and opportunities to progress along within the organization.
When all factors are taken together — an analytics history, savvy blending of technology and people, transparency about data sources and quality, incremental implementation, non-adversarial culture, analytical maturity, and a process focus — Intermountain has created an enviable set of achievements around data that bodes well for its future.
But of course, no analytics initiative will be completely smooth, particularly when it involves new computer systems.
Since Intermountain’s new EHR system replaces another, there is a real danger of “second system effect.” With the first system, people are often just happy to get it to work. But a replacement system must do more than the first (otherwise, why replace?), and those building it tend to try to accomplish everything that was left out of the first system and to correct all of the earlier shortcomings. As a result, second systems can be what Frederick P. Brooks, Jr. calls “the most dangerous system.”
Hearing that doctors were expecting to be “facile in a matter of two weeks” must have been insanely frustrating to the project team. Despite what the case study calls a “standard three-phase approach” to training that included “classroom training, one-on-one coaching sessions, group simulations, super user experts, physician coaches, and a telephone support help desk,” my guess is that Intermountain’s chief data officer Lee Pierce was ready to yank his hair out in the Las Vegas airport meeting with a physician who had heard “we would be up and running and as efficient as before in just a couple of weeks.” Where did expectations go astray?
Analytics initiatives bring challenges that differ depending on the organization’s analytical maturity. Beginners struggle to get basic infrastructure and processes established and to get the first, crucial successes that are needed to showcase the system’s value (and provide a foundation for continued building). Advanced organizations, on the other hand, may find they must undertake more complex systems or pervasive changes to continue to extract value from data. As a relatively advanced analytical organization, Intermountain’s basic opportunities for value from analytics may have already been utilized.
With increasing complexity comes increased difficulty of showing value from data, and from the case study we see that that is clearly true of Intermountain’s new EHR system. Drawing conclusions from data is rarely straightforward, particularly in contexts as complex as health care. For example, it is wonderful to use the opportunity to collect follow up data if an orthopedics patient later “hits our system, wherever it is.” A holistic overview is a great benefit of an EHR. However, what about patients who don’t hit the system again? It will be important to consider this source of potential bias in reaching conclusions about the strengths or shortcomings of the initial orthopedic treatment.
Or, in another example, consider the inadvertent test that took place during rollout. This is far from a true randomized test. There is something fundamentally different about a group of people who organize their own practice sessions than groups that do not; causal conclusions will require analysis of the subsequent follow-up coaching sessions, among other considerations. Difficult questions like these are complex to analyze, but simultaneously provide opportunity for analytically mature organizations to derive value from analytics. Data from the new EHR system will support this complex analysis analysis, but gathering the data is only the first of many steps.
Across the United States, both patients and physicians express legitimate concerns about EHR systems. Automation and the data that come with it are not free. Getting these platforms in place is costly; they impose a considerable “time tax” on people throughout the system. Physicians will spend more time using a system than writing a note by hand. Nurses will spend more time documenting.
This is true of most, if not all, changes. When organizations replaced typing pools with distributed word processing, managers spent more time typing than they had before. Yet going back to typing pools seems absurd now. I expect many of the changes induced by EHR will seem similarly absurd to return to. And the first generations of these systems will be clunkier than later generations. Unfortunately, these clunkier steps are largely inevitable on the path toward benefits from EHR systems and analytics.
Intermountain provides a nice example of many successes from embracing analytics, both historically and with their current initiatives. But even with this rich history, it will continue to have to work through many issues when it comes to deriving value from data. From that perspective, Intermountain's story should be a cautionary tale for those looking to emulate it. Less analytically mature organizations will find it tough to have only the difficulties Intermountain has had and can realistically expect more.
Despite the effort required, organizations everywhere (in health care and beyond) need to improve their ability to build value through data.
View more Views
Reproduced from MIT Sloan Management Review