Are you the publisher? Claim or contact us about this channel


Embed this content in your HTML

Search

Report adult content:

click to rate:

Account: (login)

More Channels


Showcase


Channel Catalog


Channel Description:

Best content from the best source handpicked by Shyam. The source include The Harvard University, MIT, Mckinsey & Co, Wharton, Stanford,and other top educational institutions. domains include Cybersecurity, Machine learning, Deep Learning, Bigdata, Education, Information Technology, Management, others.

older | 1 | .... | 68 | 69 | (Page 70) | 71 | 72 | .... | 82 | newer

    0 0





    Talent is what separates the best from the rest. The best-performing companies simply have better people. Right?

    That’s certainly what we thought before Bain & Company launched its in-depth investigation of workforce productivity. After assessing the practices of global companies and surveying senior executives, we discovered that the best companies have roughly the same percentage of star talent as the rest — no more, no less. It turns out that what separates the best-performing companies from others is the way they deploy talent.

    Bain performed detailed organizational audits on 25 global companies. We benchmarked the practices of these organizations relative to companies widely regarded as best-in-class. To complement this research, we collaborated with the Economist Intelligence Unit to survey more than 300 senior executives from large companies worldwide. We asked them to assess their workforce and to describe their people management practices, all with an eye toward understanding the drivers of workforce productivity. What we found surprised us, at least with respect to star talent:
    • On average, 15% of a company’s workforce — roughly one in seven employees — are A players, or “stars.”
    • The amount of star talent does not differ dramatically between the best-performing companies in our sample (the top quartile) and the rest (the average of the remaining three quartiles). Stars made up 16% for the best, and 14% for the rest.
    What does differ between the best and the rest is how each group deploys its star talent. We found two distinct deployment models at work:

    The best companies used intentional nonegalitarianism. The best-performing companies deploy their star talent in an intentionally nonegalitarian way. That is, they focus their stars on areas where these individuals can have the biggest impact on company performance. As a result, the vast majority of business-critical roles — upward of 95% — are filled by A-quality talent. In some technology companies, for example, software development is critical to business success. So the best-performing companies in this industry make sure that software development roles are filled with star talent. In other industries brand management matters more, so the A players tend to be clustered there. Stars are concentrated where they can make the biggest difference, which of course means that less A talent is available for other positions.

    The rest used unintentional egalitarianism. The remaining companies in our sample deploy star talent in an unintentionally egalitarian way. In other words, these companies attempt to spread their A players more or less evenly across all roles, so that one in seven professionals in every role is a star player, and the remaining six are average players. No team has more stars than any other; no roles are seen as more important than the rest.

    The egalitarian approach may seem fair, even admirable, but it does not produce superior results.
    Our research suggests that people-deployment practices account for a significant portion of the difference in productivity and performance between the best and the rest. To be sure, many other practices are at play as well. But people deployment is vitally important.

    What steps should organizations take to make the most of their star talent? Our research highlights five best practices:

    Know who the stars are in your organization. It is difficult to deploy scarce talent effectively without first identifying your company’s A players. Most companies employ some form of assessment based on performance and potential, typically as a vehicle for determining compensation and career progression. Following this approach, A players are employees who score highly on both dimensions.

    Know where your A players are (and could be) deployed. Knowing who your stars are is just the beginning. You also need to know how effectively they are being deployed. For each star in you company, ask two important and related questions:
    • Where are they currently deployed? What role is each star currently playing in the organization? This information will help you assess how effectively you are deploying scarce star talent.
    • How fungible are they? Could they perform some other role with the same (or similar) performance? Your most valuable people are both highly proficient in their current roles and highly versatile. If you find that you have underinvested scarce talent in a number of critical roles, versatile stars can help to fill these roles.
    Identify the business-critical roles in your company. Not all roles are created equal. Some are inherently more important than others in successfully executing a company’s strategy and delivering superior performance. The best companies identify these roles explicitly. They ask themselves: “Which roles benefit the most from star talent?” and, by implication, “Which roles can we afford to fill with ‘good enough’ talent?” Having the best software programmer in the world makes little difference if your business is consumer packaged goods. But having the very best brand managers and marketers may make a big difference. The best-performing companies put their talent where the money is.

    Treat star talent as a company-wide resource. Organizations commonly struggle with moving great talent from one part of the company to another. Your star talent can quickly become the property of a single business unit or function unless you have the processes and practices to ensure that these scarce resources are invested on behalf of your entire company, not just the division, business, geography, or function where they currently reside. Organizations that put these practices in place make better use of their existing talent and avoid the artificial shortages of talent that can be created by parochial hoarding of A players.

    Ensure that business-critical roles get first dibs on star talent. Once your leadership has the information it needs to determine who and where the stars are in your organization, it must be ruthlessly nonegalitarian in the way it assigns talent. It must make sure that business-critical roles are filled with A players first, and then turn its attention to roles that are important but less business-critical. Only then can you be assured that your star talent is being deployed as well as possible.
    Ever since the start of the “war for talent,” companies have invested billions to attract, develop, and retain the very best. Now that war looks like a stalemate: Most companies, on average, have the same amount of stars. The companies that perform the best are the ones that treat star talent as the scarce, hard-won resource that it is.

    Reproduced from Harvard Business Review


    0 0





    Pakistani Hindu community will have a personal law for the first time as the Senate has passed 'The Hindu Marriage Bill 2017'. The bill, approved by the National Assembly on September 26, 2015 and passed on Friday, will likely get presidential assent next week to become a law, Dawn online reported.

    According to the bill, Hindu women will get documentary proof of their marriage.
    After approval, the law will be applicable on Pakistani Hindus in Punjab, Balo­chis­tan and Khyber Pakhtun­khwa. Sindh province already has its own Hindu marriage law.

    The bill presented in the Senate by Law Minister Zahid Hamid faced no opposition or objection. It was approved by the Senate Functional Committee on Human Rights on January 2 with an overwhelming majority. However, Senator Mufti Abdul Sattar of the Jamiat Ulema-e-Islam-Fazl had opposed the bill, saying the Constitution was vast enough to cater for such needs.

    Committee chairperson Senator Nasreen Jalil of the Muttahida Qaumi Movement on Friday said: "This was unfair. Not only against the principles of Islam but also a human rights violation that we have not been able to formulate a personal family law for the Hindus."

    Senators Aitzaz Ahsan, Jehanzeb Jamaldini and Sitara Ayaz, supporting the bill, said it was related to the marriage of Hindus living in Pakistan and had nothing to do with Muslims.

    Ramesh Kumar Vankwani, who had been working relentlessly for three years to have a Hindu marriage law, said: "Such laws will help discourage forced conversions and streamline the Hindu community after the marriage of individuals.

    Vankwani said it was difficult for Hindu women to prove their marriage.

    The law paves the way for 'Shadi Parath' -- similar to Nikahnama for Muslims -- to be signed by a pundit and registered with the relevant government department.

    However, the Hindu parliamentarians and members of the community had concerns over one of the clauses of the bill that deals with "annulment of marriage".

    According to the bill, one of the partners can approach the court for separation if any of them changes the religion.

    "The separation case should be filed before the conversion as it gives an option to the miscreants to kidnap a married woman, keep her under illegal custody and present her in a court that she has converted to Islam and does not want to live with a Hindu man," Vankwani said.

    The bill is widely acceptable for Pakistani Hindus and relates to marriage, its registration, separation and remarriage, with the minimum age of marriage set at 18 years for both boys and girls.

    View at the original source

    0 0



    What is SAP HANA, express edition?

























    SAP HANA, express edition is a streamlined version of SAP HANA that can run on laptops and other resource-constrained hosts, such as a cloud-hosted virtual machine. SAP HANA, express edition is free to use for in-memory databases up to 32GB.   



    New to SAP HANA, express edition?

    SAP HANA, express edition is targeted to run in resource-constrained environments and contains a rich set of capabilities for a developer to work with.



    In-memory OLTP and Column Store Database Server

    Eliminate disk bottlenecks and achieve groundbreaking performance with the SAP HANA in-memory database. SAP HANA is an ACID compliant database that stores compressed data in memory, in a columnar format and processes data in parallel, across multiprocessor cores and single instruction multiply data (SIMD) commands.

    Bring your own language and micro-services

    SAP HANA XS Advanced is delivered with the release and fully supports Apache TomEE Java and JavaScript/Node.js. XS Advanced uses a micro-services architecture based on cloud foundry.

    Predictive Analytics

    The Predictive Analytics Library (PAL) provides support for classic and universal predicitive analysis algorithms including:
    • Clustering
    • Classifification
    • Time Series
    • Statistics
    • And more.
    PAL requires additional configuration of the base HXE server.

    Geospatial

    Store, process and visualize geo data within SAP HANA. You can also perform operations like distance calculations and determine union and intersection of mulitple objects. In addition, you can integrate geo-data with other structured data.




    System requirements / features

    SAP HANA, express edition comes as a binary installer or as a pre-configured virtual machine image (ova file). If your host runs on an SAP HANA, express edition supported operating system, you can choose either the binary installer or the ova file. Otherwise, you must use the ova file.
    Operating systems supported by SAP HANA, express edition 1.0 SPS12 include:
    • SuSE Linux Enterprise for SAP Applications, 11.4, 12.0, 12.1
    • Red Hat Enterprise Linux 7.2
    Operating systems supported by SAP HANA, express edition 2.0 include:
    • SuSE Linux Enterprise for SAP Applications, 12.1
    SAP HANA, express edition databases are limited to 32 GB of RAM




    SAP HANA, express edition diagram


    This diagram shows the HANA platform services available in SAP HANA, express edition. Please review the Feature Scope document for more details.

    Download Link 

    View at the original source

    0 0





























    Image credit : Shyam's Imagination Library




    The prestigious journal The Lancet has published a large study identifying differences in the brains of people diagnosed with attention-deficit hyperactivity disorder (ADHD).

    It found ADHD is associated with the delayed development of five brain regions, and should be considered a brain disorder. This is vindication for people experiencing ADHD whose diagnosis is sometimes called into question as an invented condition used to label normal children who are not meeting unrealistic expectations of “normal” behaviours.

    Researchers from 23 centres in nine countries scanned the brains of people of aged four to 63 years, 1,713 with and 1,529 without ADHD. When they analysed all the data they found people with ADHD had slightly smaller brains overall, and in five of the seven specific regions there was a definite but very slight reduction in size.

    They found these differences were more marked in children. When they analysed separately those who had and had not been treated with stimulant medication, they found no effect of medication. This suggests the differences are related to ADHD, and not an effect of treatment.

    Not all cases of ADHD are the same

    One important limitation of looking at brain images of people with ADHD relates to the diagnosis of ADHD, which is based on a person meeting a certain set of clinical criteria. Some of these are outcome-based and relate to a person’s ability to carry out tasks. For example, they may avoid tasks that require mental effort or leave tasks incomplete.

    The result of this – fewer tasks completed – could have more than one possible cause. The lack of precision in the cause makes it difficult to align the diagnosis exactly with brain images.
    Inefficiencies in the “thinking” function of the brain (called “executive functioning deficits”) have been identified in people with ADHD. These inefficiencies would make it harder for people with ADHD to carry out certain tasks, such as tasks that take a long time, are difficult and are not constantly rewarding or reinforcing. Therefore a person with ADHD might find motivation for homework extremely difficult to sustain, but electronic games could hold their attention for a far longer period.

    The diagnostic criteria for ADHD ignore the emotional aspect. Using present diagnostic criteria, at least 40% of individuals with ADHD also meet diagnostic criteria for oppositional defiant disorder, a childhood behavioural problem characterised by a negative attitude, disobedience and hostility.
    An even larger proportion probably have features of oppositional defiant disorder but do not reach the diagnostic threshold. This very substantial overlap requires explanation. The findings of the Lancet paper may indicate there is an emotional component that is intrinsic to ADHD.

    It is possible some people with ADHD do not experience an adequate level of emotional satisfaction or sense of achievement in completing everyday tasks. This deficiency in the emotional reward could be an additional problem for some people with ADHD. These individuals would find tasks not only more difficult but also less satisfying, reducing their motivation to achieve. They might also be more moody and disagreeable.

    Individuals with a combination of reduced emotional satisfaction (sometimes termed “reward deficiency”) and executive functioning deficits, would have two different mechanisms that would each serve to reduce their productivity.

    Both of these mechanisms would contribute to their symptoms of ADHD, as they would result in fewer tasks completed. So because there is more than one possible underlying mechanism contributing to certain features of ADHD, it could be anticipated that a large cohort of individuals with ADHD would show a mixed picture, with a variety of different brain structures affected.
    This would reflect differences in the balance of the deficits contributing to their symptoms. The study results are consistent with this concept – the scans show there is no single brain difference that can categorically diagnose ADHD, but they do involve brain centres related to emotion.

    A valid pathology

    The differences in the brains of people with ADHD confirm it is a valid diagnosis and the problems experienced by people with ADHD are genuine.

    However, neuroscience has moved ahead of the clinical understanding of ADHD that is based on the definition in the Diagnostic and Statistical Manual of Mental Disorders.

    We need more sophisticated but clinically relevant models that recognise ADHD results from a combination of deficits that interact to produce varying symptoms for every person who experiences ADHD.



    The Conversation
    View at the original source 




    Shyam's take...............


    This article was first published in "Lancet". Thinking of Lancet, my medical knowledge was built by the extensive reading, that I did of medical journals like Lancet, BMJ, JAMA, AHA, and others.

    Those days Medical college and British Library were my favorite food joints, and medical and management content was my staple food. This was because of my association with the medical profession and knowledge sharing, I did with the medical professionals, or simply put , may be,

    I was suffering from ERS "Excessive Reading Syndrome. "

    As I have among my contacts seasoned professionals and knowledgeable home makers. I am keen to have their feedback on ADHD and as to How they deal with it??

    Now with the advent of social media, especially "Whats App". we know have Social media ADHD. Research is going on this fast developing aspect of the ADHD.

    Have you gone through this experience anytime, or have seen people going through this experience???

    Do you feel, it could be because of low esteem feeling on account of various reasons, personal and professional.

    Or do you feel OCD could be a resultant character trait resulting out of this syndrome.
    Please feel free to share your comments or you can even DM to me your opinion.

    0 0


    "Soon there are going to be states in India that will start ageing and they will need people from other parts of India to look after their elderly." 




    At a time when many economically progressive states and cities are battling “anti-outsider” sentiments, Dr Arvind Subramanian, the Chief Economic Advisor (CEA) to the Government of India, Friday said India needed more migration to sustain all the “peninsular states” and West Bengal, that have a rapidly ageing population, equivalent to those found in advanced countries. Speaking on the topic, “The Economic Survey 2016-17: How India Surprises,” at the Indian Institute of Management-Ahmedabad (IIM-A), he said, “From a demographic point of view, we have two Indias.

    “We have an India that comprises of Tamil Nadu, Kerala, Karnataka, Andhra Pradesh and West Bengal, where populations have started becoming older and older, like some of the advanced countries. So we have the peninsular India, that resembles the advanced countries in terms of their age structure and then we have a young India; Bihar, Madhya Pradesh, Rajasthan, Orissa, which are much more young and dynamic.”

    “So this is going to be huge challenge and an opportunity…. Soon there are going to be states in India that will start ageing and they will need people from other parts of India to look after their elderly.

    “In a sense, we will need more migration in India, so that young demographic India with Bimaru states actually migrate more in order to look after the ageing population of some of the older, ageing states,” Subramanian added.

    Referring to the Economic Survey 2016-17 presented recently in the Parliament, he pointed out that an estimated nine million migrant population travel between states for work. This was derived using data from Indian Railways for a period between 2011-16.

    During his visit to his alma mater after 36 years, Subramanian also talked about rising disparity in India. According to him, in the last 15 years, poorer states in India have grown slower than the richer states.

    “So what it means is, the disparity within India is increasing and this is in sharp contrast to what is happening elsewhere in the world,” he said while making comparisons with countries like China where gap between rich and poor provinces is narrowing,” the CEA said.

    “If Gujarat is growing faster than Orissa, people will come from Orissa to Gujarat. So you will see people from Orissa becoming richer because of partaking in what is happening in Gujarat. But, this is not happening enough in Indian and this is a deep puzzle…,” he added.

    The economist also talked about how India needed to quickly cash-in on it’s demographic dividend. “The other thing that we discovered is that, this window of opportunity that we call as demographic dividend is going to close very soon.

    “We estimate that by 2020, we will reach the peak of demographic dividend and after that the growth of young people will start declining very rapidly and very soon.”

    While taking questions from the audience Subramanian while taking about black money said that more “sticks and carrots” were required after demonetisation to “arrest” the flow of black money.
    He also said that despite government programmes like Jan Dhan, India was “far away” from achieving financial inclusion.



    0 0


    Probably the tail end article on Demonetisation in my collection..

    Everyone has a right to say and speak his/her mind. And 'every say' needs to be heard.  Agreements. Disagreements not with standing....




























    Image credit : Shyam's Imagination Library



    Upadate: The Lok Sabha on February 8, 2017, passed the Specified Bank Notes (Cessation of
    Liabilities) Bill, 2017. Under the new law, holding, transfer and receiving of old 500 and 1000 currency notes is a criminal offence. The bill has thus ended the liability of the RBI and the government on the demonetised currency notes.

    At eight O'clock on November 8, 2016 evening, India was wide awake. Reason: In his televised special address, Prime Minister Narendra Modi had unveiled an economic tsunami where 86 per cent of the currency in Rs 500 and Rs 1,000 denominations ceased to be legal tender.

    Since that day, the Reserve Bank of India and the Centre have taken many steps to lessen the problems of the common man faced due to demonetisation. At the same time, the Supreme Court has been flooded with public interest litigations (PILs) challenging the constitutional validity of the move. But the million-dollar question is: Is demonetisation legal?


    While the matter has been referred to the five-judge constitutional bench, Propguide studies the legal aspects of the order which has hit the country’s booming economy.

    Violates the constitutional right to property under Article 300A: The only right ever to be erased from the list of fundamental rights was the Right to Property. It was demoted to a mere constitutional right. So, is cash-rationing a valid restriction on the constitutional right to property? Yes, because cash and bank accounts are property.

    In Jayantilal vs RBI, in the context of the 1978 demonetisation, the top court had held that demonetisation is not merely a regulation of property, as the government is presently arguing, but constitutes compulsory acquisition of a "public debt' owed to the bearer of the notes declared illegal. Article 300A states that the state may deprive an individual of property only through 'law', and not by executive notification as the government has done.

    Excessive delegation: Section 26(2) of the Reserve Bank of India Act, 1934, says that on the recommendation of the central bank, the central government may pass a notification in the Gazette of India that any series of bank notes cease to be legal tender. Fixing the date from which the demonetisation would come into force is the foundation of section 26(2) and constitutes an "essential law-making function" which cannot be fixed by the central government on behalf of the central bank

    Abridgement of fundamental rights: The currency ban had caused a lot of hardships to the common man as many of them could not carry out their business and trade (19(1)(g) and violated the Right of Life (Article 21) of those 100-odd people who died while standing in the long bank queues. While the government is within its right to curtail fundamental rights in the larger public good, it needs to prove that the curbs were 'reasonable'.

    Act of Parliament needed: The precedent is that on the last two occasions of demonetisation (1956 and 1978), the law was effected through an ordinance. But, this time the law was effected through a government notification. The rationale is that to have a rule which would have a draconian effect on the lives of people, a notification can't suffice.

    Article 19(6) has no significance: The Centre would want to hide behind Article 19(6) of the Constitution but it has no relevance here. Article 19(6) says that nothing in Article 19(1)(g) shall affect the operation of any existing law in so far as it prevents the state from making any law imposing, in the interest of the general public, reasonable restrictions on the exercise of the rights conferred by the sub-clause. But the exception of Article 19(6) is not available to the central government as the notification is beyond "police powers".




    0 0





    Boston Dynamics’ wheeled Handle robot received much fanfare earlier this month when DFJ partner Steve Jurvetson slipped us an early video from a company Keynote. Today we have more details on Handle in the form of a shiny HD video.

    We already knew Handle could manage some pretty sick hurdles and spins, but the new video shows us how the robot can operate in tough environments — on hills, in the snow and over uneven terrain. It’s able to do this with a height of 6.5 feet that surpasses that of most humans. On wheels, it can move at a chipper nine mph and manage four-foot vertical jumps. If you’re wondering, the highest human jump ever recorded is 5.3 feet. (Take that Handle!)





    View at the original source

    0 0



    To understand how advances in artificial intelligence are likely to change the workplace — and the work of managers — you need to know where AI delivers the most value.



    Major technology companies such as Apple, Google, and Amazon are prominently featuring artificial intelligence (AI) in their product launches and acquiring AI-based startups. The flurry of interest in AI is triggering a variety of reactions — everything from excitement about how the capabilities will augment human labor to trepidation about how they will eliminate jobs. In our view, the best way to assess the impact of radical technological change is to ask a fundamental question: How does the technology reduce costs? Only then can we really figure out how things might change.

    To appreciate how useful this framing can be, let’s review the rise of computer technology through the same lens. Moore’s law, the long-held view that the number of transistors on an integrated circuit doubles approximately every two years, dominated information technology until just a few years ago. What did the semiconductor revolution reduce the cost of? In a word: arithmetic.

    This answer may seem surprising since computers have become so widespread. We use them to communicate, play games and music, design buildings, and even produce art. But deep down, computers are souped-up calculators. That they appear to do more is testament to the power of arithmetic. The link between computers and arithmetic was clear in the early days, when computers were primarily used for censuses and various military applications. Before semiconductors, “computers” were humans who were employed to do arithmetic problems. Digital computers made arithmetic inexpensive, which eventually resulted in thousands of new applications for everything from data storage to word processing to photography.

    AI presents a similar opportunity: to make something that has been comparatively expensive abundant and cheap. The task that AI makes abundant and inexpensive is prediction— in other words, the ability to take information you have and generate information you didn’t previously have. In this article, we will demonstrate how improvement in AI is linked to advances in prediction. We will explore how AI can help us solve problems that were not previously prediction oriented, how the value of some human skills will rise while others fall, and what the implications are for managers. Our speculations are informed by how technological change has affected the cost of previous tasks, allowing us to anticipate how AI may affect what workers and managers do.

    Machine Learning and Prediction

    The recent advances in AI come under the rubric of what’s known as “machine learning,” which involves programming computers to learn from example data or past experience. Consider, for example, what it takes to identify objects in a basket of groceries. If we could describe how an apple looks, then we could program a computer to recognize apples based on their color and shape. However, there are other objects that are apple-like in both color and shape. We could continue encoding our knowledge of apples in finer detail, but in the real world, the amount of complexity increases exponentially.

    Environments with a high degree of complexity are where machine learning is most useful. In one type of training, the machine is shown a set of pictures with names attached. It is then shown millions of pictures that each contain named objects, only some of which are apples. As a result, the machine notices correlations — for example, apples are often red. Using correlates such as color, shape, texture, and, most important, context, the machine references information from past images of apples to predict whether an unidentified new image it’s viewing contains an apple.

    When we talk about prediction, we usually mean anticipating what will happen in the future. For example, machine learning can be used to predict whether a bank customer will default on a loan. But we can also apply it to the present by, for instance, using symptoms to develop a medical diagnosis (in effect, predicting the presence of a disease). Using data this way is not new. The mathematical ideas behind machine learning are decades old. Many of the algorithms are even older. So what has changed?

    Recent advances in computational speed, data storage, data retrieval, sensors, and algorithms have combined to dramatically reduce the cost of machine learning-based predictions. And the results can be seen in the speed of image recognition and language translation, which have gone from clunky to nearly perfect. All this progress has resulted in a dramatic decrease in the cost of prediction.

    The Value of Prediction

    So how will improvements in machine learning impact what happens in the workplace? How will they affect one’s ability to complete a task, which might be anything from driving a car to establishing the price for a new product? Once actions are taken, they generate outcomes. (See “The Anatomy of a Task.”) But actions don’t occur in a vacuum. Rather, they are shaped by underlying conditions. For example, a driver’s decision to turn right or left is influenced by predictions about what other drivers will do and what the best course of action may be in light of those predictions.

    0 0





    Among the three or four things that I have been really passionate about, when I first came, was the need to ramp up public investment. The government has embraced that wholeheartedly. Broadly, the views I have been advocating are reflected in this year’s budget as well. The other thing I feel really passionate about is the GST. I did want this to be a simple low-rate GST and whether I’ve been successful, we’ll have to wait and see. The job is to always create the sort of ecosphere of ideas where all these things are debated. Robert Solow has a famous line: “Ninety per cent of the job of a good economist is to flush out bad ideas like cockroaches”. Those are the things the people will never know — the bad ideas that are flushed out.

    Iyer: How has it been for you over the last three years in government? Does bureaucracy fascinate you?

    This is, by far, the most exciting thing I’ve done ever. I’m on a 24/7 high all the time and my wife, who is here, can testify to that. One of the big challenges to this is how you persuade people to go along with you. Navigating the bureaucracy is very much part of that. For me, watching and being part of this process is a daily learning experience.

    Iyer: There have been several reports on administrative reforms. Do you think certain things can be fixed?

    Fixing the Indian state is beyond Brahma, Vishnu, Shiva and all the gods put together. But, I do think a government can improve if they make use of outside talent. The marriage of the insiders with the outsiders produces results. If you don’t get outsiders, there is too much ossification and caution, too much hierarchy in the system to actually generate ideas. Equally, if you just rely on outsiders, they don’t know what’s going on and are not familiar with our ways of decision making.

    Iyer: You’ve often talked about the twin balance sheet problem. But, banks been unable to address this issue on their own. You advocated a bad bank. Why is it so politically unpalatable?
    If you remember, I came in November 2014 and the first document that we wrote is the mid-year media analysis, in December 2014, which is when we coined the term ‘Twin Balance Sheet’ problem. At least, some of us recognised it early on. It’s also honest to say that there was an urgency and how serious we needed to be, even we didn’t signal it at that point. We were also a little bit behind the curve. There is probably in-built incentive in the system to not reveal the true extent of problems.

    We also thought that if growth were to pick up, it would kind of be ... a rising tide can cover all the jagged rocks, and so it would do that. A combination of all these things lulled people in to a sense that the problem is not so big. The final bit of honesty is that it is a very hard problem, which should not be underestimated because nowhere in the world is it easy to say I will forgive the debts of the private sector. Because you create moral hazards, you say what kind of system is this, crony capitalism and so on. The twin balance sheet problem is finally about that, of course. There was a lot of crony capitalism, but a lot of honest mistakes were made.

    Down the road, we realised that a big part of the solution has to be to write down the debts of many companies, several of which were large companies, and that is not going to be easy. Making those hard decisions to write down debt, it’s difficult for any political system to do and of course in India you have this overhang of the four Cs. Even honest bureaucrats within the public sector are kind of fearful.

    Damodaran: You mean the CAG, the CVC...

    I would say the Courts, CAG, CVC, CBI. I don’t mean to cast any aspersions on any of them. They’re doing their job. But there is a kind of pall of anxiety and nervousness hanging over decision making. Put that together anywhere in the world and it is difficult to write off. So, it’s a really hard thing and something we need to crack.

    Iyer: In the run up to the ongoing elections, the campaign draws so much from religion-based politics. How does such politics based so much on religion impact development?

    All evidence suggests that social harmony is an intrinsic precondition for economic development. We’ve been a very early democracy and a cleavaged democracy and that has obviously had an impact on our economic development. While we justly and rightly claim that we’re very proud of our political development, I think it has not been without some cost.

    View at the original source


    0 0







    “We are all in the gutter,” wrote Oscar Wilde, “but some of us are looking at the stars.” That is the nature of strategy through execution. You operate deep in the weeds, managing countless day-to-day tasks and transactions. At the same time, you keep a steady gaze on your company’s long-term goals  and on ways you can stand out from your competitors.

    Having a close link between strategy and execution is critically important. Your strategy is your promise to deliver value: the things you do for customers, now and in the future, that no other company can do as well. Your execution occurs in the thousands of decisions made each day by people at every level of your company.

    Quality, innovation, profitability, and growth all depend on having strategy and execution fit together seamlessly. If they don’t fit — if you can’t deliberately align them in a coherent way — you risk operating at cross-purposes and losing your focus. This problem is all too common. In a recent Strategy& global survey, 700 business executives were asked to rate their company’s top leaders in terms of their skill at strategy creation and at execution. Only 8 percent were credited as being very effective at both.

    Strategy&, the strategy consulting business of PwC, has been studying the relationship between strategy and execution for years. We have found that the most iconic enterprises — companies such as Apple, Amazon, Danaher, IKEA, Starbucks, and the Chinese appliance manufacturer Haier, all of which compete successfully time after time — are exceptionally coherent. They put forth a clear winning value proposition, backed up by distinctive capabilities, and apply this mix of strategy and execution to everything they do.

    Any company can follow the same path as these successful firms, and an increasing number of companies are doing just that. If you join them, you will need to cultivate the ability to translate the strategic into the everyday. This means linking strategy and execution closely together by creating distinctive, complex capabilities that set your company apart, and applying them to every product and service in your portfolio. These capabilities combine all the elements of execution — technology, human skills, processes, and organizational structures — to deliver your company’s chosen value proposition.

    How do you accomplish this on a day-to-day basis? How do you get the strategists and implementers in your company to work together effectively? These 10 principles, derived from our experience at Strategy&, can help you avoid common pitfalls and accelerate your progress. For companies that truly embrace strategy through execution, principles like these become a way of life.

    1. Aim High

    Don’t compromise your strategy or your execution. Set a lofty ambition for your strategy: not just financial success but sustained value creation, making a better world through your products, services, and presence. Apple’s early goal of making “a computer for the rest of us,” which effectively shaped the personal computer industry, is a classic example.

    Next, aim just as high on the execution side, with a dedication to excellence that seems almost obsessive to outsiders. Apple, for instance, has long been known for its intensive interest in every aspect of product design and marketing, iterating endlessly until its notoriously demanding leaders are satisfied. The company’s leaders do not consider execution beneath them; it is part of what makes Apple special.

    Together, a strong long-term strategy and a fierce commitment to excellent execution can transform not only a company, but a regional economy. After the 1992 Olympics in Barcelona, a group of local political and business leaders realized, with some disappointment, that the event hadn’t triggered the economic growth they had expected. So they resolved to change the region’s economy in other ways. Led by the mayor, the group created a common base of technologies and practices and set up training programs for local enterprises. By 2014, after two decades of persistent effort, the city had become a hub for research and technology companies. One legacy of the Olympics is a group of about 600 sports-related companies with a collective annual revenue of US$3 billion and 20,000 employees.

    In carrying out this first principle, the top executives of your company must lead the way. They must learn to set lofty goals, establish a clear message about why those goals are relevant, and stick to them without compromise. This may take a while, because lofty goals require patience. You need to persevere without lowering your standards, and the confidence to believe you can reach the goals soon enough. Leaders must demonstrate that courage and commitment, or no one else will. At the same time, don’t be surprised if the rewards start to appear sooner than you expect — both financial rewards and the intrinsic pleasure of working with highly capable people on relevant projects. With high aspirations (for example, IKEA’s goal of “creating a better everyday life for the many people” or Amazon’s self-proclaimed role as the “everything store”), you recruit talented people who are deeply committed to being there. That’s one way you’ll know that you’re aiming high enough: The whole organization will start to feel like a better place to work.


    2. Build on Your Strengths.


    Your company has capabilities that set it apart, things you do better than anyone else. You can use them as a starting point to create greater success. Yet more likely than not, your strongest capabilities have been obscured over the years. If, like most companies, you pursue opportunities that crop up without thinking much about whether you have the prowess needed to capture them, you can gradually lose sight of what you do best, or why customers respond to it.

    Take an inventory of your most distinctive capabilities. Look for examples where you have excelled as a company, achieving greatly desired outcomes without heroic efforts. Articulate all the different things that had to happen to make these capabilities work, and figure out what it will take to build on your strengths, so that you can succeed the same way more consistently in the future.

    Sometimes a particular episode will bring to light new ways of building on your strengths. That’s what happened at Bombardier Transportation, a division of a Canadian firm and one of the world’s largest manufacturers of railroad equipment. To win a highly competitive bid for supplying 66 passenger train cars to a British rail operator, Bombardier shifted its manufacturing and commercial models to a platform-based approach, which allowed it to use and reuse the same designs for several different types of railway cars. “Platforming,” which was a new operational strategy for the industry, required adjustments to Bombardier’s supplier relationships and product engineering practices. But the benefits were immediate: lower costs, less technology risk, faster time-to-market, and better reliability.

    Bombardier won the bid — and, more importantly, learned from the experience, making the episode a model for other bids and contracts. When some Bombardier engineers complained about the platform approach on the grounds that it curtailed their creativity, the leadership had an immediate answer: The platform demonstrated capabilities that competitors couldn’t match and the company’s creativity could be focused on innovation. Additional contracts soon followed.

    The more knowledge you have about your own capabilities, the more opportunities you’ll have to build on your strengths. So you should always be analyzing what you do best, gathering data about your practices, and conducting postmortems. In every case, there is something to learn — about your operations, and also about the choices you make and the value you’re able to deliver.


    3. Be Ambidextrous


    In the physical world, ambidexterity is the ability to use both hands with equal skill and versatility. In business, it’s the ability to manage strategy and execution with equal competence. In some companies, this is known as being “bilingual”: able to speak the language of the boardroom and the shop floor or software center with equal facility. Ambidextrous managers can think about the technical and operational details of a project in depth and then, without missing a beat, can consider its broader ramifications for the industry. If strategy through execution is to become a reality, people across the enterprise need to master ambidexterity.

    Lack of ambidexterity can be a key factor in chronic problems. For instance, if IT professionals focus only on execution when they manage ERP upgrades or the adoption of new applications, they may be drawn to vendors for their low rates or expertise on specific platforms instead of their ability to design solutions that support the company’s business strategy. When the installation fails to deliver the capabilities that the company needs, there will be an unplanned revision; the costs will balloon accordingly, and the purchase won’t fulfill its promise.

    We recognize, of course, that not everyone needs to be equally conversant in the company’s strategy. A typical paper goods manufacturer, for example, employs chemists who research hydrogen bonds to discover ways to make paper towels more absorbent. They may not need to spend much time debating strategy in the abstract, but they do need to be aware of how their role fits in. Like the apocryphal bricklayer who sees himself as building a cathedral, the highly skilled technologists on your team must recognize that they are not merely fulfilling a spec but rather developing a technology unlike anyone else’s, for the sake of building highly distinctive capabilities. They might even help figure out what those capabilities should be.

    Similarly, your top leaders don’t have to be experts on hydrogen bonds or cloud-based SQL server hosting, but they do have to be conversant enough with technological and operational details to make the right high-level decisions. No longer can a senior executive credibly say, “I don’t use computers. My staff is my computer.” If your leaders aren’t ambidextrous, they risk being eclipsed or outperformed by someone who is.

    In The Self-Made Billionaire Effect: How Extreme Producers Create Massive Value (Portfolio, 2014), John Sviokla and Mitch Cohen suggest using the word producers to describe ambidextrous individuals. Self-made billionaires, such as Spanx founder Sara Blakely, POM Wonderful cofounder Lynda Resnick, Uniqlo founder Tadashi Yanai, and Morningstar founder Joe Manseuto have this quality. They can both envision a blockbuster strategy and figure out in detail how to develop and sell it to customers. There are similarly ambidextrous people in every company, but they often go unappreciated. Find them, recognize and reward them, and give them opportunities to influence others. 

    Foster ambidexterity in practices and processes as well as in people. For example, in your annual budgeting exercises, ask people to explain the relationship of each line item to the company’s strategy, and specifically to the capability it is enabling. Over time, this approach will channel investments toward projects with a more strategic rationale. 


    4. Clarify Everyone’s Strategic Role

    When the leaders of the General Authority of Civil Aviation (GACA) of Saudi Arabia decided to improve the way they ran the country’s 25 airports, they started with the hub in Riyadh, one of the largest airports in the country. They had already outsourced much of their activity, redesigning airport practices and enhancing operations. But not much had changed. Convening the directors and some department leaders, the head of the airport explained that some seemingly minor operational issues — long customs lines, slow boarding processes, and inadequate basic amenities — were not just problems in execution. They stood in the way of the country’s goal of becoming a commercial and logistics hub for Africa, Asia, and Europe. Individual airport employees, he added, could make a difference.

    The head of the airport then conducted in-depth sessions with employees on breaking down silos and improving operations. In these sessions, he turned repeatedly to a common theme: Each minor operational improvement would affect the attractiveness of the country for commercial travel and logistics. A wake-up call for staff, the sessions marked a turning point for the airport’s operational success. Other airports in the Saudi system are now expected to follow suit.

    The people in your day-to-day operations — wherever they are, and on whatever level — are continually called upon to make decisions on behalf of the enterprise. If they are not motivated to deliver the strategy, the strategy won’t reach the customers. It is well established that financial rewards and other tangible incentives will go only so far in motivating people. Workers cannot make a greater personal commitment unless they understand why their jobs make a difference, and why the company’s advancement will help their own advancement.

    Successful leaders spend a great deal of time and attention on the connection between strategy and personal commitment. One such leader has run the trade promotion effectiveness (TPE) capability at two global consumer products goods (CPG) companies over the past several years. CPG companies use this capability to build the momentum of key brands. It involves assembling assortments of products to promote, merchandising them to retailers, arranging in-store displays and online promotions, adjusting prices and discounts to test demand, and assessing the results. A great TPE capability consistently attracts customers and compels them to seek out the same products for months after the campaign ends. TPE and related activities often represent the second-largest item (after the cost of goods sold) on the P&L statement. This in itself indicates the capability’s strategic importance for CPG companies.

    In both enterprises, this executive took the time to go up and down the organization, making a case for why the specific mechanics of trade promotion matter to the value proposition of the company and, ultimately, to its survival. He made it a point to talk numbers but didn’t limit the conversation to them. “We spend billions at this company on promotions,” he might say. “We have to get back $100 million in added revenue next year, and another $100 million on top of that the year after.” He then urged employees to develop better promotions that would attract more consumers and increase their synergies with retailers. This combination of numbers and mission made it clear how people’s individual efforts could affect the company’s prospects. 


    5. Align Structures to Strategy


    Set up all your organizational structures, including your hierarchical design, decision rights, incentives, and metrics, so they reinforce your company’s identity: your value proposition and critical capabilities. If the structures of your company don’t support your strategy, consider removing them or changing them wholesale. Otherwise, they will just get in your way.

    Consider, for example, the metrics used to track the results delivered by call center employees. In many companies, these individuals must follow a script and check off that they’ve said everything on the list — even at the risk of irritating potential customers. Better instead to get employees to fully internalize the company’s strategy and grade them on their prowess at solving customer problems.

    Danaher, a conglomerate of more than 25 companies specializing in environmental science, life sciences, dental technologies, and industrial manufacturing technologies, is intensely focused on creating value through operational excellence. Critical to this approach are metrics built into the Danaher Business System, the company’s intensive continuous improvement program. Only eight key metrics, called “core value drivers” to underline their strategic relevance, are tracked constantly in all Danaher enterprises. The financial metrics (core growth, operating margin expansion, working capital returns, and return on invested capital) are used not just by investors but also by managers to evaluate the value of their own activities.

    Danaher also tracks two customer-facing metrics (on-time delivery and quality as perceived by customers), and two metrics related to employees (retention rates and the percentage of managerial positions filled by internal candidates). Lengthy in-person operating reviews, conducted monthly, are very data driven, focusing on solving problems and improving current practices. The metrics are posted on the shop floor, where anyone can see the progress that’s being made — or not being made — toward clear targets. The meetings are constructive: People feel accountable and challenged, but also encouraged to rise to the challenges.

    Data analytics is evolving to the point where it can help revitalize metrics and incentives. A spreadsheet is no longer enough to capture and analyze this body of material; you can use large information management systems programmed to deliver carefully crafted performance data. No matter how complex the input, the final incentives and metrics need to be simple enough to drive clear, consistent behavior. More generally, every structure in your organization should make your capabilities stronger, and focus them on delivering your strategic goals.

    6. Transcend Functional Barriers

    Great capabilities always transcend functional barriers. Consider Starbucks’ understanding of how to create the right ambience, Haier’s ability to rapidly manufacture home appliances to order, and Amazon’s aptitude for launching products and services enabled by new technologies. These companies all bring people from different functions to work together informally and creatively. Most companies have some experience with this. For example, any effective TPE capability brings together marketing, sales, design, finance, and analytics professionals, all working closely together and learning from one another. The stronger the cross-functional interplay and the more it is supported by the company’s culture, the more effective the promotion.

    Unfortunately, many companies unintentionally diminish their capabilities by allowing functions to operate independently. It’s often easier for the functional leaders to focus on specialized excellence, on “doing my job better” rather than on “what we can accomplish together.” Pressed for time, executives delegate execution to IT, HR, or operational specialists, who are attuned to their areas of expertise but not necessarily to the company’s overall direction. Collaborative efforts bring together people who don’t understand each other or, worse, who pursue competing objectives and agendas. When their narrow priorities conflict, the teams end up stuck in cycles of internal competition. The bigger a company gets, the harder it becomes to resolve these problems.

    You can break this cycle by putting together cross-functional teams to blueprint, build, and roll out capabilities. Appoint a single executive for each capability team, accountable for fully developing the capability. Ensure this person has credibility at all levels of the organization. Tap high-quality people from each function for this team, and give the leader the authority to set incentives for performance.

    There’s always the risk that these cross-functional teams will be seen as skunkworks, separate from the rest of the enterprise. To guard against this risk, you need a strong dotted line from each team member back to the original function. Sooner or later, the capabilities orientation will probably become habitual, affecting the way people (including functional leaders) see their roles: not as gatekeepers of their expertise, but as contributors to a larger whole.


    7. Become a Fully Digital Enterprise


    The seventh principle should affect every technological investment you make — and with luck, it will prevent you from making some outdated ones. Embrace digital technology’s potential to transform your company: to create fundamentally new experiences and interactions for your customers, your employees, and every other constituent. Until you use technology this way, many of your IT investments will be wasted; you won’t realize their potential in forming powerful new capabilities.

    Complete digitization will inevitably broaden your range of strategic options, enabling you to pursue products, services, and innovations that weren’t feasible before. For example, Under Armour began as a technologically enabled sports apparel company, specializing in microfiber-based synthetic fabrics that felt comfortable under all conditions. To keep its value proposition as an innovator, it aggressively expanded into fitness trackers and the development of smart apparel. The company is now developing clothing that will provide data that can both help athletes raise their game and point the way to design improvements.

    Adopting digital technology may mean abandoning expensive legacy IT systems, perhaps more rapidly than you had planned. Customers and employees have come to expect the companies they deal with to be digitally sophisticated. They now take instant access, seamless interoperability, smartphone connectivity, and an intuitively obvious user experience for granted. To be sure, it is expensive and risky to shift digital systems wholesale, and therefore you need to be judicious; some companies are applying the Fit for Growth approach to IT, in which they reconsider every expense, investing more only in those that are directly linked to their most important capabilities. (See “Building Trust while Cutting Costs,” by Vinay Couto, Deniz Caglar, and John Plansky.)

    Fortunately, cloud-based technologies provide many more options than were available before. To boost agility and reduce costs, you can outsource some tech activities, while keeping others that are distinctive to your business. You also can use embedded sensors and analytics to share data across your value chain and collaborate more productively (an approach known as “Industry 4.0” and the “Industrial Internet of Things”). The biggest constraint is no longer the cost and difficulty of implementation. It’s your ability to combine business strategy, user experience, and technological prowess in your own distinctive way. 


    8. Keep It Simple, Sometimes


    Many company leaders wish for more simplicity: just a few products, a clear and simple value chain, and not too many projects on the schedule. Unfortunately, it rarely works out that way. In a large, mainstream company, execution is by nature complex. Capabilities are multifaceted. Different customers want different things. Internal groups design new products or processes without consulting one another. Mergers and acquisitions add entirely new ways of doing things. Although you might clean house every so often, incoherence and complexity creep back in, along with the associated costs and bureaucracy.

    Many company leaders wish for more simplicity. Unfortunately, it rarely works out that way.

    The answer is to constantly seek simplicity, but in a selective way. Don’t take a machete to your product lineup or org chart. Remember that not all complexity is alike. One advantage of aligning your strategy with your capabilities is that it helps you see your operations more clearly. You can distinguish the complexity that truly adds value (for example, a supply chain tailored to your most important customers) from the complexity that gets in your way (for example, a plethora of suppliers when only one or two are needed).

    As Vinay Couto, Deniz Caglar, and John Plansky explain in Fit for Growth: A Guide to Strategic Cost Cutting, Restructuring, and Renewal (Wiley, 2017), effective cost management depends on the ability to ruthlessly cut the investments that don’t drive value. Customer-facing activities can be among the worst offenders. Some customers need more tailored offerings or elaborate processes, but many do not.

    For example, Lenovo, a leading computer hardware company with twin headquarters in China and the U.S. (Lenovo’s ThinkPad computer business was acquired with its purchase of IBM’s personal computer business), has a strategy based on cross-pollination of innovation between two entirely different markets. The first is “relationship” customers (large enterprises, government agencies, and educational institutions), which purchase in large volume, need customized software, and are often legacy IBM customers. The second is “transactional” customers (individuals and smaller companies), typically buying one or two computers at a time, all seeking more or less the same few models; these customers, however, are sensitive to cost and good user experience.

    Lenovo has a single well-developed hardware and software innovation capability aimed at meeting the needs of both types of customers. But its supply chain capability is bifurcated. The relationship supply chain is complex, designed to provide enterprise customers with greater responsiveness and flexibility. Lenovo’s computer manufacturing plant in Whitsett, N.C., which opened in 2013, was designed for fast shipping, large orders, and high levels of customization. Meanwhile, the company maintains a simpler supply chain with manufacturing sites in low-cost locations for its transactional customers.

    The principle “keep it simple, sometimes” is itself more complex than it appears at first glance. It combines three concepts in one: First, be as simple as possible. Second, let your company’s strategy be your guide in adding the right amount of complexity. Third, build the capabilities needed to effectively manage the complexity inherent in serving your markets and customers.


    9. Shape Your Value Chain


    No company is an island. Every business relies on other companies in its network to help shepherd its products and services from one end of the value chain to the other. As you raise your game, you will raise the game of other operations you work with, including suppliers, distributors, retailers, brokers, and even regulators.

    Since these partners are working with you on execution, they should also be actively involved in your strategy. That means selling your strategy to them, getting them excited about taking the partnership to a whole new level, and backing up your strategic commitment with financing, analytics, and operational prowess. For example, when the Brazilian cosmetics company Natura Cosméticos began sourcing ingredients from Amazon rain forest villages, its procurement staff discovered that the supply would be sustainable only if they built deeper relationships with their suppliers. Beyond paying suppliers, they needed to invest in the suppliers’ communities. The company has held to that commitment even during down periods.

    Use leading-edge digital technology to align analytics and processes across your value chain. In the past, companies that linked operations to customer insight in innovative ways did it through vertical integration, by bringing all parts of the operation in-house. For example, Inditex created a robust in-house network that linked its Zara retail stores with its design and production teams. Real-time purchase data allowed designers to find out what was selling — and what wasn’t — more quickly than their competitors could. This approach has helped Zara introduce more items that would sell quickly while keeping costs down. And it has helped Inditex outpace its rivals in both profitability and growth.

    At the time Inditex developed its system, vertical integration was a prerequisite for that kind of integration. But now the technology has changed, and in a cloud-based computer environment, you no longer need full vertical integration. You can achieve the same result through integrated business platforms (some managed by third-party logistics companies such as Genpact, and others being developed as joint ventures). By allowing several companies to share real-time data seamlessly, these platforms enable each participating company to set more ambitious strategic goals. 


    10. Cultivate Collective Mastery


    The more bound your company is by internal rules and procedures for making and approving decisions, the slower it becomes. Hence the frustration leaders have with the pace of bureaucracy, in which people can’t make decisions because they don’t know what the strategic priorities are — or even what other stakeholders will think. In a world where disruption has become prevalent, your company can’t afford the time or expense of operating this way.

    The alternative is what we call collective mastery. This is a cultural attribute, often found in companies where strategy through execution is prevalent. It is the state you reach when communication is fluid, open, and constant. Your strategists understand what will work or not work because they talk easily with functional specialists. Your functional specialists know not only what they’re supposed to do, but why it matters. Everyone moves quickly and decisively, because they have the ingrained judgment to know who to consult, and when. People trust one another to make decisions on behalf of the whole.

    Many of the attributes of Silicon Valley companies owe a great deal to the high level of collective mastery in the area. The culture of these companies encourages risk taking, because it’s expected that people will make mistakes — not as a goal, of course, but in the process of learning. People expect their colleagues to be informal, quick-thinking, and unassuming. They rely on systems and processes only when they add value, and are willing to jettison them at other times. With this type of culture, people can focus on getting results.

    Collective mastery builds over time when people have the support and encouragement they need to work easily and readily across organizational boundaries, with a high level of trust and frequent informal contact. Even when they hold different perspectives, they get to the point where they understand one another’s thinking.

    To operate this way, you have to be flexible. That doesn’t mean giving up your strategy; you still should pursue only opportunities with which you have the capabilities to win. Indeed, knowing what you do best allows you to be closer to the customers who matter, and to give more autonomy to employees. Because you are less distracted by nonstrategic issues, you have the attention and resources to pursue worthwhile opportunities as soon as they arise. Collective mastery also makes it easier to conduct an experiment: to launch a project and learn from the response without making a huge commitment. This high level of fluidity and flexibility is essential for navigating in a volatile economic landscape.

    In the end, the 10 principles of strategy through execution will do more than help you achieve your business goals. They will also help build a new kind of culture, one in which people are aware of where you’re going and how you’re going to get there. The capabilities you build, and the value you provide, are larger than any individual can make them. But by creating the right kind of atmosphere, you make it possible to not just stand in the weeds and look at the stars, but reach a higher level than you may ever have thought you would.


    View at the original source







    0 0


    The strangeness of the quantum realm opens up exciting new technological possibilities.



    A BATHING cap that can watch individual neurons, allowing others to monitor the wearer’s mind. A sensor that can spot hidden nuclear submarines. A computer that can discover new drugs, revolutionise securities trading and design new materials. A global network of communication links whose security is underwritten by unbreakable physical laws. Such—and more—is the promise of quantum technology.

    All this potential arises from improvements in scientists’ ability to trap, poke and prod single atoms and wispy particles of light called photons. Today’s computer chips get cheaper and faster as their features get smaller, but quantum mechanics says that at tiny enough scales, particles sail through solids, short-circuiting the chip’s innards. Quantum technologies come at the problem from the other direction. Rather than scale devices down, quantum technologies employ the unusual behaviours of single atoms and particles and scale them up. Like computerisation before it, this unlocks a world of possibilities, with applications in nearly every existing industry—and the potential to spark entirely new ones.

    Strange but true

    Quantum mechanics—a theory of the behaviour at the atomic level put together in the early 20th century—has a well-earned reputation for weirdness. That is because the world as humanity sees it is not, in fact, how the world works. Quantum mechanics replaced wholesale the centuries-old notion of a clockwork, deterministic universe with a reality that deals in probabilities rather than certainties—one where the very act of measurement affects what is measured. Along with that upheaval came a few truly mind-bending implications, such as the fact that particles are fundamentally neither here nor there but, until pinned down, both here and there at the same time: they are in a “superposition” of here-there-ness. The theory also suggested that particles can be spookily linked: do something to one and the change is felt instantaneously by the other, even across vast reaches of space. This “entanglement” confounded even the theory’s originators.

    It is exactly these effects that show such promise now: the techniques that were refined in a bid to learn more about the quantum world are now being harnessed to put it to good use. Gizmos that exploit superposition and entanglement can vastly outperform existing ones—and accomplish things once thought to be impossible.

    Improving atomic clocks by incorporating entanglement, for example, makes them more accurate than those used today in satellite positioning. That could improve navigational precision by orders of magnitude, which would make self-driving cars safer and more reliable. And because the strength of the local gravitational field affects the flow of time (according to general relativity, another immensely successful but counter-intuitive theory), such clocks would also be able to measure tiny variations in gravity. That could be used to spot underground pipes without having to dig up the road, or track submarines far below the waves.

    Other aspects of quantum theory permit messaging without worries about eavesdroppers. Signals encoded using either superposed or entangled particles cannot be intercepted, duplicated and passed on. That has obvious appeal to companies and governments the world over. China has already launched a satellite that can receive and reroute such signals; a global, unhackable network could eventually follow.

    The advantageous interplay between odd quantum effects reaches its zenith in quantum computers. Rather than the 0s and 1s of standard computing, a quantum computer’s bits are in super positions of both, and each “qubit” is entangled with every other. Using algorithms that recast problems in quantum-amenable forms, such computers will be able to chomp their way through calculations that would take today’s best supercomputers millennia. Even as high-security quantum networks are being developed, a countervailing worry is that quantum computers will eventually render obsolete today’s cryptographic techniques, which are based on hard mathematical problems.

    Long before that happens, however, smaller quantum computers will make other contributions in industries from energy and logistics to drug design and finance. Even simple quantum computers should be able to tackle classes of problems that choke conventional machines, such as optimising trading strategies or plucking promising drug candidates from scientific literature. Google said last week that such machines are only five years from commercial exploitability. This week IBM, which already runs a publicly accessible, rudimentary quantum computer, announced expansion plans. As our Technology Quarterly in this issue explains, big tech firms and startups alike are developing software to exploit these devices’ curious abilities. A new ecosystem of middlemen is emerging to match new hardware to industries that might benefit.

    The solace of quantum

    This landscape has much in common with the state of the internet in the early 1990s: a largely laboratory-based affair that had occupied scientists for decades, but in which industry was starting to see broader potential. Blue-chip firms are buying into it, or developing their own research efforts. Startups are multiplying. Governments are investing “strategically”, having paid for the underlying research for many years—a reminder that there are some goods, such as blue-sky scientific work, that markets cannot be relied upon to provide.

    Fortunately for quantum technologists, the remaining challenges are mostly engineering ones, rather than scientific. And today’s quantum-enhanced gizmos are just the beginning. What is most exciting about quantum technology is its as yet untapped potential. Experts at the frontier of any transformative technology have a spotty record of foreseeing many of the uses it will find; Thomas Edison thought his phonograph’s strength would lie in elocution lessons. For much of the 20th century “quantum” has, in the popular consciousness, simply signified “weird”. In the 21st, it will come to mean “better”.

    View at the original source



    0 0





    DSS-14 is NASA's 70-meter (230-foot) antenna located at the Goldstone Deep Space Communications Complex in California. It is known as the “Mars Antenna” as it was first to receive signals from the first spacecraft to closely observe Mars, Mariner 4, on March 18, 1966.
    Credits: NASA/JPL-Caltech


    Finding derelict spacecraft and space debris in Earth’s orbit can be a technological challenge. Detecting these objects in orbit around Earth’s moon is even more difficult. Optical telescopes are unable to search for small objects hidden in the bright glare of the moon.

    However, a new technological application of interplanetary radar pioneered by scientists at NASA’s Jet Propulsion Laboratory in Pasadena, California, has successfully located spacecraft orbiting the moon -- one active, and one dormant. This new technique could assist planners of future moon missions.


    “We have been able to detect NASA’s Lunar Reconnaissance Orbiter [LRO] and the Indian Space Research Organization’s Chandrayaan-1 spacecraft in lunar orbit with ground-based radar,” said Marina Brozović, a radar scientist at JPL and principal investigator for the test project. “Finding LRO was relatively easy, as we were working with the mission’s navigators and had precise orbit data where it was located. Finding India’s Chandrayaan-1 required a bit more detective work because the last contact with the spacecraft was in August of 2009.”


    Add to the mix that the Chandrayaan-1 spacecraft is very small, a cube about five feet (1.5 meters) on each side -- about half the size of a smart car. Although the interplanetary radar has been used to observe small asteroids several million miles from Earth, researchers were not certain that an object of this smaller size as far away as the moon could be detected, even with the world’s most powerful radars. Chandrayaan-1 proved the perfect target for demonstrating the capability of this technique.






    This computer generated image depicts the Chandrayaan-1’s location at time it was detected by the Goldstone Solar System radar on July 2, 2016. In the graphic the 120-mile (200-kilometer) wide purple circle represents the width of the Goldstone radar beam at lunar distance. The radar beam was pointed 103 miles (165 kilometers) off the lunar surface. The white box in the upper-right corner of the animation depicts the strength of echo. As the spacecraft entered and exited the radar beam (purple circle), the echo from the spacecraft alternated between being very strong and very weak, as the radar beam scattered from the flat metal surfaces. Once the spacecraft flew outside the beam, the echo was gone.
    Credits: NASA/JPL-Caltech

    While they all use microwaves, not all radar transmitters are created equal. The average police radar gun has an operational range of about one mile, while air traffic control radar goes to about 60 miles. To find a spacecraft 237,000 miles (380,000 kilometers) away, JPL’s team used NASA's 70-meter (230-foot) antenna at NASA's Goldstone Deep Space Communications Complex in California to send out a powerful beam of microwaves directed toward the moon. Then the radar echoes bounced back from lunar orbit were received by the 100-meter (330-foot) Green Bank Telescope in West Virginia.


    Finding a derelict spacecraft at lunar distance that has not been tracked for years is tricky because the moon is riddled with mascons (regions with higher-than-average gravitational pull) that can dramatically affect a spacecraft’s orbit over time, and even cause it to have crashed into the moon. JPL’s orbital calculations indicated that Chandrayaan-1 is still circling some 124 miles (200 kilometers) above the lunar surface, but it was generally considered “lost.”


    However, with Chandrayaan-1, the radar team utilized the fact that this spacecraft is in polar orbit around the moon, so it would always cross above the lunar poles on each orbit. So, on July 2, 2016, the team pointed Goldstone and Green Bank at a location about 100 miles (160 kilometers) above the moon’s north pole and waited to see if the lost spacecraft crossed the radar beam. Chandrayaan-1 was predicted to complete one orbit around the moon every two hours and 8 minutes.  Something that had a radar signature of a small spacecraft did cross the beam twice during four hours of observations, and the timings between detections matched the time it would take Chandrayaan-1 to complete one orbit and return to the same position above the moon’s pole.




    Radar imagery acquired of the Chandrayaan-1 spacecraft as it flew over the moon’s south pole on July 3, 2016. The imagery was acquired using NASA's 70-meter (230-foot) antenna at the Goldstone Deep Space Communications Complex in California. This is one of four detections of Chandrayaan-1 from that day.
    Credits: NASA/JPL-Caltech


    The team used data from the return signal to estimate its velocity and the distance to the target.  This information was then used to update the orbital predictions for Chandrayaan-1.


    “It turns out that we needed to shift the location of Chandrayaan-1 by about 180 degrees, or half a cycle from the old orbital estimates from 2009,” said Ryan Park, the manager of JPL’s Solar System Dynamics group, who delivered the new orbit back to the radar team.  “But otherwise, Chandrayaan-1’s orbit still had the shape and alignment that we expected.”

    Radar echoes from the spacecraft were obtained seven more times over three months and are in perfect agreement with the new orbital predictions. Some of the follow-up observations were done with the Arecibo Observatory in Puerto Rico, which has the most powerful astronomical radar system on Earth. Arecibo is operated by the National Science Foundation with funding from NASA’s Planetary Defense Coordination Office for the radar capability.



    Hunting down LRO and rediscovering Chandrayaan-1 have provided the start for a unique new capability. Working together, the large radar antennas at Goldstone, Arecibo and Green Bank demonstrated that they can detect and track even small spacecraft in lunar orbit. Ground-based radars could possibly play a part in future robotic and human missions to the moon, both for a collisional hazard assessment tool and as a safety mechanism for spacecraft that encounter navigation or communication issues.


    JPL manages and operates NASA's Deep Space Network, including the Goldstone Solar System Radar, and hosts the Center for Near-Earth Object Studies for NASA's Near-Earth Object Observations Program, an element of the Planetary Defense Coordination Office within the agency's Science Mission Directorate.


    0 0





























    Some behaviors — yawning and scratching, for example — are socially contagious, meaning if one person does it, others are likely to follow suit. Now, researchers at Washington University School of Medicine in St. Louis have found that socially contagious itching is hardwired in the brain.

    Studying mice, the scientists have identified what occurs in the brain when a mouse feels itchy after seeing another mouse scratch. The discovery may help scientists understand the neural circuits that control socially contagious behaviors.

    “Itching is highly contagious,” said principal investigator Zhou-Feng Chen, PhD, director of the Washington University Center for the Study of Itch. “Sometimes even mentioning itching will make someone scratch. Many people thought it was all in the mind, but our experiments show it is a hardwired behavior and is not a form of empathy.”

    For this study, Chen’s team put a mouse in an enclosure with a computer screen. The researchers then played a video that showed another mouse scratching.

    “Within a few seconds, the mouse in the enclosure would start scratching, too,” Chen said. “This was very surprising because mice are known for their poor vision. They use smell and touch to explore areas, so we didn’t know whether a mouse would notice a video. Not only did it see the video, it could tell that the mouse in the video was scratching.”

    Next, the researchers identified a structure called the suprachiasmatic nucleus (SCN), a brain region that controls when animals fall asleep or wake up. The SCN was highly active after the mouse watched the video of the scratching mouse.

    When the mouse saw other mice scratching — in the video and when placed near scratching littermates — the brain’s SCN would release a chemical substance called GRP (gastrin-releasing peptide). In 2007, Chen’s team identified GRP as a key transmitter of itch signals between the skin and the spinal cord.

    “The mouse doesn’t see another mouse scratching and then think it might need to scratch, too,” Chen said. “Instead, its brain begins sending out itch signals using GRP as a messenger.”

    Chen’s team also used various methods to block GRP or the receptor it binds to on neurons. Mice whose GRP or GRP receptor were blocked in the brains’ SCN region did not scratch when they saw others scratch. But they maintained the ability to scratch normally when exposed to itch-inducing substances.

    Chen believes the contagious itch behavior the mice engaged in is something the animals can’t control.

    “It’s an innate behavior and an instinct,” he said. “We’ve been able to show that a single chemical and a single receptor are all that’s necessary to mediate this particular behavior. The next time you scratch or yawn in response to someone else doing it, remember it’s really not a choice nor a psychological response; it’s hardwired into your brain.

    View at the original source

    0 0




















    Antibodies can bind to cells in a specific manner – where the FAB portion of the antibody binds to a high-affinity specific target or the FC portion of the antibody binds to the FcR on the surface of some cells.

    They can also bind to cells in a nonspecific manner, where the FAB portion binds to a low affinity, non-specific target. Further, as cells die, and the membrane integrity is compromised, antibodies can non-specifically bind to intracellular targets.

    So, the question is, how can you identify and control for this observed nonspecific antibody binding? 

    To answer this question, many research groups started using a control known as the isotype control.
    The concept of this control is that an antibody targeting a protein not on the surface of the target cells has the same isotype (both heavy and light chain) as the antibody of interest. When used to label cells, those that showed binding to the isotype would be excluded as they represented the non-specific binding of the cells.

    Why Isotype Controls Often Fall Short 

    Isotype controls were once the most popular negative control for flow cytometry experiments.


    They are still very often included by some labs, almost abandoned by others, and a subject of confusion for many beginners. What are they, why and when do I need them? Are they of any use at all, or just a waste of money?

    Most importantly, why do reviewers keep asking for them when they review papers containing flow data?

    Isotype controls were classically meant to show what level of nonspecific binding you might have in your experiment. The idea is that there are several ways that an antibody might react in undesirable ways with the surface of the cell.

    Not all of these can be directly addressed by this control (such as cross-reactivity to a similar epitope on a different antigen, or even to a different epitope on the same antigen). What it does do is give you an estimate of non-specific (non-epitope-driven) binding. This can be Fc mediated binding, or completely nonspecific “sticky” cell adhesion.

    In order to be useful, the isotype control should ideally be the same isotype, both in terms of species, heavy chain (IgA, IgG, IgD, IgE, or IgM) and light chain (kappa or lambda) class, the same fluorochrome (PE, APC, etc.), and have the same F:P ratio. F:P is a measurement of how many fluorescent molecules are present on each antibody.

    This, unfortunately, makes the manufacture of ideal isotype controls highly impractical. 

    There is even a case to be made that differences in the amino acid sequence of the variable regions of both the light and heavy chains might result in variable levels of undesirable adherence in isotypes versus your antibody of interest. 
    Moving Beyond Isotype Controls

    Many flow cytometry researchers are no longer using isotype controls, with some suggesting they be left out of almost all experiments.


    If you spend any time browsing the Purdue Cytometry list, you’ll see these same arguments presented in threads about isotype controls. 

    A report in Cytometry A presents options for controls in several categories, the options available, and pros and cons of each option. The report's section on isotype controls summarizes the problems with the use of isotype controls very clearly.

    A second report in Cytometry B presents options for controls in several categories, the options available, and pros and cons of each option.

    The section of the above paper focusing on isotype controls summarizes the problems with their use very clearly.


    The article also illustrates difference in undesirable binding at different levels using the same clone from different manufacturers.

    For example, the figure below shows how even the same isotype control clone can result in highly variable levels of undesirable staining.

















    If you do use isotype controls in your experiment, they must match as many of the following characteristics as possible for your specific antibody — species, isotype, fluorochrome, F:P ratio, and concentration.


    Here are 5 cases against using isotype controls alone...

    1. Isotype controls are not needed for bimodal experiments.

    You don’t need isotype controls for experiments that are clearly bimodal. For example, if you are looking for T cells and B cells in peripheral blood, the negative cells also in the circulation will provide gating confidence.

    As seen in the second figure below, it is extremely easy to pick out CD4 and CD8 positive cells in the sample of lysed mouse blood.



    2. Isotype controls are not sufficient for post-cultured cells.

    If you are using post-cultured cells, the isotype control might give you some information about the inherent “stickiness” of your cells.

    However, this measurement is not a value you can subtract from your specific antibody sample to determine fluorescence intensity or percent positive.

    Instead, the measurement is simply a qualitative measure of “stickiness” and the effectiveness of Fc-blocking in your protocol.

    3. Isotype controls should not be used as gating controls.

    If you are using multiple dyes in your search, and your concern is positivity by spectral overlap, you will be better served by using a fluorescence-minus-one control (FMO), in which all antibodies are included except the one you suspect is most prone to error from spectral overlap.

    4. Isotype controls should not be used to determine positivity.

    You should absolutely not be using isotype controls to determine positive versus negative cells — or, as mentioned in #3 above, as a gating control.

    5. Isotype controls are not always sufficient for determining non-specific antibody adherence.

    Isotype controls cannot always determine non-specific antibody adherence versus, for example, free fluorochrome adherence. For this, you need to use isoclonic controls.If you add massive amounts of non-fluorochrome conjugated monoclonal antibody to your staining reaction, your fluorescence should drop. If it does not, your issue is not due to nonspecific antibody binding, but to free fluorochrome binding.






    0 0


    Many businesses don’t yet know the answer to that question. But going forward, companies will need to develop greater expertise at valuing their data assets.































    Image credit : Shyam's Imagination Library


    In 2016, Microsoft Corp. acquired the online professional network LinkedIn Corp. for $26.2 billion. Why did Microsoft consider LinkedIn to be so valuable? And how much of the price paid was for LinkedIn’s user data — as opposed to its other assets? Globally, LinkedIn had 433 million registered users and approximately 100 million active users per month prior to the acquisition. Simple arithmetic tells us that Microsoft paid about $260 per monthly active user.

    Did Microsoft pay a reasonable price for the LinkedIn user data? Microsoft must have thought so — and LinkedIn agreed. But the deal generated scrutiny from the rating agency Moody’s Investors Service Inc., which conducted a review of Microsoft’s credit rating after the deal was announced. What can be learned from the Microsoft–LinkedIn transaction about the valuation of user data? How can we determine if Microsoft — or any acquirer — paid a reasonable price?

    The answers to these questions are not clear. But the subject is growing increasingly relevant as companies collect and analyze ever more data. Indeed, the multibillion-dollar deal between Microsoft and LinkedIn is just one recent example of data valuation coming to the fore. Another example occurred during the Chapter 11 bankruptcy proceedings of Caesars Entertainment Operating Corp.

    Inc., a subsidiary of the casino gaming company Caesars Entertainment Corp. One area of conflict was the data in Caesars’ Total Rewards customer loyalty program; some creditors argued that the Total Rewards program data was worth $1 billion, making it, according to a Wall Street Journal article, “the most valuable asset in the bitter bankruptcy feud at Caesars Entertainment Corp.” A 2016 report by a bankruptcy court examiner on the case noted instances where sold-off Caesars properties — having lost access to the customer analytics in the Total Rewards database — suffered a decline in earnings. But the report also observed that it might be difficult to sell the Total Rewards system to incorporate it into another company’s loyalty program. Although the Total Rewards system was Caesars’ most valuable asset, its value to an outside party was an open question.

    As these examples illustrate, there is no formula for placing a precise price tag on data. But in both of these cases, there were parties who believed the data to be worth hundreds of millions of dollars.

    Exploring Data Valuation

    To research data valuation, we conducted interviews and collected secondary data on information activities in 36 companies and nonprofit organizations in North America and Europe. Most had annual revenues greater than $1 billion. They represented a wide range of industry sectors, including retail, health care, entertainment, manufacturing, transportation, and government.

    Although our focus was on data value, we found that most of the organizations in our study were focused instead on the challenges of storing, protecting, accessing, and analyzing massive amounts of data — efforts for which the information technology (IT) function is primarily responsible.

    While the IT functions were highly effective in storing and protecting data, they alone cannot make the key decisions that transform data into business value. Our study lens, therefore, quickly expanded to include chief financial and marketing officers and, in the case of regulatory compliance, legal officers. Because the majority of the companies in our study did not have formal data valuation practices, we adjusted our methodology to focus on significant business events triggering the need for data valuation, such as mergers and acquisitions, bankruptcy filings, or acquisitions and sales of data assets. Rather than studying data value in the abstract, we looked at events that triggered the need for such valuation and that could be compared across organizations.
    We define data value as the composite of three sources of value: (1) the asset, or stock, value; (2) the activity value; and (3) the expected, or future, value.
    All the companies we studied were awash in data, and the volume of their stored data was growing on average by 40% per year. We expected this explosion of data would place pressure on management to know which data was most valuable. However, the majority of companies reported they had no formal data valuation policies in place. A few identified classification efforts that included value assessments. These efforts were time-consuming and complex. For example, one large financial group had a team working on a significant data classification effort that included the categories “critical,” “important,” and “other.” Data was categorized as “other” when the value was judged to be context-specific. The team’s goal was to classify hundreds of terabytes of data; after nine months, they had worked through less than 20.

    The difficulty that this particular financial group encountered is typical. Valuing data can be complex and highly context-dependent. Value may be based on multiple attributes, including usage type and frequency, content, age, author, history, reputation, creation cost, revenue potential, security requirements, and legal importance. Data value may change over time in response to new priorities, litigation, or regulations. These factors are all relevant and difficult to quantify.

    A Framework for Valuing Data

    How, then, should companies formalize data valuation practices? Based on our research, we define data value as the composite of three sources of value: (1) the asset, or stock, value; (2) the activity value; and (3) the expected, or future, value. Here’s a breakdown of each value source:

    1. Data as Strategic Asset

    For most companies, monetizing data assets means looking at the value of customer data. This is not a new concept; the idea of monetizing customer data is as old as grocery store loyalty cards. Customer data can generate monetary value directly (when the data is sold, traded, or acquired) or indirectly (when a new product or service leveraging customer data is created, but the data itself is not sold). Companies can also combine publicly available and proprietary data to create unique data sets for sale or use.

    How big is the market opportunity for data monetization? In a word: big. The Strategy& unit of PwC has estimated that, in the financial sector alone, the revenue from commercializing data will grow to $300 billion per year by 2018.

    2. The Value of Data in Use

    Data use is typically defined by the application — such as a customer relationship management system or general ledger — and frequency of use. The frequency of use is typically defined by the application workload, the transaction rate, and the frequency of data access.

    The frequency of data usage brings up an interesting aspect of data value. Conventional, tangible assets generally exhibit decreasing returns to use. That is, they decrease in value the more they are used. But data has the potential — not always, but often — to increase in value the more it is used. That is, data viewed as an asset can exhibit increasing returns to use. For example, Google Inc.’s Waze navigation and traffic application integrates real-time crowdsourced data from drivers, so the Waze mapping data becomes more valuable as more people use it.

    The major costs of data are in its capture, storage, and maintenance. The marginal costs of using it can be almost negligible. An additional factor is time of use: The right data at the right time — for example, transaction data collected during the Christmas retail sales season — may be of very high value.

    Of course, usage-based definitions of value are two-sided; the value attached to each side of the activity is unlikely to be the same. For example, for a traveler lost in an unfamiliar city, mapping data sent to the traveler’s cellphone may be of very high value for one use, but the traveler may never need that exact data again. On the other hand, the data provider may keep the data for other purposes — and use it over and over again — for a very long time.

    3. The Expected Future Value of Data

    Although the phrases “digital assets” or “data assets” are commonly used, there is no generally accepted definition of how these assets should be counted on balance sheets. In fact, if data assets are tracked and accounted for at all — a big “if” — they are typically commingled with other intangible assets, such as trademarks, patents, copyrights, and goodwill. There are a number of approaches to valuing intangible assets. For example, intangible assets can be valued on the basis of observable market-based transactions involving similar assets; on the income they produce or cash flow they generate through savings; or on the cost incurred to develop or replace them.
    Making implicit data policies explicit, codified, and sharable across the company is a first step in prioritizing data value.

    What Can Companies Do?

    No matter which path a company chooses to embed data valuation into company-wide strategies, our research uncovered three practical steps that all companies can take.

    1. Make valuation policies explicit and sharable across the company. It is critical to develop company-wide policies in this area. For example, is your company creating a data catalog so that all data assets are known? Are you tracking the usage of data assets, much like a company tracks the mileage on the cars or trucks it owns? Making implicit data policies explicit, codified, and sharable across the company is a first step in prioritizing data value.

    A few companies in our sample were beginning to manually classify selected data sets by value. In one case, the triggering event was an internal security audit to assess data risk. In another, the triggering event was a desire to assess where in the organization the volume of data was growing rapidly and to examine closely the costs and value of that growth.

    The strongest business case we found for data valuation was in the acquisition, sale, or divestiture of business units with significant data assets. We anticipate that in the future, some of the evolving responsibilities of chief data officers may include valuing company data for these purposes. But that role is too new for us to discern any aggregate trends at this time.

    2. Build in-house data valuation expertise. Our study found that several companies were exploring ways to monetize data assets for sale or licensing to third parties. However, having data to sell is not the same thing as knowing how to sell it. Several of the companies relied on outside experts, rather than in-house expertise, to value their data. We anticipate this will change. Companies seeking to monetize their data assets will first need to address how to acquire and develop valuation expertise in their own organizations.

    3. Decide whether top-down or bottom-up valuation processes are the most effective within the company. In the top-down approach to valuing data, companies identify their critical applications and assign a value to the data used in those applications, whether they are a mainframe transaction system, a customer relationship management system, or a product development system. Key steps include defining the main system linkages — that is, the systems that feed other systems — associating the data accessed by all linked systems, and measuring the data activity within the linked systems. This approach has the benefit of prioritizing where internal partnerships between IT and business units need to be built, if they are not already in place.

    A second approach is to define data value heuristically — in effect, working up from a map of data usage across the core data sets in the company. Key steps in this approach include assessing data flows and linkages across data and applications, and producing a detailed analysis of data usage patterns. Companies may already have much of the required information in data storage devices and distributed systems.

    Whichever approach is taken, the first step is to identify the business and technology events that trigger the business’s need for valuation. A needs-based approach will help senior management prioritize and drive valuation strategies, moving the company forward in monetizing the current and future value of its digital assets.

    Reproduced from MITSLOAN Management Review

    0 0


    Digital technology, despite its seeming ubiquity, has only begun to penetrate industries. As it continues its advance, the implications for revenues, profits, and opportunities will be dramatic.





























    Image credit : Shyam's Imagination Library



    As new markets emerge, profit pools shift, and digital technologies pervade more of everyday life, it’s easy to assume that the economy’s digitization is already far advanced. According to our latest research, however, the forces of digital have yet to become fully mainstream. On average, industries are less than 40 percent digitized, despite the relatively deep penetration of these technologies in media, retail, and high tech.

    As digitization penetrates more fully, it will dampen revenue and profit growth for some, particularly the bottom quartile of companies, according to our research, while the top quartile captures disproportionate gains. Bold, tightly integrated digital strategies will be the biggest differentiator between companies that win and companies that don’t, and the biggest payouts will go to those that initiate digital disruptions. Fast-followers with operational excellence and superior organizational health won’t be far behind.

    The case for digital reinvention 


    As digitization penetrates more fully, it will dampen revenue and profit growth for some, particularly the bottom quartile of companies, according to our research, while the top quartile captures disproportionate gains. Bold, tightly integrated digital strategies will be the biggest differentiator between companies that win and companies that don’t, and the biggest payouts will go to those that initiate digital disruptions. Fast-followers with operational excellence and superior organizational health won’t be far behind.

    These findings emerged from a research effort to understand the nature, extent, and top-management implications of the progress of digitization. We tailored our efforts to examine its effects along multiple dimensions: products and services, marketing and distribution channels, business processes, supply chains, and new entrants at the ecosystem level (for details, see sidebar “About the research”). We sought to understand how economic performance will change as digitization continues its advance along these different dimensions. What are the best-performing companies doing in the face of rising pressure? Which approach is more important as digitization progresses: a great strategy with average execution or an average strategy with great execution?

    The research-survey findings, taken together, amount to a clear mandate to act decisively, whether through the creation of new digital businesses or by reinventing the core of today’s strategic, operational, and organizational approaches.

    More digitization—and performance pressure—ahead

    According to our research, digitization has only begun to transform many industries (Exhibit 1). Its impact on the economic performance of companies, while already significant, is far from complete.

    Contd 2.........

    Page 2, 3, 4, 5

    0 0



    This finding confirms what many executives may already suspect: by reducing economic friction, digitization enables competition that pressures revenue and profit growth. Current levels of digitization have already taken out, on average, up to six points of annual revenue and 4.5 points of growth in earnings before interest and taxes (EBIT). And there’s more pressure ahead, our research suggests, as digital penetration deepens (Exhibit 2).





    While the prospect of declining growth rates is hardly encouraging, executives should bear in mind that these are average declines across all industries. Beyond the averages, we find that performance is distributed unequally, as digital further separates the high performers from the also-rans. This finding is consistent with a separate McKinsey research stream, which also shows that economic performance is extremely unequal. Strongly performing industries, according to that research, are three times more likely than others to generate market-beating economic profit. Poorly performing companies probably won’t thrive no matter which industry they compete in.

    At the current level of digitization, median companies, which secure three additional points of revenue and EBIT growth, do better than average ones, presumably because the long tail of companies hit hard by digitization pulls down the mean. But our survey results suggest that as digital increases economic pressure, all companies, no matter what their position on the performance curve may be, will be affected.

    Uneven returns on investment

    That economic pressure will make it increasingly critical for executives to pay careful heed to where—and not just how—they compete and to monitor closely the return on their digital investments. So far, the results are uneven. Exhibit 3 shows returns distributed unequally: some players in every industry are earning outsized returns, while many others in the same industries are experiencing returns below the cost of capital. 





    These findings suggest that some companies are investing in the wrong places or investing too much (or too little) in the right ones—or simply that their returns on digital investments are being competed away or transferred to consumers. On the other hand, the fact that high performers exist in every industry (as we’ll discuss further in a moment) indicates that some companies are getting it right—benefiting, for example, from cross-industry transfers, as when technology companies capture value in the media sector.

    Where to make your digital investments

    Improving the ROI of digital investments requires precise targeting along the dimensions where digitization is proceeding. Digital has widely expanded the number of available investment options, and simply spreading the same amount of resources across them is a losing proposition. In our research, we measured five separate dimensions of digitization’s advance into industries: products and services, marketing and distribution channels, business processes, supply chains, and new entrants acting in ecosystems.

    How fully each of these dimensions has advanced, and the actions companies are taking in response, differ according to the dimension in question. And there appear to be mismatches between opportunities and investments. Those mismatches reflect advancing digitization’s uneven effect on revenue and profit growth, because of differences among dimensions as well as among industries. Exhibit 4 describes the rate of change in revenue and EBIT growth that appears to be occurring as industries progress toward full digitization. This picture, combining the data for all of the industries we studied, reveals that today’s average level of digitization, shown by the dotted vertical line, differs for each dimension. Products and services are more digitized, supply chains less so. 




    To model the potential effects of full digitization on economic performance, we linked the revenue and EBIT growth of companies to a given dimension’s digitization rate, leaving everything else equal. The results confirm that digitization’s effects depend on where you look. Some dimensions take a bigger bite out of revenue and profit growth, while others are digitizing faster. This makes intuitive sense. As platforms transform industry ecosystems, for example, revenues grow—even as platform-based competitors put pressure on profits. As companies digitize business processes, profits increase, even though little momentum in top-line growth accompanies them.

    The biggest future impact on revenue and EBIT growth, as Exhibit 4 shows, is set to occur through the digitization of supply chains. In this dimension, full digitization contributes two-thirds (6.8 percentage points of 10.2 percent) of the total projected hit to annual revenue growth and more than 75 percent (9.4 out of 12 percent) to annual EBIT growth.

    Despite the supply chain’s potential impact on the growth of revenues and profits, survey respondents say that their companies aren’t yet investing heavily in this dimension. Only 2 percent, in fact, report that supply chains are the focus of their forward-looking digital strategies (Exhibit 5), though headlining examples such as Airbnb and Uber demonstrate the power of tapping previously inaccessible sources of supply (sharing rides or rooms, respectively) and bringing them to market. Similarly, there is little investment in the ecosystems dimension, where hyperscale businesses such as Alibaba, Amazon, Google, and Tencent are pushing digitization most radically, often entering one industry and leveraging platforms to create collateral damage in others. 

    Contd 3...............

    Page 1 3



    0 0






    Instead, the survey indicates that distribution channels and marketing are the primary focus of digital strategies (and thus investments) at 49 percent of companies. That focus is sensible, given the extraordinary impact digitization has already had on customer interactions and the power of digital tools to target marketing investments precisely. By now, in fact, this critical dimension has become “table stakes” for staying in the game. Standing pat is not an option.

    The question, it seems, looking at exhibits 4 and 5 in combination, is whether companies are overlooking emerging opportunities, such as those in supply chains, that are likely to have a major influence on future revenues and profits. That may call for resource reallocation. In general, companies that strategically shift resources create more value and deliver higher returns to shareholders. This general finding could be even more true as digitization progresses.

    Our survey results also suggest companies are not sufficiently bold in the magnitude and scope of their investments (see sidebar “Structuring your digital reinvention”). Our research (Exhibit 6) suggests that the more aggressively they respond to the digitization of their industries—up to and including initiating digital disruption—the better the effect on their projected revenue and profit growth. The one exception is the ecosystem dimension: an overactive response to new hyperscale competitors actually lowers projected growth, perhaps because many incumbents lack the assets and capabilities necessary for platform strategies.




    Contd 4.........

    Page 1, 2, 4,






    0 0




    0 0



    As executives assess the scope of their investments, they should ask themselves if they have taken only a few steps forward in a given dimension—by digitizing their existing customer touchpoints, say. Others might find that they have acted more significantly by digitizing nearly all of their business processes and introducing new ones, where needed, to connect suppliers and users.
    To that end, it may be useful to take a closer look at Exhibit 6, which comprises six smaller charts. The last of them totals up actions companies take in each dimension of digitization. Here we can see that the most assertive players will be able to restore more than 11 percent of the 12 percent loss in projected revenue growth, as well as 7.3 percent of the 10.4 percent reduction in profit growth. Such results will require action across all dimensions, not just one or two—a tall order for any management team, even those at today’s digital leaders.

    Looking at the digital winners

    To understand what today’s leaders are doing, we identified the companies in our survey that achieved top-quartile rankings in each of three measures: revenue growth, EBIT growth, and return on digital investment.

    We found that more than twice as many leading companies closely tie their digital and corporate strategies than don’t. What’s more, winners tend to respond to digitization by changing their corporate strategies significantly. This makes intuitive sense: many digital disruptions require fundamental changes to business models. Further, 49 percent of leading companies are investing in digital more than their counterparts do, compared with only 5 percent of the laggards, 90 percent of which invest less than their counterparts. It’s unclear which way the causation runs, of course, but it does appear that heavy digital investment is a differentiator.

    Leading companies not only invested more but also did so across all of the dimensions we studied. In other words, winners exceed laggards in both the magnitude and the scope of their digital investments (Exhibit 7). This is a critical element of success, given the different rates at which these dimensions are digitizing and their varying effect on economic performance. 








































    Strengths in organizational culture underpin these bolder actions. Winners were less likely to be hindered by siloed mind-sets and behavior or by a fragmented view of their customers. A strong organizational culture is important for several reasons: it enhances the ability to perceive digital threats and opportunities, bolsters the scope of actions companies can take in response to digitization, and supports the coordinated execution of those actions across functions, departments, and business units.

    Bold strategies win

    So we found a mismatch between today’s digital investments and the dimensions in which digitization is most significantly affecting revenue and profit growth. We also confirmed that winners invest more, and more broadly and boldly, than other companies do. Then we tested two paths to growth as industries reach full digitization.

    The first path emphasizes strategies that change a business’s scope, including the kind of pure-play disruptions the hyperscale businesses discussed earlier generate. As Exhibit 8 shows, a great strategy can by itself retrieve all of the revenue growth lost, on average, to full digitization—at least in the aggregate industry view. Combining this kind of superior strategy with median performance in the nonstrategy dimensions of McKinsey’s digital-quotient framework—including agile operations, organization, culture, and talent—yields total projected growth of 4.3 percent in annual revenues. (For more about how we arrived at these conclusions, see sidebar “About the research.”).





























    Most executives would fancy the kind of ecosystem play that Alibaba, Amazon, Google, and Tencent have made on their respective platforms. Yet many recognize that few companies can mount disruptive strategies, at least at the ecosystem level. With that in mind, we tested a second path to revenue growth (Exhibit 9).






    In the quest for coherent responses to a digitizing world, companies must assess how far digitization has progressed along multiple dimensions in their industries and the impact that this evolution is having—and will have—on economic performance. And they must act on each of these dimensions with bold, tightly integrated strategies. Only then will their investments match the context in which they compete.

    Contd 5.........

    Page 1, 2, 3, 5



older | 1 | .... | 68 | 69 | (Page 70) | 71 | 72 | .... | 82 | newer