Are you the publisher? Claim or contact us about this channel

Embed this content in your HTML


Report adult content:

click to rate:

Account: (login)

More Channels


Channel Catalog

Channel Description:

Best content from the best source handpicked by Shyam. The source include The Harvard University, MIT, Mckinsey & Co, Wharton, Stanford,and other top educational institutions. domains include Cybersecurity, Machine learning, Deep Learning, Bigdata, Education, Information Technology, Management, others.

older | 1 | .... | 72 | 73 | (Page 74) | 75 | 76 | .... | 82 | newer

    0 0

    Cybersecurity has become an 'Inevitable Essential' for the organisations operating in the cyberspace. However to what degree it needs to used or to what degree it needs to be scaled up is a strategic question that needs to be addressed.

    The customised cybersecurity model required for every business is different  in scale and intensity and has to be addressed by the cybersecurity experts.

    Image credit: Shyam's Imagination Library

    Protecting critical and sensitive information is of paramount importance in business and government, but plans must be in place to handle inevitable breaches too.
    Cybersecurity has become one of the biggest priorities for businesses and governments, as practically all of life migrates its way to data centers and the cloud. In this episode of the McKinsey Podcast, recorded at the Yale Cyber Leadership Forum in March, Sam Palmisano, chairman of the Center for Global Enterprise and the retired chairman and CEO of IBM, and Nathaniel Gleicher, head of cybersecurity strategy at data-and-cloud-security company Illumio, speak with McKinsey about how governments and companies can vastly improve their cyberprotections.

    Podcast transcript 

    Roberta Fusaro: Cybersecurity has become one of the biggest priorities for businesses and governments, as practically all of life migrates its way to data centers and the cloud. In this episode of the McKinsey Podcast, recorded at the Yale Cyber Leadership Forum in March, we catch up with two leading thinkers on security issues. Sam Palmisano is the retired chairman and CEO of IBM, who served as vice chair of the US Commission on Enhancing National Cybersecurity. Nathaniel Gleicher is the head of cybersecurity strategy at Illumio, a data and cloud security company.

    First up from the forum is Sam Palmisano, who, in this wide-ranging conversation with McKinsey’s Marc Sorel, makes the case that strong cybersecurity programs are critical for improved innovation and economic growth.

    Sam, thank you for joining us today. I want to talk a little bit about your work on the Commission on Enhancing National Cybersecurity. What was the original mandate?

    What was the process by which you came up with your findings? And what were some of the most surprising results?

    Sam Palmisano: Thank you, Roberta. The thing was that President Obama had reached the conclusion that the digital economy or the Internet is so fundamental now to economic growth and society that something needed to be done to make some recommendations to enhance it or strategically position it for the future.

    A great example is the Internet of Things, because it’s no longer just phones and desktop computers. It’s everything in life. It’s self-driving cars, it’s thermostats, it’s music players, it’s cameras.

    Now you take this infrastructure and you’re making billions of things that are computers, which are smart devices. But that’s what they are, they’re chips with software with all the vulnerabilities, unless you design for security from the beginning. And you’ve taken this problem and you’ve put it on steroids.

    The complexity there is one of getting consensus to go fast and address the issues prior to billions of things being out there that aren’t secure, which is the path we’re headed down.

    Marc Sorel: How do you think about what the private sector, and to some extent the social sector, need to do now to be part of that?

    Sam Palmisano: We need to form a private-public collaboration. The reason for it, the government doesn’t have the skills to do this themselves. We spent nine months crawling through their statements of skill. They can argue all they want. They don’t. That doesn’t mean that elements of government don’t have some skill. To take the intelligence agencies out of this discussion and get to that commercial side, it doesn’t have the capability. They need the capability, so they had to form a partnership. The skills exist in the academic community and in the research universities and in the technology community.

    Marc Sorel: Did you all as a commission see a model in the market today for what that collaboration could look like?

    Sam Palmisano: There are established entities within government that are a combination of academic, private sector, government, and technical. A lot of the technical communities come together.

    General Keith Alexander ran the Cyber Command Center. There were probably 20 of us that met once a quarter for five, six years. The same guys that were running IBM, Google, Dell, Microsoft, HP, and Verizon, plus all the government appropriate people would meet quarterly. The technical people would meet even more often to tackle some of these issues, and it was self-funding. We solved problems just by pitching in because it was in the best interest of everyone to solve some of these issues, and in the best interest of the industry because you wanted to expand and grow.

    To really do this, though, this was going to require funding, To solve the problem we’re talking about, it’s going to require some amount of money and research, like a DARPA or related fund, pick something like that as the funding source that government can coordinate, and then convene this body. Then do the work as we suggest. Now, the work is going to get complicated. Because there’s two pieces to it. One is, let’s say for example, to come up with a standard for the Internet of Things that you would put in this device, this object. Then within that object, you’d have this standard. Then you’d also have a nutrition label on the standard. We called it the Cyber Star. It’s like the health seal that says, “OK, if you’re the manufacturer and you’ve complied with these standards, you get the star.” You get the Cyber Star.

    There were also guys that recommended a thing called secure, they call it clean pipes. With clean pipes, there are a lot of policy implications, a lot of criminal-justice-systems implications. But technically, you could create a clean path and you could have a secure path, and you could argue for certain areas where life is threatened.

    In the autonomous vehicles or drones or things where people could actually be seriously injured or die, you’d want a secure, clean path. You don’t want this on the open Internet.

    Marc Sorel: So you’re talking about creating a separate secure environment for these privileged parts of the ecosystem.

    Sam Palmisano: Right. Think of it as a commercial virtual private network but beyond that. Put that on steroids from an encryption and security perspective. For all these Internet of Things devices. Health, heart monitor, things you’re putting in your body. Pacemakers, et cetera. Defibrillators. Those kinds of things. Not Fitbits that you wear on your wrist, but serious things that could do serious harm like stop your heart. You want to have that information flowing in a secure way. In an encrypted, secure way. That doesn’t mean everything should be that. If you’re sharing your photos with friends, I don’t think you need that level or cost associated with those kinds of technologies.

    Marc Sorel: You’re basically saying at some level, there should be a tiering of Internets to acknowledge the degree of security required for different pieces of the ecosystem to communicate.

    Sam Palmisano: That is a solution to the problem. Now you have to make it commercially viable, which gets you into things like net neutrality. But if you were to technically solve the problem, you would begin to architect portions of the Internet. You can’t go recreate the past. It’s just too old, it’s too cobbled together. Let that be what it is.

    But anything that’s life-threatening or takes down the infrastructure or the world economy. Let’s just start there. The premise or the assumption is that you can’t solve this in the Internet as it exists today. It just was too complicated. It’s too convoluted. It’s too open by design. That’s why it was so successful, because it was an open architecture. We had all these debates, all of the technical guys. And said, “Look. We used to do this 40 years ago.” ATMs never got hacked. Money didn’t start spitting out on the curb and stuff because it was a secure connection. It was, it was a proprietary network. We know how to do it technically.

    But there are people that did these things for years. We’ve moved onto an open innovative system which is terrific because it drives innovation at a much more rapid pace. It also gives people more economic opportunity to participate. That’s a big plus. But in certain areas where you’re dealing with, let’s say, major societal issues, we ought to go back to some of the classical approaches to how you design the systems.

    Roberta Fusaro: Most people today would say, “If I had to place a bet on who’s going to gain ground on whom and put space between themselves, it’s the attackers that are going to continue to distance themselves in terms of capability from the defenders in terms of their capabilities.” Do you agree with that?

    Sam Palmisano: Eighty percent of the cybersecurity issues that have occurred in the commercial world are internal process and people. It’s not the disgruntled employees who got fired and therefore they gave somebody their access codes. It’s also people who didn’t protect their access codes or they tape it to their computer. Or they leave it in the top drawer of their desk, and the cleaning people can go get the stuff. You would get rid of half of your problems as an enterprise if you just train your folks and put controls in place.

    It’s a combination of monitoring, process training, audit people. Did you follow the process? So there’s an accountability in the system. That’ll clean up a lot of the stuff in the commercial world. Password authentication and end points. If the civilian side of government, .gov, did those things, they would clean up probably 95 percent of their problems and save a ton of money, too.

    We also talked about this idea, which never got traction in the commission report, but we thought it was a good idea where you basically would create a national ID like a credit bureau. You could create this national ID foundry where you get your birth certificate. You also get your digital identity at birth, and that digital identity is secure and protected. Now, you can modify for simple things—sharing your photos on the Internet—or you can modify it for very sophisticated things like financial transactions, your health information.

    Marc Sorel: Why didn’t it catch on?

    Sam Palmisano: In the commission itself?

    Marc Sorel: Yeah.

    Sam Palmisano: What we did was said, further studies should take place, and we recommended that Treasury would look at, further look at creating this kind of an entity. We also looked at commercial insurance as well, and the purpose of commercial insurance.

    The purpose of commercial insurance was that if you agreed on the standards, and therefore you complied with those standards, you should be able to get higher liability coverage at a lower rate than somebody who didn’t.

    Our view was that would drive up the adoption rate because people are going to want to find an insurance policy for cyber. That’s going to happen. How do you get these companies to make the investments to move up the risk-protection curve? Well, you make it to their advantage by having insurance that says, “We could audit those standards. And if you’ve complied with those standards, like burglar alarm systems or fire alarms in your home, you’re going to get higher liability coverage at a lower rate.” That’s to make it an economic-based system versus a government-mandated system.

    The commission was very biased toward private-sector solutions versus government-mandated solutions. You need a private sector or an economically driven set of motivations to solve the problem.

    Roberta Fusaro: This has been a fascinating conversation. Thank you, Sam, for taking the time to be with us today.

    Sam Palmisano: Oh, thank you. It was great being with you.

    Next up from the Forum, is Nathaniel Gleicher, who describes how businesses can learn a lot from the model of protection used by the US Secret Service.

    Roberta Fusaro: Welcome, Nathaniel. Thank you for joining us today for the McKinsey Podcast.

    Nathaniel Gleicher: No problem. Glad that I could join.

    Roberta Fusaro: Your company has been providing cyber options for four or five years now, and I’m wondering how you’ve seen the market change over that time in terms of what customers are looking for or technologies that have emerged.

    Nathaniel Gleicher: There used to be a perception that cybersecurity was black magic, particularly outside of the technical community, and that outside of that community, people would sort of say, “I don’t understand this. Just make it work.” As long as you don’t hear anything, no news is good news. The increasing scope and scale of breaches and the degree to which organizations are moving into these exposed environments has changed that. If you look at business leaders, I think they are focused on how do you quantify the risks that you face, and how do you measure the benefit that you’re getting from the solutions you invest in? It’s a much more quantification-driven industry than it used to be. I don’t know that we’re very good at quantification yet. But the desire to quantify is an important change.

    Roberta Fusaro: Apart from quantification, are there other hot topics in cyber that you’re seeing or managing right now?

    Nathaniel Gleicher: Sometimes I think we do cybersecurity like fourth graders play soccer. Chase the ball across the field, the whole group runs. There are always hot topics. What’s interesting to me is that we’ve known for a while there are a few steps that if you took them, environments would be much more secure.

    If you think about encrypting data, using strong passwords, white-listing your applications, segmenting your environment, patching your vulnerabilities, and people generally haven’t done that because it’s been hard to figure out how to do that at scale across these large organizations.

    One of the biggest challenges that we face in cybersecurity today is that we don’t really have a single, coherent strategic model to describe how to protect an environment. There are a lot of tactical models, so if you look at the SANS top 20, if you look at NIST, if you look at some of these other frameworks, they will tell you, you should be investing in encryption. You should be investing in segmentation. You should be investing in certain kinds of detection. They’ll tell you all the tools you should use and you can think about how to line them up, but it’s very tactical. It’s hard to find a model that lets you pull back and think about the threat as a whole.

    I’m starting to see groups of companies trying to solve that problem, trying to think, how do you do these steps that don’t seem all that sexy, but that actually drive to security.

    Roberta Fusaro: What are some of the potential remedies?

    Nathaniel Gleicher: If you look at security disciplines through the ages, whether it’s law enforcement, executive protection, physical security for locations, military security, any of these sort of well-built disciplines, the foundation of every security discipline is understanding the environment you’re protecting and exerting control over that environment.

    In cybersecurity, we are not good at understanding the environment we’re defending. Most organizations don’t understand the network. They don’t understand what’s connected and what’s communicating with what. Because of that, they have relatively few options to control that environment. I mentioned before a few simple things people could do to strengthen their environment. Those are all about control, and what I mean by control, people often think there’s prevention, keeping the bad guys out, and then there’s detection and response, catching them once they get in.

    Those are both important components. In general today, people would tell you, you can’t invest all in one or the other, that prevention by itself isn’t enough. People are going to get in. What people miss in that debate is the reason detection and response works is because you understand your environment, and you control it.

    If you don’t know where your high-value assets are, and if you don’t know what connects to them, how someone would access them, it’s incredibly hard to know what you need to protect. If you don’t have the resources to control that, you’re defending an open field. So you have hundreds and hundreds of paths you need to defend, potential connections you need to worry about, and the attacker gets to move first. On the flip side, if you invest to understand your environment first, and control your environment first, it actually makes detection and response better.

    Roberta Fusaro: What are some ways to identify the crown jewels, the things that really do matter? I can imagine that that could be an incredibly difficult task, given all the assets that companies manage.

    Nathaniel Gleicher: It’s different for every organization, to some degree, but it’s about understanding business risk. The question is, what are the assets that I defend, or that my business relies on, such that if they were exposed or compromised, it would fundamentally harm the way I do business?

    Whether that’s health care data about your customers, or customer information, whether that’s the systems on which your business runs, whether that’s the exchanges across which you connect, every business has a different set of factors they need to judge. But often, if you think in terms of business risk, we’re pretty good at figuring that out because businesses have been measuring and concerned about risk for quite some time. It’s just a question of translating that and understanding the technical implications.

    A model that I like to use when I think about this is the way the Secret Service protects the president. The president is a lot like a high-value asset in a data center, in that he’s very valuable, very targeted, and also very exposed. The Secret Service doesn’t get to take the president, put him in a box somewhere, and have him not talk to anyone. He’s constantly talking to people, so the job is really about managing risk, which is similar to the way we’re protecting assets in the data center.

    When the Secret Service is protecting the president, if you imagine the president speaking in an auditorium, the Secret Service shows up months before the president is going to be there. The first thing they do is they map the auditorium to understand that if the president’s going to be here, speaking on this stage, here are all the attack vectors.

    Here are all the ways someone could reach the president. An auditorium is built for openness, so there are going to be a lot. The Secret Service tries to control that environment, to shrink the number of attack vectors. The reason they do this is, as we said before, if you have to watch a hundred attack vectors, it’s really expensive, and you’re really spread out thin. If you have to watch 20, you’re in much better shape as a defender. So you can say we don’t leave this doorway open, and no one’s going to sit in this portion of the auditorium. You can close things down to simplify your environment. That’s important for a lot of reasons, but the biggest reason is it makes detection much easier.

    If there’s a section of the auditorium where no one is supposed to sit, that doesn’t necessarily mean no one will show up there. People always do strange things. But if someone does, you know they’ve broken a policy. It’s not a false positive. There’s no risk of confusion. You can simply react, and it lets the Secret Service act much more quickly because rather than basing their actions on uncertain analysis, they’re basing it, they create firm boundaries. When someone breaks a boundary, they know what to do. If the Secret Service wanted to, they have a lot of resources, they could put a metal detector at every seat in the auditorium.

    They could put one at every single seat. They could get the best metal detector in the world. The problem is, they would never do that. They would get thousands and thousands of alerts and lots of them would be because someone had a particularly heavy watch on, or had change in their pocket. Whatever it might be. In order to test those alerts, they would have to send Secret Service agents out into the auditorium to check each one. And Secret Service agents are really expensive, and they’re rare. It takes a long time to train them. They’re hard to find. What you really want to do, is take your precious resource, your Secret Service agents, and you want to direct them at the hardest, smallest slice of the problem.

    So take that and apply it to the data center. If you are detecting everything everywhere, and you don’t have control over the environment, you’re going to get a lot of alerts. The statistics we see right now back that up. Organizations get 500, 1,000 critical alerts a day, which is a huge number of alerts that supposedly you have to deal with.

    On average, organizations say they have the capacity to investigate something like 1 percent of them. So you’re investigating 1 percent of all these critical alerts. Quickly you start to turn things off because that data is dirty. If you’re following the model, you would do the same thing the Secret Service does. You don’t put a metal detector everywhere.

    What you do is you control the environment. You limit the places people can be, the paths they can take, so you know where to watch. So you know if this is my high-value asset in my data center, then if anything strange happens there, obviously it should be my highest priority. If anything strange happens in something connected to it that might be a secondary priority. You can start to prioritize these alerts and focus on the problems that matter more.

    Roberta Fusaro: What are some of the policies or regulations that are emerging that business executives need to concern themselves with?

    Nathaniel Gleicher: In a lot of ways, 2017 will be a year of regulation in cybersecurity. Not exactly the regulation people think about. I don’t know that it’ll come from DC. SWIFT, the financial-transactions organization, recently put out controls that all of its members need to comply with to segment and protect their SWIFT application.

    This is in response to all the criminal activity targeting SWIFT applications. That’s one. The New York DFS, the financial regulator, put out controls around cybersecurity quite recently. The European Union recently put out a new general data-protection regulation, which has a whole range of controls built into it, but there are specific pieces around where is data stored, and how is it stored, which raise serious concerns for companies.

    There are a lot of pieces coming out from different places, that depending on what industry you sit on, you need to watch. The pattern that I’m seeing, though, is each of these has components that require organizations to do a better job exerting control over the data in their possession.

    Organizations have said, “My data just pools in all these places. I don’t even know where it is. It moves through these systems too fast for me to follow.” It has been acceptable for companies not to know answers to these technical questions. You’re seeing these regulations start to come out that push back on that. There’s this increasing requirement on organizations to understand what’s happening in those systems, and where that data’s going.

    Roberta Fusaro: How might this increased oversight affect companies’ ability to innovate? So many new business models are data- and analytics-driven.

    Nathaniel Gleicher: There’s this old apocryphal joke that if we built cars like we built computers, cars would go 500 miles an hour, get 500 miles a gallon, and blow up once a week. We’ve made this choice, historically, around computer and Internet innovation that the consequences of unreliability aren’t all that high.

    We’d rather have rapid innovation, but what’s happening now is more and more you see the technical world, the Internet world, colliding or reconnecting with the physical world, whether it’s autonomous cars, whether it’s health innovation like you’re seeing, whether it’s integrating smart solutions into the home, whether it’s integrating smart solutions into our transportation framework.

    There are more and more opportunities integrating technology and smart solutions into the financial systems that our society runs on. There are more and more opportunities for surprisingly small bugs to cause very big chain effects in the physical world. The push and pull that you’re seeing is how do you maintain the pace of innovation that has been so valuable, and such an engine of economic growth, an engine of competitive edge for us, while still mitigating the risks of all of these autonomous systems, and more and more sophisticated systems that are impacting the physical world.

    Roberta Fusaro: What are the opportunities for VCs and start-ups in this changing environment?

    Nathaniel Gleicher: There are huge opportunities in pointing artificial intelligence solutions and orchestration solutions at problems that are incredibly hard to do at scale for large organizations. We tend to think of cybersecurity as a technology solution because that’s convenient.

    The truth is, it’s really an organizational solution. If you only have one computer, obviously anyone can make a computer secure by turning it off. But if you have one computer, if you have one system, a sophisticated defender is going to be much better able to protect that than if you have a thousand systems and hundreds of employees, or 10,000 systems, and hundreds or thousands of employees.

    The challenge is getting large organizations to operate in a coherent fashion, when large organizations are made up of people, and we aren’t always good at operating in a coherent fashion. What organizations really need, and where there’s real potential, is how do you make it so those things we talked about at the beginning, encryption, strong passwords, segmentation, white-listing applications, patching vulnerabilities can be done reliably, consistently and at scale because if we can do that, we would solve a large chunk of our security problem.

    Roberta Fusaro: Nathaniel, thank you so much for joining us today.

    Nathaniel Gleicher: Thank you for having me.

    About the author(s)

    Nathaniel Gleicher is the head of cybersecurity strategy at Illumio, and Sam Palmisano is chairman of the Center for Global Enterprise and retired chairman and CEO of IBM. Roberta Fusaro is a senior editor of McKinsey Publishing, and Marc Sorel is a consultant in McKinsey’s Washington, DC, office.

    View at the original source

    0 0

    Patrick Collison and Stripe are at the forefront of a new wave of cloud leaders.
    The world is moving online, and business is going with it. The companies that make that journey possible – providing everything from infrastructure to security, chat tools to marketing and HR – make up the wide-ranging and red hot category of cloud computing.

    For the second year, the Forbes Cloud 100 recognizes the best and brightest of the cloud. Compiled with the help of partners Bessemer Venture Partners and Salesforce Ventures, the list tracks candidates by operating metrics such as revenue and funding, with the help of 25 of their public cloud CEO peers.

    The companies of the Cloud 100 have worked with the world's largest corporations and solved small business headaches alike, fixed people's grammar online and traced government sponsored hacking attacks. They're led in 2017 by Stripe, the online payments company founded by Irish-born Patrick Collison and his brother John, valued at $9.2 billion. Hundreds of thousands of businesses use Stripe's software to handle sales and other transactions on their sites, including Facebook, Lyft, Target and Unicef.

    Stripe's joined by three other San Francisco companies in the top 5: file sharing and collaboration company Dropbox (No. 2), messaging platform Slack (3) and digital signatures unicorn DocuSign (4). But rounding out the group is a challenger to Stripe with sneaky-big revenue and its own multi-billion dollar valuation, Adyen (5). On a list still dominated by Silicon Valley, cofounder and CEO Pieter van der Does has quietly built his own would-be payments juggernaut. Adyen works with Facebook, too, but also Uber, Netflix and Spotify, processing $90 billion in transactions last year on $727 million in nearly-doubled revenue. The company bills itself as more international friendly than its California competitors, in part due to its scrappy Dutch roots. "We [have taken] it on as a badge of honor: Adyen, the unknown unicorn," van der Does says.

    A handful of last year's Cloud 100 companies were ineligible this year due to exits.

    The list features 25 newcomers from 2016's inaugural list, led by billionaire Tom Siebel's second act providing app-making software for the Internet of Things, C3 IoT (19). Siebel's joined by another repeat entrepreneur, Groupon cofounder Brad Keywell, in the fast-growing category that helps process vast amounts of data for businesses in aerospace, energy, manufacturing and more -- Keywell's Chicago startup Uptake now churns out four million predictions for its customers each week, good for a $2 billion valuation and a debut at No. 22.

    Data and analytics companies make up the most list companies of any category with 15%, led by Utah experience management leader Qualtrics (6). Despite a population less than half of the Bay Area and New York, Utah's emerging cloud scene accounts for six companies on this year's list, and three in the top 20 as a model ecosystem for outsized tech success forms around the pre-IPO leaders with CEO friends, Qualtrics, Domo (15) and Pluralsight (20).

    How the Cloud 100 breaks down by category.

    With strong showings by IT operations firms, security shops, marketing companies and more, the Cloud 100 is at its best in the diversity of its offerings. At cyber firm CrowdStrike (30), business is booming after the company linked Russian government-affiliated hackers with the Democratic National Committee hacks. In health tech, Nat Turner and Flatiron Health (51) are looking to manage every oncologist office in the U.S. as well as help with clinical trials; Jennifer Tejada and PagerDuty (41) help spot operational failure before a website goes down for the count. Toast (68) helps restaurants manage their businesses, while Grammarly (90) offers a Chrome plug-in that can help writers use better vocabulary and catch grammatical errors.

    Canva (100) CEO Melanie Perkins speaks for the mindset of many of the companies on this year's Cloud 100 list. The Australian design software maker already has 10 million users, but has her eyes firmly fixed on the future. "We have so much more to do. We feel like we've only done 1% of what is possible," Perkins says.

    View the full list here

    0 0

    Always take backup.

    We hear it all the time on cop shows; in everyday life, it translates to something like, “It pays to have a Plan B” or allusions to the Robert Burns poem about “the best laid plans” often going awry.
    But new Wharton research shows that there is an important downside to making a backup plan – merely thinking through a backup plan may actually cause people to exert less effort toward their primary goal, and consequently be less likely to achieve that goal they were striving for. Jihae Shin, a former Wharton Ph.D. student who is now a professor at the University of Wisconsin, and Katherine Milkman, a Wharton professor of operations, information and decisions, detail their findings in the paper,

    “How Backup Plans Can Harm Goal Pursuit: The Unexpected Downside of Being Prepared for Failure,” which was published in the journal, Organizational Behavior and Human Decision Processes.

    The paper was inspired by a conversation that Shin and Milkman had when Shin was working to get an academic faculty job while completing the Ph.D. program at Wharton. While some of her peers were thinking about backup options in case they didn’t find a job in academia, Shin found herself not wanting to because she worried that, “if I make a backup plan, it could make me work less hard to achieve my goal, and ultimately lower my chances of success.”
    “When people thought about another way to achieve the same high-level outcome, they worked less hard and did less well.”–Katherine Milkman
    Shin and Milkman agreed that they should test Shin’s idea. In a series of experiments, they found that thinking through backup plans did quash people’s motivation to achieve their primary goal. For example, after all participants in one experiment were told that performing well on a task would earn them a free snack, or the privilege of leaving the study early, some were prompted to think about “another way they could have an extra 10 minutes or another way they could get a free snack,” Milkman notes.

    “When people were prompted to think about another way to achieve the same high-level outcome in case they failed in their primary goal, they worked less hard and did less well.”

    The researchers add that the effect wasn’t about putting a concrete backup plan in place. “Just thinking about it — you haven’t invented a backup plan, you haven’t created a safety net, you’ve just contemplated the existence of one” — causes people to lose focus on their goal, Milkman says.

    Outsourcing Plan B

    But can you really get through life without contemplating backup plans? Milkman says no – and nor should you. “There are huge benefits to making a backup plan,” Milkman points out. “If you don’t have one in life, sometimes it can be really disastrous.”

    What you can do, the researchers say, is to become more strategic about when and how to make a backup plan. “You might want to delay making a backup plan until after you have done everything you can to achieve your primary goal,” Shin says.

    Or you can outsource it. Milkman notes that while Shin was focusing on her goal of landing a faculty job in academia, Milkman and Shin’s other mentors were thinking about what she could do if it didn’t work out. “In a work environment, if an employee is given a task, you can tell him or her not to think about failure; just put all your eggs in one basket and know that it’s not your job to think about a backup plan,” Milkman says. “That’s the boss’s job, and the boss doesn’t have to tell the employee that he or she is worrying about it.” Alternately, Shin adds, companies can give one group of employees the job of pursuing a goal, and another group the responsibility of coming up with backup plans.
    “You might want to delay making a backup plan until after you have done everything you can to achieve your primary goal.”–Jihae Shin
    The researchers note that the effect is only relevant to goals that are dependent on effort, rather than luck. In addition, while it’s often impossible for the most cautious among us not to think about what happens if our goals don’t fall into place, Shin says people can avoid making specific, detailed backup plans. “The more specific and detailed your backup plans, the more potent their negative effects will likely be,” Shin notes.

    “My dad told me when I was coming to the U.S. to do a Ph.D. that, ‘Nothing valuable in life is achieved easily,’” adds Shin, “I believe that persistence and grit toward a goal, which can be affected by making a backup plan, could make a difference in deciding who succeeds and who doesn’t in that goal.” Shin says one next direction for the research would be to examine whether the attractiveness of the backup plan impacts people’s level of motivation to achieve their primary goal — whether making an unattractive backup plan would hurt motivation less than making an attractive backup plan.

    That said, after their conversation about her job prospects, Shin suspected that Milkman might have been thinking about a backup plan for her. “For this I am thoroughly grateful,” Shin says.

    View at the original source

    0 0

    Many executives are fond of promising to deliver growth, but far fewer realize those ambitions. This is because many fundamentally mismanage the growth gap, which is the difference between their growth goals and what their base businesses can deliver. Filling the gap requires either innovative new offerings or acquisitions. That’s where the trouble starts — it is easy to be fooled by rosy assumptions that, when analyzed in a disciplined way, turn out not to be practical.

    Let’s take the example of one large company we worked with, which posited that it needed $250 million in new revenue from innovative new products in five years. Spreadsheets were developed, resources were marshaled, budgets were approved, and the work began. It was decided that, given the company’s size, project selection should filter out new product ideas unless, at maturity, they could be expected to generate $50 million in revenue. Over the stipulated five-year time horizon, this seemed reasonable.

    We started mapping future projections to resource commitments with a framework called the Opportunity Portfolio, in which projects are evaluated with respect to their market and technical uncertainty, their resource intensity, and their upside potential.

    We assigned projects to four categories of opportunity (plus another category for innovations that support the core business). Positioning options have high technical but low market uncertainty, in which the major challenge is solving a technical problem of some kind. Scouting options have low technical but high market uncertainty, in which the major task is finding product/market fit to extend the reach of an existing capability. Stepping-stone options have both high technical and high marketing uncertainty. Finally, platform launches represent a new business that is ready to be scaled up. These have relatively lower uncertainty than an option. They may be generating revenue but usually not yet a lot of bottom line. They show enough promise that they will become mainstay core products in the next 12 months or so.

    Projecting new revenues to the four areas in the Opportunity Portfolio was an easy exercise. As the following table shows, it led to a comforting view of the future growth potential of the current portfolio. Each block in the table denotes new revenue that year from maturing portfolio investments, resulting in cumulative new revenue, which can be found at the bottom of each column. Note that the table implicitly projects limited investment and a slow start to the new growth initiatives, with no new revenues in 2017, modest new revenues in 2018, and significant new revenues really only beginning in 2020 and 2021.

    The table offers an attractive view of the firm’s growth prospects, with a projected total of $620 million in new revenues by the 2022 timeframe.

    Beware of Spreadsheets

    And this is where spreadsheets, which a colleague of ours dubs “quantifications of fantasy,” can lead to unrealistic conclusions. The big problem is that spreadsheets tend to reduce the world to linear models, when in reality the growth process is nonlinear, sometimes even exponential. We’ve all seen those spreadsheets in which Year 2 revenue is Year 1 revenue plus 10%, and so on, and we know they don’t represent reality.

    Imposing just a bit of realistic discipline with respect to the likely times at which revenues will be realized led to a very different conclusion about when the growth program would show results and close the growth gap. We were particularly concerned about the timing of the firm’s proposed investments relative to its expectations for results.

    With the growth initiative just getting under way in 2017, the company’s own projections showed that significant new revenues would not be realized until 2020, representing a three-year lag between initiating its growth projects and reaping the rewards from them. Of particular concern on our part was how long it would take for each project to achieve 50% of its target revenue, testing the assumption of linear growth embedded in the projections.

    Modeling Nonlinear Growth

    To do this, we modeled the assumptions in the plan with a logistic growth model, a technique that incorporates nonlinear growth functions. It uses three inputs: the revenue goal for the investment at steady state, the assumed first-year revenue, and the inflection point, which is the time the company thought would be required to reach 50% of the revenue goal.

    This allowed us to create the following chart, based on the table above, for the likely trajectory of the revenue growth plans, given the assumptions about the inflection point, first-year revenue, and expected target revenue.

    This analysis revealed that attractive-looking cumulative revenue numbers in the plan did not take into account the dynamics of timing. Even though the table projected cumulative new revenues from the plan of $620 million, a dynamic view that takes timing into account shows that at best the new revenue is likely to be in the $180 million range — a far cry from the target.

    Projects started after 2019 would be of little help in hitting the portfolio target revenue in 2022, because they simply do not have the time needed to begin delivering results. This in turn called into question the planned strategy for resource deployment, which was essentially continuing as if these projects were still options, with small investments at the beginning that would ramp up only later on.

    Making the Transition from an Option to a Major Launch

    What executives often fail to realize is when you make a commitment to launching a major new growth platform, the investment logic changes. Maximum resources are needed early on. If the firm sought to drive serious growth in 2017 and 2018, a lot more resources would be required much earlier. Moreover, not all projects will succeed, so to have nine projects become revenue-generating by 2020, in all likelihood over 20 projects will need to be started.

    This is a very common problem organizations experience when they decide that a project is ready to make the transition from being an option, in which the main goal is to search for a reliable, repeatable business, to a new growth platform. What many executives don’t understand is that this shift is a phase change. The project goes from essentially being an internal startup to becoming a full-fledged member of the corporate parent at scale. Often, a new team needs to be brought in, one with more operational expertise than the startup team. Organizational and technical debts need to be repaid. The metrics need to change. And all of this takes resources.

    Without realizing the significance of this shift, executives are tentative about putting the talent, resources, and commitment behind the program to assure its success. Unsurprisingly, the result of such timidity is that the project experiences a slow takeoff, leading many to lose faith in it before it ever had a chance.

    What is interesting is that simply as a function of timing and investment, the firm could potentially have been on track to hit its $250 million target by 2024, just not the stipulated timeframe of 2022. The executives making those rosy growth projections would justifiably have been criticized for making proclamations that were predictably unrealistic.

    So how can you bring more discipline to your growth projections and avoid getting sideswiped by a growth gap that could have been foreseen? Based on our experience, four actions can help:

    Take the time to assess what your growth gap potential is. It’s all too easy to assume that your current business will deliver the growth that your investors, employees, and other stakeholders are expecting. The process is not that complex: Simply look at the growth trends of your existing lines of business and compare them to where you think your strategy needs to be at some point in the future. Usually, there will be a gap.

    While it seems astonishing that leaders wouldn’t do this (and boards wouldn’t insist on it), we see it all the time. Sometimes, it is because leaders just won’t take the time away from day-to-day operations. Sometimes, it is because, oddly, it is no one’s job. And sometimes, there are simply too few people with the vantage point to see the trends across the entire enterprise. And unfortunately, executives in some companies are rewarded for essentially gaming their numbers rather than being realistic.

    Manage your portfolio to keep today’s business fresh while placing bets on the future. When we look at the once-great businesses that have stumbled (we’re looking at you, Blackberry), what we often see is very poorly diversified portfolios with an excessive focus on today’s core business. As PepsiCo’s Indra Nooyi observes:

    “It’s been a long time since you could talk about sustainable competitive advantage. The cycles are shortened. The rule used to be that you’d reinvent yourself once every seven to 10 years. Now it’s every two to three years. There’s constant reinvention: how you do business, how you deal with the customer.”

    In general, as the core business comes under pressure, you’ll need to make bets on some combination of acquisitions and organic growth. When time is tight, you’ll place more emphasis on acquisitions. If you have time and want to build a capability, organic growth or partnering makes more sense.

    Don’t apply linear thinking to projecting how your growth initiatives will unfold. It is an old saw that things change less than we expect in the short term and more than we expect in the long term. This refers to the very human tendency to think in terms of linear change, when we know that patterns of change in business are nonlinear, particularly patterns of growth. Amazon Web Services, for instance, went from being a concept to being a $10 billion-plus revenue business in less than 10 years, a torrid rate of nonlinear growth.

    Tools such as the logistic model above can help you test the financial assumptions in your growth plans in a way that recognizes these patterns. It may also help to look at a range of possible outcomes under different scenarios.

    Don’t allow your assumptions to become facts in your own mind. One of the biggest mistakes we see over and over again is thinking about your growth businesses using the same mental models that you use to think about your operating businesses. The growth journey is about learning, about discovery, and about finding a business model. It is a mistake to begin it thinking you know what the linear, measurable path will be.

    Research done on the venture capital industry found that even these expert investors in innovation learned that it took twice as long for their portfolio companies to generate half the revenue they were projecting. And, of course, the overall success rate for VC-backed startups is pretty low. There’s no reason to think your organization is going to outsmart seasoned VC investors on a regular basis. What you can expect is better results by making sure that your strategy and growth program are aligned.
    Unrealistic revenue projections or assumptions about how much growth you’re really going to get can lead to career-ending misses. Misses sap investor confidence, can cause dramatic stock price declines, and can lead to investors wielding metaphorical pitchforks. Better to do some smart thinking beforehand.

    Reproduced from Harvard Business Review

    0 0

    From astrophysicists to entrepreneurs, technology leads drug makers to seek new skills.

    After many years building successful technology businesses, Jeremy Sohn never imagined that at 43 he would find himself on the payroll of a big pharmaceutical company. But 18 months ago he was appointed global head of digital business development and licensing at Swiss drug maker Novartis.

    His appointment is evidence of how an industry, slow to respond to the disruption of digitisation, is grasping its importance as it confronts pricing pressures, ever-vaster quantities of patient data and more empowered consumers. Digitisation is changing the way pharma interacts with payers, doctors and patients, leading drugmakers to seek out different skills and personality traits in employees.

    Germany’s Merck last year appointed 30-year-old James Kugler as its first chief digital officer, with a degree in biomedical engineering and a tech background. Boehringer Ingelheim, Europe’s biggest private drugmaker, hired Simone Menne as chief financial officer from airline Lufthansa. She is in charge of a new digital “lab”, recruiting data specialists and software developers.

    Mr Sohn, whose role at Novartis includes overseeing venture capital investments in technology companies — a growing trend in Big Pharma — says that working alongside highly qualified scientists, he “typically feels like the dumbest person in any meeting”. However, he and other external recruits have brought mindsets that are helping the group evolve from a pure science company into “a data [and] technology company”, he adds.According to Steven Baert, head of human resources, Novartis is starting to reap considerable benefits from digital investments, particularly in the speed and efficiency with which it can test medicines. 

    He says: “We’re already seeing how real-time data capture can help analyse patient populations and demographics, to make it easier to recruit patients for clinical trials, and how real-time data-capture devices, like connected sensors and patient engagement apps, are helping to create remote clinical trials that aren’t site-dependent.”In the past five years, these changes have been visible in Novartis’s workforce.

    While staffing overall has risen by just over 20 per cent, the salesforce — the traditional bedrock of pharma companies, and their combined $1tn in global revenues — has increased by just 13 per cent. At the same time the number employed in “market access” — negotiating prices with payers, whether governments or insurers — has risen up to five times faster than the average growth rate and now stands at 1,100. 

    Novartis employs more than 1,200 dual-qualified mathematicians and engineers to analyse big data sets and calculate the value of new drugs — for instance, their potential to reduce hospitalisations and so cut costs. As recently as six years ago, not a single one was on the payroll. Behind these changes lie two key shifts. The first is the determination of cash-constrained global health systems to secure better value from the drugs they buy.

    The second is the advance of digital technology, which is increasingly playing a role in how patients manage their conditions and companies communicate the benefits of their medicines to doctors. GlaxoSmithKline, for example, employs more than 50 people to run webinars with physicians — a “multichannel media team” that did not exist five years ago.The UK drugmaker has begun hiring astrophysicists to work in research and development, keen to deploy their ability to visualise huge data sets.

    The company says these qualities are specially important as it seeks to use artificial intelligence to help spot patterns and connections amid a mass of information. At Boehringer, senior executives say that this level of disruption calls for agility and entrepreneurialism in employees — which in some cases may be better found outside the life sciences sector.

    Andreas Neumann, head of HR, explains that, although new CFO Ms Menne had “no clue” about pharma, she had worked in a sector that had faced substantial upheaval. “She has significant experience in an industry which is under tremendous cost pressure and has gone through a tremendous amount of change. And you can learn from that experience, as a company.”US-based Pfizer last year recognised this new landscape by establishing a division to bring together health economists; researchers measuring the outcomes produced by different medicines; and market access specialists.

    Previously these groups had been spread throughout the organisation.Andy Schmeltz, who heads the division, gives the example of Eliquis, an anticoagulant produced with Bristol-Myers Squibb. Data analysts processed “real world” evidence — derived from patients going about their normal lives, rather than taking part in a carefully managed trial — that suggested it was more cost effective than the long-established anticoagulant, Warfarin.

     Underpinning this work is a massive repository of data, from sources such as electronic medical records, that covers “over 300m lives”, says Mr Schmeltz. This, he says, “enables us to query the database and generate insights, even when we’re just trying to figure out the design of a trial and the feasibility of recruitment; are there enough patients out there that meet certain entry criteria? It enables us to make better decisions on clinical trial development. It also enables us to model different outcomes across different diseases.”

    At Merck, chief executive Stefan Oschmann enthuses about its new breed of digitally savvy employee, led by “forward-thinking” Mr Kugler. “We’re working on stuff like the connected lab,” he says, “a laboratory where everything, every container, every machine, every pipette, is smart and connected and captures data automatically . . . So we [employ] a very different type of people these days.”

    While the project is still in the planning stages, when complete it will allow staff to manage inventory and research across multiple labs and share findings more readily, as well as making it easier to access safety and regulatory compliance data. The pharma industry still has a considerable way to go before it exploits digital technology as successfully and automatically as many other sectors. A recent report by McKinsey, the global consultancy, assessed “digital maturity” under a range of categories including strategy and customer focus. Only the public sector, an infamous digital laggard, came out worse.

     Stefan Biesdorf, who leads McKinsey’s digital pharma and medical technology work in Europe, says: “While virtually every pharma company has either worked on its digital strategy or made plans about how to address the topic, compared with other industries pharma . . . still has a lot to do.” 

    One analyst describes some big pharma companies as “schizophrenic” about how to respond to digital advances, aware they needed to act but unsure how much investment to divert from their core mission of drug discovery. Alyse Forcellina, leader of the Americas healthcare practice at executive recruitment consultancy Egon Zehnder, says Big Pharma needs outsiders because “nobody in pharma is excellent at digital”.

    She warns, however, of the risk of “organ rejection” of new recruits who, for instance, may not understand that “many things are illegal or just not possible” in pharma, such as direct approaches to patients.Mr Baert of Novartis acknowledges there is also a danger that companies will hire the right people but fail to foster the internal culture required to take advantage of their expertise. However, he cites as a warning the example of Kodak, which was at the forefront of discovering digital technology but failed to accelerate the shift to a new business model.At Boehringer, Mr Neumann acknowledges the process is not always smooth. But he is in no doubt about the potential gains if companies can create an environment in which diversity of background is seen as an advantage, not a threat.

    He says: “If you hire someone who is disruptive because you want disruption, you get what you have hired, right?”

    Force driving salesAs pharma companies reshape their workforces for an evolving economic and regulatory climate, how far and how fast can the changes go?Some say it is possible to exaggerate the extent of the overhaul. Jo Walton, a pharma analyst at Credit Suisse, argues that the notion drugmakers will be able to dispense with sales forces altogether is unrealistic.She says: “If you think how many new drugs are developed after a doctor leaves university and medical school, clearly doctors require some form of continuing medical education.”

    The most effective way for pharma groups to show the merits of their medicines is still by handing them out in doctors’ offices: “Putting a drug in a samples cabinet still requires someone to be in there,” she points out.Although the role of data analytics and health economics in demonstrating the value of drugs has grown, Steven Baert, head of HR at Novartis, acknowledges that “we’re not yet in a world where one can bring a product to patients without a sales force calling on physicians, which means that you need both today”.

    However, as insurers and governments increasingly develop ways of pricing drugs according to the outcome they produce, an even more radical shake-up of the traditional pharma workforce is in prospect.Mr Baert says that matters are “moving in that direction [towards outcomes-based pricing], but it’s not yet a reality in one country, or in one disease area, or in one market”. “Do I expect that in five years the world will be completely different?,” he says. 

    “No, not yet. Do I expect that in 20 years we will see a very different picture? Absolutely.”

    0 0

    A new weapon for the war on cancer: a broad-spectrum circulating tumor cell capture agent for diagnostics.

    Engineered opsonin protein captures circulating tumor cells in the bloodstream with high efficiency.

    Scanning electron microscope (SEM) image of FcMBL-coated beads (gray) attached to a tumor cell (red). Credit: Wyss Institute at Harvard University

    Cancerous tumors are formidable enemies, recruiting blood vessels to aid their voracious growth, damaging nearby tissues, and deploying numerous strategies to evade the body’s defense systems. But even more malicious are the circulating tumor cells (CTCs) that tumors release, which travel stealthily through the bloodstream and take up residence in other parts of the body, a process known as metastasis. While dangerous, their presence is also a valuable indicator of the stage of a patient’s disease, making CTCs an attractive new approach to cancer diagnostics. Unfortunately, finding the relative handful of CTCs among the trillions of healthy blood cells in the human body is like playing the ultimate game of needle-in-a-haystack: CTCs can make up as few as one in ten thousand of the cells in the blood of a cancer patient. This is made even more difficult by the lack of broad-spectrum CTC capture agents, as the most commonly used antibodies fail to recognize many types of cancer cells.

    To address this problem, a group of researchers at the Wyss Institute at Harvard University has adapted an engineered human blood opsonin protein known as FcMBL, which was originally developed as a broad-spectrum pathogen capture agent, to target CTCs instead. Using magnetic beads coated with FcMBL, they were able to capture >90% of seven different types of cancer cells. “We were able to rapidly isolate CTCs both in vitro and from blood, including some which are not bound by today’s standard CTC-targeting technologies,” says Michael Super, Ph.D., Lead Senior Staff Scientist at the Wyss Institute and co-author of the paper. “This new technique could become useful in cancer diagnostics.” The technology is described in Advanced Biosystems.

    Current CTC diagnostic systems frequently make use of a cancer cell marker, the epithelial cell adhesion molecule (EpCAM), which is highly expressed on the surface of tumor cells. However, EpCAM expression on cancer cells decreases when tumor cells transform into CTCs, ironically making EpCAM-based tests less useful precisely when it is most crucial to know that a patient’s cancer has metastasized.

    The Wyss Institute capture technology takes advantage of a protein naturally found in the body, mannose-binding lectin (MBL), which recognizes and binds to carbohydrates present on the surfaces of bacteria and other pathogens, marking them for destruction by the immune system. Healthy human cells have different carbohydrate patterns and are immune to MBL, but many cancer cells have aberrant carbohydrates that are similar to those found on pathogens and, therefore, are vulnerable to MBL binding.

    FcMBL-coated beads (gray) are able to bind to tumor cells (red) in large numbers, increasing capture efficiency. Credit: Wyss Institute at Harvard University

    The team previously developed a genetically engineered version of MBL in which the binding portion is fused to an antibody Fc fragment (FcMBL) to stabilize the molecule. Past studies showed that when tiny magnetic beads are coated with FcMBL and added to various pathogens, the FcMBL-coated beads attach to the surfaces of these cells like flies on flypaper so that, when a magnetic field is applied, the beads drag their bound cells along with them toward the magnet.

    To evaluate whether this system could specifically target CTCs, the researchers implanted fluorescently-labeled human breast cancer cells in mice, let the tumors develop for 28 days, and then tested the blood to determine the number of CTCs present. They then mixed the blood with FcMBL-coated beads and pulled the beads out of suspension with a magnet.

    “The FcMBL-coated beads are unlikely to be bound to normal cells, and so when we measured the movement of cancer cells versus normal cells, the cancer cells moved much faster because they were being dragged to the magnet by the beads,” explains first author Joo Kang, Ph.D., who was a Technology Development Fellow at the Wyss Institute while completing this study and is now an Assistant Professor at the Ulsan National Institute of Science and Technology. The concentration of CTCs present in the blood was also reduced by more than 93%, showing that FcMBL can effectively capture CTCs in the blood even after they have undergone the transitions that reduce EpCAM expression.

    The team then tested their system against six additional cancer cell types, including human non-small cell lung cancer, lung carcinoma, and glioblastoma. The FcMBL-coated beads captured all six types of tumor cells with >90% efficiency – which is comparable to EpCAM-targeting methods – and were also able to capture two types that are not successfully bound by anti-EpCAM antibodies (lung carcinoma and glioblastoma). “Our results suggest that while the EpCAM marker can be useful for some tumors, it becomes less and less useful over time as EpCAM expression decreases and the cell becomes metastatic,” says Super. “Our FcMBL system can either be used as an alternative to EpCAM-based diagnostics, or as a follow-up method once EpCAM ceases to be expressed.”

    Cancer cells (red) being bound by FcMBL-coated beads (gray). Credit: Wyss Institute at Harvard University

    The researchers hope to continue their studies to determine exactly which carbohydrate molecules FcMBL is targeting on CTCs, which could further improve the specificity and efficacy of capture. “The FcMBL opsonin technology has already been shown to be an extremely broad-spectrum capture agent for pathogens,” says senior author of the study and Wyss Founding Director Donald Ingber, who is also the Judah Folkman Professor of Vascular Biology at Harvard Medical School and the Vascular Biology Program at Boston Children’s Hospital, as well as a Professor of Bioengineering at Harvard’s School of Engineering and Applied Sciences. “This new finding that it has similar broad-spectrum binding activity for many different types of circulating cancer cells is equally exciting, and once again demonstrates the power of leveraging biological design principles when developing new medical innovations.”

    Additional co-authors include Harry Driscoll from Giner, Inc, who was a Research Assistant at the Wyss Institute when this study was completed; Akiko Mammoto, M.D., Ph.D., from Boston Children’s Hospital and Harvard Medical School; and Alexander Watters, Ph.D., Bissrat Melakerberhan, and Alexander Diaz from the Wyss Institute.

    This work was supported under DARPA grant No. N66001-11-1-4180 and contract No. Hr0011-13-C-0025. Opinions, interpretations, conclusions and recommendations are those of the authors and are not necessarily endorsed by the U.S. Army.

    View at the original source

    0 0

    The Cognitive era is here, and it’s accelerating, across industries. The cognitive computing market is estimated to grow from $2.5 billion in 2014 to more than $13 billion by 2019. Experts predict that by 2018, more that half of all consumers will interact with cognitive technology on a regular basis. But the journey to becoming an intelligent business is still new to so many leaders, and watching and learning from other early adopters may be the best way to avoid common mistakes and overcome complex challenges.

    Businesses that can create actionable knowledge from large volumes of data, can improve business outcomes, expand expertise, delight customers and continuously outthink the needs of the market. Early adopters like Honda, Hilton, Staples and GM are already gaining a major competitive advantage from their use of cognitive technologies. And hundreds of other companies are catching up.

    To understand how these early adopters are working on their business transformations, we surveyed more than 600 decision makers, worldwide, at various stages of implementation of cognitive initiatives. The results of the survey weren’t just surprising; they was inspiring and encouraging as we discovered exciting real-world applications, successes and valuable lessons we can all learn from.
    At this year’s World of Watson Conference in Las Vegas on Oct. 23-27, IBM thought leaders Susanne Hupfer and Nancy Pearson will share the full results of this survey in their session titled: “The Intelligent Enterprise: Building a Cognitive Business.”

    As a sneak peek into the results of this survey, we’re sharing 5 things we learned from speaking to these 600 early adopters of cognitive:

    1. Most businesses want to become cognitive, but many of them are only starting their journey
    Of the 600+ decision-makers we surveyed, about 65% of them said cognitive computing is extremely important to their business strategy and success. But only 22% of respondents said they had been using two or more cognitive technology capabilities for more than a year.

    While cognitive technologies are still new to many businesses, the race to the top is now fully under way. More than half the respondents said they had been using multiple cognitive technologies for less than a year or using one technology for more than a year. A quarter of them said they are planning to adopt cognitive and AI initiatives within the next two years.

    2. Business leaders see cognitive solutions as a key differentiator that gives them a competitive advantage
    These “thinking” businesses are already seeing positive business outcomes including improved customer service, sales, ad conversions, productivity, employee performance and revenue growth. More than half the respondents said they consider cognitive computing to be a key ingredient of their strategy to remain competitive within the next few years, and essential to the digital transformation of their businesses. Cognitive systems are able to put content into context, to quickly find the proverbial needle in a haystack and identify new patterns and actionable insights in ALL available data.

    3. While the opportunities are limitless, there are still many hurdles to overcome
    Businesses on their path to becoming more cognitive face some common challenges. Top adoption challenges include security concerns, lack of skilled resources, roadmap struggles, maturity of these new technologies, data security, and lack of unified sources of data. About half of our survey respondents said that they see the value in cognitive computing, but they struggle with a clear roadmap for adoption.

    4. It’s not enough to just have advanced analytics anymore
    Cognitive computing is essential to overcoming data challenges that conventional analytics cannot solve as it unlocks the hidden value of “dark data” that was previously unreadable by machines. At most companies, a lot of the data available — more than 80% of it — is “unstructured,” in the form of emails, social media posts, documents, videos, images, audio recordings, manuals etc. Traditional tools and machines can’t analyze this unstructured content to find insights and patterns, but cognitive systems like Watson can.

    5. Many business leaders share common goals for implementing cognitive solutions

    While their products and industries may vary, many business leaders share the same goals and challenges on their path to becoming truly cognitive.
    Top priorities include:
    • Improving productivity and efficiency
    • Reducing costs and compliance risks
    • Improving decision-making and planning across teams
    • Delivering more personalized and faster customer service
    • Scaling expertise to make every employee as good as their best employees
    Cognitive solutions can help businesses achieve all these goals and more. They create usable and meaningful knowledge from data to expand everyone’s expertise, continuously learning and adapting to outthink the needs of the market.

    View at the original source

    0 0

    The competitors facing asset and wealth managers, banks, and insurance companies aren’t who we thought they were. Emerging technology is presenting growing opportunities for FinTechs.
    And change is fast. The customers seem to be changing their minds about what they value most. For select legacy instituitions, this is a time of great opportunity. For others, it’s a sign of the end of an era.

    Technology trends

    It’s no secret that financial services has become a digital business. But the speed and extent of the transition is downright jarring. Artificial intelligence now drives the way leading firms provide everything from customer service to investment advice.

    Blockchain, with its ability to store information on distributed ledgers without a central clearinghouse, could upend a variety of businesses.

    Digital labor, or robotic process automation, is helping firms automate things they couldn’t do before, without having to hire an army of developers. And all of this depends on robust cybersecurity, to hold off threats that are coming from multiple directions.

    Business trends

    How business is conducted is shifting too. For decades, American firms have looked to the United Kingdom as the gateway to Europe, but Brexit could change this. Firms are focusing on jurisdictional analysis and what they’ll need to expand in the UK or move directly to the EU.

    In the US, the regulatory environment will likely be affected by new appointments to the federal agencies and some targeted Dodd-Frank rollback by Congress. And as the industry grapples with risk management culture, ethics, and trust, it often finds itself playing defense.

    Economic trends

    The economic backdrop for these forces also keeps changing. Asset and wealth managers, banks, and insurance companies once primarily competed against their own kind. They still do—but now, they also face competition from nontraditional market players with skills, funding, and attitude.

    And in a prolonged low interest rate environment, many now look at cost containment as one of the keys to survival. Finally, we see firms in a scramble for top line growth, organically and through acquisition, in a search for new revenue opportunities. Staying the same means falling behind.

    Top 10 issues/opportunities facing Financial Services in 2017

    1. Artificial intelligence now drives the way leading firms provide everything from customer service to #roboadvisor investment advice.

    2. Blockchain, with its ability to store information data on distributed ledgers without a central clearinghouse, could upend a variety of businesses.

    3. For decades, American firms looked to the United Kingdom as the gateway to Europe, but #Brexit could change this.

    4. Financial institutions face competition from nontraditional #Fintech players with skills, funding, and attitude.

    5. In a prolonged low interest rate environment, many now look at cost containment as one of the keys to survival.

    6. Everything depends on robust cybersecurity to hold off threats that are coming from multiple directions.

    7. The regulatory environment next year will likely be impacted from new appointments to the federal agencies and some targeted Dodd-Frank rollback by Congress, among other things.

    8. And as the industry grapples with risk management culture, ethics, and trust, it often finds itself playing defense.

    9. Digital labor, or robotic process automation, is helping firms automate things they couldn’t do before, without having to hire an army of developers.

    10. Finally, we see firms in a search for new revenue opportunities, either organically, or through acquisitions. Staying the same means falling behind.

    This full PwC report looks more broadly at these top issues facing financial institutions in the coming year.

    For each topic, we look at the current landscape, share our view on what will likely come next, and offer our thoughts on how you can turn the situation to your advantage.

    View at the original source

    0 0

    I will be the first person to admit that I’m a city boy. I grew up in Seattle, where my main agricultural experience as a kid was the farmers who sold freshly picked fruits and vegetables at Pike Place Market.

    Since then I’ve visited lots of small farms as part of my work with the foundation. But nothing prepared me for where I recently found myself: in the wilds of the Australian outback watching a cattle rancher artificially inseminate a cow.

    It’s a pretty graphic procedure to say the least, but I was impressed by how high tech the whole process was at Wylarah Station (a station is the Australian term for a ranch). The Australian Agricultural Company—or AACo—relies on cutting edge genomics to breed wagyu beef cows, some of the most elite cattle in the world.

    AACo is one of the foremost experts in the developed world on tropical cattle production. Although they use innovation to raise higher quality beef that they can sell for a good price, I was more interested in learning about how their methods could help farmers in low income countries with similar climates.

    Farmers across sub-Saharan Africa are already raising cattle—beef and dairy—in massive numbers. Ethiopia, Sudan, and Tanzania are among the world’s top 15 cattle producing countries. While there are legitimate questions about whether the world can meet its appetite for animal products without destroying the environment, it’s a fact that many poor people rely on cattle for both nutrition and income. I believe they should be able to raise cattle as efficiently as farmers in rich countries do.

    I’m optimistic that technology can improve the quality of African cattle. A typical dairy cow in the United States produces nearly 30 liters of milk every day. Compare that to your average cow in Ethiopia, which produces just 1.69 liters of milk a day. If you want to increase milk yield, you can’t just take a high-producing Holstein cow from Wisconsin and drop it into the tropical savannah. Unlike indigenous breeds, temperate cattle have no natural resistance to tropical diseases—like trypanosomiasis, or sleeping sickness—and they struggle to get enough nutrition from local food sources.

    Instead, you could breed cattle that will flourish in the local climate. That means using artificial insemination—like the process they use at Wylarah Station—to crossbreed a native female cow (with her built-in resilience to tropical heat and diseases) with a bull from a genetic line that produces lots of milk.

    Our foundation is already tackling this, but AACo’s technology could make the process much more precise than it is today. One of the things that amazed me most during my visit was how much they know about the ancestry of their cattle. The animals on their ranch have a more detailed family history than most people do. If farmers in Africa were equipped with the same level of knowledge, they could handpick the best possible cow parents and breed a better calf. But that leads us to another problem.

    Because they lack adequate storage, most African farmers rely on artificial insemination stations (yes, that’s what they’re really called) to provide sperm samples. Depending on how far a farmer lives from a station, the sample can sometimes heat up too much and effectively die before it is delivered. Many farmers decide not to take the risk. Instead they get their cows pregnant the old-fashioned way, which makes it harder to control genetic integrity and can lead to calves that are less resilient or produce less milk.

    AACo is looking into methods that extend the viability of sperm samples. Similar technology is currently used in Europe to improve the success rate of fertilization, but it hasn’t been tried yet with tropical cattle. If successful, it could double the amount of time a sample can survive outside of storage and make it easier for more farmers across Africa to use artificial insemination.

    Beyond breeding, Wylarah Station uses technology to ensure that their herds receive proper nutrition. I was surprised to see their ranch hands use smart watches to track how much the cows are drinking.
    The whole operation was a far cry from the John Wayne cowboy movies I used to watch as a kid.
    In the past someone had to manually inspect all of the water troughs scattered across the ranch, driving hundreds of kilometers every day. Now they receive a notification on their watch when a sensor detects that a tank needs attention. The whole operation was a far cry from the John Wayne cowboy movies I used to watch as a kid.

    Not all of AACo’s innovative approaches could work in the poor world. It’s unlikely that every farmer in Africa will be wearing a smart watch anytime soon (if ever). But as smartphone usage continues to grow across the continent, it’s easy to imagine a future where Africans might use an app to order the perfect bull DNA or make sure their cattle are eating enough—something that an African ICT company called iCow is promoting in Kenya, Ethiopia, and Tanzania with help from our foundation.

    There’s a lot we can learn from Wylarah Ranch about how to more efficiently raise cattle, but I can’t ignore the big question: should we rely on animals for food at all? Eating too much meat contributes to higher levels of obesity and heart disease, and raising animals contributes to climate change. That’s why I’ve invested in companies working on meat substitutes, which could one day eliminate the need to raise and slaughters animals entirely.

    Although it might be possible to get people in richer countries to eat less, we can’t expect people in low income countries to follow suit. When I went vegetarian for a year in my late 20s, all I had to do to get my daily serving of protein was buy a can of beans or a container of tofu at the grocery store. It’s not so easy for families in poor communities to get the nutrition they need.

    For them, meat and dairy are a great source of high-quality proteins that help children fully develop mentally and physically. Just 20 grams of animal protein a day can combat malnutrition, which is why our foundation’s nutrition strategy wants to get more meat, dairy, and eggs into the diets of children in Africa. Cattle are also a huge economic driver in some parts of Africa. In Ethiopia alone, cattle account for 45 percent of their agricultural GDP. In addition, livestock can actually contribute to ecosystems by stimulating pasture growth, enhancing biodiversity, and recycling energy and nutrients.

    As more people in poor countries move into the middle class, they will likely eat more beef and drink more milk. But we can mitigate the impact of that growth on the environment by increasing production from the cows they already have. The cowboys of Wylarah Ranch have mastered the art of raising tropical cattle. I don’t know yet how African farmers can benefit from their expertise—our foundation is just starting to dig into this—but I’m excited about the possibilities.

    View at the original source

    0 0

    Introducing relief for H-1B petitioners, the US Citizenship and Immigration Services (USCIS) has decided to resume premium processing for certain cap-exempt H-1B petitions effective immediately.

    Those included under the petitions are H-1B petitioners coming from an institute of higher education, a nonprofit organisation related or affiliated with an institute of higher education or is a petitioner with non-profit research or governmental research organisations.

    The petitioners from this move will avail the benefits of the premium processing that may be exempt from the cap.

    The official Twitter channel of USCIS tweeted the notification on Monday evening announcing their decision to resume H-1B premium processing for certain cap-exempt petitions. The H-1B visa has an annual cap of 65,000 visas each fiscal year.

    Moreover, there is an annual "master's cap" of 20,000 petitions filed for beneficiaries with a US master's degree or higher. Premium processing will also resume for petitions that may also be exempt if the beneficiary will be employed at a qualifying cap-exempt institution, organisation or entity.
    Starting Monday, those cap-exempt petitioners who are eligible for premium processing are allowed to file Form I-907, request for premium processing service for Form I-129, petition for a non-immigrant worker.

    The petitioner can file Form I-907 with an H-1B petition or separately for a pending H-1B petition. USCIS had previously announced that premium processing resumed on June 26 for H-1B petitions. The services mentioned that these petitions were filed on behalf of physicians under the Conrad 30 waiver program as well as interested government agency waivers.

    USCIS plans to resume premium processing of other H-1B petitions as workloads permit. USCIS will make additional announcements with specific details related to when we will begin accepting premium processing for those petitions. Until then, premium processing remains temporarily suspended for all other H-1B petitions.

    View at the original source

    0 0


    It has come to the notice of Medical Council of India that certain unscrupulous elements are misleading students by alluring them that they would get them admitted in MBBS course in Medical Colleges. In this regard, it is brought to the notice of all concerned that admissions for MBBS in all Medical Colleges falling within the purview of Indian Medical Council Act, 1956 for the academic year 2017-18 has to be through the common counseling conducted by: 

    1.The Directorate General of Health Services, Ministry of Health & Family Welfare, Government of India for 15% All India Quota seats in Government Medical Colleges of the contributing states and for All MBBS seats in Medical Colleges run by Deemed Universities.  The Designated Authority of the State/ Union Territory Government in respect of MBBS seats in Government Medical Colleges and Non-Governmental Medical Colleges.

    2. Admissions made without common counseling is impermissible and illegal.

    3. By way of this Advisory all Candidates interested in taking admission in Medical Colleges should check the status regarding grant of permission by the Central Government to the colleges from the website of Medical Council of India i.e. and only after verifying the status of the permission should proceed to take admission.

    4. It has also come to the notice of the Council that certain unscrupulous elements are promising students admission and are also demanding capitation fee for such admissions. These activities are illegal and students are cautioned not to be misled by any such frivolous statements made by these college authorities.  

    5. It is further brought to notice of all concerned that following Medical Colleges have not been granted permission by the Medical Council of India / Central Government for admitting students in MBBS for academic year 2017-18 and 2018-19:  
    S. No. State Name of the Medical College

    1.  Andhra Pradesh RVS Institute of Medical Sciences, Chittoor, Andhra Pradesh 
    Debarred from admission for the academic year 2017-18 & 201819.

    2.  Andhra Pradesh Nimra Institute of Medical Sciences, Andhra Pradesh
    Debarred from admission for the academic year 2017-18 & 201819.

    3.  Andhra Pradesh Gayatri Vidya Parishad Institute of Health Care and Medical Technology, Visakhapatnam, AP
    Debarred from admission for the academic year 2017-18 & 201819.
    S. No. State Name of the Medical College

    4.  Andhra Pradesh Apollo Institute of Medical Sciences & Research, Murakkambattu Village, Chittoor,  Andhra Pradesh
    Debarred from admission for the academic year 2017-18 & 201819.

    5.  Chhattisgarh Raipur Institute of Medical Sciences, Raipur, Chhattisgarh
    Debarred from admission for the academic year 2017-18 & 201819.

    6.  Chhattisgarh Shri Shankaracharya Institute of Medical Sciences, Junwani,  Bhilai, Chhattisgarh
    Debarred from admission for the academic year 2017-18 & 201819.

    7.  Chhattisgarh Govt. Medical College, Ambikapur, Chhattisgarh 
    Not permitted for admission for the academic year 2017-18.

    8.  Delhi Hamdard Institute of Medical Sciences & Research, New Delhi
    Debarred from admission for the academic year 2017-18 & 201819.

    9.  Gujarat  Pramukhswami Medical College, Karamsad
    Debarred from admission for the academic year 2017-18 & 201819 against increased intake from 100 to 150. The college is recognized for 100 MBBS seats, hence, it is permitted for admission for 100 seats.

    10.  Haryana World College of Medical Sciences & Research Village - Gurawar, Jhahhar, Haryana
    Debarred from admission for the academic year 2017-18 & 201819.

    11.  Haryana N.C. Medical College & Hospital, Israna, Panipat, Haryana
    Debarred from admission for the academic year 2017-18 & 201819.

    12.  Jharkhand Patliputra Medical Sciences, Dhanbad, Jharkhand
    Not permitted for admission for the academic year 2017-18 against increased intake from 50 -100. The college is recognized for 50 MBBS seats, hence, it is permitted for admission for 50 seats.

    13.  Karnataka Kanachur Institute of Medical Sciences & Research Centre, Mangalore, Karnataka
     Debarred from admission for the academic year 2017-18 & 201819.

    14.  Karnataka Akash Institute of Medical Sciences & Research Centre, Devanhalli
    Debarred from admission for the academic year 2017-18 & 201819.

    15.  Karnataka Sambharam Institute of Medical Sciences & Research, Kolar, Karnataka
    Debarred from admission for the academic year 2017-18 & 201819.

    16.  Karnataka Sri Siddhartha Medical College, Tumkur 
    Not permitted for admission for the academic year 2017-18 against increased intake from 130 – 150. The college is recognized for 130 MBBS seats, hence, it is permitted for admission for 130 seats.

    17.  Karnataka Al Ameen Medical college & Hospital, Bijapur, Karnataka
    Not permitted for admission for the academic year 2017-18 against increased intake from 100-150. The college is recognized for 100 MBBS seats, hence, it is permitted for admission for 100 seats.

    18.  Karnataka Kempegowda Institute of Medical Sciences, Bangalore
    Not permitted for admission for the academic year 2017-18 against increased intake from 120-150. The college is recognized for 120 MBBS seats, hence, it is permitted for admission for 120 seats.

    19.  Kerala Kerala Medical College, Palakkad, Kerala
    Debarred from admission for the academic year 2017-18 & 201819.

    20.  Kerala S.R Medical College & Research Centre, Thiruvananthapuram
    Debarred from admission for the academic year 2017-18 & 201819.

    21.  Kerala Al-Azhar Medical College and Super Speciality Hospital, Thodupuzha, Kerala.
    Debarred from admission for the academic year 2017-18 & 201819.

    22.  Kerala  Mount Zion Medical College, Pathanamthitta, Kerala
    Not permitted for admission for the academic year 2017-18.

    23.  Kerala DM Wayanad Institute of Medical Sciences, Wayanad, Kerala
    Debarred from admission for the academic year 2017-18 & 201819.

    24.  Kerala Government Medical College, Painav, Idukki, Kerala
    Not permitted for admission for the academic year 2017-18.

    25.   Kerala Kannur Medical College, Not permitted for admission for
    S. No. State Name of the Medical College

    Kannur the academic year 2017-18 against increased intake from 100-150. The college is recognized for 100 MBBS seats, hence, it is permitted for admission for 100 seats.

    26.  Madhya Pradesh
    Sakshi Medical College & Research Centre, Guna, M.P.
    Debarred from admission for the academic year 2017-18 & 201819.

    27.  Madhya Pradesh
    Advanced Institute of Medical Sciences & Research Centre, Bhopal
    Debarred from admission for the academic year 2017-18 & 201819.

    28.  Madhya Pradesh
    Modern Institute of Medical Sciences, Indore, Madhya Pradesh
    Debarred from admission for the academic year 2017-18 & 201819

    29.  Madhya Pradesh
    Sri Aurobindo Medical College, Indore
     Debarred from admission for the academic year 2017-18 & 201819 against increased intake from 100-150. The college is recognized for 100 MBBS seats, hence, it is permitted for admission for 100 seats.

    30.  Maharashtra Institute of Medical Science and Research, Vidyagiri, Satara
    Debarred from admission for the academic year 2017-18 & 201819.

    31.  Maharashtra Jawahar Medical Foundation’ Annasaheb Chudaman Patil Memorial Medical College, Dhule  Debarred from admission for the academic year 2017-18 & 201819. 

    32.  Maharashtra Maharashtra Institute of Medical Sciences and Research, Talegaon, Dabhade, Pune 
    Not permitted for admission for the academic year 2017-18 against increased intake from 100-150. The college is recognized for 100 MBBS seats, hence, it is permitted for admission for 100 seats.

    33.  Maharashtra Dr. D.Y. Patil Medical College, Hospital and Research Center, Navi Mumbai 
    Debarred from admission for the academic year 2017-18 & 201819 against increased intake from 150-250. The college is recognized for 150 MBBS seats, hence, it is permitted for admission for 150 seats.

    34.  Maharashtra Dr. Ulhas Patil Medical College & Hospital, Nashik
    Not permitted for admission for the academic year 2017-18 against increased intake from 100-150. The college is recognized for 100 MBBS seats, hence, it is permitted for admission for 100 seats.
    35.  Orissa Hi-Tech Medical College & Hospital, Rourkela
    Debarred from admission for the academic year 2017-18 & 201819.

    36.  Punjab Chintpurni Medical College, Gurdaspur 
    Debarred from admission for the academic year 2017-18 & 201819.

    37.  Rajasthan American International Institute of Medical Sciences, Bedwas, Udaipur
    Debarred from admission for the academic year 2017-18 & 201819.

    38.  Rajasthan Ananta Institute of Medical Sciences & Research Centre, Nathdwara, Rajsamand, Udaipur, Rajasthan  Debarred from admission for the academic year 2017-18 & 201819.

    39.  Tamilnadu Ponnaiyah Ramajayam Institute of Medical Sciences, Kancheepuram, Chennai, Tamilnadu. Debarred from admission for the academic year 2017-18 & 201819.

    40.  Tamilnadu Annaii Medical College Hospital & Research Institute, Kancheepuram, Tamilnadu
    Debarred from admission for the academic year 2017-18 & 201819.

    41.  Tamil Nadu Karpagam Faculty of Medical Sciences & Research, Coimbatore
    Not permitted for admission for the academic year 2017-18.

    42.  Tamil Nadu Madha Medical College and Hospital, Thandalam, Chennai
    Debarred from admission for the academic year 2017-18 & 201819.

    43.  Tamil Nadu Melmaruvathur Adhiprasakthi Institute of Medical Sciences & Research, Melmaruvathur. Debarred from admission for the academic year 2017-18 & 201819.

    44.  Telagana RVM Institute of Medical Sciences & Research Centre, Mulugu Mondal, Medak Distt. Telangana. Debarred from admission for the academic year 2017-18 & 201819.

    45.  Telangana Mahavir Institute of Medical Sciences, Ranga Reddy,  Vikarabad, Telangana
    Debarred from admission for the academic year 2017-18 & 201819.

    46.  Telangana Malla Reddy Medical College for Women, Jeedimetla, Hyderabad, Andhra Pradesh
    Debarred from admission for the academic year 2017-18 & 201819.

    47.  Telangana  SVS Medical College, Mehboobnagar 
    Not permitted for admission for the academic year 2017-18 against increased intake from 100-150.  The college is recognized for 100 MBBS seats, hence, it is permitted for admission for 100 seats.

    48.  Telengana Mediciti Institute of Medical Sciences, Ghanpur, Ranga Reddy, A.P. 
     Not permitted for admission for the academic year 2017-18 against increased intake from 100-150 The college is recognized for 100 MBBS seats, hence, it is permitted for admission for 100 seats.

    49.  Uttar Pradesh Glocal Medical College, Super Specialty Hospital & Research Center, Mirzapur, Saharanpur, U.P. Debarred from admission for the academic year 2017-18 & 201819.

    50.  Uttar Pradesh G.C.R.G. Institute of Medical Sciences, Lucknow, Uttar Pradesh
    Debarred from admission for the academic year 2017-18 & 201819.

    51.  Uttar Pradesh Krishna Mohan Medical College & Hospital, Mathura, Uttar Pradesh
    Debarred from admission for the academic year 2017-18 & 201819.

    52.  Uttar Pradesh Venkateshwara Institute of Medical Sciences, Gajraula, J.P. Nagar, Uttar Pradesh
    Debarred from admission for the academic year 2017-18 & 201819.

    53.  Uttar Pradesh Saraswati Medical College, Unnao, Uttar Pradesh
    Debarred from admission for the academic year 2017-18 & 201819.

    54.  Uttar Pradesh Prasad Instt. Of Medical Sciences, Lucknow
    Debarred from admission for the academic year 2017-18 & 201819.

    55.  Uttar Pradesh Varunarjun Medical College, Banthra, Distt.  Shahjahanpur, Uttar Pradesh
    Debarred from admission for the academic year 2017-18 & 201819.

    56.  Uttar Pradesh Hind Institute of Medical Sciences, Ataria, Sitapur, Uttar Pradesh
    Not permitted for admission for the academic year 2017-18.

    57.  Uttar Pradesh Major S D Singh Medical College and Hospital, Fathehgarh, Farrukhabad
    Debarred from admission for the academic year 2017-18 & 201819.

    58.  Uttarakhand Shridev Suman Subharti Medical College, Dehradun, Uttarakhand
    Debarred from admission for the academic year 2017-18 & 201819.

    59.  Uttar Pradesh Saraswathi Institute of Medical Sciences, Hapur 
    Debarred from admission for the academic year 2017-18 & 201819 against increased intake from 100-150.  The college is recognized for 100 MBBS seats, hence, it is permitted for admission for 100 seats.

    60.  Uttar Pradesh Era's Medical College & Hospital, Lucknow
    Not permitted for admission for the academic year 2017-18 against increased intake from 100-150. The college is recognized for 100 MBBS seats, hence, it is permitted for admission for 100 seats.

    61.  Uttar Pradesh Rohilkhand Medical College & Hospital, Bareilly
    Debarred from admission for the academic year 2017-18 & 201819 against increased intake from 100-150. The college is recognized for 100 MBBS seats, hence, it is permitted for admission for 100 seats.

    62.  Uttar Pradesh Subharati Medical College, Meerut   
    Debarred from admission for the academic year 2017-18 & 201819 against increased intake from 100-150. The college is recognized for 100 MBBS seats, hence, it is permitted for admission for 100 seats.

    63.  West Bengal Gouri Devi Institute of Medical Sciences, Durgapur, Burdwan, West Bengal
    Debarred from admission for the academic year 2017-18 & 201819.

    64.  West Bengal IQ-City Medical College, Burdwan, West Bengal
    Not permitted for admission for the academic year 2017-18.

    65.  West Bengal ICARE Institute of Medical Sciences & Research, Haldia, West Bengal
    Not permitted for admission for the academic year 2017-18.

    66.  West Bengal North Bengal Medical College, Darjeeling
    Not permitted for admission for the academic year 2017-18 against increased intake from 100-150. The college is recognized for 100 MBBS seats, hence, it is permitted for admission for 100 seats.

    67.  West Bengal Midnapore Medical College, Midnapore
    Not permitted for admission for the academic year 2017-18 against increased intake from 100-150. The college is recognized for 100 MBBS seats, hence, it is permitted for admission for 100 seats.

                                                                                                              Sd/- (Dr. Reena Nayyar) Secretary I/C


    0 0
    0 0

    WASHINGTON: President Donald Trump has announced his support for a legislation that would cut in half the number legal immigrants allowed into the US while moving to a "merit-based" system favouring English-speaking skilled workers for residency cards.

    If passed by the Congress and signed into law, the legislation titled the Reforming American Immigration for Strong Employment (RAISE) Act could benefit highly-educated and technology professionals from countries such as India.

    The RAISE Act would scrap the current lottery system to get into the US and instead institute a points-based system for earning a green card. Factors that would be taken into account include English language skills, education, high- paying job offers and age.

    "The RAISE Act will reduce poverty, increase wages, and save taxpayers billions and billions of dollars. It will do this by changing the way the US issues Green Cards to nationals from other countries. Green Cards provide permanent residency, work authorisation, and fast track to citizenship," Trump said at a White House event to announce his support to the RAISE Act.

    Standing along with two top authors of the bill -- Senators Tom Cotton and David Perdue Trump said the RAISE Act ends chain migration, and replaces the low-skilled system with a new points-based system for receiving a Green Card.

    This competitive application process will favour applicants who can speak English, financially support themselves and their families, and demonstrate skills that will contribute to our economy, he said, adding that the RAISE Act prevents new migrants and new immigrants from collecting welfare, and protects US workers from being displaced.

    Trump said this legislation will not only restore America's competitive edge in the 21st century, but it will restore the sacred bonds of trust between America and its citizens.

    "This legislation demonstrates our compassion for struggling American families who deserve an immigration system that puts their needs first and that puts America first," he said.

    The RAISE Act will be re-orienting Green Card system towards people who can speak English, who have high degrees of educational attainment, who have a job offer that pays more, and a typical job in their local economy, who are going to create a new business, and who are outstanding in their field around the world, Senator Cotton said.

    Senator Perdue said the current system does not work. "It keeps America from being competitive, and it does not meet the needs of the economy today," he said.

    "Today we bring in 1.1 million legal immigrants a year. Over 50 per cent of our households of legal immigrants today participate in our social welfare system. Right now, only one 1 out of 15 immigrants who come into our country come in with skills that are employable. We've got to change that," he said.

    "We can all agree that the goals of our nation's immigration system should be to protect the interests of working Americans, including immigrants, and to welcome talented individuals who come here legally and want to work and make a better life for themselves. Our current system makes it virtually impossible for them to do that," said Senator Perdue.

    According to Attorney General Jeff Sessions, the higher entry standards established in this proposal will allow authorities to do a more thorough job reviewing applicants for entry, therefore protecting the security of the US homeland.

    The additional time spent on vetting each application as a result of this legislation will also ensure that each application serves the national interest, he observed.

    View at the original source

    0 0

    The successful user experience is about meeting a consumer’s needs on an individual level – a “segment of one” not “one-size-fits” all, many experts say. But what does that look like in practice? “What really differentiates companies is their personalization through data — which allows them to build unique experiences that lead to increased engagement and better outcomes, …” write Scott A. Snyder, president and CSO of Mobiquity and a senior fellow at Wharton, and Jason Hreha, founder of Dopamine, a behavior design firm, in this opinion piece.

    Today, design has a seat at the table. With the success of products like the iPod and the iPhone, businesses have realized that a good user experience is key for the bottom line.

    Yet even with this determined focus on design, most digital experiences fall short of user expectations. Of the 700 million websites that exist, 72% fail to consistently engage users or drive conversions. Of the 1.6 million apps available, just 200 account for 70% of all usage, and three out of four apps aren’t even used beyond the initial download.

    So where did things go wrong? Or more importantly, how can we get them right? Surprisingly, the answer does not lie with design. It lies with data.

    Netflix is an example of a company that pays attention to user experience. Early on Netflix chose not to charge late fees, like Blockbuster did, in order to help build its subscription DVD business. Netflix soon put Blockbuster out of business, but also came under threat from other online video streaming businesses like Sling and Roku. Fortunately, Netflix was able to use its viewing analytics to create personalized content recommendations, and even create its own shows geared toward viewer interests, such as House of Cards and Orange is the New Black.

    For Netflix, the user experience was the price of entry, and the viewing data they gathered and analyzed became the strategic advantage of the business. Because of their approach, we don’t order special TV/Movie packages anymore. Instead, thanks to Netflix analytics, we have our viewing experience tailored to who we are. This is one example of a new breed of data-driven user experiences created by companies like Amazon, Pandora, Sephora, Nike, Progressive and Disney.

    Good user experience design has become table stakes. If you don’t do it well, you can’t even get out of the gate in this hyper-competitive digital world. What really differentiates companies is their personalization through data — which allows them to build unique experiences that lead to increased engagement and better outcomes for the user and company. However, there is a fine line between “helpful” and “annoying” in the digital world, and the price of getting your data-driven personalization right or wrong may be the difference between a delighted customer and one who will never come back to your brand.

    Good user experience design has become table stakes. If you don’t do it well, you can’t even get out of the gate in this hyper-competitive digital world.

    How Can We Achieve Truly Personalized Experiences?

    There are three reasons why digital solutions fail to engage users long term and drive positive outcomes: segmentation, relevance and rewards.

    1. Segmentation: Behavioral

    Are you someone who likes competition and rewards? Or are you someone motivated by helpful nudges from friends and family? Do you respond to text messages during work, or do you catch up on your personal messages at night? Do you travel a lot? Do you have a “wearable” (or are willing to use one)?

    These are the types of questions we should be asking our users. We can either ask them directly, or infer answers from user interactions and behaviors. People are all different — but you wouldn’t know it by looking at most digital products. The majority of applications create a one-size-fits-all experience that fails to engage even a fifth of those who sign up. The good news is that we have the ability to collect individual behavioral data from users so that we can segment them more accurately, and present them with experiences that speak to their unique experiences and preferences.

    2. Relevance: Getting Context Right

    View enlarged image

    In order to deliver relevant, impactful interactions at the right moment, we need to understand each user’s context. But context is much more than just time (when?) and location (where?). With richer data being collected from both users and third-party sources, context is now evolving to include situation (what am I doing?) and emotion (how am I feeling?). An expanded definition of context is shown in the figure above.

    (Reference: Mobiquity and Wireless Innovation Council Research, 2014)

    With this multifaceted model of context, we move closer to the ideal “segment of one” (a unique profile for each user at a given point in time). You would not want to send weight loss content to a customer who is maintaining a healthy weight, or give a shopping coupon to a stressed out traveler in an airport security line. Context-aware applications like Google Now and Tempo AI (acquired by Salesforce) leverage a user’s calendar as a source of context, so that they know when a user may be busy, in a meeting or enjoying downtime. This information is used by these applications to adjust their content and experience to fit the context-determined mindset of each individual user.

    Context-aware applications like Google Now and Tempo AI (acquired by Salesforce) leverage a user’s calendar as a source of context, so that they know when a user may be busy, in a meeting or enjoying downtime.

    Most users are only willing to share their data if they perceive that they will get real benefits in return. More than 60% of consumers want real-time promotions, yet 67% don’t trust retailers with their data (Opinion Lab Survey, 2015). We have the opportunity to do better.

    3. Reward: Overcoming the Effort versus Benefit Challenge

    In order to get, you have to give. Unfortunately, most applications ask for too much and offer too little. Twitter It’s common for apps to have a long-winded sign-up process that asks you every question under the sun. This is not a recipe for success.

    Popular applications like Waze and Pandora are case studies in proper information gathering. They provide us with immediate benefits right after we download their apps. Waze improves our driving route in exchange for our location. Pandora gives us a personal DJ, tailored to our tastes, in exchange for rating the songs we’re listening to. In both these cases, our effort seems minimal in comparison to the benefits we get back. Contrast this with the majority of digital solutions that ask for a lot of data (like registration, profile, location, etc.) before delivering one ounce of benefit. We have to “earn the right” to collect the type of data we need to appropriately segment users. We have to win the “benefit versus effort” trade-off with our users by providing them with immediate, tangible benefits and using the data being collected to further personalize their experiences.

    Evolving to a Data-driven UX Approach

    Eighty-six percent of mobile marketers have reported success from personalization — including increased engagement, higher revenue, improved conversions, better user insights, and higher retention. However, only 1.5% of apps personalize their experiences (Mobile Marketing Automation Report, VB Insight, July 2015). In order to get to true personalization, and deliver greater effort than benefit, we need to make our user-experience (UX) design data-driven.

    A traditional UX design process starts with user research followed by user flow creation, persona creation, storyboards/wireframes creation, and (finally) a graphical mock-up or prototype of the design. The desired result of this process is a single beautiful design that attempts to deliver the best possible experience to meet the needs of all the different user types.

    By knowing something about each user’s behaviors, motivations and contexts, we have the opportunity to deliver a variation of the core experience that is best suited to each individual user by using robust analytics and an adaptive user interface.

    But the reality is that all users are not the same — and they don’t all want to interact with your app in the same way. By knowing something about each user’s behaviors, motivations and contexts, we have the opportunity to deliver a variation of the core experience that is best suited to each individual user by using robust analytics and an adaptive user interface. In a data-driven UX approach like this, we start with the desired outcomes and behaviors we are trying to achieve with the target user base.

    We then develop an initial behavioral segmentation model, and identify the optimal interaction strategies and user experience for each segment. And finally, we use analytics and machine learning to have our system adapt over time, so we can further optimize the design and underlying interaction models.

    The figure below depicts the difference between a traditional and data-driven UX approach.

    View enlarged image 

    Make it Real

    Data-driven UX design is a fundamental shift in how companies approach product design and development. While the journey is not easy, the potential payoff is huge in terms of long-term engagement and positive outcomes for your customers. If you want to move to this new model, you need to consider the following:

    1. Expand your definition of context beyond location and time. Situation and Emotion matter.

    2. Deliver immediate benefits to users before asking for more of their data. There is a fine line between useful and creepy.

    3. Segment your users based on digital behaviors, preferences, motivations and context to drive the most relevant interactions.

    4. Set up a big data and analytics environment capable of capturing and acting on behavioral analytics data in real time.

    5. Use analytics and machine learning to adapt the target interactions for each user-segment over time, based on user responses.

    6. Recruit a new breed of user experience designers—those with analytics skills to support the design of adaptive user experiences.

    7. Start with desired outcomes, then pilot and adjust quickly.

    It’s no longer good enough to know your customers. It’s what you do with that knowledge that really matters. Your customers are willing to engage and share their data if they perceive a real benefit for them.

    Are you ready to live up to your end of the bargain?

    View at the original source

    0 0

    Image credit : Shyam's Imagination Library

    In recent weeks, a story about experimental Facebook machine learning research has been circulating
    with increasingly panicky, Skynet-esque headlines.

    “Facebook engineers panic, pull plug on AI after bots develop their own language,” one site wrote. “Facebook shuts down down AI after it invents its own creepy language,” another added. “Did we humans just create Frankenstein?” asked yet another. One British tabloid quoted a robotics professor saying the incident showed “the dangers of deferring to artificial intelligence” and “could be lethal” if similar tech was injected into military robots.

    References to the coming robot revolution, killer droids, malicious AIs and human extermination abounded, some more or less serious than others. Continually quoted was this passage, in which two Facebook chat bots had learned to talk to each other in what is admittedly a pretty creepy way.

    Bob: I can i i everything else

    Alice: balls have zero to me to me to me to me to me to me to me to me to

    Bob: you i everything else

    Alice: balls have a ball to me to me to me to me to me to me to me to me

    The reality is somewhat more prosaic. A few weeks ago, FastCo Design did report on a Facebook effort to develop a “generative adversarial network” for the purpose of developing negotiation software.

    The two bots quoted in the above passage were designed, as explained in a Facebook Artificial Intelligence Research unit blog post in June, for the purpose of showing it is “possible for dialog agents with differing goals (implemented as end-to-end-trained neural networks) to engage in start-to-finish negotiations with other bots or people while arriving at common decisions or outcomes.”

    The negotiation system’s GUI. Gif Credit: Facebook AI Research

    The bots were never doing anything more nefarious than discussing with each other how to split an array of given items (represented in the user interface as innocuous objects like books, hats, and balls) into a mutually agreeable split.

    The intent was to develop a chatbot which could learn from human interaction to negotiate deals with an end user so fluently said user would not realize they are talking with a robot, which FAIR said was a success:

    “The performance of FAIR’s best negotiation agent, which makes use of reinforcement learning and dialog rollouts, matched that of human negotiators ... demonstrating that FAIR’s bots not only can speak English but also think intelligently about what to say.”

    When Facebook directed two of these semi-intelligent bots to talk to each other, FastCo reported, the programmers realized they had made an error by not incentivizing the chatbots to communicate according to human-comprehensible rules of the English language. In their attempts to learn from each other, the bots thus began chatting back and forth in a derived shorthand—but while it might look creepy, that’s all it was.

    “Agents will drift off understandable language and invent codewords for themselves,” FAIR visiting researcher Dhruv Batra said. “Like if I say ‘the’ five times, you interpret that to mean I want five copies of this item. This isn’t so different from the way communities of humans create shorthands.”
    Facebook did indeed shut down the conversation, but not because they were panicked they had untethered a potential Skynet. FAIR researcher Mike Lewis told FastCo they had simply decided “our interest was having bots who could talk to people,” not efficiently to each other, and thus opted to require them to write to each other legibly.

    But in a game of content telephone not all that different from what the chat bots were doing, this story evolved from a measured look at the potential short-term implications of machine learning technology to thinly veiled doomsaying.

    There are probably good reasons not to let intelligent machines develop their own language which humans would not be able to meaningfully understand—but again, this is a relatively mundane phenomena which arises when you take two machine learning devices and let them learn off each other. It’s worth noting that when the bot’s shorthand is explained, the resulting conversation was both understandable and not nearly as creepy as it seemed before.

    As FastCo noted, it’s possible this kind of machine learning could allow smart devices or systems to communicate with each other more efficiently. Those gains might come with some problems—imagine how difficult it might be to debug such a system that goes wrong—but it is quite different from unleashing machine intelligence from human control.

    In this case, the only thing the chatbots were capable of doing was coming up with a more efficient way to trade each others’ balls.

    There are good uses of machine learning technology, like improved medical diagnostics, and potentially very bad ones, like riot prediction software police could use to justify cracking down on protests. All of them are essentially ways to compile and analyze large amounts of data, and so far the risks mainly have to do with how humans choose to distribute and wield that power.

    Hopefully humans will also be smart enough not to plug experimental machine learning programs into something very dangerous, like an army of laser-toting androids or a nuclear reactor. But if someone does and a disaster ensues, it would be the result of human negligence and stupidity, not because the robots had a philosophical revelation about how bad humans are.

    At least not yet. Machine learning is nowhere close to true AI, just humanity’s initial fumbling with the technology. If anyone should be panicking about this news in 2017, it’s professional negotiators, who could find themselves out of a job.

    View at the original source

    0 0

    From Bitcoin and Blockchain to hacks and ransomware, security is a hot topic in tech. According to a recent Cybersecurity Ventures Report, the cost of cybercrime damage is predicted to reach $6 trillion by 2021, and global spending on cybersecurity products and services for defending against cyber crime is projected to exceed $1 trillion by 2021. With that much at stake, the business world certainly has security on their minds.

    It’s critical for technology marketers to stay on the pulse of today’s trends – and what better way to do so than by following influential peers on LinkedIn? To learn from today’s best writers, content curators, and opinion leaders on all things security, look no further.

    Below, we’re showcasing five professionals providing a helpful portal into the world of security – those that can keep you — and your organization — in the know!

    Bill Brenner, Infosec Scribe at Sophos

    Bill has been on a number of “top tech” lists, and for good reason. He is active in the LinkedIn security community and focused on sharing real-time info about the current state of the industry. Following him ensures that you’ll be alerted about recent ransomware attacks – and fed great articles on how to combat it.

    One of Bill’s recent posts shared an angle not often seen, talking about his own “Rockstar” status in the security world and how it caused his content and point of view to become stale. When an influencer takes time to step back and assess his or her influence and POV, it’s anything but stale; Bill is a breath of fresh air in an oftentimes cluttered conversation.

    Tech marketer takeaway: If you think you’ve learned everything about a particular topic or industry, you’re probably wrong. As Bill put it, “Never stop seeking truth.” Your customers and prospects will thank you for this – it’s never fun talking to a person or company who thinks they have all the answers. Keep an open mind when it comes to new content, ideas and perspectives, and you’ll be a much better marketer for it!

    Maya Schirmann, CMO at Deep Instinct

    Maya is a great go-to connection when it comes to security marketing. Her company, Deep Instinct, is “the first company to apply deep learning to cybersecurity” and she also takes a “deep learning” approach to her own content. Tech marketers can benefit from her great approach to discussing security. As data breaches and hacks are in the headlines more often than not these days, Maya shares frequent helpful updates, podcasts and go-to lists on the all of the latest cybersecurity news. She also covers a number of insightful topics  in her published posts such as weak password management, “hacking highlights” and an Oscar-themed awards post containing actionable security advice for organizations both large and small.

    Tech marketer takeaway: Creatively tying your topic of choice to current events and impactful discussions is a great way to get noticed – and it’ll pique the interest of potential customers. Tying content into interesting trends and current events is an effective way to set your content apart.

    Steve Morgan, Founder & Editor-In-Chief, Cybersecurity Ventures

    Steve is a veritable fountain of security info – he’s written dozens of reports on cybercrime, cybersecurity products and services, and launched the “Cybersecurity 500” list of the hottest cybersecurity companies to watch each year. He shares updates on cybersecurity employment, defense firms, hacks and data breaches – and how companies can be proactively preparing for their IT security futures. Many of his shared posts are helpful because they are so proactive – they often include step-by-step instructions for companies looking to boost their cybersecurity efforts, where to best spend their security budgets and what risks they should be most aware of and on the lookout for. Steve doesn’t just want companies to be in the know – he wants them to do something about it!

    Tech marketer takeaway: Simply knowing about the current trends and buzzwords in an industry isn’t enough. To be truly effective as a security marketer, you must provide actionable insights to your customers and prospects. Everyone knows that cyberattacks can and will hit: but are you contributing to the conversation and equipping people who are preparing for the worst?

    Yotam Gutman, VP Marketing at CyberDB

    Yotam not only frequently posts interesting cybersecurity trends, he actively wants to engage and have meaningful discussions with his LinkedIn audience of almost 15,000 followers. He takes cybersecurity incredibly seriously and shares intricate, detailed data – but he also doesn’t shy away from sharing a fun story or video with his followers now and then. Yotam often attends IT security events around the world and takes the time to share what he’s learned –  reminding his followers and their companies that amongst the hustle and hype around cybersecurity, it’s important to define who you are and have a definitive brand for your organization.

    Tech marketer takeaway: You don’t have to be a technical expert in all things security to come at this topic from a new angle and appeal to your tech buyers with your insights. Tech professionals want to be part of an active community that both gives out relevant info and engages in a more meaningful way (and events are a great way to do this). It’s all about the conversation!

    Bob Carver, Manager of Network Security, Verizon

    Bob knows his stuff when it comes to risk management, cyber resiliency and strategy, incident management and threat intelligence – he’s monitored tens of thousands of infected endpoints at Verizon, at one point overseeing the company’s Security Incident Response team. Follow Bob for a more in-depth take on many IT security conversations, from Botnet monitoring and cryptocurrency to the NSA. Bob’s content comes from the fact that he’s most likely “seen it all” throughout his career – and he wants to put that in-depth knowledge to good use for companies around the globe looking to proactively fight cyberattacks.

    Tech marketer takeaway: A little technical in-depth info goes a long way. It might take some serious time and a lot of research to truly understand complex security topics that your customers care about – but it’s worth the hassle in the end.

    As security is becoming more and more of a priority for many in the tech industry, it’s important to stay abreast of the latest security trends and innovations. These security leaders are invaluable within the LinkedIn technology community, and each showcases the true power of the LinkedIn network: that anyone can both access and participate in greater industry discussions across the globe. It’s time to take what we learn from them and put it into action!

    0 0

    Electric cars have started to gain traction in the automotive market as the public desires a sustainable alternative to the traditional fossil fuel vehicles. Rather than develop an electric car to use electricity obtained from a standard grid, Sono Motors decided to take this concept a step further with Sion, a self-charging car that utilizes solar energy with solar panels wrapped around the exterior of the vehicle. The solar panels grant the owner the freedom to take their vehicle anywhere and know that the battery will consistently be charged.

    The solar panels installed in the exterior do not protrude from the frame of the car; instead they hug the roof, rear, front and sides of the car and are covered in a layer of polycarbonate. Sono Motors created two models, an Extender and an Urban. On a full battery charge, the Extender can travel about 155 miles straight, while the Urban can go up to roughly 75 miles. Both of the models can drive for nearly 19 miles by simply sitting in the sun, before drawing power from the battery. Once the battery depletes, it takes 40 minutes to get 80% of the battery’s full power back through an electrical outlet, or half a day sitting in the sun.

    Sustainable features extend to the inside of the vehicle, which contains an air filtering unit that uses a moss liner under the dashboard, called the breSono system. The moss, a special lichen, doesn’t require additional maintenance from the owner as the plants acquire the water they need to survive by absorbing it through the air. The plant also acts as a sound dampener to avoid hearing the engine and protects against potential fires.

    Sustainable features extend to the inside of the vehicle, which contains an air filtering unit that uses a moss liner under the dashboard, called the breSono system. The moss, a special lichen, doesn’t require additional maintenance from the owner as the plants acquire the water they need to survive by absorbing it through the air. The plant also acts as a sound dampener to avoid hearing the engine and protects against potential fires.

    The developers behind the Sion decided to bring the idea forward with an Indiegogo crowdfunding
    campaign in 2016. While the vehicle cost €16,000, backers of the campaign did not have to offer all of this upfront. The developers broke it down into reasonably priced chunks, from €50 all the way to €14,080. Those who pledged more than €50 had the opportunity to get put at the front of the line to preorder the vehicle when they released it as well as the chance to test drive the car before anyone else. The Indiegogo campaign raised €180,000 for the Sion.

    Sono Motors continues to accept preorders on their website. The company offers preorders at four price tiers: €500, €2,000,€8,000, and €14,720, which is the full price of the car. Those who plan to buy the Sion on one of the installment plans will eventually need to pay what they owe, but at a discount. If any parts of the vehicle require maintenance, owners can go through Sono Motors. Backers should receive their Sions sometime in 2019, but an exact date remains unknown.

    View at the original source

    0 0

    The Reserve Bank of India (RBI) will transfer Rs 30,659 crore of its surplus to the government for the financial year 2016-17, less than half of the Rs 65,876 crore it transferred a year earlier.

    The RBI did not provide any reason for the decline in dividend but economists said this indicated the cost incurred by the central bank in printing new notes as well as in sterilising liquidity after old Rs 500 and Rs 1,000 currency notes were scrapped in November and subsequently returned to the banking system.

    The dividend paid is the lowest since 2011-12, when the RBI had transferred Rs 16,010 crore of its surplus to the government. In 2012-13, the central bank paid Rs 33,010 crore. The RBI’s financial year runs from July to June. The central bank is expected to publish its annual reports next week after its board met on Thursday to clear the accounts. 

    In 2012-13, the YH Malegam Committee recommended the central bank transfer its entire surplus to the government. The RBI has been transferring its entire surplus to the government since then. It paid Rs 52,679 crore in 2013-14 and Rs 65,896 crore in 2014-15.

    In the Union Budget for 2017-18, the government had accounted for a dividend of Rs 74,901 crore from the RBI and other nationalised banks. An official later said the RBI’s share would be Rs 58,000 crore. 

    RBI Governor Urjit Patel told a parliamentary panel in July that the central bank had not finished counting the old returned notes. 

    He has also said notes not returned remain the RBI’s liability and cannot be passed on to the government as dividend. 

    The Union Budget had not accounted for any special dividend from the RBI against demonetisation, which some economists had estimated would be in the lakhs of crores of rupees.

    The low actual dividends, meanwhile, will exert pressure on the government to meet its fiscal deficit. Care Ratings Chief Economist Madan Sabnavis said the fiscal deficit could increase from 3.2 per cent of the GDP to 3.4 per cent this year. At its peak, the excess liquidity parked by banks neared Rs 5 lakh crore, on which the central bank had to pay them 6 per cent interest. The average daily liquidity absorption continued to remain above Rs 2 lakh crore after demonetisation was announced.

    According to Devendra Pant, chief economist of India Ratings & Research, the appreciation of the rupee against the dollar depressed returns, in rupee terms, on the RBI’s foreign holdings. The rupee has appreciated by more than 6 per cent against the dollar since January.

    0 0

    A resident of Mumbra, a town on Mumbai outskirts, is fighting for a refund from Amazon since past two months but to no avail.

    Mumbra resident Mohammad Sarwar had ordered a 50-inch television set on Amazon's website in May and paid Rs 33,000 via credit card.

    Sarwar received the package on time but was advised not to open it until a technician comes to install it.

    "They said I may inadvertently damage the TV while opening the box. I left the box intact, which I now realise was a big mistake," he told Mirror.

    When the technician arrived a day later, there was no TV inside the box but a 13-inch Acer monitor, which appeared to have been used before.

    Sarwar has been fighting for a refund since then.

    "I made several calls before a customer service agent said the refund would be issued only after I sent the package back. I was told I would have to bear the courier service charges. The suggestion left me furious, but I wanted my money back so I agreed," he said.

    The story does not end here. A courier company refused to send the monitor as it didn't have an office near his house.

    Meanwhile, Sarwar claims Amazon did not pay heed to his calls.

    "The e-tailer's customer support took its own sweet time every time I called. It kept transferring my calls from one agent to another. I even sent emails and shared my grievance on social media. Nothing happened," he said.

    An Amazon spokesperson told Mirror it was trying to resolve Sarwar's complaint. "We are in touch with the customer and are committed to resolving this at the earliest," the official said.

    "I understand as an e-commerce site, they (Amazon) have their limits, but they can't keep me hanging after delivering a wrong product. I will take the matter to a court, if that's what will make them take customers seriously," said Sarwar.  

     View at the original source

    0 0

    Desktop Alert, Inc., the patented system owner of less than one minute network-centric emergency mass notification systems (EMNS) to military, government, healthcare, higher education and industrial organizations, today announced that its industry leading mass notification communication platform, Desktop Alert 5.x has garnered three 1st place awards from Government Security News’ (GSN) 2017 Airport, Seaport, Border Security Awards.

    Panel 2 of the Summit was moderated by Chuck Brooks, President for Government Relations and Marketing at Sutherland Global Services, who ran the most interactive panel of the day. With all guests being members of DC’s IT Tech elite and the subject of the panel being future threats and new defense technologies, the ballroom was buzzing with questions and discourse. It’s safe to say that this panel ran much like a think-tank, comprised of DC’s greatest tech minds and fueled by the spirit of collaborative learning.

    Mr. Brooks has also been cited by Linkedin as one of the top 5 out of 500 million members to follow for emerging technology issues. Linkedin will also be featuring Chuck in their upcoming blogs as a cyber security SME and advisor.

    Desktop Alert was named Best Mass Notification System and also a co-winner for Most Notable Implementation of new Technology – Solano Country Implementation of Desktop Alert and Safekey. Desktop Alert subsidiary Metis Secure Solutions also won for Best Alert Beacon System.
    "We are honored to have been chosen as a multiple category winner. Our companies numerous years of products and services to the U.S. Army National Guard, U.S. Air National Guard and Northern Command proved pivotal in the award selection process," said Howard Ryan, Founder Desktop Alert Inc.

    About Desktop Alert:  

    Worldwide U.S. Military organizations such as U.S. Northern Command, The United States National Guard, The United States Air Force Academy, The United States Military Academy at West Point, Multi-National Forces in IRAQ and Afghanistan, The U.S. Air Force, The U.S. Army now utilize the Desktop Alert mass notification platform daily for their organizations emergency communication requirements. Desktop Alert can contact thousands of users with desktop alerts and require receipt confirmation of the message. Those not verified can then be listed on a report and/or sent as a "Target Package" to be automatically contacted by other means such as email, SMS, phone calls and other devices.  

older | 1 | .... | 72 | 73 | (Page 74) | 75 | 76 | .... | 82 | newer