Using Advanced Analytics to Redefine IT Service Excellence

Enabling profitable revenue growth, business scalability, operational efficiencies and increased customer satisfaction by using rapid innovation has now become the prime focus for any CIO office. Modern IT organizations have positioned themselves to be in-charge of setting up the big data foundation for enabling business teams across engineering, manufacturing, sales, marketing, operations and finance to rev up the growth engines, become more agile and reduce business risk.

While doing so, many IT organizations don’t have the time or the resources to effectively manage their own backyards e.g. end-user productivity infrastructures, service desks etc. While there are some tools like Zendesk and ServiceNow to manage the IT service delivery and performance, these, however, generally lack good analytical reporting capabilities. Therefore, many IT organizations have still relied on spreadsheet-based reporting.

It not only takes long time to generate IT service performance reports, it is also very difficult to get any insights into the service performance. As these reports provide no predictive capabilities, it is very difficult to plan intelligently for future service improvements. The overall usability of these reports is generally very poor.

There are tons of 3rd party analytical solutions that integrate with these service management platforms. However, most of these cannot be efficiently used by the end users. These tools force the users to have a pre-understanding of the semantics and grammar, and require a lot of training. They still expect an IT engineer to write the logic for answering any typical reporting request. This is equivalent to buying a car body, instead of a ready-to-drive car, that works only on a particular type of road. The driver is expected to fit appropriate parts depending on the type of road, weather conditions, and everything else about the route… Imagine having to do that day in and day out, and for every possible route… with business teams waiting in line for IT to help with the car assembly! I am not kidding!

At PMC Sierra, as part of our digital transformation, we encouraged end user adoption of analytical products that freed up the IT bandwidth and saved costs on training. Drastin is one such solution that showed potential to increase analytics usage across the enterprise. With its “Google-like” search interface and rich visualizations, anyone could start using it without much training.

Drastin is a natural language-based platform that learns from the historical data and helps predict future performance. With its great visualization dashboards, it helps provide easy slicing and dicing and drill down into the data (diagnostic analytics), and hence improves the overall user adoption.  Also, since it is a cloud-based “Analytics as a service” tool, one doesn’t have to tie down the precious IT resources to the management of this tool. Drastin’s search functionality is natural and allows business users to make better decisions while removing the linear dependency on IT engineers. Within a few days after initial setups and with some data learning by the tool, one could ask their business queries in English (for example, “Monthly MTTR last year for network team”) and get an instantaneous response.

Good advanced analytics tools can remove the painful reporting process and allow stakeholders easy access to KPIs, metrics, actionable insights, without much training. CIO office can get deeper insights into IT Service Delivery across teams, drive the IT organization to effectively track performance for service improvements and encourage data-driven decisions.

Advanced analytics tools can help redefine IT Service Excellence.

Posted in BIG Data | Leave a comment

User-centric IT: Is it a Fad or a real Opportunity for IT to shine?

Traditionally, IT departments have always deployed enterprise solutions by focusing on the business problems first and secondly on the user needs. It is no surprise that most of the projects do not fetch intended business results due to low user adoption.

Information Technology started with data-centers being the focal point of service delivery. This was the time when companies mandated central IT departments to establish company-wide frameworks, processes, standards and protocols. IT systems were costly and required in-house implementation by a group of IT experts. IT services in the market were scarce, hard to procure and often difficult to implement. IT defined, designed and rolled out systems that mostly ended up dictating what the users could and could not do, and users were able to use only company provided phones and laptops.

Fast forward to today…

Cloud-technology has made IT services widely and easily accessible. Workday, Box, AWS,, Office365 are some of the many ubiquitous services that require swipe of a card to procure. These cloud-based CRM, Collaboration, HRIS and Office productivity tools can be implemented in days if not in minutes. Add to this the plethora of smartphones and tablets that now have become devices of choice for users. This is also labelled as consumerization of IT where personal and business use of technology devices and applications are blending hassle-free.

Employees of all ages are bringing their own devices and software into the office, and cloud adoption is shifting computing workloads out of the enterprise’s data centers.

Yes, you guessed it right, the earlier methods of traditional IT approach are over..

Enter User-Centric IT…

Transformational IT organizations have realized the shift and have been re-thinking traditional service models. Such groups are designing new services starting with users at the center.

What is User-Centric IT?

User-centric IT model encompasses following principles:

  • User-centric IT serves the business by empowering people
  • User-centric IT adapts to the way people work, not the other way around
  • People, information and knowledge must connect in real time
  • Mobility is a work-style preference, not a device
  • Security should be inherent and transparent to the user experience

With changing times, now most of the services either are or will be available in the cloud. Users will have freedom to choose and bring their own devices. IT departments will run less and less services internally and are aligned to rethink IT service models.

In the Future

According to Geoffrey Moore, Managing Director of Geoffrey Moore Consulting, “As the business world becomes increasingly digital, mobile and cloud-based, organizations must transform how they deliver IT. It’s no longer about the best or most cost effective technology stack – it’s about the technologies that make the information worker most productive. The user-centric IT initiative inspires this change, and as IT adopts this roadmap, users can access and share information faster, more easily and without barriers, thus leading to better business results.”

When companies will adopt user-centric IT models, it will bring fundamental change in the way businesses deliver IT services.

Implementing this new technology deployment model will lead to higher adoption, agility, and success.



Posted in Leadership | Leave a comment

Cognitive Computing – New Era of IT

When IBM’s Watson defeated the previous human champions on the jeopardy show, it was the beginning of new Cognitive computing era. This new era of computing changes how humans and computers interact.

Cognitive systems will understand the world in the way that humans do: through senses, learning, and experience. Cognitive computing will help human experts to make better decisions.

According to IBM “growth is accelerating as more of the world’s activity is expressed digitally. Not only is it increasing in volume, but also in speed, variety and uncertainty. Most data now comes in unstructured forms such as video, images, symbols and natural language – a new computing model is needed in order for businesses to process and make sense of it, and enhance and extend the expertise of humans. Rather than being programmed to anticipate every possible answer or action needed to perform a function or set of tasks, cognitive computing systems are trained using artificial intelligence (AI) and machine learning algorithms to sense, predict, infer and, in some ways, think”.

Building Blocks

Cognitive computers are fundamentally different from traditional computers. Traditional computers are built around a central processing unit (CPU) whereas Cognitive computers are built to analyse information and draw insights from it. Traditional computers are designed to perform quick calculations and are programmed by humans to perform specific tasks, whereas cognitive computers are capable of learning from their interactions and continuously reprogramming themselves.

BIG Data, Artificial Intelligence and Interfaces are the three foundational technologies driving progress in the field of cognitive computing.

  • BIG Data
    • In recent times, there has been a huge growth of data and that forms the vast amounts of intelligence for the cognitive computing. The volume of data creates the potential for people to understand the environment around us with depth and clarity that was not possible before. BIG data is a vast resource that helps machines to think and interact like humans.
  • Artificial Intelligence and Machine learning
    • Using this vastly increasing BIG data and by applying deep analytics, cognitive computers are able to provide solutions to complex problems. To help us think better, cognitive computers need the humanlike characteristics – learning, adapting, interacting and some form of understanding, and this need is enabled by the ever emerging field of Artificial Intelligence and Machine Learning.


  • Human / Machine Interface:
    • Humans and machines interact with each other using natural processing language and other sensors that aid in seeing and hearing each other. Cognitive computing will result in intelligent collaboration between humans and the machine. The idea is not to replace the human thinking with machine thinking, but to complement each other’s best suits.


While traditional computing applications involved programming by humans, cognitive systems will learn from their interactions with humans and draw inferences from vast amounts of data captured from sensor networks and other sources.

Cognitive computing has been already making progress in the following areas.

  • Medical
  • Finance
  • Sales & Marketing

There are now APIs platforms available in the cloud that programmers can use to write their own Cognitive applications. IBM’s Watson developers cloud is one such example.

According to a news article in gigaom, applications build around Watson cloud platform are scheduled for release in 2014 and include applications like a personal health assistant and a personal shopper, you can guess what these apps are supposed to do.

Other Cognitive platforms include Expect Labs’ MindMeld APIs, AlchemyAPI’s natural-language processing and computer vision services, and Jetpac’s DeepBelief neural network SDK for object recognition.

In Closing

We are crossing a new frontier in the evolution of computing and entering the era of cognitive systems. Scientists and engineers are pushing the boundaries of science and technology with the goal of creating machines that sense, learn, reason and interact with people in new ways. Cognitive systems will help people and organizations penetrate complexity and make better decisions—potentially transforming business and society.

Author of the book on “smart machines” says, “In the programmable-computing era, people have to adapt to the way computers work. In the cognitive era, computers will adapt to people.”, and that is very profound!

Jeopardy challenge pitted man against the machine, sky will the limit when man and the machine will work together.


Posted in BIG Data, Cloud, Home | 1 Comment

Collaborative Economy – Disrupting Businesses

Imagine a world in which in addition to sharing pictures, opinions and activities on the social media, people start sharing goods and services with each other.

Early indicators of this disruption can already be seen in the following areas:

  • Space – Office Space, Places to Stay. e.g. AirBnB, LiquidSpace, OpenDesks
  • Goods – Pre-owned goods, Custom products e.g. Chegg, Bookcrossing
  • Transportation – Cars, Bikes e.g. Uber, Lyft

Key drivers enabling shared economy include:

  • Connectivity – Allows consumers and sellers to plug into the platform easily to share with each other
  • Gravity – An engaging platform that attracts participants, both producers and consumers
  • Flow – Easy way of handling transactions to foster the exchange and co-creation of value

Challenges and Motivations

People are used to transact with well-known and established businesses. In order for them to share rather than purchase assets will require change in their buying behaviors.

Key Challenges:

Following are some of the challenges for using shared services

  • Risk of unknown suppliers
  • Uncertainty about quality of services
  • Lack of information about availability of shared services.

Key Motivators:

According to a recent survey of people who have used shared services in the past, following are the main factors that drive shared use of assets:

  • Ease of doing business, anytime and anywhere
  • Lower Pricing
  • Product Quality and Service
  • Recommendations or ratings

Who are the players in Shared Economy?

According to a report based on 2 surveys conducted between October 2013 and January 2014 by Vision Critical’s Voice of Market with participants aged 18 and over, demographics participating in the shared economy is divided into the following three categories:

  • Neo-Sharers – Who have in the past year used at least one of several “emergent” sharing services, such as Airbnb and Kickstarter
  • Re-Sharers – Those who “buy and/or sell pre-owned goods online using well-established services like eBay and Craigslist,” but who have yet to graduate to neo-sharer status
  • Non-Sharers – those yet to participate, but many of whom intend to in the next year

Some other key points about the demographics:

  • Neo-sharers skew heavily towards younger groups
  • Sharers are going to be in the affluent sections of the society
  • Sharing of most goods and services tends to be more concentrated in top urban centers.
  • 51% of neo-sharers are women (compared to 50% of non-sharers),
  • 48% are aged 18-34 (versus 24% of non-sharers).
  • Neo Sharers have easy access to mobile technologies and the internet

Will Shared Economy Impact Your business?

According to Altimeter research group, every car-sharing vehicle reduces car ownership by 9-13 vehicles; a revenue loss of at least $270,000 to an average auto manufacturer. Cascading effects of this reach into rest of the ecosystem that includes auto-loans, car insurance, fuel, auto parts etc. Homeowners are displacing hotels. Consumers are skipping the bank and lending to each other.

Shared economy is redefining the seller-buyer relationship. How will your business get impacted?

Smart and innovative companies are already adopting the value-chain of collaborative economy. Companies are re-aligning their business services along following three models:

  • Company as a service e.g.,
  • Providing a platform for shared economy e.g. ebay, Yerdle, Kickstarter
  • Motivating a marketplace e.g. AirBnb

Reassess your business model, see where your business would fit in the value-chain of collaborative economy and then make the appropriate move.


Posted in Cloud, Home | Leave a comment

Open Data is here now!

Imagine a world where vital information will enable individuals to become more occupied with their communities and more engaged in making their respective governments responsible. Such a world would not only be beautiful but also be a place with better transparency and civic innovation. Open Data, a new jargon but definitely not a new concept, can make this scenario possible.

So, what is Open Data?

To put it simply – it is similar to open-source software, only it is data instead of software. Data that is freely available, that can be republished or reused without copyright or patent restrictions can be considered open data. According to Joel Gurin, editor of, open data is accessible public data that people, companies, and organizations can use to launch new ventures, analyze patterns and trends, make data-driven decisions, and solve complex problems. The extent, to which vital data can be disclosed for public access, makes open data a much discussed topic at various levels of governments in the developed and developing world.

Why is it important?

We could grasp the profound relevance of open data only when we get to know how open data impacts government policies and practices, innovation and other aspects such as consumer advocacy. The annual value of open data on the world economy was recently estimated by McKinsey and Company to top $3 trillion in one year (from seven sectors alone). Now, it is needless to say that, $3 trillion means a lot of headroom. The opportunity that this industry is opening up is enormous, both for the governments and private or public companies. McKinsey has estimated that educational industry will have the highest potential.

The incorporation of open data available with the government and a vast pool of social media data now offer a huge scope for federal agencies and government departments to identify the problems of citizens and interact with them more meaningfully. Many organizations like The Federal Emergency Management Agency (FEMA) of New York City and the Massachusetts Bay Transportation Authority have schemes lined up to examine various social media platforms for information that could improve their functioning like feedback on their service and threats against their officials. FEMA has found the opportunity to improve its emergency response after identifying social media outlets’ role in enabling people to reach out much faster during natural disasters to ask for help or report injuries and the vast pool of data hence created. They created an application that monitored Twitter activity and found out the levels of public sentiment and effectiveness of resource delivery to residents in New York and New Jersey after Hurricane Sandy.

Open government data proves to be a green signal for entrepreneurs and companies to make systems that reveal little known trends and utilize this data to further improve services.

How to analyze them?

Big data analysis is now happening globally. The same tools and techniques can be used to analyze open data. Hackathons is becoming a trending practice these days. Hackathons create an ecosystem for coders, data junkies, developers, and designers to explore better ways to deliver citizen services. Say, they are provided with the data from the department of transportation or railways. By analyzing all the data they can come up with patterns and predict delays or come up with better routing options. They can provide average delay per route, main reasons for these delays and even suggest options to lessen these delays.

So, is it truly open?

Not always, will be the answer. Making open data accessible to the public puts governments with the situation that ‘third parties’ can now do anything or rather manipulate this data, this is a huge stake and governments are actually taking measures to prevent this from happening. Many of these measures are still at the nascent stages of development. For example, US government requires its agencies to do stringent privacy and confidentiality checks before releasing data; they also have to take into account mosaic effect. Some of the current measures include limiting reuse, restricting combined reuse and restricting international reuse.

The opportunities that open data provides are no less. Though many government agencies release open data, they do so without consulting the companies or firms for which it should be useful. This may create release of irrelevant open data. However the scenario is changing in European countries like France and UK, and other countries like Mexico, where the agencies are trying to build feedback loops from data users to government data providers which will ultimately benefit the users.

With efforts being dropped in from every other corner, we could certainly hope that open data will play a major role in building great advancements in the fields of Health, Finance, Data journalism, Energy, Agriculture and more.

Posted in BIG Data, Home | Leave a comment

Gamification – How can it help your business?

Working extra hours on the project gets 50 points per hour. Finishing the task before the deadline gets bonus points for extra time. Convincing the client about your product gets 250 points (because that is the expert level). The team member who completes the maximum number of goals gets the golden badge for the best team player. I am not talking about an extended version of The Sims but an adaptation of gaming attributes in the real world, a trend which is now popularly known as Gamification.

In its simplest sense, Gamification is the implementation of game design techniques, game mechanics and/or game style into various aspects of real life, especially into the dull and boring tasks to make it creative and exciting. For example: Target, the multi chain retail company, recently introduced Gamification to their overcast workplace. They altered their checkout process and developed it into a game.  Whenever a cashier checks someone out, they are introduced to a gameplay.  While scanning, a red light tells them they were too slow to scan an item while a green light is bang on.  They were even provided with a real time score to analyze their achievement. It is needless to say that this small process had a far reaching effect on increasing and maintaining the interest and engagement of their employees.

Growing in its popularity since 2010, Gamification is now applied into training and education, customer engagement, employee performance, personal development and various other aspects of day-to-day life. In a scenario where it is properly implemented, Gamification engages people and motivates them to solve problems, cultivate skills and alter behaviors. The practice is highly innovative and is being implemented successfully by many major companies across the world.

The 2011 campaign by Volkwagen Group in China saw one of the early successes of Gamification. They invited participants to develop a new ‘people’s car’. They were given a tool to easily design their new vehicle and to post their designs for others to view. They could pick their favorites and the results were tracked on leaderboards for contestants and the public to see how competing designs were faring. At least 33 million people visited the site by the end of the campaign’s first year and 3 winning concepts were selected. Now that being an early example, Gamification is increasingly used in work environments today, where employee retention is a major pain point addressed by the management.

Companies are coming up with various options of infusing game mechanics and other gaming attributes and then molding it to suit their work environment. It makes the goals clearer, feedback timely, rules transparent and flow of information balanced. The best part is that, failures which were forbidden, punished and not talked about become expected, encouraged, spectacular and bragged about. It helps the employees understand the relationship between the job and the organization’s mission.

Gamification basically works on three different levels in employees. It gives a sense of autonomy, were you feel in charge of your goals which in turn motivates you to stick to them longer. It adds value to your task and there is always a better chance of you completing the goal for its value. It gives an inconspicuous sense of competence. When you know that you are really getting onto something and it takes hard work rather than in-built talent, you will keep trying. According to Brain Burke of Gartner Inc., 40 percent of Global 1000 organizations will use Gamification as the primary mechanism to transform business operations by 2015.

So how can you establish a simple yet feasible Gamification strategy to help you stabilize your receding employee engagement? Introduce a puzzle or a challenging problem. A sense of problem solving makes your un-challenging work environment to a proactive game zone for your employees. It is the age of multi-player games and what better way to improve your team’s performance than introducing tasks as challenging problems that can be only cracked by team effort. The best part of gaming is the thrill to explore new entities. Provide your employees with problems that will insist them to attempt new things and learn new skills. The best part of any game is the awards/honors you receive towards the end. Recognize your employee efforts with points, badges or creative medallions so that they feel rewarded for the challenges they took up. A game without tips is just as boring as an unguided work environment. Share valuable guidelines and innovative tips to various work-related challenges with your employees so that they feel guided at various levels of work. This will also help you gather valuable employee feedback.

While Gamification is an intriguing concept, it also comes with pros and cons. If not implemented with a proper vision, it can result in unintended results. While many companies are positive about the long term effects of Gamification, its short term results are not so positively quantified.  Though the strategy has seen good responses in employee performance enhancement, it has quite a backlash in maintaining customer relations. When a game based structure is introduced into selling, the sophistication and risks get high as the players will forget they are playing with real time resources and not Monopoly money. But since the trend is still at its infancy, you can always start at the foundation and slowly adapt to its contingencies.

So, are you game?

Posted in Home, Leadership | Leave a comment

Web-Scale Computing: Coming to a Data Center near you!

What if the IT competencies of Google, Facebook and Amazon (aka hyperscale data centers) could be replicated by your own IT teams in your company’s data centers? It would then be called web-scale computing in an enterprise IT.  According to a recent Gartner research paper, following six elements are required for a solid web-scale computing foundation:

  • Software-driven data center
  • Web-oriented architecture
  • Industrial-strength data centers
  • Agile development processes
  • Collaborative organization style
  • Learning culture

While larger enterprises can benefit from this new shift in Data center thinking, even smaller IT departments can benefit by improving the velocity of IT service delivery and reducing costs.

Hyperscale data centers (let’s call them Tier 1) like Amazon, Google, and Facebook have shown how to deliver scalable services faster. Others like, (let’s call them Tier 2) have been offering various storage and collaboration services in the cloud, albeit at a lower scale.

There are going to both vertically-integrated and horizontally-connected public clouds, some will be specialty clouds like Salesforce and Workday, while others will be plain vanilla clouds that customers can use as building blocks (e.g. Microsoft Azure) to construct their own interconnected services using APIs or other integration services.

There is also this massive shift happening towards hardware commoditization, in which software is sitting on top of an abstraction layer that drives this low cost hardware.

We all know that CRM, HRIS, B2B applications like Salesforce, Oracle HCM, Workday, Serus and many others have reached maturity levels. While companies feel more comfortable going for public cloud deployments for such applications, there are still some other applications that these companies will tend to keep in the private clouds for various reasons (security being one of the considerations).

In order to realize scalability, agility, security and lower cost of ownership, companies will use an appropriate mix of public and private clouds, called Hybrid clouds.

For private cloud deployments, companies need to evaluate potential platforms (e.g. Openstack) and the integration requirements between private and public cloud applications. IT architects need to get handle on end-to-end architectures for smooth business operations in their software-driven data-center initiatives.

It is in these private clouds, where your IT teams will have to bring in best-practices of hyperscale IT data centers and that is where web-scale computing will come into picture.

As enterprise IT departments start running more and more of cloud workloads in their data centers, and enhance usage of BIG Data analytics, they will need to transform their data centers to emulate the successful web-scale computing models. To do this, they will need to consider the following –

  • A self-service infrastructure which can
    • Rapidly provision services
    • Enable movement of  workloads between private and public clouds
    • Allow for multi-tenancy and
    • Is simple to manage.
  • A high performance and scalable network which provides the right balance of
    • Bandwidth
    • Throughput and
    • Latency
  • Data processing (ERP like Transactional systems) and Analytics capabilities (Data warehouse, Dashboards for drilldown, slicing/dicing) which support the new data repositories and applications.

While Web-scale computing is a new concept for enterprise IT departments, it will become a requirement in private cloud deployments and we are going to hear more about it in coming months and years.

Posted in Cloud, Home | 1 Comment