Workforce Analytics – Transforming Human Resource (HR) Management

Workforce analytics happens when BIG data meets HR.

Much like big data has revolutionized marketing and finance; workforce analytics is poised to transform human resource management. Workforce analytics is helping redefine the HR practice of talent management by enabling organizations to identify top performers, predict employee success, calculate compensation, reallocate resources, and analyze labor market trends.

Key HR Strategic Trends

According to a GigaOm research article, following HR strategies have been gaining ground, thanks to availability of better analytics and prediction software.

  • Reduce Attrition
  • Predict Employee Performance
  • Rethink Compensation

Some use cases

HR professionals who continue to depend on Excel spreadsheets to compare salaries against industry benchmarks are falling fast behind the times.

These days, workforce analytics can predict the impact of compensation pretty accurately. Certain software solutions offer tools that crunch data including revenue per sales rep, percentage hikes in compensation plans, and bonus payments to predict the impact of compensation changes on employee productivity, satisfaction, and attrition.

Forward-thinking HR leaders may also use workforce analytics to model proposed incentive programs before actually rolling them out. Failing to pay a sales rep what he’s worth or, worse, granting a new hire a better commission cut than a veteran can have a negative impact on employee retention and workforce morale. However, by testing a variety of incentive programs in a production environment and tracking how a slight variation in compensation pay might impact the bottom line, workforce analytics can save a company hundreds of thousands of dollars in lost revenue and workers

Working with Xerox, a maker of printers, Evolv found that one of the best predictors that a customer-service employee will stick with a job is that he lives nearby and can get to work easily. These and other findings helped Xerox cut attrition by a fifth in a pilot program that has since been extended. It also found that workers who had joined one or two social networks tended to stay in a job for longer. Those who belonged to four or more social networks did not.

In another instance,  it was found that people who fill out online job applications using browsers that did not come with the computer (such as Microsoft’s Internet Explorer on a Windows PC) but had to be deliberately installed (like Firefox or Google’s Chrome) perform better and change jobs less often. Some analysts think that people who bother to install a new browser may be the sort who takes the time to reach informed decisions. Such people should be better employees.

Watch out for

I have heard some folks talk about how companies would like to cull activities from an employee’s LinkedIn account and predict potential job moves, or they can review the location data from employees’ smartphones (yes it is possible if smartphone has Mobile Device Management client deployed on their phones). These types of data gathering activities can be questionable due to privacy laws, so consult your legal before embarking on such initiatives.

Software Solutions

  • Evolv workforce science platform
  • Fusion HR Analytics from Oracle
  • Workforce Planning and Analytics Software from SAP SuccessFactors
  • Cognos Workforce Performance Talent Analytics from IBM
  • Workforce Analytics Solutions from Kronos

Conclusion

Algorithms and big data are powerful tools. However, these need to be used wisely.

Now that is something you would not want to miss out if you are planning to be a big and successful enterprise. Many companies are identifying the importance of Workforce Optimization and making it an essential part of their Human Resource Management System or Enterprise Resource Planning. There are various companies with state-of-art Workforce Optimization software that could do all the activities needed to sustain a productive workforce, including human resources planning, payments and benefits, time-keeping and attendance, training and development, recruitment, performance management, and forecasting and scheduling. So if you still haven’t thought about Workforce Analytics, better go for one now.

References:

http://searchfinancialapplications.techtarget.com/definition/workforce-analytics

http://searchfinancialapplications.techtarget.com/essentialguide/A-guide-to-HR-analytics

http://www.technologyreview.com/news/514901/the-machine-readable-workforce/

http://www.economist.com/news/business/21575820-how-software-helps-firms-hire-workers-more-efficiently-robot-recruiters

http://webreprints.djreprints.com/3097890100842.html

http://www.nytimes.com/2013/04/21/technology/big-data-trying-to-build-better-workers.html?_r=1&

Advertisements
Posted in BIG Data, Home | Leave a comment

DevOps – What is the fuss about?

Industry trends are changing how companies think about software development and delivery. According to IBM’s institute for business value, following software trends are impacting business competitiveness:

  • Proliferation of mobile devices
  • Explosion of unstructured data
  • Need to collaborate across multiple value chains
  • Cloud Platforms
  • Intelligent/connected devices

DevOps literally means software development and IT operations teams working together to benefit the business. The goal is to increase velocity and reduce friction.

Integrating two Silos together.

Historically, development teams have been measured by their ability to change rapidly. IT operations teams are measured by their ability to maintain stability and 100 percent uptime. These two organizations, when managed separately, have competing goals and the collaboration between them can be adversarial

For faster delivery of software enhancements, development teams use Agile methodology. It is the software version of lean manufacturing. “Just in time” thinking reduces the overhead of low-value reporting, traceability, inspections, and other traditionally minded quality assurance activities. False precision in specifications and plans is replaced with lean thinking and agile methods that emphasize incremental iterations/batch sizes, just-in-time and Kanban production, and an acceleration of cycle times, all while leveraging the creativity of knowledge workers. Lean thinking differentiates between value-creating activities and waste, and focuses on resolving the bigger uncertainties, like architecturally significant decisions and integration testing earlier in the process.

DevOps complements the Agile Delivery models for software engineering by bringing in automation and monitoring on the IT operations side. It leverages collaboration, tool-chain pipelines, automation and cloud adoption. The benefits can be observed as improved processes, more standardizations and building trust and improve productivity across development, QA and IT operations teams.

Key attributes of a DevOps Platform

As corporate IT departments begin their journey to explore DevOps, keep following characteristics in mind:

  • Planning and collaboration
  • Orchestration
  • Monitoring
  • Configuration Management for automated provisioning
  • Code Repository
  • Test Management and automation

Additionally, while evaluating DevOps platforms, keep in mind both private and public cloud environments.

How can DevOps add value?

DevOps can help reap business benefits faster due to rapid delivery of software-based enhancements. Key benefits include:

  • Drives Business Agility through Continuous Integration and Continuous Delivery of Products
  • Improves infrastructure provisioning through automated deployments and self-service portals
  • Speeds Deployments by aligning development and operations with Enterprise Architecture guidelines
  • Improves Production Performance Monitoring and Proactive Incident Management through Automation, Development and Collaboration
  • Strengthens Service Delivery, Automation of Dev + QA + Ops Processes and Management with a Common Lifecycle View

Conclusion

Traditional approaches to software development and delivery are no longer sufficient. Manual processes are error prone, break down, and they create waste and delayed response. Businesses can’t afford to focus on cost while neglecting speed of delivery, or choose speed over managing risk. A DevOps approach offers a powerful solution to these challenges.

DevOps reduces time to customer feedback, increases quality, reduces risk and cost, and unifies process, culture, and tools across the end to end lifecycle—which includes adoption path to plan and measure, develop and test, release and deploy, and monitor and optimize.

Good luck with your DevOps implementations!

References

http://www.ibm.com/ibm/devops/us/en/

http://www.relevancelab.com/index.html

Posted in Cloud, Home | Leave a comment

Data Loss Prevention

Data Loss Prevention: A Major Pain Point

The advancement in mobile devices, the wide spread adoption of high speed internet (3G and above) and cloud services have opened up an array of opportunities to the business and technology world. It was only because of these advancements that we are hearing terms like BYOD and BYOA. These practices have increased the mobility of your employees to an extent that they need not be physically present in the office to do their job. They can access the company server using their own devices from anywhere in the world and do their job. Which is an amazing capability, yet it does not come without its defects. The major one being data loss.

The importance of protecting data, sensitive or otherwise, cannot be stressed enough. Intellectual property is the most important and most vulnerable asset of a company. Any loss or leak of these can be catastrophic. I am not talking about the threat of hacking or external interference here (of course these are major issues, but most companies have multiple layers of security to prevent this), what I am talking about is internal or insider threats. How do you make sure that any of your employees accidentally or deliberately copy or send some sensitive information or data to outside party? This is a rather huge problem now as your employees are mobile and no longer work only from the comfort and security of your office.

Not even for a second think that your employees are loyal or smarter than that. All it takes is one disgruntled employee or a careless mistake from a loyal employee. As you know, once the genie is out of the bottle, it is out for ever. So, prevention is the only cure.

What can be done!

There are several measures that can be taken in order to prevent data loss. Let us go through a few of these measures.

Data Classification

Let us admit it, no matter what we say, not all data are crucial. So, the first step in protecting your data is to identify which data to spend time and resource securing and which not to. Identify those that can be public and those that have to be kept sensitive. Understand that not all of your employees need to have access to all the data or information that your company has. Make that sensitive information available only on a ‘need to know’ basis. This might not sound much but it is effective and goes a long way in terms of protecting your data.

Prevention

You have already classified your data and now you know which ones are worth protecting, so how can you do it. Doing manual checks of every email or IM or so is not an option, it simply cannot be done, this is why software solutions are needed to this. There are multiple solutions available in the market that has the capability to do these. Chose those solutions that can do the job by following your business-rules, i.e. your predefined information disclosure policies. You can use software for email, social media, and instant message tracking to make sure there are no violations of your policies. You should also do document fingerprint, and end point monitoring to further secure your data. Encryption of sensitive data is also a great practice.

Awareness

The major challenge for data protection comes from your employees itself. A vast majority of your employees do not really want to harm the company and will not willingly disclose sensitive information to outsiders. Nonetheless, you need to make sure that they are aware of the risks. The employees should be properly aware of the information disclosure policy of the company. Gartner says that the major channel of data loss is via emails. Now, a proper IT protocol can make sure that no sensitive information can be emailed outside the company network, but if an employee copy pastes a document to the mail and instantaneously press ‘send’, chances are that the mail will be send with the information in it. To avoid this from happening, the employees should be well versed in the policies of the company. They should also be made aware of the dangers of sending data over IMs and over social media.

Solutions

 Network-based DLP solutions

Installed in your company’s network infrastructure, these solutions track and monitor data mobility and prevent sensitive data from being send outside the secure network. These solutions normally monitor emails, IMs, social media and more.

Storage-based DLP solutions

These solutions make sure that the data in your server is kept safe and secure. They mine your servers, SharePoint, and databases to make sure that no sensitive information is on non-secure platforms.

End-point based DLP solutions

They focus on end point systems such as PCs, laptops, tablets, mobile devices etc. to make sure that no sensitive data is leaving your company in the form of printouts or copies in USBs or CDs/DVDs or portable hard drives. They will also track webmail, social media, IMs and more, just to be certain.

Some DLP Solutions

  • Websense
  • Symantec
  • RSA
  • Fidelis Security Systems – IBM
  • Palisade Data Monitor – Palisade Systems
  • DLP Solutions from McAfee
  • DLP Platform from GTB Technologies

These are some of the widely accepted and effective DLP solutions available in the market.

As I have mentioned before, one cannot stress enough about the importance of data and what its protection would mean to a company. There is no question whether data loss will have adverse effect; yet it is up to you to decide what to do to prevent this and how to do it, having said that, not taking adequate measures for DLP will be a serious mistake.

Posted in Home, Security | Leave a comment

Microservers in the cloud

Microservers                       

Changes brought about by the rise in cloud computing have had a huge impact on servers and server design. Unlike enterprise IT departments, cloud companies like Amazon, Google and Facebook tend to custom design their infrastructures to handle various types of workloads in the most efficient and economical ways.

One such category of workloads includes light computing tasks with lots of parallelism and frequent use of large amounts of data. That is where Microservers come in…

Microservers are designed to process lightweight, scale out workloads for hyper-scale data centres. Typical workloads suited for Microservers include static web page serving; entry dedicated hosting, and basic content delivery, among others. Because of the Microservers’ high-density and energy-efficient design, its infrastructure (including the fan and power supply) can be shared by tens or even hundreds of physical server nodes.

Key Attributes of a Microserver

(1) Low-Power

Microservers are generally based on small form-factor, system-on-a-chip boards, which pack the CPU, memory and system I/O onto a single integrated circuit. Power consumption of Microserver boards is much less than that of the processors in a typical high-end server.

(2) Space Savers

Because of their small size, and the fact Microservers require less cooling than their traditional counterparts, they can also be densely packed together to save physical space in the datacenter.

(3) Low-Cost

Since Microservers typically share infrastructure controlling networking, power and cooling, which is built into the server chassis, hence compared to alternatives, can cost less to run.

 

Use-Cases

Microservers are typically suitable for static web page serving; entry dedicated hosting, and basic content delivery. They also lend themselves to data processing tasks where the workload can be parceled up and operated on in parallel, such as certain analytics jobs (think Hadoop), as well as handling data in non-relational databases.

Any Negatives?

(1) Not Suitable for compute intensive jobs

Because of their relatively lower compute performance or memory footprint, Microservers may not be suitable for handling mainstream enterprise IT or advanced scientific or technical computing workloads.

(2) Limits on parallelism

Due to current challenges and cost of adapting software to efficiently distribute workloads between Microservers; Microservers will require software to be rewritten to enable these servers to process the tasks in parallel.

(3) Networking overheads

Deploying a large number of less powerful servers increases the number of ports required and switching overhead.

Conclusion

Facebook is putting ARM and Tilera processors through tests in its datacenters, as it explores using non-x86 chips for some applications to reduce its electricity bill and boost performance. Dell and HP (Project Moonshot) are coming up with their Microserver line of products. CPU wars between Intel and ARM are heating up in the Microserver area.

Microservers are going to be specialized and custom-built to handle specific use-cases and will have their own place in the data-centers, both in enterprise IT and hyperscale data-centers. They will sit alongside rather than replace traditional higher power, less specialized servers.

References

http://www.zdnet.com/inside-facebooks-lab-a-mission-to-make-hardware-open-source-7000004557/

http://www.intel.com/content/www/us/en/servers/microservers.html

http://www.zdnet.com/microservers-what-you-need-to-know-7000011486/

Posted in Cloud | Leave a comment

Software-Defined Supply Chain

Software-Defined Supply Chain: Digitization of design and manufacturing

After software-driven compute virtualization, we all have been witnessing software-driven virtualization of storage and networking. While corporate IT departments have been now finding ways to implement software driven data-centers, hyperscale data-centers (Amazon, Google Facebook etc.) have been already successfully driving commoditization of hardware, enabled by software. Hence, it was only a matter of time before other business areas would start seeing the value in driving their value-chain using software.

Now according to latest research from IBM, hardware-driven supply chain is going to give way to software-driven supply chain.

Any company that has to deal with supply chain management knows how painful a task it is. It is enormous, complex, elaborate and absolutely important and unavoidable. Everything from timely manufacturing of products to the cost per product will relay upon proper supply chain management. Any mistake (however small they might be) in the management of supply chain can trigger a butterfly effect that could have a catastrophic result for the company. This is why companies that have major supply chain operations take utmost care in this. To do this properly, these companies had to spend an enormous sum of money on to the process, an enormous constraint on the company (both financially and otherwise), but all these are about to change.

What will be the change and how will it help?

According to IBM, following three technologies are driving the future software-based supply chain.

  • 3D printing
  • Intelligent Robotics
  • Open-Source Electronics

These new technologies are creating a manufacturing environment that is driven by digital data.

These technologies will enable a reconfigured global supply chain. It will radically change the nature of manufacturing in the Electronics industry, shifting global trade flows and altering the competitive landscape for both enter­prise and government policy makers.

Today, the arbitrage of production costs almost always depends upon the scale of your production, i.e. the more you produce the cheaper it will be. Which was great for the giant MNCs, but what about the smaller players? The companies that do not have the luxury to manufacture hundreds of thousands or millions of products at a time, there wasn’t much they could do in terms of competing with the giants (A very unfair situation indeed).

But, according to IBM by leveraging the three emerging technologies (3D printing, intelligent robotics and open source electronics) and the ‘software-defined supply chain’ the smaller players will also have a better fighting chance. “On average, it will be 23 percent cheaper to make products in a software defined supply chain,” says the IBM’s paper.

To create a new product, companies need to manufacture new parts or components, and they cannot be made without creating new molds. This is a major constraint. But with 3D printing the need to create molds can completely be erased, the companies can directly print the components that they need, and as you can see this makes the process cheaper and faster. Intelligent robotics and open source electronics also bring similar advantages to the manufacturing process and all three of these technologies are software-defined.

The fact that is more fascinating than cost arbitrage is that to achieve this you do not need to have a very large scale of manufacturing. “Even more dramatic however, is the 90 percent decrease in the minimum economic scale of production required to enter the industry,” point outs the paper. Now, how awesome is that?

The final product, be it a television or a smart phone, normally travel thousands of kilometers from the manufacturer to the supplier to the consumer. This transportation cost, when we look at the giant picture, itself is enormous. If we are to move away from the old fashioned manufacturing and supply chain process to this newly proposed software-defined supply chain process this issue can also be solved to a large extend. As the software-defined model comes as a service and a company need not have to spend a lump sum on a manufacturing plant they will be able to make their product anywhere at any time.

With the astonishing results from their study the IBM is predicting a complete relocation from the traditional practices of supply chain management. The process can now be fully automated and be managed and controlled by intelligent software thus lifting many of the major constraints, making it significantly more financially viable and optimized.

Any Negatives?

The ease of supply chain management and manufacturing means that the barrier for entry into the industry, any product based industry will be much lower than what it used to be. Which means there will be many more players and the competition can raise to an unforeseen level, but that can be a positive thing for the customers and from and innovation standpoint. So, let us just say, may the best product win.

Conclusion

A cherry on top is that, according to IBM, the software-defined supply chain is ‘greener’ in comparison to the traditional one. Which, it is needless to say, is a very positive aspect in today’s changing world. All in all, the possibilities that this new vision is opening are enormous, as in, it has the potential to change the world. Will the industrial world adapt the software-defined infrastructure, if so how fast that will happen, these are all relevant questions! For the answers, we will have to wait.

References

List of open-source hardware projects

http://public.dhe.ibm.com/common/ssi/ecm/en/gbe03571usen/GBE03571USEN.PDF

http://www-935.ibm.com/services/us/gbs/thoughtleadership/software-defined-supply-chain/infographic/

http://www.open-electronics.org/

Posted in Home, Supply chain Management | Leave a comment

Predictive Analytics – Get ready for the future!

What is Predictive Analytics?

According to Wikipedia, predictive analytics is an area that encompasses a variety of techniques from statistics, modeling, machine learning, and data mining that analyze current and historical facts to make predictions about future, or otherwise unknown, events.

Predictive analytics enables companies to gain actionable and forward-looking insight from their data.

Predictive analytical applications have already made inroads in Banking, Oil & Gas, Insurance, Healthcare & Retail industries.

Technology Factors That Have Helped Push Predictive Analytical Processing

  • Faster CPUs
    • Commodity-based massively parallel processing capabilities combined with faster and multi-core CPUs have significantly reduced the time it takes to perform complex analytical processes
    • In-Memory Analytics
      • Lower memory prices and 64-bit addressing have enabled loading of massive amounts of data directly into memory. Data can be mined mush faster in the memory directly.
      • Big Data:
        • Technologies like Hadoop, R, MapReduce, and natural processing and text analytics have enabled companies to collect, analyse and mine massive amounts of structured and unstructured data

Predictive Analytics:  Applications for an Enterprise

Predictive analytics is much more than general forecasting that most of the companies currently do. Below are some of the key areas where I believe predictive analytical capabilities can be used to improve a company’s competitive edge:

  • Revenue Management: Pricing Optimization
  • Supply Chain: Inventory Optimization, Safety-stock management, Product-mix planning
  • Human Resource: Critical talent retention
  • Quality/ R&D:  Yield Predictions
  • IT Risk Management: Spam filtering, Data-Loss Prevention, Advanced Persistent Threats

Predictive Modeling

Predictive modeling is the process by which a model is created or chosen to try to best predict the probability of an outcome.

A generic modelling process includes collecting data from various source systems, both internal and external. Next step involves creating statistical models based on certain assumptions/hypothesis. These models are then tested using sample data points and eventually a model is selected for deployment. This model is thereafter allowed to mature over a period of time using various game theory and machine learning techniques.

Following modelling techniques currently exist:

  • Group Method of data handling
  • Naïve Bayes
  • k-nearest neighbour algorithm
  • Majority Classifier
  • Support vector machines

Open-Source Solutions

Some of the commonly available and popular open-source solutions include the following:

  • R
  • Weka
  • Orange
  • KNIME
  • RapidMiner

Final Word

Predictive Analytics is not a new concept, but recent technological innovations have now enabled us to start using these capabilities faster and cost-effectively. A predictive analytical solution must be an integrated system covering end-to-end processes in a company’s supply chain, customer relationship or financial management processes. Predictive analytical solutions can help uncover hidden opportunities for a company, avoid pricing erosion or lost sales.

Albert Einstein once said: “The definition of insanity is doing the same thing over and over again, while expecting different results.” Investing in a good predictive analytics solution is the best investment a company can make for their business.

References

http://www.predictiveanalyticsworld.com/patimes/

http://www.predictiveanalyticstoday.com/

http://www.kaggle.com/

http://en.wikipedia.org/wiki/Predictive_analytics

http://en.wikipedia.org/wiki/Predictive_modelling

http://tdwi.org/Articles/2012/03/06/Predictive-Analytics-Growth.aspx?Page=1

http://tdwi.org/Articles/2012/05/01/5-Predictive-Analytics-Myths.aspx?Page=3

Posted in BIG Data | Leave a comment

Supply Chain Collaboration

Supply Chain: Collaboration

Among all the dust and heat of Cloud, Big Data, Bring Your Own Device (BYOD) and SocioMobile trends, Supply  chain seems to have taken a back seat. Supply chain has been key topic of discussion and improvement for years now. Have we done enough? Do companies have complete end-to-end supply chain integration in place?

Supply chain collaboration in Semiconductor Industry is still a challenge. First, let me define the collaboration areas:

  • Order Management: Placing a PO, PO Commits & Acknowledgements
  • Visibility: Work-in-progress, Advance Ship Notifications, Alerts/Notifications, Business Continuity
  • Design collaboration: Manufacturing instructions, Production specs, BOMs etc.
  • Financial Collaboration: Contract Compliance, Invoice automation
  • Supplier Performance & Compliance Management: Performance to schedule, conflict material management

Alright, Big Data has a role to play in all of the above. For improved collaboration, having an analytical tool that gives you easy access, slicing/dicing, drill-down capabilities definitely makes sense. Did I hear some of you say that you would like to get alerts/notifications on your mobile devices, sure that makes sense! Yes, it would definitely make sense if collaboration tools are cloud-based giving us scalability and lower-cost models. There you go, cloud, Big Data and Mobile are back in action.

Why is Supply Chain Collaboration so difficult?

For those of you who have perfect collaboration in place, good for you. For lesser mortals, I think some of the following challenges exist:

  • Data Accuracy – Many transactions fail because returned part# is wrong or PO line items got mis-aligned.
  • Lack of integration into MES systems at the supplier side – Challenges due to lot splits and merges, manual data-entry causes issues even though we might think we have a B2B link in place
  • Lack of complete integration on the buyer side – all the way to planning engine & financial modules of ERP system
  • Lack of Systems expertise in smaller companies
  • Secure Transmissions

Do you know of companies who still run their planning engines in spreadsheets? Five years back, situation was worse, but I believe Semi Industry has been slowly picking up on this one. Most of the companies now have good solid transactional and planning systems in place. Some lucky ones have product life-cycle management systems in place too. But how many have good solid collaboration infrastructures in place?

How can we improve overall collaboration?

In this age of information economy, we need to collectively find ways of better collaboration.

First things first, we should have a forum where supply chain partners should discuss ways of improving the collaboration. No one will win by hiding, so let’s come out in the open and discuss ways to share information and collaborate. Agreed upon standards and protocols will act like lubricant to help push this information engine.

New technological innovations (Cloud, Big Data, and Mobile etc.) can be utilized for improving supply chain collaboration. I hope someone comes up with a cloud-based collaboration hub where all the suppliers and buyers can get on-board. Suppliers could be rated (Amazon-style) by buyers. Smaller suppliers with less IT competency can compete more effectively with big firms by using predefined adapters and formats for B2B collaboration.

In a software-driven enterprise, strong supply chain collaboration is a must to compete effectively.

Posted in Home, Supply chain Management | Leave a comment