Business intelligence, or the art of using data to find hidden patterns and insights, has become an integral part of strategy development for leading organizations, including Fortune 500 companies, government agencies, and academic institutions. However, companies still struggle in several areas when implementing BI solutions. For such a project to succeed, the company should define the specific business problem, allocate resources, evaluate the outcomes from a business perspective, and then finally operationalize the modeling process recommendations.
While several methodologies exist for implementing business intelligence projects, not many companies follow the same rigor when it comes to planning them. Successful implementations require that expected outcomes be clearly defined, resources earmarked, ballpark delivery estimates published, and risks captured in advance. Project managers can then work within these constraints to deliver the final project. To prepare even these basic inputs, however, companies should follow a structured approach to developing the business case for individual analytics use cases, which can then be evaluated by senior management. The most compelling can then be put onto the implementation queue.
In this article, we discuss a high-level framework to streamline business intelligence project planning. Using the four-step approach presented here will allow companies to make informed decisions about committing time and resources to data mining projects using a comprehensive cost-benefit analysis.
In order to best appreciate the nuances of this proposed framework, consider a large retailer looking to leverage business intelligence to improve sales while reducing acquisition costs. The chief marketing officer (CMO) has been tasked with this strategic objective and must now come up with an approach that has buy-in not just from senior management but also from individual department heads, whose cooperation would be imperative in any successful delivery. How should the CMO proceed?
We identify a four-step approach to translate this strategic objective into viable, tactical delivery assignments:
So, what does it mean to say Improve sales while reducing acquisition costs to someone who is not the CEO or company shareholder? This is a rather loaded remit but what could it mean tactically? Breaking this down into smaller problem statements could result in something like this:
As can be seen, this step involves breaking down a high-level business objective into specific problem statements that make sense from a data perspective. All the possible interpretations above will have different data requirements, modeling approaches, risks, and cost/benefit profiles but these would be impossible to assess unless the larger objective is first broken down into smaller data problems.
As a rule, managers can frame problem statements by analyzing how the larger objective relates to every department and by identifying the specific optimizations that each department can do in order to deliver the larger business objective. Not only does this inclusive approach allow capturing of potential optimization options from all parts of the business, but it also helps in keeping key stakeholders engaged.
Say you ended up with over 50 possible analysis options that might deliver on the larger business objective. Marketing, product management, finance, customer service, supply chain have all chipped in with their bits to result in a bloated agenda. How do you go about prioritizing?
To do this, individual optimization options must be further elaborated and subjected to a cost/benefit analysis. Taking the example above, suppose that we wanted to do a cost/benefit analysis for implementing dynamic pricing. What considerations would this entail? Examples could include:
As can be seen, the impetus in this step is to develop high-level estimates for benefits and costs for each of the options while also outlining key project constraints and assumptions. With this information, managers could make an objective assessment of which options to prioritize or reject. Implementing personalization and dynamic pricing, for example, may well have a better business benefit in the long run, but it might be possible to drive higher sales by just optimizing the campaign spend better.
The key deliverables for this step include (for each option):
At this point, you should have a clear list of options with high-level cost estimates, but which are still based on a large number of assumptions that need to be validated further. Yet, this step acts as the first filter in screening out options that clearly do not make sense to pursue, even on the basis of a cursory analysis.
Once a prioritized list of implementation options is available, the next step is to develop SMART goals for each. In the case of dynamic pricing, these could be:
These goals typically require a combination of acute business judgment, high degree of familiarity with data preparation and modeling techniques, project management experience, a highly collaborative working environment, and, above all, senior stakeholders’ acceptance that the final results may well be different. The effort in most cases is formidable but having objective benchmarks provides constraints for implementation teams and impacts everything from choice of data integration technology and modeling techniques used to the quality and quantity of resources deployed.
Of all the steps in the proposed planning process, Step 3 is also the most technical and hands-on. For example, why settle for a figure of $X as the sale target? The number would typically come by establishing some kind of a relationship between sales and individual personalization parameters (number of browsing sessions in a specific time, pages used in those sessions, time per page, past purchase history, any customer service interactions, etc.). This would require experienced data scientists, ETL specialists, statisticians, and product experts to sketch out details of the actual analysis including a high-level overview of key influencing parameters, data quality requirements, modeling techniques to be deployed and so on. For now, the objective would be to make assumptions and make some ballpark calculations to make the objective SMART.
At this time, managers should also be able to further refine the cost estimates obtained in the previous step. More importantly though, this step results in putting numerical values to benefits thereby making the requirements unambiguous. Additionally, other details including resource requirements, hardware/software purchase required (if any), and a refined list of key assumptions and risks should become available.
Steps 1-3 above are repetitive and specific to individual business objectives. However, a successful enterprise-wide BI project involves creating a set of global practices, assets, tools, staffing arrangements, and delivery methods that can be applied across projects with minimal customization. With specific reference to business intelligence, some of the activities as part of this one-time step might include:
Lining up roles and responsibilities. Business intelligence requires several roles including business consultants, data specialists, statisticians, and project management staff. Other roles may be necessary depending upon specific business context, but at a minimum, the key responsibilities for each of these roles should be clarified as part of a global staffing strategy.
Lining up the right data mining tool. For data mining to be successful from a business perspective, models must be developed quickly and deployed cost-effectively for use within current operational systems and business processes. Various tools (SAS, SPSS, R, etc.) exist in the market for automated model development, data integration, and model testing, and it is usually best to decide these at the global level, rather than let it be done differently for individual business intelligence projects.
Standardizing on a data mining method. Adopting a consistent methodology for data mining lies at the heart of benefits realization from analytics investments. Such a method would typically involve:
A structured approach developed on these lines ensures purpose and repeatability in addition to allowing data mining to be implemented as discrete projects with defined budgets and timelines. CRISP-DM is the de-facto method used by several organizations worldwide although many have chosen to tailor it based on their own specific business context.
So, if the framework above is for planning business intelligence engagements, what are the steps involved in the actual implementation phase?
In order to appreciate the distinction, notice that the underlying philosophy of the approach above is to ensure that time and resource investments in business intelligence are prioritized based on an objective assessment of cost/benefit analysis of the various options. The activities involved here rely largely on assumptions, business judgments, and quick calculations with minimal hands-on work. This leaves the steps of actual data integration, sampling, data preparation, model development, and model testing activities being deferred to the implementation phase. Of course, all these steps need to follow their own delivery method, but unlike the planning framework, that method is strikingly different in terms of focus, approach, and level of detail involved.
When applied properly, the planning framework above results in a concrete, repeatable set of steps for translating high-level business objectives into specific project initiatives along with ballpoint estimates for time, cost, and resource requirements. Developing this strategic context helps in maximizing the value of data mining investments and in avoiding the “ad-hoc trap” wherein enterprise resources are wasted by wrong prioritization and with no real advance insight into investment returns or business outcomes.