81% of company executives believe that using data analytics helps them gain new revenue opportunities. That’s according to a 2016 research study, called the State of Enterprise Data Quality, conducted by the IT data intelligence firm Blazent and 451 Research. The research studied enterprises’ big data policies and strategies, along with the type of negative repercussions misusing big data may have on businesses. The research also found that 51% think it increases revenue and 49% believe it lowers costs.
At the same time, big data can damage business and the areas most affected by big data misuse are revenue (as perceived by 42% of respondents) and decision making (39%). Furthermore, even though 40% of C-level executives and data scientists are ‘very confident’ in the enterprise data their companies use, 94% are aware that low quality of data can seriously hurt their business.
As can be seen, when businesses work with big data they often take the keeping up with the Joneses approach, instead of following industry standards. There are a lot of ways in which big data can damage your business — here are lifehacks that may help the situation:
Big data becomes a problem when there is too much of it (70% of companies believe their data will increase by 20% in the coming year), flooding in from multiple directions simultaneously. Irrelevant data costs businesses endless man-hours spent on extracting necessary information from a sea of random facts. But leaving the issue as it is and hoping it will go away is not an option, so you will either have to either cash in on those man-hours or find a different way.
To consistently define and manage critical data to provide a single point of reference, invest in master data management (MDM) tools. Data cleansing and validation tools will help you eliminate irrelevant and unreliable data. Data stewardship tools will allow you to manage data assets and give relevant business users access to consistently and easily accessible quality data. Every organization will benefit from general purpose tools like ETL (extract, transform, load), visualization or profiling tools, while the specifics of your business niche will dictate the use of more focused tools like geo-coding ones.
Furthermore, you have to educate employees responsible for data relevance when and how each particular tool should be used. Today you can choose from a plethora of free resources, web-based or instructor-led training, or a mix of the two. Off-the-shelf data training will also do the trick.
According to the State of Enterprise Data Quality study 47% of business execs who took part in the research stated that the main reason behind using low quality data in business was migration. Besides, Bloor Research, an independent IT research and analyst European firm, found that more than half of all data migration projects end up costing more than planned and take longer than estimated. So don’t think data migration is a routine task that doesn’t demand special support by qualified experts
You need to involve specialists from all departments, including real data users. Make sure they know data management, migration and governance industry best practices and help them collaborate effectively. Be aware of who has access to data and what tasks they are allowed to perform with it. Allocate powerful system resources to data migration and map out steps for each stage: before, during and after the migration.
A large online store started migrating to a new system without proper planning and scheduling and a professional analyst team. They were in a hurry to migrate, which ended in errors and more time spent on fixing them. The store had to press pause on some vital business processes, leading to financial losses and customer dissatisfaction.
Before you migrate, check the quality of existing data and validate business rules, redefining them where necessary. Choose a fitting strategy and draw a careful roadmap based on scope and budget. Never migrate at once, but in logical iterations and sticking to a realistic timeline. Add ongoing data testing and evaluation to the project lifecycle. Don’t skip validating migration results by both specialists responsible for the process and your data users. Use this data migration checklist to begin migration and work closely with the team you hire. Run from IT specialists who think migration is a breeze, and choose a team that will take this complicated process with all the seriousness it deserves.
In large enterprises, different types of data are scattered in different parts of the organization, stored there for different purposes and owned by different parties that often don’t communicate as efficiently as they should. Sometimes one department is not using the data contained in another department, data that could provide actionable insight and bring tremendous value, if only all interested data consumers knew about it. The importance of “shining a light on ‘dark data’”, as Gartner calls it, cannot be underestimated.
Remember that departmental division and information silos it creates should not intervene with data dissemination throughout the enterprise. Rely on highly qualified professionals who can help empower the rest of the organization to use data for their goals, minimize bottlenecks and lags in data delivery and provide reliable ways to gather data and disseminate valuable insight to business players who need it to drive business policy and make intelligent decisions.
An important step is to make a data inventory to help you meet your business goals. When you provide accessibility, don’t forget about security either – it’s a delicate balancing act. Data that can influence decision-making should be disseminated in the most efficient way for data consumers via reports, dashboards, data services, etc. Think about who the data consumers are and how data from one department can influence decision making in other parts of the organization.
In an online electronics store, information silos started to form when members of Marketing became distrustful of other departments. When a customer complaint on Facebook about a shopping cart glitch was reported to the corresponding developer team, the glitch was fixed easily. But since the two teams were not collaborating properly, Marketing published a generic response to the complaint. It was a roundabout statement that seemed snarky, publicized only to make the complaints go way. Customer zero, however, who posted the original glitch report, turned out to be a young programming enthusiast who quickly pointed out possible technical system flaws, making the store’s spokespeople not only look like liars, but significantly less technologically savvy than the 16-year-old developer wannabe. As his post went viral, the company had to fire staff members and introduce data sharing practices that require weekly communication and collaboration between different departments.
To make sure no valuable data is missing from a data user who can benefit from it, filter your data according to data roles, areas of expertise, interests and responsibilities, so that users don’t get turned off by being inundated with everything at once. Prioritize valuable actionable data by relevance to users with specific roles. Make sure you distribute the latest data as soon as it becomes available and create an alert system urging users to check dashboards and analytics tools at once. Itransition’s teams from all departments get such alerts on a weekly basis.
When you try to predict customer behavior based on a set of purchasing patterns the smallest faux pas can make customers feel violated especially when it comes to sensitive issues. One case in point is Target’s infamous sharing of sensitive information with a father who received coupons for new baby products based on the purchases his teen daughter has made (even though he was not aware of his daughter’s pregnancy).
Just because you are able to use data-mining algorithms to predict customer behavior and customize marketing campaigns, doesn’t mean you should act on it disregarding the risks. Another example is making data not intended to be seen by customers clearly visible: in an embarrassing turn of events, a grieving father received a letter from OfficeMax titled “Mike Seay/Daughter Killed in Car Crash/Or Current Business.” Since then the company claimed to have upgraded its data filters for flagging inappropriate information, but the damage was done.
Ask yourself: is the campaign missing the ethical context of data processing? Does the risk outweigh the potential benefit? Whenever you come across ethically sensitive data, make sure it stays private for internal use and doesn’t seep through to customers. Even though information is shared voluntarily, it doesn’t have to become a stick in the hands of a marketer. Customers acquired by invasive campaigns will hardly become a genuinely engaged audience that’s the best brand ambassador for your business.
If scandals happen, do damage control in real time. Address the issue by publicizing reasons for the mishap, along with the steps you will immediately take to reverse the damage; issue an official apology; try to contact every affected customer, showing them they matter. Follow up with updates on how the organization has learned and improved from their mistakes. A great example of doing damage control is the case of AppFirst deleting a database of customers entitled to a free level of AppFirst products. CEO and co-founder David Roth not only issued an apology and sent out explanatory emails to everyone affected, but also followed them up by calling each customer to clarify the situation. It took the exec four days, but it was worth it, and he learnt the customers are forgiving when you take meaningful action right away.
The poor quality of data is caused by employees working with data, machines malfunctioning or data being transported incorrectly and becoming corrupted. Even minor errors in big data processing has detrimental impact, if they influence decision making or affect customer satisfaction. In 2014 Golden Key International sent an envelope to their customer addressed as “Lisa Is A Slut McIntire”, a result of human error that a big data screening tool failed to both detect and fix.
In such situations, getting an address correction tool to standardize addresses, correct spelling and autofill missing components is not enough. Be critical to your data and don’t just assume your data is correct. If the result of an analysis seems unexpected or suspicious, check the validity of data by searching for errors and fixing them. You should consult professionals for an explanation or visualize data to find extreme outliers and their causes. When data is transported or migrated, be extra aware of possible errors or corruptions.
Even though it’s easy to be seduced by every current must-have trend such as big data, using it without proper understanding of what it is and how it influences business may bring more harm than good. Big data decision-making is still a relative novelty, and it makes sense to consciously follow the lead of successful data-driven enterprises and industry tips.
Do you use big data in your decision-making? What did you do in situations where data was misused, and you had to instantly put out online fires?