As the end of summer approaches, I love the annual appearance of the windswept weather guy on our favorite news channel.
You know the one, he attempts to stand in the thrashing wind, slammed by waves that would toss a small boat onto the sidewalk, reporting on the nasty hurricane conditions as the wind blows him off camera and into the ocean.
You might feel like that weather guy every day as you get pummeled by a constant stream of data and numbers to review, trying to get a handle on your business.
There might be a break in the clouds and you hope for a reprieve.
Only to realize that it was the eye of the storm passing and you are about to get whipped by the circular motion of the data cycle that begins again.
In 2012, 639,800 gigabytes of global IP data was transferred every minute. By the end of 2013 it had increased to 1,572,877 gigabytes each minute. That is a staggering 130% increase in just one year. This statistic highlights the tension we feel as we attempt to understand the state of our business.
The average time spent on devices increased 157% in just four years. No doubt a result of the explosion of platforms and the connectivity they bring long after the workday is over.
All this connectivity creates information. From who our customers are and if they liked our latest offers. To their return rates and satisfaction after the sale. Data is crucial to understanding what is working – or not – and how our efforts are affecting the bottom line.
Without a plan to manage the flood of data, you’re rowing in the ocean with a small paddle, searching for the right information that can be the life preserver of profitability. We have been able to use data to understand and fine-tune our businesses for decades.
Business analysis had, until recently, been the exclusive domain of Fortune 1000 companies. Developments in technology give even the bakery on the corner the ability to test, measure and adjust flavors every hour if needed. With tools in the cloud, sophisticated data sorting and real-time analysis can tell us if chocolate or red velvet is a crowd pleaser.
We should be using all the data we can get our hands on. How do we manage the constant stream and make sense of it all? The key is having a plan and using the right tools to gain insight. A flight over the storm to see where you are now and where it’s headed – so that you can plan ahead and find the safest harbor.
We’ve created this guide to help in several ways. In this post and in the tools we created that you can download for free, we outline the six steps designed to help you figure out the data you have, organize and prioritize it, add what might be missing and ways to manage for the future. The six steps covered in detail are:
Before we start, we need to assemble the right team.
A project of this size works best if you have a cross-functional team and senior management buy-in. This will ensure you get help from people that create and use the reports and that they are invested in the future plan. The last thing that you want is to only have a few departments involved in the effort. You run the risk having an incomplete picture for the company.
Having the right team will help you frame the project as a positive contribution. Some people may feel threatened and fear that changing report creation will take away a task that is now their job.
To be sure that the project is viewed in a positive way, get buy-in from all levels of the company for the project first. Use internal communication tools to spread the word about the effort. This is key to a successful implementation. Help folks understand that increasing clarity around data creates a more profitable company and benefits everyone.
To develop a sound plan for managing data, you must begin with an understanding of what your organization is already spending time and effort collecting and creating. Having a team that represents all functional areas of the company is critical in this early stage. Your team can assist in gathering every report that is currently created. They will know if their team is creating a critical spreadsheet they rely on every day to manage their business.
Once you have gathered all the information, tear apart the reports to understand the information that is key for your business. This might appear to be a daunting task given the reams of paper in front of you, but it is critical in creating the right plan and flexible path for the future. Download a tool we have created that you can use to help sort your data.
The guide has several functional groups filled in that can be changed based on your organization. Typically they will be departments like finance, marketing, sales and operations. The groups can also be product or customer focused depending on your organization.
Chart the key data points from each existing report and their source. For example, operations might report the number of items produced for a given product line each day. Sales could report sales data by channel for that same product. They could be combined in an existing report. By separating the individual components, you can use that data in a different way, providing new insight into your business.
After you have this step completed, it is important to define who is going to use the data – not everyone needs – or wants – access to everything. List the names and titles of people that currently access or use this data. This then becomes a guide for what team members and titles in the organization are granted access to the reports that are generated. Don’t worry if you are missing someone, that will become obvious in the testing phase. If someone is missing access to a key report, they will let you know right away.
The next step is to rank each piece of data. If it is mission critical give it a score of 5. If it is a nice to have, give it a score of 3. And if you are asking yourself why in the world you would care about that information, give it a score of 1. This will create a scorecard that allows you to structure the data and make hard choices about your data needs. This scorecard can be used to define the most important information that forms the top line page of the dashboard.
The scorecard can help you understand the data that should be the focus of your efforts. Start with the mission critical list. Some of the reports coded as nice to haves may be constructed from mission critical reports. The scorecard will also let you cut down what is not key to managing the business. Don’t get bogged down by others that want to look at more – people can only focus on few points – this process forces focus. After assigning a score to each piece of data, move to the next stage of the project – understanding what you are missing and what you might need for the future.
Once you have sorted the scorecard and rated the data that is most important in managing your business, you can see what you might be missing.
Create a road map to gauge the data you are collecting now and the data you are going to need in the future as your business grows and expands.
Data can be organized and coded many different ways. But you can’t report on what you don’t have – so the scorecard will help you understand what might be missing. From here you can fill in the gaps and create a fuller picture to help manage the business. This might lead to generating data from new sources.
Some reports are generated to answer one question one time – but the report is still generated in case the question comes up again. The source data that is used to create an ad hoc report can be placed in the data storehouse, and by understanding what you have and how it can be combined, future ad hoc reporting – along with time and effort it takes – can be minimized.
With the scorecard complete, it is time to take the most important pieces of data and meld them into a new tool that takes the numbers and presents them in a visually appealing dashboard. Unchaining the data from a basic spreadsheet can provide more frequent and automatic updating. There are many resources can help – from simple dashboard tools to sophisticated enterprise-level business intelligence engines.
We created a quick guide to some of the most popular tools around – CLICK HERE TO READ.
The list provides a few tools to review and try. Most offer a free trial using your own data which can provide a good idea if they will work in your specific environment. The dashboard you choose should reflect the priorities that you outlined in your scorecard.
If the focus of the data is more financial in nature, you would want to ensure that the ones you test and confirm have robust capabilities in that area. If the data will be used in a more general way, then one that has a broad scope, but might limit the depth, could be a better choice.
The key is to not rush to a decision. Choose several to test and use a trial period to find the best fit. You are now ready for the next step in the data evolution – testing to ensure flawless execution.
Once you have chosen a dashboard tool to assist in creating the reporting environment, you will move into a critical testing phase. You should start the testing phase with sample data and if all goes well, test with a live data feed. The timing of this stage will be dependent on tracking down reporting errors and fixing flaws. The new dashboards you create will be compared to existing reports to ensure accuracy. This is key as if reports are wrong, all the work potentially be wiped away as teams will not trust the information.
Be patient and allow plenty of time on the schedule for this step. After a successful live test, you will want to set up a test site where you can provide access to end users and invite feedback. This test site should also include the permission levels and secure areas that will be part of the final product. This will ensure the visibility of sensitive data is executed the way that you want and need it to be. The final product should meet their needs – and if it doesn’t – be open and willing to change.
Depending on the size of your organization, you might have to start with one group or department and deploy each conversion one at a time. There is no shame in starting small and expanding. It is smart to start with a smaller data set instead of trying to manage an entire organization’s reporting needs.
Fewer data points translates into a smaller test footprint. This makes it easy to track down flaws in the testing and implementation phase.
With the right team, clear analysis, structure and tools, the tide of data that can overwhelm your business can be calmed into a smooth stream. Illuminating and providing understanding of potential. All while saving time and money.
To get a FREE PDF of the 6 Steps to Managing the Hurricane of Data – click this link:
To get a copy of the FREE scorecard tool to help you Manage the Hurricane of Data – click this link:
Remember that time a few years ago when you needed to get the market share for each product by package size by market trended over a 12 month moving average and raw monthly data? Me too!
And the presentation was in two days?
And the research group needed at least three days to pull the reports out of IRI or Nielsen, combine them with your in house sales data and specific geography, and print them out for you?
Not to mention the time it would take to ensure the numbers were correct.
Thankfully, that weekly request and run around the building approach to business intelligence can be a story one listens to around the old brand manager campfire at the annual brand summit.
Today there are a myriad of tools and platforms – well over 100 – that can help you and your business churn through and manage the hurricane of data that comes in waves each day.
In fact, a recent report from Gartner estimates that more than half of net new purchasing is data discovery driven
All of these promise end users the ability to create reports and dashboards without the need for IT resources on an ongoing basis. Once the set-up and integration is complete, the IT team can effectively work issues as needed.
And you can drill down and twist the data to your heart’s content. Or the Division Presidents heart – whichever.
Saving everyone the fire drill run around the building of a few years ago.
With close to 100 different offerings to choose from, we’ve assembled a quick guide as a starting point. This is by no means meant to be an exhaustive listing. It does provide you with some thoughts on a range of products with different capabilities.
GoodData offers cloud-based, flexible reporting using the company’s proprietary platform that can work with any data source. Combining large data sets from different sources in real time, GoodData creates great looking reports and dashboards. A sharing feature provides users the ability to collaborate in almost real time with others. Permissions can be configured in a number of ways, protecting sensitive figures as needed from team to team. Implementation is quick but does require some technical expertise. GoodData provides a great option for cloud-based analysis of your data.
iDashboards is a user-friendly, interactive dashboard tool that you can use to view critical business data. Featuring a wizard that helps connect data sources, the easy to use tool guides you through steps to personalize the visual display. A proprietary feature captures data that is similar across different dashboards, drawing attention to patterns. Data can also be combined for scenario planning. iDashboards is available as a local installation or in the cloud and is highly rated for mobile applications. If you need an easy to deploy tool for your data, iDashboards deserves a look.
Birst uses a unique two-tier structure that allows stable storage for one unified view of your data, and a second level, flexible interface giving business users the ability to analyze and discover, as well as generate dashboards and reports. It features a short development time, making it easy to use and integrate. There are several sets of vertical reports included, but end users can easily create reports on their own. Known for stellar customer service, there is also a highly engaged community that speeds deployment and offers ongoing assistance. Birst provides businesses of all sizes on-premise and cloud deployments that are flexible and can scale.
Qlik offers two products, both use an associative data engine that gives users the ability to filter data quickly without the typical use of queries. Qlik uses in memory technology that allows real-time analysis of data. Different data sources can be combined and displayed in a drill down visualizer. Qlik View is designed for technical users to push business intelligence tools out to the company. Qlik Sense is a more recent product that allows IT to manage the data that business users build dashboards with. Both offer great visual tools for discovery and Qlik Sense offers tools for storytelling and smart search. Available in web based and on-premise, Qlik offers free trials for both products.
Tableau provides the ability to rapidly develop and leverage business insight by combining a broad range of data sources. Featuring an easy to use, drag and drop interface, Tableau can be deployed and used at all levels of the company to analyze information and quickly change perspectives. The visual presentation of the data is highly rated, letting users perform free-form exploration and share in an appealing way. For advanced analytics, larger firms may need to integrate with third party applications. Tableau offers on-premise and cloud based deployments as well as a free trial.
Adaptive Insights offers a powerful cloud-based suite of products based on the design of Excel that helps you speed up initial deployment and ongoing new user training. A highly customized model can be built and used for analysis of real-time data, allowing for scenario planning and forecasting of multiple metrics – from financial and payroll to operations and marketing. The Discovery tool is designed to allow end users the capability to combine different data sets and custom views without programming help. Information can be expressed across any variable including time frame, customer, product, vendor or geography. Adaptive is a flexible and powerful solution no matter the complexity of the business.
Domo is a business management suite that features a powerful back-end in the cloud that combines any number of data sources. Domo allows you to connect with the software and data you already have, whether it is already in the cloud, on-premise or in a spreadsheet. Using an intuitive interface, data can be displayed at a high level and drilled into to get a closer look based on parameters you define. With a number of existing connectors to popular software, the time to get the system running is reduced. Domo offers an integrated solution for companies of all sizes.
Yellowfin BI started as an embedded reporting tool and it retains high marks for offering a range of ways to present your data in a highly collaborative environment. Analytic tools are available in storyboards, mapping and data discovery as well as the standard dashboard and business intelligence views. Yellowfin gets high marks in ease of use featuring interactive and intuitive radio buttons, sliders and checkboxes – user-friendly ways to drill into the data. Mobile access is also a strong point, allowing access anywhere. Yellowfin offers a cloud-based and easy to use business intelligence tool.
BOARD has one product that uses a toolkit approach to offer broad capabilities and flexibility to handle business intelligence without programming help. The dashboard features drag and drop simplicity to build complex dashboards, charts and reports that allow for deep analysis and visualization. BOARD is custom designed by users, easy to set up and configure, cutting down timelines. It is deployed on a local client environment. BOARD offers a powerful customized self-service experience for any size firm.
Large enterprise options are not covered as I’m not trying to write the Bible of Business Intelligence, but here are the larger players that you can add to any comprehensive search:
IBM – Bought a legacy leader in the field, Cognos, in 2008 and has integrated the capability across many products. The recent introduction of Watson Analytics will add to their product line.
Microsoft – There are several product platforms from Microsoft – SQL Server, SharePoint and Power BI. Take some time to determine which ones fit your specific requirements.
Oracle – Oracle has so many products and variations, we’ve provided just three links here. One to an overview that shows the breadth of the product line, a more comprehensive enterprise product in OBIEE, and a focused business planning product, Hyperion. Again, take some time to understand which ones might fit.
SAP – A standard in the space, SAP has several broad product options including SAP BusinessObjects and SAP Lumira.
SAS – Analytics is what SAS excels in and the breadth of products shows. Just three links here in what is nearly an entire company dedicated to analytics and business intelligence.
With so many options to review, it is important to start with an understanding of your data and business requirements. We’ve created a guide and tools to help.
See our previous post on Managing the Hurricane of Data
If you have a solid idea of your data and how it can impact your business, you can then begin to work with some of the companies above that meet your business needs. Ask them to demonstrate their capabilities with your actual data and create reports for you.
Ask each company that is under serious consideration to create a proof of concept and outline the length of time it took to produce the proof of concept. Pay for this if you have to, as it will give you a great idea of the actual cost and time involved to implement the solution.
Create a test user group that can be a part of the demo to track the ease of use that so many providers talk about. Real users in real time give you a real idea of just how easy it is. Use a tool to evaluate all the options with a scoring system that can weight different capabilities.
We’ve created a tool you can download that helps.
Once you have all of the data, it should be clear which of the many options can work to provide your business with the best platform and tools for your current and future business intelligence needs.