As companies race to find value in their data and improve the customer experience, many have overlooked an obvious value-add: providing analytics for the customer. Client-facing analytics differentiates companies from the competition, helping increase market share, retention, and even direct revenue if packaged and productized.
Why aren’t all companies doing this already? The data is there, and the reporting, dashboards, and visualizations are plenty.
One of the most difficult aspects of trying to provide analytics to customers is delivering a complete, appealing product. With analytics coming from a variety of internal business intelligence tools, packaging all of this information is far from easy. Both internal and external, users desire a seamless analytics experience. If bringing all of this content together into a single user experience wasn’t challenging enough, each experience needs to be tailored for various audiences.
On top of wrangling content, curating different experiences, and creating a pretty product – facilitating understanding is also a challenge. Chief Data Officers are tasked with fostering data culture, increasing BI adoption, and improving data literacy. However, these initiatives shouldn’t be limited to the internal organization. Companies need to extend this concentration to clients and partners as well.
The Ideal Customer Analytics Portal
Given the needs and challenges discussed above, let’s describe the ideal external analytics portal for clients and partners:
In summary, companies should aim to deliver analytics both internally and externally to clients and partners to maximize the value of data and grow together with strong data-informed relationships using an analytics portal.
The ideal analytics portal integrates reports and visualizations from all of your different BI tools, allows for the curation of content for different groups including data literacy support, and provides an attractive and easy to use experience.
Why Act Now?
If you provide your clients with informative analytics to help them grow their businesses, your business will become an irreplaceable source of value. If you DON’T provide your clients and partners with analytics, someone else will.
The demand for analytics is continually increasing as companies use data to drive decision making. If you are not providing clients with informative analytics to help them grow their business, how are you ensuring that your service will not become an irreplaceable source of value?
Offer your clients a service that goes above and beyond the competition. Replace any frustrations by giving them full autonomy over their analytics environment.
How to create a Customer Portal with Digital Hive
Read more about our enterprise portal solutions, or get in touch, if you want to chat about customer analytics portals – we can show you how simple it is to get set up!
Book a demo with a member of the team. See the full Digital Hive experience as well as some of the branded customer portals we’ve created.
Unless it pertains to politics or parking meters, people dislike change. Why? Change involves work, learning new skills, and the possibility of failure. This makes people uncomfortable and resistant. Maintaining the status quo is simply easier. When we look at why BI & Analytics initiatives fail, the reasons are not usually technical problems, but people problems related to change management and communication. Yet, in contrast to the average stakeholder, individuals who lead change are enthusiastic advocates and willing to put in the extra effort.
Why is this?
Champions of change understand the “Why, What, and How” of the change that is taking place.
Most stakeholders do not understand the “Why, What, and How” This is where D&A strategies are failing. If you map these 3 critical pieces of information to hot trends in Data & Analytics it is very clear. Industry challenges include:
Understanding the potential value of data – Rise of CDO(Data Culture)
Developing the skills to use data – less than 24% (Data Literacy)
The “Why” must be the basis
for change, and if it does not bring significant value to stakeholders, the
initiative is doomed from the start.
All three of these factors are important,
but the first is the most critical. To determine value, there must be strong
communication between the analytics team implementing technology and the
stakeholders who will use it. This is when we need to determine:
What are the business goals or
What decisions will be made to
reach these outcomes?
What information is needed to make
decisions and act?
This channel of communication between “Analytics” and “The Business” has been historically very weak. One reason that “change management” and “communication” are the most poorly executed components of an Enterprise Data Strategy, is because Data & Analytics initiatives are championed by technologists. While data scientists might be some of the smartest in the room, technology is their passion, not people. So, technology is what they focus on and the people side of analytics gets neglected. This has led to the rise of “Analytics Translators” and other intermediaries.
The titles for this role are wide-ranging and have little consistency, but the need for an individual to lead the change management aspect of D&A initiatives is apparent. Call this person an “Analytics Translator”, a “Data Champion”, a consultant…whatever gets the job done effectively.
This is the hard part. Every company has
unique needs, strengths, and weaknesses. There are a number of things to
consider when improving the change management aspect of your Enterprise Data
Strategy or Digital Transformation effort in addition to a focus on the above:
Do you have a clear leader and advocate for D&A on the executive team? Hiring a CDO is now a must.
How does your analytics team work together and communicate with the rest of the business? Does your company have a dedicated analytics team or a (CoE) Center of Excellence? Should your company have a centralized or decentralized analytics team?
Do you have dedicated individuals responsible for developing Data Culture, Data Literacy, and driving adoption? This is a full-time job. Hire for it.
Is there a focus on business outcomes first and technology second?
We believe establishing an Enterprise Analytics Hub helps solve many of the challenges related to Data & Analytics change management. By centralizing all BI content in a single location and user-experience, you establish a foundation from which to build a data culture, communicate with end-users and receive feedback on business needs. You can also launch embedded data literacy campaigns and increase BI adoption by providing a single point of entry, and insulate end-users from the disruption that comes with the introduction of new tools and sunsetting of legacy systems.
Has your company recently implemented Tableau, Qlik, or Power BI? Well, even those tools are now between 10 and 30 years old! Not to mention, they probably co-exist in your company with one of the other BI tools I mentioned.
Let’s fast forward. Arguably, the hottest analytics company on the market right now is ThoughtSpot. With their “Search and AI-driven capabilities, they are at the cutting edge. If you have implemented ThoughtSpot, I am 99% certain it co-exists alongside AT LEAST one of the aforementioned BI tools.
“What’s the point, Spencer?”
Well, let’s tally up the number of BI tools you have. You certainly use Excel, most likely have
a legacy BI tool like SAP, Oracle, or IBM and there is a good chance you’ve introduced
a 2nd generation data viz tool like Tableau or Power BI. If you are
cutting edge, you might also have an augmented analytics tool like ThoughtSpot.
So you probably have 3 BI tools, if not more. Our thought exercise is supported by research from Gartner and Forrester, as well as an informal survey I conducted on LinkedIn, and I haven’t even touched on tools with analytics capabilities like Salesforce.
The point I am trying to make is that it’s very hard to keep up with innovation, resulting in the co-existence of many multi-generational analytics tools. Enterprise companies are simply too big and move too slow to keep pace while simultaneously consolidating technology to a single platform.
“Who cares? What’s the
Having multiple BI tools makes it hard to use analytics and make decisions. All of your end-users are concerned with analytics tools, instead of DECISION MAKING. This creates silos of BI assets, making it difficult to find information, hard to drive BI adoption, impossible to establish data governance, consistency, or ease of use. This is a huge obstacle to establishing a strong data culture or effectively executing a change management strategy. To put it plainly, it makes things difficult. People don’t like difficult. People like easy. People like fast.
“What is the solution?”
I’m glad you asked! ? The solution is Digital Hive. Digital Hive is “Your Intelligent Enterprise Portal” that surfaces and recommends analytics in a personalized experience.
Recently, I finished my second Ironman 70.3 triathlon (for the record, that is NOT me in the photo). The triathlon is made up of three segments: swim, bike, and run. While the finishing time is reflected as a single value, there are 5 distinct times that comprise the total. Not surprisingly, the times for the swim, bike, and run portions are included, but also in the mix, are the two transition times. A transition is the time spent between the different events. Once an athlete crosses the finish line, these five times are summarized for the final result.
During this last race, I had some extra time (way more time than I expected) to ponder how the results are conveyed, and how there is so much more to the triathlon story than just the final time and the sum of the parts. There is the impact of the weather, how much training time was spent in each discipline, how effective that training was, conditions of the course, etc. Yet, when the dust settles and the results are in, it’s only these handful of metrics that are displayed.
The same scenario exists in analytics and business intelligence deployments. So much emphasis is placed on displaying the final results in a dashboard or report, that a lot of supporting context is lost. Context that could help to justify the results being seen, highlight and validate that previous business decisions have paid off, or worse, had a negative impact. This is why the current trend, one that isn’t being adopted quick enough in my opinion, is to start telling data stories.
A data story is where in addition to analytic assets, contextual information is included to help guide and inform the user. This way, users with varying levels of data literacy can all arrive at the same interpretation of the data. Now, special care must be taken to not lead the witness by framing the story with bias from the author (Just the facts, ma’am. Just the facts), but by and large, data stories are much more effective at delivering the desired message to the masses.
Being a data geek, a triathlon, and the whole training process, is FULL of metrics and various data points, so it’s a little disappointing that the results are displayed the way that they are. Hmm, maybe a data story should be built around all of the data gathered while training for an event like a triathlon …
I had the opportunity to attend Tableau’s “Design tricks for great dashboards” webinar. The speaker was Andy
Cotgreave, a visualization expert and Tableau veteran. During the webinar, Andy
touched on ‘framing’. As a level set, framing is about the context in which data
is presented, which is critical because author bias can creep in to lead the
witness, err, dashboard viewer into arriving at a biased opinion. This has been
a concern of mine for years and it was great hearing Andy describe the problem
the view of the data is static and there should be only one version of the
truth, we are all unique individuals with different opinions, backgrounds, etc,
thus, we don’t all interpret the data in the same way. I’ve always stated that if
you showed one report, without context in a management meeting, each member of
the audience would interpret the report differently. Essentially, a report can
be twisted to fit a lot of different narratives. During the webinar, Andy used
these two visuals to explain this scenario.
first is a well-known infographic (created by Simon Scarr) that was very
impactful. This striking visual depicts the loss of life as a result of the
military engagement in Iraq. Certain design choices were made, like the
deliberate choice of colour (red) and using the visual metaphor of dripping
blood, to convey a very polarizing view.
what if some simple changes were made to the infographic? I’m not talking
wholesale changes to the layout and charts, I’m simply talking about tweaking
the colour, orientation, and headline. As you can see below, taking the EXACT
same infographic, rotating it 180 degrees, swapping out the red for a blue, and
modifying the headline totally changes the narrative to something more positive.
first glance of the original infographic, my visceral opinion was negative, and
I had thoughts about how destructive and senseless war can be. When viewing the
modified infographic, my first impression was “hey look, fewer people are
dying”. Definitely a more positive narrative than the original infographic
It is our role, as the data literate,
to ensure that when building visualizations our personal bias and opinions
don’t influence the interpretation of the results. Easier said than done
though. When tasked with creating new visualizations, I focus on the questions
that the audience is looking to answer. Once the questions are understood, the
focus shifts to providing the facts required to answer those questions. My
preference is to not inject headlines or commentary into the visualizations.
without the commentary aren’t the visualizations open for interpretation, thus
propagating the ‘multi-version of the truth’ scenario?
Definitely, especially when the audience is non data literate and doesn’t have the experience in interpreting analytics. So how do we bridge the gap to resolve this? To me, the best way to solve this problem is by focusing on finding correlation, and to a certain extent, causation (this is a slippery slope to injecting personal opinion though, so beware) and adding that as context to support the analytics. When the data to support the decision-making process resides across different data sources, or BI platforms, there is an opportunity to tell a larger, more complete, data story. When commentary is placed into each visualization, legibility may be impacted when bringing together multiple artifacts into the data story. Not only that, an opportunity is lost to establish correlations that transcend single visualizations and/or platforms. An effective data story should contain:
facts required to answer the audience’s questions
necessary visualizations to convey the data story
commentary that guides the audience to correlations in the data
using multiple analytics solutions to tell the story, the emphasis should be on
the data within the visualization and not technology that created the
By sticking to the facts, a greater
emphasis will be placed on the raw data and the correlation versus forcing the
audience into one potentially polarizing view or opinion.
April showers brings May flowers … but more importantly, April brings the Masters. My name is Scott, and I’m an analytics nerd, with a golfing habit.
Although I enjoy the PGA tour, the Masters has always been special to me. Trevor and I (having both been fortunate enough to walk the fairways at Augusta on different occasions) were recently talking about the upcoming tournament. More specifically, who stood the best chance of winning and who didn’t. Still basking in the glow following the Gartner Data and Analytics conference, I thought surely there must be a way to use analytics to provide statistical evidence that our golf knowledge was above average.
Before I knew it, Trevor had built some Tableau workbooks that pulled in statistics from various sources, that then fed into a prediction model. Now because the prediction wasn’t accurate (didn’t align to my predicted outcome), I decided to enhance the prediction model … for accuracy purposes, of course. NOT so that the outcomes better aligned to my predictions. Having been impressed by a recent demo that I saw, my tool of choice was PowerBI.
Ultimately, we ended up with a LOT of different visualizations (that mostly proved me wrong, or should I say, didn’t prove me right?!). Word quickly spread through the office, and to our friends, of the *cough* work *cough* that we were doing, and people naturally wanted to see the outcomes. Now, I don’t know if you’ve ever tried sharing many different visualizations, coming from different systems, to people who don’t really understand analytics, but let me tell you, it ain’t easy.
Fortunately, we both work on Theia, which is designed for telling data stories. Although our Masters debateargument research isn’t business related, we decided to create a data story to share with those interested parties. At the end of the day, a data story can be about anything, even a golf tournament. Plus, using Theia to tell our Masters story is a good way for us to justify our time and effort as ‘work related’.
I’m not going to share the winner of the 2019 Masters with you just yet, but keep your eyes open as we will be sharing the results through social media next week. Monday and Tuesday will showcase the analytics that went into the prediction, with the winner being revealed on Wednesday. Watch, or follow, @HeyTheia next week for all of our Masters fun, I mean work.
Semi-legal disclaimer: I did say that our golf knowledge was only above average, so don’t go making any large bets using our visualizations or predictions.