Those of us preaching the power of data – on LinkedIn, on stage at keynotes, and at yearly budgeting meetings – prescribe data as the solution to all problems. However, we all know the execution of a powerful analytics strategy is often more challenging than anticipated. These challenges are especially severe in higher education, even before operating in the midst of a global pandemic and looming economic recession. Now, the hurdles seem larger than ever. Budgets are being significantly reduced or frozen, analytics teams are being defunded and paralyzed in the face of uncertainty. Yet, the greatest action universities can take today, one thing they can control, is to double down on data. Those who choose to do will evolve and emerge stronger. Those who don’t, may not survive.
Adoption is essential
Higher education has always been slow to adopt new technology, which I think is understandable. Universities are large and complex. They have many layers of leadership and often rely on external funding. It’s this complexity that makes data such a valuable asset for university decision making, and in the past decade, major strides have been made to introduce data & analytics tools across departments. The results of these changes, however, have been slow to surface.
One of the reasons that data investments are seeing less than stellar returns, is a product of the university’s organization and structure. Many groups operate independently from one another – especially when it comes to technology purchasing decisions – and this has resulted in a variety of siloed tools and technologies.
However, efforts have been made to create university-wide analytics councils and information management teams, but working backwards to resolve the existing incompatibility of different technologies is still difficult.
Now as we start to see valuable information being generated by many different departments, having an awareness or sight of key information that could be a saving grace to universities in crisis-mode is critical. However accessing, sharing, and acting on this information quickly is still next to impossible.
Everything in one place
Pomona College Branded as ‘ConnectTo’, Pomona’s advancement department used Digital Hive and created a single, unified information portal bringing together IBM Cognos, Microsoft Power BI, Tableau, and SSRS. Last year, Pomona College won a National Silver CASE Award with ‘ConnectTo’. Now, users have one place to go to easily find reports, dashboards, documents, training materials and more.
This has resulted in greater efficiency and productivity across departments, an increase in adoption (more than quadrupled), a reduction in technology management costs, and has directly impacted decision making in regards to fundraising campaigns.
Future-Proofing
While actively investing in new technology projects during a period of uncertainty may seem risky, the greatest risk is in retreating to the status quo. In all areas of our society, both in business and our personal lives, we have experienced a steady increase in digital transformation.
The COVID-19 pandemic is not creating a new normal, it has simply accelerated the inevitable evolution of how we behave and interact. For many businesses who made digital investments early on, this period will mark an opportunity to accelerate past the competition. For others, it’s a wake-up call that the train is leaving the station, and immediate action is required. Unfortunately for the rest, a lack of action when times are good and when times are bad, will result in devastating consequences.
Click here to read more about howDigital Hive ransformed Pomona College’s fundraising efforts or book a demo to see Digital Hive in action.
As companies race to find value in their data and improve the customer experience, many have overlooked an obvious value-add: providing analytics for the customer. Client-facing analytics differentiates companies from the competition, helping increase market share, retention, and even direct revenue if packaged and productized.
Why aren’t all companies doing this already? The data is there, and the reporting, dashboards, and visualizations are plenty.
The Challenge
One of the most difficult aspects of trying to provide analytics to customers is delivering a complete, appealing product. With analytics coming from a variety of internal business intelligence tools, packaging all of this information is far from easy. Both internal and external, users desire a seamless analytics experience. If bringing all of this content together into a single user experience wasn’t challenging enough, each experience needs to be tailored for various audiences.
On top of wrangling content, curating different experiences, and creating a pretty product – facilitating understanding is also a challenge. Chief Data Officers are tasked with fostering data culture, increasing BI adoption, and improving data literacy. However, these initiatives shouldn’t be limited to the internal organization. Companies need to extend this concentration to clients and partners as well.
The Ideal Customer Analytics Portal
Given the needs and challenges discussed above, let’s describe the ideal external analytics portal for clients and partners:
In summary, companies should aim to deliver analytics both internally and externally to clients and partners to maximize the value of data and grow together with strong data-informed relationships using an analytics portal.
The ideal analytics portal integrates reports and visualizations from all of your different BI tools, allows for the curation of content for different groups including data literacy support, and provides an attractive and easy to use experience.
Why Act Now?
If you provide your clients with informative analytics to help them grow their businesses, your business will become an irreplaceable source of value. If you DON’T provide your clients and partners with analytics, someone else will.
The demand for analytics is continually increasing as companies use data to drive decision making. If you are not providing clients with informative analytics to help them grow their business, how are you ensuring that your service will not become an irreplaceable source of value?
Offer your clients a service that goes above and beyond the competition. Replace any frustrations by giving them full autonomy over their analytics environment.
How to create a Customer Portal with Digital Hive
Read more about our enterprise portal solutions, or get in touch, if you want to chat about customer analytics portals – we can show you how simple it is to get set up!
Book a demo with a member of the team. See the full Digital Hive experience as well as some of the branded customer portals we’ve created.
Recently, I finished my second Ironman 70.3 triathlon (for the record, that is NOT me in the photo). The triathlon is made up of three segments: swim, bike, and run. While the finishing time is reflected as a single value, there are 5 distinct times that comprise the total. Not surprisingly, the times for the swim, bike, and run portions are included, but also in the mix, are the two transition times. A transition is the time spent between the different events. Once an athlete crosses the finish line, these five times are summarized for the final result.
During this last race, I had some extra time (way more time than I expected) to ponder how the results are conveyed, and how there is so much more to the triathlon story than just the final time and the sum of the parts. There is the impact of the weather, how much training time was spent in each discipline, how effective that training was, conditions of the course, etc. Yet, when the dust settles and the results are in, it’s only these handful of metrics that are displayed.
The same scenario exists in analytics and business intelligence deployments. So much emphasis is placed on displaying the final results in a dashboard or report, that a lot of supporting context is lost. Context that could help to justify the results being seen, highlight and validate that previous business decisions have paid off, or worse, had a negative impact. This is why the current trend, one that isn’t being adopted quick enough in my opinion, is to start telling data stories.
A data story is where in addition to analytic assets, contextual information is included to help guide and inform the user. This way, users with varying levels of data literacy can all arrive at the same interpretation of the data. Now, special care must be taken to not lead the witness by framing the story with bias from the author (Just the facts, ma’am. Just the facts), but by and large, data stories are much more effective at delivering the desired message to the masses.
Being a data geek, a triathlon, and the whole training process, is FULL of metrics and various data points, so it’s a little disappointing that the results are displayed the way that they are. Hmm, maybe a data story should be built around all of the data gathered while training for an event like a triathlon …
Today
I had the opportunity to attend Tableau’s “Design tricks for great dashboards” webinar. The speaker was Andy
Cotgreave, a visualization expert and Tableau veteran. During the webinar, Andy
touched on ‘framing’. As a level set, framing is about the context in which data
is presented, which is critical because author bias can creep in to lead the
witness, err, dashboard viewer into arriving at a biased opinion. This has been
a concern of mine for years and it was great hearing Andy describe the problem
so effectively.
Although
the view of the data is static and there should be only one version of the
truth, we are all unique individuals with different opinions, backgrounds, etc,
thus, we don’t all interpret the data in the same way. I’ve always stated that if
you showed one report, without context in a management meeting, each member of
the audience would interpret the report differently. Essentially, a report can
be twisted to fit a lot of different narratives. During the webinar, Andy used
these two visuals to explain this scenario.
The
first is a well-known infographic (created by Simon Scarr) that was very
impactful. This striking visual depicts the loss of life as a result of the
military engagement in Iraq. Certain design choices were made, like the
deliberate choice of colour (red) and using the visual metaphor of dripping
blood, to convey a very polarizing view.
But
what if some simple changes were made to the infographic? I’m not talking
wholesale changes to the layout and charts, I’m simply talking about tweaking
the colour, orientation, and headline. As you can see below, taking the EXACT
same infographic, rotating it 180 degrees, swapping out the red for a blue, and
modifying the headline totally changes the narrative to something more positive.
At
first glance of the original infographic, my visceral opinion was negative, and
I had thoughts about how destructive and senseless war can be. When viewing the
modified infographic, my first impression was “hey look, fewer people are
dying”. Definitely a more positive narrative than the original infographic
invoked.
It is our role, as the data literate,
to ensure that when building visualizations our personal bias and opinions
don’t influence the interpretation of the results. Easier said than done
though. When tasked with creating new visualizations, I focus on the questions
that the audience is looking to answer. Once the questions are understood, the
focus shifts to providing the facts required to answer those questions. My
preference is to not inject headlines or commentary into the visualizations.
But
without the commentary aren’t the visualizations open for interpretation, thus
propagating the ‘multi-version of the truth’ scenario?
Definitely, especially when the audience is non data literate and doesn’t have the experience in interpreting analytics. So how do we bridge the gap to resolve this? To me, the best way to solve this problem is by focusing on finding correlation, and to a certain extent, causation (this is a slippery slope to injecting personal opinion though, so beware) and adding that as context to support the analytics. When the data to support the decision-making process resides across different data sources, or BI platforms, there is an opportunity to tell a larger, more complete, data story. When commentary is placed into each visualization, legibility may be impacted when bringing together multiple artifacts into the data story. Not only that, an opportunity is lost to establish correlations that transcend single visualizations and/or platforms. An effective data story should contain:
The
facts required to answer the audience’s questions
The
necessary visualizations to convey the data story
Non-biased
commentary that guides the audience to correlations in the data
When
using multiple analytics solutions to tell the story, the emphasis should be on
the data within the visualization and not technology that created the
visualization
By sticking to the facts, a greater
emphasis will be placed on the raw data and the correlation versus forcing the
audience into one potentially polarizing view or opinion.