Three technologies that are being talked about but mistakenly intertwined and overlapped.
1. Data Catalog:
What it is: A data catalog is a centralized repository that contains metadata about data assets within an organization. It serves as a comprehensive inventory of available data sources, datasets, databases, tables, files, and other data-related resources. The catalog provides information such as data descriptions, data lineage, data quality, usage statistics, business terms and access permissions. The primary purpose of a data catalog is to enable data discovery, facilitate data governance, and improve data collaboration across teams. Content producers (e.g.: Data Analysts and Data scientists) are the primary consumers for this service.
What it is not: A repository for all things upstream like Power BI files, Tableau Workbooks, Notebooks or report and dashboard definitions. All the data used in semantic layers, business definitions and other analytical artifacts should have lineage traceable via a Data Catalog. An exception to this is where data products are produced from other source data, in these cases that definition is required to trace lineage back fully.
2. Metrics Store:
What it is: A metrics store is a specialized storage system designed to be an additional, intermediate area between the data source (database, warehouse, file) and other upstream systems, esp. BI/analytics solutions. These repositories contain definitions of the underlying data and form a semantic or business layer to promote content users to use common ways of using, accessing, and manipulating (e.g.: calculations and normalizations). Content producers are the primary consumers for this service.
What it is not: A repository for data or analytics assets. Its job is to make upstream reports, dashboards, and visualization creation easier with reusable business and calculation definitions.
3. Analytics Catalog:
What it is: An analytics catalog contains the metadata associated with analytical assets and artifacts. It provides a centralized repository for storing and organizing, analytical reports, dashboards, visualizations, and other analytics-related objects from various locations and vendors. The analytics catalog helps data analysts, data scientists, business users and all consumers discover and access analytical assets, understand their context and business logic, and promote collaboration and reuse of analytical work within the organization. It also helps Analytics and BI teams get a better understanding of usage of usability to help focus their efforts.
What it is not: It is not another Business Intelligence tool. It does not require access to data or replication of data. It is not a technology used to define metrics outside of other analytics systems in use.
In summary:
–Data Catalog: Contains metadata about data assets (datasets, databases, files) to facilitate data discovery and data governance.
–Metrics Store: Contains business ready definitions of data to facilitate data consumption.
–Analytics Catalog: Focuses on metadata related to analytical assets, reports, and dashboards, to support analytics collaboration and reuse.
While there may be some overlap in functionalities (like they all have search and they all live in the world of Analytics), these three components serve different purposes and cater to different aspects of data management and analytics within an organization.
Let’s face it. Every provider of Business Intelligence (BI) and Analytics would love nothing more than to see organisations consolidate and unify under their platform or brand. It’s the reality of standardisation.
The expectation is a seamless transition, but is that the reality?
Do all tools deliver the same results?
On the surface, it might seem so – in the end they all offer data visualisations. But delve deeper, and you’ll find vast disparities, from distinct authoring methods to chart originality. A tool that offers reporting cannot simply be replaced by one that doesn’t. For instance, substituting Cognos’s reporting functionalities with a desktop tool, particularly with data governance or external consumer requirements, isn’t just ill advised, it’s reckless.
It’s not just analytics vendors that are driving this narrative. Key decision-makers frequently opt to ‘switch’ tools when they procure a new one. The allure of reducing licences and associated costs is just too much for some, but like the sirens in the water, they often pull our attention away from the real danger. The cost of switching. Let’s break it down:
Training
It’s great transitioning to a new tool with all its new bells and whistles, but you need to actually learn how to use the thing. Not only that, but you need to transfer any content from the existing system into the new one, placing a burden on the current content owners in terms of time and effort.
This becomes a people issue. Inform someone that you’re altering the tools and tasks they’re accountable for, and they’ll be inclined to shift to the new tool, or if that isn’t viable, they might choose to leave. If they leave, you risk delay and the loss of institutional knowledge.
Human Resources
The argument that existing staff will migrate the required content oversimplifies the business case. What about the new content and the aspirations of being data-driven that everyone is striving for? To demonstrate the value derived from this new expense promptly, you’ll need additional hands-on deck for these projects. More people equals more money. Money that you probably didn’t account for in the first place.
Infrastructure
Operating multiple systems simultaneously will inflate hosting costs. We mustn’t forget the databases and source systems. Recall the challenges of conducting load and stress testing against production sources.
Users
Losing users comes at a high price. This group is likely to voice the most objections. They may have advocated for improvements in the analytics experience, but standardisation implies a total change. During this period, this group is likely to fragment further. Rogue tools, new data export requests, or simply surrendering in the quest for the information they need can result in severe damage. This strays you further from the transparent and aligned data-driven culture you’re aiming for.
Keep in mind – Users who are vocal about their experience are the ones using it. Don’t mistake engagement and passion for bitching and moaning. You want engagement because it leads to enhancements and betterment across the board. Let’s not even talk about Outlook!
Double Licences
While you can try to minimise this expense, as the deployment of the new tools gets closer, users will need access to both. Validation and confidence building, as well as contingency planning if things go awry, are crucial for success (which is rare). It’s our firm belief that the BI and Analytics market figures floating around are inflated, as most users have multiple tools doing the same job.
Time
Choosing not to invest in people to do the work equates to prolonged timelines. All vendors advocate the ‘time to value’ concept, but this is only achievable with simplistic projects and some “Services” to assist or train along the way.
So, what do we truly gain when we switch from one tool to another?
Wasted time
Recreating content that is already accepted and available. (There are no migration tools available between vendors only services groups.)
Increased expenditures
Training, duplication of resources (human and financial)
Missed opportunities.
Forfeiting chances to create new content for new projects and analytics to effect better changes.
In this context, a SWITCH implies a:
Sudden
Wild
Increase in
Total
Cost of
Holistic ownership.
If you’re contemplating a switch, here’s some advice:
Rearticulate the desired outcomes.
Reimagine the problem without the solution.
Reassess how the solution will meet the outcomes and tackle the problems, equipped with the insights above to determine if the cost is justified.
In our opinion, the only solid argument for switching is if the current solution is no longer vendor-supported.
Your Mind is Set on Consolidation
If you’re still keen on consolidation, bear in mind the inevitable truth. There’s no one-size-fits-all “ring”. While BI platforms might be replaceable at a high cost, they’re no longer the only sources for analytics. Every solution provider has a strategy to introduce or enhance the analytics provided on their platform. Often, these aren’t accessible to other tools or necessitate manual Extract, Transform and Load (ETL) of data.
To put it another way, what does consolidation or standardisation truly mean for your business? Could a Unification strategy be a more fitting approach? Let’s consider some typical outcomes:
Outcome
Consolidation
Unification
Single place for analytics
FAIL External applications exist
PASS
Reduced costs
FAIL Spend happens elsewhere as teams retool (shadow Ops).
PASS Reduction of ‘legacy’ tooling content happens over time by users naturally as does the expense.
Happy users
FAIL Disruption, forced into a single tool, change is hard.
PASS Best tool for a job survives, minor changes, natural transitions.
Data Literacy
FAILOne tool can’t do it all, multiple places still exist.
PASS Users are unified to access analytics; Uses are exposed to more analytics and use cases.
This should serve as an awakening for many data and analytics team owners (including executives). Consolidation has been a catastrophe, benefiting only service teams and vendors for the past 15+ years.
Some vendors were wise enough to see the value of a side-by-side approach, but the collective end users were overlooked, leading to negative outcomes.
There is a Harmony to All This
To put it simply, we think that Unification and harmony is the strategy with the most tangible benefits at the user level. It’s a strategy that’s already being implemented with considerable expenditure at the data layer with virtualisation, data catalogues and metrics stores (but is likely to fail due to the lack of end-user consideration).
If you’re operating in an environment with multiple Data and Analytics tools (as everyone is), you owe it to yourself, your organisation, and your staff to explore unification and harmonisation.
*Sorry readers, we used a bad word. There may be more throughout this article, but if it encourages you to critically examine the concept of standardisation and its negative impact, then it’s worth it. Sorry, not sorry.*
Introduction
Yeah, you heard us right the first time. Standardisation in analytics tools, often hailed as the cornerstone of making technological progress, is bullsh*t. Don’t believe us? Let’s put it another way. Clinging to a single standardised analytics and bi tool can stifle creativity, hinder flexibility, and ultimately slow down progress within your business.
But that’s just one side of the argument. In this article, we’ll dive into the controversial statement, shedding light on both the pros and cons of BI standardisation, and challenging the widely held belief that standardisation is always beneficial.
The double-edged sword of standardisation
Standardisation is the process of creating and implementing technical standards based on the consensus of different parties that include firms, users, interest groups, standards organisations, and governments. It’s supposed to help maximise compatibility, interoperability, safety, repeatability, and quality.
It’s often associated with a number of benefits, including:
Increased efficiency: By creating a common framework for businesses to operate within they’re able to reduce costs and increase efficiency.
Improved quality: By ensuring that services meet certain standards they can improve the quality of those products and services.
Increased safety: As above but replace quality standards for safety.
Increased compatibility: Standardisation can help to increase compatibility between products and services, making it easier for businesses to work together.
Reduced spend: Removing expensive and overlapping tooling licences and removing the need for multiple ABI teams (one per tool) and overlapping infrastructure (per tool). Don’t forget better rates that come with increasing buying power with the chosen Analytics and BI vendor!
Sounds incredible, right? Who wouldn’t jump at the chance to be more efficient and cost effective while improving quality and safety for products and services. But as the subtitle suggests, this is a double-edged sword, and sadly this blade is pretty sharp.
You see, standardisation can also backfire for companies who embrace it. For example, it can:
Lead to vendor lock-in: When a company standardises a particular Analytics and BI platform, it becomes more dependent on that vendor. Making it more difficult and expensive to switch to a different platform in the future.
Reduce flexibility: Standardisation can reduce the flexibility of an analytics team. This is because a single vendor will do some things well and others not so well. The things it doesn’t do well lead to rigid solutions around the limitations.
Stifle innovation: Standardisation can stifle innovation by discouraging developers from developing new and unique analytics applications. This is because developers may be reluctant to invest time and effort in developing applications that are not compatible with standardised platforms.
Opportunity cost: Migrating existing content from other Analytics and BI tools is time consuming and costly. Typically, everything gets moved (without knowing the value), which means new projects aren’t being done and you’re missing opportunities wasting time on stuff that isn’t needed.
Not so great now, is it?
Standardisation Simplifies Interoperability
Another reason that people gravitate toward analytics standardisation is due to the way it simplifies interoperability or removes the need for interoperation entirely. The allure of everything working together seamlessly, fostering compatibility, and reducing friction in user experience is too enticing to miss out on. I mean, imagine a world where every manufacturer had a different design for electric sockets or USB ports – chaos would ensue – we’re looking at you Apple!
But does that mean you have to succumb to the other negatives we discussed? There may be a better way.
At Digital Hive, we like companies to have freedom within their analytics tech stack, utilising tools and services that tick every box based on need, not just a few because the others won’t play nicely together.
By layering Digital Hive over your analytics tech stack, you get the benefits of standardisation without the negatives that accompany it. Instead, you get to keep the ABI tools and services that work for your organisation and your individual business unit needs, while adding in a branded front end that is as simple or in depth as you need it to be.
Imagine a place where all your analytics assets live, easily accessible without having to reinvent the wheel on how it’s accessed. Now imagine having to standardise that content to fit a new product just because it plays nice with the flavour of the week tech that no one wants, but it’s part of the package you just bought. Got to get your money’s worth, right?
Stop Fitting Square Pegs in Round Holes
Okay, the title is a bit provocative, but you get the point. While standardisation offers undeniable benefits, it’s not a panacea. It can, and does, block innovation, reduce flexibility, and stifle competition.
The key is to strike a balance. By using Digital Hive to collate ABI software into one easily accessible front end, you can begin fostering an environment that encourages usage, improves productivity of users and power users, adapts to change and helps BI teams prioritize work and understand value. We can enjoy the benefits of standardisation without falling into its potential pitfalls. After all, in the dynamic world of technology, adaptability, speed and balance are the keys to success.
Thus, it’s not that standardisation that is bullsh*t; rather, it’s that blind adherence to standardisation, without considering its potential drawbacks and the need for balance, can lead us down a problematic path. By recognising this, we can navigate the complex landscape of technology with a more nuanced understanding and a greater potential for progress.
For more information about Digital Hive and how we can work with you to achieve amazing results, contact us today.
A few years ago I met Walter Isaacson, former Chairman of CNN, Editor of TIME, and author of Steve Jobs’ biography. If you
can’t tell from his pedigree, Isaacson is a great storyteller. He also wrote
about other famous innovators including Benjamin Franklin, Albert Einstein and
Leonardo Da Vinci. I only had time to ask him one question, so I made it a good
one,
“What did Jobs, Franklin, Einstein, and Da Vinci have in
common that made them such great visionaries?”
Isaacson smiled and responded, “All great innovators operate at the intersection of Art and Science.” I think Isaacson would agree this balance applies to data storytelling as well. Truly effective storytelling drives business action, and this occurs with the right mix of facts, visual presentation, and contextual narrative. Finding this balance is a challenge, but with the right tools and methodology, you can go from creating flashy dashboards to actually informing decisions.
Data Storytelling
Over the past decade, there has been a massive push for companies to leverage data. We are starting to see the Rise of Chief Data Officers. Humans are visual by nature, so we have also seen increased adoption of user-friendly visualization tools like Tableau, Qlik, Power BI, and ThoughtSpot. As the push for data democratization and access to data continues to increase, we need to ensure data is being effectively communicated and consumed – not just put into a pretty dashboard. Data Storytelling
What is Data Storytelling? Data Storytelling is translating data in an easy to understand the way to help people take action on the business. There are three main components to data storytelling: story boarding, data visualizations and data narrative.
The art of communicating using data and analytics, is still on the starting block. However, by establishing a methodology and using new technologies to support us, we can realize the full value of our data, inspire action, and transform Data Storytelling from an industry buzzword into an effective boardroom practice.
Rather than just deliver report requests, analytics teams must establish a dialogue with the business to understand the context. Context includes goals, challenges, and potential decisions that the business will make. In creating this dialogue, gaps in understanding will appear. These gaps will highlight the best questions to ask of the data. Ultimately, the answers to these questions will deliver the value business leaders have been seeking.
Using Technology for Storytelling
Once the context has been established and the right questions are being asked, analytics teams, can use technology to help communicate information with a narrative to increase understanding. We use reports and data visualization tools now. Data visualization helps us see blatant patterns, but it isn’t ideal for communicating context and situational nuances. We also shouldn’t assume interpreting a visualization is easy for everyone. With the global Data Literacy rate struggling around 24%, delivering an isolated report or visualization is risky – the information can easily be misinterpreted and lead to costly decisions.
New technology, like Digital Hive’s Enterprise Portal enables companies to easily balance the art and science of data storytelling so they can communicate and understand the entire business narrative – and ultimately make the best decisions.
By bringing together reports, visualizations, and dashboards from all of your different BI tools into a single storyboard, you can mix best-of-breed technology to deliver all of the facts. Contextually, you can incorporate video, custom messaging, presentations, and data literacy support assets to complete the narrative and inspire action.
The ideal balance of data, visualization, and narrative can now be achieved without the limitations of any one tool or technology because you can use all of your tools together seamlessly.
Conclusion
To increase the value of analytics for the business, we must find a greater balance between the art and science of data storytelling. When looking to improve the art, we must change the way analytics teams and the business communicate context. Then, we need to ask impactful questions of our data.
Finally, when delivering our findings, we should leverage technology to support us by using data visualization and data storytelling tools to communicate insight within a narrative.
*Image shows an example Digital Hive gameboard/storyboard with assets from multiple BI tools sitting side by side in a single view.
Digital Hive and Data Storytelling
Digital Hive dynamically displays content from any information system seamlessly in one unified platform – providing the easiest, most efficient, and customizable experience for the delivery and consumption of data stories on the market today. Behind the scenes, Digital Hive defends users from change-disruption, tracks analytics adoption, and reduces the IT backlog.
Organizations today are faced with more decision-making challenges than ever before. This is due to the sheer volume of data, disparate sources, and breadth of information that they must process to operate effectively—not to mention their competitors’ efforts to outmaneuver them at every turn.
Traditional business intelligence (BI) helps organizations make better decisions. However, all these tools can’t solve all the challenges companies face. Technology, process, and people are three key pillars of transformation. Technology will and is constantly changing and innovating and optimizing platforms and processes is the key to leveraging and delivering insights in the fastest and most effective way. Could an enterprise portal be the ticket you’ve been waiting for all this time?
Businesses need to become more intelligent, which means they need to make their organization more agile. They need to be able to adapt quickly and correctly, but in order do that they need a modern platform for insight delivery from multiple tools. To deliver such a platform organization should invest in an Intelligent Enterprise Portal (IEP).
“Highly successful agile transformations typically delivered around 30 percent gains in efficiency, customer satisfaction, employee engagement, and operational performance; made the organization five to ten times faster; and turbocharged innovation.” McKinsey & Company
This all sounds nice… but is it a necessity for all businesses? In this article we’ll highlight how enterprise portals can help businesses cut costs dramatically but drive value at the same time. We’ll also cover what an enterprise portal is, how it works, and why it shouldn’t be perceived as a ‘nice to have’.
What is an Enterprise Portal?
An enterprise portal is a central information hub that provides users with real-time access to critical organizational data and information. It acts as a web-based platform that combines all your existing business’s analytics and intelligence sources (on-premises and cloud), giving users a consistent interface across multiple technologies and a direct route to what they need. Over and above having a centralized analytics experience, the right intelligent enterprise portal can provide your users with that extra layer of value they need by learning from user behavior patterns, history, and peer activity (Learn more here)
Death by multiple systems?
Regardless of a user’s role in the business, most if not all the daily activities carried out involve manual processes because the people involved don’t have a direct route to the information they need, which leads to duplication of effort and the subsequent delays that come with it.
For example, let’s say there is a problem with an order. For a team – whether customer-facing or internal – to solve it, they need access to information across 25 different systems and applications. That’s 25 different platforms with 25 different logins, 25 different looking portals, 25 different ways to navigate around a platform, 25 different ways to extract information, 25 portals that don’t know or understand the user… the list goes on. Based on this example, a user is wasting 37.5 mins simply logging in and accessing what they need – they haven’t even started on the ‘solving‘ part.
Employees spend 10+ hours each week searching for information (The Economic Times)
Lost productivity and profits
Accessing, navigating, and managing multiple systems affects your bottom line. It’s as simple as that. We’re talking about hours and days lost, reducing the productivity of your employees, and impacting their engagement and morale. In essence, businesses are paying people to ‘waste time’.
In most businesses where a lack of efficiency is called into question, it is usually down to people spending too much time on their mobile devices, standing chatting at the coffee machine, or blaming a meeting that overruns. But in today’s technology driven world, could scenarios like the one above (managing 25 different platforms) be:
Costing you in dollars, time, and productivity?
Demoralizing employees?
The reason for low productivity and engagement?
Contributing towards low adoption of BI across the business?
Preventing the creation of a data-driven culture?
Stopping you from becoming a smarter organization?
So, in a year’s time, how much money might you be losing?
Keep it simple and centralized with an Enterprise Portal
The pressure to stay competitive is growing and businesses that digitally transform are better places because they are able to use analytics and information to make quick and informed decisions. Focus needs to be on the user, their experience, and what information they have at their disposal. Are in-efficient processes and disconnected technologies hindering the business’s success?
By replacing the multiple portal experience with a centralized view of critical information and analytics across existing systems and technologies – all within one unified interface – allows users to see what’s happening now and what’s coming up next.
This will not only save users time and eliminate in-efficiencies, but it can also help users make better, faster decisions across your organization.
To find out how Digital Hive can help you drive change and centralize, contacts us today!
Those of us preaching the power of data – on LinkedIn, on stage at keynotes, and at yearly budgeting meetings – prescribe data as the solution to all problems. However, we all know the execution of a powerful analytics strategy is often more challenging than anticipated. These challenges are especially severe in higher education, even before operating in the midst of a global pandemic and looming economic recession. Now, the hurdles seem larger than ever. Budgets are being significantly reduced or frozen, analytics teams are being defunded and paralyzed in the face of uncertainty. Yet, the greatest action universities can take today, one thing they can control, is to double down on data. Those who choose to do will evolve and emerge stronger. Those who don’t, may not survive.
Adoption is essential
Higher education has always been slow to adopt new technology, which I think is understandable. Universities are large and complex. They have many layers of leadership and often rely on external funding. It’s this complexity that makes data such a valuable asset for university decision making, and in the past decade, major strides have been made to introduce data & analytics tools across departments. The results of these changes, however, have been slow to surface.
One of the reasons that data investments are seeing less than stellar returns, is a product of the university’s organization and structure. Many groups operate independently from one another – especially when it comes to technology purchasing decisions – and this has resulted in a variety of siloed tools and technologies.
However, efforts have been made to create university-wide analytics councils and information management teams, but working backwards to resolve the existing incompatibility of different technologies is still difficult.
Now as we start to see valuable information being generated by many different departments, having an awareness or sight of key information that could be a saving grace to universities in crisis-mode is critical. However accessing, sharing, and acting on this information quickly is still next to impossible.
Everything in one place
Pomona College Branded as ‘ConnectTo’, Pomona’s advancement department used Digital Hive and created a single, unified information portal bringing together IBM Cognos, Microsoft Power BI, Tableau, and SSRS. Last year, Pomona College won a National Silver CASE Award with ‘ConnectTo’. Now, users have one place to go to easily find reports, dashboards, documents, training materials and more.
This has resulted in greater efficiency and productivity across departments, an increase in adoption (more than quadrupled), a reduction in technology management costs, and has directly impacted decision making in regards to fundraising campaigns.
Future-Proofing
While actively investing in new technology projects during a period of uncertainty may seem risky, the greatest risk is in retreating to the status quo. In all areas of our society, both in business and our personal lives, we have experienced a steady increase in digital transformation.
The COVID-19 pandemic is not creating a new normal, it has simply accelerated the inevitable evolution of how we behave and interact. For many businesses who made digital investments early on, this period will mark an opportunity to accelerate past the competition. For others, it’s a wake-up call that the train is leaving the station, and immediate action is required. Unfortunately for the rest, a lack of action when times are good and when times are bad, will result in devastating consequences.
Click here to read more about howDigital Hive ransformed Pomona College’s fundraising efforts or book a demo to see Digital Hive in action.