Skip to content

How Interactive is your Dashboard?

One of the impacts of Generative AI is the way it has encouraged us to re-visit a whole range of existing tools, frameworks and processes to consider whether long-standing deficiencies, gaps and frustrations could be resolved.  One subject of discussion was the many different aspects of corporate dashboarding from the most pixel-perfect issues of custom visualization to the broadest psychological aspects of metric gaming.  This is very definitely a ‘brain dump’ with the following high level sections:

  • overall context
  • data representation
  • frameworks and perspectives
  • analysis capability
  • the future?

CONTEXT

Just for fun, lets play around with the origin of the term ‘dashboard’ for a second.  A dashboard is the front board of a horse-drawn buggy which protects the occupants from flying mud as the horses dash through the streets.  Presumably that was inherited into early motor car design and when the instrumentation was placed on the driver’s side of that the dashboard then the term stuck.  Two elements are worth preserving from that: one is that you want it to be useful when you are going quickly and the second: Jason made me laugh with “you want it to cut out all the ****”.

Business Performance Management and Measurement

‘Dashboard’ probably covers a range of use cases depending on the context but the use case that pushes the boundaries is as part of a sophisticated Business Performance Measurement (ed. Neely) initiative.    The Andy Neely book includes a very useful definition that Performance Management ‘should be equated with purposeful action taken today designed to produce meaningful results tomorrow.’ A good interactive dashboard should deliver on that objective.

The Balanced Scorecard

One well-established framework that might be the subject of a dashboard is the Balanced Scorecard which combines financial and non-financial, internal and external and leading and lagging metrics.  It could work top to bottom and span different functional areas.  Objectives Key Results is another popular framework worth looking at to focus your scope.

Control Surface

Most implementations of ‘interactivity’ limit their ambitions to the user manipulating the data visualization to filter, drill through, slice and dice, personalize etc.  Though this is very important and we will deal with it more below, a more ambitious objective would be to use the dashboard to interact with the underlying business processes themselves.  Think of the ‘control surface’ of a modern commercial airliner where the instrumentation and the controls are a unified whole.

DATA REPRESENTATION

Types of Analysis

You will probably have come across categorization of analysis along the lines of:

  • Descriptive – what happened?
  • Diagnostic – why did it happened?
  • Predictive – what is going to happen?
  • Prescriptive – what should you do about it.

Until now most of these capabilities would have to be anticipated at design time and built into the functionality of the dashboard.  The advent of readily accessible AI tools offering non-deterministic processes has turned this on its head.   Generative AI tools can now describe your results in ways previously unimagined and, when used in combination with other tools, it is difficult to conceive of a limit to their capabilities.  (See ‘Chatbots’)

Data Representation

Chapter 6 of Andy Kirk’s great book Data Visualisation: A Handbook for Data Driven Design covers the building blocks of data representation and encoding.  If the data speaks for itself it doesn’t need encoding – sometimes a simple table is sufficient.  He describes ‘marks’ (points, lines and shapes) and ‘attributes’ (colours, size, symbols etc) that you can use to convey a more complicated message.

You then need to match the chart type to the type of analysis you want to perform.

  • Categorical
  • Hierarchical
  • Relational
  • Temporal
  • Spatial

This is really the engine room of any dashboard design and build…choosing the right visual for your purpose and configuring the data and formatting to optimise the value and impact of the analysis:

If the ‘native’ visuals are insufficient, then there are 3rd party options:

Custom Visuals

If you still can’t find quite what you are looking for then plugin custom visual options such as Deneb that give you more of a blank canvas to work in.  It’s a non-trivial activity, but you could ask ChatGPT for help:

Power BI

We focus on Microsoft and Power BI but other dashboarding tools are available, notably Tableau which is part of the Salesforce platform.  For a thorough analysis of the tool landscape look for the Gartner Analytics and BI Critical Capabilities report.

Power BI is assessed by Gartner as the leading Analytics and BI tool even though it does not come quite top of the pile in pure capabilities terms.  This is due to its integration as a central component in the Microsoft Power Platform and the newly announced ‘Data Fabric’ concept.  In addition, the sheer number and engagement of users, quality of documentation, samples, blog and roadmap.

The documentation takes you through basic dashboard concepts step by step.  No need to duplicate that here, other than to list just a few of the features that are available to make your dashboard ‘interactive’

ChatBots

Ways of adding smart(ish), interactive elements into dashboards have been emerging over the last few years with various AI/ML elements such as ‘Q&A’ or ‘Smart Commentary’.  The explosion of generative AI has of course opened the floodgates on that.  This post from Microsoft Build 2023 talks about including AI ‘beside, inside and outside’ which is a way of looking at integrating AI interaction into your apps, your business processes and your dashboards.

Process Interactivity

Interaction means more than ‘slice and dice’.  It means interaction with the underlying business or control process.  One very simple example is a periodic re-forecasting process.  If you start the year with a budget split into 12 months against which you match a forecast, you can then visualize and analyse the variances and re-forecast to ensure your total forecast is up to date.  All of this could be done through the dashboard itself either by integrating Power App budgeting repository or at the very least adding a hyperlink to the underlying system.  Click refresh and the whole system is up to date.   Zebra BI have some good ideas about how to intuitively visualize simple variances:

Smart Conditional Formatting

Of course conditional formatting is a common way of highlighting variances and bringing immediate visual focus to the value exceeding tolerance.  Where the high level variance is a function of lower level variances you could also think about nesting this kind of flag: Total revenue may be within tolerance but price/quantity/product mix is outside.  You could use measures to highlight this with some kind of flag to show there is a variance hidden under the top level value, inviting the user to drill through.

Data Entry

Interaction could also include data entry into the face of the report, perhaps to update a forecast in response to a newly reported variance.  You don’t want to have to open another system to execute this.  This is something that initial iterations of dashboarding software did not seem too concerned with, but options are much more open now whether it is integrating Power Apps or Adaptive Cards (though the latter link hasn’t been updated for a while so Microsoft may have got bored of the idea).

Alternatively, you could look at embedding your power BI report into your app.

Data Refresh Strategy

Your data refresh strategy must be in synch with your process cadence.  Many dashboards are refreshed on a batch schedule which may suit a formal daily/weekly/monthly cadence but will not suit more dynamic use cases so you might want to look at direct query.

History

Another thing you need to think about is historic data.  SaaS platforms such as Microsoft Dynamics 365 typically don’t have snapshot capability that you need to capture point in time data that you will need to analyse trends and comparisons.  Discussion of data warehouse/data lake solutions is out of scope for this conversation but bear it in mind.

Grain

You should also remember that the grain of your user input data may not be the same as your operational data.  High level sales budgets may be captured at the revenue per month level or perhaps the price x quantity level shown above whereas the actual data might be the level of individual sales.

Influencers

Guy in a Cube are probably the ‘gold standard’ of technology influencers.  Their videos explore every aspect of the Power BI tool from its native capabilities to its integration with the Power Platform and beyond.

BROADER CONTEXT

Design Thinking

This site is a treasure trove of UX and UI design and too much to try to summarise here.  The example I always quote is the design principle of ‘affordances’ which is ‘possible actions that an actor can readily perceive’.  Your dashboard should make intuitive sense both on first view and in facilitating further interrogation.

Multi Media Communication

This book is packed with evidence-based recommendations of how to optimize your multimedia communication.  It is broader than just visual dashboards but allows you to place your work in that broader context.  Two concepts in particular are essential: one is to ‘reduce extraneous processing’ which is the visual clutter which overloads cognitive capacity.  The second is to ‘maximise essential processing’ which is maximise cognitive fit of the visual representations (see below).

Accessibility

You should make accessibility a first class citizen in your design thinking both as a matter of law and best practice.  These guidelines will help.

Storytelling

Written, spoken and visual ‘storytelling’ are core skill for business professionals.   Edward Tufte demonstrates that here, and Andy Kirk’s blog is rich source of inspiration.

Adaptive Cognitive Fit

Cognitive Fit Theory is a long-established idea that suggests that solving problems is easier when the data ‘representation’ suits the task in hand.  Adaptive Cognitive Fit takes that several steps further and suggests that initial AI inspection of the information ‘facets’ or characteristics of the source data, combined with an understanding of the task in hand, could drive a dynamic response to the data representation.  In short, rather than the dashboard designer having pre-conceptions about which visualization was appropriate, it could be adjusted at run time.

Style Guide

You may want to comply with a corporate look and feel in which case you should look at themes.  There even theme generators or you could have a conversation with ChatGPT to see if it would generate you something useful.  If you want further inspiration you could look at high-quality sources like the Financial Times.  The most sophisticated source we have found if you want to take this sort of thing very seriously is the International Business Communications Standards.

Business Glossary

If you are considering investment in any one phase of the SDLC then improving requirements engineering delivers a disproportionate benefit for the cost.  Beyond that we would say the same of implementing a simple Business Glossary solution, perhaps in SharePoint.  This simple book on Business Knowledge Blueprints will get you going but for a full-fat version you could look at  The Semantics of Business Vocabulary and Business Rules.  This repository should be the source for terminology and algorithm definition that the dashboard should comply with.  When business users want to understand a term or calculation used on the report, they should intuitively turn to their own business definitions.

X-Ops

There are three flavours of X-Ops that may have a bearing on your overall analysis strategy in its broadest context:

DevOps.  Obviously the overall approach to development of mission critical analysis capability should be consistent with the organisation’s general SDLC approach to ensure that value, risk management, security, source control, change control etc are managed appropriately.

DataOps is a lifecycle approach to data and analytics and could inform your long-term strategy to moving up the maturity curve.  

Cloud costs are overtaking on-premises costs in most organisations.  FinOps is a response to this changing paradigm and may be the subject of its own dashboard.

The Value of Data

It may seem self-evident that the data you are dashboarding is valuable but it pays to make that an explicit part of the business case.  Doug Laney’s Infonomics is where we always point people to ‘monetize, manage and measure’ your data but his more recent Data Juice provides a host of real use cases to consider. Bill Schmarzo is another source.

The Value of Measurement

Similarly, the ‘value of measurement’ might appear to be self-evident given its ubiquity but once again it pays to understand the intended and unintended consequences of measurement.  Typically measurement cultures could be seen as ‘directive’ or ‘enabling’ or a combination.  A few books might help to navigate the boundary cases:

You have to balance the benefits of ‘gamification’ with the tendency of people to ‘game‘ any incentive framework.

ANALYSIS CAPABILITIES

Process Mining

Process mining is a method of analysing the system logs generated by business processes for process improvement opportunities.  It is a discipline championed by one of our favourite industry experts Wil van der Aalst and is part of the functionality of the Power Platform.

Causal Inference

If you were very ambitious you could explore visualizing the concept of ‘causality’ so why not explore making that explicit with causal inference.  Read Judea Pearl and look for examples relevant to your domain on the web.

System Dynamics

While we are exploring these distinct sophisticated capabilities let me also throw in the potential of system dynamics for simulated forecasts of emergent, non-linear behaviour.  It’s a big topic in its own right but we use Stella Architect.

Lineage and Provenance

Data Lineage is a tantalizing concept – the ability to traverse forwards and backwards through the data transformation process to understand the origin of the data, the effect of any transformation, and the impact and occurrence of the result.  It aims to answer questions like “Where did this data come from?” and “How has it been modified?” .  We first saw this proof of concept from Trifacta and then came across this from Collibra.  However this from Power BI a couple of years has never developed to the ‘column level’ at which it might start to be useful.

Data Provenance is a related concept that answers questions like: “Who created this data?” and “Has the data been modified or tampered with?”  Power BI offers this but we have not yet had an excuse to explore it.

THE FUTURE

Dashboards on Demand?

To finish, all of the above presupposes you are going to go through an extensive and deliberate effort to implement a dashboard process as part of a corporate performance management process.  However, in a recent demonstration the wind was taken out of my sails by a very uncompromising salesperson who said ‘I just want to put raw data In one end and have the system visualize it and ask questions.  AI tools may make that scenario perfectly feasible:

Leave a Reply

Discover more from Standswell

Subscribe now to keep reading and get access to the full archive.

Continue reading