Skip to main content

Evaluation of different landing pages on behavioural engagement with the CARA dashboard: A user research protocol

Abstract

Background

CARA set out to develop a data-visualisation platform to facilitate general practitioners to develop a deeper understanding of their patient population, disease management and prescribing through dashboards. To support the continued use and sustainability of the CARA dashboards, dashboard performance and user engagement have to be optimised. User research places people at the centre of the design process and aims to evaluate the needs, behaviours and attitudes of users to inform the design, development and impact of a product.

Objective

To explore how different initial key messages impact the level of behavioural engagement with a CARA dashboard.

Methods

Participating general practices can upload their practice data for analysis and visualisation in CARA dashboards. Practices will be randomised to one of three different initial landing pages: the full dashboard or one of two key messages: a between comparison (their practice prescribing with the average of all other practices) or within comparison (with practice data of the same month the previous year) with subsequent continuation to the full dashboard. Analysis will determine which of the three landing pages encourages user interaction, as measured by the number of ‘clicks’, ‘viewings’ and ‘sessions’. Dashboard usage data will be collected through Google analytics.

Discussion

This study will provide evidence of behavioural engagement and its metrics during the implementation of the CARA dashboards to optimise and sustain interaction.

Trial registration

ISRCTN32783644 (Registration date: 02/01/2024).

Peer Review reports

Background

Most general practices have computerised patient management systems (PMS). However, few PMS provide options to perform data reporting and exploration, audit and feedback, in-depth analysis of aggregated patient data or comparison and benchmarking with other general practices [1, 2]. The use of dashboards and data visualisations is growing in the health sector, in particular since the pandemic, when numerous exemplars were designed to allow fast and easy comparisons [3, 4]. However, these visual analytics processes imply access and integration of data [5] and challenges remain as data is recorded through different, often incompatible systems (data silos) for which project specific, bespoke dashboards are designed [4, 6].

Dashboards are generally launched for one sole purpose, such as research, exploratory, analytical or business. Dashboard are on the whole, designed ‘for’ and not ‘with’ users and allow limited, if any, user interaction or engagement [3, 7, 8]. Once set up, dashboards require continuous input from the developers to maintain their relevance [3], which must be tailored to the end users’ needs. Developers should employ iterative evaluation of dashboard performance during the development and implementation to promote uptake and use of the dashboard [9]. Researchers and developers have to maintain the effectiveness and relevance of the dashboard in terms of health outcomes and user engagement.

User engagement is a complex concept that incorporates emotional, cognitive and behavioural dimensions [10]. Emotional engagement entails the processing of visual, tactile, auditory and interactive cues by the user’s attentional faculties. Cognitive engagement pertains to problem-solving and sensory aspects of the experience, encompassing enjoyment, boredom, curiosity, stimulation of imagination and evoked interest. Behavioural engagement is linked to interactivity and serves as a key element in enhancing the sensory dimension of the experience [10]. Behavioural engagement revolves around the user’s interactions with the product and their responses to its feedback. It includes the frequency and extent of user interactions, along with the time needed to accomplish tasks or engage with content. Interaction serves as a tangible representation of a user’s behavioural engagement with the product, signifying active user participation during these stages. The metrics employed typically involve tracking the number of clicks, pages visited, video durations watched and views of specific content pages [10].

User research combines qualitative and quantitative research methods to evaluate the needs, behaviours and attitudes of users to inform the design, development and impact of a product [11]. For the evaluation of a data dashboard, a process evaluation involves understanding how users interaction with the dashboard (i.e., clicks, views and duration on the interaction), which helps inform the dashboard’s adaptation to maximise user exposure [12]. Monitoring web traffic sources through Google Analytics provides quantitative data on dashboard usage [12, 13] and allows measuring user perceptions or behaviour indirectly [12, 14,15,16]. Furthermore, these web traffic sources allow developers to estimate and infer the level of behavioural engagement on website platforms. Overall, a strong level of behavioural engagement groups several indicators from Google Analytics, such as high returning users numbers, low bounce rate, high page viewed numbers per session, high mean session duration and lofty goal conversion rates [12, 16, 17].

The CARA project is designed as a sustainable data-sharing platform that can be applied across different PMS to facilitate general practitioners to develop a deeper understanding of their patient population, disease management and prescribing through dashboards [18, 19]. The CARA infrastructure consists of a common data model (to combine data from different PMS), CARAconnect to upload data and CARA dashboards for visualising data. The first exemplar dashboard is focused on antibiotic prescribing and includes automated audit reports, filters (within practice) and between-practice comparisons [18, 19]. The antibiotic Anatomical Therapeutic Chemical code J01 was used for this antibiotic dashboard and categorised into green (preferred – antibiotics are more effective, have fewer side effects and are less likely to lead to resistant infections) and red (non-preferred - antibiotics should not be used in primary care unless absolutely necessary) antibiotics according to Irish national guidelines [20, 21].

To design the CARA dashboard, action design research (ADR) was used, which generates knowledge about how to solve organisational problems in practice. ADR includes four steps (a) problem formulation, (b) building, intervention and evaluation, (c) reflection and learning and (d) formalisation of learning [22, 23]. During the development and testing of the CARA dashboard, it became clear that dashboard performance (i.e. behavioural engagement and interaction) is important for continued use and sustainability of the dashboard. Hence, it is essential to implement user research to quantify how users (practices) interact with the dashboards and how to influence user behaviour. Previous research showed that behavioural interventions based on peer comparison can have a marked effect on the prescribing decisions of health professionals [24,25,26]. This study will implement user research to explore how different initial key messages impact the level of behavioural engagement with the CARA dashboard.

Method

Study design

General practices will be randomised to one of three different initial landing pages (groups): the full dashboard or one of two key messages. All practices have access to the same dashboards and view identical information but two groups will be shown a key message before they can continue to the full CARA dashboard [27]. The key messages are similar and focus on one of the charts from the existing dashboard (Figs. 1 and 2):

Fig. 1
figure 1

Details of the three different initial landing pages (groups)

Fig. 2
figure 2

Overview of the allocation of practices to group A, B or C

Participants

General practices that register with the CARA network for access to the dashboards will be randomised (practice level) using a computer-generated system, which will allocate to either group A, B or C using 1:1:1 randomisation (Fig. 1). Practices are NOT allocated to different dashboards but to different initial landing pages from where they can access the full dashboard.

Recruitment

General practices will be invited to join the CARA network [18] through social media and direct mailing through the Irish College of General Practitioners. Registration with the CARA network [18] is open to all general practices. Once registered and after confirming terms and conditions, the general practice receives a link to CARAconnect (to de-identify, extract and upload data) and view their practice data in the CARA dashboards.

Outcomes

The primary outcome is behavioural engagement (measured as the number of ‘clicks’, ‘viewings’ and ‘sessions’) with the CARA dashboard after one month of dashboard usage collected through Google Analytics (Table 1).

Table 1 Terminology definition of behavioural engagement outcomes (from Google Analytics [28])

Sample size

Sample size is calculated based on the conversion event (number of clicks on the interaction with the antibiotic dashboard page, see Table 1) using a comparative usability test to determine differences between group A, B and C (between-group design) [29, 30]. A comparative usability test is a method for describing and comparing the usability of more than one application [29, 30]. The variance estimation is obtained from a usability test for a similar experiment, which reported the mean number of mouse clicks for dashboard usage (mean = 23.6 and standard deviation = 14.8) [31]. A total sample size of 87 users or practices (29 per group) is estimated to be sufficient to observe a 10% difference [7, 32] with 5% statistical confidence [29, 30].

Data collection

This study will run for a limited time, dashboard usage data will be collected for each practice for a month from the time of enrolment. All practices registering with the CARA network will be included in the randomisation. Dashboard usage data will be collected through Google analytics, which is an integrated part of CARA. Google Analytics provides anonymous data on the behavioural engagement with a page irrespective of who is engaging. Therefore, three identical websites (dashboards) will be set up, two of which will contain an additional landing page with a key message. At no stage IP addresses will be recorded or tracked. Table 1 shows the behavioural engagement data that will be collected.

Data analysis

The quantitative information collected will be summarised (descriptive characteristic table) using the mean and standard deviation or, if not normal, with medians and interquartile ranges. Categorical variables will be summarised with counts and proportions. Depending on the data distribution, the continuous measures analysis will use linear regressions or generalised estimating equations for binary and count data. The N − 1 two-proportion test (A/B testing) will be used (for small sample sizes) to compare differences in proportions [33]. All statistical tests will be 2-tailed and significance will be set at a p-value of 0,05. Analysis will be performed using R-4.0.3® software.

Discussion

This paper outlines the protocol to explore how different initial key messages impact the level of behavioural engagement with the CARA dashboard. User research can determine a dashboard’s overall performance and ensures rigorous design and implementation, assesses usefulness, operability, ease of use, satisfaction, user interface, content, as well as system capabilities [11, 34].

Even though similar dashboards on prescribing that include user involvement, have been developed in Canada and the United Kingdom, no users’ involvement process or user research evaluation were described [35, 36]. User research is a process that should continue after the dashboard is launched and includes a process evaluation in a real-world setting [11, 30, 37]. However, process evaluations are still rarely used despite the proliferation of dashboards [4, 6]. For instance, Fazaeli S et al. designed and implemented a COVID-19 management dashboard, used usability testing to examine users’ satisfaction and incorporated feedback into the dashboard design [36]. However, they did not evaluate the complete user research process [10].

In general, little evidence is available on the evaluation of dashboard engagement or dashboard usage data analysis [7, 8, 38]. User engagement is not static and is a prolonged process which varies over time. Its measurement reflects the degree of involvement a user has with the dashboard or the system [39, 40]. User engagement measures quantify usability, but a recent systematic review only identified one study that recorded dashboard engagement [7]. It was unclear if any of the other existing dashboards developed for use in primary care and included in the review, went through a usability evaluation process [7].

However, the application of behavioural engagement and its metrics during the development and implementation of dashboards needs to be explored to know how the user interacts with the dashboard, how they make sense of the data presented in the dashboard, how effective and meaningful the dashboard design is and the benefit gained from using this dashboard and data [41]. Furthermore, engaging with dashboard users reveals interesting challenges that may be addressed through dashboard design and implementation. For instance, peer performance visualisation is often used to promote dashboard engagement and motivation [42].

The CARA infrastructure will be implemented and tested in an incremental number of general practices. To support the continued use and sustainability of the CARA dashboards, their performance and engagement will be monitored to regularly implement updates and improvements. The implications of this study will improve our understanding of how users interact with the CARA dashboard, as well as how different initial key messages impact behavioural engagement within the CARA dashboard. Building on this research, inclusion of key performance indicators for evaluating, measuring and improving prescribing will be a follow on study for CARA.

Data availability

No datasets were generated or analysed during the current study.

Abbreviations

ADR:

Action Design Research

CARA:

Collaborate, Analyse, Research, Audit

PMS:

Patient Management System

References

  1. Collins C, Homeniuk R. How many general practice consultations occur in Ireland annually? Cross-sectional data from a survey of general practices. BMC Fam Pract. 2021;22(1):40.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  2. Health Service Executive. Terms of agreement between the Department of Health, the HSE and the IMO regarding GP contractual reform and Service Development. In.; 2019.

  3. Schulze A, Brand F, Geppert J, Bol GF. Digital dashboards visualizing public health data: a systematic review. Front Public Health. 2023;11:999958.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Gardner L, Ratcliff J, Dong E, Katz A. A need for open public data standards and sharing in light of COVID-19. Lancet Infect Dis. 2021;21(4):e80.

    Article  CAS  PubMed  Google Scholar 

  5. Gotz D, Borland D. Data-Driven Healthcare: challenges and opportunities for interactive visualization. IEEE Comput Graph Appl. 2016;36(3):90–6.

    Article  PubMed  Google Scholar 

  6. Dagliati A, Malovini A, Tibollo V, Bellazzi R. Health informatics and EHR to support clinical research in the COVID-19 pandemic: an overview. Brief Bioinform. 2021;22(2):812–22.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  7. Garzon-Orjuela N, Parveen S, Amin D, Vornhagen H, Blake C, Vellinga A. The effectiveness of interactive dashboards to Optimise Antibiotic Prescribing in Primary Care: a systematic review. Antibiot (Basel) 2023, 12(1).

  8. Zhuang M, Concannon D, Manley E. A Framework for evaluating dashboards in Healthcare. IEEE Trans Vis Comput Graph. 2022;28(4):1715–31.

    Article  PubMed  Google Scholar 

  9. Helminski D, Kurlander JE, Renji AD, Sussman JB, Pfeiffer PN, Conte ML, Gadabu OJ, Kokaly AN, Goldberg R, Ranusch A, et al. Dashboards in Health Care settings: protocol for a scoping review. JMIR Res Protoc. 2022;11(3):e34894.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Oyedele Y, Greunen D, Veldsman A. UX Engagement and Interaction. In: 2018 7th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO): 29–31 Aug. 2018 2018; 2018: 792–798.

  11. Sauro J, Lewis JR. Chap. 2 - Quantifying user research. In: Quantifying the User Experience (Second Edition) edn. Edited by Sauro J, Lewis JR. Boston: Morgan Kaufmann; 2016: 9–18.

  12. Song MJ, Ward J, Choi F, Nikoo M, Frank A, Shams F, Tabi K, Vigo D, Krausz M. A process evaluation of a web-based Mental Health Portal (WalkAlong) using Google Analytics. JMIR Ment Health. 2018;5(3):e50.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Plaza B. Google Analytics for measuring website performance. Tour Manag. 2011;32(3):477–81.

    Article  Google Scholar 

  14. Ogrodniczuk JS, Beharry J, Oliffe JL. An evaluation of 5-Year web analytics for HeadsUpGuys: a men’s Depression E-Mental Health Resource. Am J Mens Health. 2021;15(6):15579883211063322.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Crutzen R, Roosjen JL, Poelman J. Using Google Analytics as a process evaluation method for internet-delivered interventions: an example on sexual health. Health Promot Int. 2013;28(1):36–42.

    Article  PubMed  Google Scholar 

  16. Clark DJ, Nicholas D, Jamali HR. Evaluating information seeking and use in the changing virtual world: the emerging role of Google Analytics. Learn Publish. 2014;27(3):185–94.

    Article  Google Scholar 

  17. Vona P, Wilmoth P, Jaycox LH, McMillen JS, Kataoka SH, Wong M, DeRosier ME, Langley AK, Kaufman J, Tang L, Stein BD. A web-based platform to support an evidence-based Mental Health intervention: lessons from the CBITS web site. Psychiatric Serv. 2014;65(11):1381–4.

    Article  Google Scholar 

  18. CARA NETWORK. [https://caranetwork.ie].

  19. Garzon-Orjuela N, Garcia Pereira A, Vornhagen V, Stasiewicz K, Parveen S, Amin D, Porwol L, Vellinga A. Design of infrastructure for visualising and benchmarking patient data from general practice (CARA). Eur J Public Health. 2023;33(Suppl 2):ckad160.1210. https://doi.org/10.1093/eurpub/ckad160.1210.

  20. O’Connor N, Breen R, Carton M, Mc Grath I, Deasy N, Collins C, Vellinga A. Improving the quality of antibiotic prescribing through an educational intervention delivered through the out-of-hours general practice service in Ireland. Eur J Gen Pract. 2020;26(1):119–25.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Government of Ireland. Ireland ’s Second One Health National Action Plan on Antimicrobial Resistance 2021–2025; 2021.

  22. Sein MK, Henfridsson O, Purao S, Rossi M, Lindgren R. Action Design Research. MIS Q. 2011;35(1):37–56.

    Article  Google Scholar 

  23. Garzon-Orjuela N, Vornhagen H, Stasiewicz K, Garcia Pereira A, Vellinga A. Interactive dashboard development for Irish general practices to visualise and compare patient data. Eur J Public Health. 2023;33(Suppl 2):ckad160.575. https://doi.org/10.1093/eurpub/ckad160.575.

  24. Meeker D, Linder JA, Fox CR, Friedberg MW, Persell SD, Goldstein NJ, Knight TK, Hay JW, Doctor JN. Effect of behavioral interventions on Inappropriate Antibiotic Prescribing among Primary Care practices: a Randomized Clinical Trial. JAMA. 2016;315(6):562–70.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  25. Zeng Y, Shi L, Liu C, Li W, Li J, Yang S, Yang X, Huang Q, Yang L. Effects of social norm feedback on antibiotic prescribing and its characteristics in behaviour change techniques: a mixed-methods systematic review. Lancet Infect Dis. 2023;23(5):e175–84.

    Article  PubMed  Google Scholar 

  26. Talat U, Schmidtke KA, Khanal S, Chan A, Turner A, Horne R, Chadborn T, Gold N, Sallis A, Vlaev I. A systematic review of Nudge interventions to optimize medication prescribing. Front Pharmacol. 2022;13:798916.

    Article  PubMed  PubMed Central  Google Scholar 

  27. CARA dashboard v1.0. [http://vmcara-vm2.datascienceinstitute.ie/dashboard/app]].

  28. Dimensions and metrics. [GA4] Understand user metrics [https://support.google.com/analytics/answer/12253918?hl=en&ref_topic=11151952&sjid=17122983140791273295-EU]

  29. Sauro J, Lewis JR. Chap. 6 - What sample sizes do we need? Part 1: summative studies. In: Quantifying the User Experience (Second Edition) edn. Edited by Sauro J, Lewis JR. Boston: Morgan Kaufmann; 2016: 103–141.

  30. Lewis JR. Usability Testing. In: Handbook of Human Factors and Ergonomics edn.; 2006: 1275–1316.

  31. Rothfeld R. Advancing Web-based Dashboards: Providing Contextualised Comparisons in an Air Traffic Discovery Dashboard. 2015.

  32. Curtis HJ, Bacon S, Croker R, Walker AJ, Perera R, Hallsworth M, Harper H, Mahtani KR, Heneghan C, Goldacre B. Evaluating the impact of a very low-cost intervention to increase practices’ engagement with data and change prescribing behaviour: a randomized trial in English primary care. Fam Pract. 2021;38(4):373–80.

    Article  PubMed  Google Scholar 

  33. Sauro J, Lewis JR. Chap. 5 - Is there a statistical difference between designs? In: Quantifying the User Experience (Second Edition) edn. Edited by Sauro J, Lewis JR. Boston: Morgan Kaufmann; 2016: 61–102.

  34. Almasi S, Bahaadinbeigy K, Ahmadi H, Sohrabei S, Rabiei R. Usability Evaluation of Dashboards: A Systematic Literature Review of Tools. Biomed Res Int 2023, 2023:9990933.

  35. : MyPractice, Care P. [https://www.hqontario.ca/QualityImprovement/Practice-Reports/Primary-Care].

  36. Building Rapid Interventions. to reduce antibiotic resisTance (BRIT) [https://www.britanalytics.uk/].

  37. Few S. Information Dashboard Design: The Effective Visual Communication of Data / S. Few. 2006.

  38. Koneska E, Appelbe D, Williamson PR, Dodd S. Usage Metrics of web-based interventions evaluated in Randomized controlled trials: systematic review. J Med Internet Res. 2020;22(4):e15474.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Bickmore T, Schulman D, Yin L. Maintaining Engagement in Long-Term interventions with Relational agents. Appl Artif Intell. 2010;24(6):648–66.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Taki S, Lymer S, Russell CG, Campbell K, Laws R, Ong KL, Elliott R, Denney-Wilson E. Assessing user Engagement of an mHealth intervention: development and implementation of the growing healthy app Engagement Index. JMIR Mhealth Uhealth. 2017;5(6):e89.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Hoang P. Measuring user Engagement from interactive media. The University of Bergen; 2022.

  42. Sarikaya A, Correll M, Bartram L, Tory M, Fisher D. What do we talk about when we talk about dashboards? IEEE Trans Vis Comput Graph. 2019;25(1):682–92.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This work was funded by grant number RL-20200-03 from Research Leader Awards (RL) 2020, Health Research Board, Ireland. Nathaly was funded by this grant and is a SPHeRE scholar.

Author information

Authors and Affiliations

Authors

Contributions

N.G.O. and A.V. conceived and design the study; N.G.O drafted the protocol; A.V., H.V. and C.B. made substantial contributions to the study protocol; A.V., H.V. and C.B. critically revised drafts. All authors participated in the protocol revision and approved the final version.

Corresponding author

Correspondence to Nathaly Garzón-Orjuela.

Ethics declarations

Ethics approval and consent to participate

The study was approved by the UCD Research Ethics Committee (LS-LR-23-97-Orjuela-Vellinga). Furthermore, the CARA project previously received ICGP REC approval (ICGP_REC_21_0018) and amendment (October 2022). The use of Google Analytics is standard for websites. There is no practice consent form; however, the GP agreement of the CARA project includes the use of Google Analytics (https://caranetwork.ie/wp-content/uploads/2023/10/GP-agreement-CARA.pdf).

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Garzón-Orjuela, N., Vornhagen, H., Blake, C. et al. Evaluation of different landing pages on behavioural engagement with the CARA dashboard: A user research protocol. BMC Prim. Care 25, 174 (2024). https://doi.org/10.1186/s12875-024-02420-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12875-024-02420-6

Keywords