Measuring Success

Producing digital catalogues is a time- and resource-intensive activity. Through this study, we have produced a wealth of data that suggest digital catalogues are a worthwhile endeavor for museums. These publications reach thousands of visitors, are respected for their scholarly merit, and offer significant advantages to users over print publications through the amount of information they make available and the tools they provide for navigating and working with that information. This study has also helped our team identify some of the metrics that work best for measuring the success of digital catalogues. We found that certain metrics of success could be gathered with relative ease using web analytics or close-ended questions on pop-up surveys. In other cases, the rich qualitative data provided through focus group conversations and open-ended survey questions provided deeper insights that require more effort to digest. Conducting this study also highlighted the ways that the definition of success might vary depending on contextual factors and each museum’s goals for its digital publications.

Measuring Reach and Engagement

With careful interpretation, analytics data can provide many quantitative measures of success for digital catalogues, such as the number of visitors a catalogue receives and the depth of their engagement with the catalogue. Even more important is discoverability, since a digital catalogue that does not appear in web searches will not gather any new visitors. In almost all cases, analytics metrics mean little on their own but can take on greater significance when compared to other data, such as:

  • A catalogue’s performance in a previous time period (e.g., How did the catalogue do in the first month of its launch, versus after a recent marketing push?)

  • The performance of a comparable catalogue or resource on the museum’s website (e.g., Is the catalogue receiving similar traffic to one featuring a very well-known artist? How does traffic to the museum’s collection pages compare to traffic to the catalogue?)

  • The reach of a similar print publication (e.g., How many copies did a comparable print publication sell in the first year, and how does this compare to the number of engaged visitors the digital catalogue received in that time period?)

Web analytics that can be used as a basis for making these comparisons include the following:

Analytics for Measuring Reach and Engagement
Analytics Metrics Interpretation

Discoverability:

  • Percentage of organic search and referral sessions

Referral sessions indicate catalogue users are finding links to the catalogue from other websites, including academic search engines, arts and culture blogs, or the websites of professional organizations. This type of traffic suggests the catalogues are accessible to the scholarly audiences they are intended to serve.

Organic search sessions indicate users are arriving at the catalogues after conducting a search via Google or another search engine. This traffic may be important for reaching additional audiences beyond the catalogues' primary target groups. (For more information on discoverability, see How are users finding the catalogues?)

Traffic to the catalogues:

  • Number of sessions
  • Number of visitors

Drilling deeper to discover the number of sessions or visitors that land on the catalogue's homepage, table of contents, or other pages of significance can help museums distinguish between visitors who are making use of the catalogue as a whole versus visitors with awareness of or interest in only limited pages. The web structure of the catalogue and its relationship to other pages on the museum website are also important considerations. (For more information on how web structure affects catalogue traffic, see Appendix A: Further Analyses — The Unique Structure of the NGA's Online Editions.)

Depth of engagement:

  • Session duration
  • Percentage of sessions over x minutes
  • Pageviews/session

The length of time users spend in a catalogue is one measure of their engagement with the material. As an analytics metric, average session duration can be misleading due to outliers and the fact that Google records bounced sessions as having a duration of 0. The distribution of session duration gives a more complete picture of visitor engagement, or staff can focus on the percentage of sessions that last over 10 minutes (or an alternative target set by staff).

Analytics for Measuring Reach and Engagement
Analytics Metrics Interpretation

Discoverability:

  • Percentage of organic search and referral sessions

Referral sessions indicate catalogue users are finding links to the catalogue from other websites, including academic search engines, arts and culture blogs, or the websites of professional organizations. This type of traffic suggests the catalogues are accessible to the scholarly audiences they are intended to serve.

Organic search sessions indicate users are arriving at the catalogues after conducting a search via Google or another search engine. This traffic may be important for reaching additional audiences beyond the catalogues' primary target groups. (For more information on discoverability, see How are users finding the catalogues?)

Traffic to the catalogues:

  • Number of sessions
  • Number of visitors

Drilling deeper to discover the number of sessions or visitors that land on the catalogue's homepage, table of contents, or other pages of significance can help museums distinguish between visitors who are making use of the catalogue as a whole versus visitors with awareness of or interest in only limited pages. The web structure of the catalogue and its relationship to other pages on the museum website are also important considerations. (For more information on how web structure affects catalogue traffic, see Appendix A: Further Analyses — The Unique Structure of the NGA's Online Editions.)

Depth of engagement:

  • Session duration
  • Percentage of sessions over x minutes
  • Pageviews/session

The length of time users spend in a catalogue is one measure of their engagement with the material. As an analytics metric, average session duration can be misleading due to outliers and the fact that Google records bounced sessions as having a duration of 0. The distribution of session duration gives a more complete picture of visitor engagement, or staff can focus on the percentage of sessions that last over 10 minutes (or an alternative target set by staff).

In comparing analytics data for this study, we found that defining what counts as a catalogue session is open to interpretation and can make a tremendous difference for measuring catalogue reach and engagement. The experience of visiting a single web page within a digital catalogue does not guarantee a visitor’s awareness of the catalogue as a whole in the same way they would have if holding a print publication in their hands. We found that the analytics data varied considerably depending on whether we looked at all catalogue sessions or just sessions that included a visit to the catalogue homepage. Museums may therefore want to set some minimum standard for what counts as a catalogue visit. (See How much traffic are the catalogues receiving? for more information.)

Measuring Usability and Performance

A digital catalogue can only be deemed successful if users are able to navigate it and find content of interest. Usability was therefore one focus of this study. We found that some of the keys to usability success included visitors being able to easily 1) access the table of contents, 2) determine their location within the publication as well as the larger museum website, and 3) find and use tools of interest. Some usability metrics can be tracked through analytics data:

Analytics for Usability and Performance
Analytics Metrics Interpretation

Depth of engagement:

  • Session duration
  • Percentage of sessions over x minutes
  • Pageviews/session

Depth of engagement is also a measure of usability, since visitors are unlikely to persist in using a digital catalogue if they cannot find what they are looking for. (For more information on interpreting engagement metrics, see Measuring Reach and Engagement.)

Use of key pages and tools:

  • Percentage of sessions with homepage/TOC pageviews
  • Percentage of sessions over x minutes

Users who access the homepage and table of contents during a session are likely to encounter some key wayfinding signals that help them navigate the catalogue.

Tracking the use of special tools in digital catalogues (such as downloads of PDF versions and use of citation tools or image viewing tools) requires experience in setting up analytics conversions and goals, which was beyond the scope of this study. Doing so, however, can give museums concrete data on how frequently these tools are used and if they contribute to deeper engagement.

Analytics for Usability and Performance
Analytics Metrics Interpretation

Depth of engagement:

  • Session duration
  • Percentage of sessions over x minutes
  • Pageviews/session

Depth of engagement is also a measure of usability, since visitors are unlikely to persist in using a digital catalogue if they cannot find what they are looking for. (For more information on interpreting engagement metrics, see Measuring Reach and Engagement.)

Use of key pages and tools:

  • Percentage of sessions with homepage/TOC pageviews
  • Percentage of sessions over x minutes

Users who access the homepage and table of contents during a session are likely to encounter some key wayfinding signals that help them navigate the catalogue.

Tracking the use of special tools in digital catalogues (such as downloads of PDF versions and use of citation tools or image viewing tools) requires experience in setting up analytics conversions and goals, which was beyond the scope of this study. Doing so, however, can give museums concrete data on how frequently these tools are used and if they contribute to deeper engagement.

We also found that a simple question on the pop-up survey—”Did you find what you were looking for?“—provided a quick measure of usability. While our focus group homework and discussion questions also collected usability data, we found participants’ opinions were highly varied and sometimes contradictory. (See Functionality and Design for the full discussion.) This qualitative data helped us to identify a few key trends and areas in which the catalogues might be improved, but we also discovered it did not help to get bogged down by the comments of a few disgruntled users.

Measuring Value to Audiences

An important measure of success for this study was the extent to which audiences view these catalogues as trustworthy, scholarly sources. Analytics metrics can provide certain indirect measures of scholarly value by telling us how visitors arrive at the catalogues and how deeply they engage with the content:

Analytics for Measuring Value to Audiences
Analytics Metrics Interpretation

Scholarly referrals:

  • Percentage of referral sessions from academic search engines or other sites related to scholarly research

Referral sessions from academic search engines or other sites related to scholarly research are an encouraging indicator that users are finding these catalogues in the same places they find other scholarly material.

Depth of engagement following a scholarly referral:

  • Session duration
  • Percentage of sessions over x minutes
  • Pageviews/session

Measuring visitor engagement for sessions referred by academic search engines or other scholarly sites can hint at how much these visitors value the content they find. (For more information on interpreting engagement metrics, see Measuring Reach and Engagement.)

Analytics for Measuring Value to Audiences
Analytics Metrics Interpretation

Scholarly referrals:

  • Percentage of referral sessions from academic search engines or other sites related to scholarly research

Referral sessions from academic search engines or other sites related to scholarly research are an encouraging indicator that users are finding these catalogues in the same places they find other scholarly material.

Depth of engagement following a scholarly referral:

  • Session duration
  • Percentage of sessions over x minutes
  • Pageviews/session

Measuring visitor engagement for sessions referred by academic search engines or other scholarly sites can hint at how much these visitors value the content they find. (For more information on interpreting engagement metrics, see Measuring Reach and Engagement.)

While analytics are easily collected, we found survey and focus group feedback indispensable for understanding how visitors are assessing the value of these digital catalogues. Some of the key questions that helped us measure this dimension of catalogue success include the following:

  • Please rate this resource on a scale of 1–5 for the following criteria: informative text

  • Is this resource something you would feel comfortable citing for your work? (response options: yes, no, N/A)

  • How do you feel this digital catalogue compares to printed resources you use in your work?

  • Are the scholarly essays in these catalogues on par with other kinds of sources you might cite?

Close-ended questions such as the first and second bullet above are useful for providing a quick quantitative assessment of a catalogue’s scholarly value, but the conversations generated by open-ended questions in this study (such as bullets three and four) provided valuable insights that can only be gleaned from qualitative data. (See Scholarly Content for a full discussion.)

This study also demonstrated that while these catalogues are respected as scholarly resources, they are being used for many purposes beyond scholarly research. They have a strong appeal as teaching resources, and they are also generating large amounts of traffic from art enthusiasts for personal use. (See Who is using the catalogues? for more information.) Future studies may seek to establish their own metrics for success based on how digital catalogues are serving these additional user groups.