Mark Freeman
An Analysis of Users of the National Park Service Web Catalog
Summary
A Web survey was placed on the National Park Service web catalog website in October 2013 to
investigate user segmentation. The survey ran for a month, gathering 39 responses. Segmentation
categories followed earlier studies, with the addition of a further sub-division by category for
researchers. The results show that educators are not using the site, though it was not possible to see if it
was being used by K-12 students. Historians were the majority of the research users of the site. The
study is part of a larger analysis of website visitation, and will provide a guide for website changes and
benchmark data for future studies.
This paper was written as part of a course requirement for the University of Tennessee Information
Science Masters program in the School of Communications. The class was 565 Digital Libraries and the
paper was reviewed by Dr. Awa Zhu, whose helpful suggestions form part of the final draft. Information
from this survey has also been presented to the National Park Service Museum Management Program.
Introduction
The National Park Service (NPS) web catalog presents the collections of national parks and historic sites.
In 2002, the NPS Museum Management Program developed the first iteration of the web catalog,
providing public access to textual museum records, as well as associated digital images. In doing so they
were following a trend of cultural institutions moving from descriptive websites to sites that included
dynamic collections content. In 2008 the National Academy of Public Administration’s (NAPA) Report,
“Saving Our History: A Review of National Park Cultural Resource Programs,” stated the following: “The
Panel recommends that NPS make public search tools more user friendly.” At this point, traffic at the
web catalog site was considered low, and participation from parks limited, with only 30 parks, from a
pool of 365, submitting objects to the site. Knowledge of the NPS park collections was low, a point
emphasized in user testing of the new website design – many of the testers didn’t know the NPS had
collections; they thought of the National Park Service as composed of places, not things.
In fact the NPS has huge holdings. The total collections are over 135 million items. The potential for the
web catalog is a more modest 84 million because of uncataloged items, with 52 million objects
designated as archives, which mean they may not be cataloged at the item level. The remaining 32
million objects are split across a number of distinct disciplines: archives, archeology, biology, ethnology,
geology, history, and paleontology. Individual parks decide whether to make their objects available on
the web catalog, a decision based largely on their resources, and they decide which of their objects to
put online.
In December 2011, the NPS website was re-launched, following an extensive redesign undertaken
through a co-operative agreement with the University of Tennessee and the NPS. The site currently
shares 2 million objects, and 14 million archival records, including all of the diversity of subject areas.
As noted by Hamma (2004 Introduction) “There is no such thing as a general visitor, no such thing as
someone just browsing through the on-line collections.” This is undoubtedly true of the web catalog site.
For the site redesign, it was hoped that K-12 students and teachers would use the site for school
projects. A secondary audience of park visitors, or people planning a trip, was also considered, with one
goal for the site being to increase public awareness of the parks’ collections. NPS employees, and then
1
researchers, were next on the list of intended audiences. These different publics were all considered in
the website re-design. Cultural (rather than natural history) objects were prioritized in the search, and
emphasis was given to presenting objects within a broader context of park information and collection
highlights (mini-exhibits). It was hoped that this approach would make the collections more useful for
website visitors coming to the site without specific information needs.
The site can be assessed since the redesign using simple statistics—number of visitors, page view, hits
(information based on server weblogs). Site visitation for the web catalog has jumped from 400 visits
per month to 4700 visits. This includes 1700 unique visitors per month (previous numbers unavailable).
Site traffic varies during the year, with the summer being quieter. More people look at the site at the
beginning of the week and Saturday has been the day showing the fewest visitors. The length of site
visits has increased from three minutes per visit, to six and a half minutes. These statistics provide
evidence that the new site is an improvement over the old. Additionally tracking of search terms and the
ability to see the records subsequently viewed has been added (and an analysis of these data is planned),
but it remains difficult to know who is looking at the website, and how well it is serving its visitors.
Without a clear idea of who has been using the site, it is difficult to optimize the design for multiple
audiences, and to know who the site is not reaching. To narrow this knowledge gap, an online survey
was placed on the website in October 2013. Over a four week period, website visitors answered three
simple questions relating to the reason for their site visit. This survey was based on previous museum
studies, but also designed for the particular nature of the NPS collections.
LITERATURE REVIEW/CONCEPTUAL FRAMEWORK
There have been many studies examining the audience for museums and digital collections: Sarraf
(1999); Goldman and Schaller (2004); Marty (2007); Marty (2008); Peacock and Brownhill (2007);
Fantoni, Stein and Bowman (2012). Museums, and indeed all website owners, realized that the base
web statistics - number of visitors, pages views, hits – provide only the broadest sense of website
visitation.
A survey by Sarraf (1999) provided an early examination of museum website visitors. The survey was
web based, though not placed in a specific website but promoted through newsgroups and Listservs.
While the demographics of web users have changed since her study, her report pointed to an older age
group (60% of respondent were over 30), with 64% of respondents female, and 81% having some
college level education (Sarraf, 1999, p. 239). She examined visitor occupation, rather than motivation
for a specific visit, recording 27% of visitors as museum professionals, 18% as students and 12% as
educators. Sarraf also looked at why people were coming to museum websites with 17% recorded as
looking for research information – showing a research category among users, whether professionally, or
for personal use.
The use of museum websites for research was validated by Marty (2007), who conducted an online
survey of 1200 web visitors through promotion at nine different online museums. This wasn’t a pop-up
survey, but one structured more formally outside of the participating museum web pages. His questions
were focused on looking at the relationship between visits to physical museums and their websites,
asking questions about attitudes before and after museum visitation. His particular relevance to this
study is in showing that 69% of survey respondents were either likely, or very likely, to use online images
of artifacts/collections data in their daily lives, and 54% were likely, or very likely, to use online research
materials/archives (Marty, 2007, p. 348). This suggests a strong audience for at least informal research.
2
Marty also notes the limitations of the online survey method (low response rate, self-selecting group),
though he perhaps underplays the notion that such surveys reach only existing, rather than potential
audiences (p. 344).
Goldman and Schaller (2004) sought to understand both user motivation in visiting the websites and
subsequent satisfaction with their website experience. They used a web survey, activated though a popup box presented on a number of different museum websites. They looked at both self-efficacy (how
comfortable people felt in trying tasks), and how well expectations of the site were met. Again, the
problems with web surveys were acknowledged by the authors and their survey included response rates
of between 1-3%, making extrapolation of data to a broader context difficult. What their survey did
include however was a breakdown by audience type. Teachers made up 23.9% of their sample and
students 53.6%. However it should be noted that student was a self-selecting category and included
people in all age groups.
A more holistic approach to understanding audience was taken by Peacock and Brownhill (2007). Their
paper commented on the limitations of existing visitor studies, and suggested a broader approach that
examined four paradigms commonly used in trying to understand visitors: audience and visitor studies,
marketing, product evaluation, and usability analysis. From their analysis of qualitative and quantitative
data, they split website visitors into four segments:
Browsers: defined as those accessing the website as part of general browsing
Visitors: defined as those using the website prior to a visit to the physical museum
Searchers: defined as those using the website for specific research
Transactors: defined as those using the website for financial transactions
The study also notes that they are “segment[ing] existing users rather than segmenting the market,
which includes those who don’t use museum Web sites, a much larger group” (Peacock & Brownhill,
2007, Use of Segmentation). Web-based surveys are presented to existing users, but it is important to
consider the audience that is not being reached by the survey.
The research by Peacock and Brown was further explored by Fantoni, Stein and Bowman (2012) through
the results of two museum website surveys. They first used open-ended questions to determine
people’s motivations in coming to the Indianapolis Museum of Art website. From this they determined
five categories of motivation:
Plan a visit to the museum
Find specific information for research or professional purposes
Find specific information for personal interest
Engage in casual browsing without looking for something specific
Make a transaction on the website
It can be seen that these categories, in this case coming from visitors themselves, closely match those of
Peacock and Brownhill. The differentiation between professional and personal research is an interesting
distinction, though how people may use these categories is unclear. The value in this form of user
segmentation is in the way it suggests improvements to website design. For different types of users, site
structure, language, and the importance of visual elements are likely to vary in importance. Personal
interest searchers may place less emphasis of faceted searching than professional researchers. Fantoni,
Stein and Bowman were able to use Google analytics to subsequently track these defined user segments
to see how they moved through the site and what areas of the site they used.
3
Combining, or aggregating, these studies is difficult. The researchers had different research questions,
different methodologies, and the studies happening over a period of twelve years. The table below is
meant to act merely as a summary of some of the results from the papers outlined above.
Searchers
Visitors %
Browsers %
Students %
Teachers %
Museum %
Researchers%
Sarraf1
2.6
67
18
12
27
17
Goldman2
4
54
24
Marty3
80
58
48
Peacock4
23
16
48
Fantoni5
50
10
37 (16% professional 21% personal)
20
60
Visitor segmentation by percentage from different website visitation studies. Studies in chronological order.
1.
Conflated information from more than one survey question (Sarraf, 1999).
2.
Information extrapolated from multiple questions (Goldman & Schaller, 2004).
3.
Looks at non-exclusive interest in website offerings, aggregated before and after a
museum visit taken from likely/very likely responses. Student % was inferred (with low precision)
from interest in educational materials (Marty, 2007)
4.
The fourth demographic from Peacock and Brownhill was transactors (14%) (Peacock &
Brownhill, 2007).
5.
Fantoni also noted the transactor demographic (3%) (Fantoni, Stein & Bowman, 2012).
RESEARCH DESIGN/METHODOLOGY
In designing questions for the NPS web catalog site it was important to take into consideration some of
the defining characteristics of the site. While many museums have a variety of objects, the NPS web
catalog has relatively large and diverse collections. Since the 2011 re-launch of the website, the number
of participating parks has risen to 90, and the number of objects online has risen to over two million
objects. The distribution of objects largely mirrors the overall collections, though proportionally biology
specimens are underrepresented on the web catalog.
Items
Object Class
Archives
Archeology
History
Web catalog
count
% on web
catalog*
% in overall
collections*
14,094,858
1,965,512
86
84
205,744
9
7.4
4
Art
5,302
>1
>1
Ethnology
2,164
>1
>1
Paleontology
46,420
2
1.4
Biology
36,113
1.6
6.75
Geology
7,175
>1
>1
16,363,288
100
100
Total
*Since archival counts were disproportionally high, and don’t reflect individual objects, they were not included in percentages
The type of material included on the site clearly influences the potential audience. Archeology forms the
largest group of material, and its composition includes both historical and prehistoric material. However
the utility of the archeology information is strongly affected by the removal of location and contextual
information, because of concerns about looting. This likely affects the use of this material by
archeologists. Similarly the location and locality information is removed from the natural history records,
making these records of limited use for scientific research. Currently, while faceted search is available, it
is not optimized for natural history collections. For a limited number of archival records, Finding Aids are
available on the site, but they are not promoted through the search or site navigation. In order to better
understand the usefulness of the different object classes the survey needed to quantify not just a
“searcher” category, but also the type of researchers using the material.
Accordingly, a web survey was placed on the site to ask visitors why they had come to the site. The
categories used reflect the breakdown of user motivations summarized in earlier studies – visitor,
browser, searcher, transactor. However two changes were made. First, the transactors’ category was
ignored, since there are no commercial aspects to the site. Transactors can be understood to include
those visitors that come to the site to save images, rather than just undertake financial transactions but
this was considered secondary, rather than primary motivation. Second, the web catalog site has some
particular characteristics that required a deeper understanding of the searcher demographic, so this
category was further segmented into research sub-categories to help understand the importance of the
collections to different research groups.
Before the survey was implemented feedback was requested from the Museum Management Program
at the NPS. With their support, Institutional Review Board (IRB) approval was sought, since the survey
involved human subjects. Because of IRB requirements, an initial panel was added to the survey
providing some information about the study, and asking that visitors confirm they were over 18 years of
age before continuing. Unfortunately, the IRB requirement meant a vitally important demographic was
removed from the survey – students under 18.
Initially the survey was placed on the web catalog mirror site which is not accessible to the public. It was
reviewed for functionality and language by the NPS museum management program staff and
consultants, and the survey results were discarded. Finally, the survey was placed on the NPS web
catalog (museum.nps.gov) site. The text was included through a combination of jQuery, CSS and html,
with the survey box placed through absolute positioning, and initially hidden. This was done to avoid
issues with browsers blocking pop-ups, and to make interaction with the survey as simple as possible.
The code was applied to just one page on the site - the object detail page which is uses dynamic content
to display single objects from the collections. To access this page, a visitor has to follow a link, make a
5
selection from a search query, or look at an object from a collection highlight. In this way visitors
reaching the web catalog site unintentionally, or those who leave after reviewing the home page, were
not included in the survey - only visitors that were actively engaged in using the site.
The script used random number generation to decide whether the survey box should appear. Based on
the values shown, 20% of visitors were presented with a survey box. Visitors could close the survey
without answering at any point. If they did so, a database record would record only that the survey had
been presented.
Initial survey page
Clicking on “Continue” dynamically changed the text inside the survey through jQuery scripting and css.
At this point visitors were presented with the survey questions. The first part asked users to self-identify
by profession. Two subsidiary questions asked if the visitor was currently at a National Park, and
whether they worked for the NPS. These questions were an attempt to see the mobile usage, as well as
recording what segment of the sample were NPS employees (one of the initial design target groups).
The survey questions mirrored earlier studies by Fantoni, Stein and Bowman (2012), and Peacock and
Brownhill (2007). The questions used radio buttons to record discrete categories for visitors asking: “Is
your interest in the web catalog as a:”
Casual Browser - corresponding to the casual browser and browser of earlier studies.
Current/potential visitor - matching “visitors” and “plan a visit” categories. The site can
be used to preview park exhibits, and understand the type of park collections prior to a visit.
6
Student/K-12 - while students were asked to self exclude, this category was left in place
to see if the over eighteen prohibition worked.
Student (undergraduate) – both this category and educator show use with traditional
education roles.
Educator – this is a self defining role, but educator was chosen rather than the narrower
term “teacher”.
Researcher - this group was sub-divided (see below).
Other – visitors were able to self-identify as a non-defined group and record their
category through an open text box.
Researcher was broken down into:
Historian
Amateur Historian/Collector
Archaeologist
Natural Scientist
Curator
These groups reflect in part the categories of information on the site. The split between historian and
amateur historian/collector also allowed comparison with earlier studies’ breakdown between research
for personal and professional use. It should be noted that the question asked was couched in terms of
“your interest in the web catalog” rather than “what is your profession.” The survey attempted to
understand, for this website visit, why the visitor had come to the site, rather than their occupation.
Web survey questions in situ
7
All values were initially unchecked, and visitors could close the survey without answering. Clicking “Close”
recorded the answers in a database user log table. If nothing was clicked before closing, then an empty
record was written.
Scripting was used to set a browser cookie for anyone who received the survey. Before the random
generator was used the presence of the cookie was checked; if present the survey was not shown again,
meaning that visitors should not receive the survey twice. While this was not a perfect option (a visitor
could potentially receive the survey again on a different computer), without asking for identification this
approach seemed acceptable as providing minimal visitor interference.
The survey was intended to run for the month of October 2013 (to more easily match other web
statistics). Unfortunately, the shutdown of the Federal government during the early part of the month
meant the survey could not be started. Instead, data collection started on October 22, 2013 and ran
until November 20, 2013. Data were extracted by a simple SQL query and the query results saved to a
text file. The resulting text file (comma delimited) was then brought into an Excel spreadsheet for
analysis. The data are relatively straightforward, and simple bar charts were created for presentation.
Limitations of methodology
Web surveys typically have a low response rate. This survey ran for a month, generating only 39
responses. The 34% conversion rate, however, compares favorably to other studies and was likely
helped by the simplicity of the survey. Additionally, anyone under 18 had to close the survey without
answering.
All web surveys are self-selecting. People choose whether to answer or not and it is possible that certain
type of visitors – perhaps, casual browsers – will be less likely to answer a survey. There could be a bias
in the results based on people’s motivation in taking the survey.
In the interest of minimizing the amount of text visitors had to read, the terms used in the survey were
not explicitly defined. As noted previously, the terminology mirrored the language of earlier web use
studies, but it is possible that people understood their roles differently from the survey terms. Visitors
were asked, “what is your interest in the web survey,” but through a casual reading, an individual could
have answered in terms of his or her primary occupation. A visitor might have been casually browsing
the collections, but since, for example, he/she worked as an archeologist, that is how they defined them
self in the survey.
Finally, and obviously, the survey can only tell us about existing users of the site, rather than the
potential audience. A primary issue for site visitation may be that the public is unaware that the
National Park collections exist. While this survey can infer information about who isn’t coming to the
site, it is intended to be just a part of a larger examination of website users.
RESULTS
The survey was presented to web catalog visitors 115 times with 39 responses (a response rate of 34%).
Breakdown by category is shown below:
8
Category %
70
60
50
40
30
20
10
0
Per
Percentage of respondents by category
What is immediately striking from
rom the results is that educator was not selected as a category.
categ
Considering that educators were ident
identified as being a high target group for the site,, their absence is
disappointing. It is impossible to know what percentage of non-respondents weree students
studen and being
unable to see use by K-12 students
dents in the survey was a limiting factor in the study. The 13%
13 audience of
undergraduates does show some
me use for education. The lack of educators is madee more ssurprising when
compared to to a limited survey
ey that w
was conducted on the previous site prior to the re
re-launch
(Freeman 2010, p. 9-10). In that surve
survey, 50% of respondents self identified as educators,
cators, with the other
50% being included in research
h catego
categories.
The low response for visitors may be p
partly explained by the fact the survey was not available
avail
through
mobile devices, and that the web
eb cata
catalog site is not currently marketed at parks. The relat
relatively low
numbers of browsers, compared
red to ot
other studies, may also suggest that one problem
lem of aaudience is that
the site isn’t well known. Tracking
king (and potentially improving) search engine traffic
ic may h
help here.
Research categories are shown
n below:
Amateur Hist
Archaeologists
Historian
Curator
Other/Archivist
Per
Percentage of researchers by category
9
The high percentage of historians (42%) did match the earlier survey (44%). One noticeable result was
the lack of respondents who identified as natural scientists. There is a concern about the usefulness of
the natural history data to scientists. The total natural history content is low (4% of the total site),
though a large herbarium collection is soon to be added, and the search tools are not currently
optimized for this type of content.
Archeologists accounted for 20% of the visitors. The survey also revealed the class of the object that was
being looked at when the survey was triggered (see table below) with 19% of objects classified as
archeology. A closer examination of the survey showed that archeologists looked at non-archeology
objects. So the archeology data was serving a wider audience than other archeologists, but archeologists
were not looking at the archeology data.
It is possible that the 30% of respondents identifying as “curator” reflect, in part, NPS employees. Thirty
percent identified as NPS employees or affiliates. This number seems high, but may reflect this group’s
higher motivation to complete the survey. Only 18% of people identified themselves as being on-site.
The survey was not placed on the mobile website pages and therefore did not reflect mobile use
(though tablets bring up the regular website). However, the number may also reflect the 30% identified
as NPS employees and affiliates, rather than current visitors.
Object Class
60
50
40
30
20
10
0
When the survey was presented the object class was recorded. History objects were the most viewed
(53%). If archives are included in this category, history comprises over 70% of content viewed. A single
archival web catalog record can include thousands of archival objects so it is hard to compare the use of
archival records (19%) with the huge collections on the site, but usage (17%) suggests that this material
has potential for development. As noted previously archeology comprised 19% of objects viewed. Since
archeology objects comprise 84% of the total non-archival web catalog collections this number seems
low. The low use of natural history data (biology, geology, paleontology) reflects a missing audience and
the need to make this material more accessible and useful to web catalog visitors.
Access of the total collection object records, by object class, has been collected from April 2013 to
December 2013. These numbers largely mirror the figures from the survey.
Object Class
Numbers
10
%
History
Art
Archives
Archeology
Ethnology
Biology
Geology
Paleontology
6615
531
1145
1862
358
152
33
250
10946
60.43303
4.851087
10.46044
17.01078
3.270601
1.388635
0.30148
2.283939
100
Again we can see low levels of queries on the scientific data – just under 4% of the total objects viewed.
Archaeology is 17%, which can be compared to the 86 % of the web catalog data that is archaeology
records. History (60%) forms the largest use of objects, while comprising just 9% of the records on the
web catalog.
The same data reveals that 14% of the records reviewed were collection highlight records (this statistics
based on database records rather than the adjusted count figures noted above). The high percentage of
collection highlight records does suggest strong interest in “exhibit” level data, which would be likely
associated with teaching and education, though this doesn’t appear to be supported by other data from
the survey.
DISCUSSIONS AND CONCLUSIONS
Since the web catalog re-launch in December 2011, the broad increases shown in web statistics
provided support for the success of the web catalog redesign. This survey however, provides some
questioning of how successful the redesign has been in reaching the intended audiences. Even with the
limitations of the survey―the low number of respondents and the limitations of not being able to
question K-12 students― the audience reflected in the survey is in inverse proportion to the target
audiences of K-12-visitors-NPS employees-researchers.
The web catalog remained accessible during the recent federal government shutdown, though no work
was performed on the site. One interesting aspect of the shutdown was the prominence of the National
Parks as a symbol of the Federal Government. A recent survey (Hart, 2013) showed very strong support
for the National Parks across political lines. The survey did not ask about attitudes to NPS collections,
perhaps reflecting the low profile they have in the public consciousness. However, collections were
touched upon for the final survey question: “71% of voters rate as extremely important (37%) or quite
important (34%) the idea of connecting National Parks to America’s classrooms to excite students about
learning science, math, civics, and other topics, and help them see how what they learn applies to the
real world”( Hart 2013 pp. 13-14). Clearly the K-12, and possibly undergraduate, demographic is seen as
a very important audience for the web catalog.
The survey data on researcher segmentation is less surprising, though it is clearly worth considering the
utility of the natural history data to scientists as it is currently presented. It may be that this data need
to also be on scientific websites, to better serve this community. Further dialogue is needed with
11
archeologist, to see how this data can better serve its practitioners. Otherwise, for existing site visitors,
adding more quality history data would seem to be valuable.
The survey has suggested some of the potential audiences currently under-represented in site visitation.
Future site promotion can now target these audiences. Clearly the site needs to do a better job reaching
out to K-12 students and teachers. Better linkage from the education sections of the parent NPS.gov site
should help. Within the last few weeks there has also been increased promotion through social media –
such as Facebook and Twitter – and content is being considered for other social media. Additionally
upcoming promotion of a mobile version of the site at parks will hopefully increase site use by park
visitors.
The results of this survey were provided to the NPS Museum Management Program to serve as a guide
for future changes. These are intended to increase overall site visitation, but they are also aimed at
certain audience sectors. New searching tools are intended to provide a better faceted search for
scientific data, which, it is hoped, will improve access for natural science researchers. A similar change to
better expose archival content should make the site more helpful for history researchers. For K-12 and
other educational users, teaching materials will be given more prominence on the site, with lesson plans
linked to collection highlights, and a new search option made available to list those collection highlights
with associated educational content.
Villaeesepa and Tillich (2012) point to the need to place metrics in a wider context, suggesting that they
need to be placed within wider organizational goals and that measurement become a continuous
process. A planned examination of search terms might reveal more about what other types of material
people are interested in finding. Web statistics, including newly implemented Google analytics, will be
continually monitored and used to place these results in a broader context. This study can also be used
as benchmark data for future studies.
It is difficult to infer any wider conclusions about digital collections from the multiple studies that have
been done on user segmentation. The drop in “browsers” during the studies likely reflects a general
trend in web activity; people spend less time wandering the web and more time on sites with which they
are familiar. For the web catalog the “visitors” number seems an anomaly compared to other studies,
but this can be partly be explained by the web catalog focus on collections, rather than opening hours,
or directions. All the studies show a large a segment of website visitors are interested in research, or
information about collections and exhibits; hopefully the breakdown on the type of researchers for
museum objects may inform later studies, and a follow up study tracking website behavior of these
different groups may inform future website design.
References
Fantoni, S. F., Stein, R., Bowman, G. (2012). Exploring the Relationship between Visitor Motivation and
Engagement in Online Museum Audiences. Museums and the Web 2012: Proceedings. Toronto:
Archives & Museum Informatics. Consulted October 15, 2013. Available at:
http://www.museumsandtheweb.com/mw2012/papers/exploring_the_relationship_between_v
isitor_mot.html
Freeman, M (2010). Planning for a New Park Museum Management Program website. Unpublished
report, National Park Service, Museum Management Program, Washington DC.
12
Haley Goldman, K., Schaller, D. (2004). “Exploring Motivational Factors and Visitor Satisfaction in OnLine Museum Visits.” In J. Trant and D. Bearman (eds.). Museums and the Web 2004:
Proceedings. Toronto: Archives & Museum Informatics. Consulted October 13, 2013. Available at:
http://www.archimuse.com/mw2004/papers/haleyGoldman/haleyGoldman.html
Hamma, Kenneth (2004). The role of museums in online teaching, learning, and research. First Monday,
9(5). Consulted October 13, 2013. Available at
http://firstmonday.org/ojs/index.php/fm/rt/printerFriendly/1146/1066
Hart Research Associates and North Star Opinion Research (2013). Strong bipartisan support for
National Parks. The National Parks Conservation Association and National Park Hospitality
Association. Consulted November 28, 2013. Available at:
http://www.npca.org/assets/pdf/Suvey_Findings_Memo_Final.pdf
Marty, P. F. (2007). Museum websites and museum visitors: Before and after the museum visit. Museum
Management and Curatorship, 22(4), 337-360. DOI: 10.1080/09647770701757708
Peacock, D., Brownbill, J. (2007). “Audiences, Visitors, Users: Reconceptualising Users Of Museum Online Content and Services.” In J. Trant and D. Bearman (eds.). Museums and the Web 2007:
Proceedings. Toronto: Archives & Museum Informatics. Consulted October 15, 2013. Available at:
http://www.archimuse.com/mw2007/papers/peacock/peacock.html
Sarraf, Suzanne (1999). A Survey of Museums on the Web: Who Uses Museum Websites? Curator: The
Museum Journal 42.3. Blackwell Publishing Ltd
Villaespesa, E., Tasich, T. (2012). Making Sense of Numbers: A Journey of Spreading the Analytics Culture
at Tate. Museums and the Web 2012. Consulted November 10, 2013. Available at
http://www.museumsandtheweb.com/mw2012/papers/making_sense_of_numbers_a_journey
_of_spreading
13
APPENDIX A
Survey
Introduction
Please help us learn more about you.
The following survey consists of just three questions and contains no identifying or personal
information. It can be completed in under a minute. You’re answers will help us improve the
web catalog. You can close the survey at any time without penalty, and no information will be
saved.
If you have questions about the survey you contact us here
[x] I am over 18
Cancel Continue
Questions
Please help us learn more about you
Is your interest in the web catalog as a:
Casual browser
Current/potential Park Visitor
Student (K12
Student (college
Educator
Researcher
Historian
Amateur Historian/Collector
Archaeologist
Natural Scientist
Curator [data entry text box provided]
Are you currently at a National Park?
Do you work for, or affiliated with, the National Park Service?
Cancel Continue
14
Survey Results
Number of surveys presented
Responses
115
39
Response rate of
33.91304
Prof Type
Browser
Visitor
Undergraduate
Educator
Researcher
Other
6
1
5
0
24
3
%
15.38462
2.564103
12.82051
0
61.53846
7.692308
39
100
On-site
NPS
5 12.82051
8 20.51282
Researchers
Amateur Hist
Archeologists
Historian
Curator
Other/Archivist
Class
Archeology
Biology
Ethnology
Geology
History
History-Archives
Paleontology
1
5
10
7
1
24
%
4.166667
20.83333
41.66667
29.16667
4.166667
100
19
3
5
1
54
19
1
102
%
18.62745
2.941176
4.901961
0.980392
52.94118
18.62745
0.980392
100
Summary of previous surveys and NPS data
1
Sarraf
Goldman22
Visitors
2.6
4
Browsers
67
Searchers
Students
18
54
15
Teachers
12
24
Museum
27
Researchers
17
20
Marty3
Peacock4
Fantoni5
NPS6
80
23
50
3
58
16
10
15
48
48
37 (16% professional 21% personal)
13
0
(20)
60
61
Visitor segmentation by percentage from different website visitation studies. Studies in chronological order.
1. Conflated information from more than one survey question (Sarraf, 1999).
2. Information extrapolated from multiple questions (Goldman & Schaller, 2004).
3. Looks at non-exclusive interest in website offerings, aggregated before and after a museum visit
taken from likely/very likely responses. Student % was inferred (with low precision) from
interest in educational materials (Marty, 2007)
4. The fourth demographic from Peacock and Brownhill was transactors (14%) (Peacock &
Brownhill, 2007).
5. Fantoni also noted the transactor demographic (3%) (Fantoni, Stein & Bowman 2012).
6. In this study “museum”was taken from a separate question asking NPS affiliation.
16