Taxpayer Return on Investment in Florida Public Libraries Survey Results site navigation


Previous

Appendix IV - Survey Methodology

Telephone Survey
The Haas Center staff utilized Oppenheim Research, a telephone survey firm located in Tallahassee, Florida, to conduct the statewide telephone survey. Oppenheim Research obtained the home telephone numbers used in the survey from Survey Sampling International (SSI). Generating a truly random sample of telephone numbers for individuals in a particular geographic region is becoming more and more challenging. According to SSI, “In 2007, roughly 16% of all households had only wireless or cell phone service and only 82% of households could be reached on a landline telephone. Approximately 30% of the landline telephone households in the U.S. have unlisted numbers. Each year, about 20% of American households move, so that 12-15% of the residential numbers in a typical directory are disconnected over the life cycle of the directory. Samples drawn entirely from directories, and “plus-one” techniques based on directory seed numbers, often significantly under-represent unlisted households.”3

In order to overcome these difficulties, SSI developed random digit (RDD) methods. This method yields more active numbers and faster survey completion. In addition to the RDD method, phone numbers were distributed across all counties included in the sample (in this case, the State of Florida) in proportion to their density of listed telephone numbers. “All blocks within a county are organized in ascending order by area code, exchange, and block number. Once the quota has been allocated to all counties in the frame, a sampling interval is calculated by summing the number of listed residential numbers in eligible blocks within the county and dividing that sum by the number of sampling points assigned to the county. From a random start between zero and the sampling interval, blocks are systematically selected in proportion to their density of listed households. Once a block has been selected, a two-digit number is systematically selected in the range 00-99 and is appended to the exchange and block to form a 10-digit telephone number.”4 This methodology provides a very efficient random digit sample. Each county will have a probability of selection equal to its share of listed telephone households in the county. Business numbers were not included. Additionally, mobile phone numbers were not included.

Initially, Oppenheim Research ordered 15,000 telephone numbers. Of those 15,000 they used 9,595 to obtain the 905 survey completions and a response rate of 9.4%. Several screening criteria were utilized at the beginning of the survey. After introducing the survey, respondents were asked if they were over 18 years of age or if there was someone in the household who was over 18. If they or someone in the household was over 18, they were then asked if they had visited a public library in person or online in the last 12 months. If the answer to that was no, they did not continue with the survey. The survey was conducted over a 6-week period beginning November 11, 2009 and ending December 18, 2009.

Online Survey
Several surveys were conducted online using Survey Monkey, an Internet-based software program. These consisted of the online version of the library use survey, the survey of organizational library users and the library census.

The printed in-library survey instrument used in the 2004 study was modified to permit respondents to answer the same questions as asked in the telephone version of the library users’ survey. A version of the survey in Spanish was provided. Links to the survey instruments were provided to the State Library and Archives and all public library directors for inclusion on the home page of their respective websites. Where inclusion on the home page was not feasible, library directors were encouraged to post signs containing the link and encouraging library patrons to participate in the survey. A link to the survey was also posted on the Haas Center home page. The library user online survey ran from November 13, 2009 until January 4, 2010 and a total of 2,094 completed or partially completed surveys were received.

The survey of organizational library users consisted of a series of questions designed to elicit usage patterns and the economic value placed on public libraries by businesses, public and private schools and university libraries. Links to the online survey were e-mailed to a listing of special libraries, school superintendents, public school librarians and media specialists and private school principals. The survey was conducted from November 13, 2009 until January 4, 2010. A total of 167 completed or partially completed surveys were received.

The library census consisted of questions designed to elicit additional data not regularly reported to the State Library and Archives. Links to the survey were e-mailed to the public library directors. A total of 19 surveys were completed.

Survey Analysis
One general data analysis issue with many surveys is how to deal with “outliers,” which are individual responses that tend to inflate estimated averages and totals because they are particularly large. In order to err on the conservative side we chose to exclude such outliers when a single outlier expanded estimates by 50 percent or more. The other typical survey issue involves item non-responses, which are instances in which a questionnaire is completed but one or more questions (i.e., items) are not answered. In these cases, averages were calculated omitting the non-responses.

In order to take advantage of visit-related responses from both the household telephone and online surveys, we usually combined estimates from the two surveys by weighting estimates by respective number of responses. For example, the estimated average time to use alternatives was 94.5 minutes for the online survey and 73.5 minutes for the telephone survey. Using the weights calculated for that question, we arrive at a combine average of 84 minutes (i.e., 94.5 × 0.58 + $73.5 × 0.42).

Some survey questions required respondents to check a range of values. For example, we asked for annual household income in ranges of under $30,000, between $30,000 and $50,000, between $50,000 and $75,000, between $75,000 and $100,000 and more than $100,000. In some instances we needed to estimate an average salary from these responses. If the proportion of responses for the ranges is about equal one could use the mid-points and multiply each range mid-point by the proportion of responses to that range and sum the products across the ranges. However, these values are often skewed in a log-normal manner, in which case a geometric average is used in lieu of a mid-point. This average is the square root of the product of the range points; for example, the square root of $25,000 times $50,000 or $35,355. The outside values for the end ranges are approximated from examining the log-normal plots.

To establish an hourly rate, for example to apply to the number of hours spent for work-related purposes in the library, we added a 25 percent fringe benefit rate to personal annual income and divided by 2,080 annual hours. Both of these values yield conservative estimates.

Adult users in the online survey were asked questions about taxes that are designated for public libraries and, on both library user surveys, adults were asked how much they would be willing to accept and pay for their library card. In fact, adult residents average paying about $42 per adult in local taxes and $47 per adult when state and federal tax contributions are included. When asked: “If someone would buy your public library card each year, how much would you ask for it?”. Fifty-six percent of combined survey respondents said they would not give it up. Nearly 45 percent of telephone respondents said they would accept less than they pay in taxes, but the rest indicated that they would only accept more than what they pay in taxes. They were also asked: “If you paid a price for your library card each year instead of paying taxes, how much would you be willing to pay for it?”. The average amount they said they were willing to pay was, in fact, less than the amount they pay and could demonstrate the impact of the economic downturn on individuals’ feelings of wealth. But adult users still demonstrate that they are willing to pay many times that amount over a year considering their time and other costs spent using their public library.

Online respondents tended to skip questions, particularly those requesting demographic information. Where appropriate, telephone survey demographic responses were used instead.

 

3 SSI, RDD Landline Sample Methodology, http://www.surveysampling.com/sites/all/files/imce/RDDLandline.pdf
4 Ibid

 


 

For more information: info.Florida.gov.
Links to Site Authors
Haas Center at UWF State Library and Archives of Florida State Data County Data Definitions Home