Electronic supplementary material
The online version of this article (doi:10.1186/s12889-015-2209-0) contains supplementary material, which is available to authorized users.
The authors declare that they have no competing interests.
MEB conducted all statistical analyses, initial interpretation of data and drafted the manuscript. GDD participated in the study conception and data interpretation. AVH participated in study conception and design, acquisition of data and in the analysis and interpretation of data. YK participated in data interpretation. TAB participated in the study conception and design, acquisition of data, data analysis and interpretation. All authors read and approved the final manuscript.
Parks are increasingly being viewed as a resource that may influence youth obesity and physical activity (PA). Assessing park quality can be challenging as few tools assess park characteristics geared towards youth PA. Additionally, no studies have compared reliability estimates of items assessed in different countries, hindering aims towards generalizable park audit items. Finally, new satellite imaging technology is allowing for desktop identification of parks, however it remains unclear how this compares to direct observation park identification. The purpose of this study is 1) to describe the development and reliability of a youth-oriented direct-observation park audit tool tested in Montreal, Canada, and; 2) to compare reliability estimates of items with those drawn from a tool previously tested in Perth, Australia, with those same items tested in Montreal, Canada.
Items were drawn and adapted from two existing tools and 13 new items were newly developed for a total of 92 items. Parks were pre-identified using a GIS software and then verified and audited on-site by observers. A total of 576 parks were evaluated. Cohen’s kappa and percent agreement were used to assess the inter- and intra-rater reliability of each item. Inter-rater reliabilities of 17 items drawn from a tool previously tested in Australia were compared.
Eighty-six percent of items had ≥ 75 % agreement and 83 % had kappa coefficients between 0.41 and 1. Among 40 test-retest episodes kappa agreement was relatively high (≥ 0.40) for all but four items. Percent agreement was excellent (≥ 75 % agreement) for all but eight items. Inter-rater reliability estimates of the 17 items tested in Montreal and Perth were of similar magnitude.
The tool is generally reliable and can be used to assess park characteristics that may be associated with youth PA. The items tested in Montreal and Perth are likely generalizable to other urban environments.