“The Global City Indicators Facility provides an established set of city indicators with a globally standardized methodology that allows for global comparability of city performance and knowledge sharing.”
This web-based relational database website, Global City Indicators Facility (GCIF), is based at the Faculty of Architecture, Landscape & Design of the University of Toronto in Canada and has on its executive committees both officials (partly elected politicians) from cities of various size and region and representatives from international organizations such as UN-Habitat or ICLEI.
The indicator themes are organized in two main categories:
- City services: education, finance, recreation, governance, energy, transportation, wastewater, fire and emergency response, health, safety, solid waste, urban planning, water.
- Quality of life: civic engagement, economy, shelter, culture, environment, social equity, technology and innovation
Under these themes a total of 115 indicators are listed, differentiated into basic provision indicators, supporting indicators, and profile indicators. The GCIF describes the development of its indicators as a “rigorous screening process”, demands a regular reporting by participating member cities, and underscores a specific set of criteria for the indicators. Amongst these criteria are important aspects such as the cost effectiveness of collecting the necessary data and a limited complexity of the indicators. It might sound as a weakness of the indicators, but city practitioners may well support the fact that it can pose particular barriers to local governments and administrations to work with data sets and lists of performance criteria if they are hard to understand or difficult to assess. This becomes especially relevant in cities in developing countries where local administrations are rather poorly equipped, but also in cities of industrialized nations in the current economic and financial crisis where budgets are recklessly cut back.
In contrast to the indicator development by international organizations or research institutions that may not seek the input of the parties concerned, the GCIF discusses current indicators and prospective ones with member cities in order to address their needs (concerning the performance comparability) and interests (concerning the knowledge sharing).
Global City Indicators Facility Member Cities
If one scans the list of new, currently discussed indicators, the political nature of ‘indicators’ or criteria in city performance comparison becomes clear. At the same time, the GCIF could be seen as an reaction or alternative to “Creative City Indicators” and their human representative, Richard Florida who is the Director of the Martin Prosperity Institute and Professor of Business and Creativity at the Rotman School of Management – both also based at the University of Toronto. The GCIF discusses, for instance, the indicator of “annual HIV/AIDS death rate per 100,000 population”. There is a lot of politics in such an indicator (cf., for instance, the developments in Iran’s drug policy and the issue of data availability: 1, 2). Nevertheless, such an indicator also shows that the GCIF does not shy away from rather critical issues in city performances, thereby also going beyond simple economic performances. They do include a creativity index, but at the same time, they are qualifying broad terms such as ‘green space’, ‘livable community’, or ‘gender-equity’ by monitoring, amongst others, the following indicators: Percentage of the city’s solid waste that is disposed of in an open dump; Square meters of public indoor recreation space per capita; Transportation fatalities per 100,000 population; Percentage of women employed in the city government workforce.
Without any doubt, this set of indicators needs to be further developed and other important indicators need to be added, but the GCIF can convince critiques with their approach of offering member cities to not simply apply criteria to their performances, but to differentiate between certain types of indicators, and comparing their performances in these indicators with only a particular group of other cities that share, for instance, similar profiles – this makes a comparison much more useful to public officials than a broad-brush comparison.
GCIF talks about their indicators as telling a story about a city when read as a whole. I would endorse this understanding by arguing that a city performance profile on GCIF’s website together with the city’s Wikipedia article will in most cases not contradict what citizens are experiencing in their city, especially when living there a certain amount of time. Now, one could ask: Why should cities compare their “performances”? And why should a city “perform”? To keep a strong political-ideological stance out of the discussion, I would say that “city performance” stands for providing services to citizens and visitors and presenting these everyday activities (indirectly) to an audience (as part of city branding?). And a comparison does not necessarily need to be understood as a neoliberal agenda of urban competitiveness. It shows cities where similar cities are standing and what weaknesses they are facing – and in many cases, similar cities will face similar problems and can discuss them when dealing with the formulation and development of indicators at the GCIF or meeting in similar institutions. Such a comparison is also a good start for addressing the most important flaws in a city government’s current activities. Thus, one essential question remains: What do city officials make of these data?
Pingback: Measure It, Compare It, Monitor It | Places.