The best way to survey underserved, minority, or disadvantaged residents in your city satisfaction surveys.

We understand that communities and cities are concerned about understanding the needs of their minority and disadvantaged residents. They want to be confident that they are included in their city satisfaction surveys; thus, including their responses in surveys is important.

It used to be very hard to find these residents in survey research. We had to take special care and use different methods to capture their opinions.

Those days are gone.

The latest digital trends impact how we survey these residents

According to Pew Research, as reported by Statista, the penetration of smartphone usage in 2021 was similar across different ethnic and minority groups. Even the older population adopts smartphones at an increasing rate. A separate Statista report shows that in 2021, 61% of people over the age of 65 were using smartphones – and increasing at a rate of 7 to 8 percentage points a year. In addition, Pew Research also reports that in 2021, 76% of those making less than $30,000 per year have smartphones – and this increases 4% to 5% per year.

These figures suggest that mobile surveys are not only a valid option for capturing opinions from a broader, more representative audience; they may be the best way.

Of course, there is some common sense with this as well. Who do you know who enjoys sitting down to take a paper survey? Who even does that anyway? Yet, we are all on our phones. Even these typically hard-to-reach residents are on their phones.

In short, we need to go where they are and use the communication tools they use.

Unfortunately, many government agencies and city governments still use telephone and mail surveys to find these respondents due to a mistaken belief that these modes are required to find these hard-to-find respondents. When, in fact, these modes make it more difficult to capture a representative sample of these hard-to-reach respondents. And, of course, these modes have a number of methodological problems that we won’t go into here.

Designing mobile surveys takes more care

No one wants to take a monotonous, boring survey (which describes most paper surveys). And we know our audience has many more distractions than just a few years ago. Therefore, designing even online surveys as we did just a few years ago won’t work. We need to design them specifically for a smartphone.

Most online and even mobile surveys are still designed for a computer, not a smartphone. If we want smartphone users to take surveys, we must consider the layout and design the survey specifically for a smartphone.

Also, those with smartphones tend to have much higher expectations for an easy and intuitive user interface. Lots of grids of 5-point scales are not the most fun surveys to take. Many options exist beyond these batteries of scales. Besides, humans don’t make decisions on 5-point scales. They make choices. See our video on the 5 Biggest Problems with Scales.

A good smartphone design requires simplifying the user interface and avoiding overly complicated layouts. A good rule of thumb is to keep the survey brief, with clear questions that are genuinely engaging, understandable, and easy to answer.

Is your community still relying primarily on mail surveys? Are you relying on vendors who still execute their surveys the same way as back in the 1990’s? The good news is that utilizing smartphones in combination with additional methods will provide better data faster and at a lower cost. What’s not to like?

Gold Standard Benchmarking For City Satisfaction Surveys

The use of benchmarks within a City Satisfaction Survey can be very useful. Benchmarks are an excellent method of comparing your city to those similarly situated, capturing and tracking data over time, and ensuring your government properly functions in a way that the evolving public demands of its residents.

What you do NOT want to do is simply use the “off the shelf” bencmarks provided by your research vendor.

Bad benchmarks mean bad data, and we must have good data to make representative public policy.

If your city is a college town, then you should compare yourself to other college towns. If you are a larger mid-west city, you should compare yourself to other larger mid-west cities. You get the idea.

The problem is that most “benchmarking data” comingles of all types of cities. It represents this “average” city that does not exist. If you are a small coastal town, do you really want to be compared to Phoenix? That is what you get with most off-the shelf benchmarking results. To make it worse, this data may be several years old.

Bad benchmarks can result in bad decisions. they can lead to the wrong indicators of performance measurement and create situations in which government officials are held to unreasonable and inaccurate standards. 

How can you tell a good benchmark from a bad one?

Consider the following questions:

  • Is the comparison fair? Did you know that different ethnic groups respond differently to surveys? Hispanics, for example, tend to give higher scores. Is your ethnic makeup similar to that used in the benchmark? And how do you know?
  • When were the measurements taken? And how much of a role does time play in the accuracy of the measurement? For example, a measurement taken before COVID is wildly different for many data points. After all, the ongoing pandemic altered countless aspects of our lives and governance. If your benchmarks don’t account for these changes, they’re worthless. 
  • What are you measuring? What you want to ask may not be included in benchmarking studies. After all, is your city like all others?
  • How old are the benchmarking scores? Satisfaction scores tend to increase over time as everyone tries to improve. Benchmarks from an older set of data may not be valid for today.  Many firms will use benchmarks from cumulative data over a span of 5 years.

Better Benchmarking

A much-improved benchmark is one that  . . .

  • Uses the exact same questionnaire.
  • At the exact same time as your resident survey.
  • In very “like” markets. Markets that are very similar to yours.

This is what we do at True North. We execute your City Satisfaction Survey in like markets at the exact same time. This data becomes your true apples-to-apples comparison point — the gold standard. From there, we track results over time, enabling you to get robust answers showing how data moves against benchmarks over time. 

And this process does not need to cost much more.  Most of the study costs are already covered (fixed study costs including questionnaire design, programming, survey methodology, and more).

This is just one more reason why True North has been named a “Most Trusted” Market Research Company. You can trust us to deliver true insights to you – not just data.