Eating an Elephant: Lost in the Supermarket
“I’m all lost in the supermarket, I can no longer shop happily, I came in here for a special offer, guaranteed personality…” The Clash
I briefly mentioned the supermarket accessibility problem in the first installment in the “Eating an Elephant” series but did so without describing the actual complexity of the issue and how I have no solution to proffer and, to the best of my knowledge, no one is researching this problem. I hope that Will Pearson sends in a comment on the matter as he is far more expert in aspects of this topic than me.
At a glance, the confusion of a supermarket effects sighted people as well as those of us with a vision impairment. The stores have thousands of products sorted by their similarity to other products with the exception of displays of items on sale and products receiving extra promotion. These categorized items are distributed into aisles which contain packages of differing size, shape, color and prominence based upon how high or low they sit on a shelf. The sighted person can grow overwhelmed at the sheer vastness of visual noise, the wide array of colors and the way marketing types invent packaging to mislead the consumer as to the size and/or shape of its contents.
The sighted person, although their attention might scramble a bit can, however, see that aisle four contains condiments and walk to it. While in this section, they can also see that Wish Bone salad dressing is discounted and that Paul Newman’s is not and make the choice as to which they would prefer. They can also easily find the highly recognizable
The person with a profound to severe vision impairment, though, has an extremely different experience. As I described in part one of this series, the customer service people at the store assign us a human to help us with the shopping. These people vary in competence from illiterate to unable to speak a language I might understand even a little to very helpful. Even the best shopping companion, though, will start with the question, “So, what do you want?” A well prepared blink will have printed out a shopping list, the rest of us disorganized type are left to the wilds of the shopping experience.
Often, the answer to “What do you want to get?” is, “Lots of stuff.” This means that our companion has no clue where to start and we can only begin by rattling off items we definitely know we need.
Now, let’s return to the condiment aisle example we used for our sighted friends. In a manner of over simplification, we can imagine that each side of the aisle contains the same number of shelves and that each product has exactly the same amount of shelf space. For our simplified example, we can view each product and variation thereof as having its own cubicle. To keep the arithmetic simple, we’ll say that each side is five shelves high and 20 product cubicles per shelf. Thus, we have 10 products on each shelf - a massive simplification.
Like our sighted counterpart, we know we want salad dressing,
So, we, the blind shopper is presented with 200 products and variations in the aisle and we may actually want to buy four or five items from this set. How can our companion or possibly some as yet not invented bit of technology provide us with enough but not too much information about the items in the aisle?
If our companion or technology simply tells us everything in the aisle, we will somehow need to try to hold 200 separate offerings in short term memory. This breaks the memory bank and the attention model all at once and such information overload can be discounted out of hand.
We can be told all of the categories of items in the aisle: salad dressing, hot sauce, ketchup, mayonnaise, pickles and peppers, mustard, etc. Again, we’ve a big list of items that have only a generic description and much of which we can recall from previous visits to the market. So, we’re now getting a combination of too much data plus redundant information and we still haven’t found our first item.
Like our sighted friend, we want some thousand island salad dressing. For this example we’ll say that I am especially fond of Paul Newman’s and don’t care about Wish Bone even if it is on sale. I can tell my companion to get me the dressing I want and disregard all competitors. If, however, I consider salad dressing generically, I may want the item on sale or even the Publix store brand to save a little money I need to tell my companion to list off the various brands and their prices – this is a boring and time consuming process that leads only to the selection of a single product.
The next item,
The last two examples, a random item on sale and an impulse purchase provides the most complex of the problems. There are two hundred items in this aisle, n items have sale tags (where n is a value between zero and a random figure less than 200) and all 200 minus the salad dressing and
Now, we can multiply our 200 items in the condiments aisle by the 20 aisles in the store and we have an incredibly overwhelming number of data points. Remove the constraints I placed on the number of items per aisle and we have a very complex distribution of stuff we may need or want to buy.
With a companion, reading everything or even every category blows past short term memory limits and any attention model I’ve ever seen described for human beings. How then can a human companion, far smarter than any technology that may be invented in the short term future, determine the balance between too much, too little and the Goldilocks amount of information the consumer with vision impairment needs and/or wants to hear.
Last week, as Susan, my lovely wife of 21 years, and I drove south from Cambridge back to our home in Florida we pulled off at an exit in South Carolina which had fast food joints on all four corners. Susan made the executive decision that we would eat at McDonald’s; she did not tell me that we had choices nor, of course, did she tell me which choices we had. One of the others was a Wendy’s, a crappy fast food place that I prefer over McDonald’s. Susan made the assumption that fast food was generic and that I wouldn’t care or even have an opinion on which I may prefer which, in this case, was a fallacious assumption. Susan has been married to a blink for 21 years and still hasn’t developed the knack of finding the proper middle ground level of information – how then can we expect a randomly assigned supermarket companion to have even the slightest clue what we do and do not want to hear.
The most frequently described technology possibility is based in RFID, a standard that has been due to replace UPC for a pretty long time. With something like an RFID wand, the blind consumer can hear the items that they are near. The user could turn such a device to “category” mode or “sale item” mode or any of a number of categories of information that can be held on the product’s RFID combined with augmentative data on the store’s Wi Fi system. I still think this will provide too much information in a manner too complex to be truly useful but it seems to be the best idea I’ve heard so far. The practicality, though, of getting every supermarket and product to retool their shop for such a system is probably not going to happen for a long time to come if ever.
What can we, as people with vision impairment, do to solve the supermarket problem in the time before someone invents and distributes a device that might solve the problem? The first suggestion is to shop online and have one’s groceries delivered. These online grocery services are not available in all parts of the US and, returning to the problem of the current screen reader UI paradigm of reading everything as a list, slogging through a web site with zounds of items will either take a really long time or will not do much to solve the sale item problem and little or nothing to help with impulse or new product purchases. This, of course, has the benefit of saving one some money by putting up a wall to our potential impulses but it also leaves out the ability to discover items we may really enjoy.
Do any BC readers have any suggestions? If so, please leave comments to further discussion.