Comment by londons_explore

Comment by londons_explore 5 days ago

3 replies

As soon as you get to ~99% accuracy, you probably don't need to go further.

If the customer is accidentally billed for an orange instead of a tangerine 1% of the time, the consumer probably won't notice or care, and as long as the errors aren't biased in favour of the shop, regulators and the taxman probably won't care either.

With that in mind, I suspect Amazon Go wasn't profitable due to poor execution not an inherently bad idea.

Slartie 5 days ago

Actually, discount grocers operate on razor-thin margins of 2-4%. If your inaccuracy is geared to the benefit of your customer (because otherwise you'll be out of business due to the regulatory bodies) and thus removes just one percent of that, you suddenly lose a quarter to half of your earnings! And that goes ON TOP of the additional cost incurred with all that computer vision tech.

In addition to that, you'll have the problem of inventory differences, which is often cited as being an even bigger problem with store theft than the loss of valued product. If the inventory numbers on your books differ too much from the inventory actually on the shelves, all your replenishment processes will suffer, eventually causing out of stock situations and thus loss of revenue. You may be able to eventually counter that by estimating losses to billing inaccuracies, but that's another complexity that's not going to be free to tackle, so the 1% inaccuracy is going to cost you money on the inventory difference front, no matter what.

  • SilverBirch 5 days ago

    And to add to that, it's not a neutral environment. If there's 1% of scenarios that are incorrect, people will figure out they haven't been billed for something, figure out why, and then tell their friends. Before you know it every teenager is walking into Amazon Fresh standing on one foot, taking a bag of Doritos, hopping over to the Coca Cola stand, putting the Doritos down, spinning 3 times, picking it up again and walking out of store, safe in the knowledge that the AI system has annotated the entire event as a seagull getting into the shop.

davidst 5 days ago

I don't have insight into what ultimately transpired at Amazon Go so take the following as speculation on my part.

It is unlikely the tech would be frozen when an acceptable accuracy threshold is reached:

1. There is a strong incentive to reduce operational costs by simplifying the hardware infrastructure and improving the underlying vision tech to maintain acceptable accuracy. You can save money if you can reduce the number and quality of cameras, eliminate additional signal assistance from other inputs (e.g., shelves with load cells), and generally simplify overall system complexity.

2. There is business pressure to add product types and fixtures which almost always result in new customer behaviors. I mentioned coffee in my prior post. Consider what it would mean to add support for open-top produce bins and the challenge of complex customer rummaging. It would take a lot of high-quality annotated data and probably some entirely new algorithms, as well.

Both of those require maintaining a well-staffed annotation team working continuously for an extended time. And those were just the first two things that come to mind. There are likely more reasons that aren't immediately apparent.