Comment by BrenBarn

Comment by BrenBarn a day ago

0 replies

This is a nice explication of the sequence of events, but I think it's too lenient on the bad actors who've led us to where we are now. It says that "each step could be defended". Yes, but many of them can be defended only from a "fuck you, got mine" standpoint of trying to grab as much as possible while not caring about screwing anyone.

> Software could finally be updated after it shipped. Bugs could be fixed. Security holes could be closed.

They could be. Is that what this technology was used for?

> Crash reports made it easier to fix real problems, update checks were convenient, and license activation reduced some kinds of piracy.

I am skeptical that automatic sending of crash reports is worth the harm to users. A program can create crash reports and save them locally. Then if there is a crash or bug, a support team can instruct the user in how to find and send just the relevant reports. There is no reason to automatically send anything.

> “Can we understand how people actually use this?”

> Again, that’s not an evil thought. In fact, it’s useful!

Evil and useful are not mutually exclusive. Whether it's evil depends on what it's useful for.

> Before analytics, if you wanted to understand user behavior, you had to ask people, watch them, or infer patterns from support tickets. That requires time, empathy, and effort.

If you are making a choice out of a desire to avoid spending time, effort and (especially) empathy, you might be doing something evil (even if it's useful to you).

> When experimentation becomes the primary decision-making tool, a strong product vision becomes optional.

> Not because anyone argues against vision, but because you don’t strictly need it anymore, and because backing a chart is safer than backing an opinion.

This sidesteps the issue of either the vision or the chart is being backed. If you're backing it just to make your company more money, it's probably evil, whether it's a chart or a vision.

> Some categories are basically made of alerts: messaging, security, banking, calendars, delivery tracking, anything where timing actually matters.

Banking is not made of alerts. Delivery tracking is not made of alerts. Alerts may have valid uses in those contexts, but they're not the main event.

Delivery tracking I think is a good example of how notifications can be misused. People got deliveries all the time before push notifications. Most of the time you simply don't need to know what most of the notifications are telling you. What good does it do you to know that the package left Las Vegas and is now en route to San Bernardino? What good does it do you to know that the package was delivered at 3:47 if you won't be off work until 5pm anyway? When you get home, it'll either be there or it won't.

> The problem is that once a company builds the machinery to do it, that machinery becomes cheap to reuse, and the incentives gradually pull it away from “help the user succeed” toward “move the metric.”

That is evil if that metric is "help the company regardless of whether it helps the user". That is the issue here. The article consistently dances around the central issue, which is the underlying motives driving these actions. Printing words on a sheet of paper and posting it in the town square was an evil use of technology when done by a 19th-century charlatan to enrich himself by enticing saps into buying useless snake oil. Ruthlessly using any technology to pursue every possible gain regardless of the effect on others is unethical.

> Here are a few practical ways out.

Who are these directed at? Programmers? The article already says programmers hate doing this stuff. Bosses? Venture capitalists? We've already seen that they don't care. None of this is going to change unless these recommendations are aimed at the creation of normative guidelines to be enforced by law.

I really do appreciate the article and have saved it because it does a great job of laying out how the choices were made. But I am so tired of people making excuses for evil behavior on the basis that "it's just technology" or "well they were just trying to improve their product". Every company that did these evil things could have just settled for 2% growth instead of 2.5% and our world would be the better for it; and our world will be the better in another 30 years if we now enact punitive measures against those who continue to do these things.