Comment by srndsnd

Comment by srndsnd 9 hours ago

77 replies

To me, what's missing from that set of recommendations is some method to increase the liability of companies who mishandle user data.

It is insane to me that I can be notified via physical mail of months old data breaches, some of which contained my Social Security number, and that my only recourse is to set credit freezes from multiple credit bureaus.

nimbius an hour ago

I think the only reason were seeing this revelation from a federal agency after 20 years is to boost the governments case against tiktok.

bilekas 6 hours ago

> To me, what's missing from that set of recommendations is some method to increase the liability of companies who mishandle user data.

As nice as this is on paper, it will never happen, lobbyist exists. Not to be tinfoil hat but why would any lawmaker slap the hand that feeds them.

Until there is an independent governing body which is permitted to regulate over the tech industry as a whole it wont happen. Consider the FDA, they decide which drugs and ingredients are allowed and that's all fine. There could be a regulating body which could determine the risk to people's mental health for example from 'features' of tech companies etc. But getting that body created will require a tragedy. Like why the FDA was created in the first place. [1]

That's just my 2cents.

1 : https://www.fda.gov/about-fda/fda-history/milestones-us-food....

  • Aerroon 2 hours ago

    >There could be a regulating body which could determine the risk to people's mental health for example from 'features' of tech companies etc.

    I think ideas like this is why it's not going to happen.

    Our understanding of mental health is garbage. Psychiatry used to be full of quackery and very well still might be. Treatment for something like depression boils down to "let's try drug in a random order until one works". It's a field where a coin-flip rivals the accuracy of studies. Therefore any regulating body on that will just be political. It will be all about the regulators "doing something" because somebody wrote enough articles (propaganda).

    Problems like this are why people aren't interested in supporting such endeavors.

trinsic2 2 hours ago

Sounds like a bunch of crap the industry is already trying to sell the public and no its not working and yes we can do with out it.

bhhaskin 2 hours ago

If your identity gets stolen, you should be able to sue all the companies that had a leak.

layer8 6 hours ago

I’m completely sympathetic to making companies more liable for data security. However, until data breaches regularly lead to severe outcomes for subjects whose personal data was leaked, and those outcomes can be causally linked to the breaches in an indisputable manner, it seems unlikely for such legislation to be passed.

  • wepple 5 hours ago

    I forgot where I saw this, but the US govt recently announced that they see mass PII theft as a legitimate national security issue.

    It’s not just that you or I will be inconvenienced with a bit more fraud or email spam, but rather that large nation state adversaries having huge volumes of data on the whole population can be a significant strategic advantage

    And so far we typically see email+password+ssn be the worst data leaked; I expect attackers will put in more effort to get better data where possible. Images, messages, gps locations, etc

    • dantheman 3 hours ago
      • wepple 2 hours ago

        Very aware of that. That to me seemed like a targeted attack by a tracked APT group. What I’m referring to above is that the more vanilla attacks (ex: popular online mattress store gets popped) actually have national security implications, despite seeming like just an inconvenience

    • kragen 3 hours ago

      yes, privacy is not an individual problem; it's a civil defense problem, and not just when your opponent is a nation-state. we already saw this in 02015 during the daesh capture of mosul; here's the entry from my bookmarks file:

      https://www.facebook.com/dwight.crow/media_set?set=a.1010475... “#Weaponry and morale determine outcomes. The 2nd largest city of Iraq (Mosul) fell when 1k ISIS fighters attacked “60k” Iraqi army. 40k soldiers were artifacts of embezzlement, and of 20k real only 1.5k fought - these mostly the AK47 armed local police. An AK47 loses to a 12.7mm machine gun and armored suicide vehicle bombs. Finally, the attack was personal - soldiers received calls mid-fight threatening relatives by name and address. One army captain did not leave quickly enough and had two teenage sons executed.” #violence #Iraq #daesh

      of course the americans used this kind of personalized approach extensively in afghanistan, and the israelis are using it today in lebanon and gaza, and while it hasn't been as successful as they hoped in gaza, hamas doesn't exactly seem to be winning either. it's an asymmetric weapon which will cripple "developed" countries with their extensive databases of personal information

      why would a politician go to war in the first place if the adversary has the photos and imeis of their spouse, siblings, and children, so they have a good chance of knowing where they are at all times, and the politician can't hope to protect them all from targeted assassination?

      the policy changes needed to defend against this kind of attack are far too extreme to be politically viable. they need to be effective at preventing the mere existence of databases like facebook's social graph and 'the work number', even in the hands of the government. many more digital pearl harbors like the one we saw this week in lebanon will therefore ensue; countries with facebook, credit bureaus, and national identity cards are inevitably defenseless

      imposing liability on companies whose data is stolen is a completely ineffective measure. first, there's no point in punishing people for things they can't prevent; databases are going to get stolen if they're in a computer. second, the damage done even at a personal level can vastly exceed the recoverable assets of the company that accumulated the database. third, if a company's database leaking got your government overthrown by the zetas or daesh, what court are you going to sue the company in? one operated by the new government?

      • treypitt 2 hours ago

        Are you saying you think more critical government databases than OPM or security clearance rosters are inevitably going to be breached? I'd like to think the government or corporation can effectively protect some databases at least...

        • kragen an hour ago

          those are already pretty bad, but i think the really dangerous ones are things like verizon's billing records and customer location history, credit card transaction histories, license plate registrations, credit bureau histories, passport biometrics, enough voice recordings from each person for a deepfake, public twitter postings, etc.

          consider https://en.wikipedia.org/wiki/1943_bombing_of_the_Amsterdam_...:

          > The 1943 bombing of the Amsterdam civil registry office was an attempt by members of the Dutch resistance to destroy the Amsterdam civil registry (bevolkingsregister), in order to prevent the German occupiers from identifying Jews and others marked for persecution, arrest or forced labour. The March 1943 assault was only partially successful, and led to the execution of 12 participants. Nevertheless, the action likely saved many Jews from arrest and deportation to Nazi extermination camps.

          to avoid partisan debate, imagine a neo-nazi group takes over the us, which presumably we can all agree would be very bad. after they took over, how hard would it be for them to find all the jews? not just make a list of them, but physically find them? (much easier than it was in 01943, i'm sure we can agree.) how hard would it be for them to find all the outspoken anti-fascists? where could those anti-fascists hide?

          now, step it up a notch. how hard would it be for them to find all the jews before they take over? it wouldn't be that hard if the databases leak. and if you feel safe because you're not jewish, rest assured that neo-nazis aren't the only groups who are willing to use violence for political ends. someone out there wants you dead simply because of the demographic groups you belong to. the reason you haven't been seeing widespread political violence previously is that it hasn't been a winning strategy

          the situation is changing very fast

  • deegles 2 hours ago

    Nearly everyone's data has been leaked already. Any strong protections would only protect people who haven't been born yet imo.

  • EasyMark 3 hours ago

    They’d need a lot less security if they stopped spying on us and saving all of our most critical ID data, period.

  • Onavo 5 hours ago

    Then instead of regulating the companies, make SSN easily revokable and unique per service. I don't understand why Americans are so oppposed to a national ID despite the fact that every KYC service use SSNs and driver licenses.

    • candiddevmike 5 hours ago

      Because they're the mark of the beast or a step towards fascism or something.

      I don't think it would take much to convert real IDs into a national ID, they are as close to as they can get without "freaking people out".

      • Nevermark 2 hours ago

        Emphasizing that the number can be changed would really help there.

        People could even generate their own number (private key), which they never gave out, and appeared differently to each account manager verifying it, and still replace them.

        When you choose your own number, it's only the Mark of the Beast if you are the Beast! * **

        * 666, 13, 69 and 5318008 expressly prohibited.

        ** Our offices only provide temporary tattoos.

    • mapt 5 hours ago

      The expansion of KYC and the hegemonic dominance of our global financial intelligence network is a recent infringement on our privacy that would not necessarily pass popular muster if it became well-known.

      Most of our population is still living in a headspace where transactions are effectively private and untraceable, from the cash era, and has not considered all the ways that the end of this system makes them potential prey.

      The fact is that the market is demanding a way to identify you both publicly and privately, and it will use whatever it needs to, including something fragile like a telephone number 2fa where you have no recourse when something goes wrong. It's already got a covert file on you a mile long, far more detailed than anything the intelligence agencies have bothered putting together. The political manifestation of anti-ID libertarians is wildly off base.

  • mapt 5 hours ago

    "What fraction of the FBI and CIA do the Communists have blackmail material on?"

arminiusreturns 7 hours ago

I agree. Let me tell you about what just happened to me. After a very public burnout and spiral, a friend rescued me and I took a part time gig helping a credit card processing company. About 2 months ago, the owner needed something done while I was out, and got their uber driver to send an email. They emailed the entire customer database, including bank accounts, socials, names, addresses, finance data, to a single customer. When I found out, (was kept hidden from me for 11 days) I said "This is a big deal, here are all the remediations and besides PCI we have 45 days by law to notify affected customers." The owner said "we aren't going to do that", and thus I had to turn in my resignation and am now unemployed again.

So me trying to do the right thing, am now scrambling for work, while the offender pretends nothing happened while potentially violating the entire customer base, and will likely suffer no penalty unless I report it to PCI, which I would get no reward for.

Why is it everywhere I go management is always doing shady stuff. I just want to do linuxy/datacentery things for someone who's honest... /cry

My mega side project isn't close enough to do a premature launch yet. Despite my entire plan being to forgo VC/investors, I'm now considering compromising.

  • aftbit 7 hours ago

    >Why is it everywhere I go management is always doing shady stuff.

    Well here's a cynical take on this - management is playing the business game at a higher level than you. "Shady stuff" is the natural outcome of profit motivation. Our society is fundamentally corrupt. It is designed to use the power of coercive force to protect the rights and possessions of the rich against the threat of violence by the poor. The only way to engage with it AND keep your hands clean is to be in a position that lets you blind yourself to the problem. At the end of the day, we are all still complicit in enabling slave labor and are beneficiaries of policies that harm the poor and our environment in order to enrich our lives.

    >unless I report it to PCI, which I would get no reward for.

    You may be looking at that backwards. Unless you report it to PCI, you are still complicit in the mishandling of the breach, even though you resigned. You might have been better off reporting it over the owner's objections, then claiming whistleblower protections if they tried to terminate you.

    This is not legal advice, I am not a lawyer, I am not your lawyer, etc.

    • arminiusreturns 7 hours ago

      I did verify with an attorney that since I wasn't involved and made sure the owner knew what was what, that I had no legal obligations to disclose.

    • positus 7 hours ago

      The problem isn't society or profit motivation. It's people. Humanity itself is corrupt. There aren't "good people" and "bad people". There's only "bad people." We're all bad people, just some of us are more comfortable with our corruption being visible to others to a higher degree.

      • ragnese 6 hours ago

        > We're all bad people, just some of us are more comfortable with our corruption being visible to others to a higher degree.

        If the GP's story is true (and I have no reason to suspect otherwise), then there are clearly differences in the degree of "badness" between people. GP chose to resign from his job, while his manager chose to be negligent and dishonest.

        So, even if we're all bad people, there are less bad and more bad people, so we might as well call the less bad end of the spectrum "good". Thus, there are good and bad people.

  • ValentinA23 7 hours ago

    The DOJ has just launched a corporate whistleblower program, you should look into it maybe it covers your case:

    https://www.justice.gov/criminal/criminal-division-corporate...

    >As described in more detail in the program guidance, the information must relate to one of the following areas: (1) certain crimes involving financial institutions, from traditional banks to cryptocurrency businesses; (2) foreign corruption involving misconduct by companies; (3) domestic corruption involving misconduct by companies; or (4) health care fraud schemes involving private insurance plans.

    >If the information a whistleblower submits results in a successful prosecution that includes criminal or civil forfeiture, the whistleblower may be eligible to receive an award of a percentage of the forfeited assets, depending on considerations set out in the program guidance. If you have information to report, please fill out the intake form below and submit your information via CorporateWhistleblower@usdoj.gov. Submissions are confidential to the fullest extent of the law.

  • TinyRick 7 hours ago

    Why would you resign? You could have reported it yourself and then you would have whistleblower protections - if the company retaliated against you (e.g. fired you), you then would have had a strong lawsuit.

    • arminiusreturns 7 hours ago

      Because I don't want to be associated with companies that break the law and violate regulations knowingly. I've long had a reputation of integrity, and it's one of the few things I have left having almost nothing else.

      • TinyRick 7 hours ago

        So you would rather be known as someone who had an opportunity to report a violation, and chose not to? From my perspective it seem like you decided against acting with integrity in this situation - the moral thing would have been to report the violation, but you chose to look the other way and resign.

  • mikeodds 7 hours ago

    As in.. his actual Uber driver? He just handed his laptop over?

    • arminiusreturns 7 hours ago

      Yes. The owner is old, and going blind, but refuses to sell or hand over day to day ops to someone else, and thus must ask for help on almost everything. I even pulled on my network to find a big processor with a good reputation to buy the company, but after constant delays and excuses for not engaging with them, I realized to the owner the business is both their "baby" and their social life, neither of which they want to lose.

alsetmusic 9 hours ago

Regulation is key, but I don’t see it as likely when our society is poisoned by culture war bs. Once we put that behind us (currently unlikely), we can pass sane laws reigning in huge corporations.

OkeyDokey2 7 hours ago

[flagged]

  • dylan604 7 hours ago

    This does nothing for them being able to continue with shadow profiles and inferences about you based on data they gather from others in your social network. It is well beyond "data you provide". Like waaaaay beyond.

2OEH8eoCRo0 7 hours ago

I get a feeling that liability is the missing piece in a lot of these issues. Section 230? Liability. Protection of personal data? Liability. Minors viewing porn? Liability.

Lack of liability is screwing up the incentive structure.

  • brookst 7 hours ago

    I think I agree, but people will have very different views on where liability should fall, and whether there is a malicious / negligent / no-fault model?

    Section 230? Is it the platform or the originating user that's liable?

    Protection of personal data? Is there a standard of care beyond which liability lapses (e.g. a nation state supply chain attack exfiltrates encrypted data and keys are broken due to novel quantum attack)?

    Minors viewing porn? Is it the parents, the ISP, the distributor, or the creator that's liable?

    I'm not here to argue specific answers, just saying that everyone will agree liability would fix this, and few will agree on who should be liable for what.

    • TheOtherHobbes 7 hours ago

      It's not a solvable problem. Like most tech problems it's political, not technical. There is no way to balance the competing demands of privacy, security, legality, and corporate overreach.

      It might be solvable with some kind of ID escrow, where an independent international agency managed ID as a not-for-profit service. Users would have a unique biometrically-tagged ID, ID confirmation would be handled by the agency, ID and user behaviour tracking would be disallowed by default and only allowed under strictly monitored conditions, and law enforcement requests would go through strict vetting.

      It's not hard to see why that will never happen in today's world.

      • malfist 6 hours ago

        > It's not a solvable problem

        Lawnmower manufacturers said the same thing about making safe lawnmowers. Until government regulations forced them to

    • StanislavPetrov 6 hours ago

      >Protection of personal data? Is there a standard of care beyond which liability lapses (e.g. a nation state supply chain attack exfiltrates encrypted data and keys are broken due to novel quantum attack)?

      There absolutely should be, especially for personal data collected and stored without the express written consent of those being surveilled. They should have to get people to sign off on the risks of having their personal data collected and stored, be legally prevented from collecting and storing the personal data of people who haven't consented and/or be liable for any leaking or unlawful sharing/selling of this data.

zeroonetwothree 9 hours ago

If you aren’t directly harmed yet what liability would they have? I imagine if your identity is stolen and it can be tied to a breach then they would already be liable.

  • kibwen 8 hours ago

    The fact that my data can be stolen in the first place is already outrageous, because I neither consented to allowing these companies to have my data, nor benefit from them having my data.

    It's like if you go to an AirBNB and the owner sneaks in at night and takes photos of you sleeping naked and keeps those photos in a folder on his bookshelf. Would you be okay with that? If you're not directly harmed, what liability would they have?

    Personal data should be radioactive. Any company retaining it better have a damn good reason, and if not then their company should be burned to the ground and the owners clapped in irons. And before anyone asks, "personalized advertisements" is not a good reason.

    • ryandrake 8 hours ago

      That's the big problem with relying on tort law to curb this kind of bad corporate behavior: The plaintiff has to show actual injury or harm. This kind of bad behavior should be criminal, and the state should be going after companies.

    • lesuorac 8 hours ago

      I don't think thats a proper parallel.

      I think a better example would be You (AirBnB Host) rent a house to Person and Person loses the house key. Later on (perhaps many years later), You are robbed. Does Person have liability for the robbery?

      Of course it also gets really muddy because you'll have renting the house out for those years and during that time many people will have lost keys. So does liability get divided? Is it the most recent lost key?

      Personally, I think it should just be some statutory damages of probably a very small amount per piece of data.

      • pixl97 7 hours ago

        The particular problem comes in because the amount of data lost tends to be massive when these breaches occur.

        It's kind of like the idea of robbing a minute from someone's life. It's not every much to an individual, but across large populations it's a massive theft.

        • lesuorac 7 hours ago

          Sure and if you pay a statutory fine times 10 million then it becomes a big deal and therefore companies would be incentivized to protect it better the larger they get.

          Right now they probably get some near free rate to offer you credit monitoring and dgaf.

      • 8note 7 hours ago

        This version loses multiple parts of things that are important

        1. I have no control over what was stored 2. I have no control over where the storage is

        The liability in this case is the homeowner/host, as you should have and had full ability to change out the locks.

        To make it more similar, I think you'd need one of the guests to have taken some amount of art off the wall, and brought it to a storage unit, and then the art later was stolen from the storage unit, and you don't have access to the storage unit.

        It's not as good as the naked pictures example because what's been taken is copies of something sensitive, not the whole thing

      • polygamous_bat 7 hours ago

        > I think a better example would be You (AirBnB Host) rent a house to Person and Person loses the house key.

        This is not a direct analogue, a closer analogy would be when the guest creates a copy of the key (why?) without my direct consent (signing a 2138 page "user agreement" doesn't count) and at some later point when I am no longer renting to them, loses the key.

        • lesuorac 7 hours ago

          I'm still much more interested in the answer to who is liable for the robbery.

          Just the Robber? Or are any of the key-copiers (instead of losers w/e) also?

    • JumpCrisscross 7 hours ago

      > before anyone asks, "personalized advertisements" is not a good reason

      The good reason is growth. Our AI sector is based on, in large part, the fruits of these data. Maybe it's all baloney, I don't know. But those are jobs, investment and taxes that e.g. Europe has skipped out on that America and China are capitalising on.

      My point, by the way, isn't pro surveillance. I enjoy my privacy. But blanket labelling personal data as radioactive doesn't seem to have any benefit to it outside emotional comfort. Instead, we need to do a better job of specifying which data are harmful to accumulate and why. SSNs are obviously not an issue. Data that can be used to target e.g. election misinformation are.

      • thfuran 5 hours ago

        So you're saying it's all vastly valuable and that's why it is right that it is taken without consent or compensation?

        • JumpCrisscross 3 hours ago

          > it's all vastly valuable and that's why it is right that it is taken without consent or compensation?

          No, I'm saying it's a common with a benefit to utilisation. A lot of discussions around data involve zealouts on both sides. (One claiming it's the god-given right to harvest everyone's personal information. The other acting like it's the crime of the century for their email address to be leaked.)

    • pc86 8 hours ago

      I mean it's pretty clear that you are directly harmed if someone takes naked photos of you without your knowledge or consent and then keeps them. It's not a good analogy so if we want to convince people like the GP of the points you're making, you need to make a good case because that is not how the law is currently structured. "I don't like ads" is not a good reason, and comments like this that are seething with rage and hyperbole don't convince anyone of anything.

      • drawkward 8 hours ago

        What is the harm? It is not obvious to me, if the victim is unaware...unless you are alleging simply that there is some ill-defined right to privacy. But if that is so, why does it apply to my crotch and not my personal data?

      • JumpCrisscross 7 hours ago

        > it's pretty clear that you are directly harmed if someone takes naked photos of you without your knowledge or consent and then keeps them

        Sure. In those cases, there are damages and that creates liability. I'm not sure what damages I've ever faced from any leak of e.g. my SSN.

        • pixl97 7 hours ago

          I mean most people won't until that day they find out theirs a house in Idaho under their name (and yes I've seen just this happen).

          The problem here is because of all these little data leaks you as an individual now bear a cost ensuring that others out there are not using your identity and if it happens you have to clean up the mess by pleading it wasn't you in the first place.

    • ranger_danger 7 hours ago

      >I neither consented to allowing these companies to have my data, nor benefit from them having my data.

      I think both of those are debatable.

  • halJordan 7 hours ago

    This is the traditional way of thinking, and a good question, but it is not the only way.

    An able bodied person can fully make complaints against any business that fails their Americans with Disabilities Act obligation. In fact these complaints by able bodied well-doers is the de facto enforcement mechanism even though these people can never suffer damage from that failure.

    The answer is simply to legislate the liability into existence.

  • idle_zealot 8 hours ago

    That's the whole problem with "liability", isn't it? If the harms you do are diffuse enough then nobody can sue you!

  • squeaky-clean 7 hours ago

    The same way you can get ticketed for speeding in your car despite not actually hitting anyone or anything.

  • bunderbunder 8 hours ago

    This is exactly why thinking of it in terms of individual cases of actual harm, as Americans have been conditioned to do by default, is precisely the wrong way to think about it. We're all familiar with the phrase "an ounce of prevention is worth a pound of cure", right?

    It's better to to think of it in terms of prevention. This fits into a category of things where we know they create a disproportionate risk of harm, and we therefore decide that the behavior just shouldn't be allowed in the first place. This is why there are building codes that don't allow certain ways of doing the plumbing that tend to lead to increased risk of raw sewage flowing into living spaces. The point isn't to punish people for getting poop water all over someone's nice clean carpet; the point is to keep the poop water from soaking the carpet in the first place.

    • supertrope 7 hours ago

      Safety rules are written in blood. After a disaster there’s a push to regulate. After enough years we only see the costs of the rules and not the prevented injuries and damage. The safety regulations are then considered annoying and burdensome to businesses. Rules are repealed or left unenforced. There is another disaster…

      • bunderbunder 6 hours ago

        Tangentially, there was an internet kerfuffle about someone getting in trouble for having flower planters hanging out the window of their Manhattan high rise apartment a while back, and people's responses really struck me.

        People from less dense areas generally saw this as draconian nanny state absurdity. People who had spent time living in dense urban areas with high rise residential buildings, on the other hand, were more likely to think, "Yeah, duh, this rule makes perfect sense."

        Similarly, I've noticed that my fellow data scientists are MUCH less likely to have social media accounts. I'd like to think it's because we are more likely to understand the kinds of harm that are possible with this kind of data collection, and just how irreparable that harm can be.

        Perhaps Americans are less likely to support Europe-style privacy rules than Europeans are because Americans are less likely than Europeans to know people who saw first-hand some of what was happening in Europe in the 20th century.