mindslight 2 days ago

I had assumed you were coming from a similar position, and your argument was more of a reductio-ad-absurdum.

But if you're not - the fact it's putting a chilling effect on this activity right here is a problem.

Another big problem is the complete inequity. It takes the digital equivalent of hopping over a fence and turns it into a serious federal felony with persecutors looking to make an example of the witch who can do scary things (from the perspective of suits).

Another glaring problem is that if the types of boundaries it creates are noble, then why does it leave individuals powerless to enforce such boundaries against corpos, being easily destroyed by clickwrap licenses and unequal enforcement? Any surveillance bugs/backdoors on a car I own are fundamentally unauthorized access, and yet I/we are powerless to use this law to press the issue.

  • monerozcash 2 days ago

    >I had assumed you were coming from a similar position, and your argument was more of a reductio-ad-absurdum.

    >But if you're not - the fact it's putting a chilling effect on this activity right here is a problem.

    I've personally had my own CFAA-related criminal troubles in the distant past, but I still have a hard time seeing the big problems with CFAA so often touted on HN.

    The activity of childish vandalism by flooding Mazda servers with garbage data? There's no chilling effect on simply not sending any data to Mazda.

    The activity which was proposed earlier was explicitly malicious in intent, why shouldn't there be a chilling effect put on it? Do you not think the government should generally protect you from people taking explicitly malicious actions aimed at causing you harm?

    In this context it is the motivation that makes the crime. You could absolutely modify your car in a way where the data sent to mazda is replaced with zeroes or random data, but you would need to do so in good faith.

    Of course, when the activity is explicitly malicious as stated above ("poison their databases and statistics with fake data") it's not surprising that you'd be in violation of the law.

    >Another big problem is the complete inequity. It takes the digital equivalent of hopping over a fence and turns it into a serious federal felony with persecutors looking to make an example of the witch who can do scary things (from the perspective of suits).

    I just don't think this is actually happening. The cases often spoken of here are Auernheimer and Swartz.

    I have a hard time believing that anyone can read the court files in the Auernheimer case and argue in good faith that such behavior should be legal. Among other things, the court papers contain a chat log of the co-conspirators discussing how to they should use the data they've scraped from buggy AT&T site to spam AT&T customers with malware. In the end that was too complicated, so they arrived at trying to leak the data in most damaging way possible to hurt AT&T share prices.

    Swartz performed an admirable act of civil disobedience and faced up to 6 months in prison for that (realistically, he'd most likely never have spent a day in prison). I think what Swartz did is admirable, but that doesn't mean what he didn't shouldn't have been illegal. Just as what Snowden did was admirable, but legalizing such activities would have catastrophic consequences.

    >Another glaring problem is that if the types of boundaries it creates are noble, then why does it leave individuals powerless to enforce such boundaries against corpos, being easily destroyed by clickwrap licenses and unequal enforcement?

    I feel like this is conflating the problems that CFAA seeks to address with a completely different set of problems.

    Corporations are bound by the CFAA just as much as you are, it's just that companies are rarely in the business of doing this sort of crime. Just as companies are rarely in the business of selling heroin.

    > Any surveillance bugs/backdoors on a car I own are fundamentally unauthorized access, and yet I/we are powerless to use this law to press the issue.

    The fact that CFAA mostly does not address these particular issues is not a problem with the CFAA, people (or companies!) buying devices with software they don't like was never something CFAA was intended to address.

    There are reasonable, effective legal solutions to surveillance like this, like the GDPR.

    • mindslight 20 hours ago

      I'm not really looking to litigate the larger point with you. I had really thought you were coming from a place of not liking the CFAA but interpreting it as harshly as possible (especially with that username!)

      In general in this argument here and our previous argument, you're focused solely on intent to the exclusion of analyzing actual actions. You're then attributing malevolence to the intent of the individuals acting, while giving a pass to the companies (in this case Mazda) that is also operating with malicious/adversarial intent. You're missing that criminality also revolves around specific actions - in this case unauthorized access.

      > people (or companies!) buying devices with software they don't like was never something CFAA was intended to address.

      It most certainly addresses this. If I loaded up a PC with a remote access trojan, sold it on the used market, and then spied on the buyer, I would be looking at a CFAA prosecution. This is exactly what companies are doing with embedded spyware, yet it's not prosecuted.