Comment by adrian_b

Comment by adrian_b a day ago

8 replies

SHA-1 is broken for being used in digital signature algorithms or for any other application that requires collision resistance.

There are a lot of applications for which collision resistance is irrelevant and for which the use of SHA-1 is fine, for instance in some random number generators.

On the CPUs where I have tested this (with hardware instructions for both hashes, e.g. some Ryzen and some Aarch64), SHA-1 is faster than SHA-256, though the difference is not great.

In this case, collision resistance appears irrelevant. There is no point in finding other strings that will produce the same validation hash. The correct input strings can be obtained by reverse engineering anyway, which has been done by the author. Here the hash was used just for slight obfuscation.

Retr0id 21 hours ago

The perf difference between SHA1 and SHA256 was marginal on the systems I tested (3950x, M1 Pro), which makes SHA256 a no-brainer to me if you're just picking between those two (collision resistance is nice to have even if you "don't need it").

You're right that collision resistance doesn't really matter here, but there's a fair chance SHA1 will end up deprecated or removed from whatever cryptography library you're using for it, at some point in the future.

  • mjevans 16 hours ago

    When will CRC32c (also used in https://en.wikipedia.org/wiki/Ethernet_frame#Frame_check_seq... ), MD5, etc get removed? Sure they aren't supported for _security_ use, and should not be used by anything new. However the algorithms will likely continue to exist in libraries of some sort for the foreseeable future. Maybe someday in the distant future they'll just be part of a 'legacy / ancient hash and cryptography' library that isn't standard, but they'll continue to be around.

    SO many things also already standardize on SHA1 (or even weaker hashes) as a (non-security) anti-collision hash for either sharding storage sets (host, folder, etc) or just as already well profiled hash key algos.

    • Retr0id 16 hours ago

      CRC was never a cryptographic hash so there is no need to deprecate it.

      MD5 (and SHA1) is already absent or deprecated in many cryptography libraries, e.g. https://cryptography.io/en/latest/hazmat/primitives/cryptogr...

      Every time someone uses MD5 or SHA1 for something that isn't legacy-backcompat, they further delay their deprecation/removal unnecessarily.

    • unscaled 16 hours ago

      The difference that you've already noted here is that the X-Browser-Validation is new. It doesn't have to keep using SHA1, MD5 or CRC-32 to maintain compatibility with a protocol spec that predates the existence of newer algorithms.

      • mjevans 9 hours ago

        The header is new, but what's it working with on the server side? Were there any other considerations for the selection of the value?

        Though in contrast to that, sometimes the criteria is just that a given number of bits aren't useful, so the output of a different hash is truncated to the desired size.

        Maybe part of the driving criteria os compatibility with E.G. the oldest supported Android version? Or maybe some version of Windows seen in legacy devices in poor countries? There might be good reasons beyond just 'header is new, everything must be state of the art'.

  • JimDabell 20 hours ago

    There’s also the downside of every engineer you onboard spending time raising the same concern, and being trained to ignore it. You want engineers to raise red flags when they see SHA-1!

    Sometimes something that looks wrong is bad even if it’s technically acceptable.

    • unscaled 16 hours ago

      Not just engineers. Many off-the-shelf static analysis tools would happily jump at every mention of a deprecated algorithm such as SHA1 in your code. It's just too much noise, and the performance cost of SHA-256 is negligible on modern computers. If digest size or speed on older machines is a concern, there are other options like Blake2/3.

      There probably(?) isn't any serious vulnerability in using SHA-1 for an integrity identifier that is based on a hard-coded "API key", but I think algorithm hygiene is always a good thing. You don't want to train your engineers to use broken algorithms like SHA-1, "because it might be ok, idk".

      • adrian_b 3 hours ago

        It should be noted that using a parallelizable hash, like Blake2/3, does not provide higher speed by magic.

        Evaluating anything in parallel is a different compromise between the time and the power needed to perform a computation, i.e. with an N-way parallel evaluation you hope to reduce the time by almost N times, while increasing the power by a similar factor and not increasing much the energy required to do the computation.

        The time to compute a hash is not always the most important, especially when the hash computation can be overlapped with other data processing. In mobile and embedded applications the energy can be more important. In that case using the hardware instructions for SHA-256 or SHA-1 can provide energy savings over hashes like Blake2/3.

        So the best choice for a hash function can be affected by many factors, it is preferable not to choose automatically the same function regardless of the circumstances.

        Nowadays SHA-256 is widely supported in hardware and still secure enough for any application with an 128-bit security target, so it is OK as a default choice, but it may be not the best choice in many cases.