You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
i'm not looking to use pakkero, i was just taking a look at vanilla OS and ended up in this rabbit hole. btw, thanks for your work on these projects!
so i was trying to understand the key derivation in pakkero, and i run into a problem. i'm concluding that i'm not understanding it, because otherwise there seems to be a a trivial way of attacking it.
if the key is just sha512sum(Launcher+OFFSET1), what is stopping you from postulating offset1 to be a low number, computing the hash/key, decrypting some block starting at offset1 with the computed key, checking for success (check ELF header, entropy, or whatever), increasing offset1 by its granularity (byte), and loop by extending the previously calculated hash?
with an offset1 in the recommended 2M range, the key space would be just 2^21 which is easily bruteforceable, no need to do any dynamic or static analysis. and the fatal flaw would be that the key space is linear with file size, which implies that file size is exponential with key size. it seems a key size of 64 bits would require a file size of 4503599627370496 TB, while 128 bits would require 8.3*10^34 TB.
so what am i getting wrong here? thanks!
The text was updated successfully, but these errors were encountered:
hi,
i'm not looking to use pakkero, i was just taking a look at vanilla OS and ended up in this rabbit hole. btw, thanks for your work on these projects!
so i was trying to understand the key derivation in pakkero, and i run into a problem. i'm concluding that i'm not understanding it, because otherwise there seems to be a a trivial way of attacking it.
if the key is just
sha512sum(Launcher+OFFSET1)
, what is stopping you from postulating offset1 to be a low number, computing the hash/key, decrypting some block starting at offset1 with the computed key, checking for success (check ELF header, entropy, or whatever), increasing offset1 by its granularity (byte), and loop by extending the previously calculated hash?with an offset1 in the recommended 2M range, the key space would be just 2^21 which is easily bruteforceable, no need to do any dynamic or static analysis. and the fatal flaw would be that the key space is linear with file size, which implies that file size is exponential with key size. it seems a key size of 64 bits would require a file size of 4503599627370496 TB, while 128 bits would require 8.3*10^34 TB.
so what am i getting wrong here? thanks!
The text was updated successfully, but these errors were encountered: