Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Secure random seed from Spark Cloud #25

Merged
merged 5 commits into from
Sep 4, 2014
Merged

Conversation

towynlin
Copy link
Contributor

No description provided.

There are only 5 bytes left in the credentials,
so an attempt on any future hardware that might support
64-bit integers to use sizeof(seed) would read past
the end of the credentials buffer.

For future reference, there would be very little cryptographic risk
in starting the seed at offset 32 and allowing 8 bytes, duplicating
use of the initial message ID and token bytes as part of the seed.
However, there is no need to do so at this time.
@m-mcgowan
Copy link

@andyw-lala said

it seems to me that the cloud should just
always send a new seed/nonce every time a core connects, that way there is
no extra handshake, and it is totally up to the core whether it uses it or
not, or even what it uses it for.

Sure, that's pretty much how it would be - a seed is sent as part of the handshake. The only change is that in the PR the seed is always used to call srand() - I'm asking that we make that behavior overridable so the random seed can be used for anything.

@m-mcgowan
Copy link

@towynlin - my preference is for the weak function rather than any of the other syntaxes - it's going to be a niche case and not used by 99% of people so I feel no need to dress it up!

Will this change work against older versions of the local cloud?

@towynlin
Copy link
Contributor Author

towynlin commented Sep 2, 2014

Yes. No cloud changes required. Just using some secure random bytes already being sent by the cloud but not currently being used by the core.

@m-mcgowan
Copy link

Awesome, thanks, I was guessing the plaintext was padded with random bytes, but wanted to be sure.

@andyw-lala
Copy link

The random bytes can cover the entire gamut of values, correct ? They are
not constrained to ascii or anything ? [just checking...]

On Tue, Sep 2, 2014 at 10:43 AM, Matthew McGowan notifications@github.com
wrote:

Awesome, thanks, I was guessing the plaintext was padded with random
bytes, but wanted to be sure.


Reply to this email directly or view it on GitHub
#25 (comment)
.

Andy

@towynlin
Copy link
Contributor Author

towynlin commented Sep 2, 2014

Yes, secure random byte values 0-255.

…ion.

this function as weak linkage so that user code can redefine and override this behavior.
I assumed `make test` built the test suite using the libcore-communication-lib.a + headers. "assumptions are bad, mkay?"
… I don't fully understand why - theory is that a weak declaration in the header seems to stop the linker binding the implementation given.
m-mcgowan added a commit that referenced this pull request Sep 4, 2014
Secure random seed from Spark Cloud
@m-mcgowan m-mcgowan merged commit 9bd5b56 into master Sep 4, 2014
@m-mcgowan m-mcgowan deleted the feature/secure-random-seed branch September 4, 2014 19:18
towynlin pushed a commit to particle-iot/device-os that referenced this pull request Sep 4, 2014
Also, when particle-iot-archived/core-communication-lib#25
is merged, then the user will automatically have a secure random
seed after handshaking with the Spark Cloud.
@paulmand3l
Copy link

@m-mcgowan did somebody force-push to master after this got merged? I can't find these changes in the latest master branch and the history of spark_protocol.cpp shows its last commit at August 1st instead of the merge on September 4th.

@m-mcgowan
Copy link

Github is telling me that the commit is available in the feature/hal branch. I'm not sure why it's not in master, but probably that's a good thing since it was intended as a development change. (It's since migrated to the merged repos in the develop branch of spark/firmware.) I hope that helps, sorry I can provide more specific info.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants