You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Each hub restricts individual users to not use more than a specified amount of
compute resources. The primary resources we care about, in order of 'caring' are:
Memory. This is inflexible - if you go over the amount of RAM available,
your kernel dies. This is the most important resource to understand for our
use cases - almost all educational hubs are memory bound, rather than CPU or
storage bound.
CPU. More flexible resource, since CPU availability is decided dynamically
by the linux kernel. It can give a user 1full CPU for a minute, but only 0.01
for the next 5 minutes, without any issues (other than a slowdown). This isn't
possible with memory. We wanna make sure that users have as much CPU as they
need, but it's not usually an issue - especially because cloud providers won't
let you get a lot of memory without enough CPU.
Storage. Home directory storage is persistent between users. Only code
should be kept in home directories, ideally from git repositories. Many
repos can be multiple hundreds of megs, and users can also accidentally write
code that fills up the entire storage. We should ideally restrict users to
something like a max of 10G. This doesn't require us to provision 10G for
each user - we can easily overprovision this, since we would need a large
number of users to exceed 10G at the same time to cause issues.
It's important for instructors to know these limits. Memory limit is a prime
driver in designing courses. These limits also define many other hosted
notebook providers - Colab's claim to fame is a free GPU, for example.
The text was updated successfully, but these errors were encountered:
Each hub restricts individual users to not use more than a specified amount of
compute resources. The primary resources we care about, in order of 'caring' are:
Memory. This is inflexible - if you go over the amount of RAM available,
your kernel dies. This is the most important resource to understand for our
use cases - almost all educational hubs are memory bound, rather than CPU or
storage bound.
CPU. More flexible resource, since CPU availability is decided dynamically
by the linux kernel. It can give a user 1full CPU for a minute, but only 0.01
for the next 5 minutes, without any issues (other than a slowdown). This isn't
possible with memory. We wanna make sure that users have as much CPU as they
need, but it's not usually an issue - especially because cloud providers won't
let you get a lot of memory without enough CPU.
Storage. Home directory storage is persistent between users. Only
code
should be kept in home directories, ideally from git repositories. Many
repos can be multiple hundreds of megs, and users can also accidentally write
code that fills up the entire storage. We should ideally restrict users to
something like a max of 10G. This doesn't require us to provision 10G for
each user - we can easily overprovision this, since we would need a large
number of users to exceed 10G at the same time to cause issues.
It's important for instructors to know these limits. Memory limit is a prime
driver in designing courses. These limits also define many other hosted
notebook providers - Colab's claim to fame is a free GPU, for example.
The text was updated successfully, but these errors were encountered: