Topics
- Updated List of Articles Related to Deep Learning AMIs
- Launch a AWS Deep Learning AMI (in 10 minutes)
- Faster Training with Optimized TensorFlow 1.6 on Amazon EC2 C5 and P3 Instances
- New AWS Deep Learning AMIs for Machine Learning Practitioners
- New Training Courses Available: Introduction to Machine Learning & Deep Learning on AWS
- Journey into Deep Learning with AWS
-
Q. How do I keep track of product announcements related to DLAMI?
Here are two suggestions for this:
- Bookmark this blog category, "AWS Deep Learning AMIs" found here: Updated List of Articles Related to Deep Learning AMIs.
- "Watch" the Forum: AWS Deep Learning AMIs
-
Q. Are the NVIDIA drivers and CUDA installed?
Yes. Some DLAMIs have different versions. The Deep Learning AMI with Conda has the most recent versions of any DLAMI. This is covered in more detail in CUDA Installations and Framework Bindings. You can also refer to the specific AMI's detail page on the marketplace to confirm what is installed.
-
Q. Is cuDNN installed?
Yes.
-
Q. How do I see that the GPUs are detected and their current status?
Run
nvidia-smi
. This will show one or more GPUs, depending on the instance type, along with their current memory consumption. -
Q. Are virtual environments set up for me?
Yes, but only on the Deep Learning AMI with Conda.
-
Q. What version of Python is installed?
Each DLAMI has both Python 2 and 3. The Deep Learning AMI with Conda have environments for both versions for each framework.
-
Q. Is Keras installed?
This depends on the AMI. The Deep Learning AMI with Conda has Keras available as a front end for each framework. The version of Keras depends on the framework's support for it.
-
Q. Is it free?
All of the DLAMIs are free. However, depending on the instance type you choose, the instance may not be free. See Pricing for the DLAMI for more info.
-
Q. I'm getting CUDA errors or GPU-related messages from my framework. What's wrong?
Check what instance type you used. It needs to have a GPU for many examples and tutorials to work. If running
nvidia-smi
shows no GPU, then you need to spin up another DLAMI using an instance with one or more GPUs. See Selecting the Instance Type for DLAMI for more info. -
Q. Can I use Docker?
Docker has been pre-installed since version 14 of the Deep Learning AMI with Conda. Note that you will want to use nvidia-docker on GPU instances to make use of the GPU.
-
Q. What regions are Linux DLAMIs available in?
[See the AWS documentation website for more details] -
Q. What regions are Windows DLAMIs available in?
[See the AWS documentation website for more details]