This repo is a hands-on lab showing Docker container isolation using a Python script that simulates deleting your Downloads folder
For cybersecurity professionals, malware researchers, and anyone wanting to understand Docker security through hands-on practice
Important
This is a simulation only - no files are actually deleted!
- ⚙️ Requirements
- 🎯 What This Demonstrates
- 🔥 Why This Matters for Security
- 🏗️ Lab Setup
- 🚀 Run the Lab
- 📝 Step by Step Docker Execution
- 🔍 Can't See the Files?
- 🛡️ Security Takeaways for Infosec
- 🔬 Real-World Security Uses
⚠️ Important Notes- 👤 About Me & Contact
To run this lab, you need:
- Docker installed and running on your system
- Docker Desktop for Windows/Mac
- Or Docker Engine for Linux
- Python 3.9+ (to run the script directly on your host for comparison)
- Basic terminal/command line knowledge
This lab shows the security difference between running potentially dangerous code on your host system vs inside a Docker container.
We will use a Python script that simulates deleting all files in your Downloads folder to demonstrate how Docker's containerization protects your system from malicious code.
This simulation demonstrates a crucial security concept:
If this was real malicious code:
- Running directly on your computer → Could delete your actual files 💥
- Running inside Docker container → Only affects container files ✅
Before starting, ensure you have Docker installed and running:
- Windows/Mac: Download and install Docker Desktop
- Linux: Install Docker Engine
Verify Docker is running:
docker --versionYou should see something like: Docker version 24.0.6, build ed223bc
Option 1: Clone the repository
git clone https://github.com/ThiagoMaria-SecurityIT/Docker-Delete-Lab.git
cd Docker-Delete-LabOption 2: Download ZIP
- Click the "Code" button on GitHub
- Select "Download ZIP"
- Extract the ZIP file to your preferred location
- Open terminal in the extracted folder
Open your terminal/command prompt and navigate to where you downloaded the lab:
# Windows PowerShell
cd C:\path\to\Docker-Delete-Lab
# Windows Command Prompt
cd C:\path\to\Docker-Delete-Lab
# Linux/Mac
cd /path/to/Docker-Delete-LabEnsure you can see these files in your folder:
Docker-Delete-Lab/
├── Dockerfile # Docker container configuration
├── fake_deleter.py # Python script for the simulation
└── README.md # This documentation
Quick check:
# List files in current directory
dir # Windows
ls -la # Linux/MacOnce you're in the correct folder, you should see:
Docker-Delete-Lab/
├── Dockerfile
├── fake_deleter.py
├── README.md
└── images/
├── dockerpersist.png
├── dockerfiles.png
└── dockerexc.png
Once you have:
- ✅ Docker installed and running
- ✅ Lab files downloaded
- ✅ Terminal open in the correct folder
You're ready to proceed to the "Run the Lab" section! 🚀
python fake_deleter.pyYou'll see: A simulation showing what files in your Downloads folder would be affected (nothing is actually deleted)
# Build the container
docker build -t delete-lab .
# Run it isolated
docker run --rm delete-labYou'll see: The simulation now only shows container test files being affected
docker run --rm -v "C:/Users/YourUserName/Downloads:/root/Downloads" delete-labThis shows: How explicit permissions can bridge container isolation
First, navigate to your Docker-Delete-Lab folder:
cd path/to/your/Docker-Delete-LabThis creates a runnable image from your Dockerfile:
docker build -t delete-lab .What this does:
docker build= build command-t delete-lab= tags (names) your image "delete-lab".= uses the Dockerfile in current directory
You should see output like:
[+] Building 15.2s (10/10) FINISHED
=> [internal] load build definition from Dockerfile
=> [internal] load .dockerignore
=> [internal] load metadata for docker.io/library/python:3.9-slim
...
=> => writing image sha256:abc123...
=> => naming to docker.io/library/delete-lab
Check that your image is in the list:
docker imagesYou should see:
REPOSITORY TAG IMAGE ID CREATED SIZE
delete-lab latest abc123def456 2 minutes ago 200MB
Execute your containerized application:
docker run --rm delete-labWhat this does:
docker run= run command--rm= automatically remove container when it stops (cleanup)delete-lab= the image name you built
You should see something like:
📁 Scanning folder: /root/Downloads
🔍 Found 3 items:
📄 FILE: container_file1.txt
📄 FILE: container_file2.txt
📁 FOLDER: test_folder_inside_container
🚨 SIMULATED DELETION:
❌ DELETED: container_file1.txt
❌ DELETED: container_file2.txt
❌ DELETED: test_folder_inside_container
✅ Simulation complete! (Nothing was actually deleted)
# 1. Go to your project folder
cd Docker-Delete-Lab
# 2. Build the image (only needed once, or when you change Dockerfile)
docker build -t delete-lab .
# 3. Run the container (as many times as you want)
docker run --rm delete-lab
# 4. Optional: Run it multiple times to see it's consistent
docker run --rm delete-lab
docker run --rm delete-labWhat's Happening:
When you run docker run --rm delete-lab:
- ✅ The Python code DOES create the files inside the container
- ✅ The simulation DOES run and show you the "deletion"
- ❌ But the container gets destroyed immediately after the Python script finishes (because of
--rm)
- The files exist ONLY during container execution
--rmflag = automatically deletes the container when it stops- Container filesystem is temporary by design
Run Interactive Mode (Let's go inside the container in real time!)
# Run interactively and explore
docker run -it delete-lab /bin/bash
# Now you're INSIDE the container!
ls -la /root/Downloads/ # See the created files
python fake_deleter.py # Run the script manually
exit # Leave containerThe files ARE created during docker build (in the Dockerfile with RUN commands), but they live inside the container's isolated filesystem. When the container stops, that entire filesystem disappears.
docker run -it delete-lab /bin/bash*Make sure to open the terminal with admin and inside the folder where you downloaded this repo and have the dockerfile, for example cd path/to/Docker-Delete-Labfolder> Open Terminal
This behavior proves container isolation:
- Files created inside container → Stay inside container
- No pollution of your host system
- Automatic cleanup when container stops
- Perfect for running untrusted code!
So the fact that you can't see the files on your host is actually the security feature working as intended! 🎯
Tip
Now you understand Docker security through hands-on practice
If you get "docker not found":
- Docker isn't installed or running
- Start Docker Desktop (Windows/Mac) or Docker service (Linux)
If you get "permission denied":
- On Linux: Run with
sudoor add your user to docker group - On Windows/Mac: Docker Desktop should handle permissions
If build fails:
- Make sure you're in the correct folder with Dockerfile
- Check internet connection (needs to download Python image)
- Containers are Sandboxes: Code inside can't escape by default
- Default Deny: Docker starts completely isolated from your system
- Explicit Access: You control what the container can touch
- Malware Analysis: Safe environment to study dangerous code
- Damage Containment: Breaches stay inside the container
- Malware Research: Analyze viruses without infecting your host (using a Docker container running ClamAV as a service)
- Penetration Testing: Safely run exploit tools
- Incident Response: Isolate compromised applications
- Security Training: Demonstrate attack concepts safely
- 🚫 No actual files are deleted - this is a simulation only
- 🔒 Docker provides strong isolation by default
- ⚡ Volume mounts (
-v) intentionally reduce security for demonstration - 📚 Perfect for understanding container security fundamentals
Thiago Maria - From Brazil to the World 🌎
Senior Information Security Professional | Security Risk & Compliance Specialist | AI Security Researcher | Software Developer | Post-Quantum Cryptography Enthusiast
My passion for programming and my professional background in security analysis led me to create this GitHub account to share my knowledge of security information, cybersecurity, Python, and AI development practices. My work primarily focuses on prioritizing security in organizations while ensuring usability and productivity.
Let's Connect:
👇🏽 Click on the badges below:


