Skip to content

Commit

Permalink
Closes: #11781: Add support for Amazon S3 remote data sources (#11986)
Browse files Browse the repository at this point in the history
* Add boto3 as a dependency

* Add Amazon S3 backend for remote data sources

* Update docs to include Amazon S3 support
  • Loading branch information
jeremystretch committed Mar 15, 2023
1 parent 5cd3ad0 commit cacc418
Show file tree
Hide file tree
Showing 6 changed files with 86 additions and 5 deletions.
4 changes: 4 additions & 0 deletions base_requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,10 @@
# https://github.com/mozilla/bleach
bleach<6.0

# Python client for Amazon AWS API
# https://github.com/boto/boto3
boto3

# The Python web framework on which NetBox is built
# https://github.com/django/django
Django<4.2
Expand Down
10 changes: 6 additions & 4 deletions docs/models/core/datasource.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,15 +14,17 @@ The type of data source. Supported options include:

* Local directory
* git repository
* Amazon S3 bucket

### URL

The URL identifying the remote source. Some examples are included below.

| Type | Example URL |
|------|-------------|
| Local | file:///var/my/data/source/ |
| git | https://https://github.com/my-organization/my-repo |
| Type | Example URL |
|-----------|----------------------------------------------------|
| Local | file:///path/to/my/data/ |
| git | https://github.com/my-organization/my-repo |
| Amazon S3 | https://s3.us-east-2.amazonaws.com/my-bucket-name/ |

### Status

Expand Down
2 changes: 1 addition & 1 deletion docs/release-notes/version-3.5.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The static home view has been replaced with a fully customizable dashboard. User

#### Remote Data Sources ([#11558](https://github.com/netbox-community/netbox/issues/11558))

NetBox now has the ability to synchronize arbitrary data from external sources through the new [DataSource](../models/core/datasource.md) and [DataFile](../models/core/datafile.md) models. Synchronized files are stored in the PostgreSQL database, and may be referenced and consumed by other NetBox models, such as export templates and config contexts. Currently, replication from local filesystem paths and from git repositories is supported, and we expect to add support for additional backends in the near future.
NetBox now has the ability to synchronize arbitrary data from external sources through the new [DataSource](../models/core/datasource.md) and [DataFile](../models/core/datafile.md) models. Synchronized files are stored in the PostgreSQL database, and may be referenced and consumed by other NetBox models, such as export templates and config contexts. Currently, replication from local filesystem paths, git repositories, and Amazon S3 buckets is supported, and we expect to introduce additional backends in the near future.

#### Configuration Template Rendering ([#11559](https://github.com/netbox-community/netbox/issues/11559))

Expand Down
2 changes: 2 additions & 0 deletions netbox/core/choices.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,12 @@
class DataSourceTypeChoices(ChoiceSet):
LOCAL = 'local'
GIT = 'git'
AMAZON_S3 = 'amazon-s3'

CHOICES = (
(LOCAL, _('Local'), 'gray'),
(GIT, _('Git'), 'blue'),
(AMAZON_S3, _('Amazon S3'), 'blue'),
)


Expand Down
72 changes: 72 additions & 0 deletions netbox/core/data_backends.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,14 @@
import logging
import os
import re
import subprocess
import tempfile
from contextlib import contextmanager
from pathlib import Path
from urllib.parse import quote, urlunparse, urlparse

import boto3
from botocore.config import Config as Boto3Config
from django import forms
from django.conf import settings
from django.utils.translation import gettext as _
Expand Down Expand Up @@ -115,3 +120,70 @@ def fetch(self):
yield local_path.name

local_path.cleanup()


@register_backend(DataSourceTypeChoices.AMAZON_S3)
class S3Backend(DataBackend):
parameters = {
'aws_access_key_id': forms.CharField(
label=_('AWS access key ID'),
widget=forms.TextInput(attrs={'class': 'form-control'})
),
'aws_secret_access_key': forms.CharField(
label=_('AWS secret access key'),
widget=forms.TextInput(attrs={'class': 'form-control'})
),
}

REGION_REGEX = r's3\.([a-z0-9-]+)\.amazonaws\.com'

@contextmanager
def fetch(self):
local_path = tempfile.TemporaryDirectory()

# Build the S3 configuration
s3_config = Boto3Config(
proxies=settings.HTTP_PROXIES,
)

# Initialize the S3 resource and bucket
aws_access_key_id = self.params.get('aws_access_key_id')
aws_secret_access_key = self.params.get('aws_secret_access_key')
s3 = boto3.resource(
's3',
region_name=self._region_name,
aws_access_key_id=aws_access_key_id,
aws_secret_access_key=aws_secret_access_key,
config=s3_config
)
bucket = s3.Bucket(self._bucket_name)

# Download all files within the specified path
for obj in bucket.objects.filter(Prefix=self._remote_path):
local_filename = os.path.join(local_path.name, obj.key)
# Build local path
Path(os.path.dirname(local_filename)).mkdir(parents=True, exist_ok=True)
bucket.download_file(obj.key, local_filename)

yield local_path.name

local_path.cleanup()

@property
def _region_name(self):
domain = urlparse(self.url).netloc
if m := re.match(self.REGION_REGEX, domain):
return m.group(1)
return None

@property
def _bucket_name(self):
url_path = urlparse(self.url).path.lstrip('/')
return url_path.split('/')[0]

@property
def _remote_path(self):
url_path = urlparse(self.url).path.lstrip('/')
if '/' in url_path:
return url_path.split('/', 1)[1]
return ''
1 change: 1 addition & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
bleach==5.0.1
boto3==1.26.91
Django==4.1.7
django-cors-headers==3.14.0
django-debug-toolbar==3.8.1
Expand Down

0 comments on commit cacc418

Please sign in to comment.