Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support disabling compile time http request #15

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 26 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,32 @@ iex> Domainatrex.parse("blog.someone.id.au")
{:ok, %{domain: "someone", subdomain: "blog", tld: "id.au"}}
```


## Configuration

For maximum performance, `Domainatrex` reads the list of all known top-level domains at compile time.
Likewise, by default, the package will attempt to fetch the latest list of TLDs from the web before
falling back to a local (potentially out of date) copy. You can configure this behavior in your
`config.exs` as follows:

- `:fetch_latest`: A Boolean flag to determine whether `Domainatrex` should try to fetch the latest
list of public suffixes at compile time; default is `true`
- `:public_suffix_list_url`: A charlist URL to the latest public suffix file that `Domainatrex` will
try to fetch at compile time; default is
`'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'`
- `:fallback_local_copy`: The path to the local suffix file that `Domainatrex` will use if it wasn't
able to fetch a fresh file from the URL, or if fetching updated files was disabled; default is
the `"lib/public_suffix_list.dat"` file included in the package.

Here's a complete example of how you might customize this behavior in your `confix.exs`:

```elixir
config :domainatrex,
# Explicitly allow compile-time HTTP request to fetch the latest list of TLDs (default)
fetch_latest: true,
# Download the public suffix list from the official source (not necessarily tested with Domainatrex!)
public_suffix_list_url: 'https://publicsuffix.org/list/public_suffix_list.dat',
fallback_local_copy: "priv/my_app_custom_suffix_list.dat"
```

## Changelog

Expand Down
16 changes: 11 additions & 5 deletions lib/domainatrex.ex
Original file line number Diff line number Diff line change
Expand Up @@ -3,19 +3,25 @@ defmodule Domainatrex do
@moduledoc """
Documentation for Domainatrex.
"""
@public_suffix_list_url Application.get_env(:domainatrex, :public_suffix_list_url, 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')
@fallback_local_copy Application.get_env(:domainatrex, :fallback_local_copy, "lib/public_suffix_list.dat")
@fetch_latest Application.get_env(:domainatrex, :fetch_latest, true)
@public_suffix_list nil

:inets.start
:ssl.start
case :httpc.request(:get, {@public_suffix_list_url, []}, [], []) do
{:ok, {_, _, string}} ->
@public_suffix_list to_string(string)

with true <- @fetch_latest,
public_suffix_list_url <- Application.get_env(:domainatrex, :public_suffix_list_url, 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'),
{:ok, {_, _, string}} <- :httpc.request(:get, {public_suffix_list_url, []}, [], []) do
@public_suffix_list to_string(string)
else
_ ->
case File.read @fallback_local_copy do
{:ok, string} ->
Logger.error "[Domainatrex] Could not read the public suffix list from the internet, trying to read from the backup at #{@fallback_local_copy}"
if @fetch_latest do
Logger.error "[Domainatrex] Could not read the public suffix list from the internet, trying to read from the backup at #{@fallback_local_copy}"
end

@public_suffix_list string
_ ->
Logger.error "[Domainatrex] Could not read the public suffix list, please make sure that you either have an internet connection or #{@fallback_local_copy} exists"
Expand Down