Skip to content

Commit

Permalink
Merge branch 'master' into cli
Browse files Browse the repository at this point in the history
  • Loading branch information
wochinge committed Feb 28, 2019
2 parents f11c138 + 4c9b39e commit dd3e518
Show file tree
Hide file tree
Showing 53 changed files with 321 additions and 208 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -43,4 +43,5 @@ docs/key.pub
secrets.tar
.pytest_cache
src
test_download.zip

3 changes: 2 additions & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,8 @@ install:
- python -m spacy link en_core_web_md en
- pip install https://github.com/explosion/spacy-models/releases/download/de_core_news_sm-2.0.0/de_core_news_sm-2.0.0.tar.gz --no-cache-dir > jnk
- python -m spacy link de_core_news_sm de
- if [[ ! -f /tmp/cached/total_word_feature_extractor.dat ]]; then wget --quiet -P /tmp/cached/ https://s3-eu-west-1.amazonaws.com/mitie/total_word_feature_extractor.dat;
- if [[ ! -f /tmp/cached/total_word_feature_extractor.dat ]]; then
travis_wait wget --quiet -P /tmp/cached/ https://s3-eu-west-1.amazonaws.com/mitie/total_word_feature_extractor.dat;
fi
- mv /tmp/cached/total_word_feature_extractor.dat data/total_word_feature_extractor.dat
- pip list
Expand Down
14 changes: 9 additions & 5 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,21 +9,26 @@ This project adheres to `Semantic Versioning`_ starting with version 0.7.0.

Added
-----
- Added a detailed warning showing which entities are overlapping
- Authentication token can be also set with env variable `RASA_NLU_TOKEN`.

Changed
-------
- validate training data only if used for training
- applied spacy guidelines on how to disable pipeline components
- starter packs now also tested when attempting to merge a branch to master
- new consistent naming scheme for pipelines:
- ``tensorflow_embedding`` pipeline template renamed to ``supervised_embeddings``
- ``spacy_sklearn`` pipeline template renamed to ``pretrained_embeddings_spacy``
- requirements files, sample configs, and dockerfiles renamed accordingly
- `/train` endpoint now returns a zipfile of the trained model.
- replace pep8 with pycodestyle
- renamed ``rasa_nlu.evaluate`` to ``rasa_nlu.test``
- renamed ``rasa_nlu.test.run_cv_evaluation`` to
- renamed ``rasa_nlu.test.run_cv_evaluation`` to
``rasa_nlu.test.cross_validate``
- renamed ``rasa_nlu.train.do_train()`` to ``rasa_nlu.train.train()``
- train command can now also load config from file

=======
- replace pep8 with pycodestyle

Removed
-------
- **removed python 2.7 support**
Expand Down Expand Up @@ -800,7 +805,6 @@ Added
- multithreading support of build in REST server (e.g. using gunicorn)
- multitenancy implementation to allow loading multiple models which
share the same backend

Fixed
-----
- error propagation on failed vector model loading (spacy)
Expand Down
33 changes: 20 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,11 @@
[![Supported Python Versions](https://img.shields.io/pypi/pyversions/rasa_nlu.svg)](https://pypi.python.org/pypi/rasa_nlu)
[![Build Status](https://travis-ci.com/RasaHQ/rasa_nlu.svg?branch=master)](https://travis-ci.com/RasaHQ/rasa_nlu)
[![Coverage Status](https://coveralls.io/repos/github/RasaHQ/rasa_nlu/badge.svg?branch=master)](https://coveralls.io/github/RasaHQ/rasa_nlu?branch=master)
[![Documentation Status](https://img.shields.io/badge/docs-stable-brightgreen.svg)](https://nlu.rasa.com/)
[![Documentation Status](https://img.shields.io/badge/docs-stable-brightgreen.svg)](https://rasa.com/docs/nlu/)
[![FOSSA Status](https://app.fossa.io/api/projects/git%2Bgithub.com%2FRasaHQ%2Frasa_nlu.svg?type=shield)](https://app.fossa.io/projects/git%2Bgithub.com%2FRasaHQ%2Frasa_nlu?ref=badge_shield)

<img align="right" height="244" src="https://www.rasa.com/assets/img/sara/sara-open-source-lg.png">

Rasa NLU (Natural Language Understanding) is a tool for understanding what is being said in short pieces of text.
For example, taking a short message like:

Expand All @@ -28,13 +30,13 @@ Rasa then uses machine learning to pick up patterns and generalise to unseen sen

You can think of Rasa NLU as a set of high level APIs for building your own language parser using existing NLP and ML libraries.

If you are new to Rasa NLU and want to create a bot, you should start with the [**tutorial**](https://nlu.rasa.com/tutorial.html).
If you are new to Rasa NLU and want to create a bot, you should start with the [**tutorial**](https://rasa.com/docs/nlu/quickstart/).

- **What does Rasa NLU do? 🤔** [Read About the Rasa Stack](http://rasa.com/products/rasa-stack/)

- **I'd like to read the detailed docs 🤓** [Read The Docs](https://nlu.rasa.com)
- **I'd like to read the detailed docs 🤓** [Read The Docs](https://rasa.com/docs/nlu/)

- **I'm ready to install Rasa NLU! 🚀** [Installation](https://nlu.rasa.com/installation.html)
- **I'm ready to install Rasa NLU! 🚀** [Installation](https://rasa.com/docs/nlu/installation/)

- **I have a question ❓** [Rasa Community Forum](https://forum.rasa.com)

Expand All @@ -49,7 +51,7 @@ will the next major release). If you want to use Rasa NLU with python

# Quick Install

For the full installation instructions, please head over to the documenation: [Installation](https://nlu.rasa.com/installation.html)
For the full installation instructions, please head over to the documentation: [Installation](https://rasa.com/docs/nlu/installation/)

**Via Docker Image**
From docker hub:
Expand Down Expand Up @@ -86,7 +88,7 @@ curl 'http://localhost:5000/version'

### Training New Models
[Examples](https://github.com/RasaHQ/rasa_nlu/tree/master/data/examples/rasa)
and [Documentation](https://nlu.rasa.com/dataformat.html) of the training data
and [Documentation](https://rasa.com/docs/nlu/dataformat/) of the training data
format are provided. But as a quick start execute the below command to train
a new model

Expand Down Expand Up @@ -118,7 +120,7 @@ curl 'http://localhost:5000/parse?q=hello&project=test_model'
# FAQ

### Who is it for?
The intended audience is mainly __people developing bots__, starting from scratch or looking to find a a drop-in replacement for [wit](https://wit.ai), [LUIS](https://www.luis.ai), or [Dialogflow](https://dialogflow.com). The setup process is designed to be as simple as possible. Rasa NLU is written in Python, but you can use it from any language through a [HTTP API](https://nlu.rasa.com/http.html). If your project is written in Python you can [simply import the relevant classes](https://nlu.rasa.com/python.html). If you're currently using wit/LUIS/Dialogflow, you just:
The intended audience is mainly __people developing bots__, starting from scratch or looking to find a a drop-in replacement for [wit](https://wit.ai), [LUIS](https://www.luis.ai), or [Dialogflow](https://dialogflow.com). The setup process is designed to be as simple as possible. Rasa NLU is written in Python, but you can use it from any language through a [HTTP API](https://rasa.com/docs/nlu/http/). If your project is written in Python you can [simply import the relevant classes](https://rasa.com/docs/nlu/python/). If you're currently using wit/LUIS/Dialogflow, you just:

1. Download your app data from wit, LUIS, or Dialogflow and feed it into Rasa NLU
2. Run Rasa NLU on your machine and switch the URL of your wit/LUIS api calls to `localhost:5000/parse`.
Expand All @@ -128,14 +130,19 @@ The intended audience is mainly __people developing bots__, starting from scratc
* You don't have to make a `https` call to parse every message.
* You can tune models to work well on your particular use case.

These points are laid out in more detail in a [blog post](https://blog.rasa.com/put-on-your-robot-costume-and-be-the-minimum-viable-bot-yourself/). Rasa is a set of tools for building more advanced bots, developed by the company [Rasa](https://rasa.com). Rasa NLU is the natural language understanding module, and the first component to be open-sourced.
These points are laid out in more detail in a
[blog post](https://blog.rasa.com/put-on-your-robot-costume-and-be-the-minimum-viable-bot-yourself/).
Rasa is a set of tools for building more advanced bots, developed by
the company [Rasa](https://rasa.com). Rasa NLU is the natural language
understanding module, and the first component to be open-sourced.

### What languages does it support?
It depends. Some things, like intent classification with the `tensorflow_embedding` pipeline, work in any language.
Other features are more restricted. See details [here](https://nlu.rasa.com/languages.html)
The `supervised_embeddings` pipeline works in any language.
If you want to use pre-trained word embeddings, there are models available for
many languages. See details [here](https://rasa.com/docs/nlu/languages/)

### How to contribute
We are very happy to receive and merge your contributions. There is some more information about the style of the code and docs in the [documentation](https://nlu.rasa.com/contribute.html).
We are very happy to receive and merge your contributions. There is some more information about the style of the code and docs in the [documentation](https://rasa.com/docs/contributing/).

In general the process is rather simple:
1. create an issue describing the feature you want to work on (or have a look at issues with the label [help wanted](https://github.com/RasaHQ/rasa_nlu/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22))
Expand All @@ -160,7 +167,7 @@ pip install -r alt_requirements/requirements_dev.txt
pip install -e .
```

To test the installation use (this will run a very stupid default model. you need to [train your own model](https://nlu.rasa.com/tutorial.html) to do something useful!):
To test the installation use (this will run a very stupid default model. you need to [train your own model](https://rasa.com/docs/nlu/quickstart/) to do something useful!):

### Advanced Docker
Before you start, ensure you have the latest version of docker engine on your machine. You can check if you have docker installed by typing ```docker -v``` in your terminal.
Expand All @@ -172,7 +179,7 @@ docker run -p 5000:5000 rasa/rasa_nlu:latest-full

There are also three volumes, which you may want to map: `/app/projects`, `/app/logs`, and `/app/data`. It is also possible to override the config file used by the server by mapping a new config file to the volume `/app/config.json`. For complete docker usage instructions go to the official [docker hub readme](https://hub.docker.com/r/rasa/rasa_nlu/).

To test run the below command after the container has started. For more info on using the HTTP API see [here](https://nlu.rasa.com/http.html#endpoints)
To test run the below command after the container has started. For more info on using the HTTP API see [here](https://rasa.com/docs/nlu/http/#endpoints)
```
curl 'http://localhost:5000/parse?q=hello'
```
Expand Down
2 changes: 1 addition & 1 deletion alt_requirements/conda-requirements.txt
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
scipy==1.1.0
scikit-learn==0.19.1
scikit-learn==0.20.2
6 changes: 3 additions & 3 deletions alt_requirements/requirements_full.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,13 @@
-r requirements_bare.txt

# Spacy Requirements
-r requirements_spacy_sklearn.txt
-r requirements_pretrained_embeddings_spacy.txt

# Tensorflow Requirements
-r requirements_tensorflow_sklearn.txt
-r requirements_supervised_embeddings.txt

# MITIE Requirements
-r requirements_mitie.txt
-r requirements_pretrained_embeddings_mitie.txt

duckling==1.8.0
Jpype1==0.6.2
Expand Down
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -4,5 +4,3 @@
tensorflow==1.12.0
scipy==1.1.0
sklearn-crfsuite==0.3.6
keras-applications==1.0.6
keras-preprocessing==1.0.5
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ WORKDIR ${RASA_NLU_HOME}

COPY . ${RASA_NLU_HOME}

RUN pip install -r alt_requirements/requirements_mitie.txt
RUN pip install -r alt_requirements/requirements_pretrained_embeddings_mitie.txt

RUN pip install -e .

Expand Down
34 changes: 34 additions & 0 deletions docker/Dockerfile_pretrained_embeddings_spacy_de
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
FROM python:3.6-slim

ENV RASA_NLU_DOCKER="YES" \
RASA_NLU_HOME=/app \
RASA_NLU_PYTHON_PACKAGES=/usr/local/lib/python3.6/dist-packages

# Run updates, install basics and cleanup
# - build-essential: Compile specific dependencies
# - git-core: Checkout git repos
RUN apt-get update -qq \
&& apt-get install -y --no-install-recommends build-essential git-core openssl libssl-dev libffi6 libffi-dev curl \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*

WORKDIR ${RASA_NLU_HOME}

COPY . ${RASA_NLU_HOME}

# use bash always
RUN rm /bin/sh && ln -s /bin/bash /bin/sh

RUN pip install -r alt_requirements/requirements_pretrained_embeddings_spacy.txt

RUN pip install -e .

RUN pip install https://github.com/explosion/spacy-models/releases/download/de_core_news_sm-2.0.0/de_core_news_sm-2.0.0.tar.gz --no-cache-dir > /dev/null \
&& python -m spacy link de_core_news_sm de

VOLUME ["/app/projects", "/app/logs", "/app/data"]

EXPOSE 5000

ENTRYPOINT ["./entrypoint.sh"]
CMD ["start", "-c", "config.yml", "--path", "/app/projects"]
Original file line number Diff line number Diff line change
Expand Up @@ -19,16 +19,12 @@ COPY . ${RASA_NLU_HOME}
# use bash always
RUN rm /bin/sh && ln -s /bin/bash /bin/sh

RUN pip install -r alt_requirements/requirements_spacy_sklearn.txt
RUN pip install -r alt_requirements/requirements_pretrained_embeddings_spacy.txt

RUN pip install -e .

RUN pip install https://github.com/explosion/spacy-models/releases/download/en_core_web_md-2.0.0/en_core_web_md-2.0.0.tar.gz --no-cache-dir > /dev/null \
&& python -m spacy link en_core_web_md en \
&& pip install https://github.com/explosion/spacy-models/releases/download/de_core_news_sm-2.0.0/de_core_news_sm-2.0.0.tar.gz --no-cache-dir > /dev/null \
&& python -m spacy link de_core_news_sm de

COPY sample_configs/config_spacy.yml ${RASA_NLU_HOME}/config.yml
&& python -m spacy link en_core_web_md en

VOLUME ["/app/projects", "/app/logs", "/app/data"]

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,11 +16,11 @@ WORKDIR ${RASA_NLU_HOME}

COPY . ${RASA_NLU_HOME}

RUN pip install -r alt_requirements/requirements_tensorflow_sklearn.txt
RUN pip install -r alt_requirements/requirements_supervised_embeddings.txt

RUN pip install -e .

COPY sample_configs/config_embedding.yml ${RASA_NLU_HOME}/config.yml
COPY sample_configs/config_supervised_embedding.yml ${RASA_NLU_HOME}/config.yml

VOLUME ["/app/projects", "/app/logs", "/app/data"]

Expand Down
Loading

0 comments on commit dd3e518

Please sign in to comment.