Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] JSONDecodeError with dockerized analyzers #800

Closed
gaglimax opened this issue Jun 22, 2020 · 5 comments
Closed

[Bug] JSONDecodeError with dockerized analyzers #800

gaglimax opened this issue Jun 22, 2020 · 5 comments
Labels
category:bug Issue is related to a bug

Comments

@gaglimax
Copy link

Describe the bug
There is a JSONDecodeError with cortexneurons analyzers.

To Reproduce
Steps to reproduce the behavior:

  1. Download latest version of Cortex-Analyzers (2.7.0)
  2. Generate analyzers-sable.json with update_catalogs.sh.
  3. Specify in configuration that you want to use docker to run jobs, lauch Cortex with docker, and allow the user to use the docker process on the hosts system.
  4. Enable analyzers (I tried with FileInfo_7_0 and EmlParser_1_2).
  5. Execute an analyzer.

Expected behavior
The job should succed.

Complementary information
The error is :

Traceback (most recent call last):
  File "FileInfo/fileinfo_analyzer.py", line 76, in <module>
    FileInfoAnalyzer().run()
  File "FileInfo/fileinfo_analyzer.py", line 14, in __init__
    Analyzer.__init__(self)
  File "/usr/local/lib/python3.8/site-packages/cortexutils/analyzer.py", line 17, in __init__
    Worker.__init__(self, job_directory)
  File "/usr/local/lib/python3.8/site-packages/cortexutils/worker.py", line 31, in __init__
    self._input = json.load(sys.stdin)
  File "/usr/local/lib/python3.8/json/__init__.py", line 293, in load
    return loads(fp.read(),
  File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
    return _default_decoder.decode(s)
  File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

This is not about the cortexutils version, because it is 2.0.0.
I tried only with FileInfo_7_0 and EmlParser_1_2, but I think the issue happened on every cortexneurons analyzers.

Work environment

  • Server OS: Docker thehiveproject/cortex:3.0.1
  • Cortex version: 3.0.1
  • Cortex Analyzer name: all.

Possible solutions
I think analyzer sontainers are not properly built.

@gaglimax gaglimax added the category:bug Issue is related to a bug label Jun 22, 2020
@gaglimax
Copy link
Author

I found the reason of the problem.

In my deployment, Cortex is dockerized, as well as analyzers. So I gave privileges to use docker on the host for the Cortex daemon.

When I lauch an analysis, Cortex daemon creates a file in /tmp. Then the analyzer container try to mount /tmp to get this file. My probleme was that the folder /tmp wasn't share between the to containers.

So I fixed it by mounting /tmp inside the Cortex container.

To improve that, it could be a good idea for the Cortex administrator to choose where to save files to analyze (elsewhere than /tmp).

@hermanmaleiane
Copy link

Hi @gaglimax !!!

Can you please share your configurations please.

Thanks in advance

@gaglimax
Copy link
Author

gaglimax commented Apr 8, 2021

Hi !

Here is the compose file :

services:
  cortex:
    image: artifactory.foo/cortex-3.0.1:1.0
    container_name: cortex
    command: --no-config
    networks:
      - default
    restart: unless-stopped
    volumes:
      - ./application.conf:/etc/cortex/application.conf
      - /var/log/cortex:/var/log/cortex
      - ./certificates/truststore.jks:/etc/cortex/truststore.jks
      - ./certificates/keystore.jks:/etc/cortex/keystore_elasticsearch.jks
      - ./analyzers/Custom-Analyzers:/opt/Custom-Analyzers
      - ./analyzers/analyzers.json:/opt/analyzers.json
      - ./analyzers/Cortex-Analyzers:/opt/Cortex-Analyzers
      - /var/run/docker.sock:/var/run/docker.sock
      - /tmp:/tmp
      - ./docker/.config.json:/usr/sbin/.docker/config.json
    userns_mode: "host"
    privileged: true
    expose:
      - 9001
    ports:
      - "2551:2551"

The trick was just to mount /tmp inside the Cortex container, then when you start an analysis, Cortex will create the file to analyze in /tmp. The container created by the analyzer automatically mount /tmp (it's by design), so it can get the file.
I don't remember what was in ./docker/.config.json, but I'm pretty sure it wasn't related to this issue.

@hermanmaleiane
Copy link

Hi Thanks for sharing.
What about the permission you applied to docker be able to write the files on /temp.
Hope this fix my issue.

@hermanmaleiane
Copy link

The error i have now:

FileInfo_7_0
image

Traceback (most recent call last):
File "FileInfo/fileinfo_analyzer.py", line 76, in
FileInfoAnalyzer().run()
File "FileInfo/fileinfo_analyzer.py", line 40, in init
self.error('Manalyze submodule is enabled, but either there is no method allowed (docker or binary)'
File "/usr/local/lib/python3.8/site-packages/cortexutils/worker.py", line 155, in error
self.__write_output({'success': False,
File "/usr/local/lib/python3.8/site-packages/cortexutils/worker.py", line 122, in __write_output
with open('%s/output/output.json' % self.job_directory, mode='w') as f_output:
PermissionError: [Errno 13] Permission denied: '/job/output/output.json'

My docker-compose.yml

version: '3.8'
services:
cortex:
image: thehiveproject/cortex:latest
container_name: cortex
command: --no-config
networks:
- default
restart: unless-stopped
volumes:
- ./vol/cortex/application.conf:/etc/cortex/application.conf
- /var/log/cortex:/var/log/cortex
- /var/run/docker.sock:/var/run/docker.sock
- /tmp:/tmp
userns_mode: "host"
privileged: true
ports:
- '0.0.0.0:9001:9001'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category:bug Issue is related to a bug
Projects
None yet
Development

No branches or pull requests

2 participants