Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to ship mordor logs to HELK #44

Open
blueteambram opened this issue May 29, 2021 · 9 comments
Open

Unable to ship mordor logs to HELK #44

blueteambram opened this issue May 29, 2021 · 9 comments

Comments

@blueteambram
Copy link

I went through the walkthrough for installing HELK and when I try to ingest the JSON files using the data-shipper script, I get an error saying that it is unable to open the JSON file. I was able to get it to work by instead passing the script with a tar.gz data set and it will show as complete, but when I go to Kibana to look at the discover tab it shows no logs. Also, when I look at the elasticsearch indices management tab, it shows the winlogbeat-mordor and the number of events parsed, but its health status is yellow.

@sec-balkan
Copy link

Have you been able to solve it?

@blueteambram
Copy link
Author

No, I have not

@automate-tim
Copy link

Solution that enabled me to get it up and running below.

The ingest into Elastic script does only work with .tar.gz files currently, however everything is in there to ingest a single JSON file since that is what is in the compressed files. I haven't exported that functionality into a single script yet.

A workaround that worked for me was tar -czvf mordor-data.tar.gz json-files-here (I've also got a fork that only has .tar.gz files, I'll work with authors to figure out if they want me to merge)

To address the second part @blueteambram mentioned about indices, I also had a yellow status but was able to use it. Once elastic search has parsed the winlogbeat-mordor index and it has events, you need to create a Kibana index pattern for it as well before leveraging the search functionality in the Discover tab.
Screen Shot 2021-06-11 at 12 24 47 PM

Once you have created the index, select that index in the Discover tab to start hunting! Also remember these data sets were created years ago so set the timeframe accordingly.
Screen Shot 2021-06-11 at 12 26 32 PM

This may not be a complete solution, but it is what ended up working for me so hopefully it can provide some guidance for anyone else experiencing this issue.

@sec-balkan
Copy link

I can put it, win winlogbeat-mordor, but are not shown in the dashboards

@Cyb3rWard0g
Copy link
Collaborator

Hello @tim-scythe , would you mind sharing more details about using the files as .tar.gz with Elasticsearch? Please and thank you. Sorry all for the delay. Thank you for taking the time to test and share more details.

@automate-tim
Copy link

I can put it, win winlogbeat-mordor, but are not shown in the dashboards

Long delay between my replies, but an additional item to make sure the data shows up is to expand the time window. Some of this data was generated years ago so you will need to expand your search window to the past three-four years to accommodate that.

Hello @tim-scythe , would you mind sharing more details about using the files as .tar.gz with Elasticsearch? Please and thank you. Sorry all for the delay. Thank you for taking the time to test and share more details.

Hey @Cyb3rWard0g! When looking at using and sending data to Elasticsearch, I was using the https://github.com/OTRF/Security-Datasets/blob/master/scripts/data-shippers/Mordor-Elastic.py script and noticed that it wasn't able to parse the zip files that have the data such as https://github.com/OTRF/Security-Datasets/tree/master/datasets/atomic/windows/discovery/host

Code below from the Mordor-Elastic.py, lines 63-66
if args.recursive: paths = [ p for path in args.inputs for p in path.glob("**/*.tar.gz") if p.is_file() ] else: paths = [ path for path in args.inputs if path.is_file() ]

and lines 81-86 only support tar file types.
tf = tarfile.open(path) for m in tf.getmembers(): if m.isfile(): print(f"- Importing member file {m.name}...") logfile = f"{path}/{m.name}" mf = tf.extractfile(m)

I'll be honest in that I tinkered around for a few hours with trying to get zip file extraction working in a similar manner to have the above script support both formats, however converting zip files to tar ended up being a simpler implementation that ended up working.

@l0gm0nk3y69
Copy link

I am confused about why this script only works with .tar files - "The ingest into Elastic script does only work with .tar.gz files currently" when the "Ship Data to HELK" page https://securitydatasets.com/consume/helk.html clearly shows it working with JSON files.

mordor/scripts/data-shippers/Mordor-Elastic.py --url http://localhost:9200 inputs empire_dcsync_dcerpc_drsuapi_DsGetNCChanges_2020-09-21185829.json

Did something change???

@automate-tim
Copy link

Hey @l0gm0nk3y69, I was looking at ways to mass ingest the data. At the time since each JSON file was compressed into a zip file, it required unzipping the file and then using the ingestion script as you mentioned. The ingestion script worked on tar files and get each of the JSON files within it, however it did not work the same with zip compressed files and that was the primary way a lot of the data was being stored within this repository.

@mheikalpro
Copy link

I am stuck with importing data sets files to HELK using Mordor-Elastic.py script and get the following error : TypeError: Positional arguments can't be used with Elasticsearch API methods. Instead only use keyword arguments.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants