-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
jobs UI: make it easy to copy/paste the output file paths #656
Comments
"For notebooks there is a "show in file browser" option. That would work" |
I know I can get the path of the file by extracting it from the product url, but is it also possible to use job_dir? I don't see it in the job information from dps, only job_dir_size |
the jobs UI already has the path of the outputs. this is just to make it easier to either browse to them, or copy/paste the path for Terminal commands. does that help? |
@rtapella Where is the path to the outputs? I don't see it in the 'jobInfo' object |
If you select a job in the list the bottom panel should have details of that job. One of the sub-tabs is Outputs, which should list the path(s). |
I only see the links |
To update on this ticket, we are going to add the file path to the jobs object by modifying the maap-api-nasa repository |
@grallewellyn As far as I know, that's correct. @sujen1412 Do you recall what process creates the |
The |
Do you know what creates the directory when a job is run for the first time by a user? MAAP-API? |
Since that path is an S3 object path, there is no concept of a directory here. The files here are S3 objects with key containing |
What in the DPS does this? |
The dataset_ingest script is responsible for pushing datasets to S3. https://github.com/hysds/hysds/blob/1054c0588ff7a8b9875932581010d37502662a2e/hysds/dataset_ingest.py#L595 |
@sujen1412 What are the cases when nothing is in "products_staged"? I have a couple job examples with products_staged as an empty array. The job failed, but some other failed jobs have urls in their "products_staged" (ie /triaged_job/...) |
This particular example is one of the cases where there was no configured dataset recognition which means the job might not have created an output directory and the timestamp seems like its really old job metadata before job_triage was implemented. Do you see this in newer jobs ? Say in the last 2 months? |
Sujen and I resolved this. The problem was the job failed to even download the users container so it never started docker which caused it to never produce output or triage. I only had a couple jobs that were missing the products_staged, so we will leave the file paths empty for those jobs since there really isn't anything to put In ops, I have a Also, it seems like I will need to hard code in adding |
The update from the hackathon:
to
So that Also, it is okay to hard code in |
@grallewellyn just make sure these changes are committed to the appropriate devfiles (e.g. https://github.com/MAAP-Project/maap-workspaces/tree/main/devfiles/vanilla/devfile). I will publish these changes manually for the v3.1.4 release. I have a story for v3.1.5 to automate this step: #894 |
@bsatoriu Do we want this done for the v3.1.4 release or is development supposed to be done? Also, |
Can we push this to v3.1.5? |
Sure. I think this was a bit more complicated than we expected and if the opportunity cost on other features is too high, it's okay to push back a release. |
What is the use case for |
@grallewellyn this PR fixes the file url copy bug for the triaged-jobs mounted directory: MAAP-Project/maap-api-nasa#101 |
Jobs UI Usability Test
Typically someone will go from the "output" (== "product") section of a job result and go to that folder.
The file itself is not as important to copy/paste as the folder that it's in (for cd)
The text was updated successfully, but these errors were encountered: