Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[server] Define how to handle results analysis #174

Closed
gnuletik opened this issue Apr 8, 2020 · 6 comments
Closed

[server] Define how to handle results analysis #174

gnuletik opened this issue Apr 8, 2020 · 6 comments
Assignees

Comments

@gnuletik
Copy link
Collaborator

gnuletik commented Apr 8, 2020

What is the workflow and all the cases to handle results analysis ?
We may not implement the framewise analysis and only take timewise analysis into account now.

We may need to :

  • Add a new route to provide information about what is the expected data type for analysis (PNG / JSON / HDF5)
  • Add a MIME field
  • Let analyzers declare their own data types

Some routes may be removed in the future.
For example, /visual currently returns the static string: details

@gnuletik
Copy link
Collaborator Author

gnuletik commented May 6, 2020

Hi @yomguy, do you have time to have a look at this ?
That would help us win some time :)
Thanks !

@yomguy
Copy link
Member

yomguy commented May 19, 2020

Here is the workflow:

  • the user add a analysis track with the + button from the player
  • the user select a SubProcessor to run on top of the item
  • a Preset is created and associated to the SubProcessor as an Analysis
  • an Experience and Task are created from the backend
  • the Task is run
  • ideally, the user knows that the task is running (see [API] Redefine workflow to notify client when an task status is updated #193)
  • an AnalysisTrack is created linked to the Analysis
  • a new track is added to the player track list
  • the result_url is given by AnalysisTrack.result_url
  • the type of the result is given by Result.mime_type
  • if the type is an image, the player will display it in the track
  • if the type is None:
    • the result_id is given by the Analysis.sub_processor_id
    • the part of the total JSON data is given by the result_id
    • if the time_mode is time_wise, the data is draw as is (no example yet)
    • if the time_mode is frame_wise, the signal is displayed recomposing the time interval on the time axis with duration*samplerate/blocksize
  • the result of the Analysis for the given SubProcessor is displayed

There are 4 types of data :

  • bitmap (width and height are given as parameters with default values)
  • json (2D, timewise) (RARE)
  • json (2D, framewise). In this case the backend should project the frame vector into the time space
  • single value. This type of data is obviously not return into a track but a global value.

@Tointoin please provide a property to Processor defining its data type

@Tointoin please serialize the MIME type of the result which is already a model field.

@gnuletik
Copy link
Collaborator Author

Thanks @yomguy !
That will help the frontend development and we can use this specs for the docs in the future.

Just some minor ajustements:

the user select a SubProcessor to run on top of the item

The docs state that the createAnalysisTrack operation is expected an Analysis instead of a Subprocessor.
See https://sandbox.wasabi.telemeta.org/timeside/api/docs/#operation/createAnalysisTrack
Should it be updated to expect a Subprocessor ?

Are you ok if I edit your comment to clarify which steps should be done from the API and which steps should be done from the client ?

e.g.

  • [Client/createAnalysisTrack] the user add a analysis track with the + button from the player
    • the user select a SubProcessor to run on top of the item
  • [Server/App] an Experience and Task are created
  • [Server/Worker] the Task is run
  • [Server/App] an AnalysisTrack is created linked to the Analysis

I'll start implementing the results on the player once #191 is fixed.

@gnuletik
Copy link
Collaborator Author

Some additional questions :)

What are the expected values of Result.mime_type ?

if the type is an image, the player will display it in the track
if the type is None:

So, the type (Result.mime_type) has two possible values ? 'bitmap' or None ? If that's true, @Tointoin can you set an Enum (like the status field) in the mime_type field in the schema.

How to get the image from the API point of view ?

if the type is an image, the player will display it in the track

Is it available in Result.file ? In this case, I can get the Result from AnalaysisTracks[i].result_url
Or should I make an additional request to retrieveResultVisualization (/timeside/api/results/{uuid}/visual/) if mime_type == 'bitmap' ?

What is the data format and where is it stored ?

json (2D, timewise) (RARE)
json (2D, framewise). In this case the backend should project the frame vector into the time space
single value. This type of data is obviously not return into a track but a global value.

Where are these values stored ? In Result.hdf5 (available inside retrieveResult) ?
What is the typing of this data ?
For 2D json is it an array of number ?

Available fields are shown here : https://sandbox.wasabi.telemeta.org/timeside/api/docs/#operation/retrieveResult

Thanks !

@Tointoin
Copy link
Collaborator

Tointoin commented May 19, 2020

Once again, thanks @yomguy for this spec!
Quick details on next steps for backend dev/refactor:

@Tointoin please provide a property to Processor defining its data type

a priori not necessary for now given the workflow, the fact that mime_type is arealdy in result and the time mode is in json serialization.

@Tointoin please serialize the MIME type of the result which is already a model field.

result's MIME type is already serialized (no link with the one of the item here).

What I have to do ASAP is:

  • change manually Subprocessor and Analysis to have spectrogram's grapher and aubio_pitch and onset_detection analyzer with right subprocessor_id
  • refactore AnalysisTrackSerializer.get_result_url to retrieve only /timeside/api/results/{uuid}/ route
  • discuss and resolve as quick as possible #193 task status notification

change manually Subprocessor and Analysis [...]

@yomguy please note that it will have to be redefine properly for a timeside.server instantiation from scratch in timeside-create-boilerplate command by no longer going through graphers (Correcting plugins _staging options? Removing obsolete DisplayAnalyzer?)

@Tointoin
Copy link
Collaborator

What are the expected values of Result.mime_type ?

@gnuletik possible Result.mime_type are for now:

  • empty mime_type (None) in spec meaning that you'll need to retrieve and /timeside/results/{result_uuid}/json/

  • image/png (forget bitmap here 😅)

  • audio/x-flac, audio/mpeg or audio/ogg for transcoding results, not concerned here

  • could add an Enum filed here with no other breaking change than a migration.

How to get the image from the API point of view ?

Available in AnalaysisTracks[i].result_url.file after a few refacto 😉
retrieveResultVisualization (/timeside/api/results/{uuid}/visual/) will be deprecated, at least not used here

What is the data format and where is it stored ?

Where are these values stored ? In Result.hdf5 (available inside retrieveResult) ?
What is the typing of this data ? For 2D json is it an array of number ?

"Uggliest part here", you'll have to deal with result.hdf5 data structure:

  • First from AnalaysisTracks[i].result_url.mime_type == None you know that you'll need /timeside/results/{uuid}/json/
  • Then you'll have to deal with the structure that won't have a proper in schema for now

examples here:

for our beloved beat it test item 😉

This route is not /timeside/api/and is not in doc yet

  • I could fix it changing a bit the route and probably giving a proper operation id

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants