This is a service repository with examples on how to use the Official DRES Client OAS to create participating clients for DRES.
Currently, there are examples in these languages:
To have the examples somewhat aligned, we rely on the great Gradle OpenAPI Generator Plugin and thus, on Gradle - however, as we use the gradle wrapper, you shouldn't notice this.
Besides that, the examples are in various languages with their own set of prerequisites.
These examples showcase how to use the OpenAPI Generator in combination with the official DRES Client OpenApi Specifications in order to generate client code in some languages. The generated code does all the communication with DRES, only linking said generated code to application logic is left to the user.
In these examples, we show how to use get the generated code and how to use it. In a similar fashion the code generation as well as its usage can be applied to existing projects.
If you do have troubles with the exmples shown here, please add an issue. However, issues with the server have to be submitted to the DRES issue page.
We kindly ask you to refer to the following paper in publications mentioning or employing DRES:
Rossetto L., Gasser R., Sauter L., Bernstein A., Schuldt H. (2021) A System for Interactive Multimedia Retrieval Evaluations. In: Lokoč J. et al. (eds) MultiMedia Modeling. MMM 2021. Lecture Notes in Computer Science, vol 12573. Springer, Cham. Link: https://doi.org/10.1007/978-3-030-67835-7_33
Bibtex:
@InProceedings{10.1007/978-3-030-67835-7_33,
author="Rossetto, Luca
and Gasser, Ralph
and Sauter, Loris
and Bernstein, Abraham
and Schuldt, Heiko",
editor="Loko{\v{c}}, Jakub
and Skopal, Tom{\'a}{\v{s}}
and Schoeffmann, Klaus
and Mezaris, Vasileios
and Li, Xirong
and Vrochidis, Stefanos
and Patras, Ioannis",
title="A System for Interactive Multimedia Retrieval Evaluations",
booktitle="MultiMedia Modeling",
year="2021",
publisher="Springer International Publishing",
address="Cham",
pages="385--390",
abstract="The evaluation of the performance of interactive multimedia retrieval systems is a methodologically non-trivial endeavour and requires specialized infrastructure. Current evaluation campaigns have so far relied on a local setting, where all retrieval systems needed to be evaluated at the same physical location at the same time. This constraint does not only complicate the organization and coordination but also limits the number of systems which can reasonably be evaluated within a set time frame. Travel restrictions might further limit the possibility for such evaluations. To address these problems, evaluations need to be conducted in a (geographically) distributed setting, which was so far not possible due to the lack of supporting infrastructure. In this paper, we present the Distributed Retrieval Evaluation Server (DRES), an open-source evaluation system to facilitate evaluation campaigns for interactive multimedia retrieval systems in both traditional on-site as well as fully distributed settings which has already proven effective in a competitive evaluation.",
isbn="978-3-030-67835-7"
}
Contributions are always welcome. Feel free to add the example in your preferred language and create a PR for it.