Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proposal to implement async API (Futures) #52

Open
hugues31 opened this issue Nov 1, 2017 · 37 comments
Open

Proposal to implement async API (Futures) #52

hugues31 opened this issue Nov 1, 2017 · 37 comments

Comments

@hugues31
Copy link
Owner

hugues31 commented Nov 1, 2017

Hello,

After some research, I decided to give a try to Futures to come up with a simple async API. The main idea is to keep everything very straightforward for the user (the event loop is created by the API).

The idea originates from here : #9

The source code is :

#![feature(conservative_impl_trait)]

extern crate futures;
extern crate hyper;
extern crate tokio_core;
extern crate serde_json;

use futures::Future;
use futures::future;
use futures::stream::Stream;

use hyper::Client;
use serde_json::Value;


struct Api {
    core: tokio_core::reactor::Core,
}

impl Api {
    pub fn new() -> Api {
        Api {
            core: tokio_core::reactor::Core::new().unwrap(),
        }
    }

    pub fn time(&self) -> impl Future<Item = Value, Error = hyper::Error> {
        let url = "http://date.jsontest.com";
    
        let uri = url.parse::<hyper::Uri>().unwrap();

        let handle = self.core.handle();
        let client = Client::new(&handle);

        client.get(uri).and_then(|res| {
            res.body().concat2().and_then(move |body| {
                let v: Value = serde_json::from_slice(&body).unwrap();
                future::ok((v))
            })
        })
    }
}

fn main() {
    // create the API and the Tokio core
    let mut api = Api::new();

    // BENCHMARK THE SYNC VERSION
    let work_1 = api.time(); // create the handle, the client and send request inside a Future
    let res = api.core.run(work_1).unwrap();
    let time_1 = res["milliseconds_since_epoch"].as_f64().unwrap();

    let work_2 = api.time(); 
    let res = api.core.run(work_2).unwrap();
    let time_2 = res["milliseconds_since_epoch"].as_f64().unwrap();

    println!("\nSync version takes {} milliseconds to execute.", time_2 - time_1);

    // BENCHMARK THE ASYNC VERSION
    let work_async = api.time().join(api.time());
    let (res_1, res_2) = api.core.run(work_async).unwrap();
    let time_1 = res_1["milliseconds_since_epoch"].as_f64().unwrap();
    let time_2 = res_2["milliseconds_since_epoch"].as_f64().unwrap();

    println!("\nAsync version takes {} milliseconds to execute.\n", (time_2 - time_1).abs());
}

Cargo.toml :

[dependencies]
futures = "0.1"
hyper = "0.11"
tokio-core = "0.1"
serde_json = "1.0.5"

The code below returns:


Sync version takes 198 milliseconds to execute.

Async version takes 5 milliseconds to execute.

What do you think? Since it's a huge rewrite it would be nice to have some feedbacks :)

@ainestal
Copy link
Contributor

ainestal commented Nov 1, 2017

This changes fundamentally how the whole library works and will break any currently working applications using it.

That being said I like the idea because enables the possibility of using APIs that push events alongside APIs that are polling all the time. That is the main argument for me to use this approach.

The performance is also a good point, but currently is not too bad, specially considering the polling is around 1-2s in the exchanges we are supporting. For me in this context some mili seconds are not a big deal.

I specially agree with the statement:

The main idea is to keep everything very straightforward for the user

So go for it! If you need help let me know.

@hugues31
Copy link
Owner Author

hugues31 commented Nov 1, 2017

@ainestal Thanks for your input. Indeed, this is a breaking change (Coinnect 0.6 here we go). For the performance, this is exactly the same as before if the Future contains only 1 request but as the number of requests increase the advantage of async API surge. We can add an example where we compare the time taken to pull the price of the pair BTC-ETC for the 4 exchanges with the sync and async method as shown above. My guess is that async version is ≈300% faster

@crackcomm
Copy link

Hey, I am currently working on streams @ crypto-bank/orca.
I will finish streams and I will catch up with you guys here.

In the meantime what do you guys think about using #47 and implement traits for generated code?

@ainestal
Copy link
Contributor

ainestal commented Nov 1, 2017

@crackcomm I think it can be very useful to have the code generation

@hugues31
Copy link
Owner Author

hugues31 commented Nov 1, 2017

@crackcomm From what I have seen, orca is a Poloniex websocket API implementation, right? Apart from that, I did not study the OpenAPI issue. My plan for now and the foreseeable future is just to write some Rust code and bring Coinnect to a stable version.

@crackcomm
Copy link

@hugues31 I am currently working only on Poloniex because I am still studying stream design in rust.

@ainestal I would like to see that too but currently for APIs like Poloniex etc. that do not provide OpenAPI descriptors I'd rather write code myself like @hugues31.

I will also implement async clients in orca but responses will be hard typed.
I am using protocol buffers on every step for storage and remote capabilities.

This data is mostly streamed to Python where it can be trained on by TensorFlow.
Currently data is deserialized but rust-numpy implementation is a possible future.

@ainestal
Copy link
Contributor

ainestal commented Nov 1, 2017

@crackcomm I didn't take a close look at the OpenAPI but the code generation feels very useful for the future. In the case of exchanges that don't implement it is probably easier to create the description of the API than to code it in rust.

@hugues31 I agree with the approach of coding it now, since we will get the value straight away. I like that over having to wait until we have the code generation and then we can describe all the API, which will take a long time before is completed and usable.

@ainestal
Copy link
Contributor

Hey @hugues31 who is the adventure with the code going? Let me know if I can help. For now I'm waiting until you are done to then help migrating whatever is left (I guess at least Bitstamp)

@hugues31
Copy link
Owner Author

hugues31 commented Nov 20, 2017

Hello @ainestal ! I'm ready to begin the implementation for one exchange just to see what it will looks like. I suggest to take Poloniex as an example since it may interest more people than the others.
I will create a new branch and you are very welcome to submit your PR. A lot of code need to be changed before the first successful compilation though.
EDIT: In fact, we need to change all exchanges at once... And I'm facing several issues like sharing the same client across the API as we do now (and not recreate it each time we make a request), returning error with error_chain, and so on... Not an easy task

@ainestal
Copy link
Contributor

This is the kind of changes that is way easier to do by pairing, but that's kind of complicated in our case.
@hugues31 you can get rid of the rest of exchanges while working with Poloniex. We will include the rest afterwards when the library is compiling again.

If you see that is very difficult to make it work with all the constraints we had try removing some of them (errors, recreation of the client, etc), we can re introduce them later.

@hugues31
Copy link
Owner Author

You can check the new "async" branch.
I just write down the depencies and start converting the code but there is a lot to do. Feel free to PR on this branch. I don't have a lot of time ahead of me for now..

@ainestal
Copy link
Contributor

I'll take a look when I can, currently I don't have a lot of spare time.

@ainestal
Copy link
Contributor

@hugues31 I was yesterday taking a look at the code and thinking on how to do it. My main assumption is that "We want to deliver futures to the users of the library". Is this correct?

If this is true I will work on the code trying to change the least possible parts in order to make it work.

@hugues31
Copy link
Owner Author

@ainestal Yes it should be the point. I'm not sure if including the tokio_core inside the API is a good idea thought

@ainestal
Copy link
Contributor

@hugues31 I'm trying to implement just the return of Futures from ticker, orderbook, add_order, and balances, staying at the generic_api layer, without going any deeper to the api of the exchanges.
I'm trying to keep it simple just returning a Future instead of a Result. For what I understand if we include the tokio_core in the api then we will just return a Result, not a Future. That defeats the point of using futures. If we do that just to get asynchronicity then we could use threads and not suffer the complexity of the Futures.

The part that is making me really struggle to implement this are the errors. I don't know how to pass the error chain to the future.
I've been taking a look at rust-lang-deprecated/error-chain#90 and https://github.com/sezaru/futures-error-chain but those all are workarounds.

So currently I don't know how to implement it. I'm tempted to try to get rid of the error_chain and see if I can make the library work without it, but feels like going a step backwards. What are your views on this?

@hugues31
Copy link
Owner Author

@ainestal I don't understand when you say that with the tokio_core in the API we will return a Result. If you look at my example (first post), it works right away. And indeed, errors need a lot of rework..

@ainestal
Copy link
Contributor

@hugues31 what I mean returning the Result is also happening in the example. You are using let res = api.core.run(work_1).unwrap(); in main. What I understand from that is that time would be a function from our library and that core would be used by the user. That same behavior what I'm trying to implement in our library. In the async branch you are implementing futures at the api level, but generic_api and the exchange trait remain using Result. I was thinking that by doing that the user just gets a result that will be synchronous and will happen no faster than what we are allowing to avoid problems getting banned from the exchanges (so 1 or 2 seconds usually, depending on the exchange). If we just want to provide the possibility of launching several calls to the exchanges (same or different exchanges) in parallel, we could implement threading and return results and it would be much easier to implement and use than using futures.

So with that in mind I want to implement futures at the exchange trait level, without modifying anything in any deeper level. Treating the functions of the trait as atomic operations that can be converted into futures has the effect we are looking for and keeps the implementation simpler than populating the futures through the whole library. But it still allows us to use the futures later across all the library if we need to.

Btw, I'm still struggling with the errors. I'm trying different approaches without much luck so far.

@JP-Ellis
Copy link

Any update on this feature? I've actually implemented an async version of the Poloniex API myself (not public yet, though may never be) before finding this repo. I'd be quite happy to try and implement this feature in this crate.

The general syntax in my crate is:

use poloniex::Client;
use tokio_core::reactor::Core;

let mut core = Core::new().unwrap();
let polo = Client::new(&core.handle()).unwrap();

// Launch Jobs
let tickers = polo.tickers();
let day_volume = polo.day_volume();

// Tickers
let tickers = core.run(tickers).unwrap();
assert!(tickers.contains_key("BTC_ETH"));

// Day volume
let (pairs, totals) = core.run(day_volume).unwrap();
assert!(pairs.contains_key("BTC_ETH"));
assert!(pairs["BTC_ETH"].contains_key("BTC"));
assert!(pairs["BTC_ETH"].contains_key("ETH"));
assert!(totals.contains_key("totalBTC"));

so not too dissimilar to what is proposed. Could be fairly easy for me to adapt what I have to make this crate async.

One notable difference is who owns the tokio Core. Your example has the API owning Core; however, this requires the API client always being mutable which I didn't really want. My example keeps Core separate so that multiple API clients can connect to the same Core (and whatever else is needed). Ideally, I would have had the API client be completely separate from Core; however, the underlying HttpsConnector within hyper requires a handle.

Would you like me to implement this feature in a fork?

@hugues31
Copy link
Owner Author

hugues31 commented Dec 28, 2017

@JP-Ellis Hello and thank you for contributing to Coinnect. Yes in my mind the tokio Core is owned by Coinnect to simplify. What are the disadvantages to have the tokio Core outside ? The coinnect client is already mutable and we need to change a lot of code to make it immutable (= remove features) so maybe the Core could be owned by the lib ? You are free to start implementing this (long awaited!) feature, I will take a look at it regularly to see what we can do

@ainestal
Copy link
Contributor

In my mind the Core should also be outside of the library. The library would return futures and the Core would run them. If the Core is inside, the library would return Results, which is the same it is doing now with the difference of having some parallelism when running API calls for the same exchange, in this case just having a thread is much easier to implement and the final result is the same. With the Core outside the library it would be possible to have true parallelism through all the exchanges.

Let me know if my assumptions are incorrect.

@hugues31
Copy link
Owner Author

hugues31 commented Dec 28, 2017

@ainestal Look at my first post : the Core is owned by the struct but returns Futures. The code works. What's wrong with this idea ? Let me know because I really want to implement an async API 😃 (or let someone do it aha)

@ainestal
Copy link
Contributor

ainestal commented Dec 28, 2017

@hugues31 I think the main difference I have in mind between having the Core as part of a specific exchange or having the core outside them is the reusability. If the core is outside the library a user would be able to use it for all the exchanges. If I understand correctly that would allow all the interactions with any of the exchanges to be asynchronous and handled in the same manner, from the same Core.

I've been trying to implement futures in the outer layer with the ticker, orderbook, etc, without much luck. Maybe just using the newest version of hyper and using futures through our library is easier to implement. It sure looks like the proper approach that should be used. I just didn't want to do that as a first step because it involves changing everything in the library (almost all functions should be working with futures).

Feel free to implement it, I think a non-perfect implementation of async is better than no implementation at all.

@hugues31
Copy link
Owner Author

hugues31 commented Dec 28, 2017

@ainestal We can have the Coinnect handles the different exchanges with the Core loop inside it. In fact, I see no point letting the user to handle it (the user can access the Core since it's public).
If I have sufficient time I will write a complete rewrite of this lib.

@tbourvon
Copy link

My opinion on this issue is that it would be best to use the same Core structure for every exchange: the main module should dispatch it to each exchange API. Running multiple event loops would inefficient and confusing.
Now regarding who should own the Core structure, I believe that we could simply have two different constructors, one which generates a Core if the user doesn't care about it, and the other one which takes an existing Core from the user if they want to do more stuff with the event loop. In concrete terms, we could use a reference to Core throughout the code and optionally store a Core in another field.

How does that sound?

@tbourvon
Copy link

(note that this architecture lets users choose between Futures and Results: they can call a generic API to manipulate futures themselves, or they can use the functions exposed in the main module to let Coinnect run the requests (which could still be built so that they run asynchronously and in parallel) and return Results)

@hugues31
Copy link
Owner Author

hugues31 commented Jan 3, 2018

I didn't think of using 2 constructors, I like this idea a lot. Anyone is working on that ? I may have some time to prototype that.

@crackcomm
Copy link

I believe Future is not limiting but only enabling alternative to a Result.

See futures-await for nice synchronous API.

The purpose of async/await is to provide a much more "synchronous" feeling to code while retaining all the power of asynchronous code!

Futures can be converted to a result by calling .wait() and just .unwrap() later.

@hugues31
Copy link
Owner Author

hugues31 commented Jan 3, 2018

@crackcomm yes no need to write a wrapper around functions returning Future to get a Result, this is already simple to do that :)

@crackcomm
Copy link

@hugues31 I believe we should focus on creating a boilerplate code to be used in #47.

@tbourvon
Copy link

tbourvon commented Jan 3, 2018

I agree, since the OpenAPI codegen generates async code, it wouldn't be wise to try to convert the current exchange code to async. We should probably focus on the boilerplate as @crackcomm said.

@rrichardson
Copy link

Is there still interest in this lib (and this ticket?) I would like to use Coinnect and am interested in helping to update to async/await.

@hugues31
Copy link
Owner Author

Hello @rrichardson ,
I confirm you that Coinnect is NOT abandoned. I have some free time now but I have left Rust aside at the moment so my dev productivity is very low for Async stuff. If you are willing to start a new branch implementing async for Coinnect, I will be very happy to review and merge it into the master branch !

@rrichardson
Copy link

@hugues31 -
I've gone ahead and created a working openapiv3 client code-generator that it at least adequate : https://github.com/rrichardson/openapi_generator
I forked it from https://gitlab.com/easymov/openapi_generator - I tried 4 different code-generators and they were all broken in various ways, this was the easiest starting point from which to create a working code-generator.

I've also created a provisional repo for the generated code for the various coinnect API clients here :
https://github.com/rrichardson/coinnect_codegen . It has a bug-fixed version of the Binance API Spec that I used as my first target. I've not thoroughly tested the resulting client, but it compiles, so it must work :)

In the next week or two, I'll see what I can do about taking the coinnect data model and adapting it to an async version of ExchangeAPI

At present, my primary interest is in collecting and storing market data, specifically detailed OrderFlow data over Websockets, so I will probably be expanding the API a bit.

After that, the fun work of creating OpenAPI yamls for the rest of the "supported" exchanges. I guess, for now, if anyone has an API spec to submit, just make a PR to https://github.com/rrichardson/coinnect_codegen and I'll work on making sure the generator works for that spec.

@rrichardson
Copy link

Also, thinking more about the streaming data aspects of this:
I think we should adopt a pub-sub stream mechanism using something like https://github.com/jonhoo/bus or https://github.com/Logicalshift/flo_stream as a basis of the implementation.

In fact, it might make sense to add a subscribe method to the ExchangeApi trait that will create a data-stream of the chosen instrument and route it into the pub-sub bus.

Thoughts?

@hugues31
Copy link
Owner Author

With the time, I wonder how we can maintain Coinnect and keep it up to date with the various exchanges supported.
For now, this is "hand crafted" : Kraken has a lot of missing trading pairs for example.

We can generate most of the code from what your already implement and a OpenAPI specs file.
However :

  • we need to write those OpenAPI files for each exchange (some exchanges like Binance offer it and for the other, it's simple enough to write our own)
  • OpenAPI does not support Websocket so we must use something similar like AsyncAPI. websocket seems the defacto crate for Rust but is no more updated. What alternatives have we ? I like the subscribe method you have mentioned.
  • not well versed into code gen and in particular OpenAPI. I would like to keep the 'unified' API (a la ccxt) and I don't know if it's possible with OpenAPI ?

I really don't know the best approach for long term maintainability. In the end, I think we will end up with a lot of generated code AND a lot of hand writen code

@rrichardson
Copy link

We can make the codegen produce any code we want, including code that could lend itself well to directly fitting into the ExchangeApi trait. However, since every exchange has different naming conventions and models, custom code would still need to be written (maybe one day code-generated) to fulfill the ExchangeAPI trait directly. I do think that more work could be done to insulate the Coinnect APIs from the exchange APIs.

I'm hoping that for things like trading pairs and other specifics, we can just avoid specifying those altogether, and just pass those parameters through (in a non-typesafe manner) to the inner platform-specific API (who may or may not have types for the pairs)

I think we'd definitely want to write/copy API yaml specs for every platform/exchange. That's easier than writing code for every exchange.

OpenAPI does not support Websocket so we must use something similar like AsyncAPI. websocket seems the defacto crate for Rust but is no more updated. What alternatives have we ? I like the subscribe method you have mentioned.

Funny enough, this is the first I've ever heard of AsyncAPI :)

@rrichardson
Copy link

There happen to be a couple relevant discussions happening on /r/rust that relate exactly to trait implementations for separate API/endpoints. It might be overkill to go too far into trait-land, but the more implementation we can save in the long run, the better.

I just want to leave these links here so I can find them later:

https://www.reddit.com/r/rust/comments/lkdw6o/designing_a_new_architecture_for_rspotify_based/
https://docs.rs/gdbstub/0.4.3/gdbstub/target/ext/index.html#how-protocol-extensions-work---inlineable-dyn-extension-traits-idets

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants