Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(http client): more user-friendly error messages when decoding fails #853

Merged
merged 6 commits into from
Sep 15, 2022

Conversation

niklasad1
Copy link
Member

@niklasad1 niklasad1 commented Aug 11, 2022

This PR changes the http client to decode the reponse as Response<serde_json::Value> -> Response<R>
In order to get a more user-friendly error message

Before:

response: Err(ParseError(Error("missing field `error`", line: 1, column: 3752798)))

After

2022-08-11T13:57:12.033043Z  INFO http: r: Parse error: invalid type: string "lo", expected u8

Not sure about the overhead of doing this but the async-client already does this and it's more user-friendly.

Closing #298

This PR changes the http client to decode the reponse as `Response<serde_json::Value> -> Response<R>`
In order to get a more user-friendly error message

Before:

```
response: Err(ParseError(Error("missing field `error`", line: 1, column: 3752798)))
```

After

```
2022-08-11T13:57:12.033043Z  INFO http: r: Parse error: invalid type: string "lo", expected u8
```
@niklasad1 niklasad1 requested a review from a team as a code owner August 11, 2022 14:02
@niklasad1 niklasad1 changed the title fix(http client): serde_json::Value -> T fix(http client): more user-friendly error messages when decoding fails Aug 11, 2022
@niklasad1
Copy link
Member Author

I actually benches this I couldn't find any significant regression when doing:

diff --git a/benches/bench.rs b/benches/bench.rs
index 0fd6c8e..6a28838 100644
--- a/benches/bench.rs
+++ b/benches/bench.rs
@@ -4,10 +4,10 @@ use crate::helpers::{ws_handshake, KIB};
 use criterion::*;
 use futures_util::future::{join_all, FutureExt};
 use futures_util::stream::FuturesUnordered;
-use helpers::{http_client, ws_client, SUB_METHOD_NAME, UNSUB_METHOD_NAME};
+use helpers::{http_client, to_human_friend_size, ws_client, SUB_METHOD_NAME, UNSUB_METHOD_NAME};
 use jsonrpsee::core::client::{ClientT, SubscriptionClientT};
 use jsonrpsee::http_client::HeaderMap;
-use jsonrpsee::types::{Id, ParamsSer, RequestSer};
+use jsonrpsee::types::{Id, ParamsSer, RequestSer, Response};
 use pprof::criterion::{Output, PProfProfiler};
 use tokio::runtime::Runtime as TokioRuntime;
 
@@ -16,7 +16,7 @@ mod helpers;
 criterion_group!(
 	name = types_benches;
 	config = Criterion::default().with_profiler(PProfProfiler::new(100, Output::Flamegraph(None)));
-	targets = jsonrpsee_types_v2
+	targets = types_serialize_request, types_decode_response
 );
 criterion_group!(
 	name = sync_benches;
@@ -62,8 +62,8 @@ fn v2_serialize(req: RequestSer<'_>) -> String {
 	serde_json::to_string(&req).unwrap()
 }
 
-pub fn jsonrpsee_types_v2(crit: &mut Criterion) {
-	crit.bench_function("jsonrpsee_types_v2_array_ref", |b| {
+pub fn types_serialize_request(crit: &mut Criterion) {
+	crit.bench_function("create_and_serialize_request_ref", |b| {
 		b.iter(|| {
 			let params = &[1_u64.into(), 2_u32.into()];
 			let params = ParamsSer::ArrayRef(params);
@@ -72,7 +72,7 @@ pub fn jsonrpsee_types_v2(crit: &mut Criterion) {
 		})
 	});
 
-	crit.bench_function("jsonrpsee_types_v2_vec", |b| {
+	crit.bench_function("create_and_serialize_request_alloc", |b| {
 		b.iter(|| {
 			let params = ParamsSer::Array(vec![1_u64.into(), 2_u32.into()]);
 			let request = RequestSer::new(&Id::Number(0), "say_hello", Some(params));
@@ -81,6 +81,46 @@ pub fn jsonrpsee_types_v2(crit: &mut Criterion) {
 	});
 }
 
+pub fn types_decode_response(crit: &mut Criterion) {
+	let serialized_vals: Vec<String> = [1 * KIB, 10 * KIB, 100 * KIB, 1_024 * KIB, 10_000 * KIB]
+		.into_iter()
+		.map(|size| serde_json::to_string(&Response::new("A".repeat(size), Id::Null)).unwrap())
+		.collect();
+
+	{
+		let mut group = crit.benchmark_group("decode_response");
+		for ser_json in &serialized_vals {
+			group.bench_with_input(
+				BenchmarkId::from_parameter(&to_human_friend_size(ser_json.len())),
+				&ser_json.len(),
+				|b, _| b.iter(|| black_box(serde_json::from_str::<Response<String>>(&ser_json).unwrap())),
+			);
+		}
+
+		group.finish();
+	}
+
+	{
+		let mut group = crit.benchmark_group("decode_response_twice");
+		for ser_json in &serialized_vals {
+			group.bench_with_input(
+				BenchmarkId::from_parameter(&to_human_friend_size(ser_json.len())),
+				&ser_json.len(),
+				|b, _| {
+					b.iter(|| {
+						black_box({
+							let raw_response = serde_json::from_str::<Response<serde_json::Value>>(&ser_json).unwrap();
+							serde_json::from_value::<String>(raw_response.result).unwrap()
+						})
+					})
+				},
+			);
+		}
+
+		group.finish();
+	}
+}
+
 trait RequestBencher {
 	const REQUEST_TYPE: RequestType;
 
diff --git a/benches/helpers.rs b/benches/helpers.rs
index 93d6b99..d8593d8 100644
--- a/benches/helpers.rs
+++ b/benches/helpers.rs
@@ -16,6 +16,8 @@ pub(crate) const ASYNC_METHODS: [&str; 3] = [SYNC_FAST_CALL, SYNC_MEM_CALL, SYNC
 
 // 1 KiB = 1024 bytes
 pub(crate) const KIB: usize = 1024;
+// 1 MB = 1024*1024 bytes
+pub(crate) const MB: usize = 1024 * 1024;
 
 /// Run jsonrpc HTTP server for benchmarks.
 #[cfg(feature = "jsonrpc-crate")]
@@ -207,3 +209,14 @@ pub(crate) async fn ws_handshake(url: &str, headers: HeaderMap) {
 	let uri: Uri = url.parse().unwrap();
 	WsTransportClientBuilder::default().max_request_body_size(u32::MAX).set_headers(headers).build(uri).await.unwrap();
 }
+
+pub(crate) fn to_human_friend_size(len: usize) -> String {
+	const EXCLUSIVE_KIB: usize = KIB - 1;
+	const EXCLUSIVE_MB: usize = MB - 1;
+
+	match len {
+		b @ 0..=EXCLUSIVE_KIB => format!("{}b", b),
+		b @ KIB..=EXCLUSIVE_MB => format!("{}kb", b / KIB),
+		b @ _ => format!("{}mb", b / MB),
+	}
+}

but it might be worse on some more complicated type than String

@@ -207,16 +207,19 @@ impl ClientT for HttpClient {
}
};
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dumb (unrelated to this PR) question: what's the purpose of this async block?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's part of the tracing stuff bug fix, it should be fixed by #846

@jsdw
Copy link
Collaborator

jsdw commented Aug 17, 2022

Ooh I see; you go via Value to benefit form the better error messages when Value fails to deserialize into the type. I'm not sure whether or not RawValue woule give you that (still not sure whether it would actually work; I'd need to try it).

I am a bit worries about the perf things; making Values is quite allocation heavy, much more so for complicated types than simpler ones. I wonder what benchmarks owuld best highlight that sort of difference.

@niklasad1
Copy link
Member Author

niklasad1 commented Aug 17, 2022

Ooh I see; you go via Value to benefit form the better error messages when Value fails to deserialize into the type. I'm not sure whether or not RawValue woule give you that (still not sure whether it would actually work; I'd need to try it).

I am a bit worries about the perf things; making Values is quite allocation heavy, much more so for complicated types than simpler ones. I wonder what benchmarks owuld best highlight that sort of difference.

Yeah me too that's why benched it, we could take another route which I implemented in: https://github.com/paritytech/jsonrpsee/compare/na-more-user-friendly-decoding-errors?expand=1 but the error message is not as good but it should be cheaper.

@niklasad1 niklasad1 merged commit 9479530 into master Sep 15, 2022
@niklasad1 niklasad1 deleted the na-more-user-friendly-decoding-errors-2 branch September 15, 2022 17:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants