A library for working with Apache Avro in Rust.
Please check our documentation for examples, tutorials and API reference.
We also support:
- C bindings for the crate at avro-rs-ffi
- A Python wrapper for the library at pyavro-rs
Add to your Cargo.toml
:
[dependencies]
avro-rs = "^0.6"
Then try to write and read in Avro format like below:
extern crate avro_rs;
#[macro_use]
extern crate serde_derive;
extern crate failure;
use avro_rs::{Codec, Reader, Schema, Writer, from_value, types::Record};
use failure::Error;
#[derive(Debug, Deserialize, Serialize)]
struct Test {
a: i64,
b: String,
}
fn main() -> Result<(), Error> {
let raw_schema = r#"
{
"type": "record",
"name": "test",
"fields": [
{"name": "a", "type": "long", "default": 42},
{"name": "b", "type": "string"}
]
}
"#;
let schema = Schema::parse_str(raw_schema)?;
println!("{:?}", schema);
let mut writer = Writer::with_codec(&schema, Vec::new(), Codec::Deflate);
let mut record = Record::new(writer.schema()).unwrap();
record.put("a", 27i64);
record.put("b", "foo");
writer.append(record)?;
let test = Test {
a: 27,
b: "foo".to_owned(),
};
writer.append_ser(test)?;
writer.flush()?;
let input = writer.into_inner();
let reader = Reader::with_schema(&schema, &input[..])?;
for record in reader {
println!("{:?}", from_value::<Test>(&record?));
}
Ok(())
}
This library supports calculating the following fingerprints:
- SHA-256
- MD5
Note: Rabin fingerprinting is NOT SUPPORTED yet.
An example of fingerprinting for the supported fingerprints:
extern crate avro_rs;
extern crate failure;
extern crate md5;
extern crate sha2;
use avro_rs::Schema;
use failure::Error;
use md5::Md5;
use sha2::Sha256;
fn main() -> Result<(), Error> {
let raw_schema = r#"
{
"type": "record",
"name": "test",
"fields": [
{"name": "a", "type": "long", "default": 42},
{"name": "b", "type": "string"}
]
}
"#;
let schema = Schema::parse_str(raw_schema)?;
println!("{}", schema.fingerprint::<Sha256>());
println!("{}", schema.fingerprint::<Md5>());
Ok(())
}
In order to ease decoding, the Binary Encoding specification of Avro data requires some fields to have their length encoded alongside the data.
If encoded data passed to a Reader
has been ill-formed, it can happen that
the bytes meant to contain the length of data are bogus and could result
in extravagant memory allocation.
To shield users from ill-formed data, avro-rs
sets a limit (default: 512MB)
to any allocation it will perform when decoding data.
If you expect some of your data fields to be larger than this limit, be sure
to make use of the max_allocation_bytes
function before reading any data
(we leverage Rust's std::sync::Once
mechanism to initialize this value, if
any call to decode is made before a call to max_allocation_bytes
, the limit
will be 512MB throughout the lifetime of the program).
extern crate avro_rs;
use avro_rs::max_allocation_bytes;
fn main() {
max_allocation_bytes(2 * 1024 * 1024 * 1024); // 2GB
// ... happily decode large data
}
This project is licensed under MIT License. Please note that this is not an official project maintained by Apache Avro.
Everyone is encouraged to contribute! You can contribute by forking the GitHub repo and making a pull request or opening an issue. All contributions will be licensed under MIT License.