- Simple package that implements Avro encoding and decoding
- Written in Typescript
- Follows the Avro serialization conventions of Confluent's Schema Registry. During deserialization, schemas are obtained from the registry using their 4-byte id prefix. When serializing data, schemas are registered to the registry and obtain the corresponding 4-byte id prefix.
- Easy to use interface. All of Confluent's Schema Registry flows are implemented in the package
- Supports evolution. Converts Avro-encoded payloads into a format specified by the application's Avro schema.
- We use this to encode and decode messages when consuming and producing to Kakfa
npm install avro-cado
// Avro schema
const avroSchema = {
type: "record",
name: "TestMessage",
namespace: "com.flipp.node.kafka.TestMessage",
doc: "Properties related to a TestMessage.",
fields: [
{
name: "key",
type: "string",
doc: "The the key for the message"
},
{
name: "text",
type: "string",
doc: "The text for the message"
}
]
};
// package options
const opts: Options = {
schemaRegistry: "http://localhost:8081",
numRetries: 10, // number of attempts to call schemaRegistry
wrapUnions: "auto", // avsc option
subject: "test-value", // subject for schema registration
schema: avroSchema, // schema object as needed for avsc
};
const encodeFunc = await createEncoder(opts);
// encode a message
const encoded: Buffer = encodeFunc(message);
const decodeFunc = createDecoder(opts);
// decode a message
const decoded = await decodeFunc(encoded);
- Ensure typescript is installed:
npm install -g typescript
- Install all dependencies:
npm install
- Make changes
- Run
npm run package
. This will remove therelease
directory, runtsc
and add the release directory back to git. The release directory needs to be updated every commit to include changes in the library. - Update the
version
inpackage.json
- Add your changes to git, and commit.
Before opening an issue or pull request, please read the Contributing guide.