BIT Kafka Tools
BIT Team tools for working with Kafka, Avro and other misc stuff. They are split into several independent packages that can be imported separately.
Working with node-kafka, avro and our confluence schema registry
| Package | Status | Description |
|---|---|---|
| @ovotech/kafka-avro-cli | Used in prod | A CLI for inspecting the confluent schema-registry, produce and consume avro kafka events. |
| @ovotech/avro-stream | Used in prod | Serialize/deserialize kafka-node streams with avro data, using confluent schema-registry to hold the schemas. |
| @ovotech/kafka-pg-sink | Used in prod | Store kafka-node events into a postgres database. |
| @ovotech/schema-registry-api | Used in prod | A simple typescript node-fetch wrapper on the confluent schema-registry api. |
| @ovotech/re-pipeline | Used in prod | A node streams pipeline implementation, that reconnects the pipes on error, once the error has been handled. |
| @ovotech/kafka-consumer | Used in prod | A generic kafka consumer |
Misc repos
| Package | Status | Description |
|---|---|---|
| @ovotech/winston-logger | Used in prod | Wrap winston logger to hide graylog semantics and implement safe static meta contexts with PII sanitisers |
| @ovotech/apollo-datasource-axios | Used in prod | A rest datasource that uses axios under the hood. This allows adding generic interceptors, adapters etc. Integrates with cache and cache policies. Supports Interceptors. |
| @ovotech/bigquery-pg-sink | Used in prod | Stream the results of query made by nodejs-bigquery into a postgres database. |
| @ovotech/influx-metrics-tracker | Used in prod | Track metrics and store them in an Influx database, with secondary logging if Influx is unavailable. |
Getting Started
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
Prerequisites
To get up and running, you will need the following tools. Use Homebrew for installation where possible.
Code
To install dependencies and run the code you will need node v12.4.1+ and yarn installed. It has been tested with node version 12.4.1.
brew install node@12 brew install yarn
Installing
To get a development environment running:
Change to the root directory and install required packages.
# Install required packages
yarn
# Build packages that are dependencies of other packages
yarn build
Running the tests
The tests require a running schema registry service, and we're using docker compose to start it, alongside kafka, zookeeper and postgres.
So in the project's root directory run:
Then you can run the tests with:
Coding Style Tests
Code style is enforced by using a linter (tslint) and Prettier.
Deployment
- MANUALLY bump the package version along with your changes
- On merge to master lerna will pick up on the changes and output what will be published in the GitHub actions pipeline under
prepare-publish summary - MANUALLY approve deployment for the GitHub actions job
publish
Built With
Languages / Core Tools
- Typescript - The primary language
- NPM - Node package registry
Secondary Tooling
- jest - Testing framework used for unit and integration tests
- yarn - Typescript package management
- lerna - Tool for managing JavaScript projects with multiple packages
- docker - Deployable containers for code
Versioning
You'll need to bump the package version numbers yourself. Only updated packages with newer versions will be pushed to the npm registry.
Contributing
Have a bug? File an issue with a simple example that reproduces this so we can take a look & confirm.
Want to make a change? Submit a PR, explain why it's useful, and make sure you've updated the docs (this file) and the tests.
License
This project is licensed under Apache 2 - see the LICENSE file for details