Everlog
With
CSV
Streams (channels
,tables
) you make your logs structurable.
CLI
$ npm i -g everlog
# Help
$ everlog
# List all created channels
$ everlog list
# Get statistic for a channel: amount of lines, files, etc.
$ everlog stats foo
# Read N last lines from a channel. Supports params `offset` and `limit`
$ everlog read foo
# Start viewer web-app.
$ everlog server --port=5772
API â
- Initialize the Everlog
import { Everlog } from 'everlog'
await Everlog.initialize({
directory: `./logs/`,
slack: {
token: '',
channelId: ''
}
});
- Create Log Streams
const channel = Everlog.createChannel('foo', {
// Keep only last N files
fileCountMax: 20,
// Limit size for a file. When reached, next file is created
fileBytesMax: 500 * 1024;
// Limit message count for a file. When reached, next file is created
fileMessagesMax: 10 ** 7
// Buffer N messages
messageBufferMax: 50;
columns: [] as ICsvColumn[];
// Flush logs, when no activity was for N milliseconds
writeTimeout: null as number;
columns: [
{ name: 'Title', filterable: true },
{ name: 'MyVal', type: 'number', sortable: true, groupable: true },
{ name: 'Timestamp', type: 'date', sortable: true, groupable: true },
]
});
channel.writeRow([`Lorem ipsum`, 123, new Date()]);
// On the application end flush the data (in case there is smth in the buffer)
Everlog.flush();
Dev
/src/
)
Core (- Collects events from a server or from custom streams, and proceeds with persistence or further propagation (slack)
- Creates a subapp to view collected events
/www/
)
Viewer (SubApplication to view/sort/filter collected events
- Development endpoints (unbuild source):
- web:
http://localhost:5771/index.dev.html
- api, e.g:
http://localhost:5771/api/logs/channels
- web:
Prepair
> npm i
> cd www/
> npm i
Start Example(Dev Project for the viewer)
# builds core to be available for example as lib
> npm run watch
# starts demo server with Core and Viewer attached
> npm run example
# navigate to http://localhost:5771/atma/monit/index.dev.html