-
Notifications
You must be signed in to change notification settings - Fork 886
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create depth limit option for the log line #1121
Comments
I am working on this, if I read it correctly this code const pino = require("pino")({ maxDepth: 3 });
const o = { a: { b: { c: { d: { e: { f: 'foobar' } } } } } }
pino.info(o) would produce something like this: I am currently testing with this code const pino = require("pino")();
const nested = {};
const MAX_DEPTH = 10 * 784;
let currentNestedObject = null;
for (let i = 0; i < MAX_DEPTH; i++) {
const k = 'nest_' + i;
if (!currentNestedObject) {
currentNestedObject = nested;
}
currentNestedObject[k] = {};
currentNestedObject = currentNestedObject[k];
}
pino.info(nested); This is a very edge-case, a deeply nested object which is like this: We reach immediately this line which calls In fact with that big The same error is raised by the next At this point the only way I see it is to parse the object and change it to removes all the keys at nesting level Is this the correct approach? |
Not really. Essentially we should embed json-stringify-safe and implement this feature there. |
I've had the same issue, however mostly around the size of the log and into how much money it translates. Been using https://github.com/runk/dtrim for quite some time - it covers all scenarios where you'd expect huge amounts of data to be logged: array, nested objects, buffers, big strings. Would be nice to see it used in pino, or have similar functionality for truncating lengthy structures. |
yes good catch! |
This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
As titled.
Previous ones #990 #1120
The text was updated successfully, but these errors were encountered: