You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
UPDATE: So it works on a Mac but fails on Windows64
so I'm trying to use your library to convert some relatively large SVG files (Between 70MB to 150MB). It seems to be able to handle 50mb files fine. I'm also converting them to pretty large PNGs (4800 by 4250). The code is attached below along with an example SVG you can try. This is the error message I get.
at ChildProcess.exithandler (child_process.js:275)
at emitTwo (events.js:126)
at ChildProcess.emit (events.js:214)
at maybeClose (internal/child_process.js:925)
at Process.ChildProcess._handle.onexit (internal/child_process.js:209)`
async function runExport(files) {
let promiseChunk = []
for (let i = 400; i < files.length; i++) {
console.log('Starting iteration ' + i)
const input = fs.readFileSync(files[i])
const saveName = 'ouput_' + i + '.png'
let svgPromise = svg2png(input, { width: 4800, height: 4200 })
.then(buffer => {
fs.writeFile(saveName, buffer)
if (i+1 == files.length){
var end = new Date().getTime()
console.log((end-seconds)/1000)
}
}).catch(e => console.error(e));
promiseChunk.push(svgPromise);
if (i % maxPerIteration === 0){
await Promise.all(promiseChunk)
}
}
}
`
What have I done to remedy this?
I've increased:
'--max-old-space-size=4096'
I've also looked into the heap using heapsnap shot and it seems it is only using like 350 Mb in these cases. (I could paste these later if it is useful). So I'm not entirely sure what is wrong. Some help would be extremely nice!
Hi,
UPDATE: So it works on a Mac but fails on Windows64
so I'm trying to use your library to convert some relatively large SVG files (Between 70MB to 150MB). It seems to be able to handle 50mb files fine. I'm also converting them to pretty large PNGs (4800 by 4250). The code is attached below along with an example SVG you can try. This is the error message I get.
Error:
`Error: Command failed: D:\Slic3rSVGs\node_modules\svg2png\node_modules\phantomjs-prebuilt\lib\phantom\bin\phantomjs.exe D:\Slic3rSVGs\node_modules\svg2png\lib\converter.js {"width":4800,"height":4200}
Memory exhausted.
Code:
'const fs = require("pn/fs");
const svg2png = require("svg2png");
const maxPerIteration = 1
async function runExport(files) {
let promiseChunk = []
for (let i = 400; i < files.length; i++) {
console.log('Starting iteration ' + i)
const input = fs.readFileSync(files[i])
const saveName = 'ouput_' + i + '.png'
let svgPromise = svg2png(input, { width: 4800, height: 4200 })
.then(buffer => {
fs.writeFile(saveName, buffer)
if (i+1 == files.length){
var end = new Date().getTime()
console.log((end-seconds)/1000)
}
}).catch(e => console.error(e));
promiseChunk.push(svgPromise);
if (i % maxPerIteration === 0){
await Promise.all(promiseChunk)
}
}
}
`
What have I done to remedy this?
I've increased:
'--max-old-space-size=4096'
I've also looked into the heap using heapsnap shot and it seems it is only using like 350 Mb in these cases. (I could paste these later if it is useful). So I'm not entirely sure what is wrong. Some help would be extremely nice!
The file:
https://drive.google.com/file/d/12dCoTPm3glBfw5oUsvyhmtwWTto4XjR0/view?usp=sharing
The text was updated successfully, but these errors were encountered: