-
Notifications
You must be signed in to change notification settings - Fork 21
why is it so hard to add threads to nodejs? or web workers? #4
Comments
Ref: nodejs/node#13143 (comment) |
but v8 already there or charka has web workers. isnt that all? looks nothing. only for ai we might need horsepower like facial recognition or speech. like what is the plus we need to implement when it is in chrome and charka already! |
Well, having shared memory available would be a huge plus for Workers. You can already get acceptable event-based process coordination using
If you want simple WebWorkers, yes, that would be comparatively easy. If you want to provide the full Node.js API (including things such as Also, just to give everybody a status update, I’ll try to draft an EP text this week. |
i guess dont you think? once we have web workers, the shared memory can be implemented by many solutions, lik e |
as well, dont you think we should not try to implement right away? it will take a few versions and refactors to make it mature? if isolated why do you think IO should be not be including? |
so we need the for later we can add shareddata-s , atomics as well. but not everything right away :) |
That depends on what you mean by “in isolation”. What can you
I’m not saying it should not be included, I’m saying we might need a few tricks to get it to work. Or generally: The more of Node’s internals we want to expose, the more carefully one needs to think about what could go wrong.
Because I’d like to make sure that we come up with an API that addresses everybody’s use cases, and doesn’t just take the path of least resistance. In particular, I would really not want to get us locked into a specific implementation and look back in a few months and have to say “yeah, this was a bad idea” (for example, it’s tempting to go for multi-thread support rather than multi-process support, but there are very valid points for not doing that). |
Isn't just like, versions:
what other use cases are there? |
Just to clarify my comment. The big codebase I was refering to is |
that's true, if we leave out |
but guys, i can see there is code going in. so you are writing already on it? |
besides, process.nextTick can work like a thread, isn't it? threads are overrated. I guess, web workers are good, but overrated if we use process.nextTick |
The main problems are generally the idea that I don't think the hacky solutions to these problems are good and really hope to go the route of not trying to create light weight processes |
isn't the Worker require and process would have their own? not shared at all ( but thread) |
@p3x-robot you can't decouple them, they use OS level shared data |
I don't understand the question. |
i guess someone have to write it like the original writer of nodejs. :)
v8 has native web worker. importscripts as well.
a thread cant have read data from nodejs? it should be readonly.
require.cache should its own every thread.
i guess we dont have time to do it, lots of other things to do. plus we can
use c++ async addons. looks like no one will do it.
rarely we need threads. if we really need we can use c++.
i never need it at all. i tried c++ async addon. it took 1 day. shared
data, fully async.
i guess it is enough?
Patrik
…On Jun 13, 2017 5:16 PM, "Bradley Meck" ***@***.***> wrote:
I don't understand the question.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#4 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AbQ4XOv6L5HjTHcw_-4aK-6S33eBCKV-ks5sDqe_gaJpZM4NvXKk>
.
|
@p3x-robot to a limited extent you can share data via transferring between threads, however some data like cwd and env come from the OS and are not thread local. For v8, you cannot share JS objects between Isolates/threads so you have to implement a layer like transferables. Even with that however syscalls like cwd are generally not thread safe to mutate which people do w/ standard Node APIs. I am stating that certain Node APIs are not safe to reify in workers and workers must pass messages to the main thread in order to coordinate with them. This is similar to how the browser does not expose all the DOM APIs to workers due to threading issues. |
Subscribing to this issue. Having this in Node.js would enable me to develop apps that leverage WebWorkers to offload CPU intensive operations (i.e Crypto) and make them work in the browser and Node.js in the same way. I would use it for js-ipfs and js-libp2p. |
For now, there are so many issues, it will be so slow, I am just using processes, even Chrome uses a process for everything. So you can easy use https://www.npmjs.com/package/tiny-worker , check it out! |
In order to move this forward, I think it might be good for people familiar with the internals to specify:
This way we can build up a list, discuss and converge on an outline of the end goal. |
Food for thought: https://webkit.org/blog/7846/concurrent-javascript-it-can-work/ |
That comment is not in line with our code of conduct. Is like to kindly ask that you consider removing it. |
Can we please keep comments in this thread relevant? |
@p3x-robot @AngelAngelov personal attacks are considered a code of conduct offense, therefore i'd like to ask you two to remove or otherwise edit your comments |
What happened? Comments appear to be removed. |
@AngelAngelov I think https://github.com/Microsoft/napajs/blob/master/docs/api/node-api.md is will be full at some time an you can use threads at will without web-workers as well. Merry XMAS |
I wish you guys can take a look at here. https://github.com/alibaba/AliOS-nodejs/wiki/Workers-in-Node.js-based-on-multithreaded-V8 |
this is closed, given 10.5 is implemented for threads. |
@addaleax does the threads works with Atomics? i am testing, just asking if you have any info on it... THANKS |
@p3x-robot Yes, it does. :) |
hey guys! do you have started using threads in NodeJs which has worker threads enabled? |
It would be great to have some rough numbers when it's worth it to use workers at all. e.g.: Is it possible to speed up functions that take only 10ms? Or more general information about the overhead that will occur when using workers. I think the docs are really vague here.
|
That depends – do you spin up a Worker for every invocation? No, that’s not going to help here, it would normally take longer than 10 ms to do so. But if you use shared memory, or at least MessagePorts, to communicate the tasks to a Worker and back? Then, yes, that might be very much possible. Maybe it’s worth doing more advertisement for using a worker thread pool in the docs?
Fwiw, some reasons why no specific overhead measurements are mentioned in the docs are that this is going to depend on the actual machine the code is running on and we’re actively working on improvements re: improving startup time, both for Node.js itself and Workers. |
Of course, but we could generate some performance information to give folks an indication of pros and cons. @addaleax Do you know whether this something the benchmarking WG is considering? |
@davisjam Not that I know of. At this point the only benchmark we have is one for passing messages between workers (the |
Yes good idea. Maybe an example would be nice too. |
@addaleax 50%+ speedup for my algorithm! workers rock! |
Yup – as far as V8/all JS stuff is concerned, every Worker is an independent instance. :) |
I had problems with messages coming back from workers when the main function was called in a high frequency and the workers needed different amounts of time to return results. Classic async problem I guess. So old worker returns were smearing results into later called functions. I couldn't find any solution online (the simple subchannel example in node docs produces the same error). Guess this is an edge-case anyway but it produces surprisingly strange results. My workaround is that I create a couple subchannels for each worker, send each worker it's ports and telling each worker on every call which port he should use to communicate, always iterating through the array of ports. Receiving specific function results on parent looks like this now: |
@a1xon Just for clarification … is the issue that it’s not obvious how to tie requests to workers back to the responses if they can arrive out-of-order? |
@addaleax yes couldn't find anything online or in the docs to tackle that :\ |
After a little bit later, i still think js is a functional language and not horse power. For parallism, it is a language C++ and below to assembly or video card. NodeJS will never be good for processing. It is like Unicorn NodeJs vs Atom Hydrogen Bomb Assembly. Totally different approaches. |
@a1xon I’d say it isn’t a problem that’s unique to workers – adding something like a request ID to the passed message could help here, so that you don’t have to maintain multiple ports? I (and this is really just a personal opinion) think this is a kind of problem that we’d only tackle with a built-in solution if we were to provide some kind of built-in worker pool? @p3x-robot Keep in mind that JS can definitely perform on the same order of magnitude as native languages, and there are things like WebAssembly that have a significant impact on the performance-in-JS world as well. |
Of course, just an opinion.Webassembly is awesome, just like add-on or at last, thread worker. I still have not found a good use case, we always use native bindings or native c/c++ based proceses. But what we have not found, it is coming... |
Agreed.
I think the question is "Is it fast enough for my purposes, and is it the bottleneck of the system on which I am processing?" |
@davisjam in fact, i use requestId a lot for redis or socket.io for passing one time events. As for bottleneck, we use like imagemagick, or some c or c++ to use horse power, i wanted threads so much and still i cant need it at all. Of course the thread worker can use a different core so it is for who find a good use case it will be so awesome totally. Though, theads can block itself if the thread is for some weird reason is on the same core, but thread workers is a 1099999% feature, wanted so much, now it is in my palm, HAPPY. |
@addaleax @davisjam tried to do it with only ids in the first place. But in 1 of 1000 cases the worker were called 2 or 3 times with the same random generated id. So I switched to changing ports and it works like a charm now - I'm also using IDs at the moment. Are you interested in the code? I can try to pretty it up a little. |
It is huge information that you tell us, because i am sure i will face this use case at some point. Thanks so much. |
@a1xon Do you think a counter or something like that would work, to avoid collisions? But either way, as long as you found something that works… :) You can feel free to share code if you think it contains feedback that we can apply to the Workers implementation, or that could be looked at for developing benchmarks/tests/etc. if you think that makes sense. |
@addaleax sorry that was missleading. |
Just uploaded the code. Hope I didn't strip too much from it. |
@a1xon I’d be a bit wary of using let counter = 0; // Use 0n if you want to be absolutely sure and use BigInt instead
const main = async someArgument => {
…
let requestID = ++counter;
…
}; That way you can avoid collisions and don’t need to plan for them, even if they come with a very low frequency. |
If you need a shared counter for p2p workers without going through a coordination thread you can also create a SharedArrayBuffer and pass it around, and anyone wishing to use it can lock the structure then increment the counter. |
I’m closing the existing issues here. If you have feedback about the existing Workers implementation in Node.js 10+, please use #6 for that! |
i tried C++ async addon, and it took me 1 day. ( plus cancel a thread function)
why is it so hard to add a thread to nodejs?
The text was updated successfully, but these errors were encountered: