large data args do not work larger than 1.5mb
Supplying a large-ish json object to the python child process for processing · Issue #179 · extrabacon/python-shell
Hi, I am trying to use python-shell (excellent package btw) to run python child processes to process large amounts of data for a webapp. This is being run on a computer that is running ubuntu 18.04. I am currently passing in the script n...
https://github.com/extrabacon/python-shell/issues/179
we need to passs through stdin
- input
extrabacon/python-shell
A simple way to run Python scripts from Node.js with basic but efficient inter-process communication and better error handling. Reliably spawn Python scripts in a child process Built-in text, JSON and binary modes Custom parsers and formatters Simple and efficient data transfers through stdin and stdout streams Extended stack traces when an error is thrown To run the tests: If the script exits with a non-zero code, an error will be thrown.
https://github.com/extrabacon/python-shell#exchanging-data-between-node-and-python
- and get
Fastest stdin/out IO in python 3?
Probably not. In the end, print will call sys.stdout.write(). But since print is a built in function, probably implemented in C, it might even be faster than calling sys.stdout.write(). Since all IO has to go through the object which sys.stdout returns, that's the bottleneck. The same is true for sys.stdin.
https://stackoverflow.com/questions/7982146/fastest-stdin-out-io-in-python-3

Seonglae Cho