21 questions from the last 30 days
2
votes
2
answers
82
views
Why should I pass a function using initializer and can I use shared memory instead?
Take this MWE:
from multiprocessing import Pool
from time import perf_counter as now
import numpy as np
def make_func():
n = 20000
np.random.seed(7)
M = np.random.rand(n, n)
return ...
2
votes
1
answer
82
views
tqdm, multiprocessing and how to print a line under the progress bar
I am using multiprocessing and tqdm to show the progress of the workers. I want to add a line under the progress bar to show which tasks are currently being processed. Unfortunately, whatever I do ...
2
votes
1
answer
82
views
Can `spawn` be made as memory efficient as `fork` with multiprocessing?
I am on Linux and have working multiprocessing code that uses fork. Here is a MWE version:
from multiprocessing import Pool
from time import perf_counter as now
import numpy as np
def make_func():
...
2
votes
1
answer
62
views
Multiprocessing.Process with spawn vs Subprocess.Popen
I have a binded C++ python library with a class that can only be initialized once per process (unfurtonately, due to legacy C++ code).
To overcome this, I created a subprocess wrapper around my class ...
2
votes
0
answers
93
views
How can i use multiprocessing in python for performing millions of comparisons?
My goal is to run a comparison between 2 different states of a rubiks cube class, which on its own is simple. The issue comes when you need to compute something on the order of 900 million comparisons ...
0
votes
0
answers
92
views
multiprocess library barely works
I'm using the multiprocess library to accelerate a CPU-bound task (a method inside a user-defined class).
The function processes a page of a document, in my example a 500-page document takes around 20 ...
0
votes
2
answers
54
views
Logger does not inherit config from parent process
Consider the following minimal setup:
/mymodule
├── __init__.py
├── main.py
└── worker.py
__init__.py is empty
main.py:
import sys
import logging
import multiprocessing
from test.worker import ...
0
votes
1
answer
83
views
Why does Python multiprocessing code hang when using a generator?
I am trying to mimic a large data set with a python generator and using multiproces. I am doing this to reduce memory usage but code stucks.
import multiprocessing
def process_data(data_chunk):
...
2
votes
0
answers
74
views
Terminating pool early intermittently crashes when return size is large
I have a script that's trying to analyse some images in parellel.
For some reason, I get intermittent crashes if the func passed to the pool returns variable sized data. This is only if I try to exit ...
0
votes
1
answer
46
views
The thread running under multiprocessing.Process does not update its instance attributes [closed]
I want to run several class instances in parallel and update the instance attributes using Redis. On each class instance, I run a thread in the background to listen to redis for changes. When I run ...
0
votes
1
answer
73
views
Multiprocessing doesn't work on my computer, but do works on my company's computer [duplicate]
When I run the following code, the kernel keeps running forever. I tried the exact same code in JupyterLab on Cloudera from my company’s computer and it worked, but in Jupyter Notebook and JupyterLab ...
2
votes
1
answer
55
views
Shared library cannot be pickled by multiprocessing or dill package [duplicate]
I have a set of C code that is called by ctypes in Python. I am trying to call this code in parallel fully independently. I'm currently using multiprocessing or pathos. These packages aren't a hard ...
0
votes
1
answer
64
views
ProcessPoolExecutor() with asyncio hangs randomly
I have an async service that processes data. My current approach is to process different folds created using TimeSeriesSplit in parallel, since this is a CPU heavy task I decided to uses concurrent....
0
votes
0
answers
47
views
Page number in PyMuPDF multiprocessing with extract_text
So in pymupdf documentation states that PyMuPDF does not support running on multiple threads
So they use multiprocessing, and they do this weird thing with segments in example code:
seg_size = int(...
0
votes
1
answer
27
views
Combine multiprocessing with subprocess stdin
I am trying to populate input for my fzf sript from multiple parallel processes.
fzf = subprocess.Popen(
[
"fzf",
],
stdin=subprocess.PIPE,
text=True,
)
so I want to ...
1
vote
0
answers
35
views
Paralellizing a Genetic Algorithm in Python is slower
I've been coding a Genetic Algorithm to solve TSP using python's DEAP library. I've implemented my own Graph class, and an Algorithm class that runs the GA on a Graph instance. However, when adding ...
0
votes
0
answers
35
views
Python multithreading or multiprocessing or something else
So i am in the case where i want to train and predict with few hundred models.
for training it is all fine as tensorflow seem to use all the computing power of the cpu cores.
when predicting i need to ...
0
votes
0
answers
26
views
I am using multiprocessing in my code, when I create an exe with pyinstaller the code executes several time depending of the processes I have
I am using multiprocessing in my code, the code works fine executed from cmd, vs code etc, but when I create an exe with pyinstaller the code executes from the beginning several times, (I know because ...
0
votes
1
answer
72
views
XIO: fatal IO error 0 (Success) on X server ":0" Kivy multiprocessing
Hi I am trying to create a program that when run, two windows open at the same time, from the same app. For this multihtreading is needed, but it seems I get some strange errors:
XIO: fatal IO error ...
-1
votes
0
answers
32
views
Why running an ElementWise pymoo Problem with a PyTorch model for evaluation works on Windows but not Linux?
I am currently working on a NSGA2 algorithm that is required to be evaluated ELEMENTWISE. The method for evaluating each input is a PyTorch model. I understand pytorch models are extremely efficient ...
0
votes
0
answers
16
views
Process hangs when multiprocessing with XGBoost model batch prediction
Here's a batch prediction case using multiprocessing. Steps:
After with mp.Pool(processes=num_processes) as pool, there's a with Dataset(dataset_code) as data in the main process using websocket to ...