python 导入numpy 导致多进程绑定同一个CPU问题解决方法
python 如果有導入numpy模塊的import語句,會導致默認將多進程程序的每個進程都綁定到同一個CPU core上,
失去了多進程在多核CPU上的性能優越性,這和CPU affinity(CPU親和性)有關,解決辦法:
導入affinity包,執行:
affinity.set_process_affinity_mask(0,2**multiprocessing.cpu_count()-1)
以下是英文文檔原文,供參考:
?
Python refuses to use multiple cores –?solution
I was trying to get?parallel Python?to work and I noticed that if I run two Python scripts simultaneously – say, in two different terminals – they use the same core. Hence, I get no speedup from multiprocessing/parallel Python. After some searching around, I found out that in some circumstances importing?numpy?causes Python to stick all computations in one core. This is an issue with?CPU affinity, and apparently it only happens for some mixtures of Numpy and BLAS libraries – other packages may cause the CPU affinity issue as well.
There’s a package called?affinity?(Linux only AFAIK) that lets you set and get CPU affinity. Download it, run python setup.py install, and run this in Python or?ipython:
| 1 2 3 4 | In [1]: import affinity In [2]: affinity.get_process_affinity_mask(0) Out[2]: 63 |
This is good: 63 is a bitmask corresponding to 111111 – meaning all 6 cores are available to Python. Now running this, I get:
| 1 2 3 4 | In [4]: import numpy as np In [5]: affinity.get_process_affinity_mask(0) Out[5]: 1 |
So now only one core is available to Python. The solution is simply to set the CPU affinity appropriately after import numpy, for instance:
| 1 2 3 4 5 | import numpy as np import affinity import multiprocessing affinity.set_process_affinity_mask(0,2**multiprocessing.cpu_count()-1) |
?
轉載于:https://www.cnblogs.com/Arborday/p/9858108.html
超強干貨來襲 云風專訪:近40年碼齡,通宵達旦的技術人生總結
以上是生活随笔為你收集整理的python 导入numpy 导致多进程绑定同一个CPU问题解决方法的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: ngnix的upstream模块配置详解
- 下一篇: H5页面关于android软键盘弹出顶起