While implementing some Gauss-Seidel solver with Python and Numpy I discovered an interesting side effect. I tried to extract some minimal example:
#!/usr/bin/env python3
import numpy as np
import timeit
number = 1000
startup = 'import numpy as np;N = 2048;X = np.arange(N * N).reshape((N, N));'
startup2 = startup + 'np.empty_like(X)'
example = ('X[1:: 2, : -1: 2] = ('
'X[: -1: 2, 1:: 2] +'
'X[1:: 2, : -1: 2] +'
'X[1:: 2, : -1: 2] +'
'X[: -1: 2, : -1: 2])/4')
running print(timeit.timeit(example, setup=startup, number=number))
takes at my machine ~5s
while print(timeit.timeit(example, setup=startup2, number=number))
takes ~4s.
So around ~1s faster, although there is this unnecessary array allocation withnp.emtpy_like(X)
. I observed this effect on various machines and with various array sizes or iterations.
I assume that calculation of the right hand side in the assignment leads to an temporally array allocation. It seems that Numpy somehow reuses the unused array that was created with thenp.emtpy_like(X)
to speed up the temporally array allocation.
Am I right with this assumption or is something totally different the reason for the differences in the time?
If I remove the /4
to
example = ('X[1:: 2, : -1: 2] = ('
'X[: -1: 2, 1:: 2] +'
'X[1:: 2, : -1: 2] +'
'X[1:: 2, : -1: 2] +'
'X[: -1: 2, : -1: 2])')
Then, I can not observe differences in the execution time between the different versions. So I assume that in this case the calculation can be done in-place and then there is no temporally allocation.
Is there a more explicit way to exploit this effect? Just writing np.emtpy_like(X)` looks somehow "hacky" to me.
Thanks in advance!
from Is numpy reusing memory from unused arrays?
No comments:
Post a Comment