memory-management

Are there disadvantages of using __slots__?

Are there disadvantages of using __slots__? Question: I’m using Python 3.7 and Django. I was reading about __slots__ . Evidently, __slots__ can be used to optimize memory allocation for a large number of those objects by listing all the object properties ahead of time. class MyClass(object): __slots__ = [‘name’, ‘identifier’] def __init__(self, name, identifier): self.name …

Total answers: 1

Why do two identical lists have a different memory footprint?

Why do two identical lists have a different memory footprint? Question: I created two lists l1 and l2, but each one with a different creation method: import sys l1 = [None] * 10 l2 = [None for _ in range(10)] print(‘Size of l1 =’, sys.getsizeof(l1)) print(‘Size of l2 =’, sys.getsizeof(l2)) But the output surprised me: …

Total answers: 3

How to check if pytorch is using the GPU?

How do I check if PyTorch is using the GPU? Question: How do I check if PyTorch is using the GPU? The nvidia-smi command can detect GPU activity, but I want to check it directly from inside a Python script. Asked By: vvvvv || Source Answers: These functions should help: >>> import torch >>> torch.cuda.is_available() …

Total answers: 16

Python: how to get size of all objects in current namespace?

Python: how to get size of all objects in current namespace? Question: I have some code that I am running from my own package and the program is using a lot more memory (60GB) than it should be. How can I print the size of all objects (in bytes) in the current namespace in order …

Total answers: 3

How to concatenate multiple pandas.DataFrames without running into MemoryError

How to concatenate multiple pandas.DataFrames without running into MemoryError Question: I have three DataFrames that I’m trying to concatenate. concat_df = pd.concat([df1, df2, df3]) This results in a MemoryError. How can I resolve this? Note that most of the existing similar questions are on MemoryErrors occuring when reading large files. I don’t have that problem. …

Total answers: 10

Leveraging "Copy-on-Write" to Copy Data to Multiprocessing.Pool() Worker Processes

Leveraging "Copy-on-Write" to Copy Data to Multiprocessing.Pool() Worker Processes Question: I have a bit of multiprocessing Python code that looks a bit like this: import time from multiprocessing import Pool import numpy as np class MyClass(object): def __init__(self): self.myAttribute = np.zeros(100000000) # basically a big memory struct def my_multithreaded_analysis(self): arg_lists = [(self, i) for i …

Total answers: 3

Does setting numpy arrays to None free memory?

Does setting numpy arrays to None free memory? Question: I have hundreds of really larges matrices, like (600, 800) or (3, 600, 800) shape’d ones. Therefore I want to de-allocate the memory used as soon as I don’t really need something anymore. I thought: some_matrix = None Should do the job, or is just the …

Total answers: 3

Python itertool variations, memory maximum reached

Python itertool variations, memory maximum reached Question: I am currently looking to generate a list of numbers with a specific number of digits, my code currently as follows: | Python 2.7 | import itertools inp = raw_input(‘Number of digits to write?:’) inp = int(inp) inp2 = raw_input(‘File name?:’) inp2 = inp2 + ‘.txt’ variants = …

Total answers: 2

Different ways of deleting lists

Different ways of deleting lists Question: I want to understand why: a = []; del a; and del a[:]; behave so differently. I ran a test for each to illustrate the differences I witnessed: >>> # Test 1: Reset with a = [] … >>> a = [1,2,3] >>> b = a >>> a = …

Total answers: 6