out-of-memory

MemoryError in Jupyter but not in Python

MemoryError in Jupyter but not in Python Question: I’m running NAME="CentOS Linux" VERSION="7 (Core)" ID="centos" ID_LIKE="rhel fedora" VERSION_ID="7" with plenty of memory total used free shared buff/cache available Mem: 125G 3.3G 104G 879M 17G 120G 64 bit Anaconda https://repo.anaconda.com/archive/Anaconda3-2022.10-Linux-x86_64.sh I have set max_buffer_size to 64GB in both jupyter_notebook_config.json and jupyter_notebook_config.py, and just to make sure …

Total answers: 1

Memory issue when create a diagonal numpy array

Memory issue when create a diagonal numpy array Question: I have created a diagonal numpy array: a = numpy.float32(numpy.random.rand(10)) a = numpy.diagonal(a) However, I face MemoryError since my matrix is extremely large. Is there anyway to save the memory? Asked By: Hossein Ayouqi || Source Answers: The best way to handle this case is to …

Total answers: 1

MemoryError using MultinomilNB

MemoryError using MultinomilNB Question: I got MemoryError while using sklearn.naive_bayes.MultinomialNB for Named Entity Recognition on a big data, where train.shape = (416330, 97896). data_train = pd.read_csv(path[0] + "train_SENTENCED.tsv", encoding="utf-8", sep=’t’, quoting=csv.QUOTE_NONE) data_test = pd.read_csv(path[0] + "test_SENTENCED.tsv", encoding="utf-8", sep=’t’, quoting=csv.QUOTE_NONE) print(‘TRAIN_DATA:n’, data_train.tail(5)) # FIT TRANSFORM X_TRAIN = data_train.drop(‘Tag’, axis=1) X_TEST = data_test.drop(‘Tag’, axis=1) v = DictVectorizer(sparse=False) …

Total answers: 1

Getting rid of maxpooling layer causes running cuda out memory error pytorch

Getting rid of maxpooling layer causes running cuda out memory error pytorch Question: Video card: gtx1070ti 8Gb, batchsize 64, input image size 128*128. I had such UNET with resnet152 as encoder which worked pretty fine: class UNetResNet(nn.Module): def __init__(self, encoder_depth, num_classes, num_filters=32, dropout_2d=0.2, pretrained=False, is_deconv=False): super().__init__() self.num_classes = num_classes self.dropout_2d = dropout_2d if encoder_depth == …

Total answers: 1

MemoryError when I merge two Pandas data frames

MemoryError when I merge two Pandas data frames Question: I searched almost all over the internet and somehow none of the approaches seem to work in my case. I have two large csv files (each with a million+ rows and about 300-400MB in size). They are loading fine into data frames using the read_csv function …

Total answers: 4