spacy with joblib library generates _pickle.PicklingError: Could not pickle the task to send it to the workers

Question:

I have a large list of sentences (~7 millions), and I want to extract the nouns from them.

I used joblib library to parallelize the extracting process, like in the following:

import spacy
from tqdm import tqdm
from joblib import Parallel, delayed
nlp = spacy.load('en_core_web_sm')

class nouns:

    def get_nouns(self, text):
        doc = nlp(u"{}".format(text))
        return [token.text for token in doc if token.tag_ in ['NN', 'NNP', 'NNS', 'NNPS']]

    def parallelize(self, sentences):
        results = Parallel(n_jobs=1)(delayed(self.get_nouns)(sent) for sent in tqdm(sentences))
        return results

if __name__ == '__main__':
    sentences = ['we went to the school yesterday',
                 'The weather is really cold',
                 'Can we catch the dog?',
                 'How old are you John?',
                 'I like diving and swimming',
                 'Can the world become united?']
    obj = nouns()
    print(obj.parallelize(sentences))

when n_jobs in parallelize function is more than 1, I get this long error:

100%|██████████| 6/6 [00:00<00:00, 200.00it/s]
joblib.externals.loky.process_executor._RemoteTraceback: 
"""
Traceback (most recent call last):
  File "C:Python35libsite-packagesjoblibexternalslokybackendqueues.py", line 150, in _feed
    obj_ = dumps(obj, reducers=reducers)
  File "C:Python35libsite-packagesjoblibexternalslokybackendreduction.py", line 243, in dumps
    dump(obj, buf, reducers=reducers, protocol=protocol)
  File "C:Python35libsite-packagesjoblibexternalslokybackendreduction.py", line 236, in dump
    _LokyPickler(file, reducers=reducers, protocol=protocol).dump(obj)
  File "C:Python35libsite-packagesjoblibexternalscloudpicklecloudpickle.py", line 267, in dump
    return Pickler.dump(self, obj)
  File "C:Python35libpickle.py", line 408, in dump
    self.save(obj)
  File "C:Python35libpickle.py", line 520, in save
    self.save_reduce(obj=obj, *rv)
  File "C:Python35libpickle.py", line 623, in save_reduce
    save(state)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 810, in save_dict
    self._batch_setitems(obj.items())
  File "C:Python35libpickle.py", line 836, in _batch_setitems
    save(v)
  File "C:Python35libpickle.py", line 520, in save
    self.save_reduce(obj=obj, *rv)
  File "C:Python35libpickle.py", line 623, in save_reduce
    save(state)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 810, in save_dict
    self._batch_setitems(obj.items())
  File "C:Python35libpickle.py", line 841, in _batch_setitems
    save(v)
  File "C:Python35libpickle.py", line 520, in save
    self.save_reduce(obj=obj, *rv)
  File "C:Python35libpickle.py", line 623, in save_reduce
    save(state)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 810, in save_dict
    self._batch_setitems(obj.items())
  File "C:Python35libpickle.py", line 836, in _batch_setitems
    save(v)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 770, in save_list
    self._batch_appends(obj)
  File "C:Python35libpickle.py", line 797, in _batch_appends
    save(tmp[0])
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 725, in save_tuple
    save(element)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libsite-packagesjoblibexternalscloudpicklecloudpickle.py", line 718, in save_instancemethod
    self.save_reduce(types.MethodType, (obj.__func__, obj.__self__), obj=obj)
  File "C:Python35libpickle.py", line 599, in save_reduce
    save(args)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 725, in save_tuple
    save(element)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libsite-packagesjoblibexternalscloudpicklecloudpickle.py", line 395, in save_function
    self.save_function_tuple(obj)
  File "C:Python35libsite-packagesjoblibexternalscloudpicklecloudpickle.py", line 594, in save_function_tuple
    save(state)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 810, in save_dict
    self._batch_setitems(obj.items())
  File "C:Python35libpickle.py", line 836, in _batch_setitems
    save(v)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 810, in save_dict
    self._batch_setitems(obj.items())
  File "C:Python35libpickle.py", line 841, in _batch_setitems
    save(v)
  File "C:Python35libpickle.py", line 520, in save
    self.save_reduce(obj=obj, *rv)
  File "C:Python35libpickle.py", line 623, in save_reduce
    save(state)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 810, in save_dict
    self._batch_setitems(obj.items())
  File "C:Python35libpickle.py", line 836, in _batch_setitems
    save(v)
  File "C:Python35libpickle.py", line 520, in save
    self.save_reduce(obj=obj, *rv)
  File "C:Python35libpickle.py", line 599, in save_reduce
    save(args)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 740, in save_tuple
    save(element)
  File "C:Python35libpickle.py", line 520, in save
    self.save_reduce(obj=obj, *rv)
  File "C:Python35libpickle.py", line 623, in save_reduce
    save(state)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 740, in save_tuple
    save(element)
  File "C:Python35libpickle.py", line 495, in save
    rv = reduce(self.proto)
  File "stringsource", line 2, in preshed.maps.PreshMap.__reduce_cython__
TypeError: self.c_map cannot be converted to a Python object for pickling
"""Exception in thread QueueFeederThread:
Traceback (most recent call last):
  File "C:Python35libsite-packagesjoblibexternalslokybackendqueues.py", line 150, in _feed
    obj_ = dumps(obj, reducers=reducers)
  File "C:Python35libsite-packagesjoblibexternalslokybackendreduction.py", line 243, in dumps
    dump(obj, buf, reducers=reducers, protocol=protocol)
  File "C:Python35libsite-packagesjoblibexternalslokybackendreduction.py", line 236, in dump
    _LokyPickler(file, reducers=reducers, protocol=protocol).dump(obj)
  File "C:Python35libsite-packagesjoblibexternalscloudpicklecloudpickle.py", line 267, in dump
    return Pickler.dump(self, obj)
  File "C:Python35libpickle.py", line 408, in dump
    self.save(obj)
  File "C:Python35libpickle.py", line 520, in save
    self.save_reduce(obj=obj, *rv)
  File "C:Python35libpickle.py", line 623, in save_reduce
    save(state)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 810, in save_dict
    self._batch_setitems(obj.items())
  File "C:Python35libpickle.py", line 836, in _batch_setitems
    save(v)
  File "C:Python35libpickle.py", line 520, in save
    self.save_reduce(obj=obj, *rv)
  File "C:Python35libpickle.py", line 623, in save_reduce
    save(state)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 810, in save_dict
    self._batch_setitems(obj.items())
  File "C:Python35libpickle.py", line 841, in _batch_setitems
    save(v)
  File "C:Python35libpickle.py", line 520, in save
    self.save_reduce(obj=obj, *rv)
  File "C:Python35libpickle.py", line 623, in save_reduce
    save(state)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 810, in save_dict
    self._batch_setitems(obj.items())
  File "C:Python35libpickle.py", line 836, in _batch_setitems
    save(v)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 770, in save_list
    self._batch_appends(obj)
  File "C:Python35libpickle.py", line 797, in _batch_appends
    save(tmp[0])
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 725, in save_tuple
    save(element)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libsite-packagesjoblibexternalscloudpicklecloudpickle.py", line 718, in save_instancemethod
    self.save_reduce(types.MethodType, (obj.__func__, obj.__self__), obj=obj)
  File "C:Python35libpickle.py", line 599, in save_reduce
    save(args)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 725, in save_tuple
    save(element)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libsite-packagesjoblibexternalscloudpicklecloudpickle.py", line 395, in save_function
    self.save_function_tuple(obj)
  File "C:Python35libsite-packagesjoblibexternalscloudpicklecloudpickle.py", line 594, in save_function_tuple
    save(state)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 810, in save_dict
    self._batch_setitems(obj.items())
  File "C:Python35libpickle.py", line 836, in _batch_setitems
    save(v)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 810, in save_dict
    self._batch_setitems(obj.items())
  File "C:Python35libpickle.py", line 841, in _batch_setitems
    save(v)
  File "C:Python35libpickle.py", line 520, in save
    self.save_reduce(obj=obj, *rv)
  File "C:Python35libpickle.py", line 623, in save_reduce
    save(state)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 810, in save_dict
    self._batch_setitems(obj.items())
  File "C:Python35libpickle.py", line 836, in _batch_setitems
    save(v)
  File "C:Python35libpickle.py", line 520, in save
    self.save_reduce(obj=obj, *rv)
  File "C:Python35libpickle.py", line 599, in save_reduce
    save(args)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 740, in save_tuple
    save(element)
  File "C:Python35libpickle.py", line 520, in save
    self.save_reduce(obj=obj, *rv)
  File "C:Python35libpickle.py", line 623, in save_reduce
    save(state)
  File "C:Python35libpickle.py", line 475, in save
    f(self, obj) # Call unbound method with explicit self
  File "C:Python35libpickle.py", line 740, in save_tuple
    save(element)
  File "C:Python35libpickle.py", line 495, in save
    rv = reduce(self.proto)
  File "stringsource", line 2, in preshed.maps.PreshMap.__reduce_cython__
TypeError: self.c_map cannot be converted to a Python object for pickling

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:Python35libthreading.py", line 914, in _bootstrap_inner
    self.run()
  File "C:Python35libthreading.py", line 862, in run
    self._target(*self._args, **self._kwargs)
  File "C:Python35libsite-packagesjoblibexternalslokybackendqueues.py", line 175, in _feed
    onerror(e, obj)
  File "C:Python35libsite-packagesjoblibexternalslokyprocess_executor.py", line 310, in _on_queue_feeder_error
    self.thread_wakeup.wakeup()
  File "C:Python35libsite-packagesjoblibexternalslokyprocess_executor.py", line 155, in wakeup
    self._writer.send_bytes(b"")
  File "C:Python35libmultiprocessingconnection.py", line 183, in send_bytes
    self._check_closed()
  File "C:Python35libmultiprocessingconnection.py", line 136, in _check_closed
    raise OSError("handle is closed")
OSError: handle is closed



The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File ".../playground.py", line 43, in <module>
    print(obj.Paralize(sentences))
  File ".../playground.py", line 32, in Paralize
    results = Parallel(n_jobs=2)(delayed(self.get_nouns)(sent) for sent in tqdm(sentences))
  File "C:Python35libsite-packagesjoblibparallel.py", line 934, in __call__
    self.retrieve()
  File "C:Python35libsite-packagesjoblibparallel.py", line 833, in retrieve
    self._output.extend(job.get(timeout=self.timeout))
  File "C:Python35libsite-packagesjoblib_parallel_backends.py", line 521, in wrap_future_result
    return future.result(timeout=timeout)
  File "C:Python35libconcurrentfutures_base.py", line 405, in result
    return self.__get_result()
  File "C:Python35libconcurrentfutures_base.py", line 357, in __get_result
    raise self._exception
_pickle.PicklingError: Could not pickle the task to send it to the workers.

What is the problem in my code?

Asked By: Minions

||

Answers:

Q: What is the problem in my code?

Well, most probably the issue comes not from the code, but from the “hidden” processing, that appears, once n_jobs directs ( and joblib internally orchestrates ) to prepare that many exact copies of the main process, so as to let them work independently one of each other ( effectively thus escaping from GIL-locking and mapping the multiple process-flows onto physical hardware resources )

This step is responsible for making copies of all pythonic objects and was known to use Pickle for doing this. The Pickle module was known for its historical principal limitations on what can be pickled and what cannot.

The error message confirms this:

TypeError: self.c_map cannot be converted to a Python object for pickling

One may try a trick to supply Mike McKearns dill module instead of Pickle and test, if your “problematic” python objects will get pickled with this module without throwing this error.

dill has the same API signatures, so a pure import dill as pickle may help with leaving all the other code the same.

I had the same problems, with large models to get distributed into and back from multiple processes and the dill was a way to go. Also the performance has increased.

Bonus: dill allows to save / restore the full python interpreter state!

This was a cool side-effect of finding dill, once import dill as pickle was done, pickle.dump_session( <aFile> ) will save ones complete state-full copy of the python interpreter session. This can be restored, if needed ( post-crash restores, trained trained and optimised ML-model state-fully saved / restored, incremental learning ML-model state-fully saved and re-distributed for remote restores for the deployed user-bases, etc. )

Answered By: user3666197

An additional answer for my question:

I didn’t find a solution for Joblib with Spacy, but instead to parallelize the process, I found that Spacy released something called Pipeline, where you can parse large number of documents with multi-threads.

I applied it with the same example above:

class nouns:

    def get_nouns(self, sentences):
        start = time.time()
        docs = nlp.pipe(sentences, n_threads=-1)
        result = [ ' '.join([token.text for token in doc if token.tag_ in ['NN', 'NNP', 'NNS', 'NNPS']]) for doc in docs]
        print('Time Elapsed {} ms'.format((time.time() - start) * 1000))
        print(result)


if __name__ == '__main__':
    sentences = ['we went to the school yesterday',
                 'The weather is really cold',
                 'Can we catch the dog?',
                 'How old are you John?',
                 'I like diving and swimming',
                 'Can the world become united?']
    obj = nouns()
    obj.get_nouns(sentences)
Answered By: Minions

I had a similar problem with paralleling lemmatization, but with another library pymystem3.

from pymystem3 import Mystem
mystem = Mystem()
    
def preprocess_text(text):
   ...
   tokens = mystem.lemmatize(text)
   ...
   text = " ".join(tokens)
   return text

data_set = Parallel(n_jobs=-1)(delayed(preprocess_text)(article) for article in tqdm(articles))

The solution was to put initialization into function.

def preprocess_text(text):
   ...
   mystem = Mystem()
   tokens = mystem.lemmatize(text)
   ...
   text = " ".join(tokens)
   return text

I suspect you could try the same with nlp = spacy.load

Answered By: Timur

Same issue. I solved by changing the backend from loky to threading in Parallel.

Answered By: Tommaso Di Noto

Just want to add my two cents. Use @staticmethod over your class method and spare the auto-injected self-object to prevent accidentally serializing a whole framework, as happened in my case (flask). As the framework does a lot of behind-the-scenes injections and blow-up the serialization dependencies.

Answered By: lkaupp

Case study on how I fixed this:

Environment

  • Windows 10 x64
  • Python 3.9 or 3.10
  • joblib v1.1

Solution

# Examine the stack trace very carefully, you will see a line something like this:
TypeError: self.c_map cannot be converted to a Python object for pickling

This tells you exactly what variable cannot be serialized.

To fix, choose one option:

  1. Remove the variable from the function.
  2. Initialize the variable in the function from scratch.

In my case, I had to use a mixture of #1 and #2:

  1. Removed a variable that pointed to a class that had a handle to an open file (it cannot pickle anything with an open file).
  2. A class variable could not be pickled, so I initialized it again inside the function (which removes the need to serialize this class and pass it to the new process).

Example code

# [Bugfix]. Add next line to initialize this again to eliminate pmap pickle error. Stacktrace is your friend!
hdb = HivedbApi(base_dir=hivedb_base_dir, table_name=table_name, partition_type=PartitionType.HiveFilePerDate)
hdb.write(df_trades)

In the example in the OP, I would be hunting for some variable inside get_nouns() that could not be serialized, based on the stack trace (it will tell you exactly what variable it is stumbling over).

Solutions that did not work

Nothing else on this page worked, including changing backend to threading, pickler to dill, annotating the function, changing Python version, etc.

Bottom line:

Sometimes, nothing can serialize a class, especially if it has handles to open files. In this case, the only solution is to (a) remove these variables from the target function, or (b) reinitialize these variables inside the target function.

Answered By: Contango

In case if you are still not able to find the solution.
I resolved the error by changing

Parallel(n_jobs=8)

to

Parallel(n_jobs=8, prefer="threads")
Answered By: Muhammad Fazeel