TypeError: cannot pickle 'sqlite3.Connection' object when running Ray Tune with Jupyter notebook script

Description of problem

I got an “TypeError: cannot pickle ‘sqlite3.Connection’ object” error when running the attached Ray Tune code. The code need to import a notebook script called ttt.ipynb. Such an error won’t show up if using a python script instead.

Minimal code to reproduce problem

example.py (936 Bytes)

The ttt.ipynb is as follows:

  from brian2 import *

 def ttt():
     set_device('runtime')
     prefs.codegen.target = 'numpy'
     return 2.0

Full traceback of error (if relevant)

  PC:wxie:ray.tune>ipython example.py
  ---------------------------------------------------------------------------
  TypeError                                 Traceback (most recent call last)
  File ~/code_example/python/ray.tune/example.py:24, in <module>
       20 raw_log_name = "example"
       22 algorithm = HyperOptSearch(search_space, metric="SCORE", mode="max", n_initial_points=1)
  ---> 24 tuner = tune.Tuner(objective,
       25         tune_config = tune.TuneConfig(
       26             num_samples = 1,
       27             search_alg=algorithm,
       28             ),
       29         param_space=search_space,
       30         run_config = air.RunConfig(local_dir = raw_log_dir, name = raw_log_name)
       31         )
       33 results = tuner.fit()
       34 print(results.get_best_result(metric="SCORE", mode="max").config)

  File ~/anaconda3/envs/b2c/lib/python3.8/site-packages/ray/tune/tuner.py:152, in Tuner.__init__(self, trainable, param_space, tune_config, run_config, _tuner_kwargs, _tuner_internal)
      150 kwargs.pop(_SELF, None)
      151 if not self._is_ray_client:
  --> 152     self._local_tuner = TunerInternal(**kwargs)
      153 else:
      154     self._remote_tuner = _force_on_current_node(
      155         ray.remote(num_cpus=0)(TunerInternal)
      156     ).remote(**kwargs)

  File ~/anaconda3/envs/b2c/lib/python3.8/site-packages/ray/tune/impl/tuner_internal.py:125, in TunerInternal.__init__(self, restore_path, resume_config, trainable, param_space, tune_config, run_config, _tuner_kwargs)
      122     pickle.dump(self, fp)
      124 with open(experiment_checkpoint_path / _TRAINABLE_PKL, "wb") as fp:
  --> 125     pickle.dump(self._trainable, fp)
      126 self._maybe_warn_resource_contention()

  File ~/anaconda3/envs/b2c/lib/python3.8/site-packages/ray/cloudpickle/cloudpickle_fast.py:55, in dump(obj, file, protocol, buffer_callback)
       45 def dump(obj, file, protocol=None, buffer_callback=None):
       46     """Serialize obj as bytes streamed into file
       47
       48     protocol defaults to cloudpickle.DEFAULT_PROTOCOL which is an alias to
     (...)
       53     compatibility with older versions of Python.
       54     """
  ---> 55     CloudPickler(
       56         file, protocol=protocol, buffer_callback=buffer_callback
       57     ).dump(obj)

  File ~/anaconda3/envs/b2c/lib/python3.8/site-packages/ray/cloudpickle/cloudpickle_fast.py:627, in CloudPickler.dump(self, obj)
      625 def dump(self, obj):
      626     try:
  --> 627         return Pickler.dump(self, obj)
      628     except RuntimeError as e:
      629         if "recursion" in e.args[0]:

  TypeError: cannot pickle 'sqlite3.Connection' object

Hi, I had a quick look and was able to reproduce the error. Running notebooks from another process can be a bit fragile, in particular when code then runs things in separate parallel processes. I think the issue here is simply that your notebook uses from brian2 import *, which imports a lot of things into the namespace and then leads to name clashes. Try importing brian2 by name instead, e.g. like this

import brian2

def ttt():
    brian2.set_device('runtime')
    brian2.prefs.codegen.target = 'numpy'
    return 2.0

or, for slightly less typing:

import brian2 as b2
def ttt():
    b2.set_device('runtime')
    b2.prefs.codegen.target = 'numpy'
    return 2.0

In your small example, this is enough to make it work for me. Hope it works for your more complex code as well!

2 Likes