Large File Crashing on Jupyter Notebook

Large File crashing on Jupyter Notebook

I'm not sure if this will work since the information you have provided is somewhat limited, but if you're using python 3 I had a similar issue. Try typing this at the top and see if this helps. It might fix your issue.

import os
os.environ['KMP_DUPLICATE_LIB_OK'] = 'True'

The above solution is sort of a band-aid and isn't supported and may cause undefined behavior. If your data is too big for your memory try reading in the data with dask.

import dask.dataframe as dd
dd.read_csv(path, params)

jupyter notebook kernel crashing non-stop

> File "/mnt/sda5/knuth/geo/email.py", line 1, in

This line told the problem.

In my current working directory there is this file titled "email.py" which contains some code of mine. Since Jupyter Notebok was running from same directory, it thought it is from email-parser package and started looking for something it looks for only in a standard package.

Two Lessons:

  • In Python, current working directory has highest priority, higher than even location of standard installation.

  • Never keep generic names for your files in the directory from where you are running Jupyter Notebook or Python else they will conflict with standard packages.

Thanks to Min RK for solving this at gitter channel.



Related Topics



Leave a reply



Submit