Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Process hang while executing logging.warning under specific circumstances #1166

Open
constcast opened this issue May 1, 2016 · 11 comments
Open

Comments

@constcast
Copy link

I identified a kernel hang when calling the python logging facility under certain circumstances. The hang is specific for usage of function in jupyter, i.e. it does not happen in a ipython command shell.

The hang happens when I import a specific module form this repository: https://github.com/secdev/scapy/ (the issue can be reproduced with the version from pip as well).

from scapy.route6 import *

The hang happens in line 234 when this method is called

warning("No route found for IPv6 destination %s (no default route?)" % dst)

warning() is a wrapper that calls

logging.getLogger("scapy.runtime").logging(x)

where x is the message. I am not able to reproduce this issue in a smaller test case, i.e. I cannot find other situations where a call to logging.warning() hangs the kernel.

The reason why I open this issue with jupyter is that the jupyter notebook is the only environment where this results in a hang. The code works without issues in the standard python2.7 interpreter, and in the ipython command shell

I experience the issue on Mac OS X with the following python, and package versions

$ python --version
Python 2.7.10  

ipython and jupyter versions are:

ipykernel==4.3.1
ipython==4.2.0
ipython-genutils==0.1.0
ipywidgets==5.1.2
jupyter==1.0.0
jupyter-client==4.2.2
jupyter-console==4.1.1
jupyter-core==4.1.0

Is there any additional information that I can provide to further debug this problem?

@takluyver
Copy link
Member

That's very odd, I don't know why it would hang. If you interrupt it while it's hanging, do you get any traceback?

There was a change to the logging setup in the kernel recently - #127. Can you try installing ipykernel from master and see if that makes a difference?

@constcast
Copy link
Author

There is no backtrace or any other information when I interrupt the kernel.

I installed the current ipykernel master (commit: ffadc63 ). There is no difference in the behavior compared to the one currently available with pip.

Do you have any ideas what I can do to further debug this issue?

@takluyver
Copy link
Member

Do you have any ideas what I can do to further debug this issue?

Not really, I'm afraid. Double check that it is definitely the log call hanging and not anything else nearby in the code. Dig into the logging module to try to work out exactly what is hanging.

@slishak
Copy link

slishak commented Jul 4, 2016

@takluyver do you think this is the same issue, and does it help identify what's going wrong? Thanks!

ipython/ipyparallel#161

@takluyver
Copy link
Member

On Windows I'd guess that the issue is trying to write to a nonexistant stderr; once some buffer fills up, the next write call will block. We've seen this before on Windows. This issue was reported on a Mac, so I doubt it's the same, but the cause may be similar.

@binh-vu
Copy link

binh-vu commented Jan 5, 2017

I have the same issue when trying to import module rdflib at https://github.com/RDFLib/rdflib.

When importing rdflib from python, it logs this line: INFO:rdflib:RDFLib Version: 4.2.1. And importing from jupyter, the kernel just hang there.

UPDATE:
I try to uninstall and install the follow packages from anaconda but the bug's still there:

jupyter==1.0.0
jupyter_client==4.4.0
jupyter_console==5.0.0
jupyter-core==4.2.0
ipykernel==4.5.0
ipyparallel==5.2.0
ipython==5.1.0
ipython-genutils==0.1.0
ipywidgets==5.2.2

Currently, I run this code (disable logging) before importing module to work around this bug:

import logging
logging.getLogger("rdflib").setLevel(logging.ERROR)

@joe1gi
Copy link

joe1gi commented Jan 8, 2017

I ran into the same problem with one of my own packages last week. Ultimately, this appears to be caused by Python's import lock causing a deadlock when ipkernel.iostream.OutStream.flush is called (by logging.warning or otherwise) during an import. Because of changes to the import logic, the problem may disappear when running in Python >= 3.3 (not tested).

OutStream.flush schedules OutStream._flush in a separate thread (pub_thread), and, via jupyter_client.session.Session.send, Session.msg, Session.msg_header, Session.msg_id and uuid.uuid4, triggers an import in pub_thread which blocks if an import is running in the main thread. At the same time, OutStream.flush waits for pub_thread to return from _flush, resulting in a deadlock.

Minimal example that triggers the behavior:

# hello.py
import sys
sys.stderr.write('hello')
sys.stderr.flush()
> import hello
(never returns)

Execution traces of the main thread and pub_thread.
Running IPython 5.1.0 and Jupyter notebook 4.3.1 on Python 2.7.6.

@minrk
Copy link
Member

minrk commented Jan 9, 2017

That deadlock should be fixed in ipykernel 4.5.2, so upgrading ipykernel should solve it.

@joe1gi
Copy link

joe1gi commented Jan 10, 2017

Unfortunately, I see the same behavior with ipykernel 4.5.2 (other versions are jupyter_client 4.4.0, jupyter_core 4.2.1, notebook 4.3.1).

Trace of pub_thread

@ikudriavtsev
Copy link

I was able to reproduce it with these simple steps:

  1. created a file named e.g. jupyter_logging_test.py with the following contents and put the file in the PYTHONPATH:
import logging
logging.error('foo')
  1. ran the jupyter-console command
  2. ran the import jupyter_logging_test statement in the console
  3. it hangs

Please note, that it happens only on import, i.e. it doesn't happen when one runs a python file as a script.

The environment:

ipykernel==4.4.1
ipython==5.0.0
ipython-genutils==0.1.0
ipywidgets==5.2.2

OS X Sierra 10.12.3 (16D32)

@ivanov
Copy link
Member

ivanov commented Nov 18, 2023

I haven't verified if this is still an issue, but moving it to the ipykernel repo, since it looks like that's where the fix would need to go.

@ivanov ivanov transferred this issue from jupyter/jupyter Nov 18, 2023
@ivanov ivanov transferred this issue from another repository Nov 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants