Suppress excessive debug logs when consume rabbit

If using rabbitmq as rpc backend, oslo.messaging generates large amount
of redundant timeout debug logs (several logs per second on multiple
openstack services, such as nova, heat, cinder), in format of 'Timed out
waiting for RPC response: Timeout while waiting on RPC response - topic:
"<unknown>", RPC method: "<unknown>" info: "<unknown>'. It's because
each socket timeout exception is raised to multiple levels of error
recovery callback functions then logged repeatedly.

However, the accompanying value of socket.timeout exception is currently
always “timed out”. Besides, oslo.messaging has implemented retry
mechanism to recover socket timeout failure. Therefore, IMO those logs
should be suppressed, even if at debug level, to save disk space and
make debugging more convenient.

Change-Id: Iafc360f8d18871cff93e7fd721d793ecdef5f4a1
Closes-Bug: #1714558
(cherry picked from commit 147186c7b4)
This commit is contained in:
Zhen Qin 2017-09-01 13:38:05 -04:00 committed by Victor Stinner
parent 623ba0d0e6
commit 2a567ca607
1 changed files with 4 additions and 5 deletions

View File

@ -1033,14 +1033,13 @@ class Connection(object):
timer = rpc_common.DecayingTimer(duration=timeout)
timer.start()
def _raise_timeout(exc):
LOG.debug('Timed out waiting for RPC response: %s', exc)
def _raise_timeout():
raise rpc_common.Timeout()
def _recoverable_error_callback(exc):
if not isinstance(exc, rpc_common.Timeout):
self._new_tags = set(self._consumers.values())
timer.check_return(_raise_timeout, exc)
timer.check_return(_raise_timeout)
def _error_callback(exc):
_recoverable_error_callback(exc)
@ -1073,9 +1072,9 @@ class Connection(object):
try:
self.connection.drain_events(timeout=poll_timeout)
return
except socket.timeout as exc:
except socket.timeout:
poll_timeout = timer.check_return(
_raise_timeout, exc, maximum=self._poll_timeout)
_raise_timeout, maximum=self._poll_timeout)
except self.connection.channel_errors as exc:
if exc.code == 406 and exc.method_name == 'Basic.ack':
# NOTE(gordc): occasionally multiple workers will grab