Skip to content

Commit e54f5a3

Browse files
committed
Remove unused skip_double_compressed_messages
This `skip_double_compressed_messages` flag was added in #755 in order to fix #718. However, grep'ing through the code, it looks like it this is no longer used anywhere and doesn't do anything. So removing it.
1 parent ac5a935 commit e54f5a3

File tree

2 files changed

+0
-16
lines changed

2 files changed

+0
-16
lines changed

kafka/consumer/fetcher.py

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,6 @@ class Fetcher(six.Iterator):
5555
'max_partition_fetch_bytes': 1048576,
5656
'max_poll_records': sys.maxsize,
5757
'check_crcs': True,
58-
'skip_double_compressed_messages': False,
5958
'iterator_refetch_records': 1, # undocumented -- interface may change
6059
'metric_group_prefix': 'consumer',
6160
'api_version': (0, 8, 0),
@@ -98,13 +97,6 @@ def __init__(self, client, subscriptions, metrics, **configs):
9897
consumed. This ensures no on-the-wire or on-disk corruption to
9998
the messages occurred. This check adds some overhead, so it may
10099
be disabled in cases seeking extreme performance. Default: True
101-
skip_double_compressed_messages (bool): A bug in KafkaProducer
102-
caused some messages to be corrupted via double-compression.
103-
By default, the fetcher will return the messages as a compressed
104-
blob of bytes with a single offset, i.e. how the message was
105-
actually published to the cluster. If you prefer to have the
106-
fetcher automatically detect corrupt messages and skip them,
107-
set this option to True. Default: False.
108100
"""
109101
self.config = copy.copy(self.DEFAULT_CONFIG)
110102
for key in self.config:

kafka/consumer/group.py

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -165,13 +165,6 @@ class KafkaConsumer(six.Iterator):
165165
consumer_timeout_ms (int): number of milliseconds to block during
166166
message iteration before raising StopIteration (i.e., ending the
167167
iterator). Default block forever [float('inf')].
168-
skip_double_compressed_messages (bool): A bug in KafkaProducer <= 1.2.4
169-
caused some messages to be corrupted via double-compression.
170-
By default, the fetcher will return these messages as a compressed
171-
blob of bytes with a single offset, i.e. how the message was
172-
actually published to the cluster. If you prefer to have the
173-
fetcher automatically detect corrupt messages and skip them,
174-
set this option to True. Default: False.
175168
security_protocol (str): Protocol used to communicate with brokers.
176169
Valid values are: PLAINTEXT, SSL. Default: PLAINTEXT.
177170
ssl_context (ssl.SSLContext): Pre-configured SSLContext for wrapping
@@ -279,7 +272,6 @@ class KafkaConsumer(six.Iterator):
279272
'sock_chunk_bytes': 4096, # undocumented experimental option
280273
'sock_chunk_buffer_count': 1000, # undocumented experimental option
281274
'consumer_timeout_ms': float('inf'),
282-
'skip_double_compressed_messages': False,
283275
'security_protocol': 'PLAINTEXT',
284276
'ssl_context': None,
285277
'ssl_check_hostname': True,

0 commit comments

Comments
 (0)