Python handling socket.error: [Errno 104] Connection reset by peer
Categories:
Troubleshooting Python's socket.error: [Errno 104] Connection reset by peer
![Hero image for Python handling socket.error: [Errno 104] Connection reset by peer](/img/ef7387a3-hero.webp)
Understand and resolve the common Connection reset by peer
error in Python applications, particularly when using urllib2
on Ubuntu systems.
The socket.error: [Errno 104] Connection reset by peer
is a common and often frustrating error encountered in network programming, especially when dealing with HTTP requests in Python. This error indicates that the remote server abruptly closed the connection. It's not always an error in your Python code but rather a symptom of an issue on the server side, an intermediary network device, or how your client application interacts with the server.
Understanding the 'Connection reset by peer' Error
This error, often seen as socket.error: [Errno 104] Connection reset by peer
or ConnectionResetError: [Errno 104] Connection reset by peer
in Python 3, signifies that the other end of your network connection (the 'peer') has unexpectedly closed the connection. Unlike a graceful shutdown where both sides agree to close, a 'connection reset' is an abrupt termination. This usually happens when the remote server receives data that it doesn't expect or can't process, or if it decides to close the connection due to an internal error, timeout, or resource exhaustion. The server sends a TCP RST (reset) packet, which causes your client's socket to raise this error.
sequenceDiagram participant Client participant Server Client->>Server: SYN (Connection Request) Server->>Client: SYN-ACK (Acknowledge Request) Client->>Server: ACK (Acknowledge Acknowledge) Note over Client,Server: TCP Handshake Complete Client->>Server: HTTP Request (e.g., GET /data) Server--xClient: RST (Connection Reset) Client->>Client: socket.error: [Errno 104] Connection reset by peer Note over Server: Server abruptly closes connection
Sequence diagram illustrating a 'Connection reset by peer' scenario
Common Causes and Diagnosis
Several factors can lead to a Connection reset by peer
error. Identifying the root cause often requires examining both client-side behavior and potential server-side issues. Here are the most common culprits:
- Server-Side Issues: The server might be overloaded, crashed, or configured to terminate connections that exceed certain limits (e.g., idle timeouts, request size limits, rate limiting).
- Incorrect Client Request: Your Python application might be sending malformed HTTP requests, invalid headers, or data that the server cannot parse, causing it to reset the connection.
- Firewalls or Proxies: An intermediary firewall, proxy server, or load balancer might be terminating the connection due to security policies, timeouts, or misconfigurations.
- Keep-Alive Issues: If your client expects a persistent connection (HTTP Keep-Alive) but the server closes it prematurely, subsequent requests on the same socket will fail.
- SSL/TLS Handshake Failures: Mismatched SSL/TLS versions, invalid certificates, or other security-related issues during the handshake can cause the server to reset the connection.
- Server Restart/Crash: If the server process restarts or crashes while your client is connected, existing connections will be reset.
import urllib2
try:
response = urllib2.urlopen('http://example.com/api/data')
html = response.read()
print html
except urllib2.URLError as e:
if hasattr(e, 'reason'):
print 'Failed to reach a server.'
print 'Reason: ', e.reason
elif hasattr(e, 'code'):
print 'The server couldn\'t fulfill the request.'
print 'Error code: ', e.code
except Exception as e:
print 'An unexpected error occurred:', e
Basic urllib2
request that might encounter socket.error
Strategies for Resolution
Resolving Connection reset by peer
involves a systematic approach. Here are several strategies:
- Implement Robust Error Handling and Retries: The most immediate client-side fix is to catch the error and implement a retry mechanism, possibly with exponential backoff. This handles transient network issues or temporary server overloads.
- Inspect Request Headers and Body: Ensure your HTTP requests are well-formed. Check
Content-Type
,Content-Length
,User-Agent
, and any custom headers. Malformed requests are a common cause of server resets. - Reduce Request Frequency/Size: If you're making many requests or sending large payloads, the server might be rate-limiting or hitting resource limits. Try reducing the frequency or breaking down large requests.
- Verify Server Status and Logs: Confirm the server is running and check its error logs. Look for application errors, database connection issues, or signs of resource exhaustion (CPU, memory).
- Test with Different Clients/Tools: Use
curl
or a web browser to make the same request. If they succeed, the issue is likely specific to your Python code orurllib2
configuration. If they also fail, the problem is likely server-side or network-related. - Disable HTTP Keep-Alive (if applicable): For
urllib2
, you might not have direct control, but if usingrequests
or lower-level sockets, explicitly closing connections after each request can sometimes prevent issues with server-side keep-alive timeouts. - Update Libraries: Ensure
urllib2
(orrequests
if you migrate) and Python itself are up to date, as bug fixes related to networking are common. - Check Network Configuration: Investigate firewalls, proxies, or VPNs that might be interfering with the connection. Temporarily disabling them (if safe and possible) can help diagnose.
import urllib2
import time
def fetch_url_with_retry(url, max_retries=3, delay=2):
for i in range(max_retries):
try:
response = urllib2.urlopen(url, timeout=10) # Add a timeout
return response.read()
except urllib2.URLError as e:
if hasattr(e, 'reason') and 'Connection reset by peer' in str(e.reason):
print 'Attempt %d: Connection reset by peer. Retrying in %d seconds...' % (i + 1, delay)
time.sleep(delay)
delay *= 2 # Exponential backoff
else:
raise # Re-raise other URLErrors
except Exception as e:
print 'Attempt %d: An unexpected error occurred: %s' % (i + 1, e)
time.sleep(delay)
delay *= 2
raise Exception('Failed to fetch URL after %d retries: %s' % (max_retries, url))
# Example usage:
try:
content = fetch_url_with_retry('http://example.com/api/data')
print 'Successfully fetched content.'
# print content # Uncomment to see content
except Exception as e:
print 'Final failure:', e
Implementing a retry mechanism with exponential backoff for urllib2
requests
urllib2
is part of Python 2.7, for new projects or when migrating to Python 3, consider using the requests
library. It offers a much more user-friendly API and handles many common HTTP issues more gracefully.