Part of Proxy View In Hierarchy
Known subclasses: Proxy.ProxySSLHandler
Request handler for HTTP requests.
Line # | Kind | Name | Docs |
---|---|---|---|
60 | Method | __init__ | Undocumented |
79 | Method | is_ssl | Undocumented |
82 | Method | rewrite_referer | Rewrite the referer URL, stripping the proxy hostname part. |
88 | Method | handle_redirect | Handle the Location header in the headers, saving the new location in the response object. |
102 | Method | parse_accept_encoding | Parse the incoming request's accept-encoding header to determine if we can gzip. |
132 | Method | do_GET | Undocumented |
135 | Method | do_POST | Undocumented |
138 | Method | do_GETPOST | Handle GET and POST requests: fetch the page, rewrite it and return it to the client. The 'post' boolean determines if this is a POST request and if so, the post data is read. |
287 | Method | rewrite_cookie | Rewrite the cookie to have the correct domain attribute. |
308 | Method | handle_rewritable | Handle HTML, JS and CSS pages. |
362 | Method | reader | Function used to read blocks from the server response. GZIP decompresses if necessary. |
373 | Method | writer | Function used to write blocks to the client. GZIP compresses if necessary. |
386 | Method | handle_content | Undocumented |
454 | Method | is_blocked | Returns true if the remote host is blocked. |
471 | Method | is_disallowed | No summary |
511 | Method | handle_robot_block | If robots should be blocked and '/robots.txt' is requested, then send a fake robots.txt blocking robots. Returns True if we send the fake robots.txt. |
534 | Method | handle_own | No summary |
576 | Method | handle_file | Undocumented |
598 | Method | my_log_error | Log a request to error log. |
636 | Method | my_log_request | Log a request to access log. |
673 | Method | log_request | Override the log_request() method to not log anything. |
679 | Method | log_error | Override the log_error() method to not log anything. |
685 | Method | address_string | Override the address_string method to resolve the client address based on the configuration setting. |
Handle the Location header in the headers, saving the new location in the response object.
Parse the incoming request's accept-encoding header to determine if we can gzip.
Handle GET and POST requests: fetch the page, rewrite it and return it to the client. The 'post' boolean determines if this is a POST request and if so, the post data is read.
Function used to read blocks from the server response. GZIP decompresses if necessary.
Function used to write blocks to the client. GZIP compresses if necessary.
Returns true if the remote_host is not allowed: its name is too long, it is not a correct hostname, contains only a single label or its addresses or the hostname itself are part of private networks.
If robots should be blocked and '/robots.txt' is requested, then send a fake robots.txt blocking robots. Returns True if we send the fake robots.txt.