Proxy with Urllib2

Proxy with urllib2

proxy = urllib2.ProxyHandler({'http': '127.0.0.1'})
opener = urllib2.build_opener(proxy)
urllib2.install_opener(opener)
urllib2.urlopen('http://www.google.com')

How to send HTTP request using urllib2 with proxy

I think this is the syntax youre looking for:

proxy = urllib2.ProxyHandler({'http': '177.124.160.6'})
opener = urllib2.build_opener(proxy)
urllib2.install_opener(opener)
urllib2.urlopen('http://www.google.com/search')

Alternatively in Python 2.7 requests library you can do:

requests.request(
method="GET",
url='https://www.google.com/search',
params= params,
headers=headers,
proxies = proxy)

Where each input is an object with key value pairs

How can I use a SOCKS 4/5 proxy with urllib2?

You can use SocksiPy module. Simply copy the file "socks.py" to your Python's lib/site-packages directory, and you're ready to go.

You must use socks before urllib2. (Try it pip install PySocks )

For example:

import socks
import socket
socks.setdefaultproxy(socks.PROXY_TYPE_SOCKS5, "127.0.0.1", 8080)
socket.socket = socks.socksocket
import urllib2
print urllib2.urlopen('http://www.google.com').read()

You can also try pycurl lib and tsocks, for more detail, click on here.

urllib2 allow redirects on a url using proxy

I think your answer is here: Proxy with urllib2

You use proxy_handler but don't seem to declare it.

proxy = urllib2.ProxyHandler({'http': '127.0.0.1'})
opener = urllib2.build_opener(proxy)

What's the format for urllib2 request proxies?

Two ways to do this if you must use a urllib2.Request object:

Set environment variables http_proxy and/or https_proxy either outside of the Python interpreter, or internally using os.environ['http_proxy'] before you import urllib2.

import os
os.environ['http_proxy'] = 'http://user:password@localhost:8888'
import urllib2
req = urllib2.Request('http://www.blah.com')
f = urllib2.urlopen(req)

Or manually set HTTP request header:

import urllib2
from base64 import urlsafe_b64encode
PROXY_USERNAME = 'user'
PROXY_PASSWORD = 'password'
req = urllib2.Request('http://www.blah.com')
req.set_proxy('localhost:8888', 'http')
proxy_auth = urlsafe_b64encode('%s:%s' % (PROXY_USERNAME, PROXY_PASSWORD))
req.add_header('Proxy-Authorization', 'Basic %s' % proxy_auth)
f = urllib2.urlopen(req)

How can I open a website with urllib via proxy in Python?

By default, urlopen uses the environment variable http_proxy to determine which HTTP proxy to use:

$ export http_proxy='http://myproxy.example.com:1234'
$ python myscript.py # Using http://myproxy.example.com:1234 as a proxy

If you instead want to specify a proxy inside your application, you can give a proxies argument to urlopen:

proxies = {'http': 'http://myproxy.example.com:1234'}
print("Using HTTP proxy %s" % proxies['http'])
urllib.urlopen("http://www.google.com", proxies=proxies)

Edit: If I understand your comments correctly, you want to try several proxies and print each proxy as you try it. How about something like this?

candidate_proxies = ['http://proxy1.example.com:1234',
'http://proxy2.example.com:1234',
'http://proxy3.example.com:1234']
for proxy in candidate_proxies:
print("Trying HTTP proxy %s" % proxy)
try:
result = urllib.urlopen("http://www.google.com", proxies={'http': proxy})
print("Got URL using proxy %s" % proxy)
break
except:
print("Trying next proxy in 5 seconds")
time.sleep(5)


Related Topics



Leave a reply



Submit