Rotating your proxy IPs is essential to avoid bans, distribute load, and ensure reliable scraping. In this guide, we’ll walk through how to integrate ProxyHub’s IP rotation endpoints into a Python scraping workflow in just a few lines of code.
requests
:
pip install requests
import requests
API_KEY = "YOUR_API_KEY"
BASE = f"https://proxy.montgomerynx.com/{API_KEY}"
Before rotating, you may want to know which IP you’re currently using:
def get_current_ip():
res = requests.get(f"{BASE}/ip")
res.raise_for_status()
return res.json()["ip"]
print("Current exit IP:", get_current_ip())
Endpoint: GET /{api_key}/ip
Returns: { "ip": "123.45.67.89" }
To obtain a fresh exit IP on demand:
def rotate_ip():
res = requests.get(f"{BASE}/force_rotate_ip")
res.raise_for_status()
return res.json()["ip"]
new_ip = rotate_ip()
print("Rotated to new IP:", new_ip)
Endpoint: GET /{api_key}/force_rotate_ip
Returns: { "ip": "98.76.54.32" }
import requests
from time import sleep
API_KEY = "YOUR_API_KEY"
BASE = f"https://proxy.montgomerynx.com/{API_KEY}"
TARGET_URL = "https://httpbin.org/ip"
def get_current_ip():
return requests.get(f"{BASE}/ip").json()["ip"]
def rotate_ip():
return requests.get(f"{BASE}/force_rotate_ip").json()["ip"]
def fetch_with_retries(url, max_retries=3):
for attempt in range(1, max_retries + 1):
resp = requests.get(f"{BASE}/{url}")
if resp.status_code == 200:
return resp.text
print(f"Attempt {attempt} failed (status {resp.status_code}), rotating IP…")
rotate_ip()
sleep(1)
resp.raise_for_status()
if __name__ == "__main__":
print("Starting IP:", get_current_ip())
result = fetch_with_retries(TARGET_URL)
print("Fetched data:", result)
print("Final IP:", get_current_ip())
N
requests to spread load./ip
before every single request—only when needed.402 Quota exceeded
and back off gracefully.With just a couple of helper functions and two API calls, you can build a robust, rotation-enabled Python scraper in under 20 lines of code. Happy scraping!