Troubleshooting Pip Installation Issues on Dataproc with Internal IP Only

The Issue
Cloud platforms may seem like a world of quick clicks and easy deployments, but every checkbox you tick (or forget to untick) can completely change your experience. And guess what? That’s exactly what happened to me.
I was happily setting up my Cloud Dataproc cluster, confidently clicking through the configuration, until… boom! I left one wrong box checked—Internal IP only:
and suddenly, pip install psycopg2 turned into a nightmare.
Running:
pip install psycopg2
Resulted in:
WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<pip._vendor.urllib3.connection.HTTPSConnection object at 0x7f12f19a5810>: Failed to establish a new connection: [Errno 101] Network is unreachable')': /simple/psycopg2/
No matter how many times I tried, it just wouldn’t work.
Root Cause
After much frustration (and some strong coffee), I realized that my Dataproc cluster was isolated from the internet due to the Internal IP only setting. This meant my poor nodes were trapped in a private network, unable to reach PyPI to download packages.
Why Did This Happen?
Dataproc clusters with Internal IP only are designed for security—keeping them safe from the wild internet. But that also means they can’t talk to PyPI, so pip install has nowhere to go.
Symptoms
Pip install fails with network errors.
Retrying doesn’t help (but it does test your patience).
Other internet-dependent commands (wget, curl) also don’t work.
Spinning up another Dataproc cluster without Internal IP only magically fixes everything.
What’s the fix?
Now, I could have just rebuilt my cluster with the right settings, but where’s the fun in that? Instead, here’s how you can fix this without starting over.
Option 1: Use a Different Cluster (Easy Way Out)
If security isn’t a major concern, just create a new cluster without the Internal IP only setting. But hey, what’s the lesson in that?
Option 2: Enable Cloud NAT (Best Solution for Security)
You can maintain security and get internet access by setting up Cloud NAT:
Go to Google Cloud Console → VPC Network → Cloud NAT.
Create a new NAT gateway:
Select your VPC where Dataproc is running.
Choose the subnet for your cluster.
Enable NAT for all subnets if needed.
Save and apply the settings.
After a few minutes, try pip install again—it should work like magic!
Option 3: Use an Internal PyPI Mirror (For the Hardcore Users)
Pre-download required .whl files on an internet-enabled machine and upload them to Google Cloud Storage (GCS).
Use Google Artifact Registry to host Python packages internally.
Configure pip to install from these internal sources instead of PyPI.
Option 4: Manually Transfer the Package
On an internet-enabled machine, download the package:
pip download psycopg2
Upload the downloaded .whl file to Google Cloud Storage.
SSH into the Dataproc cluster and install it manually:
pip install gs://your-bucket-name/psycopg2.whl
Conclusion
So here’s the moral of the story: Cloud computing is not just a few clicks—every checkbox matters. One tiny mistake (like leaving Internal IP only checked) can send you down a rabbit hole of troubleshooting my dataproc workflow.
While Internal IP only is great for security, you need to plan ahead for dependencies like pip install. Either set up Cloud NAT, use an internal package repo, or get comfortable manually transferring files.
Next time, I’ll double-check all my checkboxes while deploying anything on cloud.