IN1 Region Migration Guide
The IN1 (Bangalore) region will permanently shut down on April 30, 2026 UTC. After this date, all instances and data in the IN1 region—including notebooks, model weights, SSH keys, and all other stored files (including File Storage)—will be permanently deleted.
Please complete your migration to IN2 (Noida) by April 20, 2026 UTC to ensure you have time to verify your data.
What's Happening
| Milestone | Date |
|---|---|
| New instance creation disabled on IN1 | Soon |
| Migration deadline (recommended) | April 20, 2026 UTC |
| IN1 region permanent shutdown | April 30, 2026 UTC |
Migration Steps
Step 1: Identify Your Data
Before migrating, take inventory of what you need to back up:
- Instance storage (
/home): Your notebooks, scripts, and files - File Storage (
/home/jl_fs): Your persistent storage volumes - Model weights and checkpoints: Any trained models
- Custom environments: Note any conda environments or installed packages
Run du -sh /home/* in your instance terminal to see the size of each directory.
Step 2: Download Your Data
You have several options to download your data from IN1 instances:
Option A: Using Jupyter File Browser (Small Files)
- Open your instance in JupyterLab
- Navigate to the files you want to download
- Right-click and select Download
This method works best for individual files or small directories.
Option B: Using SSH/SCP (Recommended for Large Data)
For large datasets or complete backups, use scp from your local terminal:
# Download a single file
scp -P <port> root@<hostname>:/home/myfile.py ./local_folder/
# Download an entire directory
scp -r -P <port> root@<hostname>:/home/my_project ./local_folder/
# Download your entire home directory
scp -r -P <port> root@<hostname>:/home/ ./backup_folder/
To get your SSH connection details:
- Go to the Dashboard
- Click the SSH button on your instance
- The SSH command will be copied to your clipboard
IN1 and IN2 use different SSH connection formats. IN1 uses a bastion host with a custom port (-P <port>), while IN2 uses direct IP connections (default port 22, no -P flag needed). Always use the exact SSH command from the dashboard—the hostname and port vary per instance.
If you haven't set up SSH keys, follow our SSH setup guide first.
Option C: Using rsync (Best for Large Transfers)
For very large datasets with potential interruptions, rsync can resume transfers:
# Sync with progress and resume capability
rsync -avz --progress -e "ssh -p <port>" root@<hostname>:/home/ ./backup_folder/
Option D: Cloud Storage (For Very Large Datasets)
For datasets exceeding 100GB, consider uploading directly to cloud storage:
# Example: Upload to AWS S3
aws s3 sync /home/my_data s3://my-bucket/backup/
# Example: Upload to Google Cloud Storage
gsutil -m cp -r /home/my_data gs://my-bucket/backup/
Step 3: Create a New Instance on IN2
- Go to the Dashboard
- Click Launch Instance
- Select IN2 (Noida) as the region
- Configure your instance with the same settings as your IN1 instance:
- GPU type
- Storage size
- Framework/template
Your SSH keys from IN1 will work on IN2 instances. No need to reconfigure.
Step 4: Upload Your Data to IN2
Get your IN2 instance's SSH command from the dashboard. Since IN2 uses direct IP connections, you typically won't need the -P flag.
Using SCP
# Upload a directory to your new IN2 instance
scp -r ./backup_folder/ root@<hostname>:/home/
# Upload with compression (faster for text/code)
scp -r -C ./backup_folder/ root@<hostname>:/home/
Using rsync
rsync -avz --progress -e "ssh" ./backup_folder/ root@<hostname>:/home/
Step 5: Set Up File Storage on IN2 (If Applicable)
If you were using File Storage on IN1:
- Create a new File Storage volume on IN2 from the File Storage page
- Attach it to your IN2 instance
- Upload your data to
/home/jl_fs
# Upload to File Storage
scp -r ./filestorage_backup/ root@<hostname>:/home/jl_fs/
File Storage volumes are region-specific and cannot be transferred between regions. You must create a new volume on IN2.
Step 6: Verify and Clean Up
- Verify your data: Ensure all files transferred correctly
- Test your workflows: Run your notebooks/scripts to confirm everything works
- Delete IN1 instances: Once verified, delete your IN1 instances to stop storage charges
- Delete IN1 File Storage: Delete any File Storage volumes on IN1
Recreating Your Environment
If you had custom conda environments, recreate them on IN2:
# On IN1: Export your environment
conda env export > environment.yml
# Download environment.yml to your local machine
# Upload to IN2
# On IN2: Create the environment
conda env create -f environment.yml
Or note your installed packages:
# On IN1: List installed packages
pip freeze > requirements.txt
# On IN2: Reinstall packages
pip install -r requirements.txt
Need Help?
- Negative balance? If you need temporary credits to back up your data, contact support@jarvislabs.ai
- Technical issues? Check our troubleshooting guide or reach out to support
- Questions? Email support@jarvislabs.ai
FAQ
Q: Will my SSH keys work on IN2? A: Yes, your SSH keys are account-level and work across all regions.
Q: Can I keep the same instance name? A: Yes, you can use any name for your new IN2 instance.
Q: What happens if I don't migrate by April 30? A: All data in IN1 will be permanently deleted with no possibility of recovery.
Q: Can I migrate my paused instances? A: Yes, resume your paused instance, download the data, then create a new instance on IN2.