The HyperExport TUI now includes comprehensive cloud storage integration, allowing you to export VMs and upload them to S3, Azure, GCS, or SFTP directly from the interactive interface.
# Launch interactive mode
hyperexport --interactive
# In TUI:
# 1. Select VMs (Space to select, Enter to continue)
# 2. Press 'u' to configure cloud upload
# 3. Choose provider (S3, Azure, GCS, or SFTP)
# 4. Enter credentials step-by-step
# 5. Confirm and export
| Provider | Icon | Setup Required |
|---|---|---|
| Amazon S3 | βοΈ | Access key + Secret key |
| Azure Blob | π· | Account name + Account key |
| Google Cloud Storage | π©οΈ | Service account JSON |
| SFTP | π | Username + Password/Key |
VM Selection Screen:
u/U Configure cloud upload
Space Select/deselect VM
Enter Continue to confirmation
a Select all
n Deselect all
1-7 Quick filters
q Quit
Cloud Provider Selection:
β/β Navigate providers
Enter Select provider
s Toggle stream upload
l Toggle keep local
Esc Back to VM selection
Credentials Input:
Type Enter text
Backspace Delete character
Enter Next field/Continue
Esc Back to provider selection
Confirmation Screen:
y/Enter Start export
u Configure cloud upload
n/Esc Go back
q Quit
1. hyperexport --interactive
2. Select VMs (Space key)
3. Press 'u' for cloud upload
4. Select "Amazon S3"
5. Enter:
- Bucket: my-backups
- Region: us-east-1
- Access Key: AKIAIOSFODNN7EXAMPLE
- Secret Key: β’β’β’β’β’β’β’β’
- Prefix: prod/vms
6. Press 'y' to start
7. Monitor progress
1. hyperexport --interactive
2. Select VMs
3. Press 'u'
4. Select "Azure Blob Storage"
5. Enter:
- Container: vm-backups
- Account: mystorageaccount
- Key: β’β’β’β’β’β’β’β’
- Prefix: exports
6. Start export
# Primary: S3
hyperexport --interactive
# Configure S3, export
# Secondary: Azure (run again)
hyperexport --interactive
# Configure Azure, export same VMs
# Result: VMs backed up to both clouds
# Run all unit tests
go test -v ./cmd/hyperexport/
# Test cloud TUI specifically
go test -v -run TestCloud ./cmd/hyperexport/
# With coverage
go test -v -cover ./cmd/hyperexport/
# Setup environment
export AWS_ACCESS_KEY_ID="your-key"
export AWS_SECRET_ACCESS_KEY="your-secret"
export TEST_S3_BUCKET="test-bucket"
# Run integration tests
go test -tags=integration -v ./cmd/hyperexport/
# Test specific provider
go test -tags=integration -v -run TestS3Integration ./cmd/hyperexport/
See TESTING.md for complete testing documentation.
cmd/hyperexport/
βββ tui_cloud.go # Cloud TUI implementation (600+ lines)
β βββ Cloud provider selection
β βββ Credentials input screens
β βββ Upload progress visualization
β βββ Cloud storage browser
β
βββ tui_cloud_test.go # Unit tests (500+ lines)
β βββ Configuration tests
β βββ Phase transition tests
β βββ Validation tests
β βββ Benchmarks
β
βββ tui_cloud_integration_test.go # Integration tests (400+ lines)
β βββ S3 integration
β βββ Azure integration
β βββ GCS integration
β βββ SFTP integration
β βββ Large file uploads
β
βββ interactive_tui.go # Main TUI (modified)
β βββ Integrated cloud support
β
βββ cloud_storage.go # Cloud interface
βββ cloud_s3.go # S3 implementation
βββ cloud_azure.go # Azure implementation
βββ cloud_gcs.go # GCS implementation
βββ cloud_sftp.go # SFTP implementation
export AWS_ACCESS_KEY_ID="AKIAIOSFODNN7EXAMPLE"
export AWS_SECRET_ACCESS_KEY="wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
export AWS_REGION="us-east-1"
export AZURE_STORAGE_ACCOUNT="mystorageaccount"
export AZURE_STORAGE_KEY="your-account-key"
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account.json"
# Password-based (less recommended)
export SFTP_PASSWORD="your-password"
# Key-based (recommended)
ssh-keygen -t rsa -b 4096 -f ~/.ssh/hypersdk_key
ssh-copy-id -i ~/.ssh/hypersdk_key.pub user@sftp.example.com
β οΈ Never commit credentials to version control
β Best practices:
~/.config/--parallel 4--compress (if bandwidth-limited)--stream-upload (skip local disk)# Verify credentials are set
echo $AWS_ACCESS_KEY_ID
# Test with cloud CLI
aws s3 ls
az storage container list
gsutil ls
sftp user@host
# Increase timeout
hyperexport --upload-timeout 30m
# Check network
ping s3.amazonaws.com
ping blob.core.windows.net
# S3 - check IAM policy
aws iam get-user-policy --user-name hypersdk
# Azure - check account permissions
az role assignment list --assignee user@domain.com
# GCS - check service account
gcloud projects get-iam-policy PROJECT_ID
#!/bin/bash
export AWS_ACCESS_KEY_ID="$(cat ~/.aws/access_key)"
export AWS_SECRET_ACCESS_KEY="$(cat ~/.aws/secret_key)"
hyperexport \
--batch production-vms.txt \
--upload s3://backups/$(date +%Y-%m-%d) \
--compress \
--stream-upload
# Backup to 3 regions
for region in us-east-1 us-west-2 eu-west-1; do
export AWS_REGION=$region
hyperexport --vm critical-db \
--upload s3://dr-$region/$(date +%Y%m%d)
done
# Encrypted backup to compliant storage
hyperexport --vm production-db \
--encrypt \
--encrypt-method aes256 \
--upload s3://compliance-backups \
--verify
type cloudConfig struct {
provider CloudProvider // s3, azure, gcs, sftp
bucket string // S3/Azure/GCS bucket/container
region string // AWS region
accessKey string // Access credentials
secretKey string // Secret credentials
host string // SFTP host
port string // SFTP port
prefix string // Path prefix
}
const (
CloudProviderNone CloudProvider = "none"
CloudProviderS3 CloudProvider = "s3"
CloudProviderAzure CloudProvider = "azure"
CloudProviderGCS CloudProvider = "gcs"
CloudProviderSFTP CloudProvider = "sftp"
)
Q: Can I use S3-compatible storage (MinIO, Wasabi)? A: Yes, set custom endpoint in S3 configuration.
Q: How do I resume an interrupted upload?
A: Uploads automatically resume on retry. Use --resume flag.
Q: Can I upload to multiple clouds simultaneously? A: Not in TUI, but possible via command line with separate runs.
Q: Is my data encrypted during upload? A: Yes, all providers use TLS/HTTPS by default.
Q: How do I delete old backups? A: Use cloud provider lifecycle policies or the cloud browser in TUI.
Q: Whatβs the maximum file size? A: S3: 5TB, Azure: 190TB, GCS: 5TB, SFTP: unlimited (filesystem-dependent)
Same as HyperSDK main project (LGPL-3.0-or-later)