hypersdk

HyperExport Cloud TUI - Quick Reference

Overview

The HyperExport TUI now includes comprehensive cloud storage integration, allowing you to export VMs and upload them to S3, Azure, GCS, or SFTP directly from the interactive interface.

Quick Start

# Launch interactive mode
hyperexport --interactive

# In TUI:
# 1. Select VMs (Space to select, Enter to continue)
# 2. Press 'u' to configure cloud upload
# 3. Choose provider (S3, Azure, GCS, or SFTP)
# 4. Enter credentials step-by-step
# 5. Confirm and export

Supported Providers

Provider Icon Setup Required
Amazon S3 ☁️ Access key + Secret key
Azure Blob πŸ”· Account name + Account key
Google Cloud Storage 🌩️ Service account JSON
SFTP πŸ” Username + Password/Key

Key Features

Interactive Configuration

Real-Time Progress

Smart Features

Keyboard Shortcuts

VM Selection Screen:
  u/U       Configure cloud upload
  Space     Select/deselect VM
  Enter     Continue to confirmation
  a         Select all
  n         Deselect all
  1-7       Quick filters
  q         Quit

Cloud Provider Selection:
  ↑/↓       Navigate providers
  Enter     Select provider
  s         Toggle stream upload
  l         Toggle keep local
  Esc       Back to VM selection

Credentials Input:
  Type      Enter text
  Backspace Delete character
  Enter     Next field/Continue
  Esc       Back to provider selection

Confirmation Screen:
  y/Enter   Start export
  u         Configure cloud upload
  n/Esc     Go back
  q         Quit

Example Workflows

S3 Backup Workflow

1. hyperexport --interactive
2. Select VMs (Space key)
3. Press 'u' for cloud upload
4. Select "Amazon S3"
5. Enter:
   - Bucket: my-backups
   - Region: us-east-1
   - Access Key: AKIAIOSFODNN7EXAMPLE
   - Secret Key: β€’β€’β€’β€’β€’β€’β€’β€’
   - Prefix: prod/vms
6. Press 'y' to start
7. Monitor progress

Azure Quick Upload

1. hyperexport --interactive
2. Select VMs
3. Press 'u'
4. Select "Azure Blob Storage"
5. Enter:
   - Container: vm-backups
   - Account: mystorageaccount
   - Key: β€’β€’β€’β€’β€’β€’β€’β€’
   - Prefix: exports
6. Start export

Multi-Cloud Strategy

# Primary: S3
hyperexport --interactive
# Configure S3, export

# Secondary: Azure (run again)
hyperexport --interactive
# Configure Azure, export same VMs

# Result: VMs backed up to both clouds

Documentation

Testing

Unit Tests

# Run all unit tests
go test -v ./cmd/hyperexport/

# Test cloud TUI specifically
go test -v -run TestCloud ./cmd/hyperexport/

# With coverage
go test -v -cover ./cmd/hyperexport/

Integration Tests

# Setup environment
export AWS_ACCESS_KEY_ID="your-key"
export AWS_SECRET_ACCESS_KEY="your-secret"
export TEST_S3_BUCKET="test-bucket"

# Run integration tests
go test -tags=integration -v ./cmd/hyperexport/

# Test specific provider
go test -tags=integration -v -run TestS3Integration ./cmd/hyperexport/

See TESTING.md for complete testing documentation.

Code Structure

cmd/hyperexport/
β”œβ”€β”€ tui_cloud.go                     # Cloud TUI implementation (600+ lines)
β”‚   β”œβ”€β”€ Cloud provider selection
β”‚   β”œβ”€β”€ Credentials input screens
β”‚   β”œβ”€β”€ Upload progress visualization
β”‚   └── Cloud storage browser
β”‚
β”œβ”€β”€ tui_cloud_test.go                # Unit tests (500+ lines)
β”‚   β”œβ”€β”€ Configuration tests
β”‚   β”œβ”€β”€ Phase transition tests
β”‚   β”œβ”€β”€ Validation tests
β”‚   └── Benchmarks
β”‚
β”œβ”€β”€ tui_cloud_integration_test.go    # Integration tests (400+ lines)
β”‚   β”œβ”€β”€ S3 integration
β”‚   β”œβ”€β”€ Azure integration
β”‚   β”œβ”€β”€ GCS integration
β”‚   β”œβ”€β”€ SFTP integration
β”‚   └── Large file uploads
β”‚
β”œβ”€β”€ interactive_tui.go               # Main TUI (modified)
β”‚   └── Integrated cloud support
β”‚
β”œβ”€β”€ cloud_storage.go                 # Cloud interface
β”œβ”€β”€ cloud_s3.go                      # S3 implementation
β”œβ”€β”€ cloud_azure.go                   # Azure implementation
β”œβ”€β”€ cloud_gcs.go                     # GCS implementation
└── cloud_sftp.go                    # SFTP implementation

Environment Setup

AWS S3

export AWS_ACCESS_KEY_ID="AKIAIOSFODNN7EXAMPLE"
export AWS_SECRET_ACCESS_KEY="wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
export AWS_REGION="us-east-1"

Azure Blob Storage

export AZURE_STORAGE_ACCOUNT="mystorageaccount"
export AZURE_STORAGE_KEY="your-account-key"

Google Cloud Storage

export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account.json"

SFTP

# Password-based (less recommended)
export SFTP_PASSWORD="your-password"

# Key-based (recommended)
ssh-keygen -t rsa -b 4096 -f ~/.ssh/hypersdk_key
ssh-copy-id -i ~/.ssh/hypersdk_key.pub user@sftp.example.com

Security Notes

⚠️ Never commit credentials to version control

βœ… Best practices:

Performance Tips

Optimize Upload Speed

  1. Use nearest region - Reduce latency
  2. Enable parallel uploads - --parallel 4
  3. Use compression - --compress (if bandwidth-limited)
  4. Stream mode - --stream-upload (skip local disk)

Reduce Costs

  1. Use lifecycle policies - Auto-delete old backups
  2. Choose appropriate storage class:
    • S3: Standard β†’ Infrequent Access β†’ Glacier
    • Azure: Hot β†’ Cool β†’ Archive
    • GCS: Standard β†’ Nearline β†’ Coldline
  3. Enable compression - Reduce storage by 30-70%

Troubleshooting

β€œAuthentication failed”

# Verify credentials are set
echo $AWS_ACCESS_KEY_ID

# Test with cloud CLI
aws s3 ls
az storage container list
gsutil ls
sftp user@host

β€œUpload timeout”

# Increase timeout
hyperexport --upload-timeout 30m

# Check network
ping s3.amazonaws.com
ping blob.core.windows.net

β€œInsufficient permissions”

# S3 - check IAM policy
aws iam get-user-policy --user-name hypersdk

# Azure - check account permissions
az role assignment list --assignee user@domain.com

# GCS - check service account
gcloud projects get-iam-policy PROJECT_ID

Common Scenarios

Daily Automated Backup

#!/bin/bash
export AWS_ACCESS_KEY_ID="$(cat ~/.aws/access_key)"
export AWS_SECRET_ACCESS_KEY="$(cat ~/.aws/secret_key)"

hyperexport \
  --batch production-vms.txt \
  --upload s3://backups/$(date +%Y-%m-%d) \
  --compress \
  --stream-upload

Disaster Recovery

# Backup to 3 regions
for region in us-east-1 us-west-2 eu-west-1; do
  export AWS_REGION=$region
  hyperexport --vm critical-db \
    --upload s3://dr-$region/$(date +%Y%m%d)
done

Compliance Backup

# Encrypted backup to compliant storage
hyperexport --vm production-db \
  --encrypt \
  --encrypt-method aes256 \
  --upload s3://compliance-backups \
  --verify

API Reference

Cloud Configuration Structure

type cloudConfig struct {
    provider  CloudProvider  // s3, azure, gcs, sftp
    bucket    string        // S3/Azure/GCS bucket/container
    region    string        // AWS region
    accessKey string        // Access credentials
    secretKey string        // Secret credentials
    host      string        // SFTP host
    port      string        // SFTP port
    prefix    string        // Path prefix
}

Cloud Providers

const (
    CloudProviderNone  CloudProvider = "none"
    CloudProviderS3    CloudProvider = "s3"
    CloudProviderAzure CloudProvider = "azure"
    CloudProviderGCS   CloudProvider = "gcs"
    CloudProviderSFTP  CloudProvider = "sftp"
)

FAQ

Q: Can I use S3-compatible storage (MinIO, Wasabi)? A: Yes, set custom endpoint in S3 configuration.

Q: How do I resume an interrupted upload? A: Uploads automatically resume on retry. Use --resume flag.

Q: Can I upload to multiple clouds simultaneously? A: Not in TUI, but possible via command line with separate runs.

Q: Is my data encrypted during upload? A: Yes, all providers use TLS/HTTPS by default.

Q: How do I delete old backups? A: Use cloud provider lifecycle policies or the cloud browser in TUI.

Q: What’s the maximum file size? A: S3: 5TB, Azure: 190TB, GCS: 5TB, SFTP: unlimited (filesystem-dependent)

Support

License

Same as HyperSDK main project (LGPL-3.0-or-later)