Getting Started
Quick Start (5 minutes)
1Install the Python SDK
2Initialize Configuration
3Your First Backup
# Initialize client
client = NebulaClient(
endpoint="https://api.nebula.guard",
api_key="your-api-key"
)
# Backup a file
backup = client.backup(
path="/path/to/file.pdf",
retention_years=7
)
print(f"Backup ID: {backup.id}")
4Restore When Needed
client.restore(
backup_id="backup-12345",
destination="/restore/path"
)
API Access
Get your API key from the Dashboard Settings page
BYOC Setup
What is BYOC?
Bring Your Own Cloud (BYOC) means your data stays in your AWS, Azure, or GCP accounts. NebulaProof orchestrates encryption, erasure coding, and proof generation - but never stores your plaintext data.
Azure BYOC Setup
- 1Navigate to BYOC SetupGo to Dashboard → Organization → BYOC Setup
- 2Select Your RegionsChoose 6+ Azure regions for geographic diversity
- 3Provide Azure CredentialsEnter your Azure Subscription ID, Tenant ID, Client ID, and Client Secret
- 4Automatic ProvisioningNebula creates storage accounts, containers, and configures access policies
- 5Start Backing UpYour BYOC infrastructure is ready in ~3 minutes
AWS & GCP Setup
AWS and GCP BYOC support coming in Q2 2025. Contact sales for early access.
Core Concepts
Zero-Knowledge Encryption
All data is encrypted client-side before it ever leaves your device. NebulaProof never has access to your plaintext data or encryption keys.
Erasure Coding (Reed-Solomon)
Your data is split into 6 data shards + 4 parity shards. Any 6 shards can recover your complete file.
Shard Distribution
Shards are distributed across multiple cloud providers and geographic regions for maximum resilience.
Python SDK Reference
Installation
Client Initialization
client = NebulaClient(
endpoint="https://api.nebula.guard",
api_key="your-api-key-here",
timeout=30 # seconds
)
Backup Operations
backup = client.backup(
path="/path/to/file.pdf",
retention_years=7
)
# Advanced backup with options
backup = client.backup(
path="/path/to/sensitive-data/",
retention_years=10,
regions=["us-east", "eu-west"],
tags={"department": "legal", "classification": "confidential"},
encryption="aes-256-gcm"
)
print(f"Backup ID: {backup.id}")
print(f"Status: {backup.status}")
Restore Operations
client.restore(
backup_id="backup-12345-abcdef",
destination="/restore/path"
)
# Point-in-time restore
client.restore(
backup_id="backup-12345-abcdef",
destination="/restore/path",
point_in_time="2024-01-15T10:30:00Z"
)
GDPR Deletion with Proof
proof = client.delete_with_proof(
backup_id="backup-12345-abcdef",
reason="GDPR Article 17 - Right to Erasure"
)
# Access deletion certificate
print(f"All shards deleted: {proof.all_deleted}")
print(f"Recovery impossible: {proof.recovery_impossible}")
print(f"Certificate: {proof.certificate_url}")
List & Verify Backups
backups = client.list_backups()
for backup in backups:
print(f"{backup.filename} - {backup.status}")
# Verify backup integrity (30 seconds)
result = client.verify_backup("backup-12345")
print(f"Integrity: {result.integrity_score}%")
API Reference
Interactive API Documentation
Full Swagger/OpenAPI docs available at: http://localhost:8000/docs
Authentication
All API requests require a Bearer token in the Authorization header:
Backup Endpoints
Proof Endpoints
Rate Limits
Cryptographic Proofs
What Are Cryptographic Proofs?
Unlike traditional backup systems that ask you to "trust us," NebulaProof provides mathematical proof that your data is:
- Uploaded and distributed correctly
- Stored in specific geographic locations
- Permanently deleted when requested
- Recoverable and intact
Proof Types
Upload Proofs (Merkle Trees)
Proves all shards were uploaded correctly with verifiable content hashes. Auditors can independently verify integrity.
Deletion Certificates (GDPR Article 17)
Cryptographic proof that all copies, including encryption keys and backups, have been permanently destroyed. Recovery is mathematically impossible.
Residency Attestations
Cloud provider-signed certificates proving data resides only in approved regions (e.g., "EU data never left EU").
Verification Proofs
30-second integrity checks using challenge-response protocol. Proves backups are recoverable without full restore.
Auditor Verification
External auditors can independently verify proofs without accessing your data:
- 1.Download proof certificate from Nebula dashboard
- 2.Verify cryptographic signatures using public keys
- 3.Check cloud provider attestations (AWS/Azure/GCP APIs)
- 4.Validate Merkle tree proofs match stored hashes
Compliance
GDPR
- Article 17: Right to erasure with cryptographic proof
- Articles 44-49: Data residency enforcement
- Article 32: State-of-the-art encryption (AES-256)
- Article 5: Audit logs for all data access
HIPAA
- Encryption at rest and in transit (TLS 1.3)
- Complete audit trail (PHI access tracking)
- Business Associate Agreement (BAA) available
- Secure disposal with deletion certificates
SOC 2 Type II
- Security: Zero-knowledge encryption
- Availability: 99.99% uptime SLA
- Confidentiality: Multi-cloud shard distribution
- Report available on request
SEC 17a-4 / FINRA
- Immutable storage (write-once-read-many)
- Retention policy enforcement (7+ years)
- Tamper-evident audit logs
- Independent verification capability
Security
Threat Model
| Threat | Mitigation |
|---|---|
| Nebula server compromised | Zero-knowledge architecture - we can't decrypt your data |
| Cloud provider breach | Encrypted shards + need 6/10 shards + keys split across providers |
| Man-in-the-middle attack | TLS 1.3 + certificate pinning |
| Data corruption | Erasure coding + continuous integrity verification |
| Insider threat | Complete audit logs + RBAC + key splitting |
| DDoS attack | Rate limiting + CDN + load balancing |
Encryption Details
Security Best Practices
- 1.Use strong, unique API keys (rotate every 90 days)
- 2.Enable 2FA/MFA on your Nebula account
- 3.Store API keys in environment variables (never commit to git)
- 4.Use RBAC to limit access (principle of least privilege)
- 5.Monitor audit logs for suspicious activity
- 6.Test your emergency recovery kit quarterly
Emergency Recovery
What If Nebula Shuts Down?
Your data is in your cloud accounts. We provide a standalone recovery CLI that works without our service.
You can recover your data even if NebulaProof ceases to exist. Zero vendor lock-in guaranteed.
Emergency Recovery Steps
- 1Download Recovery KitAvailable in Dashboard → Settings → Emergency Recovery
- 2Install Standalone CLIpip install nebula-recovery-cli
- 3Load Metadatanebula-recovery load-metadata recovery-kit.json
- 4Recover Your Datanebula-recovery restore backup-12345 /output/path
What's in the Recovery Kit?
- Complete shard location map (which files in which cloud accounts)
- Encryption metadata (algorithms, parameters)
- Erasure coding parameters (Reed-Solomon configuration)
- Cloud provider access credentials (encrypted with your password)
- Standalone recovery tool (works offline, no Nebula servers needed)
Troubleshooting & FAQ
Backup is taking longer than expected
Large files can take time depending on your upload speed. Check your network bandwidth and consider uploading during off-peak hours. You can also pause/resume backups.
Upload failed with 'insufficient shards' error
This means we couldn't distribute all 10 shards. Check that you have at least 6 storage nodes configured in your BYOC setup. Add more regions if needed.
How do I rotate my API key?
Go to Dashboard → Settings → API Keys → Generate New Key. Update your applications with the new key, then revoke the old one.
Can I change retention period after backup?
Yes, but only to extend retention. You cannot shorten retention for compliance reasons (prevents accidental data loss).
What happens if one cloud provider goes down?
Nothing. Your data remains accessible as long as 6 of 10 shards are available. We automatically route around failed providers.
How much does cloud storage actually cost?
For Azure BYOC with 6 storage accounts: ~$2.84/month for storage + minimal transaction costs. We provide transparent cost estimates in the dashboard.
Can auditors verify proofs without our help?
Yes! Download the certificate PDF from the dashboard and share with auditors. They can independently verify cryptographic signatures using open-source tools.
Is my data really safe if you're compromised?
Yes. We use zero-knowledge encryption (data encrypted client-side before upload) and split keys across multiple cloud providers. We cannot decrypt your data even if we wanted to.