Storage Troubleshooting
Solve storage API issues including access denied errors, empty file lists, presigned URL problems, and timeout issues. Complete debugging guide with error handling patterns.
Storage Issues and Fixes
Common Issues
Access Denied (403 Errors)
Problem: Getting 403 errors when listing or accessing files.
Solutions:
- Check credentials:
// Verify your environment variables are set
console.log('Access Key:', process.env.AWS_ACCESS_KEY_ID?.substring(0, 5) + '...')
console.log('Bucket:', process.env.S3_BUCKET_NAME)
- Test connection:
const isHealthy = await storage.validation.connection()
if (!isHealthy) {
console.log('Connection failed - check credentials and permissions')
}
- Verify bucket permissions:
- Ensure your access key has
s3:ListBucket
,s3:GetObject
,s3:DeleteObject
permissions - For Cloudflare R2, check your API token has the necessary permissions
- Ensure your access key has
Empty File Lists
Problem: storage.list.files()
returns empty array but files exist.
Solutions:
- Check prefix:
// Try without prefix first
const allFiles = await storage.list.files()
console.log('Total files:', allFiles.length)
// Then with specific prefix
const prefixFiles = await storage.list.files({ prefix: 'uploads/' })
console.log('Prefix files:', prefixFiles.length)
- Verify bucket name:
const info = storage.getProviderInfo()
console.log('Current bucket:', info.bucket)
Presigned URL Errors
Problem: Presigned URLs return 403 or expire immediately.
Solutions:
- Check expiration time:
// Use longer expiration for testing
const url = await storage.download.presignedUrl('file.jpg', 3600) // 1 hour
- Verify file exists:
const exists = await storage.validation.exists('file.jpg')
if (!exists) {
console.log('File does not exist')
}
- Check bucket privacy settings:
- Private buckets require presigned URLs
- Public buckets can use direct URLs
Large File Operations Timeout
Problem: Operations on large datasets timeout or fail.
Solutions:
- Use pagination:
// Instead of loading all files at once
const allFiles = await storage.list.files() // ❌ May timeout
// Use pagination
const result = await storage.list.paginated({ maxResults: 100 }) // ✅ Better
- Use async generators for processing:
for await (const batch of storage.list.paginatedGenerator({ maxResults: 50 })) {
console.log(`Processing ${batch.files.length} files`)
// Process batch...
}
- Batch operations:
// Delete files in batches
const filesToDelete = ['file1.jpg', 'file2.jpg', /* ... many files */]
const batchSize = 100
for (let i = 0; i < filesToDelete.length; i += batchSize) {
const batch = filesToDelete.slice(i, i + batchSize)
await storage.delete.files(batch)
}
Error Handling Patterns
Graceful Degradation
async function getFilesSafely() {
try {
const files = await storage.list.files()
return { success: true, files }
} catch (error) {
if (isPushduckError(error)) {
console.log('Storage error:', error.code, error.context)
// Handle specific error types
if (error.code === 'NETWORK_ERROR') {
return { success: false, error: 'Network connection failed', files: [] }
}
if (error.code === 'ACCESS_DENIED') {
return { success: false, error: 'Access denied', files: [] }
}
}
// Fallback for unknown errors
return { success: false, error: 'Unknown error', files: [] }
}
}
Retry Logic
async function withRetry<T>(
operation: () => Promise<T>,
maxRetries = 3,
delay = 1000
): Promise<T> {
for (let attempt = 1; attempt <= maxRetries; attempt++) {
try {
return await operation()
} catch (error) {
if (attempt === maxRetries) throw error
console.log(`Attempt ${attempt} failed, retrying in ${delay}ms...`)
await new Promise(resolve => setTimeout(resolve, delay))
delay *= 2 // Exponential backoff
}
}
throw new Error('Max retries exceeded')
}
// Usage
const files = await withRetry(() => storage.list.files())
Performance Optimization
Efficient File Listing
// ❌ Inefficient - loads all metadata
const files = await storage.list.files()
// ✅ Efficient - only get what you need
const files = await storage.list.files({
maxResults: 50,
sortBy: 'lastModified',
sortOrder: 'desc'
})
// ✅ Even better - use pagination for large datasets
const result = await storage.list.paginated({ maxResults: 20 })
Batch Metadata Retrieval
// ❌ Inefficient - multiple API calls
const fileInfos = []
for (const key of fileKeys) {
const info = await storage.metadata.getInfo(key)
fileInfos.push(info)
}
// ✅ Efficient - single batch call
const fileInfos = await storage.metadata.getBatch(fileKeys)
Smart Caching
// Cache file lists for a short time
const cache = new Map()
async function getCachedFiles(prefix?: string) {
const cacheKey = `files:${prefix || 'all'}`
if (cache.has(cacheKey)) {
const { data, timestamp } = cache.get(cacheKey)
if (Date.now() - timestamp < 60000) { // 1 minute cache
return data
}
}
const files = await storage.list.files({ prefix })
cache.set(cacheKey, { data: files, timestamp: Date.now() })
return files
}
Debugging Tips
Enable Debug Logging
// Check if debug mode is enabled in your config
const config = storage.getConfig()
console.log('Debug mode:', config.provider.debug)
// The storage operations will log detailed information when debug is true
Inspect Configuration
// Check your current configuration
const config = storage.getConfig()
console.log('Provider:', config.provider.provider)
console.log('Bucket:', config.provider.bucket)
console.log('Region:', config.provider.region)
// Check provider info
const info = storage.getProviderInfo()
console.log('Provider info:', info)
Test Individual Operations
// Test each operation individually
console.log('Testing connection...')
const isHealthy = await storage.validation.connection()
console.log('Connection:', isHealthy ? 'OK' : 'Failed')
console.log('Testing file listing...')
const files = await storage.list.files({ maxResults: 1 })
console.log('Files found:', files.length)
console.log('Testing file existence...')
if (files.length > 0) {
const exists = await storage.validation.exists(files[0].key)
console.log('First file exists:', exists)
}
Getting Help
If you're still experiencing issues:
- Check the logs - Look for detailed error messages in your console
- Verify environment variables - Ensure all required variables are set
- Test with minimal configuration - Start with basic setup and add complexity gradually
- Check provider documentation - Verify your bucket/account settings
- Use health check - Run
storage.validation.connection()
to verify basic connectivity