# AI & LLM Integration
URL: /docs/ai-integration
Access documentation content in AI-friendly formats for large language models and automated tools.
***
title: AI & LLM Integration
description: Access documentation content in AI-friendly formats for large language models and automated tools.
---------------------------------------------------------------------------------------------------------------
# AI & LLM Integration
Pushduck documentation provides AI-friendly endpoints that make it easy for large language models (LLMs) and automated tools to access and process our documentation content.
## Available Endpoints
### ๐ Complete Documentation Export
Access all documentation content in a single, structured format:
```
GET /llms.txt
```
This endpoint returns all documentation pages in a clean, AI-readable format with:
* Page titles and URLs
* Descriptions and metadata
* Full content with proper formatting
* Structured sections and hierarchies
**Example Usage:**
```bash
curl https://your-domain.com/llms.txt
```
### ๐ Individual Page Access
Access any documentation page's raw content by appending `.mdx` to its URL:
```
GET /docs/{page-path}.mdx
```
**Examples:**
* `/docs/quick-start.mdx` - Quick start guide content
* `/docs/api/hooks/use-upload.mdx` - Hook documentation
* `/docs/guides/setup/aws-s3.mdx` - AWS S3 setup guide
## Use Cases
### ๐ค **AI Assistant Integration**
* Train custom AI models on our documentation
* Create chatbots that can answer questions about Pushduck
* Build intelligent documentation search systems
### ๐ง **Development Tools**
* Generate code examples and snippets
* Create automated documentation tests
* Build CLI tools that reference our docs
### ๐ **Content Analysis**
* Analyze documentation completeness
* Track content changes over time
* Generate documentation metrics
## Content Format
The LLM endpoints return content in a structured format:
```
# Page Title
URL: /docs/page-path
Page description here
# Section Headers
Content with proper markdown formatting...
## Subsections
- Lists and bullet points
- Code blocks with syntax highlighting
- Tables and structured data
```
## Technical Details
* **Caching**: Content is cached for optimal performance
* **Processing**: Uses Remark pipeline with MDX and GFM support
* **Format**: Clean markdown with frontmatter removed
* **Encoding**: UTF-8 text format
* **CORS**: Enabled for cross-origin requests
## Rate Limiting
These endpoints are designed for programmatic access and don't have aggressive rate limiting. However, please be respectful:
* Cache responses when possible
* Avoid excessive automated requests
* Use appropriate user agents for your tools
## Examples
### Python Script
```python
import requests
# Get all documentation
response = requests.get('https://your-domain.com/llms.txt')
docs_content = response.text
# Get specific page
page_response = requests.get('https://your-domain.com/docs/quick-start.mdx')
page_content = page_response.text
```
### Node.js/JavaScript
```javascript
// Fetch all documentation
const allDocs = await fetch("/llms.txt").then((r) => r.text());
// Fetch specific page
const quickStart = await fetch("/docs/quick-start.mdx").then((r) => r.text());
```
### cURL
```bash
# Download all docs to file
curl -o pushduck-docs.txt https://your-domain.com/llms.txt
# Get specific page content
curl https://your-domain.com/docs/api/hooks/use-upload.mdx
```
## Integration with Popular AI Tools
### OpenAI GPT
Use the `/llms.txt` endpoint to provide context about Pushduck in your GPT conversations.
### Claude/Anthropic
Feed documentation content to Claude for detailed analysis and code generation.
### Local LLMs
Download content for training or fine-tuning local language models.
***
These AI-friendly endpoints make it easy to integrate Pushduck documentation into your development workflow and AI-powered tools!
# Examples & Demos
URL: /docs/examples
Experience pushduck with interactive demos and real-world examples. All demos use live Cloudflare R2 integration.
***
title: Examples & Demos
description: Experience pushduck with interactive demos and real-world examples. All demos use live Cloudflare R2 integration.
------------------------------------------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Tabs, Tab } from "fumadocs-ui/components/tabs";
**Live Demos:** These are fully functional demos using real Cloudflare R2 storage. Files are uploaded to a demo bucket and may be automatically cleaned up. Don't upload sensitive information.
**Having Issues?** If uploads aren't working (especially with `next dev --turbo`), check our [Troubleshooting Guide](/docs/api/troubleshooting) for common solutions including the known Turbo mode compatibility issue.
## Interactive Upload Demo
The full-featured demo showcasing all capabilities:
**ETA & Speed Tracking:** Upload speed (MB/s) and estimated time remaining (ETA) appear below the progress bar during active uploads. Try uploading larger files (1MB+) to see these metrics in action! ETA becomes more accurate after the first few seconds of upload.
## Image-Only Upload
Focused demo for image uploads with preview capabilities:
## Document Upload
Streamlined demo for document uploads:
## Key Features Demonstrated
### โ
**Type-Safe Client**
```typescript
// Property-based access with full TypeScript inference
const imageUpload = upload.imageUpload();
const fileUpload = upload.fileUpload();
// No string literals, no typos, full autocomplete
await imageUpload.uploadFiles(selectedFiles);
```
### โก **Real-Time Progress**
* Individual file progress tracking with percentage completion
* Upload speed monitoring (MB/s) with live updates
* ETA calculations showing estimated time remaining
* Pause/resume functionality (coming soon)
* Comprehensive error handling with retry mechanisms
### ๐ **Built-in Validation**
* File type validation (MIME types)
* File size limits with user-friendly errors
* Custom validation middleware
* Malicious file detection
### ๐ **Provider Agnostic**
* Same code works with any S3-compatible provider
* Switch between Cloudflare R2, AWS S3, DigitalOcean Spaces
* Zero vendor lock-in
## Code Examples
```typescript
"use client";
import { upload } from "@/lib/upload-client";
export function SimpleUpload() {
const { uploadFiles, files, isUploading } = upload.imageUpload();
return (
uploadFiles(Array.from(e.target.files || []))}
disabled={isUploading}
/>
{files.map(file => (
{file.name}
{file.status}
{file.url &&
View }
))}
);
}
```
```typescript
// app/api/upload/route.ts
import { createUploadConfig } from "pushduck/server";
const { s3, } = createUploadConfig()
.provider("cloudflareR2",{
accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
bucket: process.env.R2_BUCKET!,
})
.defaults({
maxFileSize: "10MB",
acl: "public-read",
})
.build();
const uploadRouter = s3.createRouter({
imageUpload: s3
.image()
.max("5MB")
.formats(["jpeg", "png", "webp"])
.middleware(async ({ file, metadata }) => {
// Custom authentication and metadata
const session = await getServerSession();
if (!session) throw new Error("Unauthorized");
return {
...metadata,
userId: session.user.id,
uploadedAt: new Date().toISOString(),
};
})
.onUploadComplete(async ({ file, url, metadata }) => {
// Post-upload processing
console.log(`Upload complete: ${url}`);
await saveToDatabase({ url, metadata });
}),
});
export const { GET, POST } = uploadRouter.handlers;
export type AppRouter = typeof uploadRouter;
```
```typescript
"use client";
import { upload } from "@/lib/upload-client";
export function RobustUpload() {
const { uploadFiles, files, errors, reset } = upload.imageUpload();
const handleUpload = async (fileList: FileList) => {
try {
await uploadFiles(Array.from(fileList));
} catch (error) {
console.error("Upload failed:", error);
// Error is automatically added to the errors array
}
};
return (
e.target.files && handleUpload(e.target.files)}
/>
{/* Display errors */}
{errors.length > 0 && (
Upload Errors:
{errors.map((error, index) => (
{error}
))}
Clear Errors
)}
{/* Display files with status */}
{files.map(file => (
{file.name}
{file.status}
{file.status === "uploading" && (
)}
{file.status === "error" && (
{file.error}
)}
{file.status === "success" && file.url && (
View File
)}
))}
);
}
```
## Real-World Use Cases
### **Profile Picture Upload**
Single image upload with instant preview and crop functionality.
### **Document Management**
Multi-file document upload with categorization and metadata.
### **Media Gallery**
Batch image upload with automatic optimization and thumbnail generation.
### **File Sharing**
Secure file upload with expiration dates and access controls.
## Next Steps
# Pushduck
URL: /docs
Own your file uploads. The most comprehensive upload solution for Next.js.
***
title: Pushduck
description: Own your file uploads. The most comprehensive upload solution for Next.js.
---------------------------------------------------------------------------------------
import { Card, Cards } from "fumadocs-ui/components/card";
import { Step, Steps } from "fumadocs-ui/components/steps";
# Own Your File Uploads
*File uploads in Next.js have been overcomplicated for too long. Developers shouldn't need to cobble together multiple libraries, write custom middleware, and manage complex state just to handle file uploads. We believe the TypeScript ecosystem deserves betterโhence, Pushduck.*
**The most comprehensive file upload solution for Next.js.** Guided setup, full TypeScript support, and everything you need out of the box.
```typescript
// It's really this simple
const upload = createUploadClient();
export function MyComponent() {
const { uploadFiles, uploadedFiles, isUploading } = upload.imageUpload();
return (
uploadFiles(e.target.files)}
disabled={isUploading}
/>
);
}
```
## Why Pushduck?
File uploads should be **simple**, **secure**, and **scalable**. Other solutions make you choose between ease of use and control, or require vendor lock-in and ongoing costs.
Pushduck gives you:
* **Full ownership** of your upload infrastructure
* **Zero vendor lock-in** with provider-agnostic design
* **Production-grade features** without the complexity
* **Type-safe development** with full TypeScript inference
* **Community-driven** with real-world usage patterns
## Get Started
## Loved by Developers
> "Finally, an upload solution that just works. The TypeScript inference is incredible - I get autocomplete for everything and catch errors before they hit production."
โ **React Developer**, SaaS Startup
> "We migrated from Uploadthing to pushduck and cut our upload costs by 80%. The provider-agnostic design means we can switch S3-compatible providers anytime."
โ **CTO**, E-commerce Platform
> "The property-based client approach is genius. No more passing route names as strings - everything is type-safe and the DX is outstanding."
โ **Full-Stack Developer**, Agency
***
## Framework Agnostic
While optimized for Next.js, it works seamlessly across the JavaScript ecosystem:
## What's Included
Everything you need for production file uploads:
* โ
**Validation & Security** - File type, size, and custom validation
* โ
**Overall Progress Tracking** - Real-time aggregate progress, speed, and ETA across all files
* โ
**Error Handling** - Comprehensive error states and recovery
* โ
**Middleware System** - Custom logic for authentication, metadata, and processing
* โ
**Type Inference** - Full TypeScript safety from server to client
* โ
**Provider Support** - Cloudflare R2, AWS S3, DigitalOcean, MinIO, and more
* โ
**Image Processing** - Built-in Sharp integration for optimization
* โ
**Drag & Drop** - Ready-to-use components and hooks
* โ
**Multi-file Support** - Concurrent uploads with progress aggregation
# Quick Start
URL: /docs/quick-start
Get production-ready file uploads working in your Next.js app in under 2 minutes.
***
title: Quick Start
description: Get production-ready file uploads working in your Next.js app in under 2 minutes.
----------------------------------------------------------------------------------------------
import { Step, Steps } from "fumadocs-ui/components/steps";
import { Callout } from "fumadocs-ui/components/callout";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
# Quick Start
Get **production-ready file uploads** working in your Next.js app in under 2 minutes with our CLI tool. Interactive setup, just one command.
**๐ New!** Use our CLI for instant setup: `npx @pushduck/cli@latest init` - handles everything automatically!
## Choose Your Setup Method
### โก Interactive CLI Setup
Get everything set up instantly with our interactive CLI:
```bash
npx @pushduck/cli@latest init
```
That's it! The CLI will:
* โ
Install dependencies automatically
* โ
Set up your chosen provider (AWS S3, Cloudflare R2, etc.)
* โ
Create API routes with type safety
* โ
Generate example components
* โ
Configure environment variables
* โ
Create and configure your S3 bucket
**What you get:**
* Production-ready upload API in `app/api/upload/route.ts`
* Type-safe upload client in `lib/upload-client.ts`
* Example components in `components/ui/`
* Working demo page in `app/upload/page.tsx`
[**๐ Full CLI Documentation โ**](/docs/guides/setup/cli-setup)
**Example CLI Output:**
```
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ โ
โ ๐ Welcome to Pushduck โ
โ โ
โ Let's get your file uploads working in 2 minutes! โ
โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
๐ Detecting your project...
โ Next.js App Router detected
โ TypeScript configuration found
? Which cloud storage provider would you like to use?
โฏ AWS S3 (recommended)
Cloudflare R2 (S3-compatible, global edge)
DigitalOcean Spaces (simple, affordable)
โจ Generated files:
โโโ app/api/upload/route.ts
โโโ app/upload/page.tsx
โโโ components/ui/upload-button.tsx
โโโ lib/upload-client.ts
โโโ .env.example
๐ Setup complete! Your uploads are ready.
```
### ๐ง Manual Setup
If you prefer to set things up manually or need custom configuration:
## Prerequisites
* Next.js 13+ with App Router
* An S3-compatible storage provider (we'll use AWS S3 in this guide)
* Node.js 18+
## Install Pushduck
```bash
npm install pushduck
```
**Using a different package manager?**
```bash
npm install pushduck
```
```bash
pnpm add pushduck
```
```bash
yarn add pushduck
```
```bash
bun add pushduck
```
## Set Environment Variables
Create a `.env.local` file in your project root with your S3 credentials:
```dotenv
# .env.local
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_REGION=us-east-1
AWS_S3_BUCKET_NAME=your-bucket-name
```
**Don't have S3 credentials yet?** Follow our [AWS S3 setup
guide](/docs/guides/setup/aws-s3) to create a bucket and get your credentials
in 2 minutes.
## Create Your Upload Router
Create an API route to handle file uploads:
```typescript
// app/api/s3-upload/route.ts
import { createUploadConfig } from "pushduck/server";
const { s3 } = createUploadConfig()
.provider("cloudflareR2",{
accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
accessKeyId: process.env.CLOUDFLARE_ACCESS_KEY_ID!,
secretAccessKey: process.env.CLOUDFLARE_SECRET_ACCESS_KEY!,
bucket: process.env.CLOUDFLARE_BUCKET_NAME!,
region: "auto",
})
.build();
const router = s3.createRouter({
// Define your upload routes with validation
imageUpload: s3
.image()
.max("10MB")
.formats(["jpeg", "png", "webp"]),
documentUpload: s3.file().max("50MB").types(["application/pdf", "application/msword"]),
});
export { router as POST };
// Export the router type for client-side type safety
export type Router = typeof router;
```
**What's happening here?** - `s3.createRouter()` creates a type-safe upload
handler - `s3.image()` and `s3.file()` provide validation and TypeScript
inference - The router automatically handles presigned URLs, validation, and
errors - Exporting the type enables full client-side type safety
## Create Upload Client
Create a type-safe client for your components using the **recommended structured approach**:
**Recommended**: The structured client provides the best developer experience with property-based access, centralized configuration, and enhanced type safety.
```typescript
// lib/upload-client.ts
import { createUploadClient } from "pushduck/client";
import type { Router } from "@/app/api/s3-upload/route";
// Create a type-safe upload client (recommended)
export const upload = createUploadClient({
endpoint: "/api/s3-upload",
});
```
**Why this approach is recommended:**
* โ
**Full type inference** from your server router
* โ
**Property-based access** - `upload.imageUpload()` instead of strings
* โ
**IntelliSense support** - see all available endpoints
* โ
**Refactoring safety** - rename routes with confidence
* โ
**Centralized config** - set headers, timeouts, and options once
**Alternative**: You can also use the hook-based approach if you prefer traditional React patterns:
```typescript
// With type parameter (recommended)
const { uploadFiles } = useUploadRoute('imageUpload')
// Or without type parameter (also works)
const { uploadFiles } = useUploadRoute('imageUpload')
```
The structured client is still recommended for most use cases.
## Use in Your Components
Now you can use the upload client in any component with full type safety:
```typescript
// components/image-uploader.tsx
"use client";
import { upload } from "@/lib/upload-client";
export function ImageUploader() {
const { uploadFiles, uploadedFiles, isUploading, progress, error } =
upload.imageUpload();
const handleFileChange = (e: React.ChangeEvent) => {
const files = e.target.files;
if (files) {
uploadFiles(Array.from(files));
}
};
return (
{isUploading && (
Uploading... {Math.round(progress)}%
)}
{error && (
)}
{uploadedFiles.length > 0 && (
{uploadedFiles.map((file) => (
{file.name}
))}
)}
);
}
```
## Add to Your Page
Finally, use your upload component in any page:
```typescript
// app/page.tsx
import { ImageUploader } from "@/components/image-uploader";
export default function HomePage() {
return (
Upload Images
);
}
```
## ๐ Congratulations!
You now have **production-ready file uploads** working in your Next.js app! Here's what you accomplished:
* โ
**Type-safe uploads** with full TypeScript inference
* โ
**Automatic validation** for file types and sizes
* โ
**Progress tracking** with loading states
* โ
**Error handling** with user-friendly messages
* โ
**Secure uploads** using presigned URLs
* โ
**Multiple file support** with image preview
## What's Next?
Now that you have the basics working, explore these advanced features:
## Need Help?
* ๐ **Documentation**: Explore our comprehensive [guides](/docs/guides)
* ๐ฌ **Community**: Join our [Discord community](https://discord.gg/pushduck)
* ๐ **Issues**: Report bugs on [GitHub](https://github.com/abhay-ramesh/pushduck)
* ๐ง **Support**: Email us at [support@pushduck.com](mailto:support@pushduck.com)
**Loving Pushduck?** Give us a โญ on
[GitHub](https://github.com/abhay-ramesh/pushduck) and help spread the
word!
# Roadmap
URL: /docs/roadmap
Our vision for the future of file uploads in Next.js
***
title: Roadmap
description: Our vision for the future of file uploads in Next.js
-----------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { File, Folder, Files } from "fumadocs-ui/components/files";
import { TypeTable } from "fumadocs-ui/components/type-table";
# Roadmap
Our mission is to make file uploads **simple**, **secure**, and **scalable** for every developer and every use case.
## โ
Completed
### Core Foundation
โ
**Universal Compatibility** - Works with 16+ frameworks and edge runtimes\
โ
**Type-Safe APIs** - Full TypeScript inference from server to client\
โ
**Multi-Provider Support** - AWS S3, Cloudflare R2, DigitalOcean Spaces, MinIO\
โ
**Production Security** - Presigned URLs, file validation, CORS handling\
โ
**Developer Experience** - Property-based client, comprehensive error handling\
โ
**Overall Progress Tracking** - Now provides real-time aggregate progress metrics
* `progress` - 0-100% completion across all files
* `uploadSpeed` - Combined transfer rate in bytes/second
* `eta` - Overall time remaining in seconds
### Setup & Tooling
โ
**Interactive CLI** - Guided setup with smart defaults and auto-detection\
โ
**Code Generation** - Type-safe API routes and client components\
โ
**Framework Detection** - Automatic Next.js App Router/Pages Router detection\
โ
**Environment Setup** - Automated credential configuration
### Documentation & Examples
โ
**Comprehensive Docs** - Complete API reference and integration guides\
โ
**Live Examples** - Working demos for all supported frameworks\
โ
**Migration Guides** - Step-by-step migration from other solutions\
โ
**Best Practices** - Security, performance, and architecture guidance
## โ ๏ธ Current Limitations
### Progress Tracking Constraints
โ
**Overall Progress Tracking** - Now provides real-time aggregate progress metrics
* `progress` - 0-100% completion across all files
* `uploadSpeed` - Combined transfer rate in bytes/second
* `eta` - Overall time remaining in seconds
### Upload Control Limitations
Current upload management has constraints for handling real-world scenarios:
โ **No Resumable Uploads** - Cannot resume interrupted uploads from where they left off\
โ **No Pausable Uploads** - Cannot pause ongoing uploads and resume later\
โ **No Cancel Support** - Cannot cancel individual uploads in progress\
โ **Limited Network Resilience** - No automatic retry on network failures or connection switching
These limitations may be addressed in future releases based on community feedback and use case requirements.
## ๐ง In Progress
### Enhanced Developer Experience
๐ง **Visual Studio Code Extension** - IntelliSense, snippets, and debugging tools\
๐ง **Enhanced Error Messages** - Contextual help and troubleshooting suggestions\
๐ง **Performance Monitoring** - Built-in metrics and optimization recommendations
### Advanced Features
๐ง **Image Processing Pipeline** - Automatic optimization, resizing, and format conversion\
๐ง **Video Processing** - Transcoding, thumbnail generation, and streaming support\
๐ง **Advanced Validation** - Content scanning, virus detection, and custom rules
## ๐ Planned
### Q3 2025 - Enterprise Features
* **Advanced Analytics** - Upload metrics, performance insights, and usage tracking
* **Enhanced Hook APIs** - onProgress callbacks and advanced upload state management
* **Advanced Upload Control** - Resumable, pausable uploads with cancel support and network resilience
* **Team Management** - Multi-user access, role-based permissions, and audit logs
* **Advanced Security** - Content scanning, encryption at rest, and compliance tools
* **SLA & Support** - Enterprise support plans and guaranteed uptime
### Q4 2025 - Platform Expansion
* **Mobile SDKs** - React Native, Flutter, and native iOS/Android support
* **Desktop Applications** - Electron and Tauri integration
* **Serverless Optimization** - Enhanced edge runtime support and cold start optimization
* **Global CDN** - Built-in content delivery and edge caching
### Q1 2026 - AI Integration
* **Smart Tagging** - Automatic content categorization and metadata extraction
* **Image Recognition** - Object detection, OCR, and content moderation
* **Intelligent Compression** - AI-powered optimization for different use cases
* **Content Insights** - Usage patterns and optimization recommendations
### Q2 2026 - Ecosystem Growth
* **Plugin Architecture** - Extensible middleware system for custom workflows
* **Third-party Integrations** - CMS platforms, e-commerce solutions, and productivity tools
* **Community Templates** - Shared configurations and best practices
* **Certification Program** - Training and certification for developers and teams
## ๐ฏ Long-term Vision
### Universal File Management Platform
Transform pushduck from a upload library into a comprehensive file management platform that handles the entire file lifecycle:
* **Intelligent Storage** - Automatic tier management and cost optimization
* **Global Distribution** - Multi-region replication and edge delivery
* **Advanced Processing** - Real-time transformation and processing pipelines
* **Collaborative Features** - Shared workspaces, comments, and version control
### Developer Ecosystem
Build a thriving ecosystem around pushduck:
* **Marketplace** - Community-driven plugins, templates, and integrations
* **Certification** - Professional training and certification programs
* **Events & Community** - Conferences, meetups, and developer advocacy
* **Enterprise Solutions** - Custom implementations and consulting services
## ๐ก Ideas & Suggestions
Have ideas for pushduck? We'd love to hear them!
* [Feature Requests](https://github.com/abhay-ramesh/pushduck/discussions/categories/ideas)
* [Community Discord](https://discord.gg/pushduck)
* [Developer Survey](https://forms.gle/pushduck-feedback)
## ๐ค Contributing
Want to help build the future of file uploads? Check out our [Contributing Guide](/docs/contributing) to get started.
### Current Priorities
We're actively looking for contributors in these areas:
* **Framework Integrations** - Help us support more frameworks and platforms
* **Documentation** - Improve guides, examples, and API documentation
* **Testing** - Expand test coverage and add integration tests
* **Performance** - Optimize bundle size and runtime performance
* **Security** - Security audits and vulnerability assessments
***
*Last updated: June 2025*
This roadmap is community-driven. **Your feedback shapes our priorities.**
Join our [Discord](https://discord.gg/pushduck) or open an issue on
[GitHub](https://github.com/abhay-ramesh/pushduck) to influence what we
build next.
## Current Status
We've already solved the core problems that have frustrated developers for years:
โ
**Interactive CLI** - Guided setup with smart defaults and auto-detection\
โ
**Type Safety** - Full TypeScript inference for upload schemas\
โ
**Multiple Providers** - Cloudflare R2, AWS S3, Google Cloud, and more\
โ
**Production Ready** - Used by teams processing millions of uploads\
โ
**Developer Experience** - Property-based client access with enhanced IntelliSense
## What's Next
### ๐ Q3 2025: Developer Experience Revolution
The CLI will automatically detect your project structure and configure everything:
```bash
npx @pushduck/cli@latest init
# โ
Detected Next.js 14 with App Router
# โ
Created upload route at /api/upload
# โ
Added environment variables to .env.local
# โ
Generated type-safe upload config
```
{" "}
No more YAML or complex configuration files. Build upload pipelines visually
and export to code with real-time preview of your upload components.
Hot-reload your upload components with live preview of validation, progress,
and error states. Perfect for rapid prototyping and design iteration.
Complete control over upload lifecycle with automatic recovery from network issues:
```typescript
const { files, uploadFiles, pauseUpload, resumeUpload, cancelUpload } = upload.images
// Pause individual uploads
await pauseUpload(fileId)
// Resume from where it left off
await resumeUpload(fileId)
// Cancel with cleanup
await cancelUpload(fileId)
// Automatic network resilience
const config = {
retryAttempts: 3,
networkSwitchTolerance: true,
resumeOnReconnect: true
}
```
### ๐ Q4 2025: Enterprise & Security
```typescript
const s3Router = s3.createRouter({
images: s3.image()
.permissions(["user:upload", "admin:all"])
.auditLog(true)
.compliance("SOC2")
})
```
Built-in integration with leading security providers for automatic threat detection, content moderation, and policy enforcement.
Pre-built compliance workflows and automatic data handling policies with audit trails and data retention management.
### โก Q1 2026: Performance & Scale
```typescript
const s3Router = s3.createRouter({
images: s3.image()
.processing({
resize: { width: 800, height: 600 },
format: "webp",
edge: true // Process at nearest edge location
})
})
```
Automatic cache warming and smart invalidation strategies for optimal performance. Includes built-in CDN integration with major providers.
Real-time dashboard showing upload success rates, processing times, storage costs, and performance bottlenecks with actionable insights.
### ๐ Q2 2026: Ecosystem Expansion
Complete framework support with the same developer experience:
```typescript
// Vue 3 Composition API
import { createUploadClient } from '@pushduck/vue'
const upload = createUploadClient({
endpoint: '/api/upload'
})
const { files, uploadFiles, isUploading } = upload.imageUpload
```
```typescript
// Svelte stores
import { uploadStore } from '@pushduck/svelte'
const upload = uploadStore('/api/upload')
// Reactive stores for upload state
$: ({ files, isUploading } = $upload.imageUpload)
```
```typescript
// Pure JavaScript
import { UploadClient } from '@pushduck/core'
const client = new UploadClient('/api/upload')
client.upload('imageUpload', files)
.on('progress', (progress) => console.log(progress))
.on('complete', (urls) => console.log(urls))
```
{" "}
Native mobile SDKs with the same type-safe API you love on the web. Full
offline support with automatic retry and background uploads.
Automatic alt text generation, content categorization, duplicate
detection, and smart compression based on content analysis.
## Community Roadmap
### What You're Asking For
Based on community feedback, GitHub issues, and Discord discussions:
**๐ฅ High Priority (Next 3 months)** - **Drag & Drop File Manager** - Visual
file organization and bulk operations - **Video Processing Pipeline** -
Automatic transcoding and thumbnail generation - **Better Error Messages** -
More helpful error descriptions with suggested fixes - **Upload Resume** -
Automatic retry and resume for failed large file uploads - **Real-time
Collaboration** - Multiple users uploading to shared spaces *These features
have 100+ upvotes across GitHub and Discord*
{" "}
**๐ญ Exploring (6 months)** - **GraphQL Integration** - Native GraphQL
subscription support for upload progress - **Webhook Builder** - Visual
webhook configuration for upload events - **Template Gallery** - Pre-built
upload components for common use cases - **A/B Testing** - Built-in
experimentation for upload flows - **White-label Solution** - Fully
customizable upload interface *Join the discussion on these features in our
Discord*
**๐ฎ Vision (12+ months)** - **No-Code Integration** - Zapier/Make.com
connectors - **Blockchain Storage** - IPFS and decentralized storage options
* **AI-Powered Optimization** - Automatic performance tuning -
**Cross-Platform Desktop** - Electron-based upload manager - **Enterprise
Marketplace** - Plugin ecosystem for custom integrations *These are
aspirational goals that depend on community growth*
## How We Prioritize
Our roadmap is driven by three key factors:
1. **Community Impact** - Features that solve real problems for the most developers
2. **Technical Excellence** - Maintaining our high standards for type safety and DX
3. **Ecosystem Health** - Building a sustainable, long-term solution
### Voting on Features
Have an idea or want to prioritize something? Here's how to influence our roadmap:
Use our feature request template with use cases and expected API design. Include code examples and real-world scenarios.
{" "}
Join our Discord server where we run monthly polls on upcoming features. Your
vote directly influences our development priorities.
First Friday of every month at 10 AM PT - open to all developers. Share your use cases and help shape the future.
## Behind the Scenes
### What We're Working on Right Now
**Week of June 23, 2025:**
* ๐จ Enhanced type inference for nested upload schemas
* ๐งช Testing framework for upload workflows
* ๐ Interactive examples in documentation
* ๐ Bug fixes for edge cases in multi-part uploads
Follow our [GitHub project
board](https://github.com/abhay-ramesh/pushduck/projects) for real-time
updates on development progress.
### Development Structure
Our development process is organized around clear modules:
### Core Principles
As we build new features, we never compromise on:
* **Type Safety First** - Every feature must have full TypeScript support
* **Zero Breaking Changes** - Backward compatibility is non-negotiable
* **Performance by Default** - New features can't slow down existing workflows
* **Developer Happiness** - If it's not delightful to use, we rebuild it
## Get Involved
This roadmap exists because of developers like you. Here's how to shape the future:
### For Users
* **Share your use case** - Tell us what you're building
* **Report pain points** - What's still too complicated?
* **Request integrations** - Which providers or tools do you need?
### For Contributors
* **Code contributions** - Check our [contributing guide](https://github.com/abhay-ramesh/pushduck/blob/main/CONTRIBUTING.md)
* **Documentation** - Help improve examples and guides
* **Community support** - Answer questions in Discord and GitHub
### For Organizations
* **Sponsorship** - Support full-time development
* **Enterprise feedback** - Share your scale challenges
* **Partnership** - Integrate pushduck with your platform
***
**Ready to build the future of file uploads?** Join our [Discord
community](https://discord.gg/pushduck) and help us make file
uploads delightful for every Next.js developer.
# S3 Router
URL: /docs/api/s3-router
Type-safe upload routes with schema validation and middleware
***
title: S3 Router
description: Type-safe upload routes with schema validation and middleware
--------------------------------------------------------------------------
# S3 Router
The S3 router provides a type-safe way to define upload endpoints with schema validation, middleware, and lifecycle hooks.
## Basic Router Setup
```typescript
// app/api/upload/route.ts
import { s3 } from '@/lib/upload'
const s3Router = s3.createRouter({
imageUpload: s3
.image()
.max('5MB')
.formats(['jpeg', 'jpg', 'png', 'webp'])
.middleware(async ({ file, metadata }) => {
// Add authentication and user context
return {
...metadata,
userId: 'user-123',
uploadedAt: new Date().toISOString(),
}
}),
documentUpload: s3
.file()
.max('10MB')
.types(['application/pdf', 'text/plain'])
.paths({
prefix: 'documents',
}),
})
// Export the handler
export const { GET, POST } = s3Router.handlers;
```
## Schema Builders
### Image Schema
```typescript
s3.image()
.max('5MB')
.formats(['jpeg', 'jpg', 'png', 'webp', 'gif'])
.dimensions({ minWidth: 100, maxWidth: 2000 })
.quality(0.8) // JPEG quality
```
### File Schema
```typescript
s3.file()
.max('10MB')
.types(['application/pdf', 'text/plain', 'application/json'])
.extensions(['pdf', 'txt', 'json'])
```
### Object Schema (Multiple Files)
```typescript
s3.object({
images: s3.image().max('5MB').count(5),
documents: s3.file().max('10MB').count(2),
thumbnail: s3.image().max('1MB').count(1),
})
```
## Route Configuration
### Middleware
Add authentication, validation, and metadata:
```typescript
.middleware(async ({ file, metadata, req }) => {
// Authentication
const user = await authenticateUser(req)
if (!user) {
throw new Error('Authentication required')
}
// File validation
if (file.size > 10 * 1024 * 1024) {
throw new Error('File too large')
}
// Return enriched metadata
return {
...metadata,
userId: user.id,
userRole: user.role,
uploadedAt: new Date().toISOString(),
ipAddress: req.headers.get('x-forwarded-for'),
}
})
```
### Path Configuration
Control where files are stored:
```typescript
.paths({
// Simple prefix
prefix: 'user-uploads',
// Custom path generation
generateKey: (ctx) => {
const { file, metadata, routeName } = ctx
const userId = metadata.userId
const timestamp = Date.now()
return `${routeName}/${userId}/${timestamp}/${file.name}`
},
// Simple suffix
suffix: 'processed',
})
```
### Lifecycle Hooks
React to upload events:
```typescript
.onUploadStart(async ({ file, metadata }) => {
console.log(`Starting upload: ${file.name}`)
// Log to analytics
await analytics.track('upload_started', {
userId: metadata.userId,
filename: file.name,
fileSize: file.size,
})
})
.onUploadComplete(async ({ file, url, metadata }) => {
console.log(`Upload complete: ${file.name} -> ${url}`)
// Save to database
await db.files.create({
filename: file.name,
url,
userId: metadata.userId,
size: file.size,
contentType: file.type,
uploadedAt: new Date(),
})
// Send notification
await notificationService.send({
userId: metadata.userId,
type: 'upload_complete',
message: `${file.name} uploaded successfully`,
})
})
.onUploadError(async ({ file, error, metadata }) => {
console.error(`Upload failed: ${file.name}`, error)
// Log error
await errorLogger.log({
operation: 'file_upload',
error: error.message,
userId: metadata.userId,
filename: file.name,
})
})
```
## Advanced Examples
### E-commerce Product Images
```typescript
const productRouter = s3.createRouter({
productImages: s3
.image()
.max('5MB')
.formats(['jpeg', 'jpg', 'png', 'webp'])
.dimensions({ minWidth: 800, maxWidth: 2000 })
.middleware(async ({ metadata, req }) => {
const user = await authenticateUser(req)
const productId = metadata.productId
// Verify user owns the product
const product = await db.products.findFirst({
where: { id: productId, ownerId: user.id }
})
if (!product) {
throw new Error('Product not found or access denied')
}
return {
...metadata,
userId: user.id,
productId,
productName: product.name,
}
})
.paths({
generateKey: (ctx) => {
const { metadata } = ctx
return `products/${metadata.productId}/images/${Date.now()}.jpg`
}
})
.onUploadComplete(async ({ url, metadata }) => {
// Update product with new image
await db.products.update({
where: { id: metadata.productId },
data: {
images: {
push: url
}
}
})
}),
productDocuments: s3
.file()
.max('10MB')
.types(['application/pdf'])
.paths({
prefix: 'product-docs',
})
.onUploadComplete(async ({ url, metadata }) => {
await db.productDocuments.create({
productId: metadata.productId,
documentUrl: url,
type: 'specification',
})
}),
})
```
### User Profile System
```typescript
const profileRouter = s3.createRouter({
avatar: s3
.image()
.max('2MB')
.formats(['jpeg', 'jpg', 'png'])
.dimensions({ minWidth: 100, maxWidth: 500 })
.middleware(async ({ req }) => {
const user = await authenticateUser(req)
return { userId: user.id, type: 'avatar' }
})
.paths({
generateKey: (ctx) => {
return `users/${ctx.metadata.userId}/avatar.jpg`
}
})
.onUploadComplete(async ({ url, metadata }) => {
// Update user profile
await db.users.update({
where: { id: metadata.userId },
data: { avatarUrl: url }
})
// Invalidate cache
await cache.del(`user:${metadata.userId}`)
}),
documents: s3
.object({
resume: s3.file().max('5MB').types(['application/pdf']).count(1),
portfolio: s3.file().max('10MB').count(3),
})
.middleware(async ({ req }) => {
const user = await authenticateUser(req)
return { userId: user.id }
})
.paths({
prefix: 'user-documents',
}),
})
```
## Client-Side Usage
Once you have your router set up, use it from the client:
```typescript
// components/FileUploader.tsx
import { useUploadRoute } from 'pushduck'
export function FileUploader() {
const { upload, isUploading } = useUploadRoute('imageUpload')
const handleUpload = async (files: FileList) => {
try {
const results = await upload(files, {
// This metadata will be passed to middleware
productId: 'product-123',
category: 'main-images',
})
console.log('Upload complete:', results)
} catch (error) {
console.error('Upload failed:', error)
}
}
return (
e.target.files && handleUpload(e.target.files)}
disabled={isUploading}
/>
{isUploading &&
Uploading...
}
)
}
```
## Type Safety
The router provides full TypeScript support:
```typescript
// Types are automatically inferred
type RouterType = typeof s3Router
// Get route names
type RouteNames = keyof RouterType // 'imageUpload' | 'documentUpload'
// Get route input types
type ImageUploadInput = InferRouteInput
// Get route metadata types
type ImageUploadMetadata = InferRouteMetadata
```
# Troubleshooting
URL: /docs/api/troubleshooting
Common issues and solutions for pushduck
***
title: Troubleshooting
description: Common issues and solutions for pushduck
-----------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Tabs, Tab } from "fumadocs-ui/components/tabs";
# Troubleshooting
Common issues and solutions when using pushduck.
## Development Issues
### Next.js Turbo Mode Compatibility
**Known Issue:** pushduck has compatibility issues with Next.js Turbo mode (`--turbo` flag).
**Problem:** Uploads fail or behave unexpectedly when using `next dev --turbo`.
**Solution:** Remove the `--turbo` flag from your development script:
```json
{
"scripts": {
// โ This may cause issues
"dev": "next dev --turbo",
// โ
Use this instead
"dev": "next dev"
}
}
```
```bash
# โ This may cause issues
npm run dev --turbo
# โ
Use this instead
npm run dev
```
**Why this happens:** Turbo mode's aggressive caching and bundling can interfere with the upload process, particularly with presigned URL generation and file streaming.
## Upload Failures
### CORS Errors
**Problem:** Browser console shows CORS errors when uploading files.
**Symptoms:**
```
Access to XMLHttpRequest at 'https://bucket.s3.amazonaws.com/...'
from origin 'http://localhost:3000' has been blocked by CORS policy
```
**Solution:** Configure CORS on your S3 bucket. See the [provider setup guides](/docs/providers) for detailed CORS configuration.
### Environment Variables Not Found
**Problem:** Errors about missing environment variables.
**Symptoms:**
```
Error: Environment variable CLOUDFLARE_R2_ACCESS_KEY_ID is not defined
```
**Solution:** Ensure your environment variables are properly set:
1. **Check your `.env.local` file exists** in your project root
2. **Verify variable names** match exactly (case-sensitive)
3. **Restart your development server** after adding new variables
```bash
# .env.local
CLOUDFLARE_R2_ACCESS_KEY_ID=your_access_key
CLOUDFLARE_R2_SECRET_ACCESS_KEY=your_secret_key
CLOUDFLARE_R2_ACCOUNT_ID=your_account_id
R2_BUCKET=your-bucket-name
```
### File Size Limits
**Problem:** Large files fail to upload.
**Solution:** Check and adjust size limits:
```typescript
// app/api/upload/route.ts
const uploadRouter = s3.createRouter({
imageUpload: s3
.image()
.max("10MB") // Increase as needed
.formats(["jpeg", "png", "webp"]),
});
```
## Type Errors
### TypeScript Inference Issues
**Problem:** TypeScript errors with upload client.
**Solution:** Ensure proper type exports:
```typescript
// app/api/upload/route.ts
export const { GET, POST } = uploadRouter.handlers;
export type AppRouter = typeof uploadRouter; // โ
Export the type
// lib/upload-client.ts
import type { AppRouter } from "@/app/api/upload/route";
export const upload = createUploadClient({ // โ
Use the type
endpoint: "/api/upload",
});
```
## Performance Issues
### Slow Upload Speeds
**Problem:** Uploads are slower than expected.
**Solutions:**
1. **Choose the right provider region** close to your users
2. **Check your internet connection** and server resources
3. **Consider your provider's performance characteristics**
### Memory Issues with Large Files
**Problem:** Browser crashes or high memory usage with large files.
**Solution:** File streaming is handled automatically by pushduck:
```typescript
// File streaming is handled automatically
// No additional configuration needed
const { uploadFiles } = upload.fileUpload();
await uploadFiles(largeFiles); // โ
Streams automatically
```
## Getting Help
If you're still experiencing issues:
1. **Check the documentation** for your specific provider
2. **Enable debug logging** by setting `NODE_ENV=development`
3. **Check browser console** for detailed error messages
4. **Verify your provider configuration** is correct
**Need more help?** Create an issue on [GitHub](https://github.com/abhay-ramesh/pushduck/issues) with detailed information about your setup and the error you're experiencing.
# Manual Setup
URL: /docs/getting-started/manual-setup
Step-by-step manual setup for developers who prefer full control over configuration
***
title: Manual Setup
description: Step-by-step manual setup for developers who prefer full control over configuration
------------------------------------------------------------------------------------------------
import { Step, Steps } from "fumadocs-ui/components/steps";
import { Callout } from "fumadocs-ui/components/callout";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
## Prerequisites
* Next.js 13+ with App Router
* An S3-compatible storage provider (we recommend Cloudflare R2 for best performance and cost)
* Node.js 18+
## Install Pushduck
```bash
npm install pushduck
```
```bash
pnpm add pushduck
```
```bash
yarn add pushduck
```
```bash
bun add pushduck
```
## Set Environment Variables
Create a `.env.local` file in your project root with your storage credentials:
```dotenv title=".env.local"
# Cloudflare R2 Configuration
CLOUDFLARE_R2_ACCESS_KEY_ID=your_access_key
CLOUDFLARE_R2_SECRET_ACCESS_KEY=your_secret_key
CLOUDFLARE_R2_ACCOUNT_ID=your_account_id
CLOUDFLARE_R2_BUCKET_NAME=your-bucket-name
```
**Don't have R2 credentials yet?** Follow our [Cloudflare R2 setup guide](/docs/providers/cloudflare-r2) to create a bucket and get your credentials in 2 minutes.
```dotenv title=".env.local"
# AWS S3 Configuration
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_REGION=us-east-1
AWS_S3_BUCKET_NAME=your-bucket-name
```
**Don't have S3 credentials yet?** Follow our [AWS S3 setup guide](/docs/providers/aws-s3) to create a bucket and get your credentials in 2 minutes.
## Configure Upload Settings
First, create your upload configuration:
```typescript
// lib/upload.ts
import { createUploadConfig } from "pushduck/server";
// Configure your S3-compatible storage
export const { s3, storage } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.CLOUDFLARE_R2_ACCESS_KEY_ID!,
secretAccessKey: process.env.CLOUDFLARE_R2_SECRET_ACCESS_KEY!,
region: "auto",
endpoint: `https://${process.env.CLOUDFLARE_R2_ACCOUNT_ID}.r2.cloudflarestorage.com`,
bucket: process.env.CLOUDFLARE_R2_BUCKET_NAME!,
accountId: process.env.CLOUDFLARE_R2_ACCOUNT_ID!,
})
.build();
```
## Create Your Upload Router
Create an API route to handle file uploads:
```typescript
// app/api/s3-upload/route.ts
import { s3 } from "@/lib/upload";
const s3Router = s3.createRouter({
// Define your upload routes with validation
imageUpload: s3
.image()
.max("10MB")
.formats(["jpg", "jpeg", "png", "webp"]),
documentUpload: s3.file().max("50MB").types(["application/pdf", "application/msword", "application/vnd.openxmlformats-officedocument.wordprocessingml.document"]),
});
export const { GET, POST } = s3Router.handlers;
// Export the router type for client-side type safety
export type Router = typeof s3Router;
```
**What's happening here?** - `s3.createRouter()` creates a type-safe upload
handler - `s3.image()` and `s3.file()` provide validation and TypeScript
inference - The router automatically handles presigned URLs, validation, and
errors - Exporting the type enables full client-side type safety
## Create Upload Client
Create a type-safe client for your components:
```typescript
// lib/upload-client.ts
import { createUploadClient } from "pushduck";
import type { Router } from "@/app/api/s3-upload/route";
// Create a type-safe upload client
export const upload = createUploadClient({
baseUrl: "/api/s3-upload",
});
// You can also export specific upload methods
export const { imageUpload, documentUpload } = upload;
```
## Use in Your Components
Now you can use the upload client in any component with full type safety:
```typescript
// components/image-uploader.tsx
"use client";
import { upload } from "@/lib/upload-client";
export function ImageUploader() {
const { uploadFiles, uploadedFiles, isUploading, progress, error } =
upload.imageUpload();
const handleFileChange = (e: React.ChangeEvent) => {
const files = e.target.files;
if (files) {
uploadFiles(Array.from(files));
}
};
return (
{isUploading && (
Uploading... {Math.round(progress)}%
)}
{error && (
)}
{uploadedFiles.length > 0 && (
{uploadedFiles.map((file) => (
{file.name}
))}
)}
);
}
```
## Add to Your Page
Finally, use your upload component in any page:
```typescript
// app/page.tsx
import { ImageUploader } from "@/components/image-uploader";
export default function HomePage() {
return (
Upload Images
);
}
```
## ๐ Congratulations!
You now have **production-ready file uploads** working in your Next.js app! Here's what you accomplished:
* โ
**Type-safe uploads** with full TypeScript inference
* โ
**Automatic validation** for file types and sizes
* โ
**Progress tracking** with loading states
* โ
**Error handling** with user-friendly messages
* โ
**Secure uploads** using presigned URLs
* โ
**Multiple file support** with image preview
**Turbo Mode Issue:** If you're using `next dev --turbo` and experiencing upload issues, try removing the `--turbo` flag from your dev script. There's a known compatibility issue with Turbo mode that can affect file uploads.
## What's Next?
Now that you have the basics working, explore these advanced features:
{" "}
{" "}
โ๏ธ Other Providers
Try Cloudflare R2 for better performance, or AWS S3, DigitalOcean, MinIO
Provider Setup โ
## Need Help?
* ๐ **Documentation**: Explore our comprehensive [guides](/docs/guides)
* ๐ฌ **Community**: Join our [Discord community](https://discord.gg/pushduck)
* ๐ **Issues**: Report bugs on [GitHub](https://github.com/abhay-ramesh/pushduck)
* ๐ง **Support**: Email us at [support@pushduck.com](mailto:support@pushduck.com)
**Loving Pushduck?** Give us a โญ on
[GitHub](https://github.com/abhay-ramesh/pushduck) and help spread the
word!
# Quick Start
URL: /docs/getting-started/quick-start
Get file uploads working in your Next.js app in under 2 minutes with our CLI
***
title: Quick Start
description: Get file uploads working in your Next.js app in under 2 minutes with our CLI
-----------------------------------------------------------------------------------------
import { Step, Steps } from "fumadocs-ui/components/steps";
import { Callout } from "fumadocs-ui/components/callout";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
Get **production-ready file uploads** working in your Next.js app in under 2 minutes with our CLI tool. Interactive setup, just one command.
**๐ New!** Use our CLI for instant setup: `npx @pushduck/cli@latest init` - handles everything automatically!
### โก Interactive CLI Setup
Get everything set up instantly with our interactive CLI:
```bash
npx @pushduck/cli@latest init
```
```bash
pnpm dlx pushduck init
```
```bash
yarn dlx pushduck init
```
```bash
bunx pushduck init
```
That's it! The CLI will:
* โ
**Auto-detect your package manager** (npm, pnpm, yarn, bun)
* โ
Install dependencies using your preferred package manager
* โ
Set up your chosen provider (Cloudflare R2, AWS S3, etc.)
* โ
Create API routes with type safety
* โ
Generate example components
* โ
Configure environment variables
* โ
Create and configure your storage bucket
**What you get:**
* Production-ready upload API in `app/api/upload/route.ts`
* Type-safe upload client in `lib/upload-client.ts`
* Example components in `components/ui/`
* Working demo page in `app/upload/page.tsx`
[**๐ Full CLI Documentation โ**](/docs/api/cli/cli-setup)
**Example CLI Output:**
```
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ โ
โ ๐ Welcome to Pushduck โ
โ โ
โ Let's get your file uploads working in 2 minutes! โ
โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
๐ Detecting your project...
โ Next.js App Router detected
โ TypeScript configuration found
โ Package manager: pnpm detected
โ No existing upload configuration
โ Project structure validated
? Which cloud storage provider would you like to use?
โฏ Cloudflare R2 (recommended)
AWS S3 (classic, widely supported)
DigitalOcean Spaces (simple, affordable)
Google Cloud Storage (enterprise-grade)
MinIO (self-hosted, open source)
Custom S3-compatible endpoint
โจ Generated files:
โโโ app/api/upload/route.ts
โโโ app/upload/page.tsx
โโโ components/ui/upload-button.tsx
โโโ lib/upload-client.ts
โโโ .env.example
๐ฆ Installing dependencies with pnpm...
โ pushduck
โ @aws-sdk/client-s3
โ react-dropzone
๐ Setup complete! Your uploads are ready.
```
**Turbo Mode Issue:** If you're using `next dev --turbo` and experiencing upload issues, try removing the `--turbo` flag. There's a known compatibility issue with Turbo mode that can affect file uploads.
***
## Step-by-Step CLI Walkthrough
Here's exactly what happens when you run the CLI and how to make the best choices:
### Project Detection & Validation
```
๐ Detecting your project...
โ Next.js App Router detected
โ TypeScript configuration found
โ Package manager: pnpm detected
โ No existing upload configuration
โ Project structure validated
```
**What's happening:** The CLI automatically detects your project setup to ensure compatibility.
**If you see warnings:**
* โ ๏ธ **Pages Router detected**: Still works, but examples will be for App Router
* โ ๏ธ **No TypeScript**: JavaScript examples will be generated instead
* โ ๏ธ **Existing configuration**: CLI will ask if you want to overwrite
* โ ๏ธ **Package manager not detected**: Will default to npm
### Provider Selection
```
? Which cloud storage provider would you like to use?
โฏ Cloudflare R2 (recommended)
AWS S3 (classic, widely supported)
DigitalOcean Spaces (simple, affordable)
Google Cloud Storage (enterprise-grade)
MinIO (self-hosted, open source)
Custom S3-compatible endpoint
```
**How to choose:**
**Choose: Cloudflare R2 (recommended)**
* Zero egress fees (bandwidth is FREE)
* Global edge network with 200+ locations
* Simple setup with excellent documentation
* Best performance for most applications
**Choose: Cloudflare R2**
* No egress fees (bandwidth is free)
* $0.015/GB storage (cheaper than S3)
* Global edge network included
* Perfect for high-traffic applications
**Choose: Cloudflare R2**
* Global edge network with 200+ locations
* Automatic geographic distribution
* Faster uploads worldwide
* Built-in CDN functionality
**Choose: Google Cloud Storage**
* Enterprise-grade security and compliance
* Advanced analytics and monitoring
* Integration with Google Cloud ecosystem
* Multi-region redundancy options
**Use arrow keys** to navigate, **Enter** to select.
### Credential Detection & Setup
```
๐ง Setting up Cloudflare R2...
๐ Checking for existing credentials...
โ Found CLOUDFLARE_R2_ACCESS_KEY_ID
โ Found CLOUDFLARE_R2_SECRET_ACCESS_KEY
โ Found CLOUDFLARE_R2_ACCOUNT_ID
โ CLOUDFLARE_R2_BUCKET_NAME not found
```
**What this means:**
* โ
**Found credentials**: CLI detected existing environment variables
* โ ๏ธ **Missing credentials**: You'll be prompted to enter them
* โ **No credentials**: CLI will guide you through setup
**If prompted for credentials:**
```
? Enter your Cloudflare R2 Access Key ID: f1d2...
? Enter your Cloudflare R2 Secret Access Key: [hidden]
? Enter your Cloudflare Account ID: abc123...
? Enter your R2 bucket name: my-app-uploads-2024
```
**Pro tips:**
* Use a **unique bucket name** (globally unique across all R2)
* **Account ID** can be found in your Cloudflare dashboard
* **Don't have credentials?** Check our [Cloudflare R2 setup guide](/docs/providers/cloudflare-r2)
### Bucket Creation
```
? Create bucket automatically? (Y/n)
```
**Recommended: Yes** - The CLI will:
* Create the bucket with proper permissions
* Set up CORS configuration for web uploads
* Configure public read access for uploaded files
* Test the connection to ensure everything works
**Choose "No" if:**
* You already have a bucket configured
* Your organization requires manual bucket creation
* You need custom bucket policies
**Success looks like:**
```
โ
Created R2 bucket: my-app-uploads-2024
โ
Configured CORS for web uploads
โ
Set up public read permissions
โ
Connection test successful
```
### API Route Configuration
```
? Where should we create the upload API?
โฏ app/api/upload/route.ts (recommended)
app/api/s3-upload/route.ts (classic)
Custom path
```
**Recommendations:**
* **`/api/upload`**: Clean, modern route name
* **`/api/s3-upload`**: If you want to be explicit about S3
* **Custom path**: If you have specific routing requirements
**The CLI will create:**
* Type-safe API route with validation
* Authentication middleware (ready to customize)
* Support for multiple upload types (images, documents)
* Proper error handling and responses
### Component Generation
```
? Generate example upload page?
โฏ Yes, create app/upload/page.tsx with full example
Yes, just add components to components/ui/
No, I'll build my own
```
**Choose based on your needs:**
**Full example page** - Best for:
* Learning how the library works
* Quick prototyping and testing
* Getting a working demo immediately
**Components only** - Best for:
* Adding uploads to existing pages
* Custom UI integration
* Building your own demo
**No components** - Best for:
* Experienced developers
* Custom implementation requirements
* API-only usage
### File Generation & Installation
```
๐ ๏ธ Generating files...
โจ Created files:
โโโ app/api/upload/route.ts # Type-safe API endpoint
โโโ app/upload/page.tsx # Demo upload page
โโโ components/ui/upload-button.tsx # Simple upload button
โโโ components/ui/upload-dropzone.tsx # Drag & drop component
โโโ lib/upload-client.ts # Type-safe client
โโโ .env.example # Environment template
๐ฆ Installing dependencies with pnpm...
โ pushduck
โ @aws-sdk/client-s3
โ react-dropzone
๐ Setup complete! Your uploads are ready.
```
**What happens:**
1. **Files generated** with your specific configuration
2. **Dependencies installed** using your detected package manager
3. **Types generated** for full TypeScript support
4. **Environment configured** with your settings
**Package Manager Support:**
* **npm**: `npm install` commands
* **pnpm**: `pnpm add` commands
* **yarn**: `yarn add` commands
* **bun**: `bun add` commands
**Next steps shown:**
```
๐ Next steps:
1. Start your dev server: pnpm dev
2. Visit: http://localhost:3000/upload
3. Try uploading a file!
๐ Learn more:
โข API Reference: /docs/api
โข Providers: /docs/providers
โข Examples: /docs/examples
```
***
## Common CLI Scenarios
### First-Time Setup (Recommended)
```bash
npx @pushduck/cli@latest init
# Follow prompts, choose Cloudflare R2, let CLI create bucket
```
```bash
pnpm dlx pushduck init
# Follow prompts, choose Cloudflare R2, let CLI create bucket
```
```bash
yarn dlx pushduck init
# Follow prompts, choose Cloudflare R2, let CLI create bucket
```
```bash
bunx pushduck init
# Follow prompts, choose Cloudflare R2, let CLI create bucket
```
### Quick Cloudflare R2 Setup
```bash
# Works with any package manager
npx @pushduck/cli@latest init --provider cloudflare-r2
pnpm dlx pushduck init --provider cloudflare-r2
yarn dlx pushduck init --provider cloudflare-r2
bunx pushduck init --provider cloudflare-r2
```
### AWS S3 Setup
```bash
# Skip provider selection, go straight to AWS S3
npx @pushduck/cli@latest init --provider aws
# Use AWS S3 for existing AWS infrastructure
```
### Components Only
```bash
# Generate API and components, no demo page
npx @pushduck/cli@latest init --skip-examples
```
### Preview Mode
```bash
# See what would be created without making changes
npx @pushduck/cli@latest init --dry-run
```
***
**Need help?** The CLI includes built-in help: `npx @pushduck/cli@latest --help`. For detailed documentation, see our [complete CLI reference](/docs/api/cli/cli-setup).
# Client-Side Approaches
URL: /docs/guides/client-approaches
Compare the structured client vs hook-based approaches for file uploads
***
title: Client-Side Approaches
description: Compare the structured client vs hook-based approaches for file uploads
------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
# Client-Side Approaches
Pushduck provides **two ways** to integrate file uploads in your React components. Both approaches now provide **identical functionality** including per-route callbacks, progress tracking, and error handling.
**Recommendation**: Use the **Enhanced Structured Client** approach for the best developer experience. It now provides the same flexibility as hooks while maintaining superior type safety and centralized configuration.
## Quick Comparison
```typescript
const upload = createUploadClient({
endpoint: '/api/upload'
})
// Simple usage
const { uploadFiles, files } = upload.imageUpload()
// With per-route callbacks (NEW!)
const { uploadFiles, files } = upload.imageUpload({
onSuccess: (results) => handleSuccess(results),
onError: (error) => handleError(error),
onProgress: (progress) => setProgress(progress)
})
```
**Best for**: Most projects - provides superior DX, type safety, and full flexibility
```typescript
const { uploadFiles, files } = useUploadRoute('imageUpload', {
onSuccess: (results) => handleSuccess(results),
onError: (error) => handleError(error),
onProgress: (progress) => setProgress(progress)
})
```
**Best for**: Teams that strongly prefer React hooks, legacy code migration
## Feature Parity
Both approaches now support **identical functionality**:
| Feature | Enhanced Structured Client | Hook-Based |
| --------------------- | -------------------------------- | ---------------------------- |
| โ
Type Safety | **Superior** - Property-based | Good - Generic types |
| โ
Per-route Callbacks | **โ
Full support** | โ
Full support |
| โ
Progress Tracking | **โ
Full support** | โ
Full support |
| โ
Error Handling | **โ
Full support** | โ
Full support |
| โ
Multiple Endpoints | **โ
Per-route endpoints** | โ
Per-route endpoints |
| โ
Upload Control | **โ
Enable/disable uploads** | โ
Enable/disable uploads |
| โ
Auto-upload | **โ
Per-route control** | โ
Per-route control |
| โ
Overall Progress | **โ
progress, uploadSpeed, eta** | โ
progress, uploadSpeed, eta |
## API Comparison: Identical Capabilities
Both approaches now return **exactly the same** properties and accept **exactly the same** configuration options:
```typescript
// Hook-Based Approach
const {
uploadFiles, // (files: File[]) => Promise
files, // S3UploadedFile[]
isUploading, // boolean
errors, // string[]
reset, // () => void
progress, // number (0-100) - overall progress
uploadSpeed, // number (bytes/sec) - overall speed
eta // number (seconds) - overall ETA
} = useUploadRoute('imageUpload', {
onSuccess: (results) => handleSuccess(results),
onError: (error) => handleError(error),
onProgress: (progress) => setProgress(progress),
endpoint: '/api/custom-upload',
disabled: false,
autoUpload: true
});
// Enhanced Structured Client - IDENTICAL capabilities
const {
uploadFiles, // (files: File[]) => Promise
files, // S3UploadedFile[]
isUploading, // boolean
errors, // string[]
reset, // () => void
progress, // number (0-100) - overall progress
uploadSpeed, // number (bytes/sec) - overall speed
eta // number (seconds) - overall ETA
} = upload.imageUpload({
onSuccess: (results) => handleSuccess(results),
onError: (error) => handleError(error),
onProgress: (progress) => setProgress(progress),
endpoint: '/api/custom-upload',
disabled: false,
autoUpload: true
});
```
## Complete Options Parity
Both approaches support **identical configuration options**:
```typescript
interface CommonUploadOptions {
onSuccess?: (results: UploadResult[]) => void;
onError?: (error: Error) => void;
onProgress?: (progress: number) => void;
endpoint?: string; // Custom endpoint per route
disabled?: boolean; // Enable/disable uploads
autoUpload?: boolean; // Auto-upload when files selected
}
// Hook-based: useUploadRoute(routeName, options)
// Structured: upload.routeName(options)
// Both accept the same CommonUploadOptions interface
```
## Return Value Parity
Both approaches return **identical properties**:
```typescript
interface CommonUploadReturn {
uploadFiles: (files: File[]) => Promise;
files: S3UploadedFile[];
isUploading: boolean;
errors: string[];
reset: () => void;
// Overall progress tracking (NEW in both!)
progress?: number; // 0-100 percentage across all files
uploadSpeed?: number; // bytes per second across all files
eta?: number; // seconds remaining for all files
}
```
## Enhanced Structured Client Examples
### Basic Usage (Unchanged)
```typescript
import { createUploadClient } from 'pushduck/client'
import type { AppRouter } from '@/lib/upload'
const upload = createUploadClient({ endpoint: '/api/upload' })
export function SimpleUpload() {
const { uploadFiles, files, isUploading } = upload.imageUpload()
return (
uploadFiles(Array.from(e.target.files || []))}
disabled={isUploading}
/>
)
}
```
### With Per-Route Configuration (NEW!)
```typescript
export function AdvancedUpload() {
const [progress, setProgress] = useState(0)
const { uploadFiles, files, isUploading, errors, reset } =
upload.imageUpload({
onSuccess: (results) => {
console.log('โ
Upload successful!', results)
showNotification('Images uploaded successfully!')
},
onError: (error) => {
console.error('โ Upload failed:', error)
showErrorNotification(error.message)
},
onProgress: (progress) => {
console.log(`๐ Progress: ${progress}%`)
setProgress(progress)
}
})
return (
)
}
```
### Multiple Routes with Different Configurations
```typescript
export function MultiUploadComponent() {
// Images with progress tracking
const images = upload.imageUpload({
onProgress: (progress) => setImageProgress(progress)
})
// Documents with different endpoint and success handler
const documents = upload.documentUpload({
endpoint: '/api/secure-upload',
onSuccess: (results) => updateDocumentLibrary(results)
})
// Videos with upload disabled (feature flag)
const videos = upload.videoUpload({
disabled: !isVideoUploadEnabled
})
return (
)
}
```
### Global Configuration with Per-Route Overrides
```typescript
const upload = createUploadClient({
endpoint: '/api/upload',
// Global defaults (optional)
defaultOptions: {
onProgress: (progress) => console.log(`Global progress: ${progress}%`),
onError: (error) => logError(error)
}
})
// This route inherits global defaults
const basic = upload.imageUpload()
// This route overrides specific options
const custom = upload.documentUpload({
endpoint: '/api/secure-upload', // Override endpoint
onSuccess: (results) => handleSecureUpload(results) // Add success handler
// Still inherits global onProgress and onError
})
```
## Hook-Based Approach (Unchanged)
```typescript
import { useUploadRoute } from 'pushduck/client'
export function HookBasedUpload() {
const { uploadFiles, files, isUploading, error } = useUploadRoute('imageUpload', {
onSuccess: (results) => console.log('Success:', results),
onError: (error) => console.error('Error:', error),
onProgress: (progress) => console.log('Progress:', progress)
})
return (
uploadFiles(Array.from(e.target.files || []))}
disabled={isUploading}
/>
)
}
```
## Migration Guide
### From Hook-Based to Enhanced Structured Client
```typescript
// Before: Hook-based
const { uploadFiles, files } = useUploadRoute('imageUpload', {
onSuccess: handleSuccess,
onError: handleError
})
// After: Enhanced structured client
const upload = createUploadClient({ endpoint: '/api/upload' })
const { uploadFiles, files } = upload.imageUpload({
onSuccess: handleSuccess,
onError: handleError
})
```
### Benefits of Migration
1. **Better Type Safety**: Route names are validated at compile time
2. **Enhanced IntelliSense**: Auto-completion for all available routes
3. **Centralized Configuration**: Single place to configure endpoints and defaults
4. **Refactoring Support**: Rename routes safely across your codebase
5. **No Performance Impact**: Same underlying implementation
## When to Use Each Approach
### Use Enhanced Structured Client When:
* โ
**Starting a new project** - best overall developer experience
* โ
**Want superior type safety** - compile-time route validation
* โ
**Need centralized configuration** - single place for settings
* โ
**Value refactoring support** - safe route renames
### Use Hook-Based When:
* โ
**Migrating existing code** - minimal changes required
* โ
**Dynamic route names** - routes determined at runtime
* โ
**Team strongly prefers hooks** - familiar React patterns
* โ
**Legacy compatibility** - maintaining older codebases
## Performance Considerations
Both approaches have **identical performance** characteristics:
* Same underlying `useUploadRoute` implementation
* Same network requests and upload logic
* Same React hooks rules and lifecycle
The enhanced structured client adds zero runtime overhead while providing compile-time benefits.
***
**Full Feature Parity**: Both approaches now support the same functionality. The choice comes down to developer experience preferences rather than feature limitations.
## Detailed Comparison
### Type Safety & Developer Experience
```typescript
// โ
Complete type inference from server router
const upload = createUploadClient({
endpoint: '/api/upload'
})
// โ
Property-based access - no string literals
const { uploadFiles, files } = upload.imageUpload()
// โ
IntelliSense shows all available endpoints
upload. // <- Shows: imageUpload, documentUpload, videoUpload...
// โ
Compile-time validation
upload.nonExistentRoute() // โ TypeScript error
// โ
Refactoring safety
// Rename routes in router โ TypeScript shows all usage locations
```
**Benefits:**
* ๐ฏ **Full type inference** from server to client
* ๐ **IntelliSense support** - discover endpoints through IDE
* ๐ก๏ธ **Refactoring safety** - rename with confidence
* ๐ซ **No string literals** - eliminates typos
* โก **Better DX** - property-based access feels natural
```typescript
// โ
With type parameter - recommended for better type safety
const { uploadFiles, files } = useUploadRoute('imageUpload')
// โ
Without type parameter - also works
const { uploadFiles, files } = useUploadRoute('imageUpload')
// Type parameter provides compile-time validation
const typed = useUploadRoute('imageUpload') // Route validated
const untyped = useUploadRoute('imageUpload') // Any string accepted
```
**Characteristics:**
* ๐ช **React hook pattern** - familiar to React developers
* ๐ค **Flexible usage** - works with or without type parameter
* ๐งฉ **Component-level state** - each hook manages its own state
* ๐ฏ **Type safety** - enhanced when using ``
* ๐ **IDE support** - best with type parameter
### Code Examples
**Structured Client:**
```typescript
import { upload } from '@/lib/upload-client'
export function ImageUploader() {
const { uploadFiles, files, isUploading, error } = upload.imageUpload()
return (
uploadFiles(Array.from(e.target.files || []))}
disabled={isUploading}
/>
{/* Upload UI */}
)
}
```
**Hook-Based:**
```typescript
import { useUploadRoute } from 'pushduck/client'
export function ImageUploader() {
const { uploadFiles, files, isUploading, error } = useUploadRoute('imageUpload')
return (
uploadFiles(Array.from(e.target.files || []))}
disabled={isUploading}
/>
{/* Same upload UI */}
)
}
```
**Structured Client:**
```typescript
export function FileManager() {
const images = upload.imageUpload()
const documents = upload.documentUpload()
const videos = upload.videoUpload()
return (
)
}
```
**Hook-Based:**
```typescript
export function FileManager() {
const images = useUploadRoute('imageUpload')
const documents = useUploadRoute('documentUpload')
const videos = useUploadRoute('videoUpload')
return (
)
}
```
**Structured Client:**
```typescript
// lib/upload-client.ts
export const upload = createUploadClient({
endpoint: '/api/upload',
headers: {
Authorization: `Bearer ${getAuthToken()}`
}
})
// components/secure-uploader.tsx
export function SecureUploader() {
const { uploadFiles } = upload.secureUpload()
// Authentication handled globally
}
```
**Hook-Based:**
```typescript
export function SecureUploader() {
const { uploadFiles } = useUploadRoute('secureUpload', {
headers: {
Authorization: `Bearer ${getAuthToken()}`
}
})
// Authentication per hook usage
}
```
## Conclusion
**Our Recommendation**: Use the **Enhanced Structured Client** approach (`createUploadClient`) for most projects. It provides superior developer experience, better refactoring safety, and enhanced type inference.
**Both approaches are supported**: The hook-based approach (`useUploadRoute`) is fully supported and valid for teams that prefer traditional React patterns.
**Quick Decision Guide:**
* **Most projects** โ Use `createUploadClient` (recommended)
* **Strongly prefer React hooks** โ Use `useUploadRoute`
* **Want best DX and type safety** โ Use `createUploadClient`
* **Need component-level control** โ Use `useUploadRoute`
### Next Steps
* **New Project**: Start with [createUploadClient](/docs/api/utilities/create-upload-client)
* **Existing Hook Code**: Consider [migrating gradually](/docs/guides/migration/enhanced-client)
* **Need Help**: Join our [Discord community](https://discord.gg/pushduck) for guidance
# Astro
URL: /docs/integrations/astro
Modern static site file uploads with Astro using Web Standards - no adapter needed!
***
title: Astro
description: Modern static site file uploads with Astro using Web Standards - no adapter needed!
------------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
# Astro Integration
Astro is a modern web framework for building fast, content-focused websites with islands architecture. It uses Web Standards APIs and provides excellent performance with minimal JavaScript. Since Astro uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters!
**Web Standards Native**: Astro API routes use Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead.
## Quick Setup
**Install dependencies**
```bash
npm install pushduck
```
```bash
yarn add pushduck
```
```bash
pnpm add pushduck
```
```bash
bun add pushduck
```
**Configure upload router**
```typescript title="src/lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: import.meta.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: import.meta.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: import.meta.env.AWS_ENDPOINT_URL!,
bucket: import.meta.env.S3_BUCKET_NAME!,
accountId: import.meta.env.R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create API route**
```typescript title="src/pages/api/upload/[...path].ts"
import type { APIRoute } from 'astro';
import { uploadRouter } from '../../../lib/upload';
// Direct usage - no adapter needed!
export const ALL: APIRoute = async ({ request }) => {
return uploadRouter.handlers(request);
};
```
## Basic Integration
### Simple Upload Route
```typescript title="src/pages/api/upload/[...path].ts"
import type { APIRoute } from 'astro';
import { uploadRouter } from '../../../lib/upload';
// Method 1: Combined handler (recommended)
export const ALL: APIRoute = async ({ request }) => {
return uploadRouter.handlers(request);
};
// Method 2: Separate handlers (if you need method-specific logic)
export const GET: APIRoute = async ({ request }) => {
return uploadRouter.handlers.GET(request);
};
export const POST: APIRoute = async ({ request }) => {
return uploadRouter.handlers.POST(request);
};
```
### With CORS Support
```typescript title="src/pages/api/upload/[...path].ts"
import type { APIRoute } from 'astro';
import { uploadRouter } from '../../../lib/upload';
export const ALL: APIRoute = async ({ request }) => {
// Handle CORS preflight
if (request.method === 'OPTIONS') {
return new Response(null, {
status: 200,
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type',
},
});
}
const response = await uploadRouter.handlers(request);
// Add CORS headers to actual response
response.headers.set('Access-Control-Allow-Origin', '*');
return response;
};
```
## Advanced Configuration
### Authentication with Astro
```typescript title="src/lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: import.meta.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: import.meta.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: import.meta.env.AWS_ENDPOINT_URL!,
bucket: import.meta.env.S3_BUCKET_NAME!,
accountId: import.meta.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Private uploads with cookie-based authentication
privateUpload: s3
.image()
.max("5MB")
.middleware(async ({ req }) => {
const cookies = req.headers.get('Cookie');
const sessionId = parseCookie(cookies)?.sessionId;
if (!sessionId) {
throw new Error('Authentication required');
}
const user = await getUserFromSession(sessionId);
if (!user) {
throw new Error('Invalid session');
}
return {
userId: user.id,
username: user.username,
};
}),
// Public uploads (no auth)
publicUpload: s3
.image()
.max("2MB")
// No middleware = public access
});
export type AppUploadRouter = typeof uploadRouter;
// Helper functions
function parseCookie(cookieString: string | null) {
if (!cookieString) return {};
return Object.fromEntries(
cookieString.split('; ').map(c => {
const [key, ...v] = c.split('=');
return [key, v.join('=')];
})
);
}
async function getUserFromSession(sessionId: string) {
// Implement your session validation logic
// This could connect to a database, Redis, etc.
return { id: 'user-123', username: 'demo-user' };
}
```
## Client-Side Usage
### Upload Component (React)
```tsx title="src/components/FileUpload.tsx"
import { useUpload } from "pushduck/client";
import type { AppUploadRouter } from "../lib/upload";
const { UploadButton, UploadDropzone } = useUpload({
endpoint: "/api/upload",
});
export default function FileUpload() {
function handleUploadComplete(files: any[]) {
console.log("Files uploaded:", files);
alert("Upload completed!");
}
function handleUploadError(error: Error) {
console.error("Upload error:", error);
alert(`Upload failed: ${error.message}`);
}
return (
Image Upload
Document Upload
);
}
```
### Upload Component (Vue)
```vue title="src/components/FileUpload.vue"
Image Upload
Document Upload
```
### Using in Astro Pages
```astro title="src/pages/index.astro"
---
// Server-side code (runs at build time)
---
File Upload Demo
File Upload Demo
```
## File Management
### Server-Side File API
```typescript title="src/pages/api/files.ts"
import type { APIRoute } from 'astro';
export const GET: APIRoute = async ({ request, url }) => {
const searchParams = url.searchParams;
const userId = searchParams.get('userId');
if (!userId) {
return new Response(JSON.stringify({ error: 'User ID required' }), {
status: 400,
headers: { 'Content-Type': 'application/json' }
});
}
// Fetch files from database
const files = await getFilesForUser(userId);
return new Response(JSON.stringify({
files: files.map(file => ({
id: file.id,
name: file.name,
url: file.url,
size: file.size,
uploadedAt: file.createdAt,
})),
}), {
headers: { 'Content-Type': 'application/json' }
});
};
async function getFilesForUser(userId: string) {
// Implement your database query logic
return [];
}
```
### File Management Page
```astro title="src/pages/files.astro"
---
// This runs on the server at build time or request time
const files = await fetch(`${Astro.url.origin}/api/files?userId=current-user`)
.then(res => res.json())
.catch(() => ({ files: [] }));
---
My Files
My Files
Uploaded Files
{files.files.length === 0 ? (
No files uploaded yet.
) : (
{files.files.map((file: any) => (
{file.name}
{formatFileSize(file.size)}
{new Date(file.uploadedAt).toLocaleDateString()}
View File
))}
)}
```
## Deployment Options
```javascript title="astro.config.mjs"
import { defineConfig } from 'astro/config';
import vercel from '@astrojs/vercel/serverless';
export default defineConfig({
output: 'server',
adapter: vercel({
runtime: 'nodejs18.x',
}),
});
```
```javascript title="astro.config.mjs"
import { defineConfig } from 'astro/config';
import netlify from '@astrojs/netlify/functions';
export default defineConfig({
output: 'server',
adapter: netlify(),
});
```
```javascript title="astro.config.mjs"
import { defineConfig } from 'astro/config';
import node from '@astrojs/node';
export default defineConfig({
output: 'server',
adapter: node({
mode: 'standalone',
}),
});
```
```javascript title="astro.config.mjs"
import { defineConfig } from 'astro/config';
import cloudflare from '@astrojs/cloudflare';
export default defineConfig({
output: 'server',
adapter: cloudflare(),
});
```
## Environment Variables
```bash title=".env"
# AWS Configuration
AWS_REGION=us-east-1
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_S3_BUCKET=your-bucket-name
# Astro
PUBLIC_UPLOAD_ENDPOINT=http://localhost:3000/api/upload
```
## Performance Benefits
## Real-Time Upload Progress
```tsx title="src/components/AdvancedUpload.tsx"
import { useState } from 'react';
export default function AdvancedUpload() {
const [uploadProgress, setUploadProgress] = useState(0);
const [isUploading, setIsUploading] = useState(false);
async function handleFileUpload(event: React.ChangeEvent) {
const files = event.target.files;
if (!files || files.length === 0) return;
setIsUploading(true);
setUploadProgress(0);
try {
// Simulate upload progress
for (let i = 0; i <= 100; i += 10) {
setUploadProgress(i);
await new Promise(resolve => setTimeout(resolve, 100));
}
alert('Upload completed!');
} catch (error) {
console.error('Upload failed:', error);
alert('Upload failed!');
} finally {
setIsUploading(false);
setUploadProgress(0);
}
}
return (
{isUploading && (
{uploadProgress}% uploaded
)}
);
}
```
## Troubleshooting
**Common Issues**
1. **Route not found**: Ensure your route is `src/pages/api/upload/[...path].ts`
2. **Build errors**: Check that pushduck is properly installed and configured
3. **Environment variables**: Use `import.meta.env` instead of `process.env`
4. **Client components**: Remember to add `client:load` directive for interactive components
### Debug Mode
Enable debug logging:
```typescript title="src/lib/upload.ts"
export const uploadRouter = createS3Router({
// ... routes
}).middleware(async ({ req, file }) => {
if (import.meta.env.DEV) {
console.log("Upload request:", req.url);
console.log("File:", file.name, file.size);
}
return {};
});
```
### Astro Configuration
```javascript title="astro.config.mjs"
import { defineConfig } from 'astro/config';
import react from '@astrojs/react';
import vue from '@astrojs/vue';
export default defineConfig({
integrations: [
react(), // For React components
vue(), // For Vue components
],
output: 'server', // Required for API routes
vite: {
define: {
// Make environment variables available
'import.meta.env.AWS_ACCESS_KEY_ID': JSON.stringify(process.env.AWS_ACCESS_KEY_ID),
}
}
});
```
Astro provides an excellent foundation for building fast, content-focused websites with pushduck, combining the power of islands architecture with Web Standards APIs for optimal performance and developer experience.
# Bun Runtime
URL: /docs/integrations/bun
Ultra-fast JavaScript runtime with native Web Standards support - no adapter needed!
***
title: Bun Runtime
description: Ultra-fast JavaScript runtime with native Web Standards support - no adapter needed!
-------------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
# Bun Runtime
Bun is an ultra-fast JavaScript runtime with native Web Standards support. Since Bun uses Web Standard `Request` and `Response` objects natively, pushduck handlers work directly without any adapters!
**Web Standards Native**: Bun's `Bun.serve()` uses Web Standard `Request` objects directly, making pushduck integration seamless with zero overhead.
## Quick Setup
**Install dependencies**
```bash
bun add pushduck
```
```bash
npm install pushduck
```
```bash
yarn add pushduck
```
```bash
pnpm add pushduck
```
**Configure upload router**
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create Bun server with upload routes**
```typescript title="server.ts"
import { uploadRouter } from './lib/upload';
// Direct usage - no adapter needed!
Bun.serve({
port: 3000,
fetch(request) {
const url = new URL(request.url);
if (url.pathname.startsWith('/api/upload/')) {
return uploadRouter.handlers(request);
}
return new Response('Not found', { status: 404 });
},
});
console.log('๐ Bun server running on http://localhost:3000');
```
## Basic Integration
### Simple Upload Server
```typescript title="server.ts"
import { uploadRouter } from './lib/upload';
Bun.serve({
port: 3000,
fetch(request) {
const url = new URL(request.url);
// Method 1: Combined handler (recommended)
if (url.pathname.startsWith('/api/upload/')) {
return uploadRouter.handlers(request);
}
// Health check
if (url.pathname === '/health') {
return new Response(JSON.stringify({ status: 'ok' }), {
headers: { 'Content-Type': 'application/json' }
});
}
return new Response('Not found', { status: 404 });
},
});
console.log('๐ Bun server running on http://localhost:3000');
```
### With CORS and Routing
```typescript title="server.ts"
import { uploadRouter } from './lib/upload';
function handleCORS(request: Request) {
const origin = request.headers.get('origin');
const allowedOrigins = ['http://localhost:3000', 'https://your-domain.com'];
const headers = new Headers();
if (origin && allowedOrigins.includes(origin)) {
headers.set('Access-Control-Allow-Origin', origin);
}
headers.set('Access-Control-Allow-Methods', 'GET, POST, OPTIONS');
headers.set('Access-Control-Allow-Headers', 'Content-Type, Authorization');
return headers;
}
Bun.serve({
port: 3000,
fetch(request) {
const url = new URL(request.url);
const corsHeaders = handleCORS(request);
// Handle preflight requests
if (request.method === 'OPTIONS') {
return new Response(null, { status: 200, headers: corsHeaders });
}
// Upload routes
if (url.pathname.startsWith('/api/upload/')) {
return uploadRouter.handlers(request).then(response => {
// Add CORS headers to response
corsHeaders.forEach((value, key) => {
response.headers.set(key, value);
});
return response;
});
}
// Health check
if (url.pathname === '/health') {
return new Response(JSON.stringify({
status: 'ok',
runtime: 'Bun',
timestamp: new Date().toISOString()
}), {
headers: {
'Content-Type': 'application/json',
...Object.fromEntries(corsHeaders)
}
});
}
return new Response('Not found', { status: 404 });
},
});
console.log('๐ Bun server running on http://localhost:3000');
```
## Advanced Configuration
### Authentication and Rate Limiting
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Private uploads with authentication
privateUpload: s3
.image()
.max("5MB")
.middleware(async ({ req }) => {
const authHeader = req.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
throw new Error('Authorization required');
}
const token = authHeader.substring(7);
try {
const payload = await verifyJWT(token);
return {
userId: payload.sub as string,
userRole: payload.role as string
};
} catch (error) {
throw new Error('Invalid token');
}
}),
// Public uploads (no auth)
publicUpload: s3
.image()
.max("2MB")
// No middleware = public access
});
async function verifyJWT(token: string) {
// Your JWT verification logic here
// Using Bun's built-in crypto or a JWT library
return { sub: 'user-123', role: 'user' };
}
export type AppUploadRouter = typeof uploadRouter;
```
### Production Server with Full Features
```typescript title="server.ts"
import { uploadRouter } from './lib/upload';
// Simple rate limiting store
const rateLimitStore = new Map();
function rateLimit(ip: string, maxRequests = 100, windowMs = 15 * 60 * 1000) {
const now = Date.now();
const key = ip;
const record = rateLimitStore.get(key);
if (!record || now > record.resetTime) {
rateLimitStore.set(key, { count: 1, resetTime: now + windowMs });
return true;
}
if (record.count >= maxRequests) {
return false;
}
record.count++;
return true;
}
function getClientIP(request: Request): string {
// In production, you might get this from headers like X-Forwarded-For
return request.headers.get('x-forwarded-for') ||
request.headers.get('x-real-ip') ||
'unknown';
}
Bun.serve({
port: process.env.PORT ? parseInt(process.env.PORT) : 3000,
fetch(request) {
const url = new URL(request.url);
const clientIP = getClientIP(request);
// Rate limiting
if (!rateLimit(clientIP)) {
return new Response(JSON.stringify({
error: 'Too many requests'
}), {
status: 429,
headers: { 'Content-Type': 'application/json' }
});
}
// CORS
const corsHeaders = {
'Access-Control-Allow-Origin': process.env.NODE_ENV === 'production'
? 'https://your-domain.com'
: '*',
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type, Authorization',
};
// Handle preflight
if (request.method === 'OPTIONS') {
return new Response(null, { status: 200, headers: corsHeaders });
}
// Upload routes
if (url.pathname.startsWith('/api/upload/')) {
return uploadRouter.handlers(request).then(response => {
Object.entries(corsHeaders).forEach(([key, value]) => {
response.headers.set(key, value);
});
return response;
}).catch(error => {
console.error('Upload error:', error);
return new Response(JSON.stringify({
error: 'Upload failed',
message: process.env.NODE_ENV === 'development' ? error.message : 'Internal server error'
}), {
status: 500,
headers: {
'Content-Type': 'application/json',
...corsHeaders
}
});
});
}
// API info
if (url.pathname === '/api') {
return new Response(JSON.stringify({
name: 'Bun Upload API',
version: '1.0.0',
runtime: 'Bun',
endpoints: {
health: '/health',
upload: '/api/upload/*'
}
}), {
headers: {
'Content-Type': 'application/json',
...corsHeaders
}
});
}
// Health check
if (url.pathname === '/health') {
return new Response(JSON.stringify({
status: 'ok',
runtime: 'Bun',
version: Bun.version,
timestamp: new Date().toISOString(),
uptime: process.uptime()
}), {
headers: {
'Content-Type': 'application/json',
...corsHeaders
}
});
}
return new Response('Not found', { status: 404, headers: corsHeaders });
},
});
console.log(`๐ Bun server running on http://localhost:${process.env.PORT || 3000}`);
console.log(`๐ Environment: ${process.env.NODE_ENV || 'development'}`);
```
## File-based Routing
### Structured Application
```typescript title="routes/upload.ts"
import { uploadRouter } from '../lib/upload';
export function handleUpload(request: Request) {
return uploadRouter.handlers(request);
}
```
```typescript title="routes/api.ts"
export function handleAPI(request: Request) {
return new Response(JSON.stringify({
name: 'Bun Upload API',
version: '1.0.0',
runtime: 'Bun'
}), {
headers: { 'Content-Type': 'application/json' }
});
}
```
```typescript title="server.ts"
import { handleUpload } from './routes/upload';
import { handleAPI } from './routes/api';
const routes = {
'/api/upload': handleUpload,
'/api': handleAPI,
'/health': () => new Response(JSON.stringify({ status: 'ok' }), {
headers: { 'Content-Type': 'application/json' }
})
};
Bun.serve({
port: 3000,
fetch(request) {
const url = new URL(request.url);
for (const [path, handler] of Object.entries(routes)) {
if (url.pathname.startsWith(path)) {
return handler(request);
}
}
return new Response('Not found', { status: 404 });
},
});
```
## Performance Benefits
Bun is 3x faster than Node.js, providing incredible performance for file upload operations.
No adapter layer means zero performance overhead - pushduck handlers run directly in Bun.
Built-in bundler, test runner, package manager, and more - no extra tooling needed.
Run TypeScript directly without compilation, perfect for rapid development.
## Deployment
### Docker Deployment
```dockerfile title="Dockerfile"
FROM oven/bun:1 as base
WORKDIR /usr/src/app
# Install dependencies
COPY package.json bun.lockb ./
RUN bun install --frozen-lockfile
# Copy source code
COPY . .
# Expose port
EXPOSE 3000
# Run the app
CMD ["bun", "run", "server.ts"]
```
### Production Scripts
```json title="package.json"
{
"name": "bun-upload-server",
"version": "1.0.0",
"scripts": {
"dev": "bun run --watch server.ts",
"start": "bun run server.ts",
"build": "bun build server.ts --outdir ./dist --target bun",
"test": "bun test"
},
"dependencies": {
"pushduck": "latest"
},
"devDependencies": {
"bun-types": "latest"
}
}
```
***
**Bun + Pushduck**: The perfect combination for ultra-fast file uploads with zero configuration overhead and exceptional developer experience.
# Elysia
URL: /docs/integrations/elysia
TypeScript-first framework with Bun - Web Standards native, no adapter needed!
***
title: Elysia
description: TypeScript-first framework with Bun - Web Standards native, no adapter needed!
-------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
# Elysia
Elysia is a TypeScript-first web framework designed for Bun. Since Elysia uses Web Standard `Request` objects natively, pushduck handlers work directly without any adapters!
**Web Standards Native**: Elysia exposes `context.request` as a Web Standard `Request` object, making pushduck integration seamless with zero overhead.
## Quick Setup
**Install dependencies**
```bash
bun add pushduck
```
```bash
npm install pushduck
```
```bash
yarn add pushduck
```
```bash
pnpm add pushduck
```
**Configure upload router**
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create Elysia app with upload routes**
```typescript title="server.ts"
import { Elysia } from 'elysia';
import { uploadRouter } from './lib/upload';
const app = new Elysia();
// Direct usage - no adapter needed!
app.all('/api/upload/*', (context) => {
return uploadRouter.handlers(context.request);
});
app.listen(3000);
```
## Basic Integration
### Simple Upload Route
```typescript title="server.ts"
import { Elysia } from 'elysia';
import { uploadRouter } from './lib/upload';
const app = new Elysia();
// Method 1: Combined handler (recommended)
app.all('/api/upload/*', (context) => {
return uploadRouter.handlers(context.request);
});
// Method 2: Separate handlers (if you need method-specific logic)
app.get('/api/upload/*', (context) => uploadRouter.handlers.GET(context.request));
app.post('/api/upload/*', (context) => uploadRouter.handlers.POST(context.request));
app.listen(3000);
```
### With Middleware and CORS
```typescript title="server.ts"
import { Elysia } from 'elysia';
import { cors } from '@elysiajs/cors';
import { uploadRouter } from './lib/upload';
const app = new Elysia()
.use(cors({
origin: ['http://localhost:3000', 'https://your-domain.com'],
allowedHeaders: ['Content-Type', 'Authorization'],
methods: ['GET', 'POST']
}))
// Upload routes
.all('/api/upload/*', (context) => uploadRouter.handlers(context.request))
// Health check
.get('/health', () => ({ status: 'ok' }))
.listen(3000);
console.log(`๐ฆ Elysia is running at http://localhost:3000`);
```
## Advanced Configuration
### Authentication with JWT
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
import jwt from '@elysiajs/jwt';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Private uploads with JWT authentication
privateUpload: s3
.image()
.max("5MB")
.middleware(async ({ req }) => {
const authHeader = req.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
throw new Error('Authorization required');
}
const token = authHeader.substring(7);
try {
// Use your JWT verification logic here
const payload = jwt.verify(token, process.env.JWT_SECRET!);
return {
userId: payload.sub as string,
userRole: payload.role as string
};
} catch (error) {
throw new Error('Invalid token');
}
}),
// Public uploads (no auth)
publicUpload: s3
.image()
.max("2MB")
// No middleware = public access
});
export type AppUploadRouter = typeof uploadRouter;
```
### Full Production Setup
```typescript title="server.ts"
import { Elysia } from 'elysia';
import { cors } from '@elysiajs/cors';
import { rateLimit } from '@elysiajs/rate-limit';
import { swagger } from '@elysiajs/swagger';
import { uploadRouter } from './lib/upload';
const app = new Elysia()
// Swagger documentation
.use(swagger({
documentation: {
info: {
title: 'Upload API',
version: '1.0.0'
}
}
}))
// CORS
.use(cors({
origin: process.env.NODE_ENV === 'production'
? ['https://your-domain.com']
: true,
allowedHeaders: ['Content-Type', 'Authorization'],
methods: ['GET', 'POST']
}))
// Rate limiting
.use(rateLimit({
max: 100,
windowMs: 15 * 60 * 1000, // 15 minutes
}))
// Upload routes
.all('/api/upload/*', (context) => uploadRouter.handlers(context.request))
// Health check
.get('/health', () => ({
status: 'ok',
timestamp: new Date().toISOString()
}))
.listen(process.env.PORT || 3000);
console.log(`๐ฆ Elysia is running at http://localhost:${process.env.PORT || 3000}`);
```
## TypeScript Integration
### Type-Safe Client
```typescript title="lib/upload-client.ts"
import { createUploadClient } from 'pushduck/client';
import type { AppUploadRouter } from './upload';
export const uploadClient = createUploadClient({
baseUrl: process.env.NEXT_PUBLIC_API_URL || 'http://localhost:3000'
});
```
### Client Usage
```typescript title="components/upload.tsx"
import { uploadClient } from '../lib/upload-client';
export function UploadComponent() {
const handleUpload = async (files: File[]) => {
try {
const results = await uploadClient.upload('imageUpload', {
files,
// Type-safe metadata based on your router configuration
metadata: { userId: 'user-123' }
});
console.log('Upload successful:', results);
} catch (error) {
console.error('Upload failed:', error);
}
};
return (
{
if (e.target.files) {
handleUpload(Array.from(e.target.files));
}
}}
/>
);
}
```
## Performance Benefits
No adapter layer means zero performance overhead - pushduck handlers run directly in Elysia.
Built for Bun's exceptional performance, perfect for high-throughput upload APIs.
Full TypeScript support from server to client with compile-time safety.
Extensive plugin ecosystem for authentication, validation, rate limiting, and more.
## Deployment
### Production Deployment
```dockerfile title="Dockerfile"
FROM oven/bun:1 as base
WORKDIR /usr/src/app
# Install dependencies
COPY package.json bun.lockb ./
RUN bun install --frozen-lockfile
# Copy source code
COPY . .
# Expose port
EXPOSE 3000
# Run the app
CMD ["bun", "run", "server.ts"]
```
```bash
# Build and run
docker build -t my-upload-api .
docker run -p 3000:3000 my-upload-api
```
***
**Perfect TypeScript Integration**: Elysia's TypeScript-first approach combined with pushduck's type-safe design creates an exceptional developer experience with full end-to-end type safety.
# Expo Router
URL: /docs/integrations/expo
Full-stack React Native file uploads with Expo Router API routes - no adapter needed!
***
title: Expo Router
description: Full-stack React Native file uploads with Expo Router API routes - no adapter needed!
--------------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
# Expo Router Integration
Expo Router is a file-based router for React Native and web applications that enables full-stack development with API routes. Since Expo Router uses Web Standards APIs, pushduck handlers work directly without any adapters!
**Web Standards Native**: Expo Router API routes use standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead. Perfect for universal React Native apps!
## Quick Setup
**Install dependencies**
```bash
npx expo install expo-router pushduck
# For file uploads on mobile
npx expo install expo-document-picker expo-image-picker
# For file system operations
npx expo install expo-file-system
```
```bash
yarn expo install expo-router pushduck
# For file uploads on mobile
yarn expo install expo-document-picker expo-image-picker
# For file system operations
yarn expo install expo-file-system
```
```bash
pnpm expo install expo-router pushduck
# For file uploads on mobile
pnpm expo install expo-document-picker expo-image-picker
# For file system operations
pnpm expo install expo-file-system
```
```bash
bun expo install expo-router pushduck
# For file uploads on mobile
bun expo install expo-document-picker expo-image-picker
# For file system operations
bun expo install expo-file-system
```
**Configure server output**
Enable server-side rendering in your `app.json`:
```json title="app.json"
{
"expo": {
"web": {
"output": "server"
},
"plugins": [
[
"expo-router",
{
"origin": "https://your-domain.com"
}
]
]
}
}
```
**Configure upload router**
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3 } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = s3.createRouter({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create API route**
```typescript title="app/api/upload/[...slug]+api.ts"
import { uploadRouter } from '../../../lib/upload';
// Direct usage - no adapter needed!
export async function GET(request: Request) {
return uploadRouter.handlers(request);
}
export async function POST(request: Request) {
return uploadRouter.handlers(request);
}
```
## Basic Integration
### Simple Upload Route
```typescript title="app/api/upload/[...slug]+api.ts"
import { uploadRouter } from '../../../lib/upload';
// Method 1: Combined handler (recommended)
export async function GET(request: Request) {
return uploadRouter.handlers(request);
}
export async function POST(request: Request) {
return uploadRouter.handlers(request);
}
// Method 2: Individual methods (if you need method-specific logic)
export async function PUT(request: Request) {
return uploadRouter.handlers(request);
}
export async function DELETE(request: Request) {
return uploadRouter.handlers(request);
}
```
### With CORS Headers
```typescript title="app/api/upload/[...slug]+api.ts"
import { uploadRouter } from '../../../lib/upload';
function addCorsHeaders(response: Response) {
response.headers.set('Access-Control-Allow-Origin', '*');
response.headers.set('Access-Control-Allow-Methods', 'GET, POST, PUT, DELETE, OPTIONS');
response.headers.set('Access-Control-Allow-Headers', 'Content-Type, Authorization');
return response;
}
export async function OPTIONS() {
return addCorsHeaders(new Response(null, { status: 200 }));
}
export async function GET(request: Request) {
const response = await uploadRouter.handlers(request);
return addCorsHeaders(response);
}
export async function POST(request: Request) {
const response = await uploadRouter.handlers(request);
return addCorsHeaders(response);
}
```
## Advanced Configuration
### Authentication with Expo Auth
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
import { jwtVerify } from 'jose';
const { s3 } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = s3.createRouter({
// Private uploads with JWT authentication
privateUpload: s3
.image()
.max("5MB")
.formats(['jpeg', 'jpg', 'png', 'webp'])
.middleware(async ({ req }) => {
const authHeader = req.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
throw new Error('Authorization required');
}
const token = authHeader.substring(7);
try {
const secret = new TextEncoder().encode(process.env.JWT_SECRET!);
const { payload } = await jwtVerify(token, secret);
return {
userId: payload.sub as string,
platform: 'mobile'
};
} catch (error) {
throw new Error('Invalid token');
}
}),
// User profile pictures
profilePicture: s3
.image()
.max("2MB")
.count(1)
.formats(['jpeg', 'jpg', 'png', 'webp'])
.middleware(async ({ req }) => {
const userId = await authenticateUser(req);
return { userId, category: 'profile' };
})
.paths({
generateKey: ({ metadata, file }) => {
return `profiles/${metadata.userId}/avatar.${file.name.split('.').pop()}`;
}
}),
// Document uploads
documents: s3
.file()
.max("10MB")
.types(['application/pdf', 'text/plain'])
.count(5)
.middleware(async ({ req }) => {
const userId = await authenticateUser(req);
return { userId, category: 'documents' };
}),
// Public uploads (no auth)
publicUpload: s3
.image()
.max("2MB")
// No middleware = public access
});
async function authenticateUser(req: Request): Promise {
const authHeader = req.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
throw new Error('Authorization required');
}
const token = authHeader.substring(7);
const secret = new TextEncoder().encode(process.env.JWT_SECRET!);
const { payload } = await jwtVerify(token, secret);
return payload.sub as string;
}
export type AppUploadRouter = typeof uploadRouter;
```
## Client-Side Usage (React Native)
### Upload Hook
```typescript title="hooks/useUpload.ts"
import { createUploadClient } from 'pushduck/client';
import type { AppUploadRouter } from '../lib/upload';
export const upload = createUploadClient({
endpoint: '/api/upload'
});
```
### Image Upload Component
```typescript title="components/ImageUploader.tsx"
import React, { useState } from 'react';
import { View, Text, TouchableOpacity, Image, Alert, Platform } from 'react-native';
import * as ImagePicker from 'expo-image-picker';
import { upload } from '../hooks/useUpload';
export default function ImageUploader() {
const [selectedImage, setSelectedImage] = useState(null);
const { uploadFiles, files, isUploading, error } = upload.imageUpload();
const pickImage = async () => {
// Request permission
if (Platform.OS !== 'web') {
const { status } = await ImagePicker.requestMediaLibraryPermissionsAsync();
if (status !== 'granted') {
Alert.alert('Permission needed', 'Camera roll permission is required');
return;
}
}
const result = await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.Images,
allowsEditing: true,
aspect: [4, 3],
quality: 1,
});
if (!result.canceled) {
const asset = result.assets[0];
setSelectedImage(asset.uri);
// Create File object for upload
const file = {
uri: asset.uri,
name: asset.fileName || 'image.jpg',
type: asset.type || 'image/jpeg',
} as any;
uploadFiles([file]);
}
};
return (
{isUploading ? 'Uploading...' : 'Pick Image'}
{error && (
Error: {error.message}
)}
{selectedImage && (
)}
{files.length > 0 && (
{files.map((file) => (
{file.name}
{file.status === 'success' ? 'Complete' : `${file.progress}%`}
{file.status === 'success' && file.url && (
โ Uploaded
)}
))}
)}
);
}
```
### Document Upload Component
```typescript title="components/DocumentUploader.tsx"
import React, { useState } from 'react';
import { View, Text, TouchableOpacity, Alert, FlatList } from 'react-native';
import * as DocumentPicker from 'expo-document-picker';
import { upload } from '../hooks/useUpload';
interface UploadedFile {
name: string;
size: number;
url: string;
}
export default function DocumentUploader() {
const [uploadedFiles, setUploadedFiles] = useState([]);
const { uploadFiles, isUploading, error } = upload.documents();
const pickDocument = async () => {
try {
const result = await DocumentPicker.getDocumentAsync({
type: ['application/pdf', 'text/plain'],
multiple: true,
});
if (!result.canceled) {
const files = result.assets.map(asset => ({
uri: asset.uri,
name: asset.name,
type: asset.mimeType || 'application/octet-stream',
})) as any[];
const uploadResult = await uploadFiles(files);
if (uploadResult.success) {
const newFiles = uploadResult.results.map(file => ({
name: file.name,
size: file.size,
url: file.url,
}));
setUploadedFiles(prev => [...prev, ...newFiles]);
Alert.alert('Success', `${files.length} file(s) uploaded successfully!`);
}
}
} catch (error) {
Alert.alert('Error', 'Failed to pick document');
}
};
return (
{isUploading ? 'Uploading...' : 'Pick Documents'}
{error && (
Error: {error.message}
)}
index.toString()}
renderItem={({ item }) => (
{item.name}
{(item.size / 1024).toFixed(1)} KB
)}
/>
);
}
```
## Project Structure
Here's a recommended project structure for Expo Router with pushduck:
## Complete Example
### Main Upload Screen
```typescript title="app/(tabs)/upload.tsx"
import React from 'react';
import { View, Text, ScrollView, StyleSheet } from 'react-native';
import ImageUploader from '../../components/ImageUploader';
import DocumentUploader from '../../components/DocumentUploader';
export default function UploadScreen() {
return (
File Upload Demo
Image Upload
Document Upload
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: '#fff',
},
title: {
fontSize: 24,
fontWeight: 'bold',
textAlign: 'center',
marginVertical: 20,
},
section: {
padding: 20,
borderBottomWidth: 1,
borderBottomColor: '#eee',
},
sectionTitle: {
fontSize: 18,
fontWeight: '600',
marginBottom: 15,
},
});
```
### Tab Layout
```typescript title="app/(tabs)/_layout.tsx"
import { Tabs } from 'expo-router';
import { Ionicons } from '@expo/vector-icons';
export default function TabLayout() {
return (
(
),
}}
/>
(
),
}}
/>
);
}
```
## Deployment Options
### EAS Build Configuration
Configure automatic server deployment in your `eas.json`:
```json title="eas.json"
{
"cli": {
"version": ">= 5.0.0"
},
"build": {
"development": {
"developmentClient": true,
"distribution": "internal",
"env": {
"EXPO_UNSTABLE_DEPLOY_SERVER": "1"
}
},
"preview": {
"distribution": "internal",
"env": {
"EXPO_UNSTABLE_DEPLOY_SERVER": "1"
}
},
"production": {
"env": {
"EXPO_UNSTABLE_DEPLOY_SERVER": "1"
}
}
}
}
```
Deploy with automatic server:
```bash
# Build for all platforms
eas build --platform all
# Deploy server only
npx expo export --platform web
eas deploy
```
### Development Build Setup
```bash
# Install dev client
npx expo install expo-dev-client
# Create development build
eas build --profile development
# Or run locally
npx expo run:ios --configuration Release
npx expo run:android --variant release
```
Configure local server origin:
```json title="app.json"
{
"expo": {
"plugins": [
[
"expo-router",
{
"origin": "http://localhost:8081"
}
]
]
}
}
```
### Local Development Server
```bash
# Start Expo development server
npx expo start
# Test API routes
curl http://localhost:8081/api/upload/presigned-url
# Clear cache if needed
npx expo start --clear
```
For production testing:
```bash
# Export for production
npx expo export
# Serve locally
npx expo serve
```
## Environment Variables
```bash title=".env"
# AWS/Cloudflare R2 Configuration
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_REGION=auto
AWS_ENDPOINT_URL=https://your-account.r2.cloudflarestorage.com
S3_BUCKET_NAME=your-bucket-name
R2_ACCOUNT_ID=your-cloudflare-account-id
# JWT Authentication
JWT_SECRET=your-jwt-secret
# Expo Configuration (for client-side, use EXPO_PUBLIC_ prefix)
EXPO_PUBLIC_API_URL=https://your-domain.com
```
**Important**: Server environment variables (without `EXPO_PUBLIC_` prefix) are only available in API routes, not in client code. Client-side variables must use the `EXPO_PUBLIC_` prefix.
## Performance Benefits
Share upload logic between web and native platforms with a single codebase.
Direct access to native file system APIs for optimal performance on mobile.
Built-in support for upload progress tracking and real-time status updates.
Deploy to iOS, Android, and web with the same upload infrastructure.
## Troubleshooting
**File Permissions**: Always request proper permissions for camera and photo library access on mobile devices before file operations.
**Server Bundle**: Expo Router API routes require server output to be enabled in your `app.json` configuration.
### Common Issues
**Metro bundler errors:**
```bash
# Clear Metro cache
npx expo start --clear
# Reset Expo cache
npx expo r -c
```
**Permission denied errors:**
```typescript
// Always check permissions before file operations
import * as ImagePicker from 'expo-image-picker';
const { status } = await ImagePicker.requestMediaLibraryPermissionsAsync();
if (status !== 'granted') {
Alert.alert('Permission needed', 'Camera roll permission is required');
return;
}
```
**Network errors in development:**
```typescript
// Make sure your development server is accessible
const { upload } = useUpload('/api/upload', {
endpoint: __DEV__ ? 'http://localhost:8081' : 'https://your-domain.com',
});
```
**File upload timeout:**
```typescript
const { upload } = useUpload('/api/upload', {
timeout: 60000, // 60 seconds
});
```
### Debug Mode
Enable debug logging for development:
```typescript title="lib/upload.ts"
const { s3 } = createUploadConfig()
.provider("cloudflareR2",{ /* config */ })
.defaults({
debug: __DEV__, // Only in development
})
.build();
```
This will log detailed information about upload requests, file processing, and S3 operations to help diagnose issues during development.
## Framework-Specific Notes
1. **File System Access**: Use `expo-file-system` for advanced file operations
2. **Permissions**: Always request permissions before accessing camera or photo library
3. **Web Compatibility**: Components work on web out of the box with Expo Router
4. **Platform Detection**: Use `Platform.OS` to handle platform-specific logic
5. **Environment Variables**: Server variables don't need `EXPO_PUBLIC_` prefix in API routes
# Express
URL: /docs/integrations/express
Popular Node.js framework integration with pushduck using adapters for req/res API
***
title: Express
description: Popular Node.js framework integration with pushduck using adapters for req/res API
-----------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
# Express
Express uses the traditional Node.js `req`/`res` API pattern. Pushduck provides a simple adapter that converts Web Standard handlers to Express middleware format.
**Custom Request/Response API**: Express uses `req`/`res` objects instead of Web Standards, so pushduck provides the `toExpressHandler` adapter for seamless integration.
## Quick Setup
**Install dependencies**
```bash
npm install pushduck
```
```bash
yarn add pushduck
```
```bash
pnpm add pushduck
```
```bash
bun add pushduck
```
**Configure upload router**
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create Express server with upload routes**
```typescript title="server.ts"
import express from 'express';
import { uploadRouter } from './lib/upload';
import { toExpressHandler } from 'pushduck/adapters/express';
const app = express();
// Convert pushduck handlers to Express middleware
app.all('/api/upload/*', toExpressHandler(uploadRouter.handlers));
app.listen(3000, () => {
console.log('Server running on http://localhost:3000');
});
```
## Basic Integration
### Simple Upload Route
```typescript title="server.ts"
import express from 'express';
import cors from 'cors';
import { uploadRouter } from './lib/upload';
import { toExpressHandler } from 'pushduck/adapters/express';
const app = express();
// Middleware
app.use(cors());
app.use(express.json());
// Upload routes using adapter
app.all('/api/upload/*', toExpressHandler(uploadRouter.handlers));
// Health check
app.get('/health', (req, res) => {
res.json({ status: 'healthy', timestamp: new Date().toISOString() });
});
const port = process.env.PORT || 3000;
app.listen(port, () => {
console.log(`๐ Server running on http://localhost:${port}`);
});
```
### With Authentication Middleware
```typescript title="server.ts"
import express from 'express';
import jwt from 'jsonwebtoken';
import { uploadRouter } from './lib/upload';
import { toExpressHandler } from 'pushduck/adapters/express';
const app = express();
app.use(express.json());
// Authentication middleware
const authenticateToken = (req: express.Request, res: express.Response, next: express.NextFunction) => {
const authHeader = req.headers['authorization'];
const token = authHeader && authHeader.split(' ')[1];
if (!token) {
return res.sendStatus(401);
}
jwt.verify(token, process.env.JWT_SECRET!, (err, user) => {
if (err) return res.sendStatus(403);
req.user = user;
next();
});
};
// Public upload route (no auth)
app.all('/api/upload/public/*', toExpressHandler(uploadRouter.handlers));
// Private upload route (with auth)
app.all('/api/upload/private/*', authenticateToken, toExpressHandler(uploadRouter.handlers));
app.listen(3000);
```
## Advanced Configuration
### Upload Configuration with Express Context
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Profile pictures with authentication
profilePicture: s3
.image()
.max("2MB")
.count(1)
.formats(["jpeg", "png", "webp"])
.middleware(async ({ req }) => {
// Extract user from JWT token in Authorization header
const authHeader = req.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
throw new Error('Authentication required');
}
const token = authHeader.substring(7);
const user = await verifyJWT(token);
return {
userId: user.id,
userRole: user.role,
category: "profile"
};
}),
// Document uploads for authenticated users
documents: s3
.file()
.max("10MB")
.count(5)
.types([
"application/pdf",
"application/msword",
"application/vnd.openxmlformats-officedocument.wordprocessingml.document",
"text/plain"
])
.middleware(async ({ req }) => {
const authHeader = req.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
throw new Error('Authentication required');
}
const token = authHeader.substring(7);
const user = await verifyJWT(token);
return {
userId: user.id,
category: "documents"
};
}),
// Public uploads (no authentication)
publicImages: s3
.image()
.max("1MB")
.count(1)
.formats(["jpeg", "png"])
// No middleware = public access
});
async function verifyJWT(token: string) {
// Your JWT verification logic
const jwt = await import('jsonwebtoken');
return jwt.verify(token, process.env.JWT_SECRET!) as any;
}
export type AppUploadRouter = typeof uploadRouter;
```
### Complete Express Application
```typescript title="server.ts"
import express from 'express';
import cors from 'cors';
import helmet from 'helmet';
import rateLimit from 'express-rate-limit';
import { uploadRouter } from './lib/upload';
import { toExpressHandler } from 'pushduck/adapters/express';
const app = express();
// Security middleware
app.use(helmet());
app.use(cors({
origin: process.env.NODE_ENV === 'production'
? ['https://your-domain.com']
: ['http://localhost:3000'],
credentials: true
}));
// Rate limiting
const uploadLimiter = rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 100, // limit each IP to 100 requests per windowMs
message: 'Too many upload requests from this IP, please try again later.',
standardHeaders: true,
legacyHeaders: false,
});
// Body parsing middleware
app.use(express.json({ limit: '50mb' }));
app.use(express.urlencoded({ extended: true, limit: '50mb' }));
// Logging middleware
app.use((req, res, next) => {
console.log(`${new Date().toISOString()} - ${req.method} ${req.path}`);
next();
});
// Health check endpoint
app.get('/health', (req, res) => {
res.json({
status: 'healthy',
timestamp: new Date().toISOString(),
uptime: process.uptime(),
memory: process.memoryUsage(),
version: process.env.npm_package_version || '1.0.0'
});
});
// API info endpoint
app.get('/api', (req, res) => {
res.json({
name: 'Express Upload API',
version: '1.0.0',
endpoints: {
health: '/health',
upload: '/api/upload/*'
},
uploadTypes: [
'profilePicture - Single profile picture (2MB max)',
'documents - PDF, Word, text files (10MB max, 5 files)',
'publicImages - Public images (1MB max)'
]
});
});
// Upload routes with rate limiting
app.all('/api/upload/*', uploadLimiter, toExpressHandler(uploadRouter.handlers));
// 404 handler
app.use('*', (req, res) => {
res.status(404).json({
error: 'Not Found',
message: `Route ${req.originalUrl} not found`,
timestamp: new Date().toISOString()
});
});
// Error handler
app.use((err: Error, req: express.Request, res: express.Response, next: express.NextFunction) => {
console.error('Express error:', err);
res.status(500).json({
error: 'Internal Server Error',
message: process.env.NODE_ENV === 'development' ? err.message : 'Something went wrong',
timestamp: new Date().toISOString()
});
});
const port = process.env.PORT || 3000;
app.listen(port, () => {
console.log(`๐ Express server running on http://localhost:${port}`);
console.log(`๐ Upload endpoint: http://localhost:${port}/api/upload`);
});
```
## Project Structure
## Modular Route Organization
### Separate Upload Routes
```typescript title="routes/uploads.ts"
import { Router } from 'express';
import { uploadRouter } from '../lib/upload';
import { toExpressHandler } from 'pushduck/adapters/express';
import { authenticateToken } from '../middleware/auth';
const router = Router();
// Public uploads
router.all('/public/*', toExpressHandler(uploadRouter.handlers));
// Private uploads (requires authentication)
router.all('/private/*', authenticateToken, toExpressHandler(uploadRouter.handlers));
export default router;
```
```typescript title="middleware/auth.ts"
import { Request, Response, NextFunction } from 'express';
import jwt from 'jsonwebtoken';
export const authenticateToken = (req: Request, res: Response, next: NextFunction) => {
const authHeader = req.headers['authorization'];
const token = authHeader && authHeader.split(' ')[1];
if (!token) {
return res.status(401).json({ error: 'Access token required' });
}
jwt.verify(token, process.env.JWT_SECRET!, (err, user) => {
if (err) {
return res.status(403).json({ error: 'Invalid or expired token' });
}
req.user = user;
next();
});
};
```
# Fastify
URL: /docs/integrations/fastify
High-performance Node.js framework integration with pushduck using adapters
***
title: Fastify
description: High-performance Node.js framework integration with pushduck using adapters
----------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
# Fastify
Fastify is a high-performance Node.js web framework that uses custom `request`/`reply` objects. Pushduck provides a simple adapter that converts Web Standard handlers to Fastify handler format.
**Custom Request/Response API**: Fastify uses `request`/`reply` objects instead of Web Standards, so pushduck provides the `toFastifyHandler` adapter for seamless integration.
## Quick Setup
**Install dependencies**
```bash
npm install pushduck
```
```bash
yarn add pushduck
```
```bash
pnpm add pushduck
```
```bash
bun add pushduck
```
**Configure upload router**
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create Fastify server with upload routes**
```typescript title="server.ts"
import Fastify from 'fastify';
import { uploadRouter } from './lib/upload';
import { toFastifyHandler } from 'pushduck/adapters/fastify';
const fastify = Fastify({ logger: true });
// Convert pushduck handlers to Fastify handler
fastify.all('/api/upload/*', toFastifyHandler(uploadRouter.handlers));
const start = async () => {
try {
await fastify.listen({ port: 3000 });
console.log('๐ Fastify server running on http://localhost:3000');
} catch (err) {
fastify.log.error(err);
process.exit(1);
}
};
start();
```
## Basic Integration
### Simple Upload Route
```typescript title="server.ts"
import Fastify from 'fastify';
import cors from '@fastify/cors';
import { uploadRouter } from './lib/upload';
import { toFastifyHandler } from 'pushduck/adapters/fastify';
const fastify = Fastify({
logger: {
level: 'info',
transport: {
target: 'pino-pretty'
}
}
});
// Register CORS
await fastify.register(cors, {
origin: ['http://localhost:3000', 'https://your-domain.com']
});
// Upload routes using adapter
fastify.all('/api/upload/*', toFastifyHandler(uploadRouter.handlers));
// Health check
fastify.get('/health', async (request, reply) => {
return {
status: 'healthy',
timestamp: new Date().toISOString(),
framework: 'Fastify'
};
});
const start = async () => {
try {
await fastify.listen({ port: 3000, host: '0.0.0.0' });
} catch (err) {
fastify.log.error(err);
process.exit(1);
}
};
start();
```
### With Authentication Hook
```typescript title="server.ts"
import Fastify from 'fastify';
import jwt from '@fastify/jwt';
import { uploadRouter } from './lib/upload';
import { toFastifyHandler } from 'pushduck/adapters/fastify';
const fastify = Fastify({ logger: true });
// Register JWT
await fastify.register(jwt, {
secret: process.env.JWT_SECRET!
});
// Authentication hook
fastify.addHook('preHandler', async (request, reply) => {
// Only protect upload routes
if (request.url.startsWith('/api/upload/private/')) {
try {
await request.jwtVerify();
} catch (err) {
reply.send(err);
}
}
});
// Public upload routes
fastify.all('/api/upload/public/*', toFastifyHandler(uploadRouter.handlers));
// Private upload routes (protected by hook)
fastify.all('/api/upload/private/*', toFastifyHandler(uploadRouter.handlers));
const start = async () => {
try {
await fastify.listen({ port: 3000 });
} catch (err) {
fastify.log.error(err);
process.exit(1);
}
};
start();
```
## Advanced Configuration
### Upload Configuration with Fastify Context
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Profile pictures with authentication
profilePicture: s3
.image()
.max("2MB")
.count(1)
.formats(["jpeg", "png", "webp"])
.middleware(async ({ req }) => {
const authHeader = req.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
throw new Error('Authentication required');
}
const token = authHeader.substring(7);
const user = await verifyJWT(token);
return {
userId: user.id,
userRole: user.role,
category: "profile"
};
}),
// Document uploads for authenticated users
documents: s3
.file()
.max("10MB")
.count(5)
.types([
"application/pdf",
"application/msword",
"application/vnd.openxmlformats-officedocument.wordprocessingml.document",
"text/plain"
])
.middleware(async ({ req }) => {
const authHeader = req.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
throw new Error('Authentication required');
}
const token = authHeader.substring(7);
const user = await verifyJWT(token);
return {
userId: user.id,
category: "documents"
};
}),
// Public uploads (no authentication)
publicImages: s3
.image()
.max("1MB")
.count(1)
.formats(["jpeg", "png"])
// No middleware = public access
});
async function verifyJWT(token: string) {
// Your JWT verification logic
const jwt = await import('jsonwebtoken');
return jwt.verify(token, process.env.JWT_SECRET!) as any;
}
export type AppUploadRouter = typeof uploadRouter;
```
### Complete Fastify Application
```typescript title="server.ts"
import Fastify from 'fastify';
import cors from '@fastify/cors';
import helmet from '@fastify/helmet';
import rateLimit from '@fastify/rate-limit';
import { uploadRouter } from './lib/upload';
import { toFastifyHandler } from 'pushduck/adapters/fastify';
const fastify = Fastify({
logger: {
level: process.env.NODE_ENV === 'production' ? 'warn' : 'info',
transport: process.env.NODE_ENV !== 'production' ? {
target: 'pino-pretty'
} : undefined
}
});
// Security middleware
await fastify.register(helmet, {
contentSecurityPolicy: false
});
// CORS configuration
await fastify.register(cors, {
origin: process.env.NODE_ENV === 'production'
? ['https://your-domain.com']
: true,
credentials: true
});
// Rate limiting
await fastify.register(rateLimit, {
max: 100,
timeWindow: '15 minutes',
errorResponseBuilder: (request, context) => ({
error: 'Rate limit exceeded',
message: `Too many requests from ${request.ip}. Try again later.`,
retryAfter: Math.round(context.ttl / 1000)
})
});
// Request logging
fastify.addHook('onRequest', async (request, reply) => {
request.log.info({ url: request.url, method: request.method }, 'incoming request');
});
// Health check endpoint
fastify.get('/health', async (request, reply) => {
return {
status: 'healthy',
timestamp: new Date().toISOString(),
uptime: process.uptime(),
memory: process.memoryUsage(),
version: process.env.npm_package_version || '1.0.0',
framework: 'Fastify'
};
});
// API info endpoint
fastify.get('/api', async (request, reply) => {
return {
name: 'Fastify Upload API',
version: '1.0.0',
endpoints: {
health: '/health',
upload: '/api/upload/*'
},
uploadTypes: [
'profilePicture - Single profile picture (2MB max)',
'documents - PDF, Word, text files (10MB max, 5 files)',
'publicImages - Public images (1MB max)'
]
};
});
// Upload routes with rate limiting
fastify.register(async function (fastify) {
await fastify.register(rateLimit, {
max: 50,
timeWindow: '15 minutes'
});
fastify.all('/api/upload/*', toFastifyHandler(uploadRouter.handlers));
});
// 404 handler
fastify.setNotFoundHandler(async (request, reply) => {
reply.status(404).send({
error: 'Not Found',
message: `Route ${request.method} ${request.url} not found`,
timestamp: new Date().toISOString()
});
});
// Error handler
fastify.setErrorHandler(async (error, request, reply) => {
request.log.error(error, 'Fastify error');
reply.status(500).send({
error: 'Internal Server Error',
message: process.env.NODE_ENV === 'development' ? error.message : 'Something went wrong',
timestamp: new Date().toISOString()
});
});
// Graceful shutdown
const gracefulShutdown = () => {
fastify.log.info('Shutting down gracefully...');
fastify.close().then(() => {
fastify.log.info('Server closed');
process.exit(0);
}).catch((err) => {
fastify.log.error(err, 'Error during shutdown');
process.exit(1);
});
};
process.on('SIGTERM', gracefulShutdown);
process.on('SIGINT', gracefulShutdown);
const start = async () => {
try {
const port = Number(process.env.PORT) || 3000;
const host = process.env.HOST || '0.0.0.0';
await fastify.listen({ port, host });
fastify.log.info(`๐ Fastify server running on http://${host}:${port}`);
fastify.log.info(`๐ Upload endpoint: http://${host}:${port}/api/upload`);
} catch (err) {
fastify.log.error(err);
process.exit(1);
}
};
start();
```
## Plugin-Based Architecture
### Upload Plugin
```typescript title="plugins/upload.ts"
import { FastifyPluginAsync } from 'fastify';
import { uploadRouter } from '../lib/upload';
import { toFastifyHandler } from 'pushduck/adapters/fastify';
const uploadPlugin: FastifyPluginAsync = async (fastify) => {
// Upload routes
fastify.all('/upload/*', toFastifyHandler(uploadRouter.handlers));
// Upload status endpoint
fastify.get('/upload-status', async (request, reply) => {
return {
status: 'ready',
supportedTypes: ['images', 'documents', 'publicImages'],
maxSizes: {
profilePicture: '2MB',
documents: '10MB',
publicImages: '1MB'
}
};
});
};
export default uploadPlugin;
```
### Main Server with Plugins
```typescript title="server.ts"
import Fastify from 'fastify';
import uploadPlugin from './plugins/upload';
const fastify = Fastify({ logger: true });
// Register upload plugin
await fastify.register(uploadPlugin, { prefix: '/api' });
const start = async () => {
try {
await fastify.listen({ port: 3000 });
} catch (err) {
fastify.log.error(err);
process.exit(1);
}
};
start();
```
## Client Usage
The client-side integration is identical regardless of your backend framework:
```typescript title="client/upload-client.ts"
import { createUploadClient } from 'pushduck/client';
import type { AppUploadRouter } from '../lib/upload';
export const upload = createUploadClient({
endpoint: 'http://localhost:3000/api/upload',
headers: {
'Authorization': `Bearer ${getAuthToken()}`
}
});
function getAuthToken(): string {
return localStorage.getItem('auth-token') || '';
}
```
```typescript title="client/upload-form.tsx"
import { upload } from './upload-client';
export function DocumentUploader() {
const { uploadFiles, files, isUploading, error } = upload.documents();
const handleFileSelect = (e: React.ChangeEvent) => {
const selectedFiles = Array.from(e.target.files || []);
uploadFiles(selectedFiles);
};
return (
{error && (
Error: {error.message}
)}
{files.map((file) => (
{file.name}
{file.status === 'success' && (
Download
)}
))}
);
}
```
## Deployment
### Docker Deployment
```dockerfile title="Dockerfile"
FROM node:18-alpine
WORKDIR /app
# Copy package files
COPY package*.json ./
RUN npm ci --only=production
# Copy source code
COPY . .
# Build TypeScript
RUN npm run build
EXPOSE 3000
CMD ["npm", "start"]
```
### Package Configuration
```json title="package.json"
{
"name": "fastify-upload-api",
"version": "1.0.0",
"scripts": {
"dev": "tsx watch src/server.ts",
"build": "tsc",
"start": "node dist/server.js"
},
"dependencies": {
"fastify": "^4.24.0",
"pushduck": "latest",
"@fastify/cors": "^8.4.0",
"@fastify/helmet": "^11.1.0",
"@fastify/rate-limit": "^8.0.0",
"@fastify/jwt": "^7.2.0"
},
"devDependencies": {
"@types/node": "^20.0.0",
"tsx": "^3.12.7",
"typescript": "^5.0.0",
"pino-pretty": "^10.2.0"
}
}
```
### Environment Variables
```bash title=".env"
# Server Configuration
PORT=3000
HOST=0.0.0.0
NODE_ENV=development
JWT_SECRET=your-super-secret-jwt-key
# Cloudflare R2 Configuration
AWS_ACCESS_KEY_ID=your_r2_access_key
AWS_SECRET_ACCESS_KEY=your_r2_secret_key
AWS_ENDPOINT_URL=https://your-account-id.r2.cloudflarestorage.com
S3_BUCKET_NAME=your-bucket-name
R2_ACCOUNT_ID=your-account-id
```
## Performance Benefits
Fastify is one of the fastest Node.js frameworks, perfect for high-throughput upload APIs.
Leverage Fastify's extensive plugin ecosystem alongside pushduck's upload capabilities.
Excellent TypeScript support with full type safety for both Fastify and pushduck.
Built-in schema validation, logging, and error handling for production deployments.
***
**Fastify + Pushduck**: High-performance file uploads with Fastify's speed and pushduck's universal design, connected through a simple adapter.
# Fresh
URL: /docs/integrations/fresh
Deno-powered file uploads with Fresh using Web Standards - no adapter needed!
***
title: Fresh
description: Deno-powered file uploads with Fresh using Web Standards - no adapter needed!
------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
# Fresh Integration
Fresh is a modern web framework for Deno that uses islands architecture for optimal performance. It uses Web Standards APIs and provides server-side rendering with minimal client-side JavaScript. Since Fresh uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters!
**Web Standards Native**: Fresh API routes use Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead.
## Quick Setup
**Install Fresh and pushduck**
```bash
# Create a new Fresh project
deno run -A -r https://fresh.deno.dev my-app
cd my-app
# Add pushduck to import_map.json
```
```json title="import_map.json"
{
"imports": {
"$fresh/": "https://deno.land/x/fresh@1.6.1/",
"preact": "https://esm.sh/preact@10.19.2",
"preact/": "https://esm.sh/preact@10.19.2/",
"pushduck/server": "https://esm.sh/pushduck@latest/server",
"pushduck/client": "https://esm.sh/pushduck@latest/client"
}
}
```
```bash
# Create a new Fresh project
deno run -A -r https://fresh.deno.dev my-app
cd my-app
# Install pushduck via npm (requires Node.js compatibility)
npm install pushduck
```
```bash
# Create a new Fresh project
deno run -A -r https://fresh.deno.dev my-app
cd my-app
# Install pushduck via yarn (requires Node.js compatibility)
yarn add pushduck
```
```bash
# Create a new Fresh project
deno run -A -r https://fresh.deno.dev my-app
cd my-app
# Install pushduck via pnpm (requires Node.js compatibility)
pnpm add pushduck
```
```bash
# Create a new Fresh project
deno run -A -r https://fresh.deno.dev my-app
cd my-app
# Install pushduck via bun (requires Node.js compatibility)
bun add pushduck
```
**Configure upload router**
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: Deno.env.get("AWS_ACCESS_KEY_ID")!,
secretAccessKey: Deno.env.get("AWS_SECRET_ACCESS_KEY")!,
region: 'auto',
endpoint: Deno.env.get("AWS_ENDPOINT_URL")!,
bucket: Deno.env.get("S3_BUCKET_NAME")!,
accountId: Deno.env.get("R2_ACCOUNT_ID")!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create API route**
```typescript title="routes/api/upload/[...path].ts"
import { Handlers } from "$fresh/server.ts";
import { uploadRouter } from "../../../lib/upload.ts";
// Direct usage - no adapter needed!
export const handler: Handlers = {
async GET(req) {
return uploadRouter.handlers(req);
},
async POST(req) {
return uploadRouter.handlers(req);
},
};
```
## Basic Integration
### Simple Upload Route
```typescript title="routes/api/upload/[...path].ts"
import { Handlers } from "$fresh/server.ts";
import { uploadRouter } from "../../../lib/upload.ts";
// Method 1: Combined handler (recommended)
export const handler: Handlers = {
async GET(req) {
return uploadRouter.handlers(req);
},
async POST(req) {
return uploadRouter.handlers(req);
},
};
// Method 2: Universal handler
export const handler: Handlers = {
async GET(req) {
return uploadRouter.handlers(req);
},
async POST(req) {
return uploadRouter.handlers(req);
},
async OPTIONS(req) {
return new Response(null, {
status: 200,
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type',
},
});
},
};
```
### With Middleware
```typescript title="routes/_middleware.ts"
import { MiddlewareHandlerContext } from "$fresh/server.ts";
export async function handler(
req: Request,
ctx: MiddlewareHandlerContext,
) {
// Add CORS headers for upload routes
if (ctx.destination === "route" && req.url.includes("/api/upload")) {
const response = await ctx.next();
response.headers.set("Access-Control-Allow-Origin", "*");
response.headers.set("Access-Control-Allow-Methods", "GET, POST, OPTIONS");
response.headers.set("Access-Control-Allow-Headers", "Content-Type");
return response;
}
return ctx.next();
}
```
## Advanced Configuration
### Authentication with Fresh
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
import { getCookies } from "https://deno.land/std@0.208.0/http/cookie.ts";
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: Deno.env.get("AWS_ACCESS_KEY_ID")!,
secretAccessKey: Deno.env.get("AWS_SECRET_ACCESS_KEY")!,
region: 'auto',
endpoint: Deno.env.get("AWS_ENDPOINT_URL")!,
bucket: Deno.env.get("S3_BUCKET_NAME")!,
accountId: Deno.env.get("R2_ACCOUNT_ID")!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Private uploads with cookie-based authentication
privateUpload: s3
.image()
.max("5MB")
.middleware(async ({ req }) => {
const cookies = getCookies(req.headers);
const sessionId = cookies.sessionId;
if (!sessionId) {
throw new Error('Authentication required');
}
const user = await getUserFromSession(sessionId);
if (!user) {
throw new Error('Invalid session');
}
return {
userId: user.id,
username: user.username,
};
}),
// Public uploads (no auth)
publicUpload: s3
.image()
.max("2MB")
// No middleware = public access
});
export type AppUploadRouter = typeof uploadRouter;
// Helper function
async function getUserFromSession(sessionId: string) {
// Implement your session validation logic
// This could connect to a database, Deno KV, etc.
return { id: 'user-123', username: 'demo-user' };
}
```
## Client-Side Usage
### Upload Island Component
```tsx title="islands/FileUpload.tsx"
import { useUpload } from "pushduck/client";
import type { AppUploadRouter } from "../lib/upload.ts";
const { UploadButton, UploadDropzone } = useUpload({
endpoint: "/api/upload",
});
export default function FileUpload() {
function handleUploadComplete(files: any[]) {
console.log("Files uploaded:", files);
alert("Upload completed!");
}
function handleUploadError(error: Error) {
console.error("Upload error:", error);
alert(`Upload failed: ${error.message}`);
}
return (
Image Upload
Document Upload
);
}
```
### Using in Pages
```tsx title="routes/index.tsx"
import { Head } from "$fresh/runtime.ts";
import FileUpload from "../islands/FileUpload.tsx";
export default function Home() {
return (
<>
File Upload Demo
File Upload Demo
>
);
}
```
## File Management
### Server-Side File API
```typescript title="routes/api/files.ts"
import { Handlers } from "$fresh/server.ts";
export const handler: Handlers = {
async GET(req) {
const url = new URL(req.url);
const userId = url.searchParams.get('userId');
if (!userId) {
return new Response(JSON.stringify({ error: 'User ID required' }), {
status: 400,
headers: { 'Content-Type': 'application/json' }
});
}
// Fetch files from database/Deno KV
const files = await getFilesForUser(userId);
return new Response(JSON.stringify({
files: files.map(file => ({
id: file.id,
name: file.name,
url: file.url,
size: file.size,
uploadedAt: file.createdAt,
})),
}), {
headers: { 'Content-Type': 'application/json' }
});
},
};
async function getFilesForUser(userId: string) {
// Example using Deno KV
const kv = await Deno.openKv();
const files = [];
for await (const entry of kv.list({ prefix: ["files", userId] })) {
files.push(entry.value);
}
return files;
}
```
### File Management Page
```tsx title="routes/files.tsx"
import { Head } from "$fresh/runtime.ts";
import { Handlers, PageProps } from "$fresh/server.ts";
import FileUpload from "../islands/FileUpload.tsx";
interface FileData {
id: string;
name: string;
url: string;
size: number;
uploadedAt: string;
}
interface PageData {
files: FileData[];
}
export const handler: Handlers = {
async GET(req, ctx) {
// Fetch files for current user
const files = await getFilesForUser("current-user");
return ctx.render({ files });
},
};
export default function FilesPage({ data }: PageProps) {
function formatFileSize(bytes: number): string {
const sizes = ['Bytes', 'KB', 'MB', 'GB'];
if (bytes === 0) return '0 Bytes';
const i = Math.floor(Math.log(bytes) / Math.log(1024));
return Math.round(bytes / Math.pow(1024, i) * 100) / 100 + ' ' + sizes[i];
}
return (
<>
My Files
My Files
Uploaded Files
{data.files.length === 0 ? (
No files uploaded yet.
) : (
{data.files.map((file) => (
{file.name}
{formatFileSize(file.size)}
{new Date(file.uploadedAt).toLocaleDateString()}
View File
))}
)}
>
);
}
async function getFilesForUser(userId: string) {
// Implementation depends on your storage solution
return [];
}
```
## Deployment Options
```bash
# Deploy to Deno Deploy
deno task build
deployctl deploy --project=my-app --include=. --exclude=node_modules
```
```json title="deno.json"
{
"tasks": {
"build": "deno run -A dev.ts build",
"preview": "deno run -A main.ts",
"start": "deno run -A --watch=static/,routes/ dev.ts",
"deploy": "deployctl deploy --project=my-app --include=. --exclude=node_modules"
}
}
```
```dockerfile title="Dockerfile"
FROM denoland/deno:1.38.0
WORKDIR /app
# Copy dependency files
COPY deno.json deno.lock import_map.json ./
# Cache dependencies
RUN deno cache --import-map=import_map.json main.ts
# Copy source code
COPY . .
# Build the application
RUN deno task build
EXPOSE 8000
CMD ["deno", "run", "-A", "main.ts"]
```
```bash
# Install Deno
curl -fsSL https://deno.land/install.sh | sh
# Clone and run your app
git clone
cd
deno task start
```
```systemd title="/etc/systemd/system/fresh-app.service"
[Unit]
Description=Fresh App
After=network.target
[Service]
Type=simple
User=deno
WorkingDirectory=/opt/fresh-app
ExecStart=/home/deno/.deno/bin/deno run -A main.ts
Restart=always
[Install]
WantedBy=multi-user.target
```
## Environment Variables
```bash title=".env"
# AWS Configuration
AWS_REGION=us-east-1
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_S3_BUCKET=your-bucket-name
# Fresh
PORT=8000
```
## Performance Benefits
## Real-Time Upload Progress
```tsx title="islands/AdvancedUpload.tsx"
import { useState } from "preact/hooks";
export default function AdvancedUpload() {
const [uploadProgress, setUploadProgress] = useState(0);
const [isUploading, setIsUploading] = useState(false);
async function handleFileUpload(event: Event) {
const target = event.target as HTMLInputElement;
const files = target.files;
if (!files || files.length === 0) return;
setIsUploading(true);
setUploadProgress(0);
try {
// Simulate upload progress
for (let i = 0; i <= 100; i += 10) {
setUploadProgress(i);
await new Promise(resolve => setTimeout(resolve, 100));
}
alert('Upload completed!');
} catch (error) {
console.error('Upload failed:', error);
alert('Upload failed!');
} finally {
setIsUploading(false);
setUploadProgress(0);
}
}
return (
{isUploading && (
{uploadProgress}% uploaded
)}
);
}
```
## Deno KV Integration
```typescript title="lib/storage.ts"
// Example using Deno KV for file metadata storage
export class FileStorage {
private kv: Deno.Kv;
constructor() {
this.kv = await Deno.openKv();
}
async saveFileMetadata(userId: string, file: {
id: string;
name: string;
url: string;
size: number;
type: string;
}) {
const key = ["files", userId, file.id];
await this.kv.set(key, {
...file,
createdAt: new Date().toISOString(),
});
}
async getFilesForUser(userId: string) {
const files = [];
for await (const entry of this.kv.list({ prefix: ["files", userId] })) {
files.push(entry.value);
}
return files;
}
async deleteFile(userId: string, fileId: string) {
const key = ["files", userId, fileId];
await this.kv.delete(key);
}
}
export const fileStorage = new FileStorage();
```
## Troubleshooting
**Common Issues**
1. **Route not found**: Ensure your route is `routes/api/upload/[...path].ts`
2. **Import errors**: Check your `import_map.json` configuration
3. **Permissions**: Deno requires explicit permissions (`-A` flag for all permissions)
4. **Environment variables**: Use `Deno.env.get()` instead of `process.env`
### Debug Mode
Enable debug logging:
```typescript title="lib/upload.ts"
export const uploadRouter = createS3Router({
// ... routes
}).middleware(async ({ req, file }) => {
if (Deno.env.get("DENO_ENV") === "development") {
console.log("Upload request:", req.url);
console.log("File:", file.name, file.size);
}
return {};
});
```
### Fresh Configuration
```typescript title="fresh.config.ts"
import { defineConfig } from "$fresh/server.ts";
export default defineConfig({
plugins: [],
// Enable static file serving
staticDir: "./static",
// Custom build options
build: {
target: ["chrome99", "firefox99", "safari15"],
},
});
```
Fresh provides an excellent foundation for building modern web applications with Deno and pushduck, combining the power of islands architecture with Web Standards APIs and Deno's secure runtime environment.
# Hono
URL: /docs/integrations/hono
Fast, lightweight file uploads with Hono using Web Standards - no adapter needed!
***
title: Hono
description: Fast, lightweight file uploads with Hono using Web Standards - no adapter needed!
----------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
# Hono Integration
Hono is a fast, lightweight web framework built on Web Standards. Since Hono uses `Request` and `Response` objects natively, pushduck handlers work directly without any adapters!
**Web Standards Native**: Hono exposes `c.req.raw` as a Web Standard `Request` object, making pushduck integration seamless with zero overhead.
## Quick Setup
**Install dependencies**
```bash
npm install pushduck
```
```bash
yarn add pushduck
```
```bash
pnpm add pushduck
```
```bash
bun add pushduck
```
**Configure upload router**
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create Hono app with upload routes**
```typescript title="app.ts"
import { Hono } from 'hono';
import { uploadRouter } from './lib/upload';
const app = new Hono();
// Direct usage - no adapter needed!
app.all('/api/upload/*', (c) => {
return uploadRouter.handlers(c.req.raw);
});
export default app;
```
## Basic Integration
### Simple Upload Route
```typescript title="app.ts"
import { Hono } from 'hono';
import { uploadRouter } from './lib/upload';
const app = new Hono();
// Method 1: Combined handler (recommended)
app.all('/api/upload/*', (c) => {
return uploadRouter.handlers(c.req.raw);
});
// Method 2: Separate handlers (if you need method-specific logic)
app.get('/api/upload/*', (c) => uploadRouter.handlers.GET(c.req.raw));
app.post('/api/upload/*', (c) => uploadRouter.handlers.POST(c.req.raw));
export default app;
```
### With Middleware
```typescript title="app.ts"
import { Hono } from 'hono';
import { cors } from 'hono/cors';
import { logger } from 'hono/logger';
import { uploadRouter } from './lib/upload';
const app = new Hono();
// Global middleware
app.use('*', logger());
app.use('*', cors({
origin: ['http://localhost:3000', 'https://your-domain.com'],
allowMethods: ['GET', 'POST'],
allowHeaders: ['Content-Type'],
}));
// Upload routes
app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw));
// Health check
app.get('/health', (c) => c.json({ status: 'ok' }));
export default app;
```
## Advanced Configuration
### Authentication with Hono
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
import { verify } from 'hono/jwt';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Private uploads with JWT authentication
privateUpload: s3
.image()
.max("5MB")
.middleware(async ({ req }) => {
const authHeader = req.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
throw new Error('Authorization required');
}
const token = authHeader.substring(7);
try {
const payload = await verify(token, process.env.JWT_SECRET!);
return {
userId: payload.sub as string,
userRole: payload.role as string
};
} catch (error) {
throw new Error('Invalid token');
}
}),
// Public uploads (no auth)
publicUpload: s3
.image()
.max("2MB")
// No middleware = public access
});
export type AppUploadRouter = typeof uploadRouter;
```
## Deployment Options
```typescript title="src/index.ts"
import { Hono } from 'hono';
import { uploadRouter } from './lib/upload';
const app = new Hono();
app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw));
export default app;
```
```toml title="wrangler.toml"
name = "my-upload-api"
main = "src/index.ts"
compatibility_date = "2023-12-01"
[env.production]
vars = { NODE_ENV = "production" }
```
```bash
# Deploy to Cloudflare Workers
npx wrangler deploy
```
```typescript title="server.ts"
import { Hono } from 'hono';
import { uploadRouter } from './lib/upload';
const app = new Hono();
app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw));
export default {
port: 3000,
fetch: app.fetch,
};
```
```bash
# Run with Bun
bun run server.ts
```
```typescript title="server.ts"
import { serve } from '@hono/node-server';
import { Hono } from 'hono';
import { uploadRouter } from './lib/upload';
const app = new Hono();
app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw));
const port = 3000;
console.log(`Server is running on port ${port}`);
serve({
fetch: app.fetch,
port
});
```
```bash
# Run with Node.js
npm run dev
```
```typescript title="server.ts"
import { Hono } from 'hono';
import { uploadRouter } from './lib/upload.ts';
const app = new Hono();
app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw));
Deno.serve(app.fetch);
```
```bash
# Run with Deno
deno run --allow-net --allow-env server.ts
```
## Performance Benefits
No adapter layer means zero performance overhead - pushduck handlers run directly in Hono.
Hono is one of the fastest web frameworks, perfect for high-performance upload APIs.
Works on Cloudflare Workers, Bun, Node.js, and Deno with the same code.
Hono + pushduck creates incredibly lightweight upload services.
***
**Perfect Match**: Hono's Web Standards foundation and pushduck's universal design create a powerful, fast, and lightweight file upload solution that works everywhere.
# Next.js
URL: /docs/integrations/nextjs
Complete guide to integrating pushduck with Next.js App Router and Pages Router
***
title: Next.js
description: Complete guide to integrating pushduck with Next.js App Router and Pages Router
icon: "nextjs"
--------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
# Next.js Integration
Pushduck provides seamless integration with both Next.js App Router and Pages Router through universal handlers that work with Next.js's Web Standards-based API.
**Next.js 13+**: App Router uses Web Standards (Request/Response), so pushduck handlers work directly. Pages Router requires a simple adapter for the legacy req/res API.
## Quick Setup
**Install pushduck**
```bash
npm install pushduck
```
```bash
yarn add pushduck
```
```bash
pnpm add pushduck
```
```bash
bun add pushduck
```
**Configure your upload router**
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create API route**
```typescript title="app/api/upload/route.ts"
import { uploadRouter } from '@/lib/upload';
// Direct usage (recommended)
export const { GET, POST } = uploadRouter.handlers;
```
```typescript title="pages/api/upload/[...path].ts"
import { uploadRouter } from '@/lib/upload';
import { toNextJsPagesHandler } from 'pushduck/server';
export default toNextJsPagesHandler(uploadRouter.handlers);
```
## App Router Integration
Next.js App Router uses Web Standards, making integration seamless:
### Basic API Route
```typescript title="app/api/upload/route.ts"
import { uploadRouter } from '@/lib/upload';
// Direct usage - works because Next.js App Router uses Web Standards
export const { GET, POST } = uploadRouter.handlers;
```
### With Type Safety Adapter
For extra type safety and better IDE support:
```typescript title="app/api/upload/route.ts"
import { uploadRouter } from '@/lib/upload';
import { toNextJsHandler } from 'pushduck/adapters/nextjs';
// Explicit adapter for enhanced type safety
export const { GET, POST } = toNextJsHandler(uploadRouter.handlers);
```
### Advanced Configuration
```typescript title="app/api/upload/route.ts"
import { createUploadConfig } from 'pushduck/server';
import { getServerSession } from 'next-auth';
import { authOptions } from '@/lib/auth';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
const uploadRouter = createS3Router({
// Profile pictures with authentication
profilePicture: s3
.image()
.max("2MB")
.formats(["jpeg", "png", "webp"])
.middleware(async ({ req }) => {
const session = await getServerSession(authOptions);
if (!session?.user?.id) {
throw new Error("Authentication required");
}
return {
userId: session.user.id,
category: "profile"
};
}),
// Document uploads for authenticated users
documents: s3
.file()
.max("10MB")
.types(["application/pdf", "text/plain", "application/msword"])
.middleware(async ({ req }) => {
const session = await getServerSession(authOptions);
if (!session?.user?.id) {
throw new Error("Authentication required");
}
return {
userId: session.user.id,
category: "documents"
};
}),
// Public image uploads (no auth required)
publicImages: s3
.image()
.max("5MB")
.formats(["jpeg", "png", "webp"])
// No middleware = publicly accessible
});
export type AppUploadRouter = typeof uploadRouter;
export const { GET, POST } = uploadRouter.handlers;
```
## Pages Router Integration
Pages Router uses the legacy req/res API, so we provide a simple adapter:
### Basic API Route
```typescript title="pages/api/upload/[...path].ts"
import { uploadRouter } from '@/lib/upload';
import { toNextJsPagesHandler } from 'pushduck/adapters/nextjs-pages';
export default toNextJsPagesHandler(uploadRouter.handlers);
```
### With Authentication
```typescript title="pages/api/upload/[...path].ts"
import { createUploadConfig } from 'pushduck/server';
import { toNextJsPagesHandler } from 'pushduck/adapters/nextjs-pages';
import { getSession } from 'next-auth/react';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
// ... your config
})
.build();
const uploadRouter = createS3Router({
imageUpload: s3
.image()
.max("5MB")
.middleware(async ({ req }) => {
// Convert Web Request to get session
const session = await getSession({ req: req as any });
if (!session?.user?.id) {
throw new Error("Authentication required");
}
return {
userId: session.user.id
};
})
});
export default toNextJsPagesHandler(uploadRouter.handlers);
```
## Client-Side Usage
The client-side code is identical for both App Router and Pages Router:
### Setup Upload Client
```typescript title="lib/upload-client.ts"
import { createUploadClient } from 'pushduck/client';
import type { AppUploadRouter } from './upload';
export const upload = createUploadClient({
endpoint: '/api/upload'
});
```
### React Component
```typescript title="components/upload-form.tsx"
'use client'; // App Router
// or just regular component for Pages Router
import { upload } from '@/lib/upload-client';
import { useState } from 'react';
export function UploadForm() {
const { uploadFiles, files, isUploading, error } = upload.imageUpload();
const handleFileSelect = (e: React.ChangeEvent) => {
const selectedFiles = Array.from(e.target.files || []);
uploadFiles(selectedFiles);
};
return (
{error && (
Error: {error.message}
)}
{files.length > 0 && (
{files.map((file) => (
{file.name}
{(file.size / 1024 / 1024).toFixed(2)} MB
{file.status === 'success' ? 'Complete' : `${file.progress}%`}
{file.status === 'success' && file.url && (
View
)}
))}
)}
);
}
```
## Project Structure
Here's a recommended project structure for Next.js with pushduck:
## Complete Example
### Upload Configuration
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
import { getServerSession } from 'next-auth';
import { authOptions } from './auth';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
const timestamp = Date.now();
const randomId = Math.random().toString(36).substring(2, 8);
return `${metadata.userId}/${timestamp}/${randomId}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Profile pictures - single image, authenticated
profilePicture: s3
.image()
.max("2MB")
.count(1)
.formats(["jpeg", "png", "webp"])
.middleware(async ({ req }) => {
const session = await getServerSession(authOptions);
if (!session?.user?.id) throw new Error("Authentication required");
return { userId: session.user.id, type: "profile" };
}),
// Gallery images - multiple images, authenticated
gallery: s3
.image()
.max("5MB")
.count(10)
.formats(["jpeg", "png", "webp"])
.middleware(async ({ req }) => {
const session = await getServerSession(authOptions);
if (!session?.user?.id) throw new Error("Authentication required");
return { userId: session.user.id, type: "gallery" };
}),
// Documents - various file types, authenticated
documents: s3
.file()
.max("10MB")
.count(5)
.types([
"application/pdf",
"application/msword",
"application/vnd.openxmlformats-officedocument.wordprocessingml.document",
"text/plain"
])
.middleware(async ({ req }) => {
const session = await getServerSession(authOptions);
if (!session?.user?.id) throw new Error("Authentication required");
return { userId: session.user.id, type: "documents" };
}),
// Public uploads - no authentication required
public: s3
.image()
.max("1MB")
.count(1)
.formats(["jpeg", "png"])
// No middleware = public access
});
export type AppUploadRouter = typeof uploadRouter;
```
### API Route (App Router)
```typescript title="app/api/upload/route.ts"
import { uploadRouter } from '@/lib/upload';
export const { GET, POST } = uploadRouter.handlers;
```
### Upload Page
```typescript title="app/upload/page.tsx"
'use client';
import { upload } from '@/lib/upload-client';
import { useState } from 'react';
export default function UploadPage() {
const [activeTab, setActiveTab] = useState<'profile' | 'gallery' | 'documents'>('profile');
const profileUpload = upload.profilePicture();
const galleryUpload = upload.gallery();
const documentsUpload = upload.documents();
const currentUpload = {
profile: profileUpload,
gallery: galleryUpload,
documents: documentsUpload
}[activeTab];
return (
File Upload Demo
{/* Tab Navigation */}
{[
{ key: 'profile', label: 'Profile Picture', icon: '๐ค' },
{ key: 'gallery', label: 'Gallery', icon: '๐ผ๏ธ' },
{ key: 'documents', label: 'Documents', icon: '๐' }
].map(tab => (
setActiveTab(tab.key as any)}
className={`px-4 py-2 text-sm font-medium border-b-2 ${
activeTab === tab.key
? 'border-blue-500 text-blue-600'
: 'border-transparent text-gray-500 hover:text-gray-700'
}`}
>
{tab.icon} {tab.label}
))}
{/* Upload Interface */}
{
const files = Array.from(e.target.files || []);
currentUpload.uploadFiles(files);
}}
disabled={currentUpload.isUploading}
className="block w-full text-sm text-gray-500 file:mr-4 file:py-2 file:px-4 file:rounded-full file:border-0 file:text-sm file:font-semibold file:bg-blue-50 file:text-blue-700 hover:file:bg-blue-100"
/>
{/* File List */}
{currentUpload.files.length > 0 && (
{currentUpload.files.map((file) => (
{file.name}
{(file.size / 1024 / 1024).toFixed(2)} MB
{file.status === 'success' && 'โ
'}
{file.status === 'error' && 'โ'}
{file.status === 'uploading' && 'โณ'}
{file.status === 'pending' && 'โธ๏ธ'}
{file.status === 'success' && file.url && (
View
)}
))}
)}
);
}
```
## Environment Variables
```bash title=".env.local"
# Cloudflare R2 Configuration
AWS_ACCESS_KEY_ID=your_r2_access_key
AWS_SECRET_ACCESS_KEY=your_r2_secret_key
AWS_ENDPOINT_URL=https://your-account-id.r2.cloudflarestorage.com
S3_BUCKET_NAME=your-bucket-name
R2_ACCOUNT_ID=your-account-id
# Next.js Configuration
NEXTAUTH_SECRET=your-nextauth-secret
NEXTAUTH_URL=http://localhost:3000
```
## Deployment Considerations
* Environment variables configured in dashboard
* Edge Runtime compatible
* Automatic HTTPS
* Configure environment variables
* Works with Netlify Functions
* CDN integration available
* Complete Next.js compatibility
* Environment variable management
* Automatic deployments
***
**Next.js Ready**: Pushduck works seamlessly with both Next.js App Router and Pages Router, providing the same great developer experience across all Next.js versions.
# Nitro/H3
URL: /docs/integrations/nitro-h3
Universal web server file uploads with Nitro and H3 using Web Standards - no adapter needed!
***
title: Nitro/H3
description: Universal web server file uploads with Nitro and H3 using Web Standards - no adapter needed!
---------------------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
# Nitro/H3 Integration
Nitro is a universal web server framework that powers Nuxt.js, built on top of H3 (HTTP framework). It uses Web Standards APIs and provides excellent performance with universal deployment. Since Nitro/H3 uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters!
**Web Standards Native**: Nitro/H3 uses Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead and universal deployment capabilities.
## Quick Setup
**Install dependencies**
```bash
npm install pushduck
```
```bash
yarn add pushduck
```
```bash
pnpm add pushduck
```
```bash
bun add pushduck
```
**Configure upload router**
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create API route**
```typescript title="routes/api/upload/[...path].ts"
import { uploadRouter } from '~/lib/upload';
// Direct usage - no adapter needed!
export default defineEventHandler(async (event) => {
return uploadRouter.handlers(event.node.req);
});
```
## Basic Integration
### Simple Upload Route
```typescript title="routes/api/upload/[...path].ts"
import { uploadRouter } from '~/lib/upload';
// Method 1: Combined handler (recommended)
export default defineEventHandler(async (event) => {
return uploadRouter.handlers(event.node.req);
});
// Method 2: Method-specific handlers
export default defineEventHandler(async (event) => {
const method = getMethod(event);
if (method === 'GET') {
return uploadRouter.handlers.GET(event.node.req);
}
if (method === 'POST') {
return uploadRouter.handlers.POST(event.node.req);
}
throw createError({
statusCode: 405,
statusMessage: 'Method Not Allowed'
});
});
```
### With H3 Utilities
```typescript title="routes/api/upload/[...path].ts"
import { uploadRouter } from '~/lib/upload';
import {
defineEventHandler,
getMethod,
setHeader,
createError
} from 'h3';
export default defineEventHandler(async (event) => {
// Handle CORS
setHeader(event, 'Access-Control-Allow-Origin', '*');
setHeader(event, 'Access-Control-Allow-Methods', 'GET, POST, OPTIONS');
setHeader(event, 'Access-Control-Allow-Headers', 'Content-Type');
// Handle preflight requests
if (getMethod(event) === 'OPTIONS') {
return '';
}
try {
return await uploadRouter.handlers(event.node.req);
} catch (error) {
throw createError({
statusCode: 500,
statusMessage: 'Upload failed',
data: error
});
}
});
```
## Advanced Configuration
### Authentication with H3
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
import { getCookie } from 'h3';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Private uploads with session authentication
privateUpload: s3
.image()
.max("5MB")
.middleware(async ({ req }) => {
const cookies = req.headers.cookie;
const sessionId = parseCookie(cookies)?.sessionId;
if (!sessionId) {
throw new Error('Authentication required');
}
const user = await getUserFromSession(sessionId);
if (!user) {
throw new Error('Invalid session');
}
return {
userId: user.id,
username: user.username,
};
}),
// Public uploads (no auth)
publicUpload: s3
.image()
.max("2MB")
// No middleware = public access
});
export type AppUploadRouter = typeof uploadRouter;
// Helper functions
function parseCookie(cookieString: string | undefined) {
if (!cookieString) return {};
return Object.fromEntries(
cookieString.split('; ').map(c => {
const [key, ...v] = c.split('=');
return [key, v.join('=')];
})
);
}
async function getUserFromSession(sessionId: string) {
// Implement your session validation logic
return { id: 'user-123', username: 'demo-user' };
}
```
## Standalone Nitro App
### Basic Nitro Setup
```typescript title="nitro.config.ts"
export default defineNitroConfig({
srcDir: 'server',
routeRules: {
'/api/upload/**': {
cors: true,
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type'
}
}
},
experimental: {
wasm: true
}
});
```
### Server Entry Point
```typescript title="server/index.ts"
import { createApp, toNodeListener } from 'h3';
import { uploadRouter } from './lib/upload';
const app = createApp();
// Upload routes
app.use('/api/upload/**', defineEventHandler(async (event) => {
return uploadRouter.handlers(event.node.req);
}));
// Health check
app.use('/health', defineEventHandler(() => ({ status: 'ok' })));
export default toNodeListener(app);
```
## Client-Side Usage
### HTML with Vanilla JavaScript
```html title="public/index.html"
File Upload Demo
```
### With Framework Integration
```typescript title="plugins/upload.client.ts"
import { useUpload } from "pushduck/client";
import type { AppUploadRouter } from "~/lib/upload";
export const { UploadButton, UploadDropzone } = useUpload({
endpoint: "/api/upload",
});
```
## File Management
### File API Route
```typescript title="routes/api/files.get.ts"
import { defineEventHandler, getQuery, createError } from 'h3';
export default defineEventHandler(async (event) => {
const query = getQuery(event);
const userId = query.userId as string;
if (!userId) {
throw createError({
statusCode: 400,
statusMessage: 'User ID required'
});
}
// Fetch files from database
const files = await getFilesForUser(userId);
return {
files: files.map(file => ({
id: file.id,
name: file.name,
url: file.url,
size: file.size,
uploadedAt: file.createdAt,
})),
};
});
async function getFilesForUser(userId: string) {
// Implement your database query logic
return [];
}
```
### File Management Page
```html title="public/files.html"
My Files
```
## Deployment Options
```typescript title="nitro.config.ts"
export default defineNitroConfig({
preset: 'vercel-edge',
// or 'vercel' for Node.js runtime
});
```
```typescript title="nitro.config.ts"
export default defineNitroConfig({
preset: 'netlify-edge',
// or 'netlify' for Node.js runtime
});
```
```typescript title="nitro.config.ts"
export default defineNitroConfig({
preset: 'node-server',
});
```
```typescript title="nitro.config.ts"
export default defineNitroConfig({
preset: 'cloudflare-workers',
});
```
## Environment Variables
```bash title=".env"
# AWS Configuration
AWS_REGION=us-east-1
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_S3_BUCKET=your-bucket-name
# Nitro
NITRO_PORT=3000
NITRO_HOST=0.0.0.0
```
## Performance Benefits
## Middleware and Plugins
```typescript title="middleware/cors.ts"
export default defineEventHandler(async (event) => {
if (event.node.req.url?.startsWith('/api/upload')) {
setHeader(event, 'Access-Control-Allow-Origin', '*');
setHeader(event, 'Access-Control-Allow-Methods', 'GET, POST, OPTIONS');
setHeader(event, 'Access-Control-Allow-Headers', 'Content-Type');
if (getMethod(event) === 'OPTIONS') {
return '';
}
}
});
```
```typescript title="plugins/database.ts"
export default async (nitroApp) => {
// Initialize database connection
console.log('Database plugin initialized');
// Add database to context
nitroApp.hooks.hook('request', async (event) => {
event.context.db = await getDatabase();
});
};
```
## Real-Time Upload Progress
```html title="public/advanced-upload.html"
Advanced Upload
```
## Troubleshooting
**Common Issues**
1. **Route not found**: Ensure your route is `routes/api/upload/[...path].ts`
2. **Build errors**: Check that pushduck and h3 are properly installed
3. **CORS issues**: Use Nitro's built-in CORS handling or middleware
4. **Environment variables**: Make sure they're accessible in your deployment environment
### Debug Mode
Enable debug logging:
```typescript title="lib/upload.ts"
export const uploadRouter = createS3Router({
// ... routes
}).middleware(async ({ req, file }) => {
if (process.env.NODE_ENV === "development") {
console.log("Upload request:", req.url);
console.log("File:", file.name, file.size);
}
return {};
});
```
### Nitro Configuration
```typescript title="nitro.config.ts"
export default defineNitroConfig({
srcDir: 'server',
buildDir: '.nitro',
output: {
dir: '.output',
serverDir: '.output/server',
publicDir: '.output/public'
},
runtimeConfig: {
awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID,
awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
awsRegion: process.env.AWS_REGION,
s3BucketName: process.env.S3_BUCKET_NAME,
},
experimental: {
wasm: true
}
});
```
Nitro/H3 provides an excellent foundation for building universal web applications with pushduck, offering flexibility, performance, and deployment options across any platform while maintaining full compatibility with Web Standards APIs.
# Nuxt.js
URL: /docs/integrations/nuxtjs
Vue.js full-stack file uploads with Nuxt.js using Web Standards - no adapter needed!
***
title: Nuxt.js
description: Vue.js full-stack file uploads with Nuxt.js using Web Standards - no adapter needed!
-------------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
# Nuxt.js Integration
Nuxt.js is the intuitive Vue.js framework for building full-stack web applications. It uses Web Standards APIs and provides excellent performance with server-side rendering. Since Nuxt.js uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters!
**Web Standards Native**: Nuxt.js server routes use Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead.
## Quick Setup
**Install dependencies**
```bash
npm install pushduck
```
```bash
yarn add pushduck
```
```bash
pnpm add pushduck
```
```bash
bun add pushduck
```
**Configure upload router**
```typescript title="server/utils/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create API route**
```typescript title="server/api/upload/[...path].ts"
import { uploadRouter } from '~/server/utils/upload';
// Direct usage - no adapter needed!
export default defineEventHandler(async (event) => {
return uploadRouter.handlers(event.node.req);
});
```
## Basic Integration
### Simple Upload Route
```typescript title="server/api/upload/[...path].ts"
import { uploadRouter } from '~/server/utils/upload';
// Method 1: Combined handler (recommended)
export default defineEventHandler(async (event) => {
return uploadRouter.handlers(event.node.req);
});
// Method 2: Method-specific handlers
export default defineEventHandler({
onRequest: [
// Add middleware here if needed
],
handler: async (event) => {
if (event.node.req.method === 'GET') {
return uploadRouter.handlers.GET(event.node.req);
}
if (event.node.req.method === 'POST') {
return uploadRouter.handlers.POST(event.node.req);
}
}
});
```
### With Server Middleware
```typescript title="server/middleware/cors.ts"
export default defineEventHandler(async (event) => {
if (event.node.req.url?.startsWith('/api/upload')) {
// Handle CORS for upload routes
setHeader(event, 'Access-Control-Allow-Origin', '*');
setHeader(event, 'Access-Control-Allow-Methods', 'GET, POST, OPTIONS');
setHeader(event, 'Access-Control-Allow-Headers', 'Content-Type');
if (event.node.req.method === 'OPTIONS') {
return '';
}
}
});
```
## Advanced Configuration
### Authentication with Nuxt
```typescript title="server/utils/upload.ts"
import { createUploadConfig } from 'pushduck/server';
import jwt from 'jsonwebtoken';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Private uploads with JWT authentication
privateUpload: s3
.image()
.max("5MB")
.middleware(async ({ req }) => {
const authHeader = req.headers.authorization;
if (!authHeader?.startsWith('Bearer ')) {
throw new Error('Authorization required');
}
const token = authHeader.substring(7);
try {
const payload = jwt.verify(token, process.env.JWT_SECRET!) as any;
return {
userId: payload.sub,
userRole: payload.role
};
} catch (error) {
throw new Error('Invalid token');
}
}),
// Public uploads (no auth)
publicUpload: s3
.image()
.max("2MB")
// No middleware = public access
});
export type AppUploadRouter = typeof uploadRouter;
```
## Client-Side Usage
### Upload Composable
```typescript title="composables/useUpload.ts"
import { useUpload } from "pushduck/client";
import type { AppUploadRouter } from "~/server/utils/upload";
export const { UploadButton, UploadDropzone } = useUpload({
endpoint: "/api/upload",
});
```
### Upload Component
```vue title="components/FileUpload.vue"
Image Upload
Document Upload
```
### Using in Pages
```vue title="pages/index.vue"
File Upload Demo
```
## File Management
### Server-Side File API
```typescript title="server/api/files.get.ts"
export default defineEventHandler(async (event) => {
const query = getQuery(event);
const userId = query.userId as string;
if (!userId) {
throw createError({
statusCode: 400,
statusMessage: 'User ID required'
});
}
// Fetch files from database
const files = await $fetch('/api/database/files', {
query: { userId }
});
return {
files: files.map((file: any) => ({
id: file.id,
name: file.name,
url: file.url,
size: file.size,
uploadedAt: file.createdAt,
})),
};
});
```
### File Management Page
```vue title="pages/files.vue"
My Files
Uploaded Files
No files uploaded yet.
{{ file.name }}
{{ formatFileSize(file.size) }}
{{ new Date(file.uploadedAt).toLocaleDateString() }}
View File
```
## Deployment Options
```typescript title="nuxt.config.ts"
export default defineNuxtConfig({
nitro: {
preset: 'vercel-edge',
// or 'vercel' for Node.js runtime
},
runtimeConfig: {
awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID,
awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
awsRegion: process.env.AWS_REGION,
s3BucketName: process.env.S3_BUCKET_NAME,
}
});
```
```typescript title="nuxt.config.ts"
export default defineNuxtConfig({
nitro: {
preset: 'netlify-edge',
// or 'netlify' for Node.js runtime
},
runtimeConfig: {
awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID,
awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
awsRegion: process.env.AWS_REGION,
s3BucketName: process.env.S3_BUCKET_NAME,
}
});
```
```typescript title="nuxt.config.ts"
export default defineNuxtConfig({
nitro: {
preset: 'node-server',
},
runtimeConfig: {
awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID,
awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
awsRegion: process.env.AWS_REGION,
s3BucketName: process.env.S3_BUCKET_NAME,
}
});
```
```typescript title="nuxt.config.ts"
export default defineNuxtConfig({
nitro: {
preset: 'cloudflare-pages',
},
runtimeConfig: {
awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID,
awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
awsRegion: process.env.AWS_REGION,
s3BucketName: process.env.S3_BUCKET_NAME,
}
});
```
## Environment Variables
```bash title=".env"
# AWS Configuration
AWS_REGION=us-east-1
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_S3_BUCKET=your-bucket-name
# JWT Secret (for authentication)
JWT_SECRET=your-jwt-secret
# Nuxt
NUXT_PUBLIC_UPLOAD_ENDPOINT=http://localhost:3000/api/upload
```
## Performance Benefits
## Real-Time Upload Progress
```vue title="components/AdvancedUpload.vue"
```
## Troubleshooting
**Common Issues**
1. **Route not found**: Ensure your route is `server/api/upload/[...path].ts`
2. **Build errors**: Check that pushduck is properly installed
3. **CORS issues**: Use server middleware for CORS configuration
4. **Runtime config**: Make sure environment variables are properly configured
### Debug Mode
Enable debug logging:
```typescript title="server/utils/upload.ts"
export const uploadRouter = createS3Router({
// ... routes
}).middleware(async ({ req, file }) => {
if (process.dev) {
console.log("Upload request:", req.url);
console.log("File:", file.name, file.size);
}
return {};
});
```
### Nitro Configuration
```typescript title="nuxt.config.ts"
export default defineNuxtConfig({
nitro: {
experimental: {
wasm: true
},
// Enable debugging in development
devProxy: {
'/api/upload': {
target: 'http://localhost:3000/api/upload',
changeOrigin: true
}
}
}
});
```
Nuxt.js provides an excellent foundation for building full-stack Vue.js applications with pushduck, combining the power of Vue's reactive framework with Web Standards APIs and Nitro's universal deployment capabilities.
# Overview
URL: /docs/integrations/overview
Universal file uploads that work with any web framework - from Web Standards to custom request/response APIs
***
title: Overview
description: Universal file uploads that work with any web framework - from Web Standards to custom request/response APIs
-------------------------------------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
# Framework Integrations Overview
Pushduck provides **universal file upload handlers** that work with any web framework through a single, consistent API. Write your upload logic once and deploy it anywhere!
**Universal Design**: Pushduck uses Web Standards (Request/Response) at its core, making it compatible with both Web Standards frameworks and those with custom request/response APIs without framework-specific code.
## ๐ Universal API
All frameworks use the same core API:
```typescript
import { createS3Router, s3 } from 'pushduck/server';
const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB"),
videoUpload: s3.file().max("100MB").types(["video/*"])
});
// Universal handlers - work with ANY framework
export const { GET, POST } = uploadRouter.handlers;
```
## Framework Categories
Pushduck supports frameworks in two categories:
**No adapter needed!** Use `uploadRouter.handlers` directly.
* Hono
* Elysia
* Bun Runtime
* TanStack Start
* SolidJS Start
**Simple adapters provided** for seamless integration.
* Next.js (App & Pages Router)
* Express
* Fastify
## Quick Start by Framework
```typescript
// Works with: Hono, Elysia, Bun, TanStack Start, SolidJS Start
import { uploadRouter } from '@/lib/upload';
// Direct usage - no adapter needed!
app.all('/api/upload/*', (ctx) => {
return uploadRouter.handlers(ctx.request); // or c.req.raw
});
```
```typescript
// app/api/upload/route.ts
import { uploadRouter } from '@/lib/upload';
// Direct usage (recommended)
export const { GET, POST } = uploadRouter.handlers;
// Or with explicit adapter for extra type safety
import { toNextJsHandler } from 'pushduck/adapters/nextjs';
export const { GET, POST } = toNextJsHandler(uploadRouter.handlers);
```
```typescript
import express from 'express';
import { uploadRouter } from '@/lib/upload';
import { toExpressHandler } from 'pushduck/adapters/express';
const app = express();
app.all("/api/upload/*", toExpressHandler(uploadRouter.handlers));
```
```typescript
import Fastify from 'fastify';
import { uploadRouter } from '@/lib/upload';
import { toFastifyHandler } from 'pushduck/adapters/fastify';
const fastify = Fastify();
fastify.all('/api/upload/*', toFastifyHandler(uploadRouter.handlers));
```
## Why Universal Handlers Work
**Web Standards Foundation**
Pushduck is built on Web Standards (`Request` and `Response` objects) that are supported by all modern JavaScript runtimes.
```typescript
// Core handler signature
type Handler = (request: Request) => Promise
```
**Framework Compatibility**
Modern frameworks expose Web Standard objects directly:
* **Hono**: `c.req.raw` is a Web `Request`
* **Elysia**: `context.request` is a Web `Request`
* **Bun**: Native Web `Request` support
* **TanStack Start**: `{ request }` is a Web `Request`
* **SolidJS Start**: `event.request` is a Web `Request`
**Framework Adapters**
For frameworks with custom request/response APIs, simple adapters convert between formats:
```typescript
// Express adapter example
export function toExpressHandler(handlers: UniversalHandlers) {
return async (req: Request, res: Response, next: NextFunction) => {
const webRequest = convertExpressToWebRequest(req);
const webResponse = await handlers[req.method](webRequest);
convertWebResponseToExpress(webResponse, res);
};
}
```
## Configuration (Same for All Frameworks)
Your upload configuration is identical across all frameworks:
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Image uploads with validation
imageUpload: s3
.image()
.max("5MB")
.formats(["jpeg", "png", "webp"])
.middleware(async ({ req }) => {
const userId = await getUserId(req);
return { userId, category: "images" };
}),
// Document uploads
documentUpload: s3
.file()
.max("10MB")
.types(["application/pdf", "text/plain"])
.middleware(async ({ req }) => {
const userId = await getUserId(req);
return { userId, category: "documents" };
}),
// Video uploads
videoUpload: s3
.file()
.max("100MB")
.types(["video/mp4", "video/quicktime"])
.middleware(async ({ req }) => {
const userId = await getUserId(req);
return { userId, category: "videos" };
})
});
export type AppUploadRouter = typeof uploadRouter;
```
## Client Usage (Framework Independent)
The client-side code is identical regardless of your backend framework:
```typescript title="lib/upload-client.ts"
import { createUploadClient } from 'pushduck/client';
import type { AppUploadRouter } from './upload';
export const upload = createUploadClient({
endpoint: '/api/upload'
});
```
```typescript title="components/upload-form.tsx"
import { upload } from '@/lib/upload-client';
export function UploadForm() {
// Property-based access with full type safety
const { uploadFiles, files, isUploading } = upload.imageUpload();
const handleUpload = async (selectedFiles: File[]) => {
await uploadFiles(selectedFiles);
};
return (
handleUpload(Array.from(e.target.files || []))}
/>
{files.map(file => (
{file.name}
{file.url &&
View }
))}
);
}
```
## Benefits of Universal Design
Migrate from Express to Hono or Next.js to Bun without changing your upload implementation.
Web Standards native frameworks get direct handler access with no adapter overhead.
Master pushduck once and use it with any framework in your toolkit.
As more frameworks adopt Web Standards, they automatically work with pushduck.
## Next Steps
Choose your framework integration guide:
Complete guide for Next.js App Router and Pages Router
Fast, lightweight, built on Web Standards
TypeScript-first framework with Bun
Classic Node.js framework integration
***
**Universal by Design**: Write once, run anywhere. Pushduck's universal handlers make file uploads work seamlessly across the entire JavaScript ecosystem.
# Qwik
URL: /docs/integrations/qwik
Edge-optimized file uploads with Qwik using Web Standards - no adapter needed!
***
title: Qwik
description: Edge-optimized file uploads with Qwik using Web Standards - no adapter needed!
-------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
# Qwik Integration
Qwik is a revolutionary web framework focused on resumability and edge optimization. It uses Web Standards APIs and provides instant loading with minimal JavaScript. Since Qwik uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters!
**Web Standards Native**: Qwik server endpoints use Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead and perfect for edge deployment.
## Quick Setup
**Install dependencies**
```bash
npm install pushduck
```
```bash
yarn add pushduck
```
```bash
pnpm add pushduck
```
```bash
bun add pushduck
```
**Configure upload router**
```typescript title="src/lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: import.meta.env.VITE_AWS_ACCESS_KEY_ID!,
secretAccessKey: import.meta.env.VITE_AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: import.meta.env.VITE_AWS_ENDPOINT_URL!,
bucket: import.meta.env.VITE_S3_BUCKET_NAME!,
accountId: import.meta.env.VITE_R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create API route**
```typescript title="src/routes/api/upload/[...path]/index.ts"
import type { RequestHandler } from '@builder.io/qwik-city';
import { uploadRouter } from '~/lib/upload';
// Direct usage - no adapter needed!
export const onGet: RequestHandler = async ({ request }) => {
return uploadRouter.handlers(request);
};
export const onPost: RequestHandler = async ({ request }) => {
return uploadRouter.handlers(request);
};
```
## Basic Integration
### Simple Upload Route
```typescript title="src/routes/api/upload/[...path]/index.ts"
import type { RequestHandler } from '@builder.io/qwik-city';
import { uploadRouter } from '~/lib/upload';
// Method 1: Combined handler (recommended)
export const onRequest: RequestHandler = async ({ request }) => {
return uploadRouter.handlers(request);
};
// Method 2: Separate handlers (if you need method-specific logic)
export const onGet: RequestHandler = async ({ request }) => {
return uploadRouter.handlers.GET(request);
};
export const onPost: RequestHandler = async ({ request }) => {
return uploadRouter.handlers.POST(request);
};
```
### With CORS Support
```typescript title="src/routes/api/upload/[...path]/index.ts"
import type { RequestHandler } from '@builder.io/qwik-city';
import { uploadRouter } from '~/lib/upload';
export const onRequest: RequestHandler = async ({ request, headers }) => {
// Handle CORS preflight
if (request.method === 'OPTIONS') {
headers.set('Access-Control-Allow-Origin', '*');
headers.set('Access-Control-Allow-Methods', 'GET, POST, OPTIONS');
headers.set('Access-Control-Allow-Headers', 'Content-Type');
return new Response(null, { status: 200 });
}
const response = await uploadRouter.handlers(request);
// Add CORS headers to actual response
headers.set('Access-Control-Allow-Origin', '*');
return response;
};
```
## Advanced Configuration
### Authentication with Qwik
```typescript title="src/lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: import.meta.env.VITE_AWS_ACCESS_KEY_ID!,
secretAccessKey: import.meta.env.VITE_AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: import.meta.env.VITE_AWS_ENDPOINT_URL!,
bucket: import.meta.env.VITE_S3_BUCKET_NAME!,
accountId: import.meta.env.VITE_R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Private uploads with cookie-based authentication
privateUpload: s3
.image()
.max("5MB")
.middleware(async ({ req }) => {
const cookies = req.headers.get('Cookie');
const sessionId = parseCookie(cookies)?.sessionId;
if (!sessionId) {
throw new Error('Authentication required');
}
const user = await getUserFromSession(sessionId);
if (!user) {
throw new Error('Invalid session');
}
return {
userId: user.id,
username: user.username,
};
}),
// Public uploads (no auth)
publicUpload: s3
.image()
.max("2MB")
// No middleware = public access
});
export type AppUploadRouter = typeof uploadRouter;
// Helper functions
function parseCookie(cookieString: string | null) {
if (!cookieString) return {};
return Object.fromEntries(
cookieString.split('; ').map(c => {
const [key, ...v] = c.split('=');
return [key, v.join('=')];
})
);
}
async function getUserFromSession(sessionId: string) {
// Implement your session validation logic
return { id: 'user-123', username: 'demo-user' };
}
```
## Client-Side Usage
### Upload Component
```tsx title="src/components/file-upload.tsx"
import { component$, useSignal } from '@builder.io/qwik';
import { useUpload } from "pushduck/client";
import type { AppUploadRouter } from "~/lib/upload";
export const FileUpload = component$(() => {
const uploadProgress = useSignal(0);
const isUploading = useSignal(false);
const { UploadButton, UploadDropzone } = useUpload({
endpoint: "/api/upload",
});
const handleUploadComplete = $((files: any[]) => {
console.log("Files uploaded:", files);
alert("Upload completed!");
});
const handleUploadError = $((error: Error) => {
console.error("Upload error:", error);
alert(`Upload failed: ${error.message}`);
});
return (
Image Upload
Document Upload
);
});
```
### Using in Routes
```tsx title="src/routes/index.tsx"
import { component$ } from '@builder.io/qwik';
import type { DocumentHead } from '@builder.io/qwik-city';
import { FileUpload } from '~/components/file-upload';
export default component$(() => {
return (
File Upload Demo
);
});
export const head: DocumentHead = {
title: 'File Upload Demo',
meta: [
{
name: 'description',
content: 'Qwik file upload demo with pushduck',
},
],
};
```
## File Management
### Server-Side File Loader
```typescript title="src/routes/files/index.tsx"
import { component$ } from '@builder.io/qwik';
import type { DocumentHead } from '@builder.io/qwik-city';
import { routeLoader$ } from '@builder.io/qwik-city';
import { FileUpload } from '~/components/file-upload';
export const useFiles = routeLoader$(async (requestEvent) => {
const userId = 'current-user'; // Get from session/auth
// Fetch files from database
const files = await getFilesForUser(userId);
return {
files: files.map(file => ({
id: file.id,
name: file.name,
url: file.url,
size: file.size,
uploadedAt: file.createdAt,
})),
};
});
export default component$(() => {
const filesData = useFiles();
const formatFileSize = (bytes: number): string => {
const sizes = ['Bytes', 'KB', 'MB', 'GB'];
if (bytes === 0) return '0 Bytes';
const i = Math.floor(Math.log(bytes) / Math.log(1024));
return Math.round(bytes / Math.pow(1024, i) * 100) / 100 + ' ' + sizes[i];
};
return (
My Files
Uploaded Files
{filesData.value.files.length === 0 ? (
No files uploaded yet.
) : (
{filesData.value.files.map((file) => (
{file.name}
{formatFileSize(file.size)}
{new Date(file.uploadedAt).toLocaleDateString()}
View File
))}
)}
);
});
export const head: DocumentHead = {
title: 'My Files',
};
async function getFilesForUser(userId: string) {
// Implement your database query logic
return [];
}
```
## Deployment Options
```typescript title="vite.config.ts"
import { defineConfig } from 'vite';
import { qwikVite } from '@builder.io/qwik/optimizer';
import { qwikCity } from '@builder.io/qwik-city/vite';
import { qwikCloudflarePages } from '@builder.io/qwik-city/adapters/cloudflare-pages/vite';
export default defineConfig(() => {
return {
plugins: [
qwikCity({
adapter: qwikCloudflarePages(),
}),
qwikVite(),
],
};
});
```
```typescript title="vite.config.ts"
import { defineConfig } from 'vite';
import { qwikVite } from '@builder.io/qwik/optimizer';
import { qwikCity } from '@builder.io/qwik-city/vite';
import { qwikVercel } from '@builder.io/qwik-city/adapters/vercel-edge/vite';
export default defineConfig(() => {
return {
plugins: [
qwikCity({
adapter: qwikVercel(),
}),
qwikVite(),
],
};
});
```
```typescript title="vite.config.ts"
import { defineConfig } from 'vite';
import { qwikVite } from '@builder.io/qwik/optimizer';
import { qwikCity } from '@builder.io/qwik-city/vite';
import { qwikNetlifyEdge } from '@builder.io/qwik-city/adapters/netlify-edge/vite';
export default defineConfig(() => {
return {
plugins: [
qwikCity({
adapter: qwikNetlifyEdge(),
}),
qwikVite(),
],
};
});
```
```typescript title="vite.config.ts"
import { defineConfig } from 'vite';
import { qwikVite } from '@builder.io/qwik/optimizer';
import { qwikCity } from '@builder.io/qwik-city/vite';
import { qwikDeno } from '@builder.io/qwik-city/adapters/deno/vite';
export default defineConfig(() => {
return {
plugins: [
qwikCity({
adapter: qwikDeno(),
}),
qwikVite(),
],
};
});
```
## Environment Variables
```bash title=".env"
# AWS Configuration
VITE_AWS_REGION=us-east-1
VITE_AWS_ACCESS_KEY_ID=your_access_key
VITE_AWS_SECRET_ACCESS_KEY=your_secret_key
VITE_AWS_S3_BUCKET=your-bucket-name
# Qwik
VITE_PUBLIC_UPLOAD_ENDPOINT=http://localhost:5173/api/upload
```
## Performance Benefits
## Real-Time Upload Progress
```tsx title="src/components/advanced-upload.tsx"
import { component$, useSignal, $ } from '@builder.io/qwik';
export const AdvancedUpload = component$(() => {
const uploadProgress = useSignal(0);
const isUploading = useSignal(false);
const handleFileUpload = $(async (event: Event) => {
const target = event.target as HTMLInputElement;
const files = target.files;
if (!files || files.length === 0) return;
isUploading.value = true;
uploadProgress.value = 0;
try {
// Simulate upload progress
for (let i = 0; i <= 100; i += 10) {
uploadProgress.value = i;
await new Promise(resolve => setTimeout(resolve, 100));
}
alert('Upload completed!');
} catch (error) {
console.error('Upload failed:', error);
alert('Upload failed!');
} finally {
isUploading.value = false;
uploadProgress.value = 0;
}
});
return (
{isUploading.value && (
{uploadProgress.value}% uploaded
)}
);
});
```
## Qwik City Form Integration
```tsx title="src/routes/upload-form/index.tsx"
import { component$ } from '@builder.io/qwik';
import type { DocumentHead } from '@builder.io/qwik-city';
import { routeAction$, Form, zod$, z } from '@builder.io/qwik-city';
import { FileUpload } from '~/components/file-upload';
export const useUploadAction = routeAction$(async (data, requestEvent) => {
// Handle form submission
// Files are already uploaded via pushduck, just save metadata
console.log('Form data:', data);
// Redirect to files page
throw requestEvent.redirect(302, '/files');
}, zod$({
title: z.string().min(1),
description: z.string().optional(),
}));
export default component$(() => {
const uploadAction = useUploadAction();
return (
Upload Files
);
});
export const head: DocumentHead = {
title: 'Upload Form',
};
```
## Troubleshooting
**Common Issues**
1. **Route not found**: Ensure your route is `src/routes/api/upload/[...path]/index.ts`
2. **Build errors**: Check that pushduck is properly installed and configured
3. **Environment variables**: Use `import.meta.env.VITE_` prefix for client-side variables
4. **Resumability**: Remember to use `$` suffix for event handlers and functions
### Debug Mode
Enable debug logging:
```typescript title="src/lib/upload.ts"
export const uploadRouter = createS3Router({
// ... routes
}).middleware(async ({ req, file }) => {
if (import.meta.env.DEV) {
console.log("Upload request:", req.url);
console.log("File:", file.name, file.size);
}
return {};
});
```
### Qwik Configuration
```typescript title="vite.config.ts"
import { defineConfig } from 'vite';
import { qwikVite } from '@builder.io/qwik/optimizer';
import { qwikCity } from '@builder.io/qwik-city/vite';
export default defineConfig(() => {
return {
plugins: [qwikCity(), qwikVite()],
preview: {
headers: {
'Cache-Control': 'public, max-age=600',
},
},
// Environment variables configuration
define: {
'import.meta.env.VITE_AWS_ACCESS_KEY_ID': JSON.stringify(process.env.VITE_AWS_ACCESS_KEY_ID),
}
};
});
```
Qwik provides a revolutionary approach to web development with pushduck, offering instant loading and resumability while maintaining full compatibility with Web Standards APIs for optimal edge performance.
# Remix
URL: /docs/integrations/remix
Full-stack React file uploads with Remix using Web Standards - no adapter needed!
***
title: Remix
description: Full-stack React file uploads with Remix using Web Standards - no adapter needed!
----------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
# Remix Integration
Remix is a full-stack React framework focused on web standards and modern UX. It uses Web Standards APIs and provides server-side rendering with client-side hydration. Since Remix uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters!
**Web Standards Native**: Remix loader and action functions use Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead.
## Quick Setup
**Install dependencies**
```bash
npm install pushduck
```
```bash
yarn add pushduck
```
```bash
pnpm add pushduck
```
```bash
bun add pushduck
```
**Configure upload router**
```typescript title="app/lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create API route**
```typescript title="app/routes/api.upload.$.tsx"
import type { ActionFunctionArgs, LoaderFunctionArgs } from "@remix-run/node";
import { uploadRouter } from "~/lib/upload";
// Direct usage - no adapter needed!
export async function loader({ request }: LoaderFunctionArgs) {
return uploadRouter.handlers(request);
}
export async function action({ request }: ActionFunctionArgs) {
return uploadRouter.handlers(request);
}
```
## Basic Integration
### Simple Upload Route
```typescript title="app/routes/api.upload.$.tsx"
import type { ActionFunctionArgs, LoaderFunctionArgs } from "@remix-run/node";
import { uploadRouter } from "~/lib/upload";
// Method 1: Combined handler (recommended)
export async function loader({ request }: LoaderFunctionArgs) {
return uploadRouter.handlers(request);
}
export async function action({ request }: ActionFunctionArgs) {
return uploadRouter.handlers(request);
}
// Method 2: Method-specific handlers (if you need different logic)
export async function loader({ request }: LoaderFunctionArgs) {
if (request.method === 'GET') {
return uploadRouter.handlers.GET(request);
}
throw new Response("Method not allowed", { status: 405 });
}
export async function action({ request }: ActionFunctionArgs) {
if (request.method === 'POST') {
return uploadRouter.handlers.POST(request);
}
throw new Response("Method not allowed", { status: 405 });
}
```
### With Resource Route
```typescript title="app/routes/api.upload.$.tsx"
import type { ActionFunctionArgs, LoaderFunctionArgs } from "@remix-run/node";
import { uploadRouter } from "~/lib/upload";
// Handle CORS for cross-origin requests
export async function loader({ request }: LoaderFunctionArgs) {
// Handle preflight requests
if (request.method === 'OPTIONS') {
return new Response(null, {
status: 200,
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type',
},
});
}
const response = await uploadRouter.handlers(request);
// Add CORS headers
response.headers.set('Access-Control-Allow-Origin', '*');
return response;
}
export async function action({ request }: ActionFunctionArgs) {
const response = await uploadRouter.handlers(request);
// Add CORS headers
response.headers.set('Access-Control-Allow-Origin', '*');
return response;
}
```
## Advanced Configuration
### Authentication with Remix
```typescript title="app/lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
import { getSession } from '~/sessions';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Private uploads with session authentication
privateUpload: s3
.image()
.max("5MB")
.middleware(async ({ req }) => {
const cookie = req.headers.get("Cookie");
const session = await getSession(cookie);
if (!session.has("userId")) {
throw new Error("Authentication required");
}
return {
userId: session.get("userId"),
username: session.get("username"),
};
}),
// Public uploads (no auth)
publicUpload: s3
.image()
.max("2MB")
// No middleware = public access
});
export type AppUploadRouter = typeof uploadRouter;
```
## Client-Side Usage
### Remix Upload Hook
```typescript title="app/hooks/useUpload.ts"
import { useUpload } from "pushduck/client";
import type { AppUploadRouter } from "~/lib/upload";
export const { UploadButton, UploadDropzone } = useUpload({
endpoint: "/api/upload",
});
```
### Upload Component
```tsx title="app/components/FileUpload.tsx"
import { UploadButton, UploadDropzone } from "~/hooks/useUpload";
export function FileUpload() {
function handleUploadComplete(files: any[]) {
console.log("Files uploaded:", files);
alert("Upload completed!");
}
function handleUploadError(error: Error) {
console.error("Upload error:", error);
alert(`Upload failed: ${error.message}`);
}
return (
Image Upload
Document Upload
);
}
```
### Using in Routes
```tsx title="app/routes/_index.tsx"
import { FileUpload } from "~/components/FileUpload";
export default function Index() {
return (
File Upload Demo
);
}
```
## File Management
### Server-Side File Loader
```typescript title="app/routes/files.tsx"
import type { LoaderFunctionArgs } from "@remix-run/node";
import { json } from "@remix-run/node";
import { useLoaderData } from "@remix-run/react";
import { FileUpload } from "~/components/FileUpload";
import { getSession } from "~/sessions";
export async function loader({ request }: LoaderFunctionArgs) {
const cookie = request.headers.get("Cookie");
const session = await getSession(cookie);
if (!session.has("userId")) {
throw new Response("Unauthorized", { status: 401 });
}
const userId = session.get("userId");
// Fetch files from database
const files = await db.file.findMany({
where: { userId },
orderBy: { createdAt: 'desc' },
});
return json({
files: files.map(file => ({
id: file.id,
name: file.name,
url: file.url,
size: file.size,
uploadedAt: file.createdAt,
})),
});
}
export default function FilesPage() {
const { files } = useLoaderData();
function formatFileSize(bytes: number): string {
const sizes = ['Bytes', 'KB', 'MB', 'GB'];
if (bytes === 0) return '0 Bytes';
const i = Math.floor(Math.log(bytes) / Math.log(1024));
return Math.round(bytes / Math.pow(1024, i) * 100) / 100 + ' ' + sizes[i];
}
return (
My Files
Uploaded Files
{files.length === 0 ? (
No files uploaded yet.
) : (
{files.map((file) => (
{file.name}
{formatFileSize(file.size)}
{new Date(file.uploadedAt).toLocaleDateString()}
View File
))}
)}
);
}
```
## Deployment Options
```typescript title="remix.config.js"
/** @type {import('@remix-run/dev').AppConfig} */
export default {
ignoredRouteFiles: ["**/.*"],
server: "./server.ts",
serverBuildPath: "api/index.js",
// Vercel configuration
serverConditions: ["workerd", "worker", "browser"],
serverDependenciesToBundle: "all",
serverMainFields: ["browser", "module", "main"],
serverMinify: true,
serverModuleFormat: "esm",
serverPlatform: "neutral",
};
```
```typescript title="remix.config.js"
/** @type {import('@remix-run/dev').AppConfig} */
export default {
ignoredRouteFiles: ["**/.*"],
server: "./server.ts",
serverBuildPath: ".netlify/functions-internal/server.js",
// Netlify configuration
serverConditions: ["deno", "worker", "browser"],
serverDependenciesToBundle: "all",
serverMainFields: ["browser", "module", "main"],
serverMinify: true,
serverModuleFormat: "esm",
serverPlatform: "neutral",
};
```
```dockerfile title="Dockerfile"
FROM node:18-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY . .
RUN npm run build
EXPOSE 3000
CMD ["npm", "start"]
```
```json title="package.json"
{
"scripts": {
"build": "remix build",
"dev": "remix dev",
"start": "remix-serve build",
"typecheck": "tsc"
}
}
```
## Environment Variables
```bash title=".env"
# AWS Configuration
AWS_REGION=us-east-1
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_S3_BUCKET=your-bucket-name
# Session Secret
SESSION_SECRET=your-session-secret
# Database
DATABASE_URL=your-database-url
```
## Performance Benefits
## Real-Time Upload Progress
```tsx title="app/components/AdvancedUpload.tsx"
import { useState } from "react";
export function AdvancedUpload() {
const [uploadProgress, setUploadProgress] = useState(0);
const [isUploading, setIsUploading] = useState(false);
async function handleFileUpload(event: React.ChangeEvent) {
const files = event.target.files;
if (!files || files.length === 0) return;
setIsUploading(true);
setUploadProgress(0);
try {
// Simulate upload progress
for (let i = 0; i <= 100; i += 10) {
setUploadProgress(i);
await new Promise(resolve => setTimeout(resolve, 100));
}
alert('Upload completed!');
} catch (error) {
console.error('Upload failed:', error);
alert('Upload failed!');
} finally {
setIsUploading(false);
setUploadProgress(0);
}
}
return (
{isUploading && (
{uploadProgress}% uploaded
)}
);
}
```
## Form Integration
```tsx title="app/routes/upload-form.tsx"
import type { ActionFunctionArgs } from "@remix-run/node";
import { json, redirect } from "@remix-run/node";
import { Form, useActionData, useNavigation } from "@remix-run/react";
export async function action({ request }: ActionFunctionArgs) {
const formData = await request.formData();
const title = formData.get("title") as string;
const description = formData.get("description") as string;
// Handle form submission with file uploads
// Files are already uploaded via pushduck, just save metadata
return redirect("/files");
}
export default function UploadForm() {
const actionData = useActionData();
const navigation = useNavigation();
const isSubmitting = navigation.state === "submitting";
return (
Upload Files
Title
Description
Files
{isSubmitting ? "Uploading..." : "Upload Files"}
);
}
```
## Troubleshooting
**Common Issues**
1. **Route not found**: Ensure your route is `app/routes/api.upload.$.tsx`
2. **Build errors**: Check that pushduck is properly installed
3. **Session issues**: Make sure your session configuration is correct
4. **CORS errors**: Add proper CORS headers in your resource routes
### Debug Mode
Enable debug logging:
```typescript title="app/lib/upload.ts"
export const uploadRouter = createS3Router({
// ... routes
}).middleware(async ({ req, file }) => {
if (process.env.NODE_ENV === "development") {
console.log("Upload request:", req.url);
console.log("File:", file.name, file.size);
}
return {};
});
```
### Session Configuration
```typescript title="app/sessions.ts"
import { createCookieSessionStorage } from "@remix-run/node";
export const { getSession, commitSession, destroySession } =
createCookieSessionStorage({
cookie: {
name: "__session",
httpOnly: true,
maxAge: 60 * 60 * 24 * 30, // 30 days
path: "/",
sameSite: "lax",
secrets: [process.env.SESSION_SECRET!],
secure: process.env.NODE_ENV === "production",
},
});
```
Remix provides an excellent foundation for building full-stack React applications with pushduck, combining the power of React with Web Standards APIs and progressive enhancement for optimal user experience.
# SolidJS Start
URL: /docs/integrations/solidjs-start
Full-stack SolidJS framework with Web Standards support - no adapter needed!
***
title: SolidJS Start
description: Full-stack SolidJS framework with Web Standards support - no adapter needed!
-----------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
# SolidJS Start
SolidJS Start is a full-stack SolidJS framework with built-in Web Standards support. Since SolidJS Start provides `event.request` as a Web Standard `Request` object, pushduck handlers work directly without any adapters!
**Web Standards Native**: SolidJS Start's API handlers receive `event.request` as a Web Standard `Request` object, making pushduck integration seamless with zero overhead.
## Quick Setup
**Install dependencies**
```bash
npm install pushduck
```
```bash
yarn add pushduck
```
```bash
pnpm add pushduck
```
```bash
bun add pushduck
```
**Configure upload router**
```typescript title="src/lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create API route**
```typescript title="src/routes/api/upload/[...path].ts"
import { APIEvent } from '@solidjs/start/server';
import { uploadRouter } from '~/lib/upload';
export async function GET(event: APIEvent) {
return uploadRouter.handlers(event.request);
}
export async function POST(event: APIEvent) {
return uploadRouter.handlers(event.request);
}
```
## Basic Integration
### API Route Handler
```typescript title="src/routes/api/upload/[...path].ts"
import { APIEvent } from '@solidjs/start/server';
import { uploadRouter } from '~/lib/upload';
// Method 1: Individual handlers
export async function GET(event: APIEvent) {
return uploadRouter.handlers.GET(event.request);
}
export async function POST(event: APIEvent) {
return uploadRouter.handlers.POST(event.request);
}
// Method 2: Combined handler (alternative approach)
// export async function handler(event: APIEvent) {
// return uploadRouter.handlers(event.request);
// }
```
### With CORS Middleware
```typescript title="src/routes/api/upload/[...path].ts"
import { APIEvent } from '@solidjs/start/server';
import { uploadRouter } from '~/lib/upload';
function addCORSHeaders(response: Response) {
response.headers.set('Access-Control-Allow-Origin', '*');
response.headers.set('Access-Control-Allow-Methods', 'GET, POST, OPTIONS');
response.headers.set('Access-Control-Allow-Headers', 'Content-Type, Authorization');
return response;
}
export async function GET(event: APIEvent) {
const response = await uploadRouter.handlers.GET(event.request);
return addCORSHeaders(response);
}
export async function POST(event: APIEvent) {
const response = await uploadRouter.handlers.POST(event.request);
return addCORSHeaders(response);
}
export async function OPTIONS() {
return new Response(null, {
status: 200,
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type, Authorization',
}
});
}
```
## Advanced Configuration
### Authentication with SolidJS Start
```typescript title="src/lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Private uploads with authentication
privateUpload: s3
.image()
.max("5MB")
.middleware(async ({ req }) => {
const authHeader = req.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
throw new Error('Authorization required');
}
const token = authHeader.substring(7);
try {
const user = await verifyJWT(token);
return {
userId: user.id,
userRole: user.role
};
} catch (error) {
throw new Error('Invalid token');
}
}),
// Public uploads (no auth)
publicUpload: s3
.image()
.max("2MB")
// No middleware = public access
});
async function verifyJWT(token: string) {
// Your JWT verification logic here
return { id: 'user-123', role: 'user' };
}
export type AppUploadRouter = typeof uploadRouter;
```
### Protected API Routes
```typescript title="src/routes/api/upload/[...path].ts"
import { APIEvent } from '@solidjs/start/server';
import { uploadRouter } from '~/lib/upload';
async function requireAuth(request: Request) {
const authHeader = request.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
throw new Response(JSON.stringify({ error: 'Authorization required' }), {
status: 401,
headers: { 'Content-Type': 'application/json' }
});
}
const token = authHeader.substring(7);
// Verify token logic here
return { userId: 'user-123' };
}
export async function GET(event: APIEvent) {
await requireAuth(event.request);
return uploadRouter.handlers.GET(event.request);
}
export async function POST(event: APIEvent) {
await requireAuth(event.request);
return uploadRouter.handlers.POST(event.request);
}
```
## Client-Side Integration
### Upload Component
```typescript title="src/components/UploadDemo.tsx"
import { createSignal } from 'solid-js';
import { createUploadClient } from 'pushduck/client';
import type { AppUploadRouter } from '~/lib/upload';
const uploadClient = createUploadClient({
baseUrl: import.meta.env.DEV
? 'http://localhost:3000'
: 'https://your-domain.com'
});
export function UploadDemo() {
const [uploading, setUploading] = createSignal(false);
const [results, setResults] = createSignal([]);
const handleUpload = async (files: File[]) => {
if (!files.length) return;
setUploading(true);
try {
const uploadResults = await uploadClient.upload('imageUpload', {
files,
metadata: { userId: 'demo-user' }
});
setResults(uploadResults);
console.log('Upload successful:', uploadResults);
} catch (error) {
console.error('Upload failed:', error);
} finally {
setUploading(false);
}
};
return (
{
const files = e.target.files;
if (files) {
handleUpload(Array.from(files));
}
}}
disabled={uploading()}
class="block w-full text-sm text-gray-500 file:mr-4 file:py-2 file:px-4 file:rounded-full file:border-0 file:text-sm file:font-semibold file:bg-blue-50 file:text-blue-700 hover:file:bg-blue-100"
/>
{uploading() && (
Uploading...
)}
{results().length > 0 && (
Upload Results:
{results().map((result, index) => (
))}
)}
);
}
```
### Page Integration
```typescript title="src/routes/upload.tsx"
import { Title } from '@solidjs/meta';
import { UploadDemo } from '~/components/UploadDemo';
export default function UploadPage() {
return (
<>
File Upload Demo
>
);
}
```
## Full Application Example
### Project Structure
```
src/
โโโ components/
โ โโโ UploadDemo.tsx
โโโ lib/
โ โโโ upload.ts
โโโ routes/
โ โโโ api/
โ โ โโโ upload/
โ โ โโโ [...path].ts
โ โโโ upload.tsx
โ โโโ (home).tsx
โโโ app.tsx
โโโ entry-server.tsx
```
### Root Layout
```typescript title="src/app.tsx"
import { Router } from '@solidjs/router';
import { FileRoutes } from '@solidjs/start/router';
import { Suspense } from 'solid-js';
import './app.css';
export default function App() {
return (
(
Upload Demo
{props.children}
)}
>
);
}
```
### Home Page
```typescript title="src/routes/(home).tsx"
import { A } from '@solidjs/router';
import { Title } from '@solidjs/meta';
export default function Home() {
return (
<>
SolidJS Start + Pushduck
SolidJS Start + Pushduck
High-performance file uploads with SolidJS
Try Upload Demo
>
);
}
```
## Performance Benefits
No adapter layer means zero performance overhead - pushduck handlers run directly in SolidJS Start.
SolidJS provides exceptional performance with fine-grained reactivity and no virtual DOM.
Complete type safety from server to client with shared types.
Clean, organized API routes with SolidJS Start's file-based routing system.
## Deployment
### Vercel Deployment
```typescript title="app.config.ts"
import { defineConfig } from '@solidjs/start/config';
export default defineConfig({
server: {
preset: 'vercel'
}
});
```
### Netlify Deployment
```typescript title="app.config.ts"
import { defineConfig } from '@solidjs/start/config';
export default defineConfig({
server: {
preset: 'netlify'
}
});
```
### Node.js Deployment
```typescript title="app.config.ts"
import { defineConfig } from '@solidjs/start/config';
export default defineConfig({
server: {
preset: 'node-server'
}
});
```
### Docker Deployment
```dockerfile title="Dockerfile"
FROM node:18-alpine as base
WORKDIR /app
# Install dependencies
COPY package*.json ./
RUN npm ci --only=production
# Copy source code
COPY . .
# Build the app
RUN npm run build
# Expose port
EXPOSE 3000
# Start the app
CMD ["npm", "start"]
```
***
**SolidJS + Pushduck**: The perfect combination of SolidJS's exceptional performance and pushduck's universal design for lightning-fast file upload experiences.
# SvelteKit
URL: /docs/integrations/sveltekit
Fast, modern file uploads with SvelteKit using Web Standards - no adapter needed!
***
title: SvelteKit
description: Fast, modern file uploads with SvelteKit using Web Standards - no adapter needed!
----------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
# SvelteKit Integration
SvelteKit is the official application framework for Svelte. It uses Web Standards APIs and provides excellent performance with minimal JavaScript. Since SvelteKit uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters!
**Web Standards Native**: SvelteKit server endpoints use Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead.
## Quick Setup
**Install dependencies**
```bash
npm install pushduck
```
```bash
yarn add pushduck
```
```bash
pnpm add pushduck
```
```bash
bun add pushduck
```
**Configure upload router**
```typescript title="src/lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create API route**
```typescript title="src/routes/api/upload/[...path]/+server.ts"
import type { RequestHandler } from './$types';
import { uploadRouter } from '$lib/upload';
// Direct usage - no adapter needed!
export const GET: RequestHandler = async ({ request }) => {
return uploadRouter.handlers(request);
};
export const POST: RequestHandler = async ({ request }) => {
return uploadRouter.handlers(request);
};
```
## Basic Integration
### Simple Upload Route
```typescript title="src/routes/api/upload/[...path]/+server.ts"
import type { RequestHandler } from './$types';
import { uploadRouter } from '$lib/upload';
// Method 1: Combined handler (recommended)
export const GET: RequestHandler = async ({ request }) => {
return uploadRouter.handlers(request);
};
export const POST: RequestHandler = async ({ request }) => {
return uploadRouter.handlers(request);
};
// Method 2: Separate handlers (if you need method-specific logic)
// export const { GET, POST } = uploadRouter.handlers;
```
### With SvelteKit Hooks
```typescript title="src/hooks.server.ts"
import type { Handle } from '@sveltejs/kit';
import { sequence } from '@sveltejs/kit/hooks';
const corsHandler: Handle = async ({ event, resolve }) => {
if (event.url.pathname.startsWith('/api/upload')) {
if (event.request.method === 'OPTIONS') {
return new Response(null, {
status: 200,
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type',
},
});
}
}
const response = await resolve(event);
if (event.url.pathname.startsWith('/api/upload')) {
response.headers.set('Access-Control-Allow-Origin', '*');
}
return response;
};
export const handle = sequence(corsHandler);
```
## Advanced Configuration
### Authentication with SvelteKit
```typescript title="src/lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
import { getUserFromSession } from '$lib/auth';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Private uploads with session authentication
privateUpload: s3
.image()
.max("5MB")
.middleware(async ({ req }) => {
const cookies = req.headers.get('Cookie');
const sessionId = parseCookie(cookies)?.sessionId;
if (!sessionId) {
throw new Error('Authentication required');
}
const user = await getUserFromSession(sessionId);
if (!user) {
throw new Error('Invalid session');
}
return {
userId: user.id,
username: user.username,
};
}),
// Public uploads (no auth)
publicUpload: s3
.image()
.max("2MB")
// No middleware = public access
});
export type AppUploadRouter = typeof uploadRouter;
```
## Client-Side Usage
### Upload Component
```svelte title="src/lib/components/FileUpload.svelte"
Image Upload
Document Upload
```
### Using in Pages
```svelte title="src/routes/+page.svelte"
File Upload Demo
File Upload Demo
```
## File Management
### Server-Side File Listing
```typescript title="src/routes/files/+page.server.ts"
import type { PageServerLoad } from './$types';
import { db } from '$lib/database';
export const load: PageServerLoad = async ({ locals }) => {
const files = await db.file.findMany({
where: { userId: locals.user?.id },
orderBy: { createdAt: 'desc' },
});
return {
files: files.map(file => ({
id: file.id,
name: file.name,
url: file.url,
size: file.size,
uploadedAt: file.createdAt,
})),
};
};
```
```svelte title="src/routes/files/+page.svelte"
My Files
My Files
Uploaded Files
{#if data.files.length === 0}
No files uploaded yet.
{:else}
{#each data.files as file}
{file.name}
{formatFileSize(file.size)}
{new Date(file.uploadedAt).toLocaleDateString()}
View File
{/each}
{/if}
```
## Deployment Options
```javascript title="svelte.config.js"
import adapter from '@sveltejs/adapter-vercel';
import { vitePreprocess } from '@sveltejs/kit/vite';
/** @type {import('@sveltejs/kit').Config} */
const config = {
preprocess: vitePreprocess(),
kit: {
adapter: adapter({
runtime: 'nodejs18.x',
regions: ['iad1'],
}),
}
};
export default config;
```
```javascript title="svelte.config.js"
import adapter from '@sveltejs/adapter-netlify';
/** @type {import('@sveltejs/kit').Config} */
const config = {
kit: {
adapter: adapter({
edge: false,
split: false
}),
}
};
export default config;
```
```javascript title="svelte.config.js"
import adapter from '@sveltejs/adapter-node';
/** @type {import('@sveltejs/kit').Config} */
const config = {
kit: {
adapter: adapter({
out: 'build'
}),
}
};
export default config;
```
```javascript title="svelte.config.js"
import adapter from '@sveltejs/adapter-cloudflare';
/** @type {import('@sveltejs/kit').Config} */
const config = {
kit: {
adapter: adapter({
routes: {
include: ['/*'],
exclude: ['']
}
}),
}
};
export default config;
```
## Environment Variables
```bash title=".env"
# AWS Configuration
AWS_REGION=us-east-1
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_S3_BUCKET=your-bucket-name
# SvelteKit
PUBLIC_UPLOAD_ENDPOINT=http://localhost:5173/api/upload
```
## Performance Benefits
## Real-Time Upload Progress
```svelte title="src/lib/components/AdvancedUpload.svelte"
{#if $isUploading}
{$uploadProgress}% uploaded
{/if}
```
## Troubleshooting
**Common Issues**
1. **Route not found**: Ensure your route is `src/routes/api/upload/[...path]/+server.ts`
2. **Build errors**: Check that pushduck is properly installed
3. **CORS issues**: SvelteKit handles CORS automatically for same-origin requests
### Debug Mode
Enable debug logging:
```typescript title="src/lib/upload.ts"
export const uploadRouter = createS3Router({
// ... routes
}).middleware(async ({ req, file }) => {
if (import.meta.env.DEV) {
console.log("Upload request:", req.url);
console.log("File:", file.name, file.size);
}
return {};
});
```
SvelteKit provides an excellent foundation for building fast, modern web applications with pushduck, combining the power of Svelte's reactive framework with Web Standards APIs.
# TanStack Start
URL: /docs/integrations/tanstack-start
Full-stack React framework with Web Standards support - no adapter needed!
***
title: TanStack Start
description: Full-stack React framework with Web Standards support - no adapter needed!
---------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
# TanStack Start
TanStack Start is a full-stack React framework with built-in Web Standards support. Since TanStack Start provides `event.request` as a Web Standard `Request` object, pushduck handlers work directly without any adapters!
**Web Standards Native**: TanStack Start's API handlers receive `{ request }` as a Web Standard `Request` object, making pushduck integration seamless with zero overhead.
## Quick Setup
**Install dependencies**
```bash
npm install pushduck
```
```bash
yarn add pushduck
```
```bash
pnpm add pushduck
```
```bash
bun add pushduck
```
**Configure upload router**
```typescript title="app/lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build();
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create API route**
```typescript title="app/routes/api.upload.$.ts"
import { createAPIFileRoute } from '@tanstack/start/api';
import { uploadRouter } from '../lib/upload';
export const Route = createAPIFileRoute('/api/upload/$')({
GET: ({ request }) => uploadRouter.handlers(request),
POST: ({ request }) => uploadRouter.handlers(request),
});
```
## Basic Integration
### API Route Handler
```typescript title="app/routes/api.upload.$.ts"
import { createAPIFileRoute } from '@tanstack/start/api';
import { uploadRouter } from '../lib/upload';
export const Route = createAPIFileRoute('/api/upload/$')({
// Method 1: Individual handlers
GET: ({ request }) => uploadRouter.handlers.GET(request),
POST: ({ request }) => uploadRouter.handlers.POST(request),
// Method 2: Universal handler (alternative approach)
// You could also create a single handler that delegates:
// handler: ({ request }) => uploadRouter.handlers(request)
});
```
### With Middleware
```typescript title="app/routes/api.upload.$.ts"
import { createAPIFileRoute } from '@tanstack/start/api';
import { uploadRouter } from '../lib/upload';
// Simple CORS middleware
function withCORS(handler: (ctx: any) => Promise) {
return async (ctx: any) => {
const response = await handler(ctx);
response.headers.set('Access-Control-Allow-Origin', '*');
response.headers.set('Access-Control-Allow-Methods', 'GET, POST, OPTIONS');
response.headers.set('Access-Control-Allow-Headers', 'Content-Type, Authorization');
return response;
};
}
export const Route = createAPIFileRoute('/api/upload/$')({
GET: withCORS(({ request }) => uploadRouter.handlers.GET(request)),
POST: withCORS(({ request }) => uploadRouter.handlers.POST(request)),
OPTIONS: () => new Response(null, {
status: 200,
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type, Authorization',
}
}),
});
```
## Advanced Configuration
### Authentication with TanStack Start
```typescript title="app/lib/upload.ts"
import { createUploadConfig } from 'pushduck/server';
const { s3, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
export const uploadRouter = createS3Router({
// Private uploads with authentication
privateUpload: s3
.image()
.max("5MB")
.middleware(async ({ req }) => {
const authHeader = req.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
throw new Error('Authorization required');
}
const token = authHeader.substring(7);
try {
const user = await verifyJWT(token);
return {
userId: user.id,
userRole: user.role
};
} catch (error) {
throw new Error('Invalid token');
}
}),
// Public uploads (no auth)
publicUpload: s3
.image()
.max("2MB")
// No middleware = public access
});
async function verifyJWT(token: string) {
// Your JWT verification logic here
return { id: 'user-123', role: 'user' };
}
export type AppUploadRouter = typeof uploadRouter;
```
### Protected API Routes
```typescript title="app/routes/api.upload.$.ts"
import { createAPIFileRoute } from '@tanstack/start/api';
import { uploadRouter } from '../lib/upload';
// Authentication middleware
async function requireAuth(request: Request) {
const authHeader = request.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
throw new Response(JSON.stringify({ error: 'Authorization required' }), {
status: 401,
headers: { 'Content-Type': 'application/json' }
});
}
const token = authHeader.substring(7);
// Verify token logic here
return { userId: 'user-123' };
}
export const Route = createAPIFileRoute('/api/upload/$')({
GET: async ({ request }) => {
await requireAuth(request);
return uploadRouter.handlers.GET(request);
},
POST: async ({ request }) => {
await requireAuth(request);
return uploadRouter.handlers.POST(request);
},
});
```
## Client-Side Integration
### Upload Component
```typescript title="app/components/UploadDemo.tsx"
import { useState } from 'react';
import { createUploadClient } from 'pushduck/client';
import type { AppUploadRouter } from '../lib/upload';
const uploadClient = createUploadClient({
baseUrl: process.env.NODE_ENV === 'development'
? 'http://localhost:3000'
: 'https://your-domain.com'
});
export function UploadDemo() {
const [uploading, setUploading] = useState(false);
const [results, setResults] = useState([]);
const handleUpload = async (files: File[]) => {
if (!files.length) return;
setUploading(true);
try {
const uploadResults = await uploadClient.upload('imageUpload', {
files,
metadata: { userId: 'demo-user' }
});
setResults(uploadResults);
console.log('Upload successful:', uploadResults);
} catch (error) {
console.error('Upload failed:', error);
} finally {
setUploading(false);
}
};
return (
{
if (e.target.files) {
handleUpload(Array.from(e.target.files));
}
}}
disabled={uploading}
className="block w-full text-sm text-gray-500 file:mr-4 file:py-2 file:px-4 file:rounded-full file:border-0 file:text-sm file:font-semibold file:bg-blue-50 file:text-blue-700 hover:file:bg-blue-100"
/>
{uploading && (
Uploading...
)}
{results.length > 0 && (
Upload Results:
{results.map((result, index) => (
))}
)}
);
}
```
### Page Integration
```typescript title="app/routes/upload.tsx"
import { createFileRoute } from '@tanstack/start';
import { UploadDemo } from '../components/UploadDemo';
export const Route = createFileRoute('/upload')({
component: UploadPage,
});
function UploadPage() {
return (
);
}
```
## Full Application Example
### Project Structure
```
app/
โโโ components/
โ โโโ UploadDemo.tsx
โโโ lib/
โ โโโ upload.ts
โโโ routes/
โ โโโ api.upload.$.ts
โ โโโ upload.tsx
โ โโโ __root.tsx
โโโ router.ts
โโโ main.tsx
```
### Root Layout
```typescript title="app/routes/__root.tsx"
import { createRootRoute, Outlet } from '@tanstack/start';
export const Route = createRootRoute({
component: RootComponent,
});
function RootComponent() {
return (
TanStack Start + Pushduck
);
}
```
## Performance Benefits
No adapter layer means zero performance overhead - pushduck handlers run directly in TanStack Start.
Built on the latest React features with streaming and concurrent rendering.
Complete type safety from server to client with shared types.
Organized API routes with TanStack Start's file-based routing system.
## Deployment
### Vercel Deployment
```typescript title="app.config.ts"
import { defineConfig } from '@tanstack/start/config';
export default defineConfig({
server: {
preset: 'vercel'
}
});
```
### Netlify Deployment
```typescript title="app.config.ts"
import { defineConfig } from '@tanstack/start/config';
export default defineConfig({
server: {
preset: 'netlify'
}
});
```
### Docker Deployment
```dockerfile title="Dockerfile"
FROM node:18-alpine as base
WORKDIR /app
# Install dependencies
COPY package*.json ./
RUN npm ci --only=production
# Copy source code
COPY . .
# Build the app
RUN npm run build
# Expose port
EXPOSE 3000
# Start the app
CMD ["npm", "start"]
```
***
**Modern React + Pushduck**: TanStack Start's cutting-edge React architecture combined with pushduck's universal design creates a powerful, type-safe file upload solution.
# tRPC
URL: /docs/integrations/trpc
End-to-end typesafe file storage operations with tRPC - uploads handled by your framework!
***
title: tRPC
description: End-to-end typesafe file storage operations with tRPC - uploads handled by your framework!
-------------------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
# tRPC Integration
tRPC enables end-to-end typesafe APIs with excellent TypeScript integration. With pushduck, you use **tRPC for storage operations** (listing, deleting, metadata) while **file uploads happen through your framework's routes** using pushduck handlers. This gives you the best of both worlds: framework-native uploads with type-safe storage management.
**Separation of Concerns**: File uploads use your framework's native upload routes with pushduck handlers, while tRPC procedures handle all storage-related CRUD operations using pushduck's storage API for full type safety.
## Architecture Overview
B[Framework Upload Route]
A --> C[tRPC Storage Procedures]
B --> D[pushduck Upload Handler]
C --> E[pushduck Storage API]
D --> F[S3/R2 Storage]
E --> F
B --> G[File Upload Success]
G --> C
`}
/>
## Quick Setup
**Install dependencies**
```bash
npm install @trpc/server @trpc/client pushduck
# Framework-specific tRPC packages:
# @trpc/next (for Next.js)
# @trpc/react-query (for React)
```
```bash
yarn add @trpc/server @trpc/client pushduck
# Framework-specific tRPC packages:
# @trpc/next (for Next.js)
# @trpc/react-query (for React)
```
```bash
pnpm add @trpc/server @trpc/client pushduck
# Framework-specific tRPC packages:
# @trpc/next (for Next.js)
# @trpc/react-query (for React)
```
```bash
bun add @trpc/server @trpc/client pushduck
# Framework-specific tRPC packages:
# @trpc/next (for Next.js)
# @trpc/react-query (for React)
```
**Configure storage and upload router**
```typescript title="lib/storage.ts"
import { createUploadConfig } from 'pushduck/server';
// Configure storage
export const { s3, storage, createS3Router } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build();
// Upload router for framework routes
export const uploadRouter = createS3Router({
imageUpload: s3.image().max("5MB"),
documentUpload: s3.file().max("10MB")
});
export type AppUploadRouter = typeof uploadRouter;
```
**Create tRPC router with storage operations**
```typescript title="server/trpc.ts"
import { initTRPC, TRPCError } from '@trpc/server';
import { z } from 'zod';
import { storage } from '~/lib/storage';
const t = initTRPC.create();
export const appRouter = t.router({
files: t.router({
// List files with pagination
list: t.procedure
.input(z.object({
prefix: z.string().optional(),
limit: z.number().min(1).max(100).default(20),
cursor: z.string().optional()
}))
.query(async ({ input }) => {
const result = await storage.list.paginated({
prefix: input.prefix,
maxKeys: input.limit,
continuationToken: input.cursor,
});
return {
files: result.files,
nextCursor: result.nextContinuationToken,
hasMore: result.isTruncated,
};
}),
// Get file metadata
getInfo: t.procedure
.input(z.object({ key: z.string() }))
.query(async ({ input }) => {
const info = await storage.metadata.getInfo(input.key);
if (!info.exists) {
throw new TRPCError({
code: 'NOT_FOUND',
message: 'File not found'
});
}
return info;
}),
// Get multiple files info
getBatch: t.procedure
.input(z.object({ keys: z.array(z.string()) }))
.query(async ({ input }) => {
return await storage.metadata.getBatch(input.keys);
}),
// Delete file
delete: t.procedure
.input(z.object({ key: z.string() }))
.mutation(async ({ input }) => {
const result = await storage.delete.file(input.key);
return { success: result.success };
}),
// Delete multiple files
deleteBatch: t.procedure
.input(z.object({ keys: z.array(z.string()) }))
.mutation(async ({ input }) => {
const result = await storage.delete.files(input.keys);
return {
success: result.success,
deleted: result.deleted,
errors: result.errors,
};
}),
// Generate download URL
getDownloadUrl: t.procedure
.input(z.object({
key: z.string(),
expiresIn: z.number().optional()
}))
.query(async ({ input }) => {
const result = await storage.download.presignedUrl(
input.key,
input.expiresIn
);
return { url: result.url, expiresAt: result.expiresAt };
}),
// Search files by extension
searchByExtension: t.procedure
.input(z.object({
extension: z.string(),
prefix: z.string().optional()
}))
.query(async ({ input }) => {
return await storage.list.byExtension(
input.extension,
input.prefix
);
}),
// Search files by size range
searchBySize: t.procedure
.input(z.object({
minSize: z.number().optional(),
maxSize: z.number().optional(),
prefix: z.string().optional()
}))
.query(async ({ input }) => {
return await storage.list.bySize(
input.minSize,
input.maxSize,
input.prefix
);
}),
// Get storage statistics
getStats: t.procedure
.input(z.object({ prefix: z.string().optional() }))
.query(async ({ input }) => {
const files = await storage.list.files({ prefix: input.prefix });
const stats = files.files.reduce((acc, file) => {
acc.totalSize += file.size;
acc.count += 1;
const ext = file.key.split('.').pop()?.toLowerCase() || 'unknown';
acc.byExtension[ext] = (acc.byExtension[ext] || 0) + 1;
return acc;
}, {
totalSize: 0,
count: 0,
byExtension: {} as Record
});
return stats;
}),
}),
});
export type AppRouter = typeof appRouter;
```
**Create framework upload route**
```typescript title="app/api/upload/[...path]/route.ts"
import { uploadRouter } from '~/lib/storage';
// Handle file uploads through framework route
export const { GET, POST } = uploadRouter.handlers;
```
```typescript title="app/routes/api.upload.$.tsx"
import type { ActionFunctionArgs, LoaderFunctionArgs } from "@remix-run/node";
import { uploadRouter } from "~/lib/storage";
export async function loader({ request }: LoaderFunctionArgs) {
return uploadRouter.handlers(request);
}
export async function action({ request }: ActionFunctionArgs) {
return uploadRouter.handlers(request);
}
```
```typescript title="src/routes/api/upload/[...path]/+server.ts"
import type { RequestHandler } from './$types';
import { uploadRouter } from '$lib/storage';
export const GET: RequestHandler = async ({ request }) => {
return uploadRouter.handlers(request);
};
export const POST: RequestHandler = async ({ request }) => {
return uploadRouter.handlers(request);
};
```
**Create tRPC API route**
```typescript title="app/api/trpc/[trpc]/route.ts"
import { fetchRequestHandler } from '@trpc/server/adapters/fetch';
import { appRouter } from '~/server/trpc';
const handler = (req: Request) =>
fetchRequestHandler({
endpoint: '/api/trpc',
req,
router: appRouter,
createContext: () => ({}),
});
export { handler as GET, handler as POST };
```
```typescript title="server/index.ts"
import express from 'express';
import * as trpcExpress from '@trpc/server/adapters/express';
import { appRouter } from './trpc';
import { uploadRouter } from './storage';
import { toExpressHandler } from 'pushduck/adapters/express';
const app = express();
// tRPC for storage operations
app.use('/api/trpc', trpcExpress.createExpressMiddleware({
router: appRouter,
createContext: () => ({}),
}));
// pushduck for file uploads
app.all('/api/upload/*', toExpressHandler(uploadRouter.handlers));
app.listen(3000);
```
```typescript title="server/index.ts"
import { createHTTPServer } from '@trpc/server/adapters/standalone';
import { appRouter } from './trpc';
// tRPC server for storage operations
const server = createHTTPServer({
router: appRouter,
createContext: () => ({}),
});
server.listen(3001); // Different port for tRPC
```
## Client-Side Integration
### React with tRPC and pushduck
```tsx title="components/FileManager.tsx"
import { trpc } from '~/lib/trpc';
import { useUpload } from 'pushduck/client';
import type { AppUploadRouter } from '~/lib/storage';
// Upload hooks are from pushduck (framework-native)
const { UploadButton, UploadDropzone } = useUpload({
endpoint: "/api/upload",
});
export function FileManager() {
// Storage operations are from tRPC (type-safe)
const {
data: files,
refetch,
fetchNextPage,
hasNextPage
} = trpc.files.list.useInfiniteQuery(
{ limit: 20 },
{
getNextPageParam: (lastPage) => lastPage.nextCursor,
}
);
const deleteFile = trpc.files.delete.useMutation({
onSuccess: () => refetch(),
});
const getDownloadUrl = trpc.files.getDownloadUrl.useMutation();
const fileStats = trpc.files.getStats.useQuery({});
const handleUploadComplete = async (uploadedFiles: any[]) => {
// Files are uploaded, refresh the list
await refetch();
console.log('Upload completed:', uploadedFiles);
};
const handleDelete = async (key: string) => {
if (confirm('Are you sure you want to delete this file?')) {
await deleteFile.mutateAsync({ key });
}
};
const handleDownload = async (key: string) => {
const result = await getDownloadUrl.mutateAsync({ key });
window.open(result.url, '_blank');
};
return (
{/* Upload Section - Uses pushduck hooks */}
Upload Files
Images
alert(`Upload failed: ${error.message}`)}
/>
Documents
alert(`Upload failed: ${error.message}`)}
/>
{/* Stats Section - Uses tRPC */}
Storage Statistics
{fileStats.data && (
Total Files: {fileStats.data.count}
Total Size: {formatFileSize(fileStats.data.totalSize)}
Extensions: {Object.keys(fileStats.data.byExtension).join(', ')}
)}
{/* File List Section - Uses tRPC */}
Your Files
{files?.pages[0]?.files.length === 0 ? (
No files uploaded yet.
) : (
{files?.pages.flatMap(page => page.files).map((file) => (
{file.key.split('/').pop()}
{formatFileSize(file.size)}
{new Date(file.lastModified).toLocaleDateString()}
handleDownload(file.key)}
className="text-blue-500 hover:underline text-sm"
disabled={getDownloadUrl.isLoading}
>
Download
handleDelete(file.key)}
className="text-red-500 hover:underline text-sm"
disabled={deleteFile.isLoading}
>
Delete
))}
{hasNextPage && (
fetchNextPage()}
className="w-full bg-blue-500 text-white p-2 rounded hover:bg-blue-600"
>
Load More Files
)}
)}
);
}
function formatFileSize(bytes: number): string {
const sizes = ['Bytes', 'KB', 'MB', 'GB'];
if (bytes === 0) return '0 Bytes';
const i = Math.floor(Math.log(bytes) / Math.log(1024));
return Math.round(bytes / Math.pow(1024, i) * 100) / 100 + ' ' + sizes[i];
}
```
### Advanced File Search
```tsx title="components/FileSearch.tsx"
import { useState } from 'react';
import { trpc } from '~/lib/trpc';
export function FileSearch() {
const [searchType, setSearchType] = useState<'extension' | 'size'>('extension');
const [extension, setExtension] = useState('');
const [minSize, setMinSize] = useState();
const [maxSize, setMaxSize] = useState();
// Type-safe search operations via tRPC
const searchByExtension = trpc.files.searchByExtension.useQuery(
{ extension },
{ enabled: searchType === 'extension' && !!extension }
);
const searchBySize = trpc.files.searchBySize.useQuery(
{ minSize, maxSize },
{ enabled: searchType === 'size' && (!!minSize || !!maxSize) }
);
const results = searchType === 'extension' ? searchByExtension.data : searchBySize.data;
return (
{results && (
{results.files.map((file) => (
{file.key}
{formatFileSize(file.size)}
))}
)}
);
}
```
## Authentication Integration
```typescript title="server/trpc.ts"
import { initTRPC, TRPCError } from '@trpc/server';
import { storage } from '~/lib/storage';
// Create context with user authentication
export const createContext = async ({ req }: { req: Request }) => {
const authHeader = req.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
return { user: null };
}
try {
const token = authHeader.substring(7);
const user = await validateToken(token); // Your auth logic
return { user };
} catch {
return { user: null };
}
};
type Context = Awaited>;
const t = initTRPC.context().create();
// Auth middleware
const isAuthenticated = t.middleware(({ next, ctx }) => {
if (!ctx.user) {
throw new TRPCError({ code: 'UNAUTHORIZED' });
}
return next({ ctx: { ...ctx, user: ctx.user } });
});
const protectedProcedure = t.procedure.use(isAuthenticated);
export const appRouter = t.router({
files: t.router({
// User's files only
list: protectedProcedure
.input(z.object({ prefix: z.string().optional() }))
.query(async ({ input, ctx }) => {
// Scope to user's folder
const userPrefix = `users/${ctx.user.id}/${input.prefix || ''}`;
return await storage.list.files({ prefix: userPrefix });
}),
// User can only delete their own files
delete: protectedProcedure
.input(z.object({ key: z.string() }))
.mutation(async ({ input, ctx }) => {
// Ensure user owns the file
if (!input.key.startsWith(`users/${ctx.user.id}/`)) {
throw new TRPCError({ code: 'FORBIDDEN' });
}
return await storage.delete.file(input.key);
}),
}),
});
```
## Real-time Updates with Subscriptions
```typescript title="server/trpc.ts"
import { observable } from '@trpc/server/observable';
import { EventEmitter } from 'events';
const fileEventEmitter = new EventEmitter();
export const appRouter = t.router({
files: t.router({
// ... other procedures
// Real-time file updates
onUpdate: protectedProcedure
.subscription(({ ctx }) => {
return observable<{ type: 'uploaded' | 'deleted'; file: any }>((emit) => {
const onFileEvent = (data: any) => {
// Only emit events for this user's files
if (data.userId === ctx.user.id) {
emit.next(data);
}
};
fileEventEmitter.on('file:event', onFileEvent);
return () => {
fileEventEmitter.off('file:event', onFileEvent);
};
});
}),
}),
});
// Emit events from upload completion
export const emitFileEvent = (type: 'uploaded' | 'deleted', file: any, userId: string) => {
fileEventEmitter.emit('file:event', { type, file, userId });
};
```
## Performance Benefits
## Key Advantages
### **Clear Separation**
* **Uploads**: Framework-native routes with pushduck handlers
* **Storage Operations**: Type-safe tRPC procedures with pushduck storage API
* **Client**: Framework upload hooks + tRPC queries/mutations
### **Best of Both Worlds**
* **Framework-optimized uploads** (progress, validation, middleware)
* **Type-safe storage management** (list, delete, search, metadata)
* **Unified developer experience** with consistent patterns
### **Scalable Architecture**
* **Independent scaling** of upload and API operations
* **Flexible deployment** (separate services if needed)
* **Framework agnostic** storage operations
## Troubleshooting
**Common Issues**
1. **Mixed responsibilities**: Don't try to handle uploads in tRPC procedures - use framework routes
2. **Type mismatches**: Ensure storage operations use the same config as upload routes
3. **Authentication**: Sync auth between upload middleware and tRPC context
4. **CORS**: Configure CORS for both `/api/trpc` and `/api/upload` endpoints
### Debug Mode
```typescript title="lib/debug.ts"
// Enable debug logging for both systems
export const debugConfig = {
trpc: process.env.NODE_ENV === 'development',
storage: process.env.NODE_ENV === 'development',
};
// Storage debug logging
export const storage = createStorage(config).middleware?.(async (operation, params) => {
if (debugConfig.storage) {
console.log("Storage operation:", operation, params);
}
});
// Upload debug logging
export const uploadRouter = createS3Router({
// ... routes
}).middleware(async ({ req, file }) => {
if (debugConfig.storage) {
console.log("Upload request:", req.url);
console.log("File:", file.name, file.size);
}
return {};
});
```
This architecture gives you **framework-native file uploads** with **type-safe storage management**, combining the strengths of both pushduck and tRPC for a superior developer experience.
# AWS S3
URL: /docs/providers/aws-s3
Set up AWS S3 for production file uploads in under 5 minutes
***
title: AWS S3
description: Set up AWS S3 for production file uploads in under 5 minutes
-------------------------------------------------------------------------
import { Step, Steps } from "fumadocs-ui/components/steps";
import { Callout } from "fumadocs-ui/components/callout";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
# AWS S3 Setup
Get **AWS S3** configured for production file uploads in under 5 minutes. This guide covers everything from bucket creation to security best practices.
**Why AWS S3?** The most trusted object storage service with 99.999999999%
durability, global CDN integration, and predictable pricing. Perfect for
applications that need reliable, scalable file storage.
## What You'll Accomplish
By the end of this guide, you'll have:
* โ
A secure S3 bucket configured for web uploads
* โ
IAM user with minimal required permissions
* โ
CORS configuration for your domain
* โ
Environment variables ready for production
* โ
Cost optimization settings enabled
## Create AWS Account & S3 Bucket
If you don't have an AWS account, [sign up for free](https://aws.amazon.com/free/) - you get 5GB of S3 storage free for 12 months.
1. **Open S3 Console**: Go to [S3 Console](https://console.aws.amazon.com/s3/)
2. **Create Bucket**: Click "Create bucket"
3. **Configure Basic Settings**:
```bash
Bucket name: your-app-uploads-prod
Region: us-east-1 (or closest to your users)
```
**Bucket Naming**: Use a unique, descriptive name. Bucket names are global
across all AWS accounts and cannot be changed later.
4. **Block Public Access**: Keep all "Block public access" settings **enabled** (this is secure - we'll use presigned URLs)
5. **Enable Versioning**: Recommended for data protection
6. **Create Bucket**: Click "Create bucket"
## Configure CORS for Web Access
Your web application needs permission to upload files directly to S3.
1. **Open Your Bucket**: Click on your newly created bucket
2. **Go to Permissions Tab**: Click "Permissions"
3. **Edit CORS Configuration**: Scroll to "Cross-origin resource sharing (CORS)" and click "Edit"
4. **Add CORS Rules**:
```json
[
{
"AllowedHeaders": ["*"],
"AllowedMethods": ["GET", "PUT", "POST", "DELETE"],
"AllowedOrigins": [
"http://localhost:3000",
"http://localhost:3001",
"https://localhost:3000"
],
"ExposeHeaders": ["ETag"],
"MaxAgeSeconds": 3000
}
]
```
```json
[
{
"AllowedHeaders": ["*"],
"AllowedMethods": ["GET", "PUT", "POST", "DELETE"],
"AllowedOrigins": [
"https://yourdomain.com",
"https://www.yourdomain.com",
"https://staging.yourdomain.com"
],
"ExposeHeaders": ["ETag"],
"MaxAgeSeconds": 86400
}
]
```
5. **Save Changes**: Click "Save changes"
**Security Note**: Only add origins you trust. Wildcards (`*`) should never be
used in production - they allow any website to upload to your bucket.
## Create IAM User with Minimal Permissions
Create a dedicated user for your application with only the permissions it needs.
1. **Open IAM Console**: Go to [IAM Console](https://console.aws.amazon.com/iam/)
2. **Create User**:
* Click "Users" โ "Create user"
* Username: `your-app-s3-user`
* Select "Programmatic access" only
3. **Create Custom Policy**:
* Click "Attach policies directly"
* Click "Create policy"
* Use JSON editor and paste:
```json
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": ["s3:PutObject", "s3:GetObject", "s3:DeleteObject"],
"Resource": "arn:aws:s3:::your-app-uploads-prod/*"
},
{
"Effect": "Allow",
"Action": ["s3:ListBucket"],
"Resource": "arn:aws:s3:::your-app-uploads-prod"
}
]
}
```
4. **Name the Policy**: `YourApp-S3-Upload-Policy`
5. **Attach to User**: Go back to user creation and attach your new policy
6. **Create User**: Complete the user creation
**Replace Bucket Name**: Make sure to replace `your-app-uploads-prod` with
your actual bucket name in the policy JSON.
## Get Access Keys
Your application needs these credentials to generate presigned URLs.
1. **Select Your User**: In IAM Users, click on your newly created user
2. **Security Credentials Tab**: Click "Security credentials"
3. **Create Access Key**:
* Click "Create access key"
* Select "Application running outside AWS"
* Click "Next"
4. **Copy Credentials**:
* **Access Key ID**: Copy this value
* **Secret Access Key**: Copy this value (you'll only see it once!)
**Security Alert**: Never commit these keys to version control or share them
publicly. Use environment variables or secure key management services.
## Configure Environment Variables
Add your AWS credentials to your application.
```bash
# .env.local
AWS_ACCESS_KEY_ID=your_access_key_id
AWS_SECRET_ACCESS_KEY=your_secret_access_key
AWS_REGION=us-east-1
AWS_S3_BUCKET_NAME=your-app-uploads-prod
# Optional: Enable S3 debug logging
DEBUG=aws-sdk:\*
```
```bash
# Use your hosting platform's environment variable system
# Never store production keys in .env files
# Vercel:
# vercel env add AWS_ACCESS_KEY_ID
# vercel env add AWS_SECRET_ACCESS_KEY
# Netlify:
# Add in Site settings > Environment variables
# Railway:
# Add in Variables tab
AWS_ACCESS_KEY_ID=your_access_key_id
AWS_SECRET_ACCESS_KEY=your_secret_access_key
AWS_REGION=us-east-1
AWS_S3_BUCKET_NAME=your-app-uploads-prod
```
## Test Your Configuration
Verify everything works by testing an upload.
1. **Start Your App**: Run your development server
2. **Test Upload**: Try uploading a file using your upload component
3. **Check S3**: Verify the file appears in your S3 bucket
4. **Check Access**: Verify you can access the uploaded file via its URL
If something's not working:
* โ
Check CORS configuration matches your domain
* โ
Verify IAM policy has correct bucket name
* โ
Confirm environment variables are loaded
* โ
Check browser console for specific error messages
## ๐ Congratulations!
Your AWS S3 bucket is now ready for production! Here's what you've accomplished:
* โ
**Secure Storage**: Files are stored in AWS's enterprise-grade infrastructure
* โ
**Cost Efficient**: Pay only for what you use, with free tier coverage
* โ
**Globally Accessible**: Files available worldwide with low latency
* โ
**Scalable**: Handles millions of files without configuration changes
* โ
**Secure Access**: Minimal IAM permissions and proper CORS setup
## ๐ฐ Cost Optimization
Keep your AWS bills low with these optimization tips:
### Storage Classes
```bash
# Standard: $0.023 per GB/month - for frequently accessed files
# Standard-IA: $0.0125 per GB/month - for infrequently accessed files
# Glacier: $0.004 per GB/month - for archival (retrieval takes hours)
```
### Lifecycle Policies
Set up automatic transitions to save money:
1. **Go to Your Bucket** โ Management โ Lifecycle rules
2. **Create Rule**:
* Transition to Standard-IA after 30 days
* Transition to Glacier after 90 days
* Delete incomplete multipart uploads after 1 day
### Request Optimization
* **Use CloudFront**: Cache files globally to reduce S3 requests
* **Batch Operations**: Group multiple operations when possible
* **Monitor Usage**: Set up billing alerts for unexpected costs
## ๐ Security Best Practices
### Access Control
```json
// Example: User-specific upload paths
{
"prefix": "users/${user.id}/*",
"maxFileSize": "10MB",
"types": ["image/jpeg", "image/png"]
}
```
### Monitoring
1. **Enable CloudTrail**: Track all S3 API calls
2. **Set Up Alerts**: Monitor unusual access patterns
3. **Regular Audits**: Review IAM permissions quarterly
### Backup Strategy
```bash
# Cross-region replication for critical data
Source Bucket: us-east-1
Replica Bucket: us-west-2
Replication: Real-time
```
## ๐ What's Next?
Now that AWS S3 is configured, explore these advanced features:
๐ Advanced Security
Implement encryption, access logging, and compliance
Security Guide โ
## ๐ก Pro Tips
**Naming Convention**: Use consistent bucket naming like `{company}-{app}- {environment}-uploads` for easy management across multiple projects.
**Cost Alert**: Set up AWS billing alerts at $5, $20, and $50 to avoid
surprise charges during development.
**Performance**: Place your S3 bucket in the same region as your application
server for fastest presigned URL generation.
***
**Need help with AWS S3 setup?** Join our [Discord community](https://discord.gg/pushduck) or check out the [troubleshooting guide](/docs/guides/going-live/troubleshooting) for common issues.
# Cloudflare R2
URL: /docs/providers/cloudflare-r2
Set up Cloudflare R2 for fast, cost-effective file uploads
***
title: Cloudflare R2
description: Set up Cloudflare R2 for fast, cost-effective file uploads
-----------------------------------------------------------------------
# Using Cloudflare R2
Set up Cloudflare R2 for lightning-fast file uploads with zero egress fees.
## Why Choose Cloudflare R2?
* **๐ Global Performance**: Cloudflare's edge network for fast uploads worldwide
* **๐ฐ Cost Effective**: 10x cheaper than S3 with zero egress fees
* **๐ S3 Compatible**: Works with existing S3 tools and libraries
* **๐ Built-in CDN**: Automatic content distribution via Cloudflare's network
## 1. Create an R2 Bucket
1. Go to [Cloudflare Dashboard](https://dash.cloudflare.com/)
2. Click **"R2 Object Storage"** in the sidebar
3. Click **"Create bucket"**
4. Choose a unique bucket name (e.g., `my-app-uploads`)
5. Select your preferred location (Auto for global performance)
6. Click **"Create bucket"**
## 2. Configure Public Access (Optional)
### For Public Files (Images, Documents)
1. Go to your bucket settings
2. Click **"Settings"** tab
3. Under **"Public access"**, click **"Allow Access"**
4. Choose **"Custom domain"** or use the R2.dev subdomain
**Custom Domain Setup:**
```bash
# Add a CNAME record in your DNS:
# uploads.yourdomain.com -> your-bucket.r2.cloudflarestorage.com
```
### For Private Files
Keep public access disabled - files will only be accessible via presigned URLs.
## 3. Generate API Token
1. Go to **"Manage R2 API tokens"**
2. Click **"Create API token"**
3. Set permissions:
* **Object:Read** โ
* **Object:Write** โ
* **Bucket:Read** โ
4. Choose **"Specify bucket"** and select your bucket
5. Click **"Create API token"**
6. **Save your Access Key ID and Secret Access Key**
## 4. Configure CORS (If Using Custom Domain)
In your R2 bucket settings, add this CORS policy:
```json
[
{
"AllowedHeaders": ["*"],
"AllowedMethods": ["GET", "PUT", "POST", "DELETE", "HEAD"],
"AllowedOrigins": ["http://localhost:3000", "https://yourdomain.com"],
"ExposeHeaders": ["ETag", "Content-Length"]
}
]
```
## 5. Configure Your App
Add to your `.env.local`:
```bash
# Cloudflare R2 Configuration
AWS_ACCESS_KEY_ID=your_r2_access_key_id
AWS_SECRET_ACCESS_KEY=your_r2_secret_access_key
AWS_ENDPOINT_URL=https://account-id.r2.cloudflarestorage.com
AWS_REGION=auto
S3_BUCKET_NAME=your-bucket-name
# Optional: Custom domain for public files
CLOUDFLARE_R2_PUBLIC_URL=https://uploads.yourdomain.com
```
## 6. Update Your Upload Configuration
```typescript
// lib/upload.ts
import { createUploadConfig } from "pushduck/server";
export const { s3, } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
accountId: process.env.R2_ACCOUNT_ID!, // Found in R2 dashboard
bucket: process.env.S3_BUCKET_NAME!,
// Optional: Custom domain for faster access
customDomain: process.env.CLOUDFLARE_R2_PUBLIC_URL,
})
.defaults({
maxFileSize: "10MB",
acl: "public-read", // For public access
})
.build();
```
## 7. Test Your Setup
```bash
npx @pushduck/cli@latest test --provider r2
```
This will verify your R2 connection and upload a test file.
## โ
You're Ready!
Your Cloudflare R2 is now configured! Files will be:
* **Uploaded globally** via Cloudflare's edge network
* **Served fast** with built-in CDN
* **Cost effective** with zero egress fees
## ๐ Performance Benefits
### Global Upload Acceleration
R2 automatically routes uploads to the nearest Cloudflare data center:
```typescript
// Automatic performance optimization
const { uploadFiles } = upload.imageUpload();
// Uploads are automatically optimized for:
// - Nearest edge location
// - Fastest route to storage
// - Automatic retry on connection issues
```
### Built-in CDN
Your uploaded files are automatically cached globally:
```typescript
// Files are served from 250+ locations worldwide
const imageUrl = file.url; // Automatically CDN-accelerated
```
## ๐ง Advanced Configuration
### Worker Integration
Integrate with Cloudflare Workers for server-side processing:
```typescript
// Advanced R2 setup with Workers
export const { s3, } = createUploadConfig()
.provider("cloudflareR2",{
// ... basic config
workerScript: "image-transform", // Optional: Transform images on upload
webhookUrl: "https://api.yourdomain.com/webhook", // Optional: Post-upload webhook
})
.build();
```
### Analytics & Monitoring
Track upload performance:
```typescript
.hooks({
onUploadComplete: async ({ file, metadata }) => {
// Track successful uploads
await analytics.track("file_uploaded", {
provider: "cloudflare-r2",
size: file.size,
type: file.type,
location: metadata.cfRay, // Cloudflare location
});
}
})
```
## ๐ Security Best Practices
* **Use scoped API tokens** - Only grant permissions to specific buckets
* **Enable custom domain** - Better security than r2.dev subdomain
* **Set up WAF rules** - Protect against abuse via Cloudflare dashboard
* **Monitor usage** - Set up billing alerts for unexpected usage
## ๐ Common Issues
**CORS errors?** โ Check your domain is in AllowedOrigins and verify custom domain setup\
**Access denied?** โ Verify API token has Object:Read and Object:Write permissions\
**Slow uploads?** โ Ensure you're using the correct endpoint URL with your account ID\
**Custom domain not working?** โ Verify CNAME record and bucket public access settings
## ๐ฐ Cost Comparison
| Provider | Storage | Egress | Requests |
| ----------------- | ------------ | ------------- | ------------- |
| **Cloudflare R2** | $0.015/GB | **FREE** ๐ | $0.36/million |
| AWS S3 | $0.023/GB | $0.09/GB | $0.40/million |
| **Savings** | **35% less** | **100% less** | **10% less** |
***
**Next:** [Upload Your First Image](/guides/uploads/images) or try [DigitalOcean Spaces](/guides/setup/digitalocean-spaces)
# DigitalOcean Spaces
URL: /docs/providers/digitalocean-spaces
Set up DigitalOcean Spaces with CDN for fast, affordable file uploads
***
title: DigitalOcean Spaces
description: Set up DigitalOcean Spaces with CDN for fast, affordable file uploads
----------------------------------------------------------------------------------
# Using DigitalOcean Spaces
Set up DigitalOcean Spaces for fast, affordable file uploads with built-in CDN.
## Why Choose DigitalOcean Spaces?
* **๐ฐ Predictable Pricing**: Simple pricing with generous free tier
* **๐ Built-in CDN**: Free CDN with all Spaces for global performance
* **๐ S3 Compatible**: Works seamlessly with S3 tools and libraries
* **๐ Easy Setup**: Simple configuration with great developer experience
## 1. Create a Space
1. Go to [DigitalOcean Cloud Panel](https://cloud.digitalocean.com/spaces)
2. Click **"Create a Space"**
3. Choose a datacenter region (closest to your users)
4. Enter a unique Space name (e.g., `my-app-uploads`)
5. Choose **"Restrict File Listing"** for security
6. Enable **"CDN"** for global distribution
7. Click **"Create a Space"**
## 2. Configure CDN (Recommended)
DigitalOcean automatically creates a CDN endpoint:
* **Space URL**: `https://my-app-uploads.nyc3.digitaloceanspaces.com`
* **CDN URL**: `https://my-app-uploads.nyc3.cdn.digitaloceanspaces.com`
### Custom Domain (Optional)
Set up your own domain for branded URLs:
1. Go to **"Settings"** in your Space
2. Click **"Add Custom Domain"**
3. Enter your domain (e.g., `uploads.yourdomain.com`)
4. Add a CNAME record in your DNS:
```bash
# DNS Record:
uploads.yourdomain.com -> my-app-uploads.nyc3.cdn.digitaloceanspaces.com
```
## 3. Generate API Keys
1. Go to **"API"** in your DigitalOcean dashboard
2. Click **"Spaces access keys"**
3. Click **"Generate New Key"**
4. Enter a name (e.g., "My App Uploads")
5. **Save your Access Key and Secret Key**
## 4. Configure CORS
In your Space settings, add this CORS configuration:
```json
[
{
"AllowedHeaders": ["*"],
"AllowedMethods": ["GET", "PUT", "POST", "DELETE", "HEAD"],
"AllowedOrigins": ["http://localhost:3000", "https://yourdomain.com"],
"ExposeHeaders": ["ETag", "Content-Length"],
"MaxAgeSeconds": 3000
}
]
```
## 5. Configure Your App
Add to your `.env.local`:
```bash
# DigitalOcean Spaces Configuration
AWS_ACCESS_KEY_ID=your_spaces_access_key
AWS_SECRET_ACCESS_KEY=your_spaces_secret_key
AWS_ENDPOINT_URL=https://nyc3.digitaloceanspaces.com
AWS_REGION=nyc3
S3_BUCKET_NAME=your-space-name
# CDN Configuration (recommended)
DIGITALOCEAN_CDN_ENDPOINT=https://your-space-name.nyc3.cdn.digitaloceanspaces.com
# Or your custom domain:
# DIGITALOCEAN_CDN_ENDPOINT=https://uploads.yourdomain.com
```
## 6. Update Your Upload Configuration
```typescript
// lib/upload.ts
import { createUploadConfig } from "pushduck/server";
export const { s3, } = createUploadConfig()
.provider("digitalOceanSpaces",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: process.env.AWS_REGION!, // e.g., 'nyc3', 'sfo3', 'ams3'
bucket: process.env.S3_BUCKET_NAME!,
// Use CDN for faster file serving
cdnEndpoint: process.env.DIGITALOCEAN_CDN_ENDPOINT,
})
.defaults({
maxFileSize: "50MB", // Spaces supports larger files
acl: "public-read",
})
.build();
```
## 7. Test Your Setup
```bash
npx @pushduck/cli@latest test --provider digitalocean
```
This will verify your Spaces connection and upload a test file.
## โ
You're Ready!
Your DigitalOcean Space is configured! Benefits:
* **Global CDN** - Files served from 12+ locations worldwide
* **Affordable pricing** - $5/month for 250GB + 1TB transfer
* **High performance** - Built-in CDN acceleration
## ๐ CDN Benefits
### Automatic Global Distribution
Your files are automatically cached worldwide:
```typescript
// Files are served from the nearest CDN location
const imageUrl = file.url; // Automatically CDN-accelerated
// CDN locations include:
// - North America: NYC, SF, Toronto
// - Europe: Amsterdam, London, Frankfurt
// - Asia: Singapore, Bangalore
```
### Cache Control
Optimize caching for different file types:
```typescript
// Configure caching per upload type
const imageUpload = s3
.image()
.max("10MB")
.onUploadComplete(async ({ file, key }) => {
// Set cache headers for images (long cache)
await setObjectMetadata(key, {
"Cache-Control": "public, max-age=31536000", // 1 year
"Content-Type": file.type,
});
});
const documentUpload = s3
.file()
.types(["pdf", "docx"])
.onUploadComplete(async ({ file, key }) => {
// Shorter cache for documents
await setObjectMetadata(key, {
"Cache-Control": "public, max-age=86400", // 1 day
});
});
```
## ๐ง Advanced Configuration
### Multiple Regions
Deploy across multiple regions for redundancy:
```typescript
// Primary region (closest to users)
const primarySpaces = createUploadConfig().provider("digitalOceanSpaces",{
region: "nyc3", // New York
bucket: "my-app-uploads-us",
});
// Backup region
const backupSpaces = createUploadConfig().provider("digitalOceanSpaces",{
region: "ams3", // Amsterdam
bucket: "my-app-uploads-eu",
});
```
### Lifecycle Policies
Automatically manage old files:
```typescript
// Configure automatic cleanup
.hooks({
onUploadComplete: async ({ file, metadata }) => {
// Schedule deletion of temporary files
if (metadata.category === 'temp') {
await scheduleCleanup(file.key, { days: 7 });
}
}
})
```
## ๐ฐ Pricing Breakdown
| Resource | Included | Overage |
| ------------ | --------- | -------- |
| **Storage** | 250 GB | $0.02/GB |
| **Transfer** | 1 TB | $0.01/GB |
| **CDN** | Included | Free |
| **Requests** | Unlimited | Free |
**Total**: $5/month for most small-medium apps
## ๐ Security Features
### Private Spaces
For sensitive files:
```typescript
// Configure private access
export const { s3, } = createUploadConfig()
.provider("digitalOceanSpaces",{
// ... config
acl: "private", // Files not publicly accessible
})
.defaults({
generatePresignedUrl: true, // Generate secure URLs
urlExpirationHours: 1, // URLs expire after 1 hour
})
.build();
```
### File Access Control
Control who can access what:
```typescript
.middleware(async ({ req, file }) => {
const user = await authenticate(req);
// Only allow users to upload to their own folder
const userPrefix = `users/${user.id}/`;
return {
userId: user.id,
keyPrefix: userPrefix,
};
})
```
## ๐ Monitoring & Analytics
### Usage Monitoring
Track your Space usage:
```typescript
// Monitor uploads in real-time
.hooks({
onUploadStart: async ({ file }) => {
await analytics.track("upload_started", {
provider: "digitalocean-spaces",
fileSize: file.size,
fileType: file.type,
});
},
onUploadComplete: async ({ file, metadata }) => {
await analytics.track("upload_completed", {
provider: "digitalocean-spaces",
duration: metadata.uploadTime,
cdnEnabled: true,
});
}
})
```
## ๐ Common Issues
**CORS errors?** โ Verify your domain is in AllowedOrigins and CORS is enabled\
**Slow uploads?** โ Check you're using the correct regional endpoint\
**CDN not working?** โ Verify CDN is enabled and using the cdn.digitaloceanspaces.com endpoint\
**Access denied?** โ Check your API keys have Spaces read/write permissions\
**File not found?** โ Ensure you're using the CDN endpoint for file access
## ๐ Performance Tips
1. **Use CDN endpoints** for all file access (not direct Space URLs)
2. **Choose closest region** to your primary user base
3. **Enable gzip compression** for text files
4. **Set proper cache headers** for different file types
5. **Use progressive image formats** (WebP, AVIF) when possible
***
**Next:** [Upload Your First Image](/guides/uploads/images) or try [MinIO Setup](/guides/setup/minio)
# Google Cloud Storage
URL: /docs/providers/google-cloud
Set up Google Cloud Storage for scalable, global file uploads
***
title: Google Cloud Storage
description: Set up Google Cloud Storage for scalable, global file uploads
--------------------------------------------------------------------------
# Using Google Cloud Storage
Set up Google Cloud Storage (GCS) for scalable, global file uploads with Google's infrastructure.
## Why Choose Google Cloud Storage?
* **๐ Global Infrastructure**: Google's worldwide network for fast access
* **๐ S3 Compatible**: Works with S3-compatible libraries via XML API
* **๐ฐ Competitive Pricing**: Cost-effective with multiple storage classes
* **๐ Enterprise Security**: Google-grade security and compliance
* **โก High Performance**: Optimized for speed and reliability
## 1. Create a GCS Bucket
### Using Google Cloud Console
1. Go to [Google Cloud Console](https://console.cloud.google.com/storage)
2. Select or create a project
3. Click **"Create bucket"**
4. Configure your bucket:
* **Name**: Choose a globally unique name (e.g., `my-app-uploads-bucket`)
* **Location**: Choose region closest to your users
* **Storage class**: Standard (for frequently accessed files)
* **Access control**: Fine-grained (recommended)
5. Click **"Create"**
### Using gcloud CLI
```bash
# Install gcloud CLI (if not already installed)
curl https://sdk.cloud.google.com | bash
exec -l $SHELL
gcloud init
# Create bucket
gsutil mb -p your-project-id -c STANDARD -l us-central1 gs://my-app-uploads-bucket
```
## 2. Configure Public Access (Optional)
### For Public Files (Images, Documents)
```bash
# Make bucket publicly readable
gsutil iam ch allUsers:objectViewer gs://my-app-uploads-bucket
# Or using Console:
# Storage โ Bucket โ Permissions โ Add โ "allUsers" โ "Storage Object Viewer"
```
### For Private Files
Keep default settings - files will only be accessible via signed URLs.
## 3. Create Service Account
1. Go to [IAM & Admin](https://console.cloud.google.com/iam-admin/serviceaccounts)
2. Click **"Create Service Account"**
3. Enter details:
* **Name**: `upload-service`
* **Description**: "Service account for file uploads"
4. Click **"Create and Continue"**
5. Grant roles:
* **Storage Admin** (or **Storage Object Admin** for bucket-specific access)
6. Click **"Continue"** โ **"Done"**
## 4. Generate Service Account Key
1. Click on your service account
2. Go to **"Keys"** tab
3. Click **"Add Key"** โ **"Create new key"**
4. Choose **"JSON"** format
5. **Download and securely store the JSON file**
## 5. Configure CORS
```bash
# Create cors.json file
cat > cors.json << EOF
[
{
"origin": ["http://localhost:3000", "https://yourdomain.com"],
"method": ["GET", "PUT", "POST", "DELETE", "HEAD"],
"responseHeader": ["Content-Type", "ETag", "Content-Length"],
"maxAgeSeconds": 3600
}
]
EOF
# Apply CORS configuration
gsutil cors set cors.json gs://my-app-uploads-bucket
```
## 6. Configure Your App
Add to your `.env.local`:
```bash
# Google Cloud Storage Configuration
GOOGLE_APPLICATION_CREDENTIALS=./path/to/service-account-key.json
GCS_PROJECT_ID=your-project-id
GCS_BUCKET_NAME=my-app-uploads-bucket
# Optional: Custom domain for public files
GCS_PUBLIC_URL=https://storage.googleapis.com/my-app-uploads-bucket
# Or with custom domain: GCS_PUBLIC_URL=https://uploads.yourdomain.com
```
## 7. Update Your Upload Configuration
```typescript
// lib/upload.ts
import { createUploadConfig } from "pushduck/server";
export const { s3, } = createUploadConfig()
.provider("gcs",{
projectId: process.env.GCS_PROJECT_ID!,
keyFilename: process.env.GOOGLE_APPLICATION_CREDENTIALS!,
bucket: process.env.GCS_BUCKET_NAME!,
// Optional: Custom public URL
publicUrl: process.env.GCS_PUBLIC_URL,
})
.defaults({
maxFileSize: "100MB", // GCS supports large files
acl: "publicRead", // For public access
})
.build();
```
## 8. Test Your Setup
```bash
npx @pushduck/cli@latest test --provider gcs
```
This will verify your GCS connection and upload a test file.
## โ
You're Ready!
Your Google Cloud Storage is configured! Benefits:
* **Global performance** via Google's network
* **Automatic scaling** to handle any load
* **Enterprise-grade security** and compliance
* **Multiple storage classes** for cost optimization
## ๐ Advanced Features
### Multi-Regional Storage
```typescript
// Configure multi-regional bucket for global performance
export const { s3, } = createUploadConfig()
.provider("gcs",{
// ... basic config
bucket: process.env.GCS_BUCKET_NAME!,
// Multi-regional configuration
location: "US", // or "EU", "ASIA"
storageClass: "MULTI_REGIONAL",
})
.build();
```
### Storage Classes
Optimize costs with different storage classes:
```typescript
// Configure lifecycle policies for cost optimization
.hooks({
onUploadComplete: async ({ file, key, metadata }) => {
// Move old files to cheaper storage classes
if (metadata.category === 'archive') {
await moveToStorageClass(key, 'COLDLINE');
} else if (metadata.category === 'backup') {
await moveToStorageClass(key, 'ARCHIVE');
}
}
})
```
### CDN Integration
```typescript
// Use Cloud CDN for faster global delivery
export const { s3, } = createUploadConfig()
.provider("gcs",{
// ... config
// Configure CDN endpoint
cdnUrl: "https://your-cdn-domain.com",
// Cache control headers
defaultCacheControl: "public, max-age=31536000",
})
.build();
```
## ๐ Security Best Practices
### Bucket-Level IAM
```bash
# Create custom role with minimal permissions
gcloud iam roles create upload_service_role \
--project=your-project-id \
--title="Upload Service Role" \
--permissions="storage.objects.create,storage.objects.get"
# Bind role to service account
gcloud projects add-iam-policy-binding your-project-id \
--member="serviceAccount:upload-service@your-project-id.iam.gserviceaccount.com" \
--role="projects/your-project-id/roles/upload_service_role"
```
### Object-Level Security
```typescript
// Implement user-based access control
.middleware(async ({ req, file }) => {
const user = await authenticate(req);
// Generate user-specific path
const userPath = `users/${user.id}/${file.name}`;
return {
userId: user.id,
keyPrefix: `users/${user.id}/`,
metadata: {
uploadedBy: user.id,
uploadedAt: new Date().toISOString(),
}
};
})
```
### Signed URLs for Private Access
```typescript
// Generate time-limited access URLs
.hooks({
onUploadComplete: async ({ file, key }) => {
if (file.private) {
// Generate signed URL valid for 1 hour
const signedUrl = await generateSignedUrl(key, {
action: 'read',
expires: Date.now() + 60 * 60 * 1000, // 1 hour
});
return { ...file, url: signedUrl };
}
return file;
}
})
```
## ๐ฐ Cost Optimization
### Storage Class Strategy
| Use Case | Storage Class | Cost | Access Pattern |
| ---------------- | ------------- | ------ | ---------------- |
| **Active files** | Standard | Higher | Frequent access |
| **Backups** | Nearline | Medium | Monthly access |
| **Archives** | Coldline | Lower | Quarterly access |
| **Long-term** | Archive | Lowest | Yearly access |
### Lifecycle Management
```typescript
// Automatic lifecycle transitions
const lifecyclePolicy = {
rule: [
{
action: { type: "SetStorageClass", storageClass: "NEARLINE" },
condition: { age: 30 }, // Move to Nearline after 30 days
},
{
action: { type: "SetStorageClass", storageClass: "COLDLINE" },
condition: { age: 90 }, // Move to Coldline after 90 days
},
{
action: { type: "Delete" },
condition: { age: 365 }, // Delete after 1 year
},
],
};
```
## ๐ Monitoring & Analytics
### Cloud Monitoring Integration
```typescript
// Track upload metrics
.hooks({
onUploadStart: async ({ file }) => {
await cloudMonitoring.createTimeSeries({
name: 'custom.googleapis.com/upload/started',
value: 1,
labels: {
file_type: file.type,
file_size_mb: Math.round(file.size / 1024 / 1024),
}
});
},
onUploadComplete: async ({ file, metadata }) => {
await cloudMonitoring.createTimeSeries({
name: 'custom.googleapis.com/upload/completed',
value: metadata.uploadDuration,
labels: {
success: 'true',
provider: 'gcs',
}
});
}
})
```
### Usage Analytics
```typescript
// Track storage usage and costs
const getStorageMetrics = async () => {
const bucket = storage.bucket(process.env.GCS_BUCKET_NAME!);
const [metadata] = await bucket.getMetadata();
return {
totalSize: metadata.storageClass?.totalBytes,
objectCount: metadata.storageClass?.objectCount,
storageClass: metadata.storageClass?.name,
location: metadata.location,
};
};
```
## ๐ Custom Domain Setup
### 1. Verify Domain Ownership
```bash
# Add verification record to DNS
# TXT record: google-site-verification=your-verification-code
# Verify domain
gcloud domains verify yourdomain.com
```
### 2. Configure CNAME
```bash
# Add CNAME record to DNS
# uploads.yourdomain.com -> c.storage.googleapis.com
```
### 3. Update Configuration
```typescript
// Use custom domain for file URLs
export const { s3, } = createUploadConfig()
.provider("gcs",{
// ... config
customDomain: "https://uploads.yourdomain.com",
})
.build();
```
## ๐ Common Issues
**Authentication errors?** โ Check service account key path and permissions\
**CORS errors?** โ Verify CORS configuration and allowed origins\
**Access denied?** โ Check IAM roles and bucket permissions\
**Slow uploads?** โ Choose region closer to your users\
**Quota exceeded?** โ Check project quotas and billing account
## ๐ Performance Tips
1. **Choose the right region** - closest to your users
2. **Use multi-regional** for global applications
3. **Enable CDN** for frequently accessed files
4. **Optimize image sizes** before upload
5. **Use parallel uploads** for multiple files
6. **Implement proper caching** headers
***
**Next:** [Upload Your First Image](/guides/uploads/images) or check out our [Examples](/examples)
# MinIO
URL: /docs/providers/minio
Set up MinIO for self-hosted S3-compatible object storage
***
title: MinIO
description: Set up MinIO for self-hosted S3-compatible object storage
----------------------------------------------------------------------
# Using MinIO
Set up MinIO for self-hosted, S3-compatible object storage with full control and privacy.
## Why Choose MinIO?
* **๐ Self-Hosted**: Complete control over your data and infrastructure
* **๐ S3 Compatible**: Drop-in replacement for AWS S3 API
* **๐ฐ Cost Effective**: No cloud provider fees - just your server costs
* **๐ Private**: Keep sensitive data on your own infrastructure
* **โก High Performance**: Optimized for speed and throughput
## ๐ Quick Start with Docker
### 1. Start MinIO Server
```bash
# Create data directory
mkdir -p ~/minio/data
# Start MinIO server
docker run -d \
--name minio \
-p 9000:9000 \
-p 9001:9001 \
-v ~/minio/data:/data \
-e "MINIO_ROOT_USER=minioadmin" \
-e "MINIO_ROOT_PASSWORD=minioadmin123" \
quay.io/minio/minio server /data --console-address ":9001"
```
### 2. Access MinIO Console
1. Open `http://localhost:9001` in your browser
2. Login with:
* **Username**: `minioadmin`
* **Password**: `minioadmin123`
### 3. Create a Bucket
1. Click **"Create Bucket"**
2. Enter bucket name (e.g., `my-app-uploads`)
3. Click **"Create Bucket"**
### 4. Create Access Keys
1. Go to **"Identity"** โ **"Service Accounts"**
2. Click **"Create service account"**
3. Enter a name (e.g., "Upload Service")
4. Click **"Create"**
5. **Save your Access Key and Secret Key**
## ๐ญ Production Docker Setup
### Docker Compose Configuration
```yaml
# docker-compose.yml
version: "3.8"
services:
minio:
image: quay.io/minio/minio:latest
container_name: minio
ports:
- "9000:9000" # API
- "9001:9001" # Console
volumes:
- minio_data:/data
environment:
MINIO_ROOT_USER: ${MINIO_ROOT_USER}
MINIO_ROOT_PASSWORD: ${MINIO_ROOT_PASSWORD}
command: server /data --console-address ":9001"
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:9000/minio/health/live"]
interval: 30s
timeout: 20s
retries: 3
volumes:
minio_data:
```
### Environment Variables
```bash
# .env
MINIO_ROOT_USER=your-admin-username
MINIO_ROOT_PASSWORD=your-secure-password-min-8-chars
```
### Start Production Server
```bash
docker-compose up -d
```
## ๐ Production Deployment
### 1. Reverse Proxy Setup (Nginx)
```nginx
# /etc/nginx/sites-available/minio
server {
listen 80;
server_name uploads.yourdomain.com;
location / {
proxy_pass http://localhost:9000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# Handle large uploads
client_max_body_size 100M;
}
}
# Console (optional - for admin access)
server {
listen 80;
server_name minio-console.yourdomain.com;
location / {
proxy_pass http://localhost:9001;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
```
### 2. SSL/TLS Setup
```bash
# Install Certbot
sudo apt install certbot python3-certbot-nginx
# Get SSL certificates
sudo certbot --nginx -d uploads.yourdomain.com
sudo certbot --nginx -d minio-console.yourdomain.com
```
## ๐ง Configure Your App
Add to your `.env.local`:
```bash
# MinIO Configuration
AWS_ACCESS_KEY_ID=your_minio_access_key
AWS_SECRET_ACCESS_KEY=your_minio_secret_key
AWS_ENDPOINT_URL=http://localhost:9000
# For production: AWS_ENDPOINT_URL=https://uploads.yourdomain.com
AWS_REGION=us-east-1
S3_BUCKET_NAME=my-app-uploads
# Optional: Custom public URL
MINIO_PUBLIC_URL=https://uploads.yourdomain.com
```
## ๐ Update Your Upload Configuration
```typescript
// lib/upload.ts
import { createUploadConfig } from "pushduck/server";
export const { s3, } = createUploadConfig()
.provider("minio",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
endpoint: process.env.AWS_ENDPOINT_URL!,
region: process.env.AWS_REGION!,
bucket: process.env.S3_BUCKET_NAME!,
// Force path-style URLs for MinIO
forcePathStyle: true,
// Optional: Custom public URL for file access
publicUrl: process.env.MINIO_PUBLIC_URL,
})
.defaults({
maxFileSize: "100MB", // MinIO handles large files well
acl: "public-read",
})
.build();
```
## ๐งช Test Your Setup
```bash
npx @pushduck/cli@latest test --provider minio
```
This will verify your MinIO connection and upload a test file.
## โ
You're Ready!
Your MinIO server is configured! Benefits:
* **Full control** over your data
* **No cloud fees** - just server costs
* **High performance** for local/regional traffic
* **Complete privacy** for sensitive files
## ๐ Security Configuration
### 1. Bucket Policies
Set up access control for your bucket:
```json
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "*"
},
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::my-app-uploads/*"
}
]
}
```
### 2. Access Key Management
Create restricted access keys:
```typescript
// Service account with limited permissions
const uploadOnlyPolicy = {
Version: "2012-10-17",
Statement: [
{
Effect: "Allow",
Action: ["s3:PutObject", "s3:PutObjectAcl", "s3:GetObject"],
Resource: "arn:aws:s3:::my-app-uploads/*",
},
],
};
```
### 3. Network Security
```yaml
# docker-compose.yml with network isolation
version: "3.8"
services:
minio:
# ... other config
networks:
- minio-network
ports:
# Only expose what's needed
- "127.0.0.1:9000:9000" # Bind to localhost only
- "127.0.0.1:9001:9001"
networks:
minio-network:
driver: bridge
```
## ๐ Monitoring & Maintenance
### Health Checks
```typescript
// Add health monitoring
.hooks({
onUploadStart: async () => {
// Check MinIO connectivity
const isHealthy = await checkMinIOHealth();
if (!isHealthy) {
throw new Error("MinIO server unavailable");
}
}
})
async function checkMinIOHealth() {
try {
const response = await fetch(`${process.env.AWS_ENDPOINT_URL}/minio/health/live`);
return response.ok;
} catch {
return false;
}
}
```
### Backup Strategy
```bash
# Regular data backups
#!/bin/bash
# backup-minio.sh
BACKUP_DIR="/backups/minio/$(date +%Y-%m-%d)"
mkdir -p $BACKUP_DIR
# Backup MinIO data
docker run --rm \
-v minio_data:/source:ro \
-v $BACKUP_DIR:/backup \
alpine tar czf /backup/minio-data.tar.gz -C /source .
# Backup configuration
docker exec minio mc admin config export /backup/
```
## ๐ Performance Optimization
### 1. Storage Configuration
```yaml
# Optimized for performance
services:
minio:
# ... other config
environment:
# Performance tuning
MINIO_CACHE: "on"
MINIO_CACHE_DRIVES: "/tmp/cache"
MINIO_CACHE_QUOTA: "90"
volumes:
- minio_data:/data
- /tmp/minio-cache:/tmp/cache # Fast SSD cache
```
### 2. Connection Optimization
```typescript
// Optimize for high throughput
export const { s3, } = createUploadConfig()
.provider("minio",{
// ... config
maxRetries: 3,
retryDelayOptions: {
base: 300,
customBackoff: (retryCount) => Math.pow(2, retryCount) * 100,
},
// Connection pooling
maxSockets: 25,
timeout: 120000,
})
.build();
```
## ๐ง Advanced Features
### Multi-Tenant Setup
```typescript
// Different buckets per tenant
const createTenantConfig = (tenantId: string) =>
createUploadConfig()
.provider("minio",{
// ... base config
bucket: `tenant-${tenantId}-uploads`,
})
.middleware(async ({ req }) => {
const tenant = await getTenantFromRequest(req);
return { tenantId: tenant.id };
})
.build();
```
### Distributed Setup
```yaml
# Multi-node MinIO cluster
version: "3.8"
services:
minio1:
image: quay.io/minio/minio:latest
command: server http://minio{1...4}/data{1...2} --console-address ":9001"
# ... configuration
minio2:
image: quay.io/minio/minio:latest
command: server http://minio{1...4}/data{1...2} --console-address ":9001"
# ... configuration
# minio3, minio4...
```
## ๐ Common Issues
**Connection refused?** โ Check MinIO is running and port 9000 is accessible\
**Access denied?** โ Verify access keys and bucket permissions\
**CORS errors?** โ Set bucket policy to allow your domain\
**Slow uploads?** โ Check network connection and server resources\
**SSL errors?** โ Verify certificate configuration for custom domains
## ๐ก Use Cases
### Development Environment
* **Local testing** without cloud dependency
* **Offline development** for air-gapped environments
* **Cost-free** development and testing
### Production Scenarios
* **Data sovereignty** requirements
* **High-security** environments
* **Edge computing** deployments
* **Hybrid cloud** strategies
***
**Next:** [Upload Your First Image](/guides/uploads/images) or explore [Google Cloud Storage](/guides/setup/google-cloud)
# S3-Compatible (Generic)
URL: /docs/providers/s3-compatible
Set up any S3-compatible storage service for flexible, vendor-agnostic file uploads
***
title: S3-Compatible (Generic)
description: Set up any S3-compatible storage service for flexible, vendor-agnostic file uploads
------------------------------------------------------------------------------------------------
import { Step, Steps } from "fumadocs-ui/components/steps";
import { Callout } from "fumadocs-ui/components/callout";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
# S3-Compatible Storage
Connect to any S3-compatible storage service for flexible, vendor-agnostic file uploads with full type safety.
## Why Choose S3-Compatible?
* **๐ง Vendor Flexibility**: Works with any S3-compatible service
* **๐ Self-Hosted Options**: Perfect for custom or self-hosted solutions
* **๐ Standard API**: Uses the familiar S3 API across all providers
* **๐ฐ Cost Control**: Choose providers based on your budget needs
* **๐ก๏ธ Data Sovereignty**: Keep data where you need it geographically
**Perfect for**: Self-hosted MinIO, SeaweedFS, Garage, custom storage solutions, or any S3-compatible service not explicitly supported by dedicated providers.
## Common S3-Compatible Services
| Service | Use Case | Best For |
| -------------------- | ------------------- | -------------------------------- |
| **SeaweedFS** | Distributed storage | High-performance clusters |
| **Garage** | Lightweight storage | Self-hosted, minimal resources |
| **Ceph RadosGW** | Enterprise storage | Large-scale deployments |
| **Wasabi** | Cloud storage | Cost-effective cloud alternative |
| **Backblaze B2** | Backup storage | Archive and backup scenarios |
| **Custom Solutions** | Specialized needs | Custom implementations |
## Identify Your S3-Compatible Service
First, gather the required information from your storage provider:
### Required Information
* **Endpoint URL**: The API endpoint for your service
* **Access Key ID**: Your access key or username
* **Secret Access Key**: Your secret key or password
* **Bucket Name**: The bucket/container where files will be stored
### Common Endpoint Patterns
```bash
# Self-hosted MinIO
https://minio.yourdomain.com
# SeaweedFS
https://seaweedfs.yourdomain.com:8333
# Wasabi (if not using dedicated provider)
https://s3.wasabisys.com
# Backblaze B2 (S3-compatible endpoint)
https://s3.us-west-000.backblazeb2.com
# Custom deployment
https://storage.yourcompany.com
```
## Verify S3 API Compatibility
Ensure your service supports the required S3 operations:
### Required Operations
* `PutObject` - Upload files
* `GetObject` - Download files
* `DeleteObject` - Delete files
* `ListObjects` - List bucket contents
* `CreateMultipartUpload` - Large file uploads
### Test API Access
```bash
# Test basic connectivity
curl -X GET "https://your-endpoint.com" \
-H "Authorization: AWS ACCESS_KEY:SECRET_KEY"
# Test bucket access
curl -X GET "https://your-endpoint.com/your-bucket" \
-H "Authorization: AWS ACCESS_KEY:SECRET_KEY"
```
```bash
# Configure AWS CLI for testing
aws configure set aws_access_key_id YOUR_ACCESS_KEY
aws configure set aws_secret_access_key YOUR_SECRET_KEY
aws configure set default.region us-east-1
# Test with custom endpoint
aws s3 ls s3://your-bucket --endpoint-url https://your-endpoint.com
```
## Configure CORS (If Required)
Many S3-compatible services require CORS configuration for web uploads:
### Standard CORS Configuration
```bash
# Using MinIO client (mc)
mc cors set your-bucket --rule "effect=Allow&origin=*&methods=GET,PUT,POST,DELETE&headers=*"
# For production, restrict origins:
mc cors set your-bucket --rule "effect=Allow&origin=https://yourdomain.com&methods=GET,PUT,POST,DELETE&headers=*"
```
```json
[
{
"AllowedHeaders": ["*"],
"AllowedMethods": ["GET", "PUT", "POST", "DELETE", "HEAD"],
"AllowedOrigins": [
"http://localhost:3000",
"https://yourdomain.com"
],
"ExposeHeaders": ["ETag", "Content-Length"],
"MaxAgeSeconds": 3600
}
]
```
## Configure Environment Variables
Set up your environment variables for the S3-compatible service:
```bash
# .env.local
S3_ENDPOINT=https://your-storage-service.com
S3_BUCKET_NAME=your-bucket-name
S3_ACCESS_KEY_ID=your_access_key
S3_SECRET_ACCESS_KEY=your_secret_key
S3_REGION=us-east-1
# Optional: Force path-style URLs (required for most self-hosted)
S3_FORCE_PATH_STYLE=true
# Optional: Custom public URL for file access
S3_PUBLIC_URL=https://files.yourdomain.com
```
```bash
# Use your hosting platform's environment system
# Never store production keys in .env files
S3_ENDPOINT=https://your-storage-service.com
S3_BUCKET_NAME=your-bucket-name
S3_ACCESS_KEY_ID=your_access_key
S3_SECRET_ACCESS_KEY=your_secret_key
S3_REGION=us-east-1
S3_FORCE_PATH_STYLE=true
S3_PUBLIC_URL=https://files.yourdomain.com
```
## Update Your Upload Configuration
Configure pushduck to use your S3-compatible service:
```typescript
// lib/upload.ts
import { createUploadConfig } from "pushduck/server";
export const { s3, config } = createUploadConfig()
.provider("s3Compatible", {
endpoint: process.env.S3_ENDPOINT!,
bucket: process.env.S3_BUCKET_NAME!,
accessKeyId: process.env.S3_ACCESS_KEY_ID!,
secretAccessKey: process.env.S3_SECRET_ACCESS_KEY!,
region: process.env.S3_REGION || "us-east-1",
// Most S3-compatible services need path-style URLs
forcePathStyle: process.env.S3_FORCE_PATH_STYLE === "true",
// Optional: Custom domain for file access
customDomain: process.env.S3_PUBLIC_URL,
})
.defaults({
maxFileSize: "50MB",
acl: "public-read", // Adjust based on your needs
})
.build();
export { s3 };
```
### Advanced Configuration
```typescript
// For services with specific requirements
export const { s3, config } = createUploadConfig()
.provider("s3Compatible", {
endpoint: process.env.S3_ENDPOINT!,
bucket: process.env.S3_BUCKET_NAME!,
accessKeyId: process.env.S3_ACCESS_KEY_ID!,
secretAccessKey: process.env.S3_SECRET_ACCESS_KEY!,
region: process.env.S3_REGION || "us-east-1",
forcePathStyle: true,
})
.paths({
// Organize files by service type
prefix: "uploads",
generateKey: (file, metadata) => {
const date = new Date().toISOString().split('T')[0];
const random = Math.random().toString(36).substring(2, 8);
return `${date}/${random}/${file.name}`;
},
})
.security({
allowedOrigins: [
process.env.FRONTEND_URL!,
"http://localhost:3000",
],
})
.build();
```
## Test Your Configuration
Verify everything works correctly:
```bash
# Test with pushduck CLI
npx @pushduck/cli@latest test --provider s3-compatible
# Or test manually in your app
npm run dev
```
### Manual Testing
```typescript
// Create a simple test route
// pages/api/test-upload.ts or app/api/test-upload/route.ts
import { s3 } from '@/lib/upload';
export async function POST() {
try {
// Test creating an upload route
const imageUpload = s3.image().max("5MB");
return Response.json({
success: true,
message: "S3-compatible storage configured correctly"
});
} catch (error) {
return Response.json({
success: false,
error: error.message
}, { status: 500 });
}
}
```
## โ
You're Ready!
Your S3-compatible storage is now configured! You can now:
* โ
**Upload files** to your custom storage service
* โ
**Generate secure URLs** for file access
* โ
**Use familiar S3 APIs** with any compatible service
* โ
**Maintain vendor independence** with standard protocols
## ๐ง Service-Specific Configurations
### SeaweedFS
```typescript
export const { s3, config } = createUploadConfig()
.provider("s3Compatible", {
endpoint: "https://seaweedfs.yourdomain.com:8333",
bucket: "uploads",
accessKeyId: process.env.SEAWEEDFS_ACCESS_KEY!,
secretAccessKey: process.env.SEAWEEDFS_SECRET_KEY!,
region: "us-east-1",
forcePathStyle: true, // Required for SeaweedFS
})
.build();
```
### Garage
```typescript
export const { s3, config } = createUploadConfig()
.provider("s3Compatible", {
endpoint: "https://garage.yourdomain.com",
bucket: "my-app-files",
accessKeyId: process.env.GARAGE_ACCESS_KEY!,
secretAccessKey: process.env.GARAGE_SECRET_KEY!,
region: "garage", // Garage-specific region
forcePathStyle: true,
})
.build();
```
### Wasabi (Alternative to dedicated provider)
```typescript
export const { s3, config } = createUploadConfig()
.provider("s3Compatible", {
endpoint: "https://s3.wasabisys.com",
bucket: "my-wasabi-bucket",
accessKeyId: process.env.WASABI_ACCESS_KEY!,
secretAccessKey: process.env.WASABI_SECRET_KEY!,
region: "us-east-1",
forcePathStyle: false, // Wasabi supports virtual-hosted style
})
.build();
```
### Backblaze B2 (S3-Compatible API)
```typescript
export const { s3, config } = createUploadConfig()
.provider("s3Compatible", {
endpoint: "https://s3.us-west-000.backblazeb2.com",
bucket: "my-b2-bucket",
accessKeyId: process.env.B2_ACCESS_KEY!,
secretAccessKey: process.env.B2_SECRET_KEY!,
region: "us-west-000",
forcePathStyle: false,
})
.build();
```
## ๐ Security Best Practices
### Access Control
```typescript
// Implement user-based access control
.middleware(async ({ req, file }) => {
const user = await authenticate(req);
// Create user-specific paths
const userPath = `users/${user.id}`;
return {
userId: user.id,
keyPrefix: userPath,
metadata: {
uploadedBy: user.id,
uploadedAt: new Date().toISOString(),
}
};
})
```
### Bucket Policies
```json
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::your-bucket/public/*"
},
{
"Effect": "Deny",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::your-bucket/private/*"
}
]
}
```
### Environment Security
```typescript
// Validate configuration at startup
const validateConfig = () => {
const required = [
'S3_ENDPOINT',
'S3_BUCKET_NAME',
'S3_ACCESS_KEY_ID',
'S3_SECRET_ACCESS_KEY'
];
for (const key of required) {
if (!process.env[key]) {
throw new Error(`Missing required environment variable: ${key}`);
}
}
};
validateConfig();
```
## ๐ Monitoring & Analytics
### Health Monitoring
```typescript
// Monitor storage service health
.hooks({
onUploadStart: async ({ file }) => {
// Check service availability
const isHealthy = await checkStorageHealth();
if (!isHealthy) {
throw new Error("Storage service unavailable");
}
},
onUploadComplete: async ({ file, metadata }) => {
// Track successful uploads
await analytics.track("upload_completed", {
provider: "s3-compatible",
service: process.env.S3_ENDPOINT,
fileSize: file.size,
duration: metadata.uploadTime,
});
}
})
async function checkStorageHealth(): Promise {
try {
const response = await fetch(`${process.env.S3_ENDPOINT}/health`);
return response.ok;
} catch {
return false;
}
}
```
### Usage Analytics
```typescript
// Track storage usage patterns
const getStorageMetrics = async () => {
try {
// Use your service's API to get metrics
const metrics = await fetch(`${process.env.S3_ENDPOINT}/metrics`, {
headers: {
'Authorization': `Bearer ${process.env.S3_ACCESS_KEY_ID}`,
}
});
return await metrics.json();
} catch (error) {
console.error("Failed to fetch storage metrics:", error);
return null;
}
};
```
## ๐ Performance Optimization
### Connection Pooling
```typescript
// Optimize for high throughput
export const { s3, config } = createUploadConfig()
.provider("s3Compatible", {
// ... config
maxRetries: 3,
retryDelayOptions: {
base: 300,
customBackoff: (retryCount) => Math.pow(2, retryCount) * 100,
},
timeout: 60000,
})
.build();
```
### Parallel Uploads
```typescript
// Enable multipart uploads for large files
.defaults({
maxFileSize: "100MB",
// Configure multipart threshold
multipartUploadThreshold: "25MB",
multipartUploadSize: "5MB",
})
```
## ๐ Common Issues
### Connection Issues
**Certificate errors?** โ Add SSL certificate or use `NODE_TLS_REJECT_UNAUTHORIZED=0` for development\
**Connection refused?** โ Verify endpoint URL and port are correct\
**Timeout errors?** โ Increase timeout settings or check network connectivity
### Authentication Issues
**Access denied?** โ Verify access keys and bucket permissions\
**Invalid signature?** โ Check secret key and ensure clock synchronization\
**Region mismatch?** โ Verify the region setting matches your service
### Upload Issues
**CORS errors?** โ Configure CORS policy on your storage service\
**File size errors?** โ Check service limits and adjust `maxFileSize`\
**Path errors?** โ Enable `forcePathStyle: true` for most self-hosted services
### Debugging Commands
```bash
# Test connectivity
curl -v "https://your-endpoint.com/your-bucket"
# Check bucket contents
aws s3 ls s3://your-bucket --endpoint-url https://your-endpoint.com
# Test upload
aws s3 cp test.txt s3://your-bucket/ --endpoint-url https://your-endpoint.com
```
## ๐ก Use Cases
### Self-Hosted Solutions
* **Data sovereignty** requirements
* **Air-gapped** environments
* **Custom compliance** needs
* **Cost optimization** for high usage
### Hybrid Cloud
* **Multi-cloud** strategies
* **Disaster recovery** setups
* **Geographic distribution**
* **Vendor diversification**
### Development & Testing
* **Local development** without cloud dependencies
* **CI/CD pipelines** with custom storage
* **Testing environments** with controlled data
***
**Next:** [Upload Your First Image](/docs/guides/uploads/images) or explore [Configuration Options](/docs/api/configuration/upload-config)
# CLI Reference
URL: /docs/api/cli/cli-setup
Complete reference for the pushduck CLI tool with all commands, options, and examples
***
title: CLI Reference
description: Complete reference for the pushduck CLI tool with all commands, options, and examples
--------------------------------------------------------------------------------------------------
import { Callout } from 'fumadocs-ui/components/callout'
import { Card, Cards } from 'fumadocs-ui/components/card'
import { Steps, Step } from 'fumadocs-ui/components/steps'
import { Tab, Tabs } from 'fumadocs-ui/components/tabs'
import { Files, Folder, File } from 'fumadocs-ui/components/files'
**๐ Recommended**: Use our CLI for the fastest setup experience
## Quick Start
Get your file uploads working in under 2 minutes with our interactive CLI tool.
```bash
npx @pushduck/cli@latest init
```
```bash
pnpm dlx pushduck init
```
```bash
yarn dlx pushduck init
```
```bash
bunx pushduck init
```
The CLI will automatically:
* ๐ **Detect your package manager** (npm, pnpm, yarn, bun)
* ๐ฆ **Install dependencies** using your preferred package manager
* โ๏ธ **Set up your storage provider** (Cloudflare R2, AWS S3, etc.)
* ๐ ๏ธ **Generate type-safe code** (API routes, client, components)
* โ๏ธ **Configure environment** variables and bucket setup
## What the CLI Does
* Detects App Router vs Pages Router
* Finds existing TypeScript configuration
* Checks for existing upload implementations
* Validates project structure
* AWS S3, Cloudflare R2, DigitalOcean Spaces
* Google Cloud Storage, MinIO
* Automatic bucket creation
* CORS configuration
* Type-safe API routes
* Upload client configuration
* Example components
* Environment variable templates
The CLI walks you through each step, asking only what's necessary for your specific setup.
## CLI Commands
### `init` - Initialize Setup
```bash
npx @pushduck/cli@latest init [options]
```
**Options:**
* `--provider ` - Skip provider selection (aws|cloudflare-r2|digitalocean|minio|gcs)
* `--skip-examples` - Don't generate example components
* `--skip-bucket` - Don't create S3 bucket automatically
* `--api-path ` - Custom API route path (default: `/api/upload`)
* `--dry-run` - Show what would be created without creating
* `--verbose` - Show detailed output
**Examples:**
```bash
# Interactive setup with all prompts
npx @pushduck/cli@latest init
```
```bash
# Skip provider selection, use AWS S3
npx @pushduck/cli@latest init --provider aws
```
```bash
# Use custom API route path
npx @pushduck/cli@latest init --api-path /api/files
```
```bash
# Generate only components, skip bucket creation
npx @pushduck/cli@latest init --skip-bucket --skip-examples
```
### `add` - Add Upload Route
```bash
npx @pushduck/cli@latest add
```
Add new upload routes to existing configuration:
```bash
# Interactive route builder
npx @pushduck/cli@latest add
# Example output:
# โจ Added imageUpload route for profile pictures
# โจ Added documentUpload route for file attachments
# โจ Updated router types
```
### `test` - Test Configuration
```bash
npx @pushduck/cli@latest test [options]
```
**Options:**
* `--verbose` - Show detailed test output
Validates your current setup:
```bash
npx @pushduck/cli@latest test
# Example output:
# โ
Environment variables configured
# โ
S3 bucket accessible
# โ
CORS configuration valid
# โ
API routes responding
# โ
Types generated correctly
```
## Interactive Setup Walkthrough
### Step 1: Project Detection
```
๐ Detecting your project...
โ Next.js App Router detected
โ TypeScript configuration found
โ Package manager: pnpm detected
โ No existing upload configuration
โ Project structure validated
```
### Step 2: Provider Selection
```
? Which cloud storage provider would you like to use?
โฏ Cloudflare R2 (recommended)
AWS S3 (classic, widely supported)
DigitalOcean Spaces (simple, affordable)
Google Cloud Storage (enterprise-grade)
MinIO (self-hosted, open source)
Custom S3-compatible endpoint
```
### Step 3: Credential Setup
```
๐ง Setting up Cloudflare R2...
๐ Checking for existing credentials...
โ Found CLOUDFLARE_R2_ACCESS_KEY_ID
โ Found CLOUDFLARE_R2_SECRET_ACCESS_KEY
โ Found CLOUDFLARE_R2_ACCOUNT_ID
โ CLOUDFLARE_R2_BUCKET_NAME not found
? Enter your R2 bucket name: my-app-uploads
? Create bucket automatically? Yes
```
### Step 4: API Configuration
```
? Where should we create the upload API?
โฏ app/api/upload/route.ts (recommended)
app/api/s3-upload/route.ts (classic)
Custom path
? Generate example upload page?
โฏ Yes, create app/upload/page.tsx with full example
Yes, just add components to components/ui/
No, I'll build my own
```
### Step 5: File Generation
```
๐ ๏ธ Generating files...
โจ Created files:
โโโ app/api/upload/route.ts
โโโ app/upload/page.tsx
โโโ components/ui/upload-button.tsx
โโโ components/ui/upload-dropzone.tsx
โโโ lib/upload-client.ts
โโโ .env.example
๐ฆ Installing dependencies...
โ pushduck
โ @aws-sdk/client-s3
โ react-dropzone
๐ Setup complete! Your uploads are ready.
```
## Generated Project Structure
After running the CLI, your project will have:
### Generated API Route
```typescript title="app/api/upload/route.ts"
// No longer needed - use uploadRouter.handlers directly
import { s3 } from '@/lib/upload'
import { getServerSession } from 'next-auth'
import { authOptions } from '@/lib/auth'
const s3Router = s3.createRouter({
// Image uploads for profile pictures
imageUpload: s3.image()
.max("5MB")
.count(1)
.formats(["jpeg", "png", "webp"])
.middleware(async ({ req, metadata }) => {
const session = await getServerSession(authOptions)
if (!session?.user?.id) {
throw new Error("Authentication required")
}
return {
...metadata,
userId: session.user.id,
folder: `uploads/${session.user.id}`
}
}),
// Document uploads
documentUpload: s3.file()
.max("10MB")
.count(5)
.types(["application/pdf", "text/plain", "application/msword"])
.middleware(async ({ req, metadata }) => {
const session = await getServerSession(authOptions)
if (!session?.user?.id) {
throw new Error("Authentication required")
}
return {
...metadata,
userId: session.user.id,
folder: `documents/${session.user.id}`
}
})
})
export type AppRouter = typeof s3Router
export const { GET, POST } = s3Router.handlers
```
### Generated Upload Client
```typescript title="lib/upload-client.ts"
import { createUploadClient } from 'pushduck/client'
import type { AppRouter } from '@/app/api/upload/route'
export const upload = createUploadClient({
endpoint: '/api/upload'
})
```
### Generated Example Page
```typescript title="app/upload/page.tsx"
import { UploadButton } from '@/components/ui/upload-button'
import { UploadDropzone } from '@/components/ui/upload-dropzone'
export default function UploadPage() {
return (
File Upload Demo
Profile Picture
Documents
)
}
```
## Environment Variables
The CLI automatically creates `.env.example` and prompts for missing values:
```bash title=".env.example"
# Cloudflare R2 Configuration (Recommended)
CLOUDFLARE_R2_ACCESS_KEY_ID=your_access_key_here
CLOUDFLARE_R2_SECRET_ACCESS_KEY=your_secret_key_here
CLOUDFLARE_R2_ACCOUNT_ID=your_account_id_here
CLOUDFLARE_R2_BUCKET_NAME=your-bucket-name
# Alternative: AWS S3 Configuration
# AWS_ACCESS_KEY_ID=your_access_key_here
# AWS_SECRET_ACCESS_KEY=your_secret_key_here
# AWS_REGION=us-east-1
# AWS_S3_BUCKET_NAME=your-bucket-name
# Next.js Configuration
NEXTAUTH_SECRET=your_nextauth_secret_here
NEXTAUTH_URL=http://localhost:3000
# Optional: Custom S3 endpoint (for MinIO, etc.)
# S3_ENDPOINT=https://your-custom-endpoint.com
```
## Provider-Specific Setup
```bash
npx @pushduck/cli@latest init --provider cloudflare-r2
```
**What gets configured:**
* Cloudflare R2 S3-compatible endpoints
* Global edge network optimization
* Zero egress fee configuration
* CORS settings for web uploads
```bash
npx @pushduck/cli@latest init --provider aws
```
**What gets configured:**
* AWS S3 regional endpoints
* IAM permissions and policies
* Bucket lifecycle management
* CloudFront CDN integration (optional)
```bash
npx @pushduck/cli@latest init --provider digitalocean
```
**Required Environment Variables:**
* `AWS_ACCESS_KEY_ID` (DO Spaces key)
* `AWS_SECRET_ACCESS_KEY` (DO Spaces secret)
* `AWS_REGION` (DO region)
* `AWS_S3_BUCKET_NAME`
* `S3_ENDPOINT` (DO Spaces endpoint)
**What the CLI does:**
* Configures DigitalOcean Spaces endpoints
* Sets up CDN configuration
* Validates access permissions
* Configures CORS policies
```bash
npx @pushduck/cli@latest init --provider minio
```
**Required Environment Variables:**
* `AWS_ACCESS_KEY_ID` (MinIO access key)
* `AWS_SECRET_ACCESS_KEY` (MinIO secret key)
* `AWS_REGION=us-east-1`
* `AWS_S3_BUCKET_NAME`
* `S3_ENDPOINT` (MinIO server URL)
**What the CLI does:**
* Configures self-hosted MinIO endpoints
* Sets up bucket policies
* Validates server connectivity
* Configures development-friendly settings
## Troubleshooting
### CLI Not Found
```bash
# If you get "command not found"
npm install -g pushduck
# Or use npx for one-time usage
npx @pushduck/cli@latest@latest init
```
### Permission Errors
```bash
# If you get permission errors during setup
sudo npx @pushduck/cli@latest init
# Or fix npm permissions
npm config set prefix ~/.npm-global
export PATH=~/.npm-global/bin:$PATH
```
### Existing Configuration
```bash
# Force overwrite existing configuration
npx @pushduck/cli@latest init --force
# Or backup and regenerate
cp app/api/upload/route.ts app/api/upload/route.ts.backup
npx @pushduck/cli@latest init
```
### Bucket Creation Failed
```bash
# Test your credentials first
npx @pushduck/cli@latest test
# Skip automatic bucket creation
npx @pushduck/cli@latest init --skip-bucket
# Create bucket manually, then run:
npx @pushduck/cli@latest test
```
## Advanced Usage
### Custom Templates
```bash
# Use custom file templates
npx @pushduck/cli@latest init --template enterprise
# Available templates:
# - default: Basic setup with examples
# - minimal: Just API routes, no examples
# - enterprise: Full security and monitoring
# - ecommerce: Product images and documents
```
### Monorepo Support
```bash
# For monorepos, specify the Next.js app directory
cd apps/web
npx @pushduck/cli@latest init
# Or use the --cwd flag
npx @pushduck/cli@latest init --cwd apps/web
```
### CI/CD Integration
```bash
# Non-interactive mode for CI/CD
npx @pushduck/cli@latest init \
--provider aws \
--skip-examples \
--api-path /api/upload \
--no-interactive
```
***
**Complete CLI Reference**: This guide covers all CLI commands, options, and use cases. For a quick start, see our [Getting Started guide](/docs/getting-started/cli-setup).
# Client Configuration
URL: /docs/api/configuration/client-options
Configure your upload client for the best developer experience with enhanced type inference
***
title: Client Configuration
description: Configure your upload client for the best developer experience with enhanced type inference
--------------------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
import { TypeTable } from "fumadocs-ui/components/type-table";
# Client Configuration
The upload client provides multiple APIs to suit different needs: **property-based access** for enhanced type safety, and **hook-based access** for familiar React patterns.
This guide focuses on the **enhanced client API** with property-based access.
This provides the best developer experience with full TypeScript inference and
eliminates string literals.
## Client Setup Structure
Organize your client configuration for maximum reusability:
## Basic Client Configuration
**Import your router types**
Start by importing the router type from your server configuration:
```typescript title="lib/upload-client.ts"
import { createUploadClient } from 'pushduck/client'
import type { AppRouter } from './upload'
export const upload = createUploadClient({
endpoint: '/api/upload'
})
```
**Use property-based access**
Access your upload endpoints as properties with full type safety:
```typescript title="components/upload-form.tsx"
import { upload } from '@/lib/upload-client'
export function ImageUploadForm() {
const { uploadFiles, files, isUploading, error } = upload.imageUpload
// ^ Full TypeScript inference from your server router
const handleUpload = async (selectedFiles: File[]) => {
await uploadFiles(selectedFiles)
}
return (
handleUpload(Array.from(e.target.files || []))} />
{files.map(file => (
{file.name}
{file.status} {/* 'pending' | 'uploading' | 'success' | 'error' */}
{file.url &&
View }
))}
)
}
```
**Handle upload results**
Process upload results with full type safety:
```typescript
const { uploadFiles, files, reset } = upload.documentUpload
const handleDocumentUpload = async (files: File[]) => {
try {
const results = await uploadFiles(files)
// results is fully typed based on your router configuration
console.log('Upload successful:', results.map(r => r.url))
// Reset the upload state
reset()
} catch (error) {
console.error('Upload failed:', error)
}
}
```
## Client Configuration Options
",
},
timeout: {
description: "Request timeout in milliseconds",
type: "number",
default: "30000",
},
retries: {
description: "Number of retry attempts for failed uploads",
type: "number",
default: "3",
},
onProgress: {
description: "Global progress callback for all uploads",
type: "(progress: UploadProgress) => void",
},
onError: {
description: "Global error handler for upload failures",
type: "(error: UploadError) => void",
},
}}
/>
### Advanced Client Configuration
```typescript title="lib/upload-client.ts"
import { createUploadClient } from "pushduck/client";
import type { AppRouter } from "./upload";
export const upload = createUploadClient({
endpoint: "/api/upload",
// Custom headers (e.g., authentication)
headers: {
Authorization: `Bearer ${getAuthToken()}`,
"X-Client-Version": "1.0.0",
},
// Upload timeout (30 seconds)
timeout: 30000,
// Retry failed uploads
retries: 3,
// Global progress tracking
onProgress: (progress) => {
console.log(`Upload progress: ${progress.percentage}%`);
},
// Global error handling
onError: (error) => {
console.error("Upload error:", error);
// Send to error tracking service
trackError(error);
},
});
```
## Upload Method Options
Each upload method accepts configuration options:
void",
},
onSuccess: {
description: "Success callback when upload completes",
type: "(results: UploadResult[]) => void",
},
onError: {
description: "Error callback for this upload",
type: "(error: UploadError) => void",
},
metadata: {
description: "Custom metadata to include with upload",
type: "Record",
},
abortSignal: {
description: "AbortSignal to cancel the upload",
type: "AbortSignal",
},
}}
/>
### Upload Method Examples
```typescript
const { uploadFiles } = upload.imageUpload
// Simple upload
const results = await uploadFiles(selectedFiles)
console.log('Uploaded files:', results)
```
```typescript
const { uploadFiles } = upload.imageUpload
await uploadFiles(selectedFiles, {
onProgress: (progress) => {
console.log(`Upload ${progress.percentage}% complete`)
updateProgressBar(progress.percentage)
},
onSuccess: (results) => {
console.log('Upload successful!', results)
showSuccessNotification()
},
onError: (error) => {
console.error('Upload failed:', error)
showErrorNotification(error.message)
}
})
```
```typescript
const { uploadFiles } = upload.documentUpload
await uploadFiles(selectedFiles, {
metadata: {
category: 'contracts',
department: 'legal',
priority: 'high',
tags: ['confidential', 'urgent']
}
})
```
```typescript
const { uploadFiles } = upload.videoUpload
const abortController = new AbortController()
// Start upload
const uploadPromise = uploadFiles(selectedFiles, {
abortSignal: abortController.signal,
onProgress: (progress) => {
if (progress.percentage > 50 && shouldCancel) {
abortController.abort()
}
}
})
// Cancel upload after 10 seconds
setTimeout(() => abortController.abort(), 10000)
try {
await uploadPromise
} catch (error) {
if (error.name === 'AbortError') {
console.log('Upload was cancelled')
}
}
```
## Hook-Based API (Alternative)
For teams that prefer React hooks, the hook-based API provides a familiar pattern:
void",
},
onError: {
description: "Error callback for upload failures",
type: "(error: UploadError) => void",
},
disabled: {
description: "Disable the upload functionality",
type: "boolean",
default: "false",
},
}}
/>
### Hook Usage Examples
```typescript
import { useUpload } from 'pushduck/client'
import type { AppRouter } from '@/lib/upload'
export function ImageUploadComponent() {
const { uploadFiles, files, isUploading, error, reset } = useUpload('imageUpload', {
onSuccess: (results) => {
console.log('Upload completed:', results)
},
onError: (error) => {
console.error('Upload failed:', error)
}
})
return (
uploadFiles(Array.from(e.target.files || []))}
disabled={isUploading}
/>
{files.map(file => (
{file.name}
{file.status === 'error' &&
Failed: {file.error} }
{file.status === 'success' &&
View }
))}
Reset
)
}
```
```typescript
export function MultiUploadComponent() {
const images = useUpload('imageUpload')
const documents = useUpload('documentUpload')
return (
)
}
```
```typescript
import { useUpload } from 'pushduck/client'
import { useCallback } from 'react'
import type { AppRouter } from '@/lib/upload'
export function useImageUpload() {
const upload = useUpload('imageUpload', {
onSuccess: (results) => {
// Show success toast
toast.success(`Uploaded ${results.length} images`)
},
onError: (error) => {
// Show error toast
toast.error(`Upload failed: ${error.message}`)
}
})
const uploadImages = useCallback(async (files: File[]) => {
// Validate files before upload
const validFiles = files.filter(file => {
if (file.size > 5 * 1024 * 1024) { // 5MB
toast.error(`${file.name} is too large`)
return false
}
return true
})
if (validFiles.length > 0) {
await upload.uploadFiles(validFiles)
}
}, [upload.uploadFiles])
return {
...upload,
uploadImages
}
}
```
## Property-Based Client Access
The property-based client provides enhanced type inference and eliminates string literals:
### Type Safety Benefits
```typescript
const { uploadFiles } = upload.imageUpload
// ^ TypeScript knows this exists
const { uploadFiles: docUpload } = upload.nonExistentEndpoint
// ^ TypeScript error!
```
```typescript
upload. // IntelliSense shows: imageUpload, documentUpload, videoUpload
// ^ All your router endpoints are available with autocomplete
```
```typescript
// If you rename 'imageUpload' to 'images' in your router,
// TypeScript will show errors everywhere it's used,
// making refactoring safe and easy
```
### Enhanced Type Inference
The property-based client provides complete type inference from your server router:
```typescript
// Server router definition
export const router = createUploadRouter({
profilePictures: uploadSchema({
image: { maxSize: "2MB", maxCount: 1 },
}).middleware(async ({ req }) => {
const userId = await getUserId(req);
return { userId, category: "profile" };
}),
// ... other endpoints
});
// Client usage with full type inference
const { uploadFiles, files, isUploading } = upload.profilePictures;
// ^ uploadFiles knows it accepts File[]
// ^ files has type UploadFile[]
// ^ isUploading is boolean
// Upload files with inferred return type
const results = await uploadFiles(selectedFiles);
// ^ results is UploadResult[] with your specific metadata shape
```
## Framework-Specific Configuration
```typescript
// app/lib/upload-client.ts
import { createUploadClient } from 'pushduck/client'
import type { AppRouter } from './upload'
export const upload = createUploadClient({
endpoint: '/api/upload',
headers: {
// Next.js specific headers
'x-requested-with': 'pushduck'
}
})
// app/components/upload-form.tsx
'use client'
import { upload } from '@/lib/upload-client'
export function UploadForm() {
const { uploadFiles, files, isUploading } = upload.imageUpload
// Component implementation...
}
```
```typescript
// src/lib/upload-client.ts
import { createUploadClient } from 'pushduck/client'
import type { AppRouter } from './upload'
export const upload = createUploadClient({
endpoint: process.env.REACT_APP_UPLOAD_ENDPOINT || '/api/upload'
})
// src/components/UploadForm.tsx
import React from 'react'
import { upload } from '../lib/upload-client'
export function UploadForm() {
const { uploadFiles, files, isUploading } = upload.imageUpload
// Component implementation...
}
```
```typescript
// lib/upload-client.ts
import { createUploadClient } from '@pushduck/vue'
import type { AppRouter } from './upload'
export const upload = createUploadClient({
endpoint: '/api/upload'
})
// components/UploadForm.vue
```
```typescript
// lib/upload-client.ts
import { uploadStore } from '@pushduck/svelte'
import type { AppRouter } from './upload'
export const upload = uploadStore('/api/upload')
// components/UploadForm.svelte
```
## Error Handling Configuration
Configure comprehensive error handling for robust applications:
boolean",
},
maxConcurrentUploads: {
description: "Maximum number of concurrent file uploads",
type: "number",
default: "3",
},
chunkSize: {
description: "Size of upload chunks for large files",
type: "number",
default: "5242880",
},
}}
/>
### Advanced Error Handling
```typescript
export const upload = createUploadClient({
endpoint: "/api/upload",
// Custom retry configuration
retries: 3,
retryDelays: [1000, 2000, 4000], // 1s, 2s, 4s
// Custom retry logic
retryCondition: (error, attemptNumber) => {
// Don't retry client errors (4xx)
if (error.status >= 400 && error.status < 500) {
return false;
}
// Retry server errors up to 3 times
return attemptNumber < 3;
},
// Concurrent upload limits
maxConcurrentUploads: 2,
// Large file chunking
chunkSize: 10 * 1024 * 1024, // 10MB chunks
// Global error handler
onError: (error) => {
// Log to error tracking service
if (error.status >= 500) {
logError("Server error during upload", error);
}
// Show user-friendly message
if (error.code === "FILE_TOO_LARGE") {
showToast("File is too large. Please choose a smaller file.");
} else if (error.code === "NETWORK_ERROR") {
showToast("Network error. Please check your connection.");
} else {
showToast("Upload failed. Please try again.");
}
},
});
```
## Performance Optimization
Configure the client for optimal performance:
### Upload Performance
```typescript
export const upload = createUploadClient({
endpoint: "/api/upload",
// Optimize for performance
maxConcurrentUploads: 3, // Balance between speed and resource usage
chunkSize: 5 * 1024 * 1024, // 5MB chunks for large files
timeout: 60000, // 60 second timeout for large files
// Compression for images
compressImages: {
enabled: true,
quality: 0.8, // 80% quality
maxWidth: 1920, // Resize large images
maxHeight: 1080,
},
// Connection pooling
keepAlive: true,
maxSockets: 5,
// Progress throttling to avoid UI updates spam
progressThrottle: 100, // Update progress every 100ms
});
```
## Real-World Configuration Examples
### E-commerce Application
```typescript
export const ecommerceUpload = createUploadClient({
endpoint: "/api/upload",
headers: {
Authorization: `Bearer ${getAuthToken()}`,
"X-Store-ID": getStoreId(),
},
onProgress: (progress) => {
// Update global upload progress indicator
updateGlobalProgress(progress);
},
onError: (error) => {
// Track upload failures for analytics
analytics.track("upload_failed", {
error_code: error.code,
file_type: error.metadata?.fileType,
store_id: getStoreId(),
});
},
// E-commerce specific settings
retries: 2, // Quick retries for better UX
maxConcurrentUploads: 5, // Allow multiple product images
compressImages: {
enabled: true,
quality: 0.9, // High quality for product images
},
});
// Usage in product form
export function ProductImageUpload() {
const { uploadFiles, files, isUploading } = ecommerceUpload.productImages;
const handleImageUpload = async (files: File[]) => {
await uploadFiles(files, {
metadata: {
productId: getCurrentProductId(),
category: "product-images",
},
onSuccess: (results) => {
updateProductImages(results.map((r) => r.url));
},
});
};
return (
// Upload component implementation
{/* Upload UI */}
);
}
```
### Content Management System
```typescript
export const cmsUpload = createUploadClient({
endpoint: "/api/upload",
headers: {
Authorization: `Bearer ${getAuthToken()}`,
"X-Workspace": getCurrentWorkspace(),
},
// CMS-specific configuration
timeout: 120000, // 2 minutes for large documents
retries: 3,
maxConcurrentUploads: 2, // Conservative for large files
onError: (error) => {
// Show contextual error messages
if (error.code === "QUOTA_EXCEEDED") {
showUpgradeModal();
} else if (error.code === "UNAUTHORIZED") {
redirectToLogin();
}
},
});
// Usage in content editor
export function MediaLibrary() {
const images = cmsUpload.images;
const documents = cmsUpload.documents;
const videos = cmsUpload.videos;
return (
);
}
```
***
**Ready to upload?** Check out our [complete examples](/docs/examples) to see
these configurations in action, or explore our [provider setup
guides](/docs/guides/setup/aws-s3) to configure your storage backend.
# Path Configuration
URL: /docs/api/configuration/path-configuration
Complete guide to organizing your uploads with hierarchical path structures and custom file organization
***
title: Path Configuration
description: Complete guide to organizing your uploads with hierarchical path structures and custom file organization
---------------------------------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
import { TypeTable } from "fumadocs-ui/components/type-table";
# Path Configuration
Organize your uploads with powerful hierarchical path structures that provide clean file organization, prevent conflicts, and enable scalable storage patterns.
**New in v2.0:** The hierarchical path system allows global configuration to provide the foundation while route-level paths extend and nest within it - **no more overrides that lose configuration!**
## How Hierarchical Paths Work
The path system works in **layers** that build upon each other:
**Path Structure:** `{globalPrefix}/{routePrefix}/{globalGenerated}`
* **Global prefix:** `uploads` (foundation for all files)
* **Route prefix:** `images`, `documents` (category organization)
* **Global generated:** `{userId}/{timestamp}/{randomId}/{filename}` (file structure)
## Basic Path Configuration
### Global Foundation
Configure the base structure that all uploads will use:
```typescript title="lib/upload.ts"
import { createUploadConfig } from "pushduck/server";
const { s3 } = createUploadConfig()
.provider("cloudflareR2",{
// ... provider config
})
.paths({
// Global prefix - foundation for ALL uploads
prefix: "uploads",
// Global structure - used when routes don't override
generateKey: (file, metadata) => {
const userId = metadata.userId || "anonymous";
const timestamp = Date.now();
const randomId = Math.random().toString(36).substring(2, 8);
const sanitizedName = file.name.replace(/[^a-zA-Z0-9.-]/g, "_");
// Return ONLY the file path part (no prefix)
return `${userId}/${timestamp}/${randomId}/${sanitizedName}`;
},
})
.build();
```
### Route-Level Extensions
Extend the global foundation with route-specific organization:
```typescript title="app/api/upload/route.ts"
import { s3 } from "@/lib/upload";
const s3Router = s3.createRouter({
// Images: uploads/images/{userId}/{timestamp}/{randomId}/photo.jpg
imageUpload: s3
.image()
.max("5MB")
.paths({
prefix: "images", // Nests under global prefix
}),
// Documents: uploads/documents/{userId}/{timestamp}/{randomId}/report.pdf
documentUpload: s3
.file()
.max("10MB")
.paths({
prefix: "documents", // Nests under global prefix
}),
// General: uploads/{userId}/{timestamp}/{randomId}/file.ext
generalUpload: s3
.file()
.max("20MB")
// No .paths() - uses pure global configuration
});
```
**โจ Result:** Clean, predictable paths that scale with your application. Global config provides consistency while routes add organization.
## Advanced Path Patterns
### Custom Route Generation
Override the default structure for specific use cases:
```typescript
galleryUpload: s3
.image()
.max("5MB")
.paths({
generateKey: (ctx) => {
const { file, metadata, globalConfig } = ctx;
const globalPrefix = globalConfig.prefix || "uploads";
const date = new Date();
const year = date.getFullYear();
const month = String(date.getMonth() + 1).padStart(2, "0");
const userId = metadata.userId || "anonymous";
// Custom path: uploads/gallery/2024/06/demo-user/photo.jpg
return `${globalPrefix}/gallery/${year}/${month}/${userId}/${file.name}`;
},
})
```
**Result:** `uploads/gallery/2024/06/demo-user/photo.jpg`
```typescript
productUpload: s3
.image()
.max("8MB")
.paths({
generateKey: (ctx) => {
const { file, metadata, globalConfig } = ctx;
const globalPrefix = globalConfig.prefix || "uploads";
const category = metadata.category || "general";
const productId = metadata.productId || "unknown";
const timestamp = Date.now();
// Custom path: uploads/products/electronics/prod-123/1234567890/image.jpg
return `${globalPrefix}/products/${category}/${productId}/${timestamp}/${file.name}`;
},
})
```
**Result:** `uploads/products/electronics/prod-123/1234567890/image.jpg`
```typescript
profileUpload: s3
.image()
.max("2MB")
.paths({
generateKey: (ctx) => {
const { file, metadata, globalConfig } = ctx;
const globalPrefix = globalConfig.prefix || "uploads";
const userId = metadata.userId || "anonymous";
const fileType = file.type.split('/')[0]; // image, video, etc.
// Custom path: uploads/users/user-123/profile/image/avatar.jpg
return `${globalPrefix}/users/${userId}/profile/${fileType}/${file.name}`;
},
})
```
**Result:** `uploads/users/user-123/profile/image/avatar.jpg`
### Environment-Based Paths
Organize files by environment for clean separation:
```typescript title="lib/upload.ts"
const { s3 } = createUploadConfig()
.provider("cloudflareR2",{
// ... provider config
})
.paths({
// Environment-aware global prefix
prefix: process.env.NODE_ENV === "production" ? "prod" : "dev",
generateKey: (file, metadata) => {
const userId = metadata.userId || "anonymous";
const timestamp = Date.now();
const randomId = Math.random().toString(36).substring(2, 8);
return `${userId}/${timestamp}/${randomId}/${file.name}`;
},
})
.build();
```
**Result Paths:**
* **Development:** `dev/images/user123/1234567890/abc123/photo.jpg`
* **Production:** `prod/images/user123/1234567890/abc123/photo.jpg`
## Path Configuration API
### Global Configuration
string",
},
}}
/>
```typescript
.paths({
prefix: "uploads",
generateKey: (file, metadata) => {
// Return the file path structure (without prefix)
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
```
### Route Configuration
string",
},
}}
/>
```typescript
.paths({
prefix: "images", // Nested under global prefix
generateKey: (ctx) => {
const { file, metadata, globalConfig, routeName } = ctx;
// Full control with access to global configuration
return `${globalConfig.prefix}/custom/${file.name}`;
}
})
```
### PathContext Reference
When using custom `generateKey` functions at the route level, you receive a context object:
## Real-World Examples
### E-commerce Platform
```typescript title="app/api/upload/route.ts"
const s3Router = s3.createRouter({
// Product images: uploads/products/{category}/{productId}/images/photo.jpg
productImages: s3
.image()
.max("8MB")
.formats(["jpeg", "png", "webp"])
.middleware(async ({ req }) => {
const { productId, category } = await getProductContext(req);
return { productId, category, userId: await getUserId(req) };
})
.paths({
generateKey: (ctx) => {
const { file, metadata, globalConfig } = ctx;
const { productId, category } = metadata;
return `${globalConfig.prefix}/products/${category}/${productId}/images/${file.name}`;
},
}),
// User avatars: uploads/users/{userId}/avatar/profile.jpg
userAvatar: s3
.image()
.max("2MB")
.formats(["jpeg", "png", "webp"])
.middleware(async ({ req }) => {
const userId = await getUserId(req);
return { userId, type: "avatar" };
})
.paths({
generateKey: (ctx) => {
const { file, metadata, globalConfig } = ctx;
return `${globalConfig.prefix}/users/${metadata.userId}/avatar/${file.name}`;
},
}),
// Order documents: uploads/orders/{orderId}/documents/{timestamp}/receipt.pdf
orderDocuments: s3
.file()
.max("10MB")
.types(["application/pdf", "image/*"])
.middleware(async ({ req }) => {
const { orderId } = await getOrderContext(req);
return { orderId, uploadedAt: new Date().toISOString() };
})
.paths({
generateKey: (ctx) => {
const { file, metadata, globalConfig } = ctx;
const timestamp = Date.now();
return `${globalConfig.prefix}/orders/${metadata.orderId}/documents/${timestamp}/${file.name}`;
},
}),
});
```
### Content Management System
```typescript title="app/api/upload/route.ts"
const s3Router = s3.createRouter({
// Media library: uploads/media/{year}/{month}/{type}/filename.ext
mediaLibrary: s3
.file()
.max("50MB")
.middleware(async ({ req, file }) => {
const userId = await getUserId(req);
const mediaType = file.type.split('/')[0]; // image, video, audio
return { userId, mediaType };
})
.paths({
generateKey: (ctx) => {
const { file, metadata, globalConfig } = ctx;
const date = new Date();
const year = date.getFullYear();
const month = String(date.getMonth() + 1).padStart(2, "0");
const randomId = Math.random().toString(36).substring(2, 8);
return `${globalConfig.prefix}/media/${year}/${month}/${metadata.mediaType}/${randomId}-${file.name}`;
},
}),
// Page assets: uploads/pages/{pageSlug}/assets/image.jpg
pageAssets: s3
.image()
.max("10MB")
.middleware(async ({ req }) => {
const { pageSlug } = await getPageContext(req);
return { pageSlug, userId: await getUserId(req) };
})
.paths({
generateKey: (ctx) => {
const { file, metadata, globalConfig } = ctx;
return `${globalConfig.prefix}/pages/${metadata.pageSlug}/assets/${file.name}`;
},
}),
// Temp uploads: uploads/temp/{userId}/{sessionId}/file.ext
tempUploads: s3
.file()
.max("20MB")
.middleware(async ({ req }) => {
const userId = await getUserId(req);
const sessionId = req.headers.get('x-session-id') || 'unknown';
return { userId, sessionId, temporary: true };
})
.paths({
prefix: "temp", // Simple prefix approach
}),
});
```
## Best Practices
### ๐ฏ Path Organization
```typescript
// โ
Good
.paths({ prefix: "user-avatars" })
.paths({ prefix: "product-images" })
// โ Avoid
.paths({ prefix: "imgs" })
.paths({ prefix: "files" })
```
```typescript
// โ
Good - includes user and timestamp
return `${prefix}/users/${userId}/${timestamp}/${file.name}`;
// โ Avoid - no traceability
return `${prefix}/${file.name}`;
```
```typescript
// โ
Good - timestamp + random ID
const timestamp = Date.now();
const randomId = Math.random().toString(36).substring(2, 8);
return `${prefix}/${userId}/${timestamp}/${randomId}/${file.name}`;
```
### ๐ Performance Tips
**Path Length Limits:** Most S3-compatible services have a 1024-character limit for object keys. Keep your paths reasonable!
* **Use short, meaningful prefixes** instead of long descriptive names
* **Avoid deep nesting** beyond 5-6 levels for better performance
* **Include timestamps** for natural chronological ordering
* **Sanitize filenames** to prevent special character issues
### ๐ Security Considerations
```typescript
// โ
Sanitize user input in paths
const sanitizePathSegment = (input: string) => {
return input.replace(/[^a-zA-Z0-9.-]/g, "_").substring(0, 50);
};
.paths({
generateKey: (ctx) => {
const { file, metadata, globalConfig } = ctx;
const safeUserId = sanitizePathSegment(metadata.userId);
const safeFilename = sanitizePathSegment(file.name);
return `${globalConfig.prefix}/users/${safeUserId}/${Date.now()}/${safeFilename}`;
}
})
```
## Migration from Legacy Paths
**Upgrading from v1.x?** The old `pathPrefix` option still works but is deprecated. Use the new hierarchical system for better organization.
### Before (Legacy)
```typescript
// Old way - simple prefix only
export const { POST, GET } = s3Router.handlers;
```
### After (Hierarchical)
```typescript
// New way - hierarchical configuration
const { s3 } = createUploadConfig()
.provider("cloudflareR2",{ /* config */ })
.paths({
prefix: "uploads",
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
const s3Router = s3.createRouter({
images: s3.image().paths({ prefix: "images" }),
documents: s3.file().paths({ prefix: "documents" })
});
```
**Benefits of upgrading:**
* โ
**Better organization** with route-specific prefixes
* โ
**No configuration loss** - global settings are preserved
* โ
**More flexibility** with custom generation functions
* โ
**Type safety** with PathContext
* โ
**Scalable patterns** for growing applications
**๐ You're ready!** Your uploads now have clean, organized, and scalable path structures that will grow with your application.
# Server Router Configuration
URL: /docs/api/configuration/server-router
Complete guide to configuring your upload router with type safety and advanced features
***
title: Server Router Configuration
description: Complete guide to configuring your upload router with type safety and advanced features
----------------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { File, Folder, Files } from "fumadocs-ui/components/files";
import { TypeTable } from "fumadocs-ui/components/type-table";
# Server Router Configuration
The server router is the heart of pushduck. It defines your upload endpoints with complete type safety and validates all uploads against your schema.
This guide covers the **enhanced router API** with property-based client
access. Looking for the legacy API? Check our [migration
guide](/docs/guides/migration/enhanced-client).
## Project Structure
A typical Next.js project with pushduck follows this structure:
## Basic Router Setup
**Create your upload route**
Start by creating the API route that will handle your uploads:
```typescript title="app/api/upload/route.ts"
import { s3 } from '@/lib/upload'
const s3Router = s3.createRouter({
imageUpload: s3
.image()
.max('4MB')
.count(10)
.middleware(async ({ req }) => {
// Add your authentication logic here
return { userId: "user_123" }
}),
documentUpload: s3
.file()
.max('10MB')
.types(['application/pdf'])
.count(1)
})
export const { POST, GET } = s3Router.handlers
```
**Export router types**
Create a separate file to export your router types for client-side usage:
```typescript title="lib/upload.ts"
import { createUploadConfig } from 'pushduck/server'
// Configure upload with provider and settings
const { s3, storage } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL!,
bucket: process.env.S3_BUCKET_NAME!,
accountId: process.env.R2_ACCOUNT_ID!,
})
.build()
// Define your router
const s3Router = s3.createRouter({
imageUpload: s3.image().max('4MB').count(10),
documentUpload: s3.file().max('10MB').types(['application/pdf']).count(1)
})
// Export for use in API routes and client
export { s3, storage }
export type AppS3Router = typeof s3Router
```
**Create typed client**
Set up your client-side upload client with full type safety:
```typescript title="lib/upload-client.ts"
import { createUploadClient } from 'pushduck/client'
import type { AppS3Router } from './upload'
export const upload = createUploadClient({
endpoint: '/api/upload'
})
```
## Schema Builder Reference
The `s3` schema builders provide a fluent API for defining your upload requirements:
### FileConfig Options
## Advanced Configuration
### Multiple File Types
You can define schemas that accept multiple file types:
```typescript
const s3Router = s3.createRouter({
mixedUpload: s3.object({
images: s3.image().max('4MB').count(5),
pdfs: s3.file().max('10MB').types(['application/pdf']).count(2),
documents: s3.file().max('5MB').types(['application/vnd.openxmlformats-officedocument.wordprocessingml.document']).count(3)
})
})
```
```typescript
const s3Router = s3.createRouter({
mediaUpload: s3.object({
images: s3.image()
.max('4MB')
.count(10)
.formats(['jpeg', 'jpg', 'png', 'webp']),
videos: s3.file()
.max('100MB')
.count(2)
.types(['video/mp4', 'video/quicktime', 'video/avi'])
})
})
```
```typescript
const s3Router = s3.createRouter({
genericUpload: s3.file()
.max('50MB')
.count(20)
.types([
'image/*', 'video/*', 'application/pdf',
'application/msword', 'text/plain'
])
})
```
### Global Configuration
Configure settings that apply to all upload endpoints:
**Deprecated:** The `pathPrefix` option is deprecated. Use the new [**hierarchical path configuration**](/docs/api/configuration/path-configuration) for better organization and flexibility.
```typescript
// โ Deprecated approach - use modern createUploadConfig() instead
// This section is kept for reference only
```
**For new projects, use the [hierarchical path system](/docs/api/configuration/path-configuration) instead:**
```typescript
// โ
Modern approach with hierarchical paths
const { s3 } = createUploadConfig()
.provider("cloudflareR2",{ /* config */ })
.paths({
prefix: "uploads",
generateKey: (file, metadata) => {
return `${metadata.userId}/${Date.now()}/${file.name}`;
}
})
.build();
const s3Router = s3.createRouter({
images: s3.image().paths({ prefix: "images" }),
documents: s3.file().paths({ prefix: "documents" })
});
```
### Multiple Providers
Support different storage providers for different upload types:
```typescript
// โ
Modern approach with multiple provider configurations
import { createUploadConfig } from "pushduck/server";
const primaryConfig = createUploadConfig()
.provider("cloudflareR2",{
// Primary R2 config for production files
accessKeyId: process.env.R2_ACCESS_KEY_ID!,
secretAccessKey: process.env.R2_SECRET_ACCESS_KEY!,
accountId: process.env.R2_ACCOUNT_ID!,
bucket: process.env.R2_BUCKET!,
})
.build();
const backupConfig = createUploadConfig()
.provider("aws",{
// AWS S3 config for backups
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
region: process.env.AWS_REGION!,
bucket: process.env.AWS_BACKUP_BUCKET!,
})
.build();
// Use primary config for main uploads
const s3Router = primaryConfig.s3.createRouter({
productImages: primaryConfig.s3.image().max("4MB").count(10),
// Backup handling would be done in lifecycle hooks
});
```
## Middleware Integration
Add authentication, logging, and custom validation:
### Authentication Middleware
```typescript
const s3Router = s3.createRouter({
privateUploads: s3.image()
.max("4MB")
.count(5)
.middleware(async ({ req, metadata }) => {
const session = await getServerSession(req);
if (!session?.user?.id) {
throw new Error("Unauthorized");
}
return {
...metadata,
userId: session.user.id,
userRole: session.user.role,
};
}),
publicUploads: s3.image()
.max("1MB")
.count(1), // No middleware = publicly accessible
});
```
### File Validation Middleware
```typescript
import { z } from "zod";
const s3Router = s3.createRouter({
profilePicture: s3.image()
.max("2MB")
.count(1)
.middleware(async ({ req, file, metadata }) => {
// Custom file validation
if (file.name.includes("temp") || file.name.includes("test")) {
throw new Error("Temporary files not allowed");
}
const userId = await getUserId(req);
return { ...metadata, userId };
}),
});
```
### Metadata Enhancement
```typescript
const s3Router = s3.createRouter({
documentUpload: s3.file()
.max("10MB")
.count(1)
.types(["application/pdf"])
.middleware(async ({ req, metadata }) => {
const enrichedMetadata = {
...metadata,
uploadedAt: new Date().toISOString(),
userAgent: req.headers.get("user-agent"),
ip: req.headers.get("x-forwarded-for") || "unknown",
};
return enrichedMetadata;
}),
});
```
## Lifecycle Hooks
Handle upload events for processing, notifications, and cleanup:
void | Promise",
},
onUploadProgress: {
description: "Called during upload progress",
type: "(context, progress) => void | Promise",
},
onUploadComplete: {
description: "Called when upload succeeds",
type: "(context, result) => void | Promise",
},
onUploadError: {
description: "Called when upload fails",
type: "(context, error) => void | Promise",
},
}}
/>
```typescript
const s3Router = s3.createRouter({
imageUpload: s3.image()
.max("4MB")
.count(10)
.onUploadComplete(async ({ file, url, metadata }) => {
// Process uploaded image
await generateThumbnail(url);
await updateDatabase(file.key, metadata.userId);
})
.onUploadError(async ({ error, metadata }) => {
// Log errors and notify admins
console.error("Upload failed:", error);
await notifyAdmins(`Upload failed for user ${metadata.userId}`);
}),
});
```
## Type Safety Features
### Router Type Export
Export your router type for end-to-end type safety:
```typescript title="lib/upload.ts"
import { createUploadConfig } from "pushduck/server";
const { s3 } = createUploadConfig()
.provider("cloudflareR2",{ /* your config */ })
.build();
const s3Router = s3.createRouter({
// ... your configuration
});
export { s3 };
export type AppS3Router = typeof s3Router;
// Extract individual endpoint types
export type ImageUploadType = AppS3Router["imageUpload"];
export type DocumentUploadType = AppS3Router["documentUpload"];
```
### Custom Context Types
Define custom context types for your middleware:
```typescript
interface CustomContext {
userId: string;
userRole: "admin" | "user" | "guest";
organizationId?: string;
}
const s3Router = s3.createRouter({
upload: s3.image()
.max('4MB')
.count(5)
.middleware(async ({ req }): Promise => {
// Your auth logic here
return {
userId: "user_123",
userRole: "user",
};
}),
});
```
## Real-World Examples
### E-commerce Product Images
```typescript
const ecommerceRouter = s3.createRouter({
productImages: s3.image()
.max('5MB')
.count(8)
.formats(['webp', 'jpeg'])
.middleware(async ({ req }) => {
const vendorId = await getVendorId(req);
return { vendorId, category: "products" };
})
.onUploadComplete(async ({ files, metadata }) => {
// Update product catalog
await updateProductImages(metadata.vendorId, files);
}),
});
```
### Document Management System
```typescript
const docsRouter = s3.createRouter({
contracts: s3.file()
.max('25MB')
.types(['application/pdf'])
.count(1)
.middleware(async ({ req }) => {
const { userId, companyId } = await validateContractUpload(req);
return { userId, companyId, confidential: true };
}),
proposals: s3.object({
pdfs: s3.file().max('50MB').types(['application/pdf']).count(3),
documents: s3.file().max('25MB').types(['application/vnd.openxmlformats-officedocument.wordprocessingml.document']).count(5),
}).middleware(async ({ req }) => {
const { userId, projectId } = await validateProposalUpload(req);
return { userId, projectId };
}),
});
```
### Social Media Platform
```typescript
export const socialRouter = createUploadRouter({
profilePicture: uploadSchema({
image: {
maxSize: "2MB",
maxCount: 1,
processing: {
resize: { width: 400, height: 400 },
format: "webp",
},
},
}),
postMedia: uploadSchema({
image: { maxSize: "8MB", maxCount: 4 },
video: { maxSize: "100MB", maxCount: 1 },
}).middleware(async ({ req }) => {
const userId = await authenticateUser(req);
return { userId, postType: "media" };
}),
});
```
## Security Best Practices
**Important:** Always implement proper authentication and file validation in
production environments.
### Content Type Validation
```typescript
export const router = createUploadRouter({
secureUpload: uploadSchema({
image: {
maxSize: "4MB",
maxCount: 5,
mimeTypes: ["image/jpeg", "image/png", "image/webp"], // Explicit whitelist
},
}).middleware(async ({ req, files }) => {
// Additional security checks
for (const file of files) {
// Validate file headers match content type
const isValidImage = await validateImageFile(file);
if (!isValidImage) {
throw new Error("Invalid image file");
}
}
return { userId: await getUserId(req) };
}),
});
```
### Rate Limiting
```typescript
import { ratelimit } from "@/lib/ratelimit";
export const router = createUploadRouter({
upload: uploadSchema({
any: { maxSize: "10MB", maxCount: 3 },
}).middleware(async ({ req }) => {
const ip = req.headers.get("x-forwarded-for") || "unknown";
const { success } = await ratelimit.limit(ip);
if (!success) {
throw new Error("Rate limit exceeded");
}
return { ip };
}),
});
```
***
**Next Steps:** Now that you have your router configured, learn how to
[configure your client](/docs/api/configuration/client-options) for the best
developer experience.
# Upload Configuration
URL: /docs/api/configuration/upload-config
Complete guide to configuring pushduck with the createUploadConfig builder
***
title: Upload Configuration
description: Complete guide to configuring pushduck with the createUploadConfig builder
---------------------------------------------------------------------------------------
# Upload Configuration
The `createUploadConfig()` builder provides a fluent API for configuring your S3 uploads with providers, security, paths, and lifecycle hooks.
## Basic Setup
```typescript
// lib/upload.ts
import { createUploadConfig } from 'pushduck/server'
const { storage, s3, config } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL,
bucket: process.env.S3_BUCKET_NAME,
accountId: process.env.R2_ACCOUNT_ID,
})
.build()
export { storage, s3 }
```
## Provider Configuration
The `createUploadConfig().provider()` method provides full TypeScript type safety for provider configurations. When you specify a provider type, TypeScript will automatically infer the correct configuration interface and provide autocomplete and type checking.
### Features
* **Provider name autocomplete**: TypeScript suggests valid provider names
* **Configuration property validation**: Only valid properties for each provider are accepted
* **Required field checking**: TypeScript ensures all required fields are provided
* **Property type validation**: Each property must be the correct type
## Provider Configuration
### Cloudflare R2
```typescript
createUploadConfig().provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: 'auto', // Always 'auto' for R2
endpoint: process.env.AWS_ENDPOINT_URL,
bucket: process.env.S3_BUCKET_NAME,
accountId: process.env.R2_ACCOUNT_ID, // Required for R2
})
```
### AWS S3
```typescript
createUploadConfig().provider("aws",{
region: 'us-east-1',
bucket: 'your-bucket',
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
})
```
### DigitalOcean Spaces
```typescript
createUploadConfig().provider("digitalOceanSpaces",{
region: 'nyc3',
bucket: 'your-space',
accessKeyId: process.env.DO_SPACES_ACCESS_KEY_ID,
secretAccessKey: process.env.DO_SPACES_SECRET_ACCESS_KEY,
})
```
### MinIO
```typescript
createUploadConfig().provider("minio",{
endpoint: 'localhost:9000',
bucket: 'your-bucket',
accessKeyId: process.env.MINIO_ACCESS_KEY_ID,
secretAccessKey: process.env.MINIO_SECRET_ACCESS_KEY,
useSSL: false,
})
```
### S3-Compatible (Generic)
For any S3-compatible storage service not explicitly supported:
```typescript
createUploadConfig().provider("s3Compatible",{
endpoint: 'https://your-s3-compatible-service.com',
bucket: 'your-bucket',
accessKeyId: process.env.S3_ACCESS_KEY_ID,
secretAccessKey: process.env.S3_SECRET_ACCESS_KEY,
region: 'us-east-1', // Optional, defaults to us-east-1
forcePathStyle: true, // Optional, defaults to true for compatibility
})
```
## Configuration Methods
### .defaults()
Set default file constraints and options:
```typescript
.defaults({
maxFileSize: '10MB',
allowedFileTypes: ['image/*', 'application/pdf', 'text/*'],
acl: 'public-read', // or 'private'
metadata: {
uploadedBy: 'system',
environment: process.env.NODE_ENV,
},
})
```
### .paths()
Configure global path structure:
```typescript
.paths({
// Global prefix for all uploads
prefix: 'uploads',
// Global key generation function
generateKey: (file, metadata) => {
const userId = metadata.userId || 'anonymous'
const timestamp = Date.now()
const randomId = Math.random().toString(36).substring(2, 8)
const sanitizedName = file.name.replace(/[^a-zA-Z0-9.-]/g, '_')
return `${userId}/${timestamp}/${randomId}/${sanitizedName}`
},
})
```
### .security()
Configure security and access control:
```typescript
.security({
requireAuth: true,
allowedOrigins: [
'http://localhost:3000',
'https://your-domain.com',
],
rateLimiting: {
maxUploads: 10,
windowMs: 60000, // 1 minute
},
})
```
### .hooks()
Add lifecycle hooks for upload events:
```typescript
.hooks({
onUploadStart: async ({ file, metadata }) => {
console.log(`๐ Upload started: ${file.name}`)
// Log to analytics, validate user, etc.
},
onUploadComplete: async ({ file, url, metadata }) => {
console.log(`โ
Upload completed: ${file.name} -> ${url}`)
// Save to database
await db.files.create({
filename: file.name,
url,
userId: metadata.userId,
size: file.size,
})
// Send notifications
await notificationService.send({
type: 'upload_complete',
userId: metadata.userId,
filename: file.name,
})
},
onUploadError: async ({ file, error, metadata }) => {
console.error(`โ Upload failed: ${file.name}`, error)
// Log to error tracking service
await errorService.log({
operation: 'file_upload',
error: error.message,
userId: metadata.userId,
filename: file.name,
})
},
})
```
## Complete Example
```typescript
// lib/upload.ts
import { createUploadConfig } from 'pushduck/server'
const { storage, s3, config } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL,
bucket: process.env.S3_BUCKET_NAME,
accountId: process.env.R2_ACCOUNT_ID,
})
.defaults({
maxFileSize: '10MB',
acl: 'public-read',
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
const userId = metadata.userId || 'anonymous'
const date = new Date()
const year = date.getFullYear()
const month = String(date.getMonth() + 1).padStart(2, '0')
const day = String(date.getDate()).padStart(2, '0')
const randomId = Math.random().toString(36).substring(2, 8)
return `${userId}/${year}/${month}/${day}/${randomId}/${file.name}`
},
})
.security({
allowedOrigins: [
'http://localhost:3000',
'https://your-domain.com',
],
rateLimiting: {
maxUploads: 20,
windowMs: 60000,
},
})
.hooks({
onUploadComplete: async ({ file, url, metadata }) => {
// Save to your database
await saveFileRecord({
filename: file.name,
url,
userId: metadata.userId,
uploadedAt: new Date(),
})
},
})
.build()
export { storage, s3 }
```
## Environment Variables
The configuration automatically reads from environment variables:
```bash
# .env.local
# Cloudflare R2
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_ENDPOINT_URL=https://your-account-id.r2.cloudflarestorage.com
S3_BUCKET_NAME=your-bucket
R2_ACCOUNT_ID=your-account-id
# AWS S3 (alternative)
AWS_REGION=us-east-1
AWS_S3_BUCKET=your-s3-bucket
# DigitalOcean Spaces (alternative)
DO_SPACES_REGION=nyc3
DO_SPACES_BUCKET=your-space
DO_SPACES_ACCESS_KEY_ID=your_key
DO_SPACES_SECRET_ACCESS_KEY=your_secret
```
## Type Definitions
```typescript
interface UploadConfig {
provider: ProviderConfig
defaults?: {
maxFileSize?: string | number
allowedFileTypes?: string[]
acl?: 'public-read' | 'private'
metadata?: Record
}
paths?: {
prefix?: string
generateKey?: (
file: { name: string; type: string },
metadata: any
) => string
}
security?: {
requireAuth?: boolean
allowedOrigins?: string[]
rateLimiting?: {
maxUploads?: number
windowMs?: number
}
}
hooks?: {
onUploadStart?: (ctx: { file: any; metadata: any }) => Promise | void
onUploadComplete?: (ctx: {
file: any
url: string
metadata: any
}) => Promise | void
onUploadError?: (ctx: {
file: any
error: Error
metadata: any
}) => Promise | void
}
}
```
# useUploadRoute
URL: /docs/api/hooks/use-upload-route
React hook for uploading files with reactive state management
***
title: useUploadRoute
description: React hook for uploading files with reactive state management
--------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { TypeTable } from "fumadocs-ui/components/type-table";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { formatETA, formatUploadSpeed } from "pushduck";
# useUploadRoute
A React hook that provides reactive state management for file uploads with progress tracking and error handling.
This hook follows familiar React patterns and is perfect for teams that prefer
the traditional hook-based approach. Both this and `createUploadClient` are equally valid ways to handle uploads.
## When to Use This Hook
Use `useUploadRoute` when:
* ๐ช **Prefer React hooks** - familiar pattern for React developers
* ๐งฉ **Granular control** needed over individual upload state
* ๐ **Component-level state** management preferred
* ๐ฅ **Team preference** for hook-based patterns
## Alternative Approach
You can also use the structured client approach:
```typescript
// Hook-based approach
import { useUploadRoute } from 'pushduck/client'
const { uploadFiles, files } = useUploadRoute('imageUpload')
```
```typescript
// Structured client approach
import { createUploadClient } from 'pushduck/client'
const upload = createUploadClient({
endpoint: '/api/upload'
})
const { uploadFiles, files } = upload.imageUpload()
```
Both approaches provide the same functionality and type safety - choose what feels more natural for your team.
## Basic Usage
```typescript
import { useUploadRoute } from "pushduck/client";
import { formatETA, formatUploadSpeed } from "pushduck";
import type { AppRouter } from "@/lib/upload";
export function ImageUploader() {
// With type parameter (recommended for better type safety)
const { uploadFiles, files, isUploading, error, reset, progress, uploadSpeed, eta } =
useUploadRoute("imageUpload");
const handleFileSelect = (e: React.ChangeEvent) => {
const selectedFiles = Array.from(e.target.files || []);
uploadFiles(selectedFiles);
};
return (
{/* Overall Progress Tracking */}
{isUploading && files.length > 1 && progress !== undefined && (
Overall Progress: {Math.round(progress)}%
Speed: {uploadSpeed ? formatUploadSpeed(uploadSpeed) : '0 B/s'}
ETA: {eta ? formatETA(eta) : '--'}
)}
{/* Individual File Progress */}
{files.map((file) => (
{file.name}
{file.status === "success" &&
View }
))}
{error &&
{error.message}
}
Reset
);
}
```
## Overall Progress Tracking
The hook provides real-time overall progress metrics when uploading multiple files:
Overall progress tracking is especially useful for batch uploads and provides a better user experience when uploading multiple files simultaneously.
```typescript
const { progress, uploadSpeed, eta } = useUploadRoute("imageUpload");
// Progress: 0-100 percentage across all files
console.log(`Overall progress: ${progress}%`);
// Upload speed: Combined transfer rate in bytes/second
console.log(`Transfer rate: ${formatUploadSpeed(uploadSpeed)}`);
// ETA: Time remaining in seconds
console.log(`Time remaining: ${formatETA(eta)}`);
```
### Progress Calculation
* **progress**: Weighted by file sizes, not just file count
* **uploadSpeed**: Sum of all active file upload speeds
* **eta**: Calculated based on remaining bytes and current speed
* Values are `undefined` when no uploads are active
## Hook Signature
```typescript
// With type parameter (recommended)
function useUploadRoute(
route: keyof TRouter,
options?: UseUploadOptions
): UseUploadReturn;
// Without type parameter (also works)
function useUploadRoute(
route: string,
options?: UseUploadOptions
): UseUploadReturn;
```
## Parameters
### Type Parameter Benefits
```typescript
// โ
With type parameter - better type safety
const { uploadFiles } = useUploadRoute("imageUpload");
// - Route names are validated at compile time
// - IntelliSense shows available routes
// - Typos caught during development
// โ
Without type parameter - still works
const { uploadFiles } = useUploadRoute("imageUpload");
// - Works with any string
// - Less type safety but more flexible
// - Good for dynamic route names
```
## Options
void",
},
onError: {
description: "Callback when upload fails",
type: "(error: UploadError) => void",
},
onProgress: {
description: "Callback for progress updates",
type: "(progress: UploadProgress) => void",
},
disabled: {
description: "Disable upload functionality",
type: "boolean",
default: "false",
},
autoUpload: {
description: "Automatically upload files when added",
type: "boolean",
default: "true",
},
}}
/>
## Return Value
Promise",
},
files: {
description: "Array of files with upload status",
type: "UploadFile[]",
},
isUploading: {
description: "Whether any upload is in progress",
type: "boolean",
},
uploadedFiles: {
description: "Successfully uploaded files",
type: "UploadResult[]",
},
error: {
description: "Upload error if any",
type: "UploadError | null",
},
reset: {
description: "Reset upload state",
type: "() => void",
},
progress: {
description: "Overall progress across all files (0-100)",
type: "number | undefined",
},
uploadSpeed: {
description: "Combined transfer rate in bytes per second",
type: "number | undefined",
},
eta: {
description: "Overall time remaining in seconds",
type: "number | undefined",
},
}}
/>
## Advanced Examples
```typescript
const { uploadFiles, files } = useUploadRoute('documentUpload', {
onSuccess: (results) => {
toast.success(`Uploaded ${results.length} files`)
updateDocuments(results)
},
onError: (error) => {
toast.error(`Upload failed: ${error.message}`)
},
onProgress: (progress) => {
setGlobalProgress(progress.percentage)
}
})
```
```typescript
const images = useUploadRoute('imageUpload')
const documents = useUploadRoute('documentUpload')
return (
)
```
```typescript
const { uploadFiles, uploadedFiles } = useUploadRoute('attachments', {
onSuccess: (results) => {
setValue('attachments', results.map(r => r.url))
}
})
const onSubmit = (data) => {
// Form data includes uploaded file URLs
console.log(data.attachments)
}
```
***
**Flexible API:** Use this hook when you prefer React's familiar hook patterns
or need more granular control over upload state.
# useUpload
URL: /docs/api/hooks/use-upload
React hook for managing file uploads with full type safety and reactive state
***
title: useUpload
description: React hook for managing file uploads with full type safety and reactive state
------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { TypeTable } from "fumadocs-ui/components/type-table";
# useUpload
The `useUpload` hook provides a React-friendly API for managing file uploads with reactive state management, progress tracking, and error handling.
This hook is perfect for teams that prefer the familiar React hooks pattern.
For enhanced type safety with property-based access, see the [Client
Configuration](/docs/api/configuration/client-options) guide.
## Basic Usage
```typescript
import { useUpload } from "pushduck/client";
import type { AppRouter } from "@/lib/upload";
export function ImageUploader() {
const { uploadFiles, files, isUploading, error, reset } =
useUpload("imageUpload");
const handleFileSelect = (e: React.ChangeEvent) => {
const selectedFiles = Array.from(e.target.files || []);
uploadFiles(selectedFiles);
};
return (
{files.map((file) => (
{file.name}
{file.status === "success" &&
View }
{file.status === "error" &&
Error: {file.error} }
))}
Reset
);
}
```
## Hook API Reference
Promise",
},
files: {
description: "Array of files with upload status and progress",
type: "UploadFile[]",
},
isUploading: {
description: "Boolean indicating if any upload is in progress",
type: "boolean",
},
uploadedFiles: {
description: "Array of successfully uploaded files",
type: "UploadResult[]",
},
error: {
description: "Upload error if any occurred",
type: "UploadError | null",
},
progress: {
description: "Overall upload progress (0-100)",
type: "number",
},
reset: {
description: "Function to reset upload state",
type: "() => void",
},
cancel: {
description: "Function to cancel ongoing uploads",
type: "() => void",
},
}}
/>
## Hook Options
void",
},
onError: {
description: "Callback when upload fails",
type: "(error: UploadError) => void",
},
onProgress: {
description: "Callback for upload progress updates",
type: "(progress: UploadProgress) => void",
},
disabled: {
description: "Disable upload functionality",
type: "boolean",
default: "false",
},
autoUpload: {
description: "Automatically upload files when selected",
type: "boolean",
default: "true",
},
maxFiles: {
description: "Maximum number of files allowed",
type: "number",
},
}}
/>
## Advanced Usage Examples
### Upload with Callbacks
```typescript
export function AdvancedUploader() {
const { uploadFiles, files, isUploading } = useUpload(
"documentUpload",
{
onSuccess: (results) => {
toast.success(`Successfully uploaded ${results.length} files`);
// Update your app state
updateDocuments(results);
},
onError: (error) => {
toast.error(`Upload failed: ${error.message}`);
// Log error for debugging
console.error("Upload error:", error);
},
onProgress: (progress) => {
// Update global progress indicator
setGlobalProgress(progress.percentage);
},
maxFiles: 5,
}
);
const handleUpload = async (selectedFiles: File[]) => {
try {
await uploadFiles(selectedFiles, {
metadata: {
category: "user-documents",
timestamp: Date.now(),
},
});
} catch (error) {
// Handle specific upload errors
if (error.code === "FILE_TOO_LARGE") {
toast.error("One or more files are too large");
}
}
};
return {/* Upload UI */}
;
}
```
### Multiple Upload Types
```typescript
export function MultiTypeUploader() {
const images = useUpload("imageUpload", {
maxFiles: 10,
onSuccess: (results) => setImages((prev) => [...prev, ...results]),
});
const documents = useUpload("documentUpload", {
maxFiles: 5,
onSuccess: (results) => setDocuments((prev) => [...prev, ...results]),
});
const videos = useUpload("videoUpload", {
maxFiles: 2,
onSuccess: (results) => setVideos((prev) => [...prev, ...results]),
});
return (
images.uploadFiles(files)}
accept="image/*"
multiple
disabled={images.isUploading}
/>
documents.uploadFiles(files)}
accept=".pdf,.doc,.docx"
multiple
disabled={documents.isUploading}
/>
videos.uploadFiles(files)}
accept="video/*"
disabled={videos.isUploading}
/>
);
}
```
### Custom Upload Hook
Create reusable upload hooks for specific use cases:
```typescript
import { useUpload } from "pushduck/client";
import { useCallback } from "react";
import { toast } from "sonner";
export function useProfilePictureUpload() {
const upload = useUpload("profilePicture", {
maxFiles: 1,
onSuccess: (results) => {
toast.success("Profile picture updated!");
// Update user profile
updateProfile({ avatar: results[0].url });
},
onError: (error) => {
toast.error("Failed to update profile picture");
},
});
const uploadProfilePicture = useCallback(
async (file: File) => {
// Validate file before upload
if (file.size > 2 * 1024 * 1024) {
// 2MB
toast.error("Profile picture must be less than 2MB");
return;
}
if (!file.type.startsWith("image/")) {
toast.error("Please select an image file");
return;
}
await upload.uploadFiles([file], {
metadata: {
type: "profile-picture",
userId: getCurrentUserId(),
},
});
},
[upload.uploadFiles]
);
return {
...upload,
uploadProfilePicture,
};
}
// Usage
export function ProfilePictureUploader() {
const { uploadProfilePicture, files, isUploading } =
useProfilePictureUpload();
return (
{
const file = e.target.files?.[0];
if (file) uploadProfilePicture(file);
}}
disabled={isUploading}
/>
{/* Show upload progress */}
);
}
```
## File State Management
### UploadFile Type
### File Status Transitions
```typescript
export function FileStatusIndicator({ file }: { file: UploadFile }) {
const getStatusIcon = () => {
switch (file.status) {
case "pending":
return ;
case "uploading":
return ;
case "success":
return ;
case "error":
return ;
}
};
const getStatusText = () => {
switch (file.status) {
case "pending":
return "Waiting to upload...";
case "uploading":
return `Uploading... ${file.progress}%`;
case "success":
return "Upload complete";
case "error":
return `Error: ${file.error}`;
}
};
return (
{getStatusIcon()}
{getStatusText()}
);
}
```
## Error Handling
### Error Types
",
},
}}
/>
### Error Handling Patterns
```typescript
export function ErrorHandlingUploader() {
const { uploadFiles, files, error } = useUpload("fileUpload", {
onError: (error) => {
switch (error.code) {
case "FILE_TOO_LARGE":
toast.error("File is too large. Please choose a smaller file.");
break;
case "INVALID_TYPE":
toast.error(
"File type not supported. Please choose a different file."
);
break;
case "QUOTA_EXCEEDED":
toast.error("Upload quota exceeded. Please upgrade your plan.");
break;
case "NETWORK_ERROR":
toast.error(
"Network error. Please check your connection and try again."
);
break;
default:
toast.error("Upload failed. Please try again.");
}
},
});
const handleRetry = () => {
const failedFiles = files.filter((file) => file.status === "error");
if (failedFiles.length > 0) {
// Convert UploadFile back to File for retry
const filesToRetry = failedFiles
.map((f) => f.originalFile)
.filter(Boolean);
uploadFiles(filesToRetry);
}
};
return (
{/* Upload UI */}
{error && (
{error.message}
Retry
)}
);
}
```
## Integration Examples
### Form Integration
```typescript
import { useForm } from "react-hook-form";
import { useUpload } from "pushduck/client";
interface FormData {
title: string;
description: string;
files: UploadResult[];
}
export function DocumentForm() {
const { register, handleSubmit, setValue, watch } = useForm();
const { uploadFiles, uploadedFiles, isUploading } = useUpload(
"documentUpload",
{
onSuccess: (results) => {
// Update form data with uploaded files
setValue("files", results);
},
}
);
const onSubmit = (data: FormData) => {
console.log("Form submitted:", data);
// Submit form with uploaded file URLs
};
return (
uploadFiles(Array.from(e.target.files || []))}
/>
{isUploading ? "Uploading..." : "Submit"}
);
}
```
### Drag and Drop Integration
```typescript
import { useDropzone } from "react-dropzone";
import { useUpload } from "pushduck/client";
export function DragDropUploader() {
const { uploadFiles, files, isUploading } =
useUpload("imageUpload");
const { getRootProps, getInputProps, isDragActive } = useDropzone({
onDrop: (acceptedFiles) => {
uploadFiles(acceptedFiles);
},
accept: {
"image/*": [".png", ".jpg", ".jpeg", ".gif", ".webp"],
},
maxFiles: 10,
});
return (
{isDragActive ? (
Drop the files here...
) : (
Drag & drop images here, or click to select files
)}
{files.length > 0 && (
{files.map((file) => (
))}
)}
);
}
```
***
**Next Steps:** Explore [utility
functions](/docs/api/utilities/create-upload-client) for advanced upload
client customization, or check out [complete examples](/docs/examples) to see
the hook in action.
# File Operations
URL: /docs/api/storage/file-operations
Complete guide to file listing, metadata, and delete operations
***
title: File Operations
description: Complete guide to file listing, metadata, and delete operations
----------------------------------------------------------------------------
# File Operations
## List Operations
### Basic File Listing
```typescript
// List all files
const files = await storage.list.files()
// List with options
const files = await storage.list.files({
prefix: 'uploads/',
maxResults: 50,
sortBy: 'lastModified',
sortOrder: 'desc'
})
```
### Paginated Listing
```typescript
// Get first page
const result = await storage.list.paginated({
maxResults: 20,
prefix: 'images/'
})
console.log(result.files) // FileInfo[]
console.log(result.hasMore) // boolean
console.log(result.nextToken) // string | undefined
// Get next page
if (result.hasMore) {
const nextPage = await storage.list.paginated({
maxResults: 20,
prefix: 'images/',
continuationToken: result.nextToken
})
}
```
### Filtered Listing
```typescript
// By file extension
const images = await storage.list.byExtension('jpg', 'photos/')
const pdfs = await storage.list.byExtension('pdf')
// By file size (bytes)
const largeFiles = await storage.list.bySize(1024 * 1024) // > 1MB
const mediumFiles = await storage.list.bySize(100000, 1024 * 1024) // 100KB - 1MB
// By date range
const recent = await storage.list.byDate(
new Date('2024-01-01'),
new Date('2024-12-31')
)
```
### Directory Listing
```typescript
// List "directories" (common prefixes)
const dirs = await storage.list.directories('uploads/')
// Returns: ['uploads/images/', 'uploads/documents/', 'uploads/videos/']
```
### Async Generator (Large Datasets)
```typescript
// Process large datasets efficiently
for await (const batch of storage.list.paginatedGenerator({ maxResults: 100 })) {
console.log(`Processing ${batch.files.length} files`)
// Process batch...
}
```
## Metadata Operations
### Single File Info
```typescript
const info = await storage.metadata.getInfo('uploads/image.jpg')
console.log(info.key) // 'uploads/image.jpg'
console.log(info.size) // 1024000
console.log(info.contentType) // 'image/jpeg'
console.log(info.lastModified) // Date object
console.log(info.etag) // '"abc123..."'
```
### Batch Metadata
```typescript
const keys = ['file1.jpg', 'file2.pdf', 'file3.mp4']
const results = await storage.metadata.getBatch(keys)
results.forEach(result => {
if (result.success) {
console.log(result.info.size)
} else {
console.log(result.error)
}
})
```
### Specific Metadata
```typescript
// Get individual properties
const size = await storage.metadata.getSize('file.jpg')
const contentType = await storage.metadata.getContentType('file.jpg')
const lastModified = await storage.metadata.getLastModified('file.jpg')
// Custom metadata
const customMeta = await storage.metadata.getCustom('file.jpg')
await storage.metadata.setCustom('file.jpg', {
userId: '123',
category: 'profile-image'
})
```
## Delete Operations
### Single File Delete
```typescript
const result = await storage.delete.file('uploads/old-file.jpg')
if (result.success) {
console.log('File deleted successfully')
} else {
console.log('Delete failed:', result.error)
}
```
### Batch Delete
```typescript
const keys = ['file1.jpg', 'file2.pdf', 'file3.mp4']
const result = await storage.delete.files(keys)
console.log(`Deleted: ${result.deleted.length}`)
console.log(`Failed: ${result.errors.length}`)
// Check individual results
result.errors.forEach(error => {
console.log(`Failed to delete ${error.key}: ${error.message}`)
})
```
### Delete by Prefix (Folder-like)
```typescript
// Delete all files with prefix
const result = await storage.delete.byPrefix('temp-uploads/')
console.log(`Deleted ${result.deletedCount} files`)
// With options
const result = await storage.delete.byPrefix('old-files/', {
dryRun: true, // Preview only, don't delete
batchSize: 500, // Process in batches
maxFiles: 1000 // Limit total files
})
if (result.dryRun) {
console.log(`Would delete ${result.totalFiles} files`)
}
```
## Validation Operations
### File Existence
```typescript
// Simple existence check
const exists = await storage.validation.exists('file.jpg')
// Existence with metadata
const result = await storage.validation.existsWithInfo('file.jpg')
if (result.exists) {
console.log('File size:', result.info.size)
}
```
### File Validation
```typescript
// Validate single file
const result = await storage.validation.validateFile('image.jpg', {
max: "5MB",
types: ['image/jpeg', 'image/png'],
min: "1KB"
})
if (result.valid) {
console.log('File is valid')
} else {
console.log('Validation errors:', result.errors)
}
// Validate multiple files
const results = await storage.validation.validateFiles(
['file1.jpg', 'file2.png'],
{ max: "10MB" }
)
```
### Connection Validation
```typescript
// Test S3 connection and permissions
const isHealthy = await storage.validation.connection()
console.log('S3 connection:', isHealthy ? 'OK' : 'Failed')
```
## Type Definitions
```typescript
interface FileInfo {
key: string
size: number
contentType: string
lastModified: Date
etag: string
metadata?: Record
}
interface ListFilesOptions {
prefix?: string
maxResults?: number
sortBy?: 'key' | 'size' | 'lastModified'
sortOrder?: 'asc' | 'desc'
}
interface ValidationRules {
max?: string | number
min?: string | number
types?: string[]
requiredMetadata?: string[]
}
```
# Presigned URLs
URL: /docs/api/storage/presigned-urls
Secure file access with presigned URLs for private buckets
***
title: Presigned URLs
description: Secure file access with presigned URLs for private buckets
-----------------------------------------------------------------------
# Presigned URLs
Presigned URLs allow secure access to private S3 files without exposing credentials. They're essential for serving files from private buckets.
## Download URLs
### Basic Presigned URL
```typescript
// Generate URL valid for 1 hour (default)
const url = await storage.download.presignedUrl('private/document.pdf')
// Custom expiration (in seconds)
const url = await storage.download.presignedUrl('private/image.jpg', 3600) // 1 hour
const url = await storage.download.presignedUrl('private/video.mp4', 86400) // 24 hours
```
### Direct File URLs
For public buckets, get direct URLs:
```typescript
const publicUrl = await storage.download.url('public/image.jpg')
// Returns: https://bucket.s3.amazonaws.com/public/image.jpg
```
## Upload URLs
### Single Upload URL
```typescript
const uploadUrl = await storage.upload.presignedUrl({
key: 'uploads/new-file.jpg',
contentType: 'image/jpeg',
expiresIn: 300, // 5 minutes
maxFileSize: 5 * 1024 * 1024 // 5MB
})
console.log(uploadUrl.url) // Presigned URL for PUT request
console.log(uploadUrl.fields) // Form fields for multipart upload
```
### Batch Upload URLs
```typescript
const requests = [
{ key: 'file1.jpg', contentType: 'image/jpeg' },
{ key: 'file2.pdf', contentType: 'application/pdf' },
{ key: 'file3.mp4', contentType: 'video/mp4' }
]
const urls = await storage.upload.presignedBatch(requests)
urls.forEach((result, index) => {
if (result.success) {
console.log(`Upload URL for ${requests[index].key}:`, result.url)
} else {
console.log(`Failed to generate URL:`, result.error)
}
})
```
## Frontend Usage Examples
### Direct File Access
```typescript
// API Route (app/api/files/[key]/route.ts)
import { storage } from '@/lib/upload'
import { NextRequest, NextResponse } from 'next/server'
export async function GET(
request: NextRequest,
{ params }: { params: { key: string } }
) {
try {
const url = await storage.download.presignedUrl(params.key, 3600)
return NextResponse.redirect(url)
} catch (error) {
return NextResponse.json(
{ error: 'File not found' },
{ status: 404 }
)
}
}
```
### File Viewer Component
```tsx
'use client'
import { useState, useEffect } from 'react'
interface FileViewerProps {
fileKey: string
}
export function FileViewer({ fileKey }: FileViewerProps) {
const [url, setUrl] = useState()
const [loading, setLoading] = useState(true)
useEffect(() => {
async function getUrl() {
try {
const response = await fetch('/api/presigned', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
operation: 'get-download-url',
key: fileKey
})
})
const data = await response.json()
if (data.success) {
setUrl(data.url)
}
} finally {
setLoading(false)
}
}
getUrl()
}, [fileKey])
if (loading) return Loading...
if (!url) return Failed to load file
return (
)
}
```
### API Route for Presigned URLs
```typescript
// app/api/presigned/route.ts
import { storage } from '@/lib/upload'
import { NextRequest, NextResponse } from 'next/server'
export async function POST(request: NextRequest) {
try {
const { operation, key, expiresIn = 3600 } = await request.json()
switch (operation) {
case 'get-download-url':
const downloadUrl = await storage.download.presignedUrl(key, expiresIn)
return NextResponse.json({
success: true,
url: downloadUrl,
expiresIn
})
case 'get-upload-url':
const uploadUrl = await storage.upload.presignedUrl({
key,
expiresIn,
contentType: 'application/octet-stream'
})
return NextResponse.json({
success: true,
...uploadUrl
})
default:
return NextResponse.json(
{ success: false, error: 'Unknown operation' },
{ status: 400 }
)
}
} catch (error) {
return NextResponse.json(
{ success: false, error: error.message },
{ status: 500 }
)
}
}
```
## Security Considerations
### Expiration Times
Choose appropriate expiration times:
```typescript
// Short-lived for sensitive files
const sensitiveUrl = await storage.download.presignedUrl('private/sensitive.pdf', 300) // 5 minutes
// Medium-lived for user content
const userUrl = await storage.download.presignedUrl('user/profile.jpg', 3600) // 1 hour
// Longer-lived for public content
const publicUrl = await storage.download.presignedUrl('public/banner.jpg', 86400) // 24 hours
```
### Access Control
Implement proper access control before generating URLs:
```typescript
// API Route with authentication
export async function POST(request: NextRequest) {
// Verify user authentication
const user = await getAuthenticatedUser(request)
if (!user) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
const { key } = await request.json()
// Check if user can access this file
if (!canUserAccessFile(user.id, key)) {
return NextResponse.json({ error: 'Forbidden' }, { status: 403 })
}
// Generate presigned URL
const url = await storage.download.presignedUrl(key, 3600)
return NextResponse.json({ url })
}
```
## Type Definitions
```typescript
interface PresignedUrlOptions {
key: string
contentType?: string
expiresIn?: number
maxFileSize?: number
metadata?: Record
}
interface PresignedUrlResult {
url: string
fields?: Record
expiresAt: Date
}
```
# Quick Reference
URL: /docs/api/storage/quick-reference
Essential storage operations at a glance
***
title: Quick Reference
description: Essential storage operations at a glance
-----------------------------------------------------
# Storage API Quick Reference
## Setup
```typescript
// lib/upload.ts
import { createUploadConfig } from 'pushduck/server'
const { storage } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL,
bucket: process.env.S3_BUCKET_NAME,
accountId: process.env.R2_ACCOUNT_ID,
})
.build()
export { storage }
```
## Essential Operations
### List Files
```typescript
const files = await storage.list.files({ prefix: 'uploads/', maxResults: 50 })
const paginated = await storage.list.paginated({ maxResults: 20 })
const images = await storage.list.byExtension('jpg')
```
### File Info
```typescript
const info = await storage.metadata.getInfo('file.jpg')
const exists = await storage.validation.exists('file.jpg')
```
### Delete Files
```typescript
await storage.delete.file('old-file.jpg')
await storage.delete.files(['file1.jpg', 'file2.pdf'])
await storage.delete.byPrefix('temp/')
```
### Presigned URLs
```typescript
const downloadUrl = await storage.download.presignedUrl('private/file.pdf', 3600)
const uploadUrl = await storage.upload.presignedUrl({ key: 'new-file.jpg', contentType: 'image/jpeg' })
```
## API Route Example
```typescript
// app/api/files/route.ts
import { storage } from '@/lib/upload'
export async function GET() {
const files = await storage.list.files()
return Response.json({ files })
}
export async function DELETE(request: Request) {
const { key } = await request.json()
const result = await storage.delete.file(key)
return Response.json(result)
}
```
## Error Handling
```typescript
import { isPushduckError } from 'pushduck/server'
try {
await storage.list.files()
} catch (error) {
if (isPushduckError(error)) {
console.log(error.code, error.context)
}
}
```
## Types
```typescript
import type {
FileInfo,
ListFilesOptions,
ValidationRules
} from 'pushduck/server'
```
# Storage Instance
URL: /docs/api/storage/storage-instance
Object-style API for S3 file operations
***
title: Storage Instance
description: Object-style API for S3 file operations
----------------------------------------------------
# Storage Instance
The `StorageInstance` provides a clean, object-style API for all S3 operations. It groups related operations under namespaces for better discoverability.
## Getting the Storage Instance
The `storage` instance comes from your upload configuration, not created separately:
```typescript
// lib/upload.ts
import { createUploadConfig } from 'pushduck/server'
const { storage, s3, config } = createUploadConfig()
.provider("cloudflareR2",{
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: 'auto',
endpoint: process.env.AWS_ENDPOINT_URL,
bucket: process.env.S3_BUCKET_NAME,
accountId: process.env.R2_ACCOUNT_ID,
})
.defaults({
maxFileSize: '10MB',
acl: 'public-read',
})
.paths({
prefix: 'uploads',
generateKey: (file, metadata) => {
const userId = metadata.userId || 'anonymous'
const timestamp = Date.now()
const randomId = Math.random().toString(36).substring(2, 8)
return `${userId}/${timestamp}/${randomId}/${file.name}`
},
})
.build()
// Export the storage instance
export { storage }
```
Then use it in your API routes:
```typescript
// app/api/files/route.ts
import { storage } from '@/lib/upload'
export async function GET() {
const files = await storage.list.files()
return Response.json({ files })
}
```
## API Structure
The storage instance organizes operations into logical namespaces:
```typescript
storage.list.* // File listing operations
storage.metadata.* // File metadata operations
storage.download.* // Download and URL operations
storage.upload.* // Upload operations
storage.delete.* // Delete operations
storage.validation.* // Validation operations
```
## Configuration Methods
### getConfig()
Get the current configuration (read-only):
```typescript
const config = storage.getConfig()
console.log(config.provider.bucket) // 'my-bucket'
```
### getProviderInfo()
Get provider information:
```typescript
const info = storage.getProviderInfo()
// Returns: { provider: 'aws-s3', bucket: 'my-bucket', region: 'us-east-1' }
```
## Error Handling
All storage operations throw structured `PushduckError` instances:
```typescript
import { isPushduckError } from 'pushduck/server'
try {
const files = await storage.list.files()
} catch (error) {
if (isPushduckError(error)) {
console.log(error.code) // Error code
console.log(error.context) // Additional context
}
}
```
## TypeScript Support
The storage instance is fully typed. Import types as needed:
```typescript
import type { FileInfo, ListFilesOptions } from 'pushduck/server'
const options: ListFilesOptions = {
prefix: 'uploads/',
maxResults: 100
}
const files: FileInfo[] = await storage.list.files(options)
```
# Storage Troubleshooting
URL: /docs/api/storage/troubleshooting
Common issues and solutions for storage operations
***
title: Storage Troubleshooting
description: Common issues and solutions for storage operations
---------------------------------------------------------------
# Storage Troubleshooting
## Common Issues
### Access Denied (403 Errors)
**Problem**: Getting 403 errors when listing or accessing files.
**Solutions**:
1. **Check credentials**:
```typescript
// Verify your environment variables are set
console.log('Access Key:', process.env.AWS_ACCESS_KEY_ID?.substring(0, 5) + '...')
console.log('Bucket:', process.env.S3_BUCKET_NAME)
```
2. **Test connection**:
```typescript
const isHealthy = await storage.validation.connection()
if (!isHealthy) {
console.log('Connection failed - check credentials and permissions')
}
```
3. **Verify bucket permissions**:
* Ensure your access key has `s3:ListBucket`, `s3:GetObject`, `s3:DeleteObject` permissions
* For Cloudflare R2, check your API token has the necessary permissions
### Empty File Lists
**Problem**: `storage.list.files()` returns empty array but files exist.
**Solutions**:
1. **Check prefix**:
```typescript
// Try without prefix first
const allFiles = await storage.list.files()
console.log('Total files:', allFiles.length)
// Then with specific prefix
const prefixFiles = await storage.list.files({ prefix: 'uploads/' })
console.log('Prefix files:', prefixFiles.length)
```
2. **Verify bucket name**:
```typescript
const info = storage.getProviderInfo()
console.log('Current bucket:', info.bucket)
```
### Presigned URL Errors
**Problem**: Presigned URLs return 403 or expire immediately.
**Solutions**:
1. **Check expiration time**:
```typescript
// Use longer expiration for testing
const url = await storage.download.presignedUrl('file.jpg', 3600) // 1 hour
```
2. **Verify file exists**:
```typescript
const exists = await storage.validation.exists('file.jpg')
if (!exists) {
console.log('File does not exist')
}
```
3. **Check bucket privacy settings**:
* Private buckets require presigned URLs
* Public buckets can use direct URLs
### Large File Operations Timeout
**Problem**: Operations on large datasets timeout or fail.
**Solutions**:
1. **Use pagination**:
```typescript
// Instead of loading all files at once
const allFiles = await storage.list.files() // โ May timeout
// Use pagination
const result = await storage.list.paginated({ maxResults: 100 }) // โ
Better
```
2. **Use async generators for processing**:
```typescript
for await (const batch of storage.list.paginatedGenerator({ maxResults: 50 })) {
console.log(`Processing ${batch.files.length} files`)
// Process batch...
}
```
3. **Batch operations**:
```typescript
// Delete files in batches
const filesToDelete = ['file1.jpg', 'file2.jpg', /* ... many files */]
const batchSize = 100
for (let i = 0; i < filesToDelete.length; i += batchSize) {
const batch = filesToDelete.slice(i, i + batchSize)
await storage.delete.files(batch)
}
```
## Error Handling Patterns
### Graceful Degradation
```typescript
async function getFilesSafely() {
try {
const files = await storage.list.files()
return { success: true, files }
} catch (error) {
if (isPushduckError(error)) {
console.log('Storage error:', error.code, error.context)
// Handle specific error types
if (error.code === 'NETWORK_ERROR') {
return { success: false, error: 'Network connection failed', files: [] }
}
if (error.code === 'ACCESS_DENIED') {
return { success: false, error: 'Access denied', files: [] }
}
}
// Fallback for unknown errors
return { success: false, error: 'Unknown error', files: [] }
}
}
```
### Retry Logic
```typescript
async function withRetry(
operation: () => Promise,
maxRetries = 3,
delay = 1000
): Promise {
for (let attempt = 1; attempt <= maxRetries; attempt++) {
try {
return await operation()
} catch (error) {
if (attempt === maxRetries) throw error
console.log(`Attempt ${attempt} failed, retrying in ${delay}ms...`)
await new Promise(resolve => setTimeout(resolve, delay))
delay *= 2 // Exponential backoff
}
}
throw new Error('Max retries exceeded')
}
// Usage
const files = await withRetry(() => storage.list.files())
```
## Performance Optimization
### Efficient File Listing
```typescript
// โ Inefficient - loads all metadata
const files = await storage.list.files()
// โ
Efficient - only get what you need
const files = await storage.list.files({
maxResults: 50,
sortBy: 'lastModified',
sortOrder: 'desc'
})
// โ
Even better - use pagination for large datasets
const result = await storage.list.paginated({ maxResults: 20 })
```
### Batch Metadata Retrieval
```typescript
// โ Inefficient - multiple API calls
const fileInfos = []
for (const key of fileKeys) {
const info = await storage.metadata.getInfo(key)
fileInfos.push(info)
}
// โ
Efficient - single batch call
const fileInfos = await storage.metadata.getBatch(fileKeys)
```
### Smart Caching
```typescript
// Cache file lists for a short time
const cache = new Map()
async function getCachedFiles(prefix?: string) {
const cacheKey = `files:${prefix || 'all'}`
if (cache.has(cacheKey)) {
const { data, timestamp } = cache.get(cacheKey)
if (Date.now() - timestamp < 60000) { // 1 minute cache
return data
}
}
const files = await storage.list.files({ prefix })
cache.set(cacheKey, { data: files, timestamp: Date.now() })
return files
}
```
## Debugging Tips
### Enable Debug Logging
```typescript
// Check if debug mode is enabled in your config
const config = storage.getConfig()
console.log('Debug mode:', config.provider.debug)
// The storage operations will log detailed information when debug is true
```
### Inspect Configuration
```typescript
// Check your current configuration
const config = storage.getConfig()
console.log('Provider:', config.provider.provider)
console.log('Bucket:', config.provider.bucket)
console.log('Region:', config.provider.region)
// Check provider info
const info = storage.getProviderInfo()
console.log('Provider info:', info)
```
### Test Individual Operations
```typescript
// Test each operation individually
console.log('Testing connection...')
const isHealthy = await storage.validation.connection()
console.log('Connection:', isHealthy ? 'OK' : 'Failed')
console.log('Testing file listing...')
const files = await storage.list.files({ maxResults: 1 })
console.log('Files found:', files.length)
console.log('Testing file existence...')
if (files.length > 0) {
const exists = await storage.validation.exists(files[0].key)
console.log('First file exists:', exists)
}
```
## Getting Help
If you're still experiencing issues:
1. **Check the logs** - Look for detailed error messages in your console
2. **Verify environment variables** - Ensure all required variables are set
3. **Test with minimal configuration** - Start with basic setup and add complexity gradually
4. **Check provider documentation** - Verify your bucket/account settings
5. **Use health check** - Run `storage.validation.connection()` to verify basic connectivity
# createUploadClient
URL: /docs/api/utilities/create-upload-client
Create a type-safe upload client with property-based access and optional per-route configuration
***
title: createUploadClient
description: Create a type-safe upload client with property-based access and optional per-route configuration
-------------------------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
import { TypeTable } from "fumadocs-ui/components/type-table";
# createUploadClient
Create a type-safe upload client with **property-based access** and optional **per-route configuration**. This is the recommended approach for most projects.
**Enhanced in v2.0**: Now supports per-route callbacks, progress tracking, and error handling while maintaining superior type safety.
## Why Use This Approach?
* ๐ **Superior Type Safety** - Route names validated at compile time
* ๐ฏ **Property-Based Access** - No string literals, full IntelliSense
* โก **Per-Route Configuration** - Callbacks, endpoints, and options per route
* ๐ **Centralized Setup** - Single configuration for all routes
* ๐ก๏ธ **Refactoring Safety** - Rename routes safely across codebase
This utility function provides property-based access to your upload routes. You can also use the `useUploadRoute()` hook if you prefer traditional React patterns.
## Basic Setup
**Create the upload client**
```typescript title="lib/upload-client.ts"
import { createUploadClient } from 'pushduck/client'
import type { AppRouter } from './upload'
export const upload = createUploadClient({
endpoint: '/api/upload'
})
```
**Use in components**
```typescript title="components/upload-form.tsx"
import { upload } from '@/lib/upload-client'
export function UploadForm() {
const { uploadFiles, files, isUploading } = upload.imageUpload()
return (
uploadFiles(Array.from(e.target.files || []))}
disabled={isUploading}
/>
)
}
```
## Configuration Options
Promise",
},
}}
/>
## Per-Route Configuration
Each route method now accepts optional configuration:
void",
},
onError: {
description: "Callback when upload fails",
type: "(error: Error) => void",
},
onProgress: {
description: "Callback for progress updates",
type: "(progress: number) => void",
},
endpoint: {
description: "Override endpoint for this specific route",
type: "string",
},
disabled: {
description: "Disable uploads for this route",
type: "boolean",
default: "false",
},
autoUpload: {
description: "Automatically start upload when files are selected",
type: "boolean",
default: "true",
},
}}
/>
## Examples
```typescript
import { upload } from '@/lib/upload-client'
export function BasicUpload() {
// Simple usage - no configuration needed
const { uploadFiles, files, isUploading, reset } = upload.imageUpload()
return (
uploadFiles(Array.from(e.target.files || []))}
disabled={isUploading}
/>
{files.map(file => (
{file.name}
{file.status === 'success' &&
โ
}
))}
Reset
)
}
```
```typescript
import { upload } from '@/lib/upload-client'
import { toast } from 'sonner'
export function CallbackUpload() {
const [progress, setProgress] = useState(0)
const { uploadFiles, files, isUploading } = upload.imageUpload({
onSuccess: (results) => {
toast.success(`โ
Uploaded ${results.length} images!`)
console.log('Upload results:', results)
},
onError: (error) => {
toast.error(`โ Upload failed: ${error.message}`)
console.error('Upload error:', error)
},
onProgress: (progress) => {
setProgress(progress)
console.log(`๐ Progress: ${progress}%`)
}
})
return (
)
}
```
```typescript
export function MultiUploadForm() {
// Different configuration for each upload type
const images = upload.imageUpload({
onSuccess: (results) => updateImageGallery(results)
})
const documents = upload.documentUpload({
onSuccess: (results) => updateDocumentLibrary(results),
onError: (error) => logSecurityError(error)
})
const videos = upload.videoUpload({
onProgress: (progress) => setVideoProgress(progress),
onSuccess: (results) => processVideoThumbnails(results)
})
return (
)
}
```
```typescript
// Global configuration with per-route overrides
const upload = createUploadClient({
endpoint: '/api/upload',
// These apply to all routes by default
defaultOptions: {
onProgress: (progress) => updateGlobalProgress(progress),
onError: (error) => logError(error)
}
})
export function MixedConfigUpload() {
// Inherits global onProgress and onError
const basic = upload.imageUpload()
// Overrides global settings + adds success handler
const premium = upload.documentUpload({
endpoint: '/api/premium-upload', // Different endpoint
onSuccess: (results) => {
// This overrides global behavior
handlePremiumUpload(results)
}
// Still inherits global onProgress and onError
})
return (
)
}
```
```typescript
const upload = createUploadClient({
endpoint: '/api/upload',
// Custom fetch function
fetcher: async (input, init) => {
const token = await getAuthToken()
return fetch(input, {
...init,
headers: {
...init?.headers,
'Authorization': `Bearer ${token}`
}
})
},
defaultOptions: {
onError: (error) => {
// Global error tracking
analytics.track('upload_error', { error: error.message })
toast.error('Upload failed. Please try again.')
}
}
})
export function AdvancedUpload() {
const { uploadFiles, files } = upload.secureUpload({
endpoint: '/api/secure-upload',
disabled: !user.hasPermission('upload'),
onSuccess: (results) => {
// Audit log for secure uploads
auditLog('secure_upload_success', {
files: results.length,
user: user.id
})
}
})
return
}
```
## Type Safety Benefits
The structured client provides superior TypeScript integration:
```typescript
const upload = createUploadClient({ endpoint: '/api/upload' })
// โ
IntelliSense shows available routes
upload.imageUpload() // Autocomplete suggests this
upload.documentUpload() // And this
upload.videoUpload() // And this
// โ TypeScript error for non-existent routes
upload.invalidRoute() // Error: Property 'invalidRoute' does not exist
// โ
Route rename safety
// If you rename 'imageUpload' to 'photoUpload' in your router,
// TypeScript will show errors everywhere it's used, making refactoring safe
// โ
Callback type inference
upload.imageUpload({
onSuccess: (results) => {
// `results` is fully typed based on your router configuration
results.forEach(result => {
console.log(result.url) // TypeScript knows this exists
console.log(result.key) // And this
})
}
})
```
## Comparison with Hooks
| Feature | Enhanced Structured Client | Hook-Based |
| ---------------- | --------------------------------- | ---------------------- |
| Type Safety | โ
**Superior** - Property-based | โ
Good - Generic types |
| IntelliSense | โ
**Full route autocomplete** | โ ๏ธ String-based routes |
| Refactoring | โ
**Safe rename across codebase** | โ ๏ธ Manual find/replace |
| Callbacks | โ
**Full support** | โ
Full support |
| Per-route Config | โ
**Full support** | โ
Full support |
| Bundle Size | โ
**Same** | โ
Same |
| Performance | โ
**Identical** | โ
Identical |
## Migration from Hooks
Easy migration from hook-based approach:
```typescript
// Before: Hook-based
import { useUploadRoute } from 'pushduck/client'
const { uploadFiles, files } = useUploadRoute('imageUpload', {
onSuccess: handleSuccess,
onError: handleError
})
// After: Enhanced structured client
import { upload } from '@/lib/upload-client'
const { uploadFiles, files } = upload.imageUpload({
onSuccess: handleSuccess,
onError: handleError
})
```
Benefits of migration:
* ๐ฏ **Better type safety** - Route names validated at compile time
* ๐ **Enhanced IntelliSense** - Autocomplete for all routes
* ๐๏ธ **Centralized config** - Single place for endpoint and defaults
* ๐ก๏ธ **Refactoring safety** - Rename routes safely
* โก **Same performance** - Zero runtime overhead
***
**Recommended Approach**: Use `createUploadClient` for the best developer experience with full flexibility and type safety.
# Production Checklist
URL: /docs/guides/going-live/production-checklist
Essential checklist for deploying pushduck to production safely
***
title: Production Checklist
description: Essential checklist for deploying pushduck to production safely
----------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Steps, Step } from "fumadocs-ui/components/steps";
# Production Checklist
Ensure your file upload system is production-ready with this comprehensive checklist.
**Critical:** Never deploy file upload functionality to production without
proper security, monitoring, and error handling. Follow this checklist to
avoid common pitfalls.
## Security Checklist
**Authentication & Authorization**
* [ ] Authentication middleware is implemented on all upload routes
* [ ] Role-based access control (RBAC) is configured
* [ ] API endpoints are protected with proper authorization
* [ ] Test authentication flows work correctly
```typescript
// โ
Secure endpoint
import { s3 } from "@/lib/upload";
const s3Router = s3.createRouter({
userFiles: s3.image()
.max("5MB")
.count(10)
.middleware(async ({ req, metadata }) => {
const session = await getServerSession(req)
if (!session) throw new Error("Authentication required")
return { ...metadata, userId: session.user.id }
})
})
export const { GET, POST } = s3Router.handlers;
```
{" "}
**Environment Variables** - \[ ] All sensitive data is in environment variables
(never in code) - \[ ] AWS credentials are properly configured - \[ ] JWT
secrets are using strong, unique values - \[ ] Database connection strings are
secure `bash # โ
Required environment variables
AWS_ACCESS_KEY_ID=your_access_key AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_REGION=us-east-1 AWS_S3_BUCKET_NAME=your-bucket
JWT_SECRET=your_strong_jwt_secret NEXTAUTH_SECRET=your_nextauth_secret `
**Input Validation**
* [ ] File type validation is implemented
* [ ] File size limits are enforced
* [ ] File count limits are set
* [ ] Malicious file detection is in place
```typescript
// โ
Comprehensive validation
userPhotos: s3.image()
.max("10MB")
.count(5)
.formats(["jpeg", "png", "webp"])
```
**Rate Limiting**
* [ ] Upload rate limiting is configured
* [ ] Per-user quotas are implemented
* [ ] IP-based rate limiting is active
* [ ] Burst protection is in place
```typescript
import { Ratelimit } from "@upstash/ratelimit"
import { Redis } from "@upstash/redis"
const ratelimit = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.slidingWindow(10, "1 m"), // 10 uploads per minute
})
```
## Infrastructure Checklist
**AWS S3 Configuration**
* [ ] S3 bucket is created with proper naming convention
* [ ] Bucket policies are configured correctly
* [ ] CORS is set up for your domain
* [ ] Public access is blocked unless specifically needed
* [ ] Versioning is enabled for important buckets
```json
// โ
Secure CORS configuration
{
"CORSRules": [
{
"AllowedOrigins": ["https://yourdomain.com"],
"AllowedMethods": ["GET", "PUT", "POST", "DELETE"],
"AllowedHeaders": ["*"],
"ExposeHeaders": ["ETag"]
}
]
}
```
**CDN & Performance**
* [ ] CloudFront distribution is configured for faster delivery
* [ ] Cache headers are set appropriately
* [ ] Image optimization is enabled
* [ ] Compression is configured
```typescript
// โ
Cache headers configuration
const s3Router = s3.createRouter({
publicImages: s3.image()
.max("5MB")
.count(10)
.onUploadComplete(async ({ files }) => {
// Set cache headers for uploaded files
await setCacheHeaders(files, {
maxAge: 31536000, // 1 year
immutable: true
})
})
})
```
**Backup & Recovery**
* [ ] S3 cross-region replication is configured
* [ ] Backup strategy is documented and tested
* [ ] Point-in-time recovery is available
* [ ] Data retention policies are defined
```typescript
// โ
Backup configuration example
const backupConfig = {
enabled: process.env.NODE_ENV === 'production',
replicationBucket: process.env.AWS_BACKUP_BUCKET,
retentionDays: 30,
crossRegion: true
}
```
## Performance Checklist
* [ ] Multipart uploads are enabled for large files
* [ ] Parallel upload chunks are configured
* [ ] Upload progress tracking is working
* [ ] Network error retry logic is implemented
```typescript
export const upload = createUploadClient({
endpoint: '/api/upload',
chunkSize: 5 * 1024 * 1024, // 5MB chunks
maxRetries: 3,
retryDelay: 1000,
parallel: true
})
```
* [ ] Database indexes are created for file queries
* [ ] Connection pooling is configured
* [ ] Query performance is monitored
* [ ] Database backup strategy is in place
```sql
-- โ
Essential indexes
CREATE INDEX idx_files_user_id ON files(user_id);
CREATE INDEX idx_files_created_at ON files(created_at);
CREATE INDEX idx_files_status ON files(status);
```
* [ ] Redis/memory cache for frequently accessed data
* [ ] CDN caching for static files
* [ ] Application-level caching for metadata
* [ ] Cache invalidation strategy is defined
```typescript
import { cache } from '@/lib/cache'
export async function getUserFiles(userId: string) {
return cache.get(`user:${userId}:files`, async () => {
return await db.files.findMany({ where: { userId } })
}, { ttl: 300 }) // 5 minutes
}
```
## Monitoring Checklist
**Error Tracking**
* [ ] Error tracking service is integrated (Sentry, LogRocket, etc.)
* [ ] Upload errors are properly logged
* [ ] Error notifications are configured
* [ ] Error rate thresholds are set
```typescript
import * as Sentry from "@sentry/nextjs"
export const router = createUploadRouter({
userFiles: uploadSchema({
image: { maxSize: "5MB", maxCount: 10 }
}).onError(async ({ error, context }) => {
Sentry.captureException(error, {
tags: {
route: 'userFiles',
userId: context.userId
}
})
})
})
```
**Performance Monitoring**
* [ ] Upload speed metrics are tracked
* [ ] Server response times are monitored
* [ ] Resource usage is measured
* [ ] Uptime monitoring is active
```typescript
// โ
Performance tracking
export const router = createUploadRouter({
userFiles: uploadSchema({
image: { maxSize: "5MB", maxCount: 10 }
}).beforeUpload(async ({ context }) => {
context.startTime = Date.now()
}).afterUpload(async ({ context, uploadedFiles }) => {
const duration = Date.now() - context.startTime
await trackMetric('upload_duration', duration, {
userId: context.userId,
fileCount: uploadedFiles.length
})
})
})
```
**Security Monitoring**
* [ ] Failed authentication attempts are logged
* [ ] Suspicious upload patterns are detected
* [ ] Rate limit violations are tracked
* [ ] Security alerts are configured
```typescript
import { auditLog } from '@/lib/audit'
export const router = createUploadRouter({
userFiles: uploadSchema({
image: { maxSize: "5MB", maxCount: 10 }
}).middleware(async ({ req }) => {
const session = await getServerSession(req)
await auditLog('upload_attempt', {
userId: session?.user?.id || 'anonymous',
ip: req.headers.get('x-forwarded-for'),
userAgent: req.headers.get('user-agent'),
timestamp: new Date()
})
if (!session) {
await auditLog('auth_failure', {
ip: req.headers.get('x-forwarded-for'),
endpoint: '/api/upload'
})
throw new Error("Authentication required")
}
return { userId: session.user.id }
})
})
```
## Deployment Checklist
**CI/CD Pipeline**
* [ ] Automated tests are passing
* [ ] Build process is successful
* [ ] Environment variables are configured
* [ ] Database migrations are applied
```yaml
# โ
GitHub Actions example
name: Deploy to Production
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Run tests
run: npm test
- name: Build application
run: npm run build
- name: Deploy to production
run: npm run deploy
```
**Health Checks**
* [ ] Upload endpoint health check is implemented
* [ ] Database connectivity check is working
* [ ] S3 connectivity check is active
* [ ] Overall system health endpoint exists
```typescript
// app/api/health/route.ts
export async function GET() {
const checks = await Promise.allSettled([
checkDatabase(),
checkS3Connection(),
checkRedisConnection()
])
const isHealthy = checks.every(check => check.status === 'fulfilled')
return Response.json({
status: isHealthy ? 'healthy' : 'unhealthy',
checks: checks.map((check, i) => ({
name: ['database', 's3', 'redis'][i],
status: check.status
}))
}, { status: isHealthy ? 200 : 503 })
}
```
**Load Testing**
* [ ] Upload endpoints are load tested
* [ ] Concurrent user limits are determined
* [ ] Resource scaling thresholds are set
* [ ] Performance baselines are established
```javascript
// k6 load testing script example
import http from 'k6/http';
import { check } from 'k6';
export let options = {
vus: 10,
duration: '30s',
};
export default function() {
let response = http.post('https://yourapp.com/api/upload', {
file: http.file(open('./test-image.jpg', 'b'), 'test.jpg')
});
check(response, {
'status is 200': (r) => r.status === 200,
'upload successful': (r) => r.json('success') === true,
});
}
```
## Documentation Checklist
* [ ] **API Documentation** - All endpoints are documented with examples
* [ ] **Deployment Guide** - Step-by-step deployment instructions
* [ ] **Troubleshooting Guide** - Common issues and solutions
* [ ] **Security Documentation** - Security measures and best practices
* [ ] **Monitoring Runbook** - How to respond to alerts and incidents
## Legal & Compliance
* [ ] **Terms of Service** - File upload terms are clearly defined
* [ ] **Privacy Policy** - Data handling practices are documented
* [ ] **GDPR Compliance** - Data retention and deletion policies
* [ ] **Content Moderation** - Guidelines for acceptable content
* [ ] **Copyright Protection** - DMCA takedown procedures
## Final Verification
**Pre-Launch Testing**: Test all upload flows with real data in a staging
environment that mirrors production exactly.
### Critical Tests
1. **Authentication Flow**
* [ ] Test login/logout with upload permissions
* [ ] Verify unauthorized access is blocked
* [ ] Test token expiration handling
2. **Upload Functionality**
* [ ] Test single file uploads
* [ ] Test multiple file uploads
* [ ] Test large file uploads
* [ ] Test network interruption recovery
3. **Error Handling**
* [ ] Test file size limit enforcement
* [ ] Test invalid file type rejection
* [ ] Test quota limit enforcement
* [ ] Test server error responses
4. **Performance**
* [ ] Measure upload speeds under load
* [ ] Test concurrent user scenarios
* [ ] Verify CDN delivery performance
* [ ] Check mobile device compatibility
***
**Ready for Production**: Once all checklist items are complete, your file
upload system is ready for production deployment. Monitor closely during the
first few days and be prepared to respond quickly to any issues.
# Enhanced Client Migration
URL: /docs/guides/migration/enhanced-client
Migrate from hook-based API to property-based client access for better type safety
***
title: Enhanced Client Migration
description: Migrate from hook-based API to property-based client access for better type safety
-----------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { Steps, Step } from "fumadocs-ui/components/steps";
# Enhanced Client Migration
Upgrade to the new property-based client API for enhanced type safety, better developer experience, and elimination of string literals.
The enhanced client API is **100% backward compatible**. You can migrate
gradually without breaking existing code.
## Why Migrate?
```typescript
// โ Old: String literals, no type safety
const {uploadFiles} = useUploadRoute("imageUpload")
// โ
New: Property-based, full type inference
const {uploadFiles} = upload.imageUpload
```
```typescript
// โ
Autocomplete shows all your endpoints upload.
// imageUpload, documentUpload, videoUpload...
// ^ No more guessing endpoint names
```
```typescript
// When you rename routes in your router,
// TypeScript shows errors everywhere they're used
// Making refactoring safe and easy
```
## Migration Steps
**Install Latest Version**
Ensure you're using the latest version of pushduck:
```bash
npm install pushduck@latest
```
```bash
yarn add pushduck@latest
```
```bash
pnpm add pushduck@latest
```
```bash
bun add pushduck@latest
```
**Create Upload Client**
Set up your typed upload client:
```typescript title="lib/upload-client.ts"
import { createUploadClient } from 'pushduck/client'
import type { AppRouter } from './upload' // Your router type
export const upload = createUploadClient({
endpoint: '/api/upload'
})
```
**Migrate Components Gradually**
Update your components one by one:
```typescript
import { useUploadRoute } from 'pushduck/client'
export function ImageUploader() {
const { uploadFiles, files, isUploading } = useUploadRoute('imageUpload')
return (
uploadFiles(e.target.files)} />
{/* Upload UI */}
)
}
```
```typescript
import { upload } from '@/lib/upload-client'
export function ImageUploader() {
const { uploadFiles, files, isUploading } = upload.imageUpload
return (
uploadFiles(e.target.files)} />
{/* Same upload UI */}
)
}
```
**Update Imports**
Once migrated, you can remove old hook imports:
```typescript
// Remove old imports
// import { useUploadRoute } from 'pushduck/client'
// Use new client import
import { upload } from '@/lib/upload-client'
```
## Migration Examples
### Basic Component Migration
```typescript
import { useUploadRoute } from 'pushduck/client'
export function DocumentUploader() {
const {
uploadFiles,
files,
isUploading,
error,
reset
} = useUploadRoute('documentUpload', {
onSuccess: (results) => {
console.log('Uploaded:', results)
},
onError: (error) => {
console.error('Error:', error)
}
})
return (
uploadFiles(Array.from(e.target.files || []))}
disabled={isUploading}
/>
{files.map(file => (
))}
{error &&
Error: {error.message}
}
Reset
)
}
```
```typescript
import { upload } from '@/lib/upload-client'
export function DocumentUploader() {
const {
uploadFiles,
files,
isUploading,
error,
reset
} = upload.documentUpload
// Handle callbacks with upload options
const handleUpload = async (selectedFiles: File[]) => {
try {
const results = await uploadFiles(selectedFiles)
console.log('Uploaded:', results)
} catch (error) {
console.error('Error:', error)
}
}
return (
handleUpload(Array.from(e.target.files || []))}
disabled={isUploading}
/>
{files.map(file => (
))}
{error &&
Error: {error.message}
}
Reset
)
}
```
### Form Integration Migration
```typescript
import { useForm } from 'react-hook-form'
import { useUploadRoute } from 'pushduck/client'
export function ProductForm() {
const { register, handleSubmit, setValue } = useForm()
const { uploadFiles, uploadedFiles } = useUploadRoute('productImages', {
onSuccess: (results) => {
setValue('images', results.map(r => r.url))
}
})
return (
uploadFiles(Array.from(e.target.files || []))}
/>
Save Product
)
}
```
```typescript
import { useForm } from 'react-hook-form'
import { upload } from '@/lib/upload-client'
export function ProductForm() {
const { register, handleSubmit, setValue } = useForm()
const { uploadFiles } = upload.productImages
const handleImageUpload = async (files: File[]) => {
const results = await uploadFiles(files)
setValue('images', results.map(r => r.url))
}
return (
handleImageUpload(Array.from(e.target.files || []))}
/>
Save Product
)
}
```
### Multiple Upload Types Migration
```typescript
export function MediaUploader() {
const images = useUploadRoute('imageUpload')
const videos = useUploadRoute('videoUpload')
const documents = useUploadRoute('documentUpload')
return (
)
}
```
```typescript
import { upload } from '@/lib/upload-client'
export function MediaUploader() {
const images = upload.imageUpload
const videos = upload.videoUpload
const documents = upload.documentUpload
return (
)
}
```
## Key Differences
### API Comparison
| Feature | Hook-Based API | Property-Based API |
| ------------------ | ------------------------- | --------------------------- |
| **Type Safety** | Runtime string validation | Compile-time type checking |
| **IntelliSense** | Limited autocomplete | Full endpoint autocomplete |
| **Refactoring** | Manual find/replace | Automatic TypeScript errors |
| **Bundle Size** | Slightly larger | Optimized tree-shaking |
| **Learning Curve** | Familiar React pattern | New property-based pattern |
### Callback Handling
```typescript
const { uploadFiles } = useUploadRoute('images', {
onSuccess: (results) => console.log('Success:', results),
onError: (error) => console.error('Error:', error),
onProgress: (progress) => console.log('Progress:', progress)
})
```
```typescript
const { uploadFiles } = upload.images
await uploadFiles(files, {
onSuccess: (results) => console.log('Success:', results),
onError: (error) => console.error('Error:', error),
onProgress: (progress) => console.log('Progress:', progress)
})
```
## Troubleshooting
### Common Migration Issues
**Type Errors:** If you see TypeScript errors after migration, ensure your
router type is properly exported and imported.
```typescript
// โ Missing router type
export const upload = createUploadClient({
endpoint: "/api/upload",
});
// โ
With proper typing
export const upload = createUploadClient({
endpoint: "/api/upload",
});
```
### Gradual Migration Strategy
You can use both APIs simultaneously during migration:
```typescript
// Keep existing hook-based components working
const hookUpload = useUploadRoute("imageUpload");
// Use new property-based API for new components
const propertyUpload = upload.imageUpload;
// Both work with the same backend!
```
## Benefits After Migration
* **๐ฏ Enhanced Type Safety**: Catch errors at compile time, not runtime
* **๐ Better Performance**: Optimized bundle size with tree-shaking
* **๐ก Improved DX**: Full IntelliSense support for all endpoints
* **๐ง Safe Refactoring**: Rename endpoints without breaking your app
* **๐ฆ Future-Proof**: Built for the next generation of pushduck features
***
**Migration Complete!** You now have enhanced type safety and a better
developer experience. Need help? Join our [Discord
community](https://discord.gg/pushduck) for support.
# Authentication & Authorization
URL: /docs/guides/security/authentication
Secure your file uploads with proper authentication and authorization patterns
***
title: Authentication & Authorization
description: Secure your file uploads with proper authentication and authorization patterns
-------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { TypeTable } from "fumadocs-ui/components/type-table";
# Authentication & Authorization
Secure your file upload endpoints with robust authentication and authorization middleware.
**Important:** Never expose upload endpoints without proper authentication in
production. Unprotected endpoints can lead to storage abuse and security
vulnerabilities.
## Authentication Patterns
### NextAuth.js Integration
```typescript
import { s3 } from "@/lib/upload";
import { getServerSession } from "next-auth";
import { authOptions } from "@/lib/auth";
const s3Router = s3.createRouter({
userFiles: s3.image()
.max("5MB")
.count(10)
.middleware(async ({ req, metadata }) => {
const session = await getServerSession(authOptions);
if (!session?.user?.id) {
throw new Error("Authentication required");
}
return {
...metadata,
userId: session.user.id,
userEmail: session.user.email,
};
}),
});
export const { GET, POST } = s3Router.handlers;
```
### JWT Token Validation
```typescript
import jwt from "jsonwebtoken";
const s3Router = s3.createRouter({
protectedUploads: s3.file()
.max("10MB")
.count(5)
.middleware(async ({ req, metadata }) => {
const token = req.headers.get("authorization")?.replace("Bearer ", "");
if (!token) {
throw new Error("Authorization token required");
}
try {
const payload = jwt.verify(token, process.env.JWT_SECRET!) as any;
return {
...metadata,
userId: payload.sub,
roles: payload.roles || [],
};
} catch (error) {
throw new Error("Invalid or expired token");
}
}),
});
```
### Custom Authentication
```typescript
const s3Router = s3.createRouter({
apiKeyUploads: s3.file()
.max("25MB")
.count(1)
.types(['application/pdf', 'application/msword'])
.middleware(async ({ req, metadata }) => {
const apiKey = req.headers.get("x-api-key");
if (!apiKey) {
throw new Error("API key required");
}
// Validate API key against your database
const client = await validateApiKey(apiKey);
if (!client) {
throw new Error("Invalid API key");
}
return {
...metadata,
clientId: client.id,
plan: client.plan,
quotaUsed: client.quotaUsed,
};
}),
});
```
## Authorization Strategies
### Role-Based Access Control (RBAC)
```typescript
const s3Router = s3.createRouter({
adminUploads: s3.file()
.max("100MB")
.count(50)
.middleware(async ({ req, metadata }) => {
const { userId, roles } = await authenticateUser(req);
if (!roles.includes("admin")) {
throw new Error("Admin access required");
}
return { ...metadata, userId, roles };
}),
moderatorUploads: s3.image()
.max("10MB")
.count(20)
.middleware(async ({ req, metadata }) => {
const { userId, roles } = await authenticateUser(req);
if (!roles.includes("admin") && !roles.includes("moderator")) {
throw new Error("Moderator access required");
}
return { ...metadata, userId, roles };
}),
userUploads: s3.image()
.max("5MB")
.count(5)
.middleware(async ({ req, metadata }) => {
const { userId } = await authenticateUser(req);
// Basic authentication only
return { ...metadata, userId };
}),
});
```
### Resource-Based Authorization
```typescript
const s3Router = s3.createRouter({
projectFiles: s3.file()
.max("25MB")
.count(10)
.middleware(async ({ req, metadata }) => {
const { userId } = await authenticateUser(req);
const projectId = req.url.searchParams.get("projectId");
if (!projectId) {
throw new Error("Project ID required");
}
// Check if user has access to this project
const hasAccess = await checkProjectAccess(userId, projectId);
if (!hasAccess) {
throw new Error("Access denied to this project");
}
return { ...metadata, userId, projectId };
}),
});
```
### Attribute-Based Access Control (ABAC)
```typescript
interface AccessContext {
userId: string;
userRole: string;
resourceType: string;
action: string;
environment: string;
}
async function checkAccess(context: AccessContext): Promise {
// Complex policy evaluation
const policies = await getPolicies(context.userId);
return policies.some(
(policy) =>
policy.resource === context.resourceType &&
policy.actions.includes(context.action) &&
policy.environment.includes(context.environment)
);
}
export const router = createUploadRouter({
sensitiveFiles: uploadSchema({
document: { maxSize: "10MB", maxCount: 1 },
}).middleware(async ({ req }) => {
const { userId, role } = await authenticateUser(req);
const hasAccess = await checkAccess({
userId,
userRole: role,
resourceType: "sensitive-document",
action: "upload",
environment: process.env.NODE_ENV || "development",
});
if (!hasAccess) {
throw new Error("Access denied by policy");
}
return { userId, role };
}),
});
```
## Security Best Practices
```typescript
.middleware(async ({ req, files }) => {
for (const file of files) {
// Validate file headers
if (!await isValidFileType(file)) {
throw new Error("Invalid file type")
}
// Check for malicious content
if (await containsMalware(file)) {
throw new Error("File contains malicious content")
}
}
return { userId: await getUserId(req) }
})
```
```typescript
import { ratelimit } from '@/lib/ratelimit'
.middleware(async ({ req }) => {
const identifier = await getUserId(req) || getClientIP(req)
const { success } = await ratelimit.limit(identifier)
if (!success) {
throw new Error("Rate limit exceeded")
}
return { userId: identifier }
})
```
```typescript
.middleware(async ({ req, files }) => {
const userId = await getUserId(req)
const totalSize = files.reduce((sum, f) => sum + f.size, 0)
const quota = await getUserQuota(userId)
if (quota.used + totalSize > quota.limit) {
throw new Error("Storage quota exceeded")
}
return { userId }
})
```
## Client-Side Security
### Secure Token Handling
```typescript
// lib/upload-client.ts
export const upload = createUploadClient({
endpoint: "/api/upload",
headers: {
// Use secure token storage
Authorization: `Bearer ${getSecureToken()}`,
},
onError: (error) => {
if (error.status === 401) {
// Handle token expiration
refreshToken().then(() => {
// Retry the upload with new token
window.location.reload();
});
}
},
});
// Secure token storage
function getSecureToken(): string {
// Use httpOnly cookies or secure storage
return (
document.cookie
.split("; ")
.find((row) => row.startsWith("auth-token="))
?.split("=")[1] || ""
);
}
```
### CSRF Protection
```typescript
// Server-side CSRF validation
import { csrf } from "@/lib/csrf";
export const router = createUploadRouter({
protectedUploads: uploadSchema({
image: { maxSize: "5MB", maxCount: 10 },
}).middleware(async ({ req }) => {
// Validate CSRF token
const csrfToken = req.headers.get("x-csrf-token");
if (!csrf.verify(csrfToken)) {
throw new Error("Invalid CSRF token");
}
return { userId: await getUserId(req) };
}),
});
// Client-side CSRF token
export const upload = createUploadClient({
endpoint: "/api/upload",
headers: {
"X-CSRF-Token": getCsrfToken(),
},
});
```
## Environment-Specific Security
```typescript
export const router = createUploadRouter({
devUploads: uploadSchema({
any: { maxSize: "100MB", maxCount: 100 }
}).middleware(async ({ req }) => {
if (process.env.NODE_ENV !== 'development') {
throw new Error("Development endpoint only")
}
// Relaxed auth for development
const userId = req.headers.get('x-dev-user-id') || 'dev-user'
return { userId }
})
})
```
```typescript
export const router = createUploadRouter({
stagingUploads: uploadSchema({
any: { maxSize: "50MB", maxCount: 20 }
}).middleware(async ({ req }) => {
// Basic auth for staging
const token = req.headers.get('authorization')
if (!token || !await validateStagingToken(token)) {
throw new Error("Invalid staging credentials")
}
return { userId: 'staging-user' }
})
})
```
```typescript
export const router = createUploadRouter({
prodUploads: uploadSchema({
any: { maxSize: "25MB", maxCount: 10 }
}).middleware(async ({ req }) => {
// Full security stack for production
const session = await getServerSession(req)
if (!session) throw new Error("Authentication required")
const ip = req.headers.get('x-forwarded-for')
await checkIPWhitelist(ip)
const { success } = await ratelimit.limit(session.user.id)
if (!success) throw new Error("Rate limit exceeded")
await auditLog('file_upload_attempt', {
userId: session.user.id,
ip,
timestamp: new Date()
})
return {
userId: session.user.id,
auditId: generateAuditId()
}
})
})
```
## Security Middleware Examples
### Multi-Factor Authentication
```typescript
export const router = createUploadRouter({
sensitiveUploads: uploadSchema({
document: { maxSize: "10MB", maxCount: 1 },
}).middleware(async ({ req }) => {
const { userId } = await authenticateUser(req);
const mfaToken = req.headers.get("x-mfa-token");
if (!mfaToken) {
throw new Error("MFA token required for sensitive uploads");
}
const isValidMFA = await verifyMFAToken(userId, mfaToken);
if (!isValidMFA) {
throw new Error("Invalid MFA token");
}
return { userId, mfaVerified: true };
}),
});
```
### IP Whitelisting
```typescript
export const router = createUploadRouter({
restrictedUploads: uploadSchema({
any: { maxSize: "50MB", maxCount: 5 },
}).middleware(async ({ req }) => {
const clientIP =
req.headers.get("x-forwarded-for") ||
req.headers.get("x-real-ip") ||
"unknown";
const allowedIPs = process.env.ALLOWED_IPS?.split(",") || [];
if (!allowedIPs.includes(clientIP)) {
throw new Error(`Access denied for IP: ${clientIP}`);
}
return { userId: await getUserId(req), clientIP };
}),
});
```
### Content Scanning
```typescript
import { scanFile } from "@/lib/virus-scanner";
export const router = createUploadRouter({
scannedUploads: uploadSchema({
any: { maxSize: "25MB", maxCount: 10 },
}).middleware(async ({ req, files }) => {
// Scan all files for malware
for (const file of files) {
const scanResult = await scanFile(file);
if (scanResult.threat) {
await logSecurityEvent({
type: "malware_detected",
filename: file.name,
threat: scanResult.threat,
ip: req.headers.get("x-forwarded-for"),
});
throw new Error("File contains malicious content");
}
}
return { userId: await getUserId(req) };
}),
});
```
***
**Security First:** Always implement multiple layers of security.
Authentication, authorization, input validation, and monitoring work together
to protect your application.
# Image Uploads
URL: /docs/guides/uploads/images
Complete guide to handling image uploads with optimization, validation, and processing
***
title: Image Uploads
description: Complete guide to handling image uploads with optimization, validation, and processing
---------------------------------------------------------------------------------------------------
import { Callout } from "fumadocs-ui/components/callout";
import { Card, Cards } from "fumadocs-ui/components/card";
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
import { TypeTable } from "fumadocs-ui/components/type-table";
import { Files, Folder, File } from "fumadocs-ui/components/files";
# Image Uploads
Handle image uploads with built-in optimization, validation, and processing features for the best user experience.
Images are the most common upload type. This guide covers everything from
basic setup to advanced optimization techniques for production apps.
## Basic Image Upload Setup
### Server Configuration
```typescript
// app/api/upload/route.ts
import { s3 } from "@/lib/upload";
const s3Router = s3.createRouter({
// Basic image upload
profilePicture: s3.image()
.max('5MB')
.count(1)
.formats(['jpeg', 'png', 'webp']),
// Multiple images with optimization
galleryImages: s3.image()
.max('10MB')
.count(10)
.formats(['jpeg', 'png', 'webp', 'gif']),
});
export type AppS3Router = typeof s3Router;
export const { GET, POST } = s3Router.handlers;
```
### Client Implementation
```typescript
// components/image-uploader.tsx
import { upload } from "@/lib/upload-client";
export function ImageUploader() {
const { uploadFiles, files, isUploading } = upload.galleryImages;
const handleImageSelect = (e: React.ChangeEvent) => {
const selectedFiles = Array.from(e.target.files || []);
uploadFiles(selectedFiles);
};
return (
{files.map((file) => (
{file.status === "success" && (
)}
{file.status === "uploading" && (
)}
))}
);
}
```
## Image Validation & Processing
### Format Validation
```typescript
const s3Router = s3.createRouter({
productImages: s3.image()
.max('8MB')
.count(5)
.formats(['jpeg', 'png', 'webp'])
.dimensions({
minWidth: 800,
maxWidth: 4000,
minHeight: 600,
maxHeight: 3000,
})
.aspectRatio(16 / 9, { tolerance: 0.1 })
.middleware(async ({ req, file, metadata }) => {
// Custom validation
const imageMetadata = await getImageMetadata(file);
if (
imageMetadata.hasTransparency &&
!["png", "webp"].includes(imageMetadata.format)
) {
throw new Error("Transparent images must be PNG or WebP format");
}
if (imageMetadata.colorProfile !== "sRGB") {
console.warn(
`Image ${file.name} uses ${imageMetadata.colorProfile} color profile`
);
}
return {
...metadata,
userId: await getUserId(req),
...imageMetadata
};
}),
});
```
### Image Optimization
```typescript
const s3Router = s3.createRouter({
optimizedImages: s3.image()
.max('15MB')
.count(10)
.formats(['jpeg', 'png', 'webp'])
.dimensions({ maxWidth: 1920, maxHeight: 1080 })
.onUploadComplete(async ({ file, url, metadata }) => {
// Generate multiple sizes
await generateImageVariants(file, [
{ name: "thumbnail", width: 150, height: 150, fit: "cover" },
{ name: "medium", width: 800, height: 600, fit: "inside" },
{ name: "large", width: 1920, height: 1080, fit: "inside" },
]);
}),
});
```
## Advanced Image Features
### Responsive Image Generation
```typescript
interface ImageVariant {
name: string;
width: number;
height?: number;
quality?: number;
format?: "jpeg" | "png" | "webp";
}
const imageVariants: ImageVariant[] = [
{ name: "thumbnail", width: 150, height: 150, quality: 80 },
{ name: "small", width: 400, quality: 85 },
{ name: "medium", width: 800, quality: 85 },
{ name: "large", width: 1200, quality: 85 },
{ name: "xlarge", width: 1920, quality: 90 },
];
const s3Router = s3.createRouter({
responsiveImages: s3.image()
.max('20MB')
.count(5)
.formats(['jpeg', 'png', 'webp'])
.onUploadComplete(async ({ file, url, metadata }) => {
// Generate responsive variants
const variants = await Promise.all(
imageVariants.map((variant) => generateImageVariant(file, variant))
);
// Save variant information to database
await saveImageVariants(file.key, variants, metadata.userId);
}),
});
// Client-side responsive image component
export function ResponsiveImage({
src,
alt,
sizes = "(max-width: 768px) 100vw, (max-width: 1200px) 50vw, 33vw",
}: {
src: string;
alt: string;
sizes?: string;
}) {
const variants = useImageVariants(src);
if (!variants) return ;
const srcSet = [
`${variants.small} 400w`,
`${variants.medium} 800w`,
`${variants.large} 1200w`,
`${variants.xlarge} 1920w`,
].join(", ");
return (
);
}
```
### Image Upload with Crop & Preview
```typescript
import { useState } from 'react'
import { ImageCropper } from './image-cropper'
import { upload } from '@/lib/upload-client'
export function ImageUploadWithCrop() {
const [selectedFile, setSelectedFile] = useState(null)
const [croppedImage, setCroppedImage] = useState(null)
const { uploadFiles, isUploading } = upload.profilePicture
const handleFileSelect = (e: React.ChangeEvent) => {
const file = e.target.files?.[0]
if (file) setSelectedFile(file)
}
const handleCropComplete = (croppedBlob: Blob) => {
setCroppedImage(croppedBlob)
}
const handleUpload = async () => {
if (!croppedImage) return
const file = new File([croppedImage], 'cropped-image.jpg', {
type: 'image/jpeg'
})
await uploadFiles([file])
// Reset state
setSelectedFile(null)
setCroppedImage(null)
}
return (
{!selectedFile && (
)}
{selectedFile && !croppedImage && (
)}
{croppedImage && (
setCroppedImage(null)}>
Recrop
{isUploading ? 'Uploading...' : 'Upload'}
)}
)
}
```
```typescript
import { useRef, useCallback } from 'react'
import ReactCrop, { Crop, PixelCrop } from 'react-image-crop'
import 'react-image-crop/dist/ReactCrop.css'
interface ImageCropperProps {
image: File
aspectRatio?: number
onCropComplete: (croppedBlob: Blob) => void
}
export function ImageCropper({
image,
aspectRatio = 1,
onCropComplete
}: ImageCropperProps) {
const imgRef = useRef(null)
const [crop, setCrop] = useState({
unit: '%',
x: 25,
y: 25,
width: 50,
height: 50
})
const imageUrl = URL.createObjectURL(image)
const getCroppedImage = useCallback(async (
image: HTMLImageElement,
crop: PixelCrop
): Promise => {
const canvas = document.createElement('canvas')
const ctx = canvas.getContext('2d')!
const scaleX = image.naturalWidth / image.width
const scaleY = image.naturalHeight / image.height
canvas.width = crop.width * scaleX
canvas.height = crop.height * scaleY
ctx.imageSmoothingQuality = 'high'
ctx.drawImage(
image,
crop.x * scaleX,
crop.y * scaleY,
crop.width * scaleX,
crop.height * scaleY,
0,
0,
canvas.width,
canvas.height
)
return new Promise(resolve => {
canvas.toBlob(blob => resolve(blob!), 'image/jpeg', 0.9)
})
}, [])
const handleCropComplete = useCallback(async (crop: PixelCrop) => {
if (imgRef.current && crop.width && crop.height) {
const croppedBlob = await getCroppedImage(imgRef.current, crop)
onCropComplete(croppedBlob)
}
}, [getCroppedImage, onCropComplete])
return (
)
}
```
```typescript
// Server-side image processing after upload
const s3Router = s3.createRouter({
profilePicture: s3.image()
.max('10MB')
.count(1)
.formats(['jpeg', 'png', 'webp'])
.onUploadComplete(async ({ file, url, metadata }) => {
// Generate avatar sizes
await Promise.all([
generateImageVariant(file, {
name: 'avatar-small',
width: 32,
height: 32,
fit: 'cover',
quality: 90
}),
generateImageVariant(file, {
name: 'avatar-medium',
width: 64,
height: 64,
fit: 'cover',
quality: 90
}),
generateImageVariant(file, {
name: 'avatar-large',
width: 128,
height: 128,
fit: 'cover',
quality: 95
})
])
// Update user profile with new avatar
await updateUserAvatar(metadata.userId, {
original: url,
small: getVariantUrl(file.key, 'avatar-small'),
medium: getVariantUrl(file.key, 'avatar-medium'),
large: getVariantUrl(file.key, 'avatar-large')
})
})
})
```
## Image Upload Patterns
### Drag & Drop Image Gallery
```typescript
import { useDropzone } from "react-dropzone";
import { upload } from "@/lib/upload-client";
export function ImageGalleryUploader() {
const { uploadFiles, files, isUploading } = upload.galleryImages;
const { getRootProps, getInputProps, isDragActive } = useDropzone({
accept: {
"image/*": [".jpeg", ".jpg", ".png", ".webp", ".gif"],
},
maxFiles: 10,
onDrop: (acceptedFiles) => {
uploadFiles(acceptedFiles);
},
});
const removeFile = (fileId: string) => {
// Implementation to remove file from gallery
};
return (
{isDragActive ? (
Drop the images here...
) : (
Drag & drop images here, or click to select
Up to 10 images, max 10MB each
)}
{files.length > 0 && (
{files.map((file) => (
{file.status === "success" && (
removeFile(file.id)}
>
ร
)}
{file.status === "uploading" && (
)}
{file.status === "error" && (
โ ๏ธ
Upload failed
uploadFiles([file.originalFile])}>
Retry
)}
))}
)}
);
}
```
### Image Upload with Metadata
```typescript
const s3Router = s3.createRouter({
portfolioImages: s3.image()
.max('15MB')
.count(20)
.formats(['jpeg', 'png', 'webp'])
.middleware(async ({ req, file, metadata }) => {
const { userId } = await authenticateUser(req);
// Extract and validate metadata
const imageMetadata = await extractImageMetadata(file);
// Return enriched metadata
return {
...metadata,
userId,
uploadedBy: userId,
uploadedAt: new Date(),
originalFilename: file.name,
fileHash: await calculateFileHash(file),
...imageMetadata,
};
})
.onUploadComplete(async ({ file, url, metadata }) => {
// Save detailed image information
await saveImageToDatabase({
userId: metadata.userId,
s3Key: file.key,
url: url,
filename: metadata.originalFilename,
size: file.size,
dimensions: {
width: metadata.width,
height: metadata.height,
},
format: metadata.format,
colorProfile: metadata.colorProfile,
hasTransparency: metadata.hasTransparency,
exifData: metadata.exif,
hash: metadata.fileHash,
});
}),
});
```
## Performance Best Practices
```typescript
import { compress } from 'image-conversion'
export function optimizeImage(file: File): Promise {
return compress(file, {
quality: 0.8,
type: 'image/webp',
width: 1920,
height: 1080,
orientation: true // Auto-rotate based on EXIF
})
}
// Usage in upload component
const handleFileSelect = async (files: File[]) => {
const optimizedFiles = await Promise.all(
files.map(file => optimizeImage(file))
)
uploadFiles(optimizedFiles)
}
```
```typescript
export function ProgressiveImage({
src,
blurDataURL,
alt
}: {
src: string
blurDataURL: string
alt: string
}) {
const [isLoaded, setIsLoaded] = useState(false)
return (
setIsLoaded(true)}
/>
)
}
```
```typescript
import { useIntersectionObserver } from '@/hooks/use-intersection-observer'
export function LazyImage({ src, alt, ...props }) {
const [ref, isIntersecting] = useIntersectionObserver({
threshold: 0.1,
rootMargin: '50px'
})
return (
{isIntersecting ? (
) : (
Loading...
)}
)
}
```
## Project Structure
***
**Image Excellence**: With proper optimization, validation, and processing,
your image uploads will provide an excellent user experience while maintaining
performance and quality.