# AI & LLM Integration (/docs/ai-integration) ## AI & LLM Features Pushduck documentation provides AI-friendly endpoints that make it easy for large language models (LLMs) and automated tools to access and process our documentation content. ## Available Endpoints ### πŸ“„ Complete Documentation Export Access all documentation content in a single, structured format: ``` GET /llms.txt ``` This endpoint returns all documentation pages in a clean, AI-readable format with: * Page titles and URLs * Descriptions and metadata * Full content with proper formatting * Structured sections and hierarchies **Example Usage:** ```bash curl https://your-domain.com/llms.txt ``` ### πŸ“‘ Individual Page Access Access any documentation page's raw content by appending `.mdx` to its URL: ``` GET /docs/{page-path}.mdx ``` **Examples:** * `/docs/quick-start.mdx` - Quick start guide content * `/docs/api/client/use-upload-route.mdx` - Hook documentation * `/docs/providers/aws-s3.mdx` - AWS S3 setup guide ## Use Cases ### πŸ€– **AI Assistant Integration** * Train custom AI models on our documentation * Create chatbots that can answer questions about Pushduck * Build intelligent documentation search systems ### πŸ”§ **Development Tools** * Generate code examples and snippets * Create automated documentation tests * Build CLI tools that reference our docs ### πŸ“Š **Content Analysis** * Analyze documentation completeness * Track content changes over time * Generate documentation metrics ## Content Format The LLM endpoints return content in a structured format: ``` # Page Title URL: /docs/page-path Page description here # Section Headers Content with proper markdown formatting... ## Subsections - Lists and bullet points - Code blocks with syntax highlighting - Tables and structured data ``` ## Technical Details * **Caching**: Content is cached for optimal performance * **Processing**: Uses Remark pipeline with MDX and GFM support * **Format**: Clean markdown with frontmatter removed * **Encoding**: UTF-8 text format * **CORS**: Enabled for cross-origin requests ## Rate Limiting These endpoints are designed for programmatic access and don't have aggressive rate limiting. However, please be respectful: * Cache responses when possible * Avoid excessive automated requests * Use appropriate user agents for your tools ## Examples ### Python Script ```python import requests # Get all documentation response = requests.get('https://your-domain.com/llms.txt') docs_content = response.text # Get specific page page_response = requests.get('https://your-domain.com/docs/quick-start.mdx') page_content = page_response.text ``` ### Node.js/JavaScript ```javascript // Fetch all documentation const allDocs = await fetch("/llms.txt").then((r) => r.text()); // Fetch specific page const quickStart = await fetch("/docs/quick-start.mdx").then((r) => r.text()); ``` ### cURL ```bash # Download all docs to file curl -o pushduck-docs.txt https://your-domain.com/llms.txt # Get specific page content curl https://your-domain.com/docs/api/client/use-upload-route.mdx ``` ## Integration with Popular AI Tools ### OpenAI GPT Use the `/llms.txt` endpoint to provide context about Pushduck in your GPT conversations. ### Claude/Anthropic Feed documentation content to Claude for detailed analysis and code generation. ### Local LLMs Download content for training or fine-tuning local language models. *** These AI-friendly endpoints make it easy to integrate Pushduck documentation into your development workflow and AI-powered tools! # Comparisons (/docs/comparisons) import { Callout } from "fumadocs-ui/components/callout"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; ## Overview Choosing the right file upload solution depends on your project's requirements. This page compares Pushduck with popular alternatives to help you make an informed decision. **TL;DR:** Pushduck is ideal if you want a **lightweight, self-hosted** solution with **full control** over your S3 storage, without vendor lock-in or ongoing upload fees. **Note:** Pricing, features, and bundle sizes are approximate and current as of October 2025. Always verify current details from official sources before making decisions. *** ## Quick Comparison | Feature | Pushduck | UploadThing | Uploadcare | AWS SDK | Uppy | | --------------------- | ---------------------------------------- | -------------------- | -------------------- | -------------------- | -------------------- | | **Bundle Size** | \~7KB | \~200KB+ | \~150KB+ | \~500KB+ | \~50KB+ (core) | | **Setup Time** | 5 minutes | 10 minutes | 15 minutes | 15-20 hours | 30-60 minutes | | **Edge Runtime** | βœ… Yes | βœ… Yes | βœ… Yes | ❌ No | βœ… Partial | | **Self-Hosted** | βœ… Yes | ❌ No | ❌ No | βœ… Yes | βœ… Yes | | **Pricing Model** | Free (S3 costs only) | Per upload | Per upload + storage | Free (S3 costs only) | Free (S3 costs only) | | **Type Safety** | βœ… Full | βœ… Full | ⚠️ Partial | ⚠️ Manual | ❌ None | | **Multi-Provider** | βœ… 6 (AWS, R2, DO, MinIO, GCS, S3-compat) | ❌ Own infrastructure | ❌ Own infrastructure | ❌ AWS only | βœ… Yes | | **React Hooks** | βœ… Built-in | βœ… Built-in | βœ… Built-in | ❌ Build yourself | βœ… Available | | **Progress Tracking** | βœ… Automatic | βœ… Automatic | βœ… Automatic | ❌ Build yourself | βœ… Automatic | | **Presigned URLs** | βœ… Automatic | βœ… Automatic | N/A (managed) | ❌ Build yourself | ⚠️ Manual | | **Best For** | Developers | Rapid prototyping | Enterprises | Full AWS control | UI flexibility | *** ## Detailed Comparisons ### vs UploadThing **UploadThing** is a managed file upload service with tight Next.js integration and developer-friendly DX. **When to choose Pushduck:** * βœ… You want to avoid per-upload fees * βœ… You need edge runtime support (UploadThing requires Node.js runtime) * βœ… You want full control over storage and file URLs * βœ… You're using multiple S3-compatible providers (R2, DigitalOcean, etc.) **When to choose UploadThing:** * βœ… You want zero infrastructure setup * βœ… You prefer managed service over self-hosted * βœ… You're building a rapid prototype or MVP ``` Pushduck: ~7KB (minified + gzipped) UploadThing: ~200KB+ (includes server runtime) Difference: 28x smaller ``` **Why Pushduck is smaller:** * Uses `aws4fetch` (lightweight AWS signer) instead of heavy dependencies * No built-in UI components (bring your own) * Focused on upload logic only * Optimized for tree-shaking **Pushduck:** * Library: **Free (MIT license)** * Costs: **S3 storage only** (\~$0.023/GB on AWS, free tier: 5GB + 20k requests/month) * Example: 10k uploads/month @ 2MB each = \~$0.50/month **UploadThing:** * Free tier: 2GB storage + 100 uploads/month * Pro: $20/month (50GB storage + 10k uploads) * Enterprise: Custom pricing **Cost Comparison (10k monthly uploads):** * Pushduck + AWS S3: **\~$0.50/month** * UploadThing Pro: **$20/month** (if within limits) | Aspect | Pushduck | UploadThing | | -------------------- | ------------------------- | ------------------------------ | | **Storage Provider** | Your choice (6 providers) | UploadThing's infrastructure | | **File URLs** | Your domain/CDN | UploadThing's CDN | | **Data Ownership** | 100% yours | Stored on their infrastructure | | **Migration** | Easy (standard S3) | Requires re-uploading files | | **Vendor Lock-in** | None | Medium | *** ### vs AWS SDK **AWS SDK (`@aws-sdk/client-s3`)** is the official AWS library for S3 operations. **When to choose Pushduck:** * βœ… You need edge runtime support (AWS SDK requires Node.js) * βœ… You want a smaller bundle (\~16KB vs \~500KB) * βœ… You need React hooks and type-safe APIs * βœ… You want multi-provider support (R2, DigitalOcean, MinIO) * βœ… You prefer declarative schemas over imperative code * βœ… You want presigned URLs handled automatically * βœ… You want to avoid implementing upload infrastructure from scratch **When to choose AWS SDK:** * βœ… You need advanced S3 features (lifecycle policies, bucket management) * βœ… You're already heavily invested in AWS ecosystem * βœ… You need multipart uploads for very large files (100GB+) * βœ… You need direct, low-level control over every S3 operation ``` Pushduck: ~7KB (core client, minified + gzipped) AWS SDK: ~500KB (@aws-sdk/client-s3) Difference: 71x smaller ⚑️ ``` **Why it matters:** * Faster page loads * Lower bandwidth costs * Better mobile experience * Improved Core Web Vitals **Pushduck:** ```typescript // Declarative schema const router = s3.createRouter({ imageUpload: s3.image() .maxFileSize('5MB') .middleware(async ({ req }) => { const user = await auth(req); return { userId: user.id }; }), }); // Client (React) const { uploadFiles } = useUploadRoute('imageUpload'); ``` **AWS SDK:** ```typescript // Imperative code import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3'; const s3 = new S3Client({ region: 'us-east-1' }); const uploadFile = async (file: File) => { const command = new PutObjectCommand({ Bucket: 'my-bucket', Key: `uploads/${file.name}`, Body: await file.arrayBuffer(), ContentType: file.type, }); await s3.send(command); }; // + Manual progress tracking // + Manual validation // + Manual React state management ``` **Pushduck provides:** * Type-safe schemas * Built-in React hooks * Automatic progress tracking * Middleware system * Multi-provider support **AWS SDK provides:** * Direct S3 control * Advanced features * Official AWS support ### What AWS SDK Requires You to Build With AWS SDK, you need to **manually implement everything**: #### 1. Presigned URL Generation ```typescript import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3'; import { getSignedUrl } from '@aws-sdk/s3-request-presigner'; // ❌ You must handle: // - Creating S3 client with credentials // - Generating unique file keys // - Setting correct content types // - Configuring expiration times // - Handling CORS headers // - Managing bucket permissions const s3Client = new S3Client({ region: 'us-east-1' }); const generatePresignedUrl = async (fileName: string) => { const key = `uploads/${Date.now()}-${fileName}`; const command = new PutObjectCommand({ Bucket: 'my-bucket', Key: key, ContentType: 'application/octet-stream', // Must set manually }); const url = await getSignedUrl(s3Client, command, { expiresIn: 3600 }); return { url, key }; }; ``` #### 2. Client-Side Upload Logic ```typescript // ❌ You must build: // - XMLHttpRequest wrapper for progress tracking // - Error handling and retry logic // - AbortController for cancellation // - State management for multiple files // - Progress aggregation // - File validation (size, type) const uploadFile = (file: File, presignedUrl: string) => { return new Promise((resolve, reject) => { const xhr = new XMLHttpRequest(); xhr.upload.onprogress = (e) => { const progress = (e.loaded / e.total) * 100; // Update UI manually }; xhr.onload = () => { if (xhr.status === 200) { resolve(xhr.response); } else { reject(new Error('Upload failed')); } }; xhr.onerror = () => reject(new Error('Network error')); xhr.open('PUT', presignedUrl); xhr.setRequestHeader('Content-Type', file.type); xhr.send(file); }); }; ``` #### 3. API Route Handler ```typescript // ❌ You must implement: // - Request parsing and validation // - Authentication/authorization // - File metadata validation // - Error responses // - Type safety export async function POST(request: Request) { const { fileName, fileSize, fileType } = await request.json(); // Validate manually if (fileSize > 10 * 1024 * 1024) { return Response.json({ error: 'File too large' }, { status: 400 }); } // Auth manually const user = await authenticateUser(request); if (!user) { return Response.json({ error: 'Unauthorized' }, { status: 401 }); } // Generate presigned URL const { url, key } = await generatePresignedUrl(fileName); return Response.json({ url, key }); } ``` #### 4. React Component State Management ```typescript // ❌ You must manage: // - File state (idle, uploading, success, error) // - Progress for each file // - Overall progress // - Error messages // - Upload speed and ETA calculations // - Cleanup on unmount const [files, setFiles] = useState([]); const [isUploading, setIsUploading] = useState(false); const [progress, setProgress] = useState(0); const [errors, setErrors] = useState([]); // Implement all the upload logic... ``` #### 5. CORS Configuration ```json // ❌ You must configure S3 bucket CORS manually: { "CORSRules": [ { "AllowedOrigins": ["https://your-domain.com"], "AllowedMethods": ["PUT", "POST", "GET"], "AllowedHeaders": ["*"], "ExposeHeaders": ["ETag"] } ] } ``` *** ### What Pushduck Handles For You ```typescript // βœ… Pushduck handles ALL of the above: // Server (3 lines) const router = s3.createRouter({ imageUpload: s3.image().maxFileSize('5MB'), }); export const { GET, POST } = router.handlers; // Client (1 line) const { uploadFiles, progress, isUploading } = useUploadRoute('imageUpload'); ``` **Everything included:** * βœ… Presigned URL generation (automatic) * βœ… File validation (declarative) * βœ… Progress tracking (real-time) * βœ… Error handling (built-in) * βœ… Multi-file support (automatic) * βœ… React state management (handled) * βœ… Type safety (end-to-end) * βœ… Authentication hooks (middleware) * βœ… CORS headers (automatic) * βœ… Cancellation (AbortController) *** ### Time to Production | Task | AWS SDK | Pushduck | | ---------------------- | ----------------- | ---------------- | | **Initial setup** | 2-4 hours | 5 minutes | | **Progress tracking** | 1-2 hours | Included | | **Error handling** | 1-2 hours | Included | | **Multi-file uploads** | 2-3 hours | Included | | **Type safety** | 2-4 hours | Included | | **Testing** | 4-6 hours | Minimal | | **Total** | **\~15-20 hours** | **\~30 minutes** | **AWS SDK is a low-level tool.** You're responsible for building the entire upload infrastructure, handling edge cases, security, validation, progress tracking, and state management. **Pushduck is a high-level framework.** All the infrastructure is built-in, tested, and production-ready out of the box. *** ### vs Uploadcare / Filestack **Uploadcare** and **Filestack** are managed file upload platforms with built-in CDN, transformations, and processing. **When to choose Pushduck:** * βœ… You want to avoid per-upload and storage fees * βœ… You need full control over file storage and URLs * βœ… You prefer self-hosted over managed services * βœ… You don't need built-in image processing (can integrate Sharp, Cloudinary, etc.) * βœ… You want to avoid vendor lock-in **When to choose Uploadcare/Filestack:** * βœ… You need built-in image/video processing * βœ… You want zero infrastructure management * βœ… You need global CDN with automatic optimization * βœ… You have budget for managed services **Pushduck:** * Library: **Free** * Storage: **S3 costs** (\~$0.023/GB on AWS) * CDN: **Optional** (CloudFront, Cloudflare, BunnyCDN) * Processing: **Integrate your choice** (Sharp, Cloudinary, Imgix) **Uploadcare:** * Free: 3k uploads + 3GB storage/month * Start: $25/month (10k uploads + 10GB) * Pro: $99/month (50k uploads + 100GB) * Enterprise: Custom **Filestack:** * Free: 100 uploads + 100 transformations/month * Starter: $49/month (1k uploads) * Professional: $249/month (10k uploads) **Cost Example (10k monthly uploads, 20GB storage):** * Pushduck + S3: **\~$0.50/month** (+ optional CDN \~$1-5) * Uploadcare Start: **$25/month** * Filestack Professional: **$249/month** | Aspect | Pushduck | Uploadcare/Filestack | | ------------------ | ------------------- | -------------------- | | **Storage** | Your S3 bucket | Their infrastructure | | **File URLs** | Your domain | Their CDN | | **Processing** | Integrate as needed | Built-in | | **Data Migration** | Standard S3 API | Proprietary API | | **Privacy** | Full control | Trust third party | | **Vendor Lock-in** | None | High | *** ### vs Uppy **Uppy** is a modular JavaScript file uploader with a focus on UI components and extensibility. **When to choose Pushduck:** * βœ… You're using React (Pushduck has first-class React support) * βœ… You need type-safe APIs with TypeScript inference * βœ… You want tighter Next.js integration * βœ… You prefer schema-based validation over manual configuration **When to choose Uppy:** * βœ… You need a rich, pre-built UI (Dashboard, Drag & Drop, Webcam, Screen Capture) * βœ… You're using vanilla JS or other frameworks (Vue, Svelte) * βœ… You need resumable uploads (tus protocol) * βœ… You want highly customizable UI components **Pushduck:** * **Focus:** Direct-to-S3 uploads with presigned URLs * **UI:** Bring your own (minimal bundle size) * **Type Safety:** First-class TypeScript, runtime validation * **Backend:** Server-first (schema definitions, middleware, hooks) **Uppy:** * **Focus:** Modular file uploader with rich UI * **UI:** Built-in components (Dashboard, Drag & Drop, etc.) * **Type Safety:** TypeScript definitions available * **Backend:** Agnostic (works with any backend) **Choose Pushduck if:** ```typescript // You want schema-based validation const router = s3.createRouter({ profilePic: s3.image() .maxFileSize('2MB') .types(['image/jpeg', 'image/png']) .middleware(auth) .onUploadComplete(updateDatabase), }); ``` **Choose Uppy if:** ```typescript // You want rich UI out of the box import Uppy from '@uppy/core'; import Dashboard from '@uppy/dashboard'; import Webcam from '@uppy/webcam'; import ScreenCapture from '@uppy/screen-capture'; const uppy = new Uppy() .use(Dashboard, { inline: true }) .use(Webcam, { target: Dashboard }) .use(ScreenCapture, { target: Dashboard }); ``` *** ## Decision Matrix ### Choose Pushduck if: βœ… You want **full control** over storage and infrastructure\ βœ… You need **edge runtime** compatibility (Vercel, Cloudflare Workers)\ βœ… You prefer **self-hosted** over managed services\ βœ… You want to **minimize costs** (pay only S3 storage fees)\ βœ… You need **type-safe APIs** with TypeScript inference\ βœ… You're building with **React** and **Next.js**\ βœ… You want to avoid **vendor lock-in**\ βœ… You need support for **multiple S3-compatible providers** *** ### Choose UploadThing if: βœ… You want **zero infrastructure setup**\ βœ… You prefer **managed service** over self-hosting\ βœ… You're building a **rapid prototype** or **MVP**\ βœ… You want to **avoid S3 configuration**\ βœ… You're **okay with per-upload pricing** *** ### Choose AWS SDK if: βœ… You need **advanced S3 features** (lifecycle, versioning, bucket management)\ βœ… You're **AWS-only** (not using other providers)\ βœ… You need **multipart uploads** for very large files (100GB+)\ βœ… You don't need **edge runtime** compatibility\ βœ… Bundle size is **not a concern**\ βœ… You have **15-20 hours** to build upload infrastructure from scratch\ βœ… You want **full low-level control** over every S3 operation\ ⚠️ You're comfortable **manually implementing** presigned URLs, progress tracking, validation, error handling, and React state management *** ### Choose Uploadcare/Filestack if: βœ… You need **built-in image/video processing**\ βœ… You want **zero infrastructure** management\ βœ… You need a **global CDN** with automatic optimization\ βœ… You have **budget for managed services** ($25-250/month)\ βœ… You want **all-in-one** (upload + storage + processing + CDN) *** ### Choose Uppy if: βœ… You need **rich, pre-built UI** components\ βœ… You want **highly customizable** upload widgets\ βœ… You need **resumable uploads** (tus protocol)\ βœ… You're **not using React** (vanilla JS, Vue, Svelte)\ βœ… You want **modular architecture** (pick and choose plugins) *** ## Feature Comparison ### Core Features | Feature | Pushduck | UploadThing | AWS SDK | Uploadcare | Uppy | | --------------------- | -------- | ----------- | ---------- | ---------- | ---------- | | **Direct-to-S3** | βœ… | βœ… | βœ… | ❌ | βœ… | | **Presigned URLs** | βœ… | βœ… | βœ… | ❌ | ⚠️ | | **Progress Tracking** | βœ… | βœ… | ⚠️ Manual | βœ… | βœ… | | **Multi-file Upload** | βœ… | βœ… | βœ… | βœ… | βœ… | | **Type Safety** | βœ… Full | βœ… Full | ⚠️ Partial | ⚠️ Partial | ⚠️ Partial | | **Schema Validation** | βœ… | βœ… | ❌ | ⚠️ | ⚠️ | | **Middleware System** | βœ… | βœ… | ❌ | ❌ | ⚠️ | | **Lifecycle Hooks** | βœ… | βœ… | ❌ | ⚠️ | βœ… | | **React Hooks** | βœ… | βœ… | ❌ | βœ… | βœ… | ### Storage & Providers | Feature | Pushduck | UploadThing | AWS SDK | Uploadcare | Uppy | | ------------------------ | -------- | ----------- | ------- | ---------- | ---- | | **AWS S3** | βœ… | ❌ | βœ… | ❌ | βœ… | | **Cloudflare R2** | βœ… | ❌ | ❌ | ❌ | βœ… | | **DigitalOcean Spaces** | βœ… | ❌ | ❌ | ❌ | βœ… | | **Google Cloud Storage** | βœ… | ❌ | ❌ | ❌ | βœ… | | **MinIO** | βœ… | ❌ | ❌ | ❌ | βœ… | | **Backblaze B2** | βœ… | ❌ | ❌ | ❌ | βœ… | | **Custom Domain** | βœ… | ⚠️ Limited | βœ… | βœ… | βœ… | ### Runtime & Compatibility | Feature | Pushduck | UploadThing | AWS SDK | Uploadcare | Uppy | | ------------------------ | -------- | ----------- | ------- | ---------- | ---- | | **Edge Runtime** | βœ… | βœ… | ❌ | βœ… | βœ… | | **Node.js** | βœ… | βœ… | βœ… | βœ… | βœ… | | **Cloudflare Workers** | βœ… | βœ… | ❌ | βœ… | βœ… | | **Vercel Edge** | βœ… | βœ… | ❌ | βœ… | βœ… | | **Next.js App Router** | βœ… | βœ… | βœ… | βœ… | βœ… | | **Next.js Pages Router** | βœ… | βœ… | βœ… | βœ… | βœ… | | **Remix** | βœ… | ⚠️ | βœ… | βœ… | βœ… | | **SvelteKit** | βœ… | ❌ | βœ… | βœ… | βœ… | *** ## Bundle Size Breakdown ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Bundle Size Comparison (minified + gzipped) β”‚ β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€ β”‚ Pushduck: β–ˆβ–ˆ 7KB β”‚ β”‚ Uppy (core): β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ 50KB β”‚ β”‚ Uploadcare: β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ 150KB β”‚ β”‚ UploadThing: β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ 200KB β”‚ β”‚ AWS SDK: β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ 500KB β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` **Why bundle size matters:** * **Faster initial page load** (especially on mobile) * **Lower bandwidth costs** * **Better Core Web Vitals** (LCP, FCP) * **Improved SEO** (Google considers page speed) *** ## Pricing Breakdown ### Monthly Cost Example **Scenario:** 10,000 uploads/month, 2MB average file size, 20GB total storage | Solution | Monthly Cost | Breakdown | | ---------------------------- | ------------ | ---------------------------------- | | **Pushduck + AWS S3** | **$0.50** | Storage: $0.46, Requests: $0.04 | | **Pushduck + Cloudflare R2** | **$0.30** | Storage: $0.30, Egress: $0 | | **AWS SDK + S3** | **$0.50** | Same as Pushduck (library is free) | | **UploadThing Pro** | **$20** | (if within 10k upload limit) | | **Uploadcare Start** | **$25** | (if within 10k upload limit) | | **Filestack Professional** | **$249** | (10k uploads tier) | **Note:** Pushduck and AWS SDK are libraries, not services. You pay only for S3 storage. Managed services (UploadThing, Uploadcare, Filestack) handle infrastructure but charge per-upload fees. *** ## Migration Guide ### From AWS SDK to Pushduck ```typescript // Before (AWS SDK) import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3'; const s3 = new S3Client({ region: 'us-east-1' }); const uploadFile = async (file: File) => { const command = new PutObjectCommand({ Bucket: 'my-bucket', Key: `uploads/${file.name}`, Body: await file.arrayBuffer(), }); await s3.send(command); }; // After (Pushduck) // Server const { s3 } = createUploadConfig() .provider('aws', { bucket: 'my-bucket', region: 'us-east-1' }) .build(); const router = s3.createRouter({ fileUpload: s3.file().maxFileSize('10MB'), }); export const { GET, POST } = router.handlers; // Client const { uploadFiles } = useUploadRoute('fileUpload'); ``` **Benefits:** * βœ… 31x smaller bundle (\~16KB vs \~500KB) * βœ… Edge runtime compatible * βœ… Built-in React hooks * βœ… Type-safe APIs * βœ… Automatic progress tracking *** ### From UploadThing to Pushduck **Migration Steps:** 1. **Set up S3 bucket** (one-time setup) 2. **Replace UploadThing config with Pushduck config** 3. **Update client imports** 4. **Migrate files** (optional, can use UploadThing's S3 export if available) **Data Ownership:** * UploadThing: Files on their infrastructure (requires export) * Pushduck: Files in your S3 bucket (you own them) *** ## Frequently Asked Questions ### "Should I use Pushduck or a managed service?" **Use Pushduck if:** * You want full control and ownership * You want to minimize costs * You're comfortable with basic S3 setup **Use managed service if:** * You want zero infrastructure setup * You need built-in processing (images, videos) * You have budget for convenience *** ### "Is Pushduck production-ready?" βœ… Yes! Pushduck is used in production by multiple projects. **Production features:** * Comprehensive error handling * Health checks and metrics * Battle-tested S3 upload flow * Type-safe APIs * Extensive test coverage See: [Production Checklist](/docs/guides/production-checklist) *** ### "Can I migrate from Pushduck later?" βœ… Yes, easily! Your files are in standard S3 buckets. **Migration path:** 1. Your files are already in S3 (standard format) 2. Switch to any S3-compatible solution 3. No data migration needed (files stay in your bucket) 4. No vendor lock-in *** ### "Does Pushduck support image processing?" ⚠️ **Not built-in** (by design - keeps bundle small). **Integration options:** * [Sharp](/docs/guides/image-uploads) - Server-side processing * [Cloudinary](/docs/guides/image-uploads) - API-based * [Imgix](/docs/guides/image-uploads) - URL-based * Any image processing tool See: [Image Uploads Guide](/docs/guides/image-uploads) *** ## Conclusion **Pushduck** is ideal for developers who want: * πŸͺΆ Lightweight library (\~16KB) * πŸ”’ Full control over storage * πŸ’° Minimal costs (S3 only) * πŸš€ Edge runtime support * πŸ”Œ No vendor lock-in If you need a managed service with built-in processing and global CDN, consider **Uploadcare** or **Filestack**. If you want rapid prototyping with zero S3 setup, consider **UploadThing**. For advanced S3 features and AWS-only projects, **AWS SDK** is the right choice. For rich UI components and resumable uploads, **Uppy** is a great option. *** **Ready to get started?** Head to the [Quick Start](/docs/quick-start) guide to set up Pushduck in 5 minutes. # Examples & Demos (/docs/examples) import { Callout } from "fumadocs-ui/components/callout"; import { Tabs, Tab } from "fumadocs-ui/components/tabs"; **Live Demos:** These are fully functional demos using real Cloudflare R2 storage. Files are uploaded to a demo bucket and may be automatically cleaned up. Don't upload sensitive information. **Having Issues?** If uploads aren't working (especially with `next dev --turbo`), check our [Troubleshooting Guide](/docs/api/troubleshooting) for common solutions including the known Turbo mode compatibility issue. ## Interactive Upload Demo The full-featured demo showcasing all capabilities: **ETA & Speed Tracking:** Upload speed (MB/s) and estimated time remaining (ETA) appear below the progress bar during active uploads. Try uploading larger files (1MB+) to see these metrics in action! ETA becomes more accurate after the first few seconds of upload. ## Image-Only Upload Focused demo for image uploads with preview capabilities: ## Document Upload Streamlined demo for document uploads: ## Key Features Demonstrated ### βœ… **Type-Safe Client** ```typescript // Property-based access with full TypeScript inference const imageUpload = upload.imageUpload(); const fileUpload = upload.fileUpload(); // No string literals, no typos, full autocomplete await imageUpload.uploadFiles(selectedFiles); ``` ### ⚑ **Real-Time Progress** * Individual file progress tracking with percentage completion * Upload speed monitoring (MB/s) with live updates * ETA calculations showing estimated time remaining * Pause/resume functionality (coming soon) * Comprehensive error handling with retry mechanisms ### πŸ”’ **Built-in Validation** * File type validation (MIME types) * File size limits with user-friendly errors * Custom validation middleware * Malicious file detection ### 🌐 **Provider Agnostic** * Same code works with any S3-compatible provider * Switch between Cloudflare R2, AWS S3, DigitalOcean Spaces * Zero vendor lock-in ## Code Examples ```typescript "use client"; import { upload } from "@/lib/upload-client"; export function SimpleUpload() { const { uploadFiles, files, isUploading } = upload.imageUpload(); return (
uploadFiles(Array.from(e.target.files || []))} disabled={isUploading} /> {files.map(file => (
{file.name} {file.status} {file.url && View}
))}
); } ```
```typescript "use client"; import { upload } from "@/lib/upload-client"; import { useState } from "react"; export function MetadataUpload() { const [albumId, setAlbumId] = useState('vacation-2025'); const [tags, setTags] = useState(['summer']); const { uploadFiles, files, isUploading } = upload.imageUpload({ onSuccess: (results) => { console.log(`Uploaded ${results.length} images to album: ${albumId}`); } }); const handleUpload = (e: React.ChangeEvent) => { const selectedFiles = Array.from(e.target.files || []); // Pass client-side context as metadata uploadFiles(selectedFiles, { albumId: albumId, tags: tags, visibility: 'private', uploadSource: 'web-app' }); }; return (
{files.map(file => (
{file.name} {file.status} {file.url && View}
))}
); } ```
```typescript // app/api/upload/route.ts import { createUploadConfig } from "pushduck/server"; const { s3, } = createUploadConfig() .provider("cloudflareR2",{ accountId: process.env.CLOUDFLARE_ACCOUNT_ID!, bucket: process.env.R2_BUCKET!, }) .defaults({ maxFileSize: "10MB", acl: "public-read", }) .build(); const uploadRouter = s3.createRouter({ imageUpload: s3 .image() .maxFileSize("5MB") .formats(["jpeg", "png", "webp"]) .middleware(async ({ file, metadata }) => { // Custom authentication and metadata const session = await getServerSession(); if (!session) throw new Error("Unauthorized"); return { ...metadata, userId: session.user.id, uploadedAt: new Date().toISOString(), }; }) .onUploadComplete(async ({ file, url, metadata }) => { // Post-upload processing console.log(`Upload complete: ${url}`); await saveToDatabase({ url, metadata }); }), }); export const { GET, POST } = uploadRouter.handlers; export type AppRouter = typeof uploadRouter; ``` ```typescript "use client"; import { upload } from "@/lib/upload-client"; export function RobustUpload() { const { uploadFiles, files, errors, reset } = upload.imageUpload(); const handleUpload = async (fileList: FileList) => { try { await uploadFiles(Array.from(fileList)); } catch (error) { console.error("Upload failed:", error); // Error is automatically added to the errors array } }; return (
e.target.files && handleUpload(e.target.files)} /> {/* Display errors */} {errors.length > 0 && (

Upload Errors:

{errors.map((error, index) => (

{error}

))}
)} {/* Display files with status */} {files.map(file => (
{file.name} {file.status} {file.status === "uploading" && ( )} {file.status === "error" && ( {file.error} )} {file.status === "success" && file.url && ( View File )}
))}
); } ```
## Real-World Use Cases ### **Profile Picture Upload** Single image upload with instant preview and crop functionality. ### **Document Management** Multi-file document upload with categorization and metadata. ### **Media Gallery** Batch image upload with automatic optimization and thumbnail generation. ### **File Sharing** Secure file upload with expiration dates and access controls. ## Next Steps
⚑ Quick Start
Get set up in 2 minutes with our CLI
πŸ“š API Reference
Complete API documentation
☁️ Providers
Configure your storage provider
🎬 Full Demo
Complete upload experience
# How It Works (/docs/how-it-works) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Mermaid } from "@/components/mdx/mermaid"; ## Architecture Overview Pushduck is built on a **direct-to-S3 upload pattern** using presigned URLs, eliminating the need for your server to handle file data. **Core Principle:** Files go directly from the client to S3 storage, bypassing your server entirely. This enables infinite scalability and edge-compatible deployments. *** ## Upload Flow ### Complete Upload Process **Key Benefits:** * βœ… Server never touches file data (saves bandwidth) * βœ… Scales infinitely (S3 handles the load) * βœ… Edge-compatible (no file streaming needed) * βœ… Real-time progress tracking on client *** ## Component Architecture *** ## Configuration Flow *** ## Type Safety System *** ## Middleware Chain *** ## Storage Provider System **Key Insight:** All providers use the same S3-compatible API, so switching is just a configuration change. *** ## Client State Management *** ## Integration Points *** ## Comparison: Pushduck vs AWS SDK ### What You Need to Build with AWS SDK *** ## Key Takeaways Files upload directly to S3 storage, bypassing your server. This enables infinite scalability and edge deployment. \~7KB total bundle using `aws4fetch` instead of AWS SDK (\~500KB). 71x smaller, edge-compatible. End-to-end TypeScript inference from server schema to client hook. Catch errors at compile-time. Middleware and lifecycle hooks provide integration points without bloating the library with built-in features. Universal Web Standard handlers work with 16+ frameworks via thin adapters. Unified API works with 6 S3-compatible providers. Switch providers with just a config change. *** ## Next Steps **Ready to build?** Check out the [Quick Start](/docs/quick-start) guide to get Pushduck running in 5 minutes. **Learn More:** * [Philosophy & Scope](/docs/philosophy) - What Pushduck does (and doesn't do) * [API Reference](/docs/api) - Complete API documentation * [Examples](/docs/examples) - Live demos and code samples # Pushduck (/docs) import { Card, Cards } from "fumadocs-ui/components/card"; import { Step, Steps } from "fumadocs-ui/components/steps"; ## Simple S3 Uploads, Zero Vendor Lock-in Upload files directly to S3-compatible storage. Lightweight (6KB), type-safe, and works everywhere. No monthly fees, no vendor lock-inβ€”just **3 files and \~50 lines of code**. ```typescript // Create your upload client const upload = createUploadClient({ endpoint: '/api/upload' }); // Use anywhere in your app export function MyComponent() { const { uploadFiles, files, isUploading } = upload.imageUpload(); return ( uploadFiles(Array.from(e.target.files || []))} disabled={isUploading} /> ); } ```
## Why Choose Pushduck? **Alternative to UploadThing** - Own your infrastructure, zero recurring costs. | Feature | Pushduck | UploadThing | | ------------------ | -------------------- | ----------------------- | | **Cost** | $0 (use your S3) | $10-25/month | | **Bundle Size** | 6KB | Managed client | | **Vendor Lock-in** | None - S3 compatible | Locked to their service | | **File Ownership** | Your S3 bucket | Their storage | | **Type Safety** | Full TypeScript | TypeScript support | | **Setup Time** | \~2 minutes | \~2 minutes | **Key benefits:** * βœ… **6KB bundle** - No heavy AWS SDK * βœ… **Type-safe** - Compile-time route validation * βœ… **Own your files** - Any S3-compatible provider * βœ… **No monthly fees** - Use your own S3 * βœ… **Focused library** - Does uploads, nothing else ## More Resources
## What's Included * βœ… **Progress Tracking** - Real-time progress, speed, and ETA * βœ… **Type Safety** - Full TypeScript from server to client * βœ… **Multi-Provider** - AWS S3, Cloudflare R2, DigitalOcean, MinIO * βœ… **Validation** - File type, size, and custom rules * βœ… **Storage Operations** - List, delete, and manage files * βœ… **Framework Support** - Next.js, Remix, Express, Fastify, and more * βœ… **Drag & Drop Components** - Copy-paste UI components via CLI πŸ“– **What we don't do** - File processing, analytics, team management. See [Philosophy](/docs/philosophy) for our focused scope. # Manual Setup (/docs/manual-setup) import { Step, Steps } from "fumadocs-ui/components/steps"; import { Callout } from "fumadocs-ui/components/callout"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; ## Prerequisites * Next.js 13+ with App Router * An S3-compatible storage provider (we recommend Cloudflare R2 for best performance and cost) * Node.js 18+ ## Install Pushduck npm pnpm yarn bun ```bash npm install pushduck ``` ```bash pnpm add pushduck ``` ```bash yarn add pushduck ``` ```bash bun add pushduck ``` ## Set Environment Variables Create a `.env.local` file in your project root with your storage credentials: Cloudflare R2 AWS S3 ```dotenv title=".env.local" # Cloudflare R2 Configuration CLOUDFLARE_R2_ACCESS_KEY_ID=your_access_key CLOUDFLARE_R2_SECRET_ACCESS_KEY=your_secret_key CLOUDFLARE_R2_ACCOUNT_ID=your_account_id CLOUDFLARE_R2_BUCKET_NAME=your-bucket-name ``` ```dotenv title=".env.local" # AWS S3 Configuration AWS_ACCESS_KEY_ID=your_access_key AWS_SECRET_ACCESS_KEY=your_secret_key AWS_REGION=us-east-1 AWS_S3_BUCKET_NAME=your-bucket-name ``` **Don't have credentials yet?** Follow our [Provider setup guide](/docs/providers) to create a bucket and get your credentials in 2 minutes. ## Configure Upload Settings First, create your upload configuration: ```typescript // lib/upload.ts import { createUploadConfig } from "pushduck/server"; // Configure your S3-compatible storage export const { s3, storage } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.CLOUDFLARE_R2_ACCESS_KEY_ID!, secretAccessKey: process.env.CLOUDFLARE_R2_SECRET_ACCESS_KEY!, region: "auto", endpoint: `https://${process.env.CLOUDFLARE_R2_ACCOUNT_ID}.r2.cloudflarestorage.com`, bucket: process.env.CLOUDFLARE_R2_BUCKET_NAME!, accountId: process.env.CLOUDFLARE_R2_ACCOUNT_ID!, }) .build(); ``` ## Create Your Upload Router Create an API route to handle file uploads: ```typescript // app/api/s3-upload/route.ts import { s3 } from "@/lib/upload"; const s3Router = s3.createRouter({ // Define your upload routes with validation imageUpload: s3 .image() .maxFileSize("10MB") .formats(["jpg", "jpeg", "png", "webp"]), documentUpload: s3.file().maxFileSize("50MB").types(["application/pdf", "application/msword", "application/vnd.openxmlformats-officedocument.wordprocessingml.document"]), }); export const { GET, POST } = s3Router.handlers; // Export the router type for client-side type safety export type Router = typeof s3Router; ``` **What's happening here?** - `s3.createRouter()` creates a type-safe upload handler - `s3.image()` and `s3.file()` provide validation and TypeScript inference - The router automatically handles presigned URLs, validation, and errors - Exporting the type enables full client-side type safety ## Create Upload Client Create a type-safe client for your components: ```typescript // lib/upload-client.ts import { createUploadClient } from "pushduck"; import type { Router } from "@/app/api/s3-upload/route"; // Create a type-safe upload client export const upload = createUploadClient({ baseUrl: "/api/s3-upload", }); // You can also export specific upload methods export const { imageUpload, documentUpload } = upload; ``` ## Use in Your Components Now you can use the upload client in any component with full type safety: ```typescript // components/image-uploader.tsx "use client"; import { upload } from "@/lib/upload-client"; export function ImageUploader() { const { uploadFiles, uploadedFiles, isUploading, progress, error } = upload.imageUpload(); const handleFileChange = (e: React.ChangeEvent) => { const files = e.target.files; if (files) { uploadFiles(Array.from(files)); } }; return (
{isUploading && (
Uploading... {Math.round(progress)}%
)}
{error && (

{error.message}

)} {uploadedFiles.length > 0 && (
{uploadedFiles.map((file) => (
Uploaded image

{file.name}

))}
)}
); } ``` ## Add to Your Page Finally, use your upload component in any page: ```typescript // app/page.tsx import { ImageUploader } from "@/components/image-uploader"; export default function HomePage() { return (

Upload Images

); } ```
## πŸŽ‰ Congratulations! You now have **production-ready file uploads** working in your Next.js app! Here's what you accomplished: * βœ… **Type-safe uploads** with full TypeScript inference * βœ… **Automatic validation** for file types and sizes * βœ… **Progress tracking** with loading states * βœ… **Error handling** with user-friendly messages * βœ… **Secure uploads** using presigned URLs * βœ… **Multiple file support** with image preview **Turbo Mode Issue:** If you're using `next dev --turbo` and experiencing upload issues, try removing the `--turbo` flag from your dev script. There's a known compatibility issue with Turbo mode that can affect file uploads. ## What's Next? Now that you have the basics working, explore these advanced features:

🎨 Enhanced UI

Add drag & drop, progress bars, and beautiful components

Image Upload Guide β†’
{" "}

πŸ”’ Custom Validation

Add authentication, custom metadata, and middleware

Router Configuration β†’
{" "}

☁️ Other Providers

Try Cloudflare R2 for better performance, or AWS S3, DigitalOcean, MinIO

Provider Setup β†’

⚑ Enhanced Client

Upgrade to property-based access for better DX

Migration Guide β†’
## Need Help? * πŸ“– **Documentation**: Explore our comprehensive [guides](/docs/guides) * πŸ’¬ **Community**: Join our [Discord community](https://pushduck.dev/discord) * πŸ› **Issues**: Report bugs on [GitHub](https://github.com/abhay-ramesh/pushduck) * πŸ“§ **Support**: Email us at [support@pushduck.com](mailto:support@pushduck.com) **Loving Pushduck?** Give us a ⭐ on [GitHub](https://github.com/abhay-ramesh/pushduck) and help spread the word! # Philosophy & Scope (/docs/philosophy) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; ## Our Philosophy Pushduck is a **focused upload library**, not a platform. We believe in doing one thing exceptionally well: > The fastest, most lightweight way to add S3 file uploads to any web application This document defines the boundaries of what Pushduck will and won't do, and explains why. *** ## Core Principles ### πŸͺΆ Lightweight First Bundle size is a feature, not an afterthought. Every dependency is carefully considered. **We use:** * `aws4fetch` (6.4KB) instead of AWS SDK (500KB+) * Native `fetch()` API * Zero unnecessary dependencies **Result:** Core library stays under 10KB minified + gzipped *** ### 🎯 Focused Scope Do one thing (uploads) exceptionally well, rather than many things poorly. **We believe:** * Specialized tools beat all-in-one solutions * Small, focused libraries are easier to maintain * Users prefer composing tools over vendor lock-in **Result:** You can replace Pushduck easily if needed, or use it alongside other tools *** ### πŸ”Œ Extensibility Over Features Provide hooks and APIs, not built-in everything. **We provide:** * Middleware system for custom logic * Lifecycle hooks for integration points * Type-safe APIs for extension **You implement:** * Your specific business logic * Integration with your services * Custom workflows **Result:** Maximum flexibility without bloat *** ### πŸ“š Document, Don't Implement Show users how to integrate, don't build the integration. **We provide:** * Clear integration patterns * Example code * Best practices documentation **We don't build:** * Database adapters * Auth providers * Email services * Analytics platforms **Result:** Works with any stack, no vendor lock-in *** ## βœ… What Pushduck Does ### Core Upload Features Upload files directly to S3 without touching your server. Reduces bandwidth costs and improves performance. Track upload progress, speed, and ETA. Per-file and overall progress metrics for multi-file uploads. Validate file size, type, count, and custom rules. Prevent invalid uploads before they reach S3. Works with AWS S3, Cloudflare R2, DigitalOcean Spaces, MinIO, and any S3-compatible provider. ### Storage Operations ```typescript // List files const files = await storage.list.files({ prefix: "uploads/", maxResults: 50 }); // Delete files await storage.delete.file("uploads/old.jpg"); await storage.delete.byPrefix("temp/"); // Get metadata const info = await storage.metadata.getInfo("uploads/doc.pdf"); // Generate download URLs const url = await storage.download.presignedUrl("uploads/file.pdf", 3600); ``` **What we provide:** * βœ… List files with pagination and filtering * βœ… Delete files (single, batch, by prefix) * βœ… Get file metadata (size, date, content-type) * βœ… Generate presigned URLs (upload/download) * βœ… Check file existence **What we don't provide:** * ❌ File search/indexing (use Algolia, Elasticsearch) * ❌ File versioning (use S3 versioning) * ❌ Storage analytics (provide hooks for your analytics) * ❌ Duplicate detection (implement via hooks) *** ### Developer Experience Intelligent type inference from server to client. Catch errors at compile time. Works with Next.js, React, Express, Fastify, and more. Web Standards-based. Interactive setup wizard, automatic provider detection, and project scaffolding. Test your upload flows without hitting real S3. Perfect for CI/CD. *** ### Optional UI Components Following the [shadcn/ui](https://ui.shadcn.com) approach: ```bash # Copy components into your project npx @pushduck/cli add upload-dropzone npx @pushduck/cli add file-list ``` **What we provide:** * βœ… Basic upload UI components (dropzone, file-list, progress-bar) * βœ… Headless/unstyled components you can customize * βœ… Copy-paste, not installed as dependency **What we don't provide:** * ❌ Full-featured file manager UI * ❌ Image gallery/carousel components * ❌ File preview modals * ❌ Admin dashboard components **Philosophy:** You own the code, you customize it. We provide starting points, not rigid components. *** ## ❌ What Pushduck Doesn't Do ### File Processing **Out of Scope** - Use specialized tools for these tasks **We don't process files. Use these instead:** | Task | Recommended Tool | Why | | --------------------- | -------------------------------------------------- | ---------------------------- | | Image optimization | [Sharp](https://sharp.pixelplumbing.com/) | Best-in-class, battle-tested | | Video transcoding | [FFmpeg](https://ffmpeg.org/) | Industry standard | | PDF generation | [PDFKit](https://pdfkit.org/) | Specialized library | | Image transformations | [Cloudflare Images](https://cloudflare.com/images) | Edge-optimized | | Content moderation | AWS Rekognition, Cloudflare | Purpose-built services | **Why not?** * These tools do it better than we ever could * Adding them would balloon our bundle size * Creates unnecessary dependencies * Limits user choice **Our approach:** Document integration patterns **⚠️ Bandwidth Note:** Server-side processing requires downloading from S3 (inbound) and uploading variants (outbound). This negates the "server never touches files" benefit. **Better options:** * **Client-side preprocessing** (before upload) - Zero server bandwidth * **URL-based transforms** (Cloudinary, Imgix) - Zero server bandwidth * See [Image Uploads Guide](/docs/guides/image-uploads) for detailed patterns ```typescript // Example: Integrate with Sharp import sharp from 'sharp'; const router = s3.createRouter({ imageUpload: s3.image() .onUploadComplete(async ({ key }) => { // ⚠️ Downloads file from S3 to server const buffer = await s3.download(key); // Process with Sharp const optimized = await sharp(buffer) .resize(800, 600) .webp({ quality: 80 }) .toBuffer(); // ⚠️ Uploads processed file back to S3 await storage.upload.file(optimized, `optimized/${key}`); }) }); ``` ```typescript // βœ… Better: Client-side preprocessing (recommended) import imageCompression from 'browser-image-compression'; function ImageUpload() { const { uploadFiles } = upload.images(); const handleUpload = async (file: File) => { // βœ… Compress on client BEFORE upload const compressed = await imageCompression(file, { maxSizeMB: 1, maxWidthOrHeight: 1920, }); // Upload already-optimized file await uploadFiles([compressed]); }; } ``` *** ### Backend Services **Integration Pattern** - We provide hooks, you connect services **We don't implement these services:** | Service | What We Provide | You Implement | | --------------- | ----------------------- | ------------------ | | Webhooks | Lifecycle hooks | Webhook delivery | | Notifications | `onUploadComplete` hook | Email/SMS sending | | Database | File metadata in hooks | DB storage logic | | Queue Systems | Hooks with context | Queue integration | | Background Jobs | Async hook support | Job processing | | Analytics | Hooks with event data | Analytics tracking | **Example Integration:** ```typescript import { db } from '@/lib/database'; const router = s3.createRouter({ fileUpload: s3.file() .onUploadComplete(async ({ file, key, url, metadata }) => { // You implement database logic await db.files.create({ data: { name: file.name, size: file.size, url: url, s3Key: key, userId: metadata.userId, uploadedAt: new Date() } }); }) }); ``` ```typescript import { sendWebhook } from '@/lib/webhooks'; const router = s3.createRouter({ fileUpload: s3.file() .onUploadComplete(async ({ file, url }) => { // You implement webhook delivery await sendWebhook({ event: 'file.uploaded', data: { filename: file.name, url: url, timestamp: new Date().toISOString() } }); }) }); ``` ```typescript import { sendEmail } from '@/lib/email'; const router = s3.createRouter({ fileUpload: s3.file() .onUploadComplete(async ({ file, metadata }) => { // You implement email notifications await sendEmail({ to: metadata.userEmail, subject: 'File Upload Complete', body: `Your file "${file.name}" has been uploaded successfully.` }); }) }); ``` ```typescript import { queue } from '@/lib/queue'; const router = s3.createRouter({ fileUpload: s3.file() .onUploadComplete(async ({ file, key }) => { // You implement queue integration await queue.add('process-file', { fileKey: key, fileName: file.name, processType: 'thumbnail-generation' }); }) }); ``` ```typescript import { analytics } from '@/lib/analytics'; const router = s3.createRouter({ fileUpload: s3.file() .onUploadStart(async ({ file, metadata }) => { // Track upload start await analytics.track('upload_started', { userId: metadata.userId, fileSize: file.size, fileType: file.type }); }) .onUploadComplete(async ({ file, url, metadata }) => { // Track successful upload await analytics.track('upload_completed', { userId: metadata.userId, fileName: file.name, fileSize: file.size, fileUrl: url }); }) .onUploadError(async ({ error, metadata }) => { // Track errors await analytics.track('upload_failed', { userId: metadata.userId, error: error.message }); }) }); ``` **Why this approach?** * βœ… You're not locked into our choice of services * βœ… Use your existing infrastructure * βœ… Switch services without changing upload library * βœ… Keeps our bundle size minimal *** ### Platform Features **Not a Platform** - Pushduck is a library, not a SaaS **We will never build:** ❌ **User Management** - Use NextAuth, Clerk, Supabase Auth, etc.\ ❌ **Team/Organization Systems** - Build in your application\ ❌ **Permission/Role Management** - Implement in your middleware\ ❌ **Analytics Dashboards** - We provide hooks for your analytics\ ❌ **Admin Panels** - Build with your UI framework\ ❌ **Billing/Subscriptions** - Use Stripe, Paddle, etc.\ ❌ **API Key Management** - Implement in your system\ ❌ **Audit Logs** - Log via hooks to your logging service **Why not?** * Every app has different requirements * Would require a backend service (we're a library) * Creates vendor lock-in * Massive scope creep from our core mission **Our approach:** Provide middleware hooks ```typescript import { auth } from '@/lib/auth'; import { checkPermission } from '@/lib/permissions'; import { logAudit } from '@/lib/audit'; const router = s3.createRouter({ fileUpload: s3.file() .middleware(async ({ req, metadata }) => { // YOUR auth system const user = await auth.getUser(req); if (!user) throw new Error('Unauthorized'); // YOUR permissions system if (!checkPermission(user, 'upload:create')) { throw new Error('Forbidden'); } // YOUR audit logging await logAudit({ userId: user.id, action: 'file.upload.started', metadata: metadata }); return { userId: user.id }; }) }); ``` *** ### Authentication & Authorization **What we provide:** * βœ… Middleware hooks for auth checks * βœ… Access to request context (headers, cookies, etc.) * βœ… Integration examples with popular auth providers **What we don't provide:** * ❌ Built-in auth system * ❌ Session management * ❌ OAuth providers * ❌ API key generation * ❌ User database **Example Integrations:** ```typescript import { auth } from '@/lib/auth'; const router = s3.createRouter({ fileUpload: s3.file() .middleware(async ({ req }) => { const session = await auth.api.getSession({ headers: req.headers }); if (!session?.user) { throw new Error('Please sign in to upload files'); } return { userId: session.user.id, userEmail: session.user.email }; }) }); ``` ```typescript import { getServerSession } from 'next-auth'; const router = s3.createRouter({ fileUpload: s3.file() .middleware(async ({ req }) => { const session = await getServerSession(); if (!session?.user) { throw new Error('Please sign in to upload files'); } return { userId: session.user.id, userEmail: session.user.email }; }) }); ``` ```typescript import { auth } from '@clerk/nextjs'; const router = s3.createRouter({ fileUpload: s3.file() .middleware(async () => { const { userId } = auth(); if (!userId) { throw new Error('Unauthorized'); } return { userId }; }) }); ``` ```typescript import { createServerClient } from '@supabase/ssr'; const router = s3.createRouter({ fileUpload: s3.file() .middleware(async ({ req }) => { const supabase = createServerClient(/* config */); const { data: { user } } = await supabase.auth.getUser(); if (!user) { throw new Error('Unauthorized'); } return { userId: user.id }; }) }); ``` ```typescript import { verifyToken } from '@/lib/auth'; const router = s3.createRouter({ fileUpload: s3.file() .middleware(async ({ req }) => { const token = req.headers.get('authorization')?.replace('Bearer ', ''); if (!token) { throw new Error('No token provided'); } const user = await verifyToken(token); if (!user) { throw new Error('Invalid token'); } return { userId: user.id }; }) }); ``` *** ## 🎯 The Integration Pattern This is our core philosophy in action: ```typescript // 1. We handle uploads // 2. You connect your services via hooks // 3. Everyone wins const router = s3.createRouter({ fileUpload: s3.file() // YOUR auth .middleware(async ({ req }) => { const user = await yourAuth.getUser(req); return { userId: user.id }; }) // YOUR business logic .onUploadStart(async ({ file, metadata }) => { await yourAnalytics.track('upload_started', { userId: metadata.userId, fileSize: file.size }); }) // YOUR database .onUploadComplete(async ({ file, url, key, metadata }) => { await yourDatabase.files.create({ userId: metadata.userId, url: url, s3Key: key, name: file.name, size: file.size }); // YOUR notifications await yourEmailService.send({ to: metadata.userEmail, template: 'upload-complete', data: { fileName: file.name } }); // YOUR webhooks await yourWebhooks.trigger({ event: 'file.uploaded', data: { url, fileName: file.name } }); // YOUR queue await yourQueue.add('process-file', { fileKey: key, userId: metadata.userId }); // YOUR analytics await yourAnalytics.track('upload_completed', { userId: metadata.userId, fileName: file.name, fileSize: file.size, fileUrl: url, fileKey: key }); }) // YOUR error handling .onUploadError(async ({ error, metadata }) => { await yourErrorTracking.log({ error: error, userId: metadata.userId }); }) }); ``` **Benefits:** * πŸͺΆ Pushduck stays lightweight (only upload logic) * πŸ”Œ You use your preferred services * 🎯 No vendor lock-in * ⚑ No unnecessary code in your bundle * πŸ”§ Maximum flexibility *** ## πŸ€” Decision Framework When considering new features, we ask: ### βœ… Add if: 1. **Core to uploads** - Directly helps files get to S3 2. **Universally needed** - 80%+ of users need it 3. **Can't be solved externally** - Must be part of upload flow 4. **Lightweight** - Doesn't balloon bundle size 5. **Framework agnostic** - Works everywhere ### ❌ Don't add if: 1. **Better tools exist** - Sharp does image processing better 2. **Service-specific** - Requires backend infrastructure 3. **Opinion-heavy** - Database choice, auth provider, etc. 4. **UI-specific** - Every app needs different UI 5. **Platform feature** - User management, billing, etc. ### πŸ”Œ Provide hooks if: 1. **Common integration point** - Many users need it 2. **Can be external** - Services can be swapped 3. **Timing matters** - Needs to happen at specific point in upload lifecycle *** ## 🌟 What This Means For You ### As a User **You get:** * βœ… Lightweight, focused upload library * βœ… Freedom to choose your own tools * βœ… No vendor lock-in * βœ… Clear integration patterns * βœ… Stable, predictable API **You're responsible for:** * πŸ”§ Choosing and integrating your services * πŸ”§ Building your UI (or copy ours) * πŸ”§ Implementing your business logic * πŸ”§ Managing your infrastructure ### As a Contributor **Focus contributions on:** * βœ… Core upload features (resumable, queuing, etc.) * βœ… Framework adapters * βœ… Testing utilities * βœ… Documentation & examples * βœ… Integration guides **We'll reject PRs for:** * ❌ File processing features * ❌ Backend services (webhooks, notifications) * ❌ Database adapters * ❌ Auth providers * ❌ Platform features *** ## πŸ“š Further Reading Our development roadmap and planned features Guidelines for contributing to the project Patterns for integrating databases, auth, notifications, and more Complete examples showing integration patterns *** ## πŸ’¬ Questions? Have questions about scope or philosophy? * πŸ’­ [GitHub Discussions](https://github.com/abhay-ramesh/pushduck/discussions) * πŸ’¬ [Discord Community](https://pushduck.dev/discord) * πŸ› [GitHub Issues](https://github.com/abhay-ramesh/pushduck/issues) **Remember:** We're focused on being the best upload library, not the biggest. Every feature we say "no" to keeps Pushduck fast, lightweight, and maintainable. # Quick Start (/docs/quick-start) import { Step, Steps } from "fumadocs-ui/components/steps"; import { Callout } from "fumadocs-ui/components/callout"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Accordion, Accordions } from "fumadocs-ui/components/accordion"; Get file uploads working in **3 simple steps**. No overwhelming configuration, just the essentials. **Prefer automated setup?** Use `npx @pushduck/cli init` for zero-config setup. This guide is for manual installation. ### Install Pushduck npm pnpm yarn bun ```bash npm install pushduck ``` ```bash pnpm add pushduck ``` ```bash yarn add pushduck ``` ```bash bun add pushduck ``` ### Create API Route One file with your S3 config and upload route: ```ts title="app/api/upload/route.ts" import { createUploadConfig } from 'pushduck/server'; const { s3 } = createUploadConfig() .provider("aws", { bucket: process.env.AWS_BUCKET_NAME!, region: process.env.AWS_REGION!, accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, }) .build(); const router = s3.createRouter({ imageUpload: s3.image().maxFileSize('5MB'), }); export const { GET, POST } = router.handlers; export type AppRouter = typeof router; ``` **Environment variables**: Add your S3 credentials to `.env.local`. See [Provider Setup](/docs/providers) for getting credentials. ### Create Client & Use in Components Create a reusable client and use it in your components: ```ts title="lib/upload-client.ts" import { createUploadClient } from 'pushduck/client'; import type { AppRouter } from '@/app/api/upload/route'; export const upload = createUploadClient({ endpoint: '/api/upload' }); ``` ```tsx title="app/upload-demo.tsx" 'use client'; import { upload } from '@/lib/upload-client'; export function UploadDemo() { const { uploadFiles, files, isUploading } = upload.imageUpload(); return (
uploadFiles(Array.from(e.target.files || []))} disabled={isUploading} /> {files.map((file) => (
{file.name} - {file.progress}% {file.status === 'success' && {file.name}}
))}
); } ```
## βœ… Done! That's it - **3 steps, 3 files, \~40 lines of code**, and you have production-ready file uploads! *** ## Need More? For production apps, you'll want to add authentication and custom paths: ```ts title="app/api/upload/route.ts" const { s3 } = createUploadConfig() .provider("aws", { /* ... */ }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => `${metadata.userId}/${Date.now()}/${file.name}` }) .build(); const router = s3.createRouter({ imageUpload: s3.image() .maxFileSize('5MB') .middleware(async ({ req }) => { const user = await getUser(req); if (!user) throw new Error('Unauthorized'); return { userId: user.id }; }), }); ``` See [Configuration Guide](/docs/api/configuration/upload-config) for all options. Using Remix, SvelteKit, Hono, or another framework? See the [Integrations](/docs/integrations) page for framework-specific examples. Configure CORS on your S3 bucket: ```json [ { "AllowedOrigins": ["http://localhost:3000", "https://yourdomain.com"], "AllowedMethods": ["GET", "PUT", "POST", "DELETE"], "AllowedHeaders": ["*"], "ExposeHeaders": ["ETag"] } ] ``` See [Provider Setup](/docs/providers) for detailed CORS configuration. Want drag & drop, progress bars, and styled components? ```bash npx @pushduck/cli add upload-dropzone npx @pushduck/cli add file-list ``` Or see our [Examples](/docs/examples) for production-ready components. ## Next Steps # Roadmap (/docs/roadmap) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { File, Folder, Files } from "fumadocs-ui/components/files"; import { TypeTable } from "fumadocs-ui/components/type-table"; ## Development Roadmap Our mission is to make file uploads **simple**, **secure**, and **scalable** for every developer and every use case. ## βœ… Completed ### Core Foundation βœ… **Universal Compatibility** - Works with 16+ frameworks and edge runtimes\ βœ… **Type-Safe APIs** - Full TypeScript inference from server to client\ βœ… **Multi-Provider Support** - AWS S3, Cloudflare R2, DigitalOcean Spaces, MinIO\ βœ… **Production Security** - Presigned URLs, file validation, CORS handling\ βœ… **Developer Experience** - Property-based client, comprehensive error handling\ βœ… **Overall Progress Tracking** - Now provides real-time aggregate progress metrics * `progress` - 0-100% completion across all files * `uploadSpeed` - Combined transfer rate in bytes/second * `eta` - Overall time remaining in seconds ### Setup & Tooling βœ… **Interactive CLI** - Guided setup with smart defaults and auto-detection\ βœ… **Code Generation** - Type-safe API routes and client components\ βœ… **Framework Detection** - Automatic Next.js App Router/Pages Router detection\ βœ… **Environment Setup** - Automated credential configuration ### Documentation & Examples βœ… **Comprehensive Docs** - Complete API reference and integration guides\ βœ… **Live Examples** - Working demos for all supported frameworks\ βœ… **Migration Guides** - Step-by-step migration from other solutions\ βœ… **Best Practices** - Security, performance, and architecture guidance ## ⚠️ Current Limitations ### Progress Tracking Constraints βœ… **Overall Progress Tracking** - Now provides real-time aggregate progress metrics * `progress` - 0-100% completion across all files * `uploadSpeed` - Combined transfer rate in bytes/second * `eta` - Overall time remaining in seconds ### Upload Control Limitations Current upload management has constraints for handling real-world scenarios. **The following features are NOT yet implemented but are planned for Q3 2025:** ❌ **No Resumable Uploads** - Cannot resume interrupted uploads from where they left off\ ❌ **No Pausable Uploads** - Cannot pause ongoing uploads and resume later\ ❌ **No Cancel Support** - Cannot cancel individual uploads in progress\ ❌ **Limited Network Resilience** - No automatic retry on network failures or connection switching These features are actively being designed and will be released based on community feedback and use case requirements. ## 🚧 In Progress ### Developer Experience 🚧 **Enhanced Error Messages** - Contextual help and troubleshooting suggestions\ 🚧 **Testing Utilities** - Mock S3 providers and testing helpers for CI/CD\ 🚧 **Performance Profiling** - Debugging hooks for upload performance analysis ## πŸ“‹ Planned ### Q3 2025 - Core Upload Features * βœ… **Enhanced Hook APIs** - onProgress callbacks and advanced upload state management (with `onStart` lifecycle) * **Advanced Upload Control** - Resumable, pausable uploads with cancel support and network resilience * **Upload Queue Management** - Concurrent upload limits, prioritization, and bandwidth throttling * **Chunk Upload Optimization** - Configurable chunk sizes and parallel chunk uploads for large files * **Better Error Recovery** - Automatic retry with exponential backoff and network change detection ### Q4 2025 - Framework Support * **Mobile SDKs** - React Native and Expo support for mobile apps * **Additional Framework Adapters** - Astro, Qwik, Solid.js, and Fresh * **Testing Framework** - Built-in test utilities and mock providers * **Migration Tools** - Automated migration from uploadthing, uppy, and other libraries ### Q1 2026 - Developer Experience * **Enhanced TypeScript** - Better type inference and IDE support * **Plugin System** - Hooks-based extension system for custom upload logic * **Debug Mode** - Detailed logging and upload lifecycle visualization * **Performance Monitoring Hooks** - Metrics collection for custom analytics integration ### Q2 2026 - Integration Ecosystem * **Integration Guides** - Documented patterns for Sharp, FFmpeg, Cloudflare Images, etc. * **Example Applications** - Production-ready examples for common use cases * **Community Adapters** - Framework adapters maintained by the community * **Best Practices Library** - Shared configurations and patterns ## 🎯 Long-term Vision ### Stay Focused, Stay Lightweight Pushduck will remain a **focused upload library**, not a platform. Our mission: > "The fastest, most lightweight way to add S3 file uploads to any web application" **What we'll always prioritize:** * πŸͺΆ **Minimal bundle size** - Keep dependencies light * 🎯 **Core upload features** - Do one thing exceptionally well * πŸ”§ **Extensibility** - Provide hooks, not built-in features for everything * πŸ“š **Integration guides** - Document how to integrate with other tools * 🌍 **Universal compatibility** - Work everywhere JavaScript runs **What we won't build:** * ❌ File processing (use Sharp, FFmpeg, etc.) * ❌ Content moderation (use Cloudflare, AWS services) * ❌ Analytics dashboards (provide hooks for your analytics) * ❌ Team management (that's your app's concern) * ❌ Platform features (we're a library, not a SaaS) πŸ“– **Read our full philosophy** - For detailed scope boundaries, integration patterns, and decision framework, see [Philosophy & Scope](/docs/philosophy) ## πŸ’‘ Ideas & Suggestions Have ideas for pushduck? We'd love to hear them! * [Feature Requests](https://github.com/abhay-ramesh/pushduck/discussions/categories/ideas) * [Community Discord](https://pushduck.dev/discord) * [GitHub Issues](https://github.com/abhay-ramesh/pushduck/issues) ## 🀝 Contributing Want to help build the future of file uploads? Check out our [Contributing Guide](https://github.com/abhay-ramesh/pushduck/blob/main/CONTRIBUTING.md) to get started. ### Current Priorities We're actively looking for contributors in these areas: * **Core Upload Features** - Resumable uploads, better error recovery * **Framework Adapters** - Help us support more frameworks and platforms * **Documentation** - Improve guides, examples, and API documentation * **Testing** - Expand test coverage and add integration tests * **Performance** - Optimize bundle size and runtime performance * **Integration Guides** - Document patterns for common tools (Sharp, FFmpeg, etc.) *** *Last updated: June 2025* This roadmap is community-driven. **Your feedback shapes our priorities.** Join our [Discord](https://pushduck.dev/discord) or open an issue on [GitHub](https://github.com/abhay-ramesh/pushduck) to influence what we build next. ## Current Status We've already solved the core problems that have frustrated developers for years: βœ… **Interactive CLI** - Guided setup with smart defaults and auto-detection\ βœ… **Type Safety** - Full TypeScript inference for upload schemas\ βœ… **Multiple Providers** - Cloudflare R2, AWS S3, Google Cloud, and more\ βœ… **Production Ready** - Used by teams processing millions of uploads\ βœ… **Developer Experience** - Property-based client access with enhanced IntelliSense ## What's Next ### πŸš€ Q3 2025: Core Upload Features **Planned for Q3 2025** - Complete control over upload lifecycle with automatic recovery from network issues: ```typescript // PLANNED API - Not yet implemented const { files, uploadFiles, pauseUpload, resumeUpload, cancelUpload } = upload.images // Pause individual uploads await pauseUpload(fileId) // Resume from where it left off await resumeUpload(fileId) // Cancel with cleanup await cancelUpload(fileId) // Automatic network resilience const config = { retryAttempts: 3, networkSwitchTolerance: true, resumeOnReconnect: true } ``` Manage upload queues with prioritization and bandwidth throttling: ```typescript // PLANNED API const { uploadFiles } = upload.images({ queue: { maxConcurrent: 3, maxBandwidth: '5MB/s', priority: 'high' } }) ``` Optimize large file uploads with configurable chunk sizes and parallel processing: ```typescript // PLANNED API const { uploadFiles } = upload.videos({ chunking: { chunkSize: '10MB', parallelChunks: 3, retryChunks: true } }) ``` Robust error handling with automatic retry and network change detection: ```typescript // PLANNED API const { uploadFiles } = upload.files({ retry: { attempts: 3, exponentialBackoff: true, detectNetworkChange: true } }) ``` ### 🌍 Q4 2025: Framework Support Complete framework support with the same developer experience: ```typescript // Vue 3 Composition API import { createUploadClient } from '@pushduck/vue' const upload = createUploadClient({ endpoint: '/api/upload' }) const { files, uploadFiles, isUploading } = upload.imageUpload ``` ```typescript // Svelte stores import { uploadStore } from '@pushduck/svelte' const upload = uploadStore('/api/upload') // Reactive stores for upload state $: ({ files, isUploading } = $upload.imageUpload) ``` ```typescript // Pure JavaScript import { UploadClient } from '@pushduck/core' const client = new UploadClient('/api/upload') client.upload('imageUpload', files) .on('progress', (progress) => console.log(progress)) .on('complete', (urls) => console.log(urls)) ``` {" "} Upload support for React Native and Expo apps with the same type-safe API. Platform-specific optimizations for iOS and Android. Built-in mock S3 providers, upload simulation utilities, and testing helpers for comprehensive test coverage in your CI/CD pipeline. ## Community Feedback ### What We're Hearing From You Based on community feedback, GitHub issues, and Discord discussions, here are the most requested **in-scope** features: **πŸ”₯ High Priority:** * βœ… **Upload Resume & Retry** - Automatic retry and resume for failed uploads (Q3 2025) * **Better Error Messages** - More helpful error descriptions with suggested fixes * **Queue Management** - Control concurrent uploads and bandwidth throttling * **Progress Customization** - More granular progress tracking hooks * **Example Library** - More real-world examples and integration patterns **πŸ’­ Under Discussion:** * **GraphQL Integration** - Native GraphQL subscription support for upload progress * **Webhook Support** - Custom webhook configuration for upload lifecycle events * **Component Library** - Headless UI components for common upload patterns * **Performance Profiling** - Built-in profiling tools for debugging upload issues **Out of Scope:** We won't build features like file processing (use Sharp/FFmpeg), content moderation (use specialized services), or platform features (team management, dashboards). We're staying focused as a lightweight upload library. ## How We Prioritize Our roadmap is driven by three key factors: 1. **Community Impact** - Features that solve real problems for the most developers 2. **Technical Excellence** - Maintaining our high standards for type safety and DX 3. **Ecosystem Health** - Building a sustainable, long-term solution ### Voting on Features Have an idea or want to prioritize something? Here's how to influence our roadmap: Use our feature request template with use cases and expected API design. Include code examples and real-world scenarios. {" "} Join our Discord server where we run monthly polls on upcoming features. Your vote directly influences our development priorities. First Friday of every month at 10 AM PT - open to all developers. Share your use cases and help shape the future. ## Development Principles As we build new features, we never compromise on: * **πŸͺΆ Lightweight First** - Bundle size is a feature, not an afterthought * **🎯 Focused Scope** - Do one thing (uploads) exceptionally well * **πŸ“˜ Type Safety** - Every feature must have full TypeScript support * **πŸ”„ Zero Breaking Changes** - Backward compatibility is non-negotiable * **⚑ Performance** - New features can't slow down existing workflows * **πŸ”Œ Extensibility** - Provide hooks, not built-in everything Follow our [GitHub project board](https://github.com/abhay-ramesh/pushduck/projects) for real-time updates on development progress. ## Get Involved This roadmap exists because of developers like you. Here's how to shape the future: ### For Users * **Share your use case** - Tell us what you're building * **Report pain points** - What's still too complicated? * **Request integrations** - Which providers or tools do you need? ### For Contributors * **Code contributions** - Check our [contributing guide](https://github.com/abhay-ramesh/pushduck/blob/main/CONTRIBUTING.md) * **Documentation** - Help improve examples and guides * **Community support** - Answer questions in Discord and GitHub ### For Organizations * **Sponsorship** - Support full-time development * **Enterprise feedback** - Share your scale challenges * **Partnership** - Integrate pushduck with your platform *** **Ready to build the future of file uploads?** Join our [Discord community](https://pushduck.dev/discord) and help us make file uploads delightful for every Next.js developer. # CLI Reference (/docs/api/cli) import { Callout } from 'fumadocs-ui/components/callout' import { Card, Cards } from 'fumadocs-ui/components/card' import { Steps, Step } from 'fumadocs-ui/components/steps' import { Tab, Tabs } from 'fumadocs-ui/components/tabs' import { Files, Folder, File } from 'fumadocs-ui/components/files' **πŸš€ Recommended**: Use our CLI for the fastest setup experience **Next.js Only**: The pushduck CLI currently only supports Next.js projects. Support for other frameworks is coming soon. ## Quick Start Get your file uploads working in under 2 minutes with our interactive CLI tool. npm pnpm yarn bun ```bash npx @pushduck/cli@latest init ``` ```bash pnpm dlx @pushduck/cli@latest init ``` ```bash yarn dlx @pushduck/cli@latest init ``` ```bash bun x @pushduck/cli@latest init ``` The CLI will automatically: * πŸ” **Detect your package manager** (npm, pnpm, yarn, bun) * πŸ“¦ **Install dependencies** using your preferred package manager * ☁️ **Set up your storage provider** (Cloudflare R2, AWS S3, etc.) * πŸ› οΈ **Generate type-safe code** (API routes, client, components) * βš™οΈ **Configure environment** variables and bucket setup ## What the CLI Does * Detects App Router vs Pages Router * Finds existing TypeScript configuration * Checks for existing upload implementations * Validates project structure * AWS S3, Cloudflare R2, DigitalOcean Spaces * Google Cloud Storage, MinIO * Automatic bucket creation * CORS configuration * Type-safe API routes * Upload client configuration * Example components * Environment variable templates The CLI walks you through each step, asking only what's necessary for your specific setup. ## CLI Commands ### `init` - Initialize Setup npm pnpm yarn bun ```bash npx @pushduck/cli@latest init [options] ``` ```bash pnpm dlx @pushduck/cli@latest init [options] ``` ```bash yarn dlx @pushduck/cli@latest init [options] ``` ```bash bun x @pushduck/cli@latest init [options] ``` **Options:** * `--provider ` - Skip provider selection (aws|cloudflare-r2|digitalocean|minio|gcs) * `--skip-examples` - Don't generate example components * `--skip-bucket` - Don't create S3 bucket automatically * `--api-path ` - Custom API route path (default: `/api/upload`) * `--dry-run` - Show what would be created without creating * `--verbose` - Show detailed output **Examples:** Quick Setup AWS Direct Custom API Path Components Only ```bash # Interactive setup with all prompts npx @pushduck/cli@latest init ``` ```bash # Skip provider selection, use AWS S3 npx @pushduck/cli@latest init --provider aws ``` ```bash # Use custom API route path npx @pushduck/cli@latest init --api-path /api/files ``` ```bash # Generate only components, skip bucket creation npx @pushduck/cli@latest init --skip-bucket --skip-examples ``` ### `add` - Add Upload Route ```bash npx @pushduck/cli@latest add ``` Add new upload routes to existing configuration: ```bash # Interactive route builder npx @pushduck/cli@latest add # Example output: # ✨ Added imageUpload route for profile pictures # ✨ Added documentUpload route for file attachments # ✨ Updated router types ``` ### `test` - Test Configuration ```bash npx @pushduck/cli@latest test [options] ``` **Options:** * `--verbose` - Show detailed test output Validates your current setup: ```bash npx @pushduck/cli@latest test # Example output: # βœ… Environment variables configured # βœ… S3 bucket accessible # βœ… CORS configuration valid # βœ… API routes responding # βœ… Types generated correctly ``` ## Interactive Setup Walkthrough ### Step 1: Project Detection ``` πŸ” Detecting your project... βœ“ Next.js App Router detected βœ“ TypeScript configuration found βœ“ Package manager: pnpm detected βœ“ No existing upload configuration βœ“ Project structure validated ``` ### Step 2: Provider Selection ``` ? Which cloud storage provider would you like to use? ❯ Cloudflare R2 (recommended) AWS S3 (classic, widely supported) DigitalOcean Spaces (simple, affordable) Google Cloud Storage (enterprise-grade) MinIO (self-hosted, open source) Custom S3-compatible endpoint ``` ### Step 3: Credential Setup ``` πŸ”§ Setting up Cloudflare R2... πŸ” Checking for existing credentials... βœ“ Found CLOUDFLARE_R2_ACCESS_KEY_ID βœ“ Found CLOUDFLARE_R2_SECRET_ACCESS_KEY βœ“ Found CLOUDFLARE_R2_ACCOUNT_ID ⚠ CLOUDFLARE_R2_BUCKET_NAME not found ? Enter your R2 bucket name: my-app-uploads ? Create bucket automatically? Yes ``` ### Step 4: API Configuration ``` ? Where should we create the upload API? ❯ app/api/upload/route.ts (recommended) app/api/s3-upload/route.ts (classic) Custom path ? Generate example upload page? ❯ Yes, create app/upload/page.tsx with full example Yes, just add components to components/ui/ No, I'll build my own ``` ### Step 5: File Generation ``` πŸ› οΈ Generating files... ✨ Created files: β”œβ”€β”€ app/api/upload/route.ts β”œβ”€β”€ app/upload/page.tsx β”œβ”€β”€ components/ui/upload-button.tsx β”œβ”€β”€ components/ui/upload-dropzone.tsx β”œβ”€β”€ lib/upload-client.ts └── .env.example πŸ“¦ Installing dependencies... βœ“ pushduck βœ“ react-dropzone πŸŽ‰ Setup complete! Your uploads are ready. ``` ## Generated Project Structure After running the CLI, your project will have: ### Generated API Route ```typescript title="app/api/upload/route.ts" // No longer needed - use uploadRouter.handlers directly import { s3 } from '@/lib/upload' import { getServerSession } from 'next-auth' import { authOptions } from '@/lib/auth' const s3Router = s3.createRouter({ // Image uploads for profile pictures imageUpload: s3.image() .maxFileSize("5MB") .maxFiles(1) .formats(["jpeg", "png", "webp"]) .middleware(async ({ req, metadata }) => { const session = await getServerSession(authOptions) if (!session?.user?.id) { throw new Error("Authentication required") } return { ...metadata, userId: session.user.id, folder: `uploads/${session.user.id}` } }), // Document uploads documentUpload: s3.file() .maxFileSize("10MB") .maxFiles(5) .types(["application/pdf", "text/plain", "application/msword"]) .middleware(async ({ req, metadata }) => { const session = await getServerSession(authOptions) if (!session?.user?.id) { throw new Error("Authentication required") } return { ...metadata, userId: session.user.id, folder: `documents/${session.user.id}` } }) }) export type AppRouter = typeof s3Router export const { GET, POST } = s3Router.handlers ``` ### Generated Upload Client ```typescript title="lib/upload-client.ts" import { createUploadClient } from 'pushduck/client' import type { AppRouter } from '@/app/api/upload/route' export const upload = createUploadClient({ endpoint: '/api/upload' }) ``` ### Generated Example Page ```typescript title="app/upload/page.tsx" import { UploadButton } from '@/components/ui/upload-button' import { UploadDropzone } from '@/components/ui/upload-dropzone' export default function UploadPage() { return (

File Upload Demo

Profile Picture

Documents

) } ``` ## Environment Variables The CLI automatically creates `.env.example` and prompts for missing values: ```bash title=".env.example" # Cloudflare R2 Configuration (Recommended) CLOUDFLARE_R2_ACCESS_KEY_ID=your_access_key_here CLOUDFLARE_R2_SECRET_ACCESS_KEY=your_secret_key_here CLOUDFLARE_R2_ACCOUNT_ID=your_account_id_here CLOUDFLARE_R2_BUCKET_NAME=your-bucket-name # Alternative: AWS S3 Configuration # AWS_ACCESS_KEY_ID=your_access_key_here # AWS_SECRET_ACCESS_KEY=your_secret_key_here # AWS_REGION=us-east-1 # AWS_S3_BUCKET_NAME=your-bucket-name # Next.js Configuration NEXTAUTH_SECRET=your_nextauth_secret_here NEXTAUTH_URL=http://localhost:3000 # Optional: Custom S3 endpoint (for MinIO, etc.) # S3_ENDPOINT=https://your-custom-endpoint.com ``` ## Provider-Specific Setup ```bash npx @pushduck/cli@latest init --provider cloudflare-r2 ``` **What gets configured:** * Cloudflare R2 S3-compatible endpoints * Global edge network optimization * Zero egress fee configuration * CORS settings for web uploads ```bash npx @pushduck/cli@latest init --provider aws ``` **What gets configured:** * AWS S3 regional endpoints * IAM permissions and policies * Bucket lifecycle management * CloudFront CDN integration (optional) ```bash npx @pushduck/cli@latest init --provider digitalocean ``` **Required Environment Variables:** * `AWS_ACCESS_KEY_ID` (DO Spaces key) * `AWS_SECRET_ACCESS_KEY` (DO Spaces secret) * `AWS_REGION` (DO region) * `AWS_S3_BUCKET_NAME` * `S3_ENDPOINT` (DO Spaces endpoint) **What the CLI does:** * Configures DigitalOcean Spaces endpoints * Sets up CDN configuration * Validates access permissions * Configures CORS policies ```bash npx @pushduck/cli@latest init --provider minio ``` **Required Environment Variables:** * `AWS_ACCESS_KEY_ID` (MinIO access key) * `AWS_SECRET_ACCESS_KEY` (MinIO secret key) * `AWS_REGION=us-east-1` * `AWS_S3_BUCKET_NAME` * `S3_ENDPOINT` (MinIO server URL) **What the CLI does:** * Configures self-hosted MinIO endpoints * Sets up bucket policies * Validates server connectivity * Configures development-friendly settings ## Troubleshooting ### CLI Not Found ```bash # If you get "command not found" npm install -g pushduck # Or use npx for one-time usage npx @pushduck/cli@latest@latest init ``` ### Permission Errors ```bash # If you get permission errors during setup sudo npx @pushduck/cli@latest init # Or fix npm permissions npm config set prefix ~/.npm-global export PATH=~/.npm-global/bin:$PATH ``` ### Existing Configuration ```bash # Force overwrite existing configuration npx @pushduck/cli@latest init --force # Or backup and regenerate cp app/api/upload/route.ts app/api/upload/route.ts.backup npx @pushduck/cli@latest init ``` ### Bucket Creation Failed ```bash # Test your credentials first npx @pushduck/cli@latest test # Skip automatic bucket creation npx @pushduck/cli@latest init --skip-bucket # Create bucket manually, then run: npx @pushduck/cli@latest test ``` ## Advanced Usage ### Custom Templates ```bash # Use custom file templates npx @pushduck/cli@latest init --template enterprise # Available templates: # - default: Basic setup with examples # - minimal: Just API routes, no examples # - enterprise: Full security and monitoring # - ecommerce: Product images and documents ``` ### Monorepo Support ```bash # For monorepos, specify the Next.js app directory cd apps/web npx @pushduck/cli@latest init # Or use the --cwd flag npx @pushduck/cli@latest init --cwd apps/web ``` ### CI/CD Integration ```bash # Non-interactive mode for CI/CD npx @pushduck/cli@latest init \ --provider aws \ --skip-examples \ --api-path /api/upload \ --no-interactive ``` *** **Complete CLI Reference**: This guide covers all CLI commands, options, and use cases. For a quick start, see our [Quick Start guide](/docs/quick-start). # API Reference (/docs/api) import { Card, Cards } from "fumadocs-ui/components/card"; import { Callout } from "fumadocs-ui/components/callout"; ## Complete API Documentation Complete reference documentation for all pushduck APIs, from client-side hooks to server configuration and storage operations. **Type-Safe by Design**: All pushduck APIs are built with TypeScript-first design, providing excellent developer experience with full type inference and autocompletion. ## Client APIs * `useUpload` - Core upload hook with progress tracking * `useUploadRoute` - Route-specific uploads with validation **Perfect for**: React applications, reactive UIs * `createUploadClient` - Type-safe upload client * Property-based route access * Enhanced type inference **Perfect for**: Complex applications, better DX ## Server Configuration * Route definitions and validation * File type and size restrictions * Custom naming strategies **Essential for**: Setting up upload routes * Router configuration options * Middleware integration * Advanced routing patterns **Essential for**: Server setup and customization * Default upload options * Error handling configuration * Progress tracking settings **Essential for**: Client-side configuration * Dynamic path generation * Custom naming strategies * Folder organization **Essential for**: File organization ## Server APIs * Route definition and configuration * Built-in validation and middleware * Type-safe request/response handling **Core API**: The heart of pushduck * File listing and metadata * Delete operations * Presigned URL generation **Perfect for**: File management features ## Developer Tools * Project initialization * Component generation * Development utilities **Perfect for**: Quick setup and scaffolding * Error diagnosis and solutions * Performance optimization * Common gotchas and fixes **Essential for**: Problem solving ## Quick Reference ### Basic Server Setup ```typescript import { createS3Router, s3 } from 'pushduck/server'; const uploadRouter = createS3Router({ storage: { provider: 'aws-s3', region: 'us-east-1', bucket: 'my-bucket', credentials: { accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, }, }, routes: { imageUpload: s3.image().maxFileSize("5MB"), documentUpload: s3.file().maxFileSize("10MB"), }, }); export const { GET, POST } = uploadRouter.handlers; ``` ### Basic Client Usage ```typescript import { useUpload } from 'pushduck/client'; function UploadComponent() { const { upload, uploading, progress } = useUpload({ endpoint: '/api/upload', route: 'imageUpload', }); const handleUpload = async (file: File) => { const result = await upload(file); console.log('Public URL:', result.url); // Permanent access console.log('Download URL:', result.presignedUrl); // Temporary access (1 hour) }; return (
handleUpload(e.target.files![0])} /> {uploading &&
Progress: {progress}%
}
); } ``` ## Architecture Overview **Getting Started**: New to pushduck? Start with the [Quick Start](/docs/quick-start) guide, then explore the specific APIs you need for your use case. ## API Categories | Category | Purpose | Best For | | ------------------- | ----------------------- | ---------------------------------------- | | **Client APIs** | Frontend file uploads | React components, user interactions | | **Server APIs** | Backend upload handling | Route definitions, validation | | **Storage APIs** | File management | Listing, deleting, URL generation | | **Configuration** | Setup and customization | Project configuration, advanced features | | **Developer Tools** | Development workflow | Setup, debugging, optimization | ## Next Steps 1. **New to pushduck?** β†’ Start with [Quick Start](/docs/quick-start) 2. **Setting up uploads?** β†’ Check [S3 Router](/docs/api/s3-router) 3. **Building UI?** β†’ Explore [React Hooks](/docs/api/client) 4. **Managing files?** β†’ Use [Storage API](/docs/api/storage) 5. **Need help?** β†’ Visit [Troubleshooting](/docs/api/troubleshooting) # S3 Router (/docs/api/s3-router) ## S3 Router Configuration The S3 router provides a type-safe way to define upload endpoints with schema validation, middleware, and lifecycle hooks. ## Basic Router Setup ```typescript title="app/api/upload/route.ts" // app/api/upload/route.ts import { s3 } from '@/lib/upload' const s3Router = s3.createRouter({ imageUpload: s3 .image() .maxFileSize('5MB') .formats(['jpeg', 'jpg', 'png', 'webp']) .middleware(async ({ file, metadata }) => { // [!code highlight] // Add authentication and user context return { ...metadata, userId: 'user-123', uploadedAt: new Date().toISOString(), } }), documentUpload: s3 .file() .maxFileSize('10MB') .types(['application/pdf', 'text/plain']) .paths({ prefix: 'documents', }), }) // Export the handler export const { GET, POST } = s3Router.handlers; // [!code highlight] ``` ## Schema Builders ### Image Schema ```typescript title="Image Schema Configuration" s3.image() .maxFileSize('5MB') // [!code highlight] .formats(['jpeg', 'jpg', 'png', 'webp', 'gif']) .dimensions({ minWidth: 100, maxWidth: 2000 }) .quality(0.8) // JPEG quality ``` ### File Schema ```typescript title="File Schema Configuration" s3.file() .maxFileSize('10MB') // [!code highlight] .types(['application/pdf', 'text/plain', 'application/json']) .extensions(['pdf', 'txt', 'json']) ``` ### Object Schema (Multiple Files) ```typescript title="Object Schema Configuration" s3.object({ images: s3.image().maxFileSize('5MB').maxFiles(5), // [!code highlight] documents: s3.file().maxFileSize('10MB').maxFiles(2), thumbnail: s3.image().maxFileSize('1MB').maxFiles(1), }) ``` ## Route Configuration ### Middleware Add authentication, validation, and metadata: ```typescript title="Middleware Example" .middleware(async ({ file, metadata, req }) => { // [!code highlight] // Authentication const user = await authenticateUser(req) if (!user) { throw new Error('Authentication required') // [!code highlight] } // File validation if (file.size > 10 * 1024 * 1024) { throw new Error('File too large') } // Return enriched metadata return { ...metadata, // Client-provided metadata (e.g., albumId, tags) userId: user.id, // [!code highlight] userRole: user.role, uploadedAt: new Date().toISOString(), ipAddress: req.headers.get('x-forwarded-for'), } }) ``` **Client Metadata Support:** The `metadata` parameter contains data sent from the client via `uploadFiles(files, metadata)`. This allows passing UI context like album selections, tags, or form data. The middleware can then enrich this client metadata with server-side data like authenticated user information. **Example with Client Metadata:** ```typescript // Client component const { uploadFiles } = upload.imageUpload(); uploadFiles(files, { albumId: 'vacation-2025', tags: ['beach', 'sunset'], visibility: 'private' }); // Server middleware receives and validates .middleware(async ({ req, metadata }) => { const user = await authenticateUser(req); // Validate client-provided albumId if (metadata?.albumId) { const album = await db.albums.findFirst({ where: { id: metadata.albumId, userId: user.id // Ensure user owns the album } }); if (!album) throw new Error('Album not found or access denied'); } return { // Client metadata (validated) albumId: metadata?.albumId, tags: metadata?.tags || [], visibility: metadata?.visibility || 'private', // Server metadata (trusted) userId: user.id, // From auth, NOT from client role: user.role, // From auth, NOT from client uploadedAt: new Date().toISOString() }; }) ``` **Security Warning:** Client metadata is UNTRUSTED user input. Always validate and never trust client-provided identity claims (userId, role, permissions, etc.). Extract identity from authenticated sessions on the server. ### Path Configuration Control where files are stored: ```typescript .paths({ // Simple prefix prefix: 'user-uploads', // Custom path generation generateKey: (ctx) => { const { file, metadata, routeName } = ctx const userId = metadata.userId const timestamp = Date.now() return `${routeName}/${userId}/${timestamp}/${file.name}` }, // Simple suffix suffix: 'processed', }) ``` ### Lifecycle Hooks React to upload events: ```typescript .onUploadStart(async ({ file, metadata }) => { console.log(`Starting upload: ${file.name}`) // Log to analytics await analytics.track('upload_started', { userId: metadata.userId, filename: file.name, fileSize: file.size, }) }) .onUploadComplete(async ({ file, url, metadata }) => { console.log(`Upload complete: ${file.name} -> ${url}`) // Save to database await db.files.create({ filename: file.name, url, userId: metadata.userId, size: file.size, contentType: file.type, uploadedAt: new Date(), }) // Send notification await notificationService.send({ userId: metadata.userId, type: 'upload_complete', message: `${file.name} uploaded successfully`, }) }) .onUploadError(async ({ file, error, metadata }) => { console.error(`Upload failed: ${file.name}`, error) // Log error await errorLogger.log({ operation: 'file_upload', error: error.message, userId: metadata.userId, filename: file.name, }) }) ``` ## Advanced Examples ### E-commerce Product Images ```typescript const productRouter = s3.createRouter({ productImages: s3 .image() .maxFileSize('5MB') .formats(['jpeg', 'jpg', 'png', 'webp']) .dimensions({ minWidth: 800, maxWidth: 2000 }) .middleware(async ({ metadata, req }) => { const user = await authenticateUser(req) const productId = metadata.productId // Verify user owns the product const product = await db.products.findFirst({ where: { id: productId, ownerId: user.id } }) if (!product) { throw new Error('Product not found or access denied') } return { ...metadata, userId: user.id, productId, productName: product.name, } }) .paths({ generateKey: (ctx) => { const { metadata } = ctx return `products/${metadata.productId}/images/${Date.now()}.jpg` } }) .onUploadComplete(async ({ url, metadata }) => { // Update product with new image await db.products.update({ where: { id: metadata.productId }, data: { images: { push: url } } }) }), productDocuments: s3 .file() .maxFileSize('10MB') .types(['application/pdf']) .paths({ prefix: 'product-docs', }) .onUploadComplete(async ({ url, metadata }) => { await db.productDocuments.create({ productId: metadata.productId, documentUrl: url, type: 'specification', }) }), }) ``` ### User Profile System ```typescript const profileRouter = s3.createRouter({ avatar: s3 .image() .maxFileSize('2MB') .formats(['jpeg', 'jpg', 'png']) .dimensions({ minWidth: 100, maxWidth: 500 }) .middleware(async ({ req }) => { const user = await authenticateUser(req) return { userId: user.id, type: 'avatar' } }) .paths({ generateKey: (ctx) => { return `users/${ctx.metadata.userId}/avatar.jpg` } }) .onUploadComplete(async ({ url, metadata }) => { // Update user profile await db.users.update({ where: { id: metadata.userId }, data: { avatarUrl: url } }) // Invalidate cache await cache.del(`user:${metadata.userId}`) }), documents: s3 .object({ resume: s3.file().maxFileSize('5MB').types(['application/pdf']).maxFiles(1), portfolio: s3.file().maxFileSize('10MB').maxFiles(3), }) .middleware(async ({ req }) => { const user = await authenticateUser(req) return { userId: user.id } }) .paths({ prefix: 'user-documents', }), }) ``` ## Client-Side Usage Once you have your router set up, use it from the client: ```typescript // components/FileUploader.tsx import { useUploadRoute } from 'pushduck' export function FileUploader() { const { upload, isUploading } = useUploadRoute('imageUpload') const handleUpload = async (files: FileList) => { try { const results = await upload(files, { // This metadata will be passed to middleware productId: 'product-123', category: 'main-images', }) console.log('Upload complete:', results) } catch (error) { console.error('Upload failed:', error) } } return (
e.target.files && handleUpload(e.target.files)} disabled={isUploading} /> {isUploading &&

Uploading...

}
) } ``` ## Type Safety The router provides full TypeScript support: ```typescript // Types are automatically inferred type RouterType = typeof s3Router // Get route names type RouteNames = keyof RouterType // 'imageUpload' | 'documentUpload' // Get route input types type ImageUploadInput = InferRouteInput // Get route metadata types type ImageUploadMetadata = InferRouteMetadata ``` # Troubleshooting (/docs/api/troubleshooting) import { Callout } from "fumadocs-ui/components/callout"; import { Tabs, Tab } from "fumadocs-ui/components/tabs"; ## Common Issues and Solutions Common issues and solutions when using pushduck. ## Development Issues ### Next.js Turbo Mode Compatibility **Known Issue:** pushduck has compatibility issues with Next.js Turbo mode (`--turbo` flag). **Problem:** Uploads fail or behave unexpectedly when using `next dev --turbo`. **Solution:** Remove the `--turbo` flag from your development script: ```json { "scripts": { // ❌ This may cause issues "dev": "next dev --turbo", // βœ… Use this instead "dev": "next dev" } } ``` ```bash # ❌ This may cause issues npm run dev --turbo # βœ… Use this instead npm run dev ``` **Why this happens:** Turbo mode's aggressive caching and bundling can interfere with the upload process, particularly with presigned URL generation and file streaming. ## Upload Failures ### CORS Errors **Problem:** Browser console shows CORS errors when uploading files. **Symptoms:** ``` Access to XMLHttpRequest at 'https://bucket.s3.amazonaws.com/...' from origin 'http://localhost:3000' has been blocked by CORS policy ``` **Solution:** Configure CORS on your storage bucket. **Comprehensive CORS Guide:** For detailed CORS configuration, testing, and troubleshooting across all providers, see the [CORS & ACL Configuration Guide](/docs/guides/security/cors-and-acl). **Quick fixes:** * See the [provider setup guides](/docs/providers) for basic CORS configuration * Ensure your domain is included in `AllowedOrigins` * Verify all required HTTP methods are allowed (`PUT`, `POST`, `GET`) * Check that required headers are included in `AllowedHeaders` ### Environment Variables Not Found **Problem:** Errors about missing environment variables. **Symptoms:** ``` Error: Environment variable CLOUDFLARE_R2_ACCESS_KEY_ID is not defined ``` **Solution:** Ensure your environment variables are properly set: 1. **Check your `.env.local` file exists** in your project root 2. **Verify variable names** match exactly (case-sensitive) 3. **Restart your development server** after adding new variables ```bash # .env.local CLOUDFLARE_R2_ACCESS_KEY_ID=your_access_key CLOUDFLARE_R2_SECRET_ACCESS_KEY=your_secret_key CLOUDFLARE_R2_ACCOUNT_ID=your_account_id R2_BUCKET=your-bucket-name ``` ### File Size Limits **Problem:** Large files fail to upload. **Solution:** Check and adjust size limits: ```typescript // app/api/upload/route.ts const uploadRouter = s3.createRouter({ imageUpload: s3 .image() .maxFileSize("10MB") // Increase as needed .formats(["jpeg", "png", "webp"]), }); ``` ## Type Errors ### TypeScript Inference Issues **Problem:** TypeScript errors with upload client. **Solution:** Ensure proper type exports: ```typescript // app/api/upload/route.ts export const { GET, POST } = uploadRouter.handlers; export type AppRouter = typeof uploadRouter; // βœ… Export the type // lib/upload-client.ts import type { AppRouter } from "@/app/api/upload/route"; export const upload = createUploadClient({ // βœ… Use the type endpoint: "/api/upload", }); ``` ## Performance Issues ### Slow Upload Speeds **Problem:** Uploads are slower than expected. **Solutions:** 1. **Choose the right provider region** close to your users 2. **Check your internet connection** and server resources 3. **Consider your provider's performance characteristics** ### Memory Issues with Large Files **Problem:** Browser crashes or high memory usage with large files. **Solution:** File streaming is handled automatically by pushduck: ```typescript // File streaming is handled automatically // No additional configuration needed const { uploadFiles } = upload.fileUpload(); await uploadFiles(largeFiles); // βœ… Streams automatically ``` ## Getting Help If you're still experiencing issues: 1. **Check the documentation** for your specific provider 2. **For CORS/ACL issues** see the [CORS & ACL Configuration Guide](/docs/guides/security/cors-and-acl) 3. **Enable debug logging** by setting `NODE_ENV=development` 4. **Check browser console** for detailed error messages 5. **Verify your provider configuration** is correct **Need more help?** Create an issue on [GitHub](https://github.com/abhay-ramesh/pushduck/issues) with detailed information about your setup and the error you're experiencing. # Client-Side Approaches (/docs/guides/client-approaches) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; ## Client-Side Approaches Pushduck provides **two ways** to integrate file uploads in your React components. Both approaches now provide **identical functionality** including per-route callbacks, progress tracking, and error handling. **Recommendation**: Use the **Enhanced Structured Client** approach for the best developer experience. It now provides the same flexibility as hooks while maintaining superior type safety and centralized configuration. ## Quick Comparison ```typescript const upload = createUploadClient({ endpoint: '/api/upload' }) // Simple usage const { uploadFiles, files } = upload.imageUpload() // With per-route callbacks (NEW!) const { uploadFiles, files } = upload.imageUpload({ onStart: (files) => setUploadStarted(true), onSuccess: (results) => handleSuccess(results), onError: (error) => handleError(error), onProgress: (progress) => setProgress(progress) }) ``` **Best for**: Most projects - provides superior DX, type safety, and full flexibility ```typescript const { uploadFiles, files } = useUploadRoute('imageUpload', { onStart: (files) => setUploadStarted(true), onSuccess: (results) => handleSuccess(results), onError: (error) => handleError(error), onProgress: (progress) => setProgress(progress) }) ``` **Best for**: Teams that strongly prefer React hooks, legacy code migration ## Feature Parity Both approaches now support **identical functionality**: | Feature | Enhanced Structured Client | Hook-Based | | --------------------- | -------------------------------- | ---------------------------- | | βœ… Type Safety | **Superior** - Property-based | Good - Generic types | | βœ… Per-route Callbacks | **βœ… Full support** | βœ… Full support | | βœ… Progress Tracking | **βœ… Full support** | βœ… Full support | | βœ… Error Handling | **βœ… Full support** | βœ… Full support | | βœ… Multiple Endpoints | **βœ… Per-route endpoints** | βœ… Per-route endpoints | | βœ… Upload Control | **βœ… Enable/disable uploads** | βœ… Enable/disable uploads | | βœ… Auto-upload | **βœ… Per-route control** | βœ… Per-route control | | βœ… Overall Progress | **βœ… progress, uploadSpeed, eta** | βœ… progress, uploadSpeed, eta | ## API Comparison: Identical Capabilities Both approaches now return **exactly the same** properties and accept **exactly the same** configuration options: ```typescript // Hook-Based Approach const { uploadFiles, // (files: File[]) => Promise files, // S3UploadedFile[] isUploading, // boolean errors, // string[] reset, // () => void progress, // number (0-100) - overall progress uploadSpeed, // number (bytes/sec) - overall speed eta // number (seconds) - overall ETA } = useUploadRoute('imageUpload', { onStart: (files) => setUploadStarted(true), onSuccess: (results) => handleSuccess(results), onError: (error) => handleError(error), onProgress: (progress) => setProgress(progress), endpoint: '/api/custom-upload', }); // Enhanced Structured Client - IDENTICAL capabilities const { uploadFiles, // (files: File[]) => Promise files, // S3UploadedFile[] isUploading, // boolean errors, // string[] reset, // () => void progress, // number (0-100) - overall progress uploadSpeed, // number (bytes/sec) - overall speed eta // number (seconds) - overall ETA } = upload.imageUpload({ onStart: (files) => setUploadStarted(true), onSuccess: (results) => handleSuccess(results), onError: (error) => handleError(error), onProgress: (progress) => setProgress(progress), endpoint: '/api/custom-upload', }); ``` ## Complete Options Parity Both approaches support **identical configuration options**: ```typescript interface CommonUploadOptions { onStart?: (files: S3FileMetadata[]) => void; onSuccess?: (results: UploadResult[]) => void; onError?: (error: Error) => void; onProgress?: (progress: number) => void; endpoint?: string; // Custom endpoint per route } // Hook-based: useUploadRoute(routeName, options) // Structured: upload.routeName(options) // Both accept the same CommonUploadOptions interface ``` ## Return Value Parity Both approaches return **identical properties**: ```typescript interface CommonUploadReturn { uploadFiles: (files: File[]) => Promise; files: S3UploadedFile[]; isUploading: boolean; errors: string[]; reset: () => void; // Overall progress tracking (NEW in both!) progress?: number; // 0-100 percentage across all files uploadSpeed?: number; // bytes per second across all files eta?: number; // seconds remaining for all files } ``` ## Enhanced Structured Client Examples ### Basic Usage (Unchanged) ```typescript import { createUploadClient } from 'pushduck/client' import type { AppRouter } from '@/lib/upload' const upload = createUploadClient({ endpoint: '/api/upload' }) export function SimpleUpload() { const { uploadFiles, files, isUploading } = upload.imageUpload() return ( uploadFiles(Array.from(e.target.files || []))} disabled={isUploading} /> ) } ``` ### With Per-Route Configuration (NEW!) ```typescript export function AdvancedUpload() { const [progress, setProgress] = useState(0) const { uploadFiles, files, isUploading, errors, reset } = upload.imageUpload({ onStart: (files) => { console.log('πŸš€ Upload starting!', files) setUploadStarted(true) }, onSuccess: (results) => { console.log('βœ… Upload successful!', results) results.forEach(file => { console.log('Public URL:', file.url); // Permanent access console.log('Download URL:', file.presignedUrl); // Temporary access (1 hour) }); showNotification('Images uploaded successfully!') setUploadStarted(false) }, onError: (error) => { console.error('❌ Upload failed:', error) showErrorNotification(error.message) setUploadStarted(false) }, onProgress: (progress) => { console.log(`πŸ“Š Progress: ${progress}%`) setProgress(progress) } }) return (
uploadFiles(Array.from(e.target.files || []))} /> {progress > 0 && }
) } ``` ### Multiple Routes with Different Configurations ```typescript export function MultiUploadComponent() { // Images with progress tracking const images = upload.imageUpload({ onStart: (files) => setUploadingImages(true), onProgress: (progress) => setImageProgress(progress) }) // Documents with different endpoint and success handler const documents = upload.documentUpload({ endpoint: '/api/secure-upload', onStart: (files) => setUploadingDocuments(true), onSuccess: (results) => { // Use presigned URLs for private document downloads updateDocumentLibrary(results.map(file => ({ id: file.id, name: file.name, downloadUrl: file.presignedUrl, // Secure, time-limited access permanentUrl: file.url, // For internal operations key: file.key }))); } }) // Videos with conditional logic in component const videos = upload.videoUpload({ onStart: (files) => setUploadingVideos(true) }) return (
) } ``` ### Global Configuration with Per-Route Overrides ```typescript const upload = createUploadClient({ endpoint: '/api/upload', // Global defaults (optional) defaultOptions: { onStart: (files) => console.log(`Starting upload of ${files.length} files`), onProgress: (progress) => console.log(`Global progress: ${progress}%`), onError: (error) => logError(error) } }) // This route inherits global defaults const basic = upload.imageUpload() // This route overrides specific options const custom = upload.documentUpload({ endpoint: '/api/secure-upload', // Override endpoint onSuccess: (results) => handleSecureUpload(results) // Add success handler // Still inherits global onProgress and onError }) ``` ## Hook-Based Approach (Unchanged) ```typescript import { useUploadRoute } from 'pushduck/client' export function HookBasedUpload() { const { uploadFiles, files, isUploading, error } = useUploadRoute('imageUpload', { onStart: (files) => console.log('Starting upload:', files), onSuccess: (results) => console.log('Success:', results), onError: (error) => console.error('Error:', error), onProgress: (progress) => console.log('Progress:', progress) }) return ( uploadFiles(Array.from(e.target.files || []))} disabled={isUploading} /> ) } ``` ## Migration Guide ### From Hook-Based to Enhanced Structured Client ```typescript // Before: Hook-based const { uploadFiles, files } = useUploadRoute('imageUpload', { onStart: handleStart, onSuccess: handleSuccess, onError: handleError }) // After: Enhanced structured client const upload = createUploadClient({ endpoint: '/api/upload' }) const { uploadFiles, files } = upload.imageUpload({ onStart: handleStart, onSuccess: handleSuccess, onError: handleError }) ``` ### Benefits of Migration 1. **Better Type Safety**: Route names are validated at compile time 2. **Enhanced IntelliSense**: Auto-completion for all available routes 3. **Centralized Configuration**: Single place to configure endpoints and defaults 4. **Refactoring Support**: Rename routes safely across your codebase 5. **No Performance Impact**: Same underlying implementation ## When to Use Each Approach ### Use Enhanced Structured Client When: * βœ… **Starting a new project** - best overall developer experience * βœ… **Want superior type safety** - compile-time route validation * βœ… **Need centralized configuration** - single place for settings * βœ… **Value refactoring support** - safe route renames ### Use Hook-Based When: * βœ… **Migrating existing code** - minimal changes required * βœ… **Dynamic route names** - routes determined at runtime * βœ… **Team strongly prefers hooks** - familiar React patterns * βœ… **Legacy compatibility** - maintaining older codebases ## Performance Considerations Both approaches have **identical performance** characteristics: * Same underlying `useUploadRoute` implementation * Same network requests and upload logic * Same React hooks rules and lifecycle The enhanced structured client adds zero runtime overhead while providing compile-time benefits. *** **Full Feature Parity**: Both approaches now support the same functionality. The choice comes down to developer experience preferences rather than feature limitations. ## Detailed Comparison ### Type Safety & Developer Experience ```typescript // βœ… Complete type inference from server router const upload = createUploadClient({ endpoint: '/api/upload' }) // βœ… Property-based access - no string literals const { uploadFiles, files } = upload.imageUpload() // βœ… IntelliSense shows all available endpoints upload. // <- Shows: imageUpload, documentUpload, videoUpload... // βœ… Compile-time validation upload.nonExistentRoute() // ❌ TypeScript error // βœ… Refactoring safety // Rename routes in router β†’ TypeScript shows all usage locations ``` **Benefits:** * 🎯 **Full type inference** from server to client * πŸ” **IntelliSense support** - discover endpoints through IDE * πŸ›‘οΈ **Refactoring safety** - rename with confidence * 🚫 **No string literals** - eliminates typos * ⚑ **Better DX** - property-based access feels natural ```typescript // βœ… With type parameter - recommended for better type safety const { uploadFiles, files } = useUploadRoute('imageUpload') // βœ… Without type parameter - also works const { uploadFiles, files } = useUploadRoute('imageUpload') // Type parameter provides compile-time validation const typed = useUploadRoute('imageUpload') // Route validated const untyped = useUploadRoute('imageUpload') // Any string accepted ``` **Characteristics:** * πŸͺ **React hook pattern** - familiar to React developers * πŸ”€ **Flexible usage** - works with or without type parameter * 🧩 **Component-level state** - each hook manages its own state * 🎯 **Type safety** - enhanced when using `` * πŸ” **IDE support** - best with type parameter ### Code Examples **Structured Client:** ```typescript import { upload } from '@/lib/upload-client' export function ImageUploader() { const { uploadFiles, files, isUploading, error } = upload.imageUpload() return (
uploadFiles(Array.from(e.target.files || []))} disabled={isUploading} /> {/* Upload UI */}
) } ``` **Hook-Based:** ```typescript import { useUploadRoute } from 'pushduck/client' export function ImageUploader() { const { uploadFiles, files, isUploading, error } = useUploadRoute('imageUpload') return (
uploadFiles(Array.from(e.target.files || []))} disabled={isUploading} /> {/* Same upload UI */}
) } ```
**Structured Client:** ```typescript export function FileManager() { const images = upload.imageUpload() const documents = upload.documentUpload() const videos = upload.videoUpload() return (
) } ``` **Hook-Based:** ```typescript export function FileManager() { const images = useUploadRoute('imageUpload') const documents = useUploadRoute('documentUpload') const videos = useUploadRoute('videoUpload') return (
) } ```
**Structured Client:** ```typescript // lib/upload-client.ts export const upload = createUploadClient({ endpoint: '/api/upload', headers: { Authorization: `Bearer ${getAuthToken()}` } }) // components/secure-uploader.tsx export function SecureUploader() { const { uploadFiles } = upload.secureUpload() // Authentication handled globally } ``` **Hook-Based:** ```typescript export function SecureUploader() { const { uploadFiles } = useUploadRoute('secureUpload', { headers: { Authorization: `Bearer ${getAuthToken()}` } }) // Authentication per hook usage } ```
## Conclusion **Our Recommendation**: Use the **Enhanced Structured Client** approach (`createUploadClient`) for most projects. It provides superior developer experience, better refactoring safety, and enhanced type inference. **Both approaches are supported**: The hook-based approach (`useUploadRoute`) is fully supported and valid for teams that prefer traditional React patterns. **Quick Decision Guide:** * **Most projects** β†’ Use `createUploadClient` (recommended) * **Strongly prefer React hooks** β†’ Use `useUploadRoute` * **Want best DX and type safety** β†’ Use `createUploadClient` * **Need component-level control** β†’ Use `useUploadRoute` ### Next Steps * **New Project**: Start with [createUploadClient](/docs/api/client/create-upload-client) * **Existing Hook Code**: Consider [migrating gradually](/docs/guides/migrate-to-enhanced-client) * **Need Help**: Join our [Discord community](https://pushduck.dev/discord) for guidance # Client-Side Metadata (/docs/guides/client-metadata) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; ## Client-Side Metadata Client-side metadata allows you to pass contextual information from your UI directly to the server during file uploads. This enables dynamic file organization, categorization, and processing based on user selections and application state. **New Feature:** As of v0.1.23, you can now pass metadata from the client when calling `uploadFiles()`. This metadata flows through to your middleware, lifecycle hooks, and path generation functions. ## Why Use Client Metadata? Client metadata bridges the gap between UI context and server-side processing: * 🎯 **UI State** - Pass album selections, categories, or form data * 🏒 **Multi-tenant Context** - Send workspace, project, or organization IDs * 🏷️ **User Preferences** - Include tags, visibility settings, or custom fields * πŸ“Š **Dynamic Organization** - Organize files based on client context * 🎨 **Flexible Workflows** - Adapt to different use cases without API changes ## Basic Usage **Client: Pass metadata with uploadFiles** ```typescript import { upload } from '@/lib/upload-client'; export function ImageUploader() { const { uploadFiles } = upload.imageUpload(); const handleUpload = (files: File[]) => { // Pass metadata as second parameter uploadFiles(files, { albumId: 'vacation-2025', tags: ['beach', 'sunset'], visibility: 'private' }); }; return handleUpload(Array.from(e.target.files || []))} />; } ``` **Server: Receive in middleware** ```typescript // app/api/upload/route.ts const s3Router = s3.createRouter({ imageUpload: s3.image() .middleware(async ({ req, metadata }) => { const user = await authenticateUser(req); return { ...metadata, // Client data: { albumId, tags, visibility } userId: user.id, // Server data from auth }; }) }); ``` **Use in hooks and path generation** ```typescript .paths({ generateKey: (ctx) => { // Metadata available in path generation return `users/${ctx.metadata.userId}/albums/${ctx.metadata.albumId}/${ctx.file.name}`; } }) .onUploadComplete(async ({ metadata, url }) => { // Metadata available in lifecycle hooks await db.images.create({ url, albumId: metadata.albumId, tags: metadata.tags, userId: metadata.userId }); }) ``` ## Real-World Examples ### Multi-Tenant SaaS Application ```typescript // Client component export function WorkspaceFileUpload({ workspace, project }: Props) { const { uploadFiles } = upload.documentUpload(); const handleUpload = (files: File[]) => { uploadFiles(files, { workspaceId: workspace.id, projectId: project.id, teamId: workspace.team.id, folder: selectedFolder.path, permissions: { canEdit: currentUser.role === 'admin', canDelete: currentUser.role === 'admin', canShare: true } }); }; return ; } ``` ```typescript // Server middleware - validates tenant isolation .middleware(async ({ req, metadata }) => { const user = await authenticateUser(req); // Verify user belongs to workspace const membership = await db.workspaceMemberships.findFirst({ where: { workspaceId: metadata?.workspaceId, userId: user.id } }); if (!membership) { throw new Error('Access denied to workspace'); } // Verify user can access project const project = await db.projects.findFirst({ where: { id: metadata?.projectId, workspaceId: metadata?.workspaceId } }); if (!project) { throw new Error('Project not found'); } return { // Validated client metadata workspaceId: metadata.workspaceId, projectId: metadata.projectId, folder: metadata.folder || '/', // Server metadata userId: user.id, userRole: membership.role, uploadedAt: new Date().toISOString() }; }) .paths({ generateKey: (ctx) => { const { metadata, file } = ctx; return `workspaces/${metadata.workspaceId}/projects/${metadata.projectId}${metadata.folder}/${file.name}`; } }) ``` ### E-Commerce Product Images ```typescript // Client: Product image upload with variants export function ProductImageManager({ product }: { product: Product }) { const [imageType, setImageType] = useState<'main' | 'gallery' | 'thumbnail'>('gallery'); const [selectedVariant, setSelectedVariant] = useState(null); const { uploadFiles, files } = upload.productImages(); const handleUpload = (files: File[]) => { uploadFiles(files, { productId: product.id, variantId: selectedVariant, imageType: imageType, sortOrder: product.images.length + 1, altText: `${product.name} - ${imageType} image` }); }; return (
{product.variants.length > 0 && ( )} e.target.files && handleUpload(Array.from(e.target.files))} />
); } ``` ```typescript // Server: Organize by product and variant .middleware(async ({ req, metadata }) => { const user = await authenticateUser(req); // Verify user owns product const product = await db.products.findFirst({ where: { id: metadata?.productId, ownerId: user.id } }); if (!product) { throw new Error('Product not found or access denied'); } return { productId: metadata.productId, variantId: metadata.variantId, imageType: metadata.imageType, sortOrder: metadata.sortOrder, userId: user.id, merchantId: product.merchantId }; }) .paths({ generateKey: (ctx) => { const { metadata, file } = ctx; const variantPath = metadata.variantId ? `/variants/${metadata.variantId}` : ''; return `products/${metadata.productId}${variantPath}/${metadata.imageType}/${metadata.sortOrder}-${file.name}`; } }) .onUploadComplete(async ({ metadata, url }) => { await db.productImages.create({ productId: metadata.productId, variantId: metadata.variantId, type: metadata.imageType, url: url, sortOrder: metadata.sortOrder, altText: metadata.altText }); }) ``` ### Content Management System ```typescript // Client: Content upload with categorization export function CMSMediaUpload() { const [contentType, setContentType] = useState('article'); const [category, setCategory] = useState('technology'); const [tags, setTags] = useState([]); const [publishDate, setPublishDate] = useState(''); const { uploadFiles } = upload.mediaUpload(); const handleUpload = (files: File[]) => { uploadFiles(files, { contentType: contentType, category: category, tags: tags, publishDate: publishDate || new Date().toISOString(), featured: false, author: currentUser.username }); }; return (
setPublishDate(e.target.value)} /> e.target.files && handleUpload(Array.from(e.target.files))} />
); } ``` ```typescript // Server: CMS organization .middleware(async ({ req, metadata }) => { const user = await authenticateUser(req); // Verify user has content creation permissions if (!user.permissions.includes('create:content')) { throw new Error('Insufficient permissions'); } return { contentType: metadata?.contentType || 'article', category: metadata?.category, tags: metadata?.tags || [], publishDate: metadata?.publishDate, authorId: user.id, authorName: user.name, status: 'draft' }; }) .paths({ generateKey: (ctx) => { const { metadata, file } = ctx; const date = new Date(metadata.publishDate); const year = date.getFullYear(); const month = String(date.getMonth() + 1).padStart(2, '0'); return `content/${metadata.contentType}/${year}/${month}/${metadata.category}/${file.name}`; } }) ``` ## Security Best Practices **⚠️ CRITICAL: Client metadata is UNTRUSTED user input.** Never trust identity claims, permissions, or security-related data from the client. Always validate and extract identity from authenticated sessions on the server. ### ❌ DON'T: Trust Client Identity ```typescript // ❌ BAD: Trusting client-provided userId uploadFiles(files, { userId: currentUser.id, // Client can fake this isAdmin: true, // Client can lie about this role: 'admin' // Never trust this from client }); .middleware(async ({ metadata }) => { return { userId: metadata.userId, // ❌ DANGEROUS! role: metadata.role // ❌ SECURITY RISK! }; }) ``` ### βœ… DO: Validate and Override ```typescript // βœ… GOOD: Server determines identity uploadFiles(files, { albumId: selectedAlbum.id, // βœ… OK - contextual data tags: selectedTags, // βœ… OK - user input (validate) visibility: 'private' // βœ… OK - user preference }); .middleware(async ({ req, metadata }) => { const user = await authenticateUser(req); // Server auth // Validate client data if (metadata?.albumId) { const album = await db.albums.findFirst({ where: { id: metadata.albumId, userId: user.id // Verify ownership } }); if (!album) throw new Error('Invalid album'); } return { // Client metadata (validated) albumId: metadata?.albumId, tags: sanitizeTags(metadata?.tags || []), // Server metadata (trusted) userId: user.id, // βœ… From auth role: user.role, // βœ… From auth uploadedAt: new Date().toISOString() }; }) ``` ## Validation Strategies ### Type Validation ```typescript .middleware(async ({ metadata }) => { // Validate data types if (metadata?.albumId && typeof metadata.albumId !== 'string') { throw new Error('Invalid albumId type'); } if (metadata?.tags && !Array.isArray(metadata.tags)) { throw new Error('Tags must be an array'); } if (metadata?.sortOrder && typeof metadata.sortOrder !== 'number') { throw new Error('Invalid sortOrder type'); } return { ...metadata }; }) ``` ### Value Validation ```typescript .middleware(async ({ metadata }) => { // Validate UUIDs const uuidRegex = /^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$/i; if (metadata?.albumId && !uuidRegex.test(metadata.albumId)) { throw new Error('Invalid album ID format'); } // Validate enums const validVisibilities = ['public', 'private', 'unlisted']; if (metadata?.visibility && !validVisibilities.includes(metadata.visibility)) { throw new Error('Invalid visibility setting'); } // Sanitize strings const sanitizedTags = (metadata?.tags || []).map((tag: string) => tag.trim().toLowerCase().replace(/[^a-z0-9-]/g, '') ); return { ...metadata, tags: sanitizedTags }; }) ``` ### Database Validation ```typescript .middleware(async ({ req, metadata }) => { const user = await authenticateUser(req); // Verify referenced entities exist and user has access if (metadata?.projectId) { const project = await db.projects.findFirst({ where: { id: metadata.projectId, members: { some: { userId: user.id } } } }); if (!project) { throw new Error('Project not found or access denied'); } } if (metadata?.folderId) { const folder = await db.folders.findFirst({ where: { id: metadata.folderId, projectId: metadata.projectId, deleted: false } }); if (!folder) { throw new Error('Folder not found'); } } return { ...metadata, userId: user.id }; }) ``` ## TypeScript Support Define metadata interfaces for better type safety: ```typescript // Define your metadata interface interface UploadMetadata { albumId: string; tags: string[]; visibility: 'public' | 'private' | 'unlisted'; featured?: boolean; } // Client usage with type safety const handleUpload = (files: File[]) => { const metadata: UploadMetadata = { albumId: selectedAlbum.id, tags: selectedTags, visibility: visibilityOption, featured: isFeatured }; uploadFiles(files, metadata); }; // Server middleware with typed metadata .middleware(async ({ req, metadata }: { req: NextRequest; metadata?: UploadMetadata }) => { const user = await authenticateUser(req); return { ...metadata, userId: user.id }; }) ``` ## Common Use Cases ### Album/Gallery Organization ```typescript // Pass album selection from UI uploadFiles(files, { albumId: selectedAlbum.id, albumName: selectedAlbum.name, tags: selectedTags, visibility: albumSettings.defaultVisibility }); ``` ### Document Management ```typescript // Pass folder structure and metadata uploadFiles(files, { folderId: currentFolder.id, folderPath: currentFolder.fullPath, category: documentCategory, confidential: isConfidential, expiresAt: expirationDate }); ``` ### User Profile Assets ```typescript // Pass asset type and purpose uploadFiles(files, { assetType: 'profile-picture', purpose: 'avatar', aspectRatio: '1:1', previousAssetId: currentAvatar?.id // For cleanup }); ``` ### Form Submissions ```typescript // Pass form context with uploads uploadFiles(files, { formId: formSubmission.id, formType: 'contact', attachmentType: 'supporting-document', relatedTo: formData.ticketId }); ``` ## Advanced Patterns ### Conditional Metadata ```typescript const handleUpload = (files: File[]) => { const metadata: any = { uploadSource: 'web-app', timestamp: Date.now() }; // Conditionally add metadata if (selectedAlbum) { metadata.albumId = selectedAlbum.id; } if (tags.length > 0) { metadata.tags = tags; } if (isAdminUser) { metadata.priority = 'high'; metadata.skipModeration = true; } uploadFiles(files, metadata); }; ``` ### Metadata from Form State ```typescript import { useForm } from 'react-hook-form'; export function FormWithUploads() { const { register, handleSubmit } = useForm(); const { uploadFiles } = upload.attachments(); const onSubmit = async (formData: any) => { // Upload files with form context await uploadFiles(selectedFiles, { formId: formData.id, category: formData.category, priority: formData.priority, department: formData.department, requestedBy: formData.requesterEmail }); // Then submit form await submitForm(formData); }; return
...
; } ``` ### Dynamic Path Generation ```typescript // Client uploadFiles(files, { organizationId: org.id, departmentId: dept.id, projectCode: project.code, fileClass: 'confidential' }); // Server - organize by metadata .paths({ generateKey: (ctx) => { const { metadata, file } = ctx; const date = new Date(); const year = date.getFullYear(); const quarter = `Q${Math.ceil((date.getMonth() + 1) / 3)}`; return [ 'organizations', metadata.organizationId, 'departments', metadata.departmentId, year.toString(), quarter, metadata.fileClass, metadata.projectCode, file.name ].join('/'); } }) ``` ## Metadata Size Considerations **Size Limits:** While there's no hard limit on metadata size, keep it reasonable (\< 10KB). Large metadata objects increase request size and processing time. ### βœ… Good Metadata ```typescript // Compact and purposeful { albumId: 'abc123', tags: ['vacation', 'beach'], visibility: 'private', featured: false } ``` ### ⚠️ Avoid Large Metadata ```typescript // Too large - send via separate API call { albumId: 'abc123', fullImageData: base64EncodedImage, // ❌ Don't embed files entireUserProfile: { ... }, // ❌ Too much data allPreviousUploads: [ ... ], // ❌ Unnecessary complexNestedStructure: { ... } // ⚠️ Keep it simple } ``` ## Error Handling Handle metadata-related errors gracefully: ```typescript const { uploadFiles } = upload.imageUpload({ onError: (error) => { if (error.message.includes('album')) { toast.error('Selected album is invalid or you don\'t have access'); resetAlbumSelection(); } else if (error.message.includes('metadata')) { toast.error('Invalid upload settings. Please try again.'); } else { toast.error('Upload failed: ' + error.message); } } }); ``` ## Testing with Metadata ```typescript import { render, fireEvent } from '@testing-library/react'; test('uploads files with correct metadata', async () => { const mockUploadFiles = vi.fn(); const { getByLabelText, getByRole } = render( ); // Select album const albumSelect = getByLabelText('Album'); fireEvent.change(albumSelect, { target: { value: 'vacation-2025' } }); // Add tags const tagsInput = getByLabelText('Tags'); fireEvent.change(tagsInput, { target: { value: 'beach,sunset' } }); // Upload files const fileInput = getByRole('file-input'); const files = [new File(['content'], 'photo.jpg', { type: 'image/jpeg' })]; fireEvent.change(fileInput, { target: { files } }); // Verify metadata was passed correctly expect(mockUploadFiles).toHaveBeenCalledWith( expect.arrayContaining(files), expect.objectContaining({ albumId: 'vacation-2025', tags: ['beach', 'sunset'] }) ); }); ``` ## Best Practices ### βœ… DO * Pass UI state and user selections * Validate metadata in middleware * Use metadata for dynamic path generation * Keep metadata size reasonable (\< 10KB) * Define TypeScript interfaces for metadata * Sanitize user input (tags, descriptions) * Verify access to referenced entities (albums, projects) ### ❌ DON'T * Trust client-provided identity (userId, role, permissions) * Send sensitive data (passwords, tokens, secrets) * Embed large objects (base64 files, entire datasets) * Skip validation in middleware * Use metadata for authentication * Trust metadata without verification ## Migration Guide If you're upgrading from a version without metadata support: ```typescript // Before: No metadata support const { uploadFiles } = upload.imageUpload(); uploadFiles(files); // After: Add metadata (backward compatible) const { uploadFiles } = upload.imageUpload(); // Still works without metadata uploadFiles(files); // Or add metadata uploadFiles(files, { albumId: album.id }); ``` The feature is **100% backward compatible** - existing code continues to work without changes. *** # Image Uploads (/docs/guides/image-uploads) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { TypeTable } from "fumadocs-ui/components/type-table"; import { Files, Folder, File } from "fumadocs-ui/components/files"; ## Image Upload Guide Handle image uploads with built-in optimization, validation, and processing features for the best user experience. Images are the most common upload type. This guide covers everything from basic setup to advanced optimization techniques for production apps. ## Basic Image Upload Setup ### Server Configuration ```typescript // app/api/upload/route.ts import { s3 } from "@/lib/upload"; const s3Router = s3.createRouter({ // Basic image upload profilePicture: s3.image() .maxFileSize('5MB') .maxFiles(1) .formats(['jpeg', 'png', 'webp']), // Multiple images with optimization galleryImages: s3.image() .maxFileSize('10MB') .maxFiles(10) .formats(['jpeg', 'png', 'webp', 'gif']), }); export type AppS3Router = typeof s3Router; export const { GET, POST } = s3Router.handlers; ``` ### Client Implementation ```typescript // components/image-uploader.tsx import { upload } from "@/lib/upload-client"; export function ImageUploader() { const { uploadFiles, files, isUploading } = upload.galleryImages; const handleImageSelect = (e: React.ChangeEvent) => { const selectedFiles = Array.from(e.target.files || []); uploadFiles(selectedFiles); }; return (
{files.map((file) => (
{file.status === "success" && ( {file.name} )} {file.status === "uploading" && (
{file.progress}%
)}
))}
); } ``` ## Client-Side Metadata for Images Pass contextual information from your UI to organize and categorize images: **New Feature:** You can now pass metadata directly from the client when uploading images. This allows you to send album IDs, tags, categories, or any contextual data from your UI to the server. ```typescript // Client component import { upload } from '@/lib/upload-client'; import { useState } from 'react'; export function GalleryUpload() { const [selectedAlbum, setSelectedAlbum] = useState('vacation-2025'); const [tags, setTags] = useState([]); const [isFeatured, setIsFeatured] = useState(false); const { uploadFiles, files } = upload.galleryImages({ onSuccess: (results) => { console.log(`Uploaded to album: ${selectedAlbum}`); } }); const handleUpload = (selectedFiles: File[]) => { // Pass UI state as metadata uploadFiles(selectedFiles, { albumId: selectedAlbum, tags: tags, featured: isFeatured, uploadSource: 'gallery-manager', category: 'user-content' }); }; return (
setIsFeatured(e.target.checked)} /> e.target.files && handleUpload(Array.from(e.target.files))} />
); } ``` **Server-side validation:** ```typescript // Server router .middleware(async ({ req, metadata }) => { const user = await authenticateUser(req); // Validate client-provided album access if (metadata?.albumId) { const hasAccess = await db.albums.canUserAccess(metadata.albumId, user.id); if (!hasAccess) throw new Error('Access denied to album'); } return { // Client metadata (validated) albumId: metadata?.albumId, tags: metadata?.tags || [], featured: metadata?.featured || false, // Server metadata (trusted) userId: user.id, uploadedAt: new Date().toISOString() }; }) .paths({ generateKey: (ctx) => { const { metadata, file } = ctx; // Use metadata in path generation return `albums/${metadata.albumId}/${metadata.featured ? 'featured/' : ''}${file.name}`; } }) ``` **Security:** Always validate client metadata in middleware. Never trust client-provided user IDs or permissions. ## Image Validation & Processing ### Format Validation ```typescript const s3Router = s3.createRouter({ productImages: s3.image() .maxFileSize('8MB') .maxFiles(5) .formats(['jpeg', 'png', 'webp']) .dimensions({ minWidth: 800, maxWidth: 4000, minHeight: 600, maxHeight: 3000, }) .aspectRatio(16 / 9, { tolerance: 0.1 }) .middleware(async ({ req, file, metadata }) => { // Custom validation const imageMetadata = await getImageMetadata(file); if ( imageMetadata.hasTransparency && !["png", "webp"].includes(imageMetadata.format) ) { throw new Error("Transparent images must be PNG or WebP format"); } if (imageMetadata.colorProfile !== "sRGB") { console.warn( `Image ${file.name} uses ${imageMetadata.colorProfile} color profile` ); } return { ...metadata, userId: await getUserId(req), ...imageMetadata }; }), }); ``` ## Image Processing Integration **⚠️ Important:** Pushduck handles **file uploads only**. Image processing (resizing, optimization, format conversion) requires external tools. The examples below show **integration patterns** using `.onUploadComplete()` hooks. **Popular image processing tools:** * **Sharp** - Fast Node.js processing (recommended) * **Cloudinary** - Full-service image CDN with transformations * **Imgix** - Real-time URL-based transformations See [Philosophy](/docs/philosophy#what-pushduck-doesnt-do) for why we don't include processing. ### Integration Example: Sharp for Resizing **⚠️ Bandwidth Tradeoff:** Server-side processing requires **downloading** the file from S3 to your server (inbound bandwidth), processing it, then **uploading** variants back (outbound bandwidth). **This negates Pushduck's "server never touches files" benefit!** **Better Alternatives:** * **Client-side preprocessing** - Resize before upload (see below) * **URL-based processing** - Cloudinary/Imgix transform via URL (no download) * **Async queue** - Process in background worker, not blocking upload ```typescript // First: npm install sharp import sharp from 'sharp'; const s3Router = s3.createRouter({ optimizedImages: s3.image() .maxFileSize('15MB') .maxFiles(10) .formats(['jpeg', 'png', 'webp']) .dimensions({ maxWidth: 1920, maxHeight: 1080 }) .onUploadComplete(async ({ file, url, metadata }) => { // ⚠️ This downloads the file from S3 (inbound bandwidth) const imageBuffer = await fetch(url).then(r => r.arrayBuffer()); // Process with Sharp (external tool) const variants = await Promise.all([ sharp(imageBuffer).resize(150, 150, { fit: 'cover' }).toBuffer(), sharp(imageBuffer).resize(800, 600, { fit: 'inside' }).toBuffer(), sharp(imageBuffer).resize(1920, 1080, { fit: 'inside' }).toBuffer(), ]); // ⚠️ Upload variants back to S3 (outbound bandwidth) // await uploadVariantsToS3(variants); }), }); ``` *** ## Better Alternative: Client-Side Preprocessing **βœ… Recommended Approach:** Process images **before** upload on the client side. This maintains Pushduck's "server never touches files" architecture and saves bandwidth. ### Client-Side Resize Before Upload ```typescript // Use browser-image-compression for client-side resizing // npm install browser-image-compression import imageCompression from 'browser-image-compression'; import { upload } from '@/lib/upload-client'; export function ClientSideImageUpload() { const { uploadFiles, isUploading } = upload.profilePicture(); const handleImageSelect = async (e: React.ChangeEvent) => { const file = e.target.files?.[0]; if (!file) return; // βœ… Resize on client BEFORE upload (no server bandwidth) const options = { maxSizeMB: 1, maxWidthOrHeight: 1920, useWebWorker: true, }; try { const compressedFile = await imageCompression(file, options); // Upload the already-processed file await uploadFiles([compressedFile]); console.log('Original:', file.size / 1024, 'KB'); console.log('Compressed:', compressedFile.size / 1024, 'KB'); } catch (error) { console.error('Compression error:', error); } }; return ( ); } ``` **Benefits:** * βœ… **No server bandwidth** - file never touches your server * βœ… **Faster uploads** - smaller files upload quicker * βœ… **Lower S3 costs** - store smaller files * βœ… **Edge compatible** - no Node.js processing * βœ… **Better UX** - instant preview of processed image *** ## Alternative: URL-Based Processing (Cloudinary/Imgix) **βœ… Best of Both Worlds:** Upload original to S3, transform via URL without downloading. Zero server bandwidth! ```typescript const s3Router = s3.createRouter({ images: s3.image() .maxFileSize('10MB') .onUploadComplete(async ({ file, url, metadata }) => { // Save original URL - NO download needed await db.images.create({ userId: metadata.userId, originalUrl: url, s3Key: file.key, }); // βœ… Cloudinary can fetch from your S3 URL and transform // No bandwidth on your server! }), }); // Client: Use Cloudinary URLs for transformations function ImageDisplay({ s3Url }: { s3Url: string }) { // Cloudinary fetches from S3 and transforms (their bandwidth, not yours) const cloudinaryUrl = `https://res.cloudinary.com/your-cloud/image/fetch/w_400,h_400,c_fill/${encodeURIComponent(s3Url)}`; return Transformed; } ``` *** ## Advanced Patterns (Optional) ### Responsive Image Generation ```typescript interface ImageVariant { name: string; width: number; height?: number; quality?: number; format?: "jpeg" | "png" | "webp"; } const imageVariants: ImageVariant[] = [ { name: "thumbnail", width: 150, height: 150, quality: 80 }, { name: "small", width: 400, quality: 85 }, { name: "medium", width: 800, quality: 85 }, { name: "large", width: 1200, quality: 85 }, { name: "xlarge", width: 1920, quality: 90 }, ]; const s3Router = s3.createRouter({ responsiveImages: s3.image() .maxFileSize('20MB') .maxFiles(5) .formats(['jpeg', 'png', 'webp']) .onUploadComplete(async ({ file, url, metadata }) => { // Generate responsive variants const variants = await Promise.all( imageVariants.map((variant) => generateImageVariant(file, variant)) ); // Save variant information to database await saveImageVariants(file.key, variants, metadata.userId); }), }); // Client-side responsive image component export function ResponsiveImage({ src, alt, sizes = "(max-width: 768px) 100vw, (max-width: 1200px) 50vw, 33vw", }: { src: string; alt: string; sizes?: string; }) { const variants = useImageVariants(src); if (!variants) return {alt}; const srcSet = [ `${variants.small} 400w`, `${variants.medium} 800w`, `${variants.large} 1200w`, `${variants.xlarge} 1920w`, ].join(", "); return ( {alt} ); } ``` ### Image Upload with Crop & Preview ```typescript import { useState } from 'react' import { ImageCropper } from './image-cropper' import { upload } from '@/lib/upload-client' export function ImageUploadWithCrop() { const [selectedFile, setSelectedFile] = useState(null) const [croppedImage, setCroppedImage] = useState(null) const { uploadFiles, isUploading } = upload.profilePicture const handleFileSelect = (e: React.ChangeEvent) => { const file = e.target.files?.[0] if (file) setSelectedFile(file) } const handleCropComplete = (croppedBlob: Blob) => { setCroppedImage(croppedBlob) } const handleUpload = async () => { if (!croppedImage) return const file = new File([croppedImage], 'cropped-image.jpg', { type: 'image/jpeg' }) await uploadFiles([file]) // Reset state setSelectedFile(null) setCroppedImage(null) } return (
{!selectedFile && ( )} {selectedFile && !croppedImage && ( )} {croppedImage && (
Cropped preview
)}
) } ```
```typescript import { useRef, useCallback } from 'react' import ReactCrop, { Crop, PixelCrop } from 'react-image-crop' import 'react-image-crop/dist/ReactCrop.css' interface ImageCropperProps { image: File aspectRatio?: number onCropComplete: (croppedBlob: Blob) => void } export function ImageCropper({ image, aspectRatio = 1, onCropComplete }: ImageCropperProps) { const imgRef = useRef(null) const [crop, setCrop] = useState({ unit: '%', x: 25, y: 25, width: 50, height: 50 }) const imageUrl = URL.createObjectURL(image) const getCroppedImage = useCallback(async ( image: HTMLImageElement, crop: PixelCrop ): Promise => { const canvas = document.createElement('canvas') const ctx = canvas.getContext('2d')! const scaleX = image.naturalWidth / image.width const scaleY = image.naturalHeight / image.height canvas.width = crop.width * scaleX canvas.height = crop.height * scaleY ctx.imageSmoothingQuality = 'high' ctx.drawImage( image, crop.x * scaleX, crop.y * scaleY, crop.width * scaleX, crop.height * scaleY, 0, 0, canvas.width, canvas.height ) return new Promise(resolve => { canvas.toBlob(blob => resolve(blob!), 'image/jpeg', 0.9) }) }, []) const handleCropComplete = useCallback(async (crop: PixelCrop) => { if (imgRef.current && crop.width && crop.height) { const croppedBlob = await getCroppedImage(imgRef.current, crop) onCropComplete(croppedBlob) } }, [getCroppedImage, onCropComplete]) return (
Crop preview
) } ```
```typescript // Server-side image processing after upload const s3Router = s3.createRouter({ profilePicture: s3.image() .maxFileSize('10MB') .maxFiles(1) .formats(['jpeg', 'png', 'webp']) .onUploadComplete(async ({ file, url, metadata }) => { // Generate avatar sizes await Promise.all([ generateImageVariant(file, { name: 'avatar-small', width: 32, height: 32, fit: 'cover', quality: 90 }), generateImageVariant(file, { name: 'avatar-medium', width: 64, height: 64, fit: 'cover', quality: 90 }), generateImageVariant(file, { name: 'avatar-large', width: 128, height: 128, fit: 'cover', quality: 95 }) ]) // Update user profile with new avatar await updateUserAvatar(metadata.userId, { original: url, small: getVariantUrl(file.key, 'avatar-small'), medium: getVariantUrl(file.key, 'avatar-medium'), large: getVariantUrl(file.key, 'avatar-large') }) }) }) ```
## Image Upload Patterns ### Drag & Drop Image Gallery ```typescript import { useDropzone } from "react-dropzone"; import { upload } from "@/lib/upload-client"; export function ImageGalleryUploader() { const { uploadFiles, files, isUploading } = upload.galleryImages; const { getRootProps, getInputProps, isDragActive } = useDropzone({ accept: { "image/*": [".jpeg", ".jpg", ".png", ".webp", ".gif"], }, maxFiles: 10, onDrop: (acceptedFiles) => { uploadFiles(acceptedFiles); }, }); const removeFile = (fileId: string) => { // Implementation to remove file from gallery }; return (
{isDragActive ? (

Drop the images here...

) : (

Drag & drop images here, or click to select

Up to 10 images, max 10MB each

)}
{files.length > 0 && (
{files.map((file) => (
{file.status === "success" && (
{file.name}
)} {file.status === "uploading" && (
{file.progress}%

{file.name}

)} {file.status === "error" && (
⚠️

Upload failed

)}
))}
)}
); } ``` ### Image Upload with Metadata ```typescript const s3Router = s3.createRouter({ portfolioImages: s3.image() .maxFileSize('15MB') .maxFiles(20) .formats(['jpeg', 'png', 'webp']) .middleware(async ({ req, file, metadata }) => { const { userId } = await authenticateUser(req); // Extract and validate metadata const imageMetadata = await extractImageMetadata(file); // Return enriched metadata return { ...metadata, userId, uploadedBy: userId, uploadedAt: new Date(), originalFilename: file.name, fileHash: await calculateFileHash(file), ...imageMetadata, }; }) .onUploadComplete(async ({ file, url, metadata }) => { // Save detailed image information await saveImageToDatabase({ userId: metadata.userId, s3Key: file.key, url: url, filename: metadata.originalFilename, size: file.size, dimensions: { width: metadata.width, height: metadata.height, }, format: metadata.format, colorProfile: metadata.colorProfile, hasTransparency: metadata.hasTransparency, exifData: metadata.exif, hash: metadata.fileHash, }); }), }); ``` ## Performance Best Practices ```typescript import { compress } from 'image-conversion' export function optimizeImage(file: File): Promise { return compress(file, { quality: 0.8, type: 'image/webp', width: 1920, height: 1080, orientation: true // Auto-rotate based on EXIF }) } // Usage in upload component const handleFileSelect = async (files: File[]) => { const optimizedFiles = await Promise.all( files.map(file => optimizeImage(file)) ) uploadFiles(optimizedFiles) } ``` ```typescript export function ProgressiveImage({ src, blurDataURL, alt }: { src: string blurDataURL: string alt: string }) { const [isLoaded, setIsLoaded] = useState(false) return (
{alt} {alt} setIsLoaded(true)} />
) } ```
```typescript import { useIntersectionObserver } from '@/hooks/use-intersection-observer' export function LazyImage({ src, alt, ...props }) { const [ref, isIntersecting] = useIntersectionObserver({ threshold: 0.1, rootMargin: '50px' }) return (
{isIntersecting ? ( {alt} ) : (
Loading...
)}
) } ```
## Project Structure *** **Image Excellence**: With proper optimization, validation, and processing, your image uploads will provide an excellent user experience while maintaining performance and quality. # Guides & Tutorials (/docs/guides) import { Card, Cards } from "fumadocs-ui/components/card"; import { Callout } from "fumadocs-ui/components/callout"; import { Steps, Step } from "fumadocs-ui/components/steps"; ## Learning Path & Tutorials Learn how to build robust file upload features with pushduck through comprehensive guides covering everything from basic uploads to advanced production patterns. **Progressive Learning**: These guides are organized from basic concepts to advanced patterns. Start with client approaches and work your way up to production deployment. ## Getting Started * Hook-based vs Property-based clients * When to use each approach * Migration strategies * Performance considerations **Perfect for**: Understanding client patterns ## Upload Patterns * Image validation and processing * Automatic resizing and optimization * Format conversion * Progressive loading patterns **Perfect for**: Photo sharing, profile pictures, galleries ## Security & Authentication * User authentication strategies * Role-based access control * JWT integration * Session management **Essential for**: Secure applications * CORS setup for different providers * Access Control Lists (ACL) * Public vs private uploads * Security best practices **Essential for**: Production deployments ## Migration & Upgrades * Step-by-step migration process * Breaking changes and compatibility * Performance improvements * Type safety enhancements **Perfect for**: Upgrading existing projects ## Production Deployment * Environment configuration * Security considerations * Performance optimization * Monitoring and logging **Essential for**: Going live safely ## Common Patterns ### Basic Upload Flow **Configure Server Router** Set up your upload routes with validation: ```typescript const uploadRouter = createS3Router({ routes: { imageUpload: s3.image().maxFileSize("5MB"), documentUpload: s3.file().maxFileSize("10MB"), }, }); ``` **Implement Client Upload** Use hooks or client for reactive uploads: ```typescript const { upload, uploading, progress } = useUpload({ endpoint: '/api/upload', route: 'imageUpload', }); ``` **Handle Upload Results** Process successful uploads and errors: ```typescript const result = await upload(file); console.log('File uploaded:', result.url); ``` ### Authentication Pattern ```typescript // Server: Add authentication middleware const uploadRouter = createS3Router({ middleware: [ async (req) => { const user = await authenticate(req); if (!user) throw new Error('Unauthorized'); return { user }; } ], routes: { userAvatar: s3.image() .maxFileSize("2MB") .path(({ metadata }) => `avatars/${metadata.user.id}`), }, }); // Client: Include auth headers const { upload } = useUpload({ endpoint: '/api/upload', route: 'userAvatar', headers: { Authorization: `Bearer ${token}`, }, }); ``` ## Architecture Patterns ### Multi-Provider Setup ```typescript // Support multiple storage providers const uploadRouter = createS3Router({ storage: process.env.NODE_ENV === 'production' ? { provider: 'aws-s3', ... } : { provider: 'minio', ... }, routes: { // Your routes remain the same }, }); ``` ### Route-Based Organization ```typescript const uploadRouter = createS3Router({ routes: { // Public uploads publicImages: s3.image().maxFileSize("5MB").public(), // User-specific uploads userDocuments: s3.file() .maxFileSize("10MB") .path(({ metadata }) => `users/${metadata.userId}/documents`), // Admin uploads adminAssets: s3.file() .maxFileSize("50MB") .middleware([requireAdmin]), }, }); ``` ## Performance Tips **Optimization Strategies**: * Use appropriate file size limits for your use case * Implement client-side validation before upload * Consider using presigned URLs for large files * Enable CDN for frequently accessed files * Implement progressive upload for large files ## Troubleshooting Quick Links | Issue | Solution | | ----------------- | -------------------------------------------------------------------------- | | **CORS errors** | Check [CORS Configuration](/docs/guides/security/cors-and-acl) | | **Auth failures** | Review [Authentication Guide](/docs/guides/security/authentication) | | **Slow uploads** | See [Production Checklist](/docs/guides/production-checklist) | | **Type errors** | Check [Enhanced Client Migration](/docs/guides/migrate-to-enhanced-client) | ## What's Next? 1. **New to pushduck?** β†’ Start with [Client Approaches](/docs/guides/client-approaches) 2. **Building image features?** β†’ Check [Image Uploads](/docs/guides/image-uploads) 3. **Adding security?** β†’ Review [Authentication](/docs/guides/security/authentication) 4. **Going to production?** β†’ Use [Production Checklist](/docs/guides/production-checklist) 5. **Need help?** β†’ Visit our [troubleshooting guide](/docs/api/troubleshooting) **Community Guides**: Have a useful pattern or solution? Consider contributing to our documentation to help other developers! # Enhanced Client Migration (/docs/guides/migrate-to-enhanced-client) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; ## Migrating to Enhanced Client Upgrade to the new property-based client API for enhanced type safety, better developer experience, and elimination of string literals. The enhanced client API is **100% backward compatible**. You can migrate gradually without breaking existing code. ## Why Migrate? ```typescript // ❌ Old: String literals, no type safety const {uploadFiles} = useUploadRoute("imageUpload") // βœ… New: Property-based, full type inference const {uploadFiles} = upload.imageUpload ``` ```typescript // βœ… Autocomplete shows all your endpoints upload. // imageUpload, documentUpload, videoUpload... // ^ No more guessing endpoint names ``` ```typescript // When you rename routes in your router, // TypeScript shows errors everywhere they're used // Making refactoring safe and easy ``` ## Migration Steps **Install Latest Version** Ensure you're using the latest version of pushduck: ```bash npm install pushduck@latest ``` ```bash yarn add pushduck@latest ``` ```bash pnpm add pushduck@latest ``` ```bash bun add pushduck@latest ``` **Create Upload Client** Set up your typed upload client: ```typescript title="lib/upload-client.ts" import { createUploadClient } from 'pushduck/client' import type { AppRouter } from './upload' // Your router type export const upload = createUploadClient({ endpoint: '/api/upload' }) ``` **Migrate Components Gradually** Update your components one by one: ```typescript import { useUploadRoute } from 'pushduck/client' export function ImageUploader() { const { uploadFiles, files, isUploading } = useUploadRoute('imageUpload') return (
uploadFiles(e.target.files)} /> {/* Upload UI */}
) } ```
```typescript import { upload } from '@/lib/upload-client' export function ImageUploader() { const { uploadFiles, files, isUploading } = upload.imageUpload return (
uploadFiles(e.target.files)} /> {/* Same upload UI */}
) } ```
**Update Imports** Once migrated, you can remove old hook imports: ```typescript // Remove old imports // import { useUploadRoute } from 'pushduck/client' // Use new client import import { upload } from '@/lib/upload-client' ```
## Migration Examples ### Basic Component Migration ```typescript import { useUploadRoute } from 'pushduck/client' export function DocumentUploader() { const { uploadFiles, files, isUploading, error, reset } = useUploadRoute('documentUpload', { onSuccess: (results) => { console.log('Uploaded:', results) }, onError: (error) => { console.error('Error:', error) } }) return (
uploadFiles(Array.from(e.target.files || []))} disabled={isUploading} /> {files.map(file => (
{file.name}
))} {error &&
Error: {error.message}
}
) } ```
```typescript import { upload } from '@/lib/upload-client' export function DocumentUploader() { const { uploadFiles, files, isUploading, error, reset } = upload.documentUpload // Handle callbacks with upload options const handleUpload = async (selectedFiles: File[]) => { try { const results = await uploadFiles(selectedFiles) console.log('Uploaded:', results) } catch (error) { console.error('Error:', error) } } return (
handleUpload(Array.from(e.target.files || []))} disabled={isUploading} /> {files.map(file => (
{file.name}
))} {error &&
Error: {error.message}
}
) } ```
### Form Integration Migration ```typescript import { useForm } from 'react-hook-form' import { useUploadRoute } from 'pushduck/client' export function ProductForm() { const { register, handleSubmit, setValue } = useForm() const { uploadFiles, uploadedFiles } = useUploadRoute('productImages', { onSuccess: (results) => { setValue('images', results.map(r => r.url)) } }) return (
uploadFiles(Array.from(e.target.files || []))} />
) } ```
```typescript import { useForm } from 'react-hook-form' import { upload } from '@/lib/upload-client' export function ProductForm() { const { register, handleSubmit, setValue } = useForm() const { uploadFiles } = upload.productImages const handleImageUpload = async (files: File[]) => { const results = await uploadFiles(files) setValue('images', results.map(r => r.url)) } return (
handleImageUpload(Array.from(e.target.files || []))} />
) } ```
### Multiple Upload Types Migration ```typescript export function MediaUploader() { const images = useUploadRoute('imageUpload') const videos = useUploadRoute('videoUpload') const documents = useUploadRoute('documentUpload') return (

Images

images.uploadFiles(e.target.files)} />

Videos

videos.uploadFiles(e.target.files)} />

Documents

documents.uploadFiles(e.target.files)} />
) } ```
```typescript import { upload } from '@/lib/upload-client' export function MediaUploader() { const images = upload.imageUpload const videos = upload.videoUpload const documents = upload.documentUpload return (

Images

images.uploadFiles(e.target.files)} />

Videos

videos.uploadFiles(e.target.files)} />

Documents

documents.uploadFiles(e.target.files)} />
) } ```
## Key Differences ### API Comparison | Feature | Hook-Based API | Property-Based API | | ------------------ | ------------------------- | --------------------------- | | **Type Safety** | Runtime string validation | Compile-time type checking | | **IntelliSense** | Limited autocomplete | Full endpoint autocomplete | | **Refactoring** | Manual find/replace | Automatic TypeScript errors | | **Bundle Size** | Slightly larger | Optimized tree-shaking | | **Learning Curve** | Familiar React pattern | New property-based pattern | ### Callback Handling ```typescript const { uploadFiles } = useUploadRoute('images', { onSuccess: (results) => console.log('Success:', results), onError: (error) => console.error('Error:', error), onProgress: (progress) => console.log('Progress:', progress) }) ``` ```typescript const { uploadFiles } = upload.images await uploadFiles(files, { onSuccess: (results) => console.log('Success:', results), onError: (error) => console.error('Error:', error), onProgress: (progress) => console.log('Progress:', progress) }) ``` ## Troubleshooting ### Common Migration Issues **Type Errors:** If you see TypeScript errors after migration, ensure your router type is properly exported and imported. ```typescript // ❌ Missing router type export const upload = createUploadClient({ endpoint: "/api/upload", }); // βœ… With proper typing export const upload = createUploadClient({ endpoint: "/api/upload", }); ``` ### Gradual Migration Strategy You can use both APIs simultaneously during migration: ```typescript // Keep existing hook-based components working const hookUpload = useUploadRoute("imageUpload"); // Use new property-based API for new components const propertyUpload = upload.imageUpload; // Both work with the same backend! ``` ## Benefits After Migration * **🎯 Enhanced Type Safety**: Catch errors at compile time, not runtime * **πŸš€ Better Performance**: Optimized bundle size with tree-shaking * **πŸ’‘ Improved DX**: Full IntelliSense support for all endpoints * **πŸ”§ Safe Refactoring**: Rename endpoints without breaking your app * **πŸ“¦ Future-Proof**: Built for the next generation of pushduck features *** **Migration Complete!** You now have enhanced type safety and a better developer experience. Need help? Join our [Discord community](https://pushduck.dev/discord) for support. # Production Checklist (/docs/guides/production-checklist) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Steps, Step } from "fumadocs-ui/components/steps"; ## Going Live Checklist Get your file uploads production-ready. Start with the 8 essentials belowβ€”most apps don't need more than this. **Quick Path to Production:** Complete the essential checklist below (8 items) and you're ready to deploy. Advanced optimizations can be added later as you scale. ## βœ… Essential Checklist (Required) These 8 items are critical for safe production deployment: **1. Authentication** * [ ] Auth middleware on all upload routes * [ ] Unauthenticated requests are blocked ```typescript const router = s3.createRouter({ userFiles: s3.image() .middleware(async ({ req }) => { const session = await getServerSession(req); if (!session) throw new Error("Auth required"); return { userId: session.user.id }; }) }); ``` **2. Environment Variables** * [ ] S3 credentials in `.env` (not in code) * [ ] Secrets are strong and unique ```bash AWS_ACCESS_KEY_ID=xxx AWS_SECRET_ACCESS_KEY=xxx AWS_REGION=us-east-1 S3_BUCKET_NAME=your-bucket ``` **3. File Validation** * [ ] File type restrictions (`.formats()`) * [ ] File size limits (`.maxFileSize()`) ```typescript userPhotos: s3.image() .maxFileSize("10MB") .maxFiles(5) .formats(["jpeg", "png", "webp"]) ``` **4. CORS Configuration** * [ ] CORS set up on S3 bucket * [ ] Only your domain is allowed See [CORS Setup Guide](/docs/guides/security/cors-and-acl) **5. Error Monitoring** * [ ] Error tracking enabled (Sentry/LogRocket) * [ ] Upload failures are logged ```typescript .onUploadError(async ({ error }) => { console.error('Upload failed:', error); // Sentry.captureException(error); }) ``` **6. Basic Rate Limiting** (Optional but recommended) * [ ] Prevent abuse with upload limits Use Upstash or Vercel KV for simple rate limiting. **7. Test Uploads** * [ ] Upload works in production environment * [ ] Files appear in S3 bucket correctly * [ ] URLs are accessible **8. Backup Strategy** * [ ] S3 versioning enabled (optional) * [ ] Know how to restore deleted files **βœ… Done!** If you've completed these 8 items, your upload system is production-ready. *** ## πŸš€ When You Need More **Most apps are production-ready with the 8 essentials above.** As you scale, consider: * **CDN integration** - For global audience or high traffic * **Advanced auth** - RBAC/ABAC for enterprise permissions (see [Authentication Guide](/docs/guides/security/authentication)) * **Redis caching** - For 10k+ requests/minute * **Multi-region** - For mission-critical redundancy *** ## Next Steps Deep dive into authentication patterns Configure CORS for your provider Common issues and solutions # Astro (/docs/integrations/astro) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; **🚧 Client-Side In Development**: Astro server-side integration is fully functional with Web Standards APIs. However, Astro-specific client-side components and hooks are still in development. You can use the standard pushduck client APIs for now. ## Using pushduck with Astro Astro is a modern web framework for building fast, content-focused websites with islands architecture. It uses Web Standards APIs and provides excellent performance with minimal JavaScript. Since Astro uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters! **Web Standards Native**: Astro API routes use Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead. ## Quick Setup **Install dependencies** ```bash npm install pushduck ``` **Configure upload router** ```typescript title="src/lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: import.meta.env.AWS_ACCESS_KEY_ID!, secretAccessKey: import.meta.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: import.meta.env.AWS_ENDPOINT_URL!, bucket: import.meta.env.S3_BUCKET_NAME!, accountId: import.meta.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().maxFileSize("5MB"), documentUpload: s3.file().maxFileSize("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create API route** ```typescript title="src/pages/api/upload/[...path].ts" import type { APIRoute } from 'astro'; import { uploadRouter } from '../../../lib/upload'; // Direct usage - no adapter needed! export const ALL: APIRoute = async ({ request }) => { return uploadRouter.handlers(request); }; ``` ## Basic Integration ### Simple Upload Route ```typescript title="src/pages/api/upload/[...path].ts" import type { APIRoute } from 'astro'; import { uploadRouter } from '../../../lib/upload'; // Method 1: Combined handler (recommended) export const ALL: APIRoute = async ({ request }) => { return uploadRouter.handlers(request); }; // Method 2: Separate handlers (if you need method-specific logic) export const GET: APIRoute = async ({ request }) => { return uploadRouter.handlers.GET(request); }; export const POST: APIRoute = async ({ request }) => { return uploadRouter.handlers.POST(request); }; ``` ### With CORS Support ```typescript title="src/pages/api/upload/[...path].ts" import type { APIRoute } from 'astro'; import { uploadRouter } from '../../../lib/upload'; export const ALL: APIRoute = async ({ request }) => { // Handle CORS preflight if (request.method === 'OPTIONS') { return new Response(null, { status: 200, headers: { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'GET, POST, OPTIONS', 'Access-Control-Allow-Headers': 'Content-Type', }, }); } const response = await uploadRouter.handlers(request); // Add CORS headers to actual response response.headers.set('Access-Control-Allow-Origin', '*'); return response; }; ``` ## Advanced Configuration ### Authentication with Astro ```typescript title="src/lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: import.meta.env.AWS_ACCESS_KEY_ID!, secretAccessKey: import.meta.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: import.meta.env.AWS_ENDPOINT_URL!, bucket: import.meta.env.S3_BUCKET_NAME!, accountId: import.meta.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with cookie-based authentication privateUpload: s3 .image() .maxFileSize("5MB") .middleware(async ({ req }) => { const cookies = req.headers.get('Cookie'); const sessionId = parseCookie(cookies)?.sessionId; if (!sessionId) { throw new Error('Authentication required'); } const user = await getUserFromSession(sessionId); if (!user) { throw new Error('Invalid session'); } return { userId: user.id, username: user.username, }; }), // Public uploads (no auth) publicUpload: s3 .image() .maxFileSize("2MB") // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; // Helper functions function parseCookie(cookieString: string | null) { if (!cookieString) return {}; return Object.fromEntries( cookieString.split('; ').map(c => { const [key, ...v] = c.split('='); return [key, v.join('=')]; }) ); } async function getUserFromSession(sessionId: string) { // Implement your session validation logic // This could connect to a database, Redis, etc. return { id: 'user-123', username: 'demo-user' }; } ``` ## Client-Side Usage ### Upload Component (React) ```tsx title="src/components/FileUpload.tsx" import { useUpload } from "pushduck/client"; import type { AppUploadRouter } from "../lib/upload"; const { UploadButton, UploadDropzone } = useUpload({ endpoint: "/api/upload", }); export default function FileUpload() { function handleUploadComplete(files: any[]) { console.log("Files uploaded:", files); alert("Upload completed!"); } function handleUploadError(error: Error) { console.error("Upload error:", error); alert(`Upload failed: ${error.message}`); } return (

Image Upload

Document Upload

); } ``` ### Upload Component (Vue) ```vue title="src/components/FileUpload.vue" ``` ### Using in Astro Pages ```astro title="src/pages/index.astro" --- // Server-side code (runs at build time) --- File Upload Demo

File Upload Demo

``` ## File Management ### Server-Side File API ```typescript title="src/pages/api/files.ts" import type { APIRoute } from 'astro'; export const GET: APIRoute = async ({ request, url }) => { const searchParams = url.searchParams; const userId = searchParams.get('userId'); if (!userId) { return new Response(JSON.stringify({ error: 'User ID required' }), { status: 400, headers: { 'Content-Type': 'application/json' } }); } // Fetch files from database const files = await getFilesForUser(userId); return new Response(JSON.stringify({ files: files.map(file => ({ id: file.id, name: file.name, url: file.url, size: file.size, uploadedAt: file.createdAt, })), }), { headers: { 'Content-Type': 'application/json' } }); }; async function getFilesForUser(userId: string) { // Implement your database query logic return []; } ``` ### File Management Page ```astro title="src/pages/files.astro" --- // This runs on the server at build time or request time const files = await fetch(`${Astro.url.origin}/api/files?userId=current-user`) .then(res => res.json()) .catch(() => ({ files: [] })); --- My Files

My Files

Uploaded Files

{files.files.length === 0 ? (

No files uploaded yet.

) : (
{files.files.map((file: any) => (

{file.name}

{formatFileSize(file.size)}

{new Date(file.uploadedAt).toLocaleDateString()}

View File
))}
)}
``` ## Deployment Options ```javascript title="astro.config.mjs" import { defineConfig } from 'astro/config'; import vercel from '@astrojs/vercel/serverless'; export default defineConfig({ output: 'server', adapter: vercel({ runtime: 'nodejs18.x', }), }); ``` ```javascript title="astro.config.mjs" import { defineConfig } from 'astro/config'; import netlify from '@astrojs/netlify/functions'; export default defineConfig({ output: 'server', adapter: netlify(), }); ``` ```javascript title="astro.config.mjs" import { defineConfig } from 'astro/config'; import node from '@astrojs/node'; export default defineConfig({ output: 'server', adapter: node({ mode: 'standalone', }), }); ``` ```javascript title="astro.config.mjs" import { defineConfig } from 'astro/config'; import cloudflare from '@astrojs/cloudflare'; export default defineConfig({ output: 'server', adapter: cloudflare(), }); ``` ## Environment Variables ```bash title=".env" # AWS Configuration AWS_REGION=us-east-1 AWS_ACCESS_KEY_ID=your_access_key AWS_SECRET_ACCESS_KEY=your_secret_key AWS_S3_BUCKET=your-bucket-name # Astro PUBLIC_UPLOAD_ENDPOINT=http://localhost:3000/api/upload ``` ## Performance Benefits ## Real-Time Upload Progress ```tsx title="src/components/AdvancedUpload.tsx" import { useState } from 'react'; export default function AdvancedUpload() { const [uploadProgress, setUploadProgress] = useState(0); const [isUploading, setIsUploading] = useState(false); async function handleFileUpload(event: React.ChangeEvent) { const files = event.target.files; if (!files || files.length === 0) return; setIsUploading(true); setUploadProgress(0); try { // Simulate upload progress for (let i = 0; i <= 100; i += 10) { setUploadProgress(i); await new Promise(resolve => setTimeout(resolve, 100)); } alert('Upload completed!'); } catch (error) { console.error('Upload failed:', error); alert('Upload failed!'); } finally { setIsUploading(false); setUploadProgress(0); } } return (
{isUploading && (

{uploadProgress}% uploaded

)}
); } ``` ## Troubleshooting **Common Issues** 1. **Route not found**: Ensure your route is `src/pages/api/upload/[...path].ts` 2. **Build errors**: Check that pushduck is properly installed and configured 3. **Environment variables**: Use `import.meta.env` instead of `process.env` 4. **Client components**: Remember to add `client:load` directive for interactive components ### Debug Mode Enable debug logging: ```typescript title="src/lib/upload.ts" export const uploadRouter = createS3Router({ // ... routes }).middleware(async ({ req, file }) => { if (import.meta.env.DEV) { console.log("Upload request:", req.url); console.log("File:", file.name, file.size); } return {}; }); ``` ### Astro Configuration ```javascript title="astro.config.mjs" import { defineConfig } from 'astro/config'; import react from '@astrojs/react'; import vue from '@astrojs/vue'; export default defineConfig({ integrations: [ react(), // For React components vue(), // For Vue components ], output: 'server', // Required for API routes vite: { define: { // Make environment variables available 'import.meta.env.AWS_ACCESS_KEY_ID': JSON.stringify(process.env.AWS_ACCESS_KEY_ID), } } }); ``` Astro provides an excellent foundation for building fast, content-focused websites with pushduck, combining the power of islands architecture with Web Standards APIs for optimal performance and developer experience. # Bun Runtime (/docs/integrations/bun) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; ## Using pushduck with Bun Bun is an ultra-fast JavaScript runtime with native Web Standards support. Since Bun uses Web Standard `Request` and `Response` objects natively, pushduck handlers work directly without any adapters! **Web Standards Native**: Bun's `Bun.serve()` uses Web Standard `Request` objects directly, making pushduck integration seamless with zero overhead. ## Quick Setup **Install dependencies** ```bash bun add pushduck ``` ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().maxFileSize("5MB"), documentUpload: s3.file().maxFileSize("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create Bun server with upload routes** ```typescript title="server.ts" import { uploadRouter } from './lib/upload'; // Direct usage - no adapter needed! Bun.serve({ port: 3000, fetch(request) { const url = new URL(request.url); if (url.pathname.startsWith('/api/upload/')) { return uploadRouter.handlers(request); } return new Response('Not found', { status: 404 }); }, }); console.log('πŸš€ Bun server running on http://localhost:3000'); ``` ## Basic Integration ### Simple Upload Server ```typescript title="server.ts" import { uploadRouter } from './lib/upload'; Bun.serve({ port: 3000, fetch(request) { const url = new URL(request.url); // Method 1: Combined handler (recommended) if (url.pathname.startsWith('/api/upload/')) { return uploadRouter.handlers(request); } // Health check if (url.pathname === '/health') { return new Response(JSON.stringify({ status: 'ok' }), { headers: { 'Content-Type': 'application/json' } }); } return new Response('Not found', { status: 404 }); }, }); console.log('πŸš€ Bun server running on http://localhost:3000'); ``` ### With CORS and Routing ```typescript title="server.ts" import { uploadRouter } from './lib/upload'; function handleCORS(request: Request) { const origin = request.headers.get('origin'); const allowedOrigins = ['http://localhost:3000', 'https://your-domain.com']; const headers = new Headers(); if (origin && allowedOrigins.includes(origin)) { headers.set('Access-Control-Allow-Origin', origin); } headers.set('Access-Control-Allow-Methods', 'GET, POST, OPTIONS'); headers.set('Access-Control-Allow-Headers', 'Content-Type, Authorization'); return headers; } Bun.serve({ port: 3000, fetch(request) { const url = new URL(request.url); const corsHeaders = handleCORS(request); // Handle preflight requests if (request.method === 'OPTIONS') { return new Response(null, { status: 200, headers: corsHeaders }); } // Upload routes if (url.pathname.startsWith('/api/upload/')) { return uploadRouter.handlers(request).then(response => { // Add CORS headers to response corsHeaders.forEach((value, key) => { response.headers.set(key, value); }); return response; }); } // Health check if (url.pathname === '/health') { return new Response(JSON.stringify({ status: 'ok', runtime: 'Bun', timestamp: new Date().toISOString() }), { headers: { 'Content-Type': 'application/json', ...Object.fromEntries(corsHeaders) } }); } return new Response('Not found', { status: 404 }); }, }); console.log('πŸš€ Bun server running on http://localhost:3000'); ``` ## Advanced Configuration ### Authentication and Rate Limiting ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with authentication privateUpload: s3 .image() .maxFileSize("5MB") .middleware(async ({ req }) => { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authorization required'); } const token = authHeader.substring(7); try { const payload = await verifyJWT(token); return { userId: payload.sub as string, userRole: payload.role as string }; } catch (error) { throw new Error('Invalid token'); } }), // Public uploads (no auth) publicUpload: s3 .image() .maxFileSize("2MB") // No middleware = public access }); async function verifyJWT(token: string) { // Your JWT verification logic here // Using Bun's built-in crypto or a JWT library return { sub: 'user-123', role: 'user' }; } export type AppUploadRouter = typeof uploadRouter; ``` ### Production Server with Full Features ```typescript title="server.ts" import { uploadRouter } from './lib/upload'; // Simple rate limiting store const rateLimitStore = new Map(); function rateLimit(ip: string, maxRequests = 100, windowMs = 15 * 60 * 1000) { const now = Date.now(); const key = ip; const record = rateLimitStore.get(key); if (!record || now > record.resetTime) { rateLimitStore.set(key, { count: 1, resetTime: now + windowMs }); return true; } if (record.count >= maxRequests) { return false; } record.count++; return true; } function getClientIP(request: Request): string { // In production, you might get this from headers like X-Forwarded-For return request.headers.get('x-forwarded-for') || request.headers.get('x-real-ip') || 'unknown'; } Bun.serve({ port: process.env.PORT ? parseInt(process.env.PORT) : 3000, fetch(request) { const url = new URL(request.url); const clientIP = getClientIP(request); // Rate limiting if (!rateLimit(clientIP)) { return new Response(JSON.stringify({ error: 'Too many requests' }), { status: 429, headers: { 'Content-Type': 'application/json' } }); } // CORS const corsHeaders = { 'Access-Control-Allow-Origin': process.env.NODE_ENV === 'production' ? 'https://your-domain.com' : '*', 'Access-Control-Allow-Methods': 'GET, POST, OPTIONS', 'Access-Control-Allow-Headers': 'Content-Type, Authorization', }; // Handle preflight if (request.method === 'OPTIONS') { return new Response(null, { status: 200, headers: corsHeaders }); } // Upload routes if (url.pathname.startsWith('/api/upload/')) { return uploadRouter.handlers(request).then(response => { Object.entries(corsHeaders).forEach(([key, value]) => { response.headers.set(key, value); }); return response; }).catch(error => { console.error('Upload error:', error); return new Response(JSON.stringify({ error: 'Upload failed', message: process.env.NODE_ENV === 'development' ? error.message : 'Internal server error' }), { status: 500, headers: { 'Content-Type': 'application/json', ...corsHeaders } }); }); } // API info if (url.pathname === '/api') { return new Response(JSON.stringify({ name: 'Bun Upload API', version: '1.0.0', runtime: 'Bun', endpoints: { health: '/health', upload: '/api/upload/*' } }), { headers: { 'Content-Type': 'application/json', ...corsHeaders } }); } // Health check if (url.pathname === '/health') { return new Response(JSON.stringify({ status: 'ok', runtime: 'Bun', version: Bun.version, timestamp: new Date().toISOString(), uptime: process.uptime() }), { headers: { 'Content-Type': 'application/json', ...corsHeaders } }); } return new Response('Not found', { status: 404, headers: corsHeaders }); }, }); console.log(`πŸš€ Bun server running on http://localhost:${process.env.PORT || 3000}`); console.log(`πŸ“Š Environment: ${process.env.NODE_ENV || 'development'}`); ``` ## File-based Routing ### Structured Application ```typescript title="routes/upload.ts" import { uploadRouter } from '../lib/upload'; export function handleUpload(request: Request) { return uploadRouter.handlers(request); } ``` ```typescript title="routes/api.ts" export function handleAPI(request: Request) { return new Response(JSON.stringify({ name: 'Bun Upload API', version: '1.0.0', runtime: 'Bun' }), { headers: { 'Content-Type': 'application/json' } }); } ``` ```typescript title="server.ts" import { handleUpload } from './routes/upload'; import { handleAPI } from './routes/api'; const routes = { '/api/upload': handleUpload, '/api': handleAPI, '/health': () => new Response(JSON.stringify({ status: 'ok' }), { headers: { 'Content-Type': 'application/json' } }) }; Bun.serve({ port: 3000, fetch(request) { const url = new URL(request.url); for (const [path, handler] of Object.entries(routes)) { if (url.pathname.startsWith(path)) { return handler(request); } } return new Response('Not found', { status: 404 }); }, }); ``` ## Performance Benefits Bun is 3x faster than Node.js, providing incredible performance for file upload operations. No adapter layer means zero performance overhead - pushduck handlers run directly in Bun. Built-in bundler, test runner, package manager, and more - no extra tooling needed. Run TypeScript directly without compilation, perfect for rapid development. ## Deployment ### Docker Deployment ```dockerfile title="Dockerfile" FROM oven/bun:1 as base WORKDIR /usr/src/app # Install dependencies COPY package.json bun.lockb ./ RUN bun install --frozen-lockfile # Copy source code COPY . . # Expose port EXPOSE 3000 # Run the app CMD ["bun", "run", "server.ts"] ``` ### Production Scripts ```json title="package.json" { "name": "bun-upload-server", "version": "1.0.0", "scripts": { "dev": "bun run --watch server.ts", "start": "bun run server.ts", "build": "bun build server.ts --outdir ./dist --target bun", "test": "bun test" }, "dependencies": { "pushduck": "latest" }, "devDependencies": { "bun-types": "latest" } } ``` *** **Bun + Pushduck**: The perfect combination for ultra-fast file uploads with zero configuration overhead and exceptional developer experience. # Elysia (/docs/integrations/elysia) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; ## Using pushduck with Elysia Elysia is a TypeScript-first web framework designed for Bun. Since Elysia uses Web Standard `Request` objects natively, pushduck handlers work directly without any adapters! **Web Standards Native**: Elysia exposes `context.request` as a Web Standard `Request` object, making pushduck integration seamless with zero overhead. ## Quick Setup **Install dependencies** ```bash bun add pushduck ``` ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().maxFileSize("5MB"), documentUpload: s3.file().maxFileSize("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create Elysia app with upload routes** ```typescript title="server.ts" import { Elysia } from 'elysia'; import { uploadRouter } from './lib/upload'; const app = new Elysia(); // Direct usage - no adapter needed! app.all('/api/upload/*', (context) => { return uploadRouter.handlers(context.request); }); app.listen(3000); ``` ## Basic Integration ### Simple Upload Route ```typescript title="server.ts" import { Elysia } from 'elysia'; import { uploadRouter } from './lib/upload'; const app = new Elysia(); // Method 1: Combined handler (recommended) app.all('/api/upload/*', (context) => { return uploadRouter.handlers(context.request); }); // Method 2: Separate handlers (if you need method-specific logic) app.get('/api/upload/*', (context) => uploadRouter.handlers.GET(context.request)); app.post('/api/upload/*', (context) => uploadRouter.handlers.POST(context.request)); app.listen(3000); ``` ### With Middleware and CORS ```typescript title="server.ts" import { Elysia } from 'elysia'; import { cors } from '@elysiajs/cors'; import { uploadRouter } from './lib/upload'; const app = new Elysia() .use(cors({ origin: ['http://localhost:3000', 'https://your-domain.com'], allowedHeaders: ['Content-Type', 'Authorization'], methods: ['GET', 'POST'] })) // Upload routes .all('/api/upload/*', (context) => uploadRouter.handlers(context.request)) // Health check .get('/health', () => ({ status: 'ok' })) .listen(3000); console.log(`🦊 Elysia is running at http://localhost:3000`); ``` ## Advanced Configuration ### Authentication with JWT ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; import jwt from '@elysiajs/jwt'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with JWT authentication privateUpload: s3 .image() .maxFileSize("5MB") .middleware(async ({ req }) => { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authorization required'); } const token = authHeader.substring(7); try { // Use your JWT verification logic here const payload = jwt.verify(token, process.env.JWT_SECRET!); return { userId: payload.sub as string, userRole: payload.role as string }; } catch (error) { throw new Error('Invalid token'); } }), // Public uploads (no auth) publicUpload: s3 .image() .maxFileSize("2MB") // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; ``` ### Full Production Setup ```typescript title="server.ts" import { Elysia } from 'elysia'; import { cors } from '@elysiajs/cors'; import { rateLimit } from '@elysiajs/rate-limit'; import { swagger } from '@elysiajs/swagger'; import { uploadRouter } from './lib/upload'; const app = new Elysia() // Swagger documentation .use(swagger({ documentation: { info: { title: 'Upload API', version: '1.0.0' } } })) // CORS .use(cors({ origin: process.env.NODE_ENV === 'production' ? ['https://your-domain.com'] : true, allowedHeaders: ['Content-Type', 'Authorization'], methods: ['GET', 'POST'] })) // Rate limiting .use(rateLimit({ max: 100, windowMs: 15 * 60 * 1000, // 15 minutes })) // Upload routes .all('/api/upload/*', (context) => uploadRouter.handlers(context.request)) // Health check .get('/health', () => ({ status: 'ok', timestamp: new Date().toISOString() })) .listen(process.env.PORT || 3000); console.log(`🦊 Elysia is running at http://localhost:${process.env.PORT || 3000}`); ``` ## TypeScript Integration ### Type-Safe Client ```typescript title="lib/upload-client.ts" import { createUploadClient } from 'pushduck/client'; import type { AppUploadRouter } from './upload'; export const uploadClient = createUploadClient({ baseUrl: process.env.NEXT_PUBLIC_API_URL || 'http://localhost:3000' }); ``` ### Client Usage ```typescript title="components/upload.tsx" import { uploadClient } from '../lib/upload-client'; export function UploadComponent() { const handleUpload = async (files: File[]) => { try { const results = await uploadClient.upload('imageUpload', { files, // Type-safe metadata based on your router configuration metadata: { userId: 'user-123' } }); console.log('Upload successful:', results); } catch (error) { console.error('Upload failed:', error); } }; return ( { if (e.target.files) { handleUpload(Array.from(e.target.files)); } }} /> ); } ``` ## Performance Benefits No adapter layer means zero performance overhead - pushduck handlers run directly in Elysia. Built for Bun's exceptional performance, perfect for high-throughput upload APIs. Full TypeScript support from server to client with compile-time safety. Extensive plugin ecosystem for authentication, validation, rate limiting, and more. ## Deployment ### Production Deployment ```dockerfile title="Dockerfile" FROM oven/bun:1 as base WORKDIR /usr/src/app # Install dependencies COPY package.json bun.lockb ./ RUN bun install --frozen-lockfile # Copy source code COPY . . # Expose port EXPOSE 3000 # Run the app CMD ["bun", "run", "server.ts"] ``` ```bash # Build and run docker build -t my-upload-api . docker run -p 3000:3000 my-upload-api ``` *** **Perfect TypeScript Integration**: Elysia's TypeScript-first approach combined with pushduck's type-safe design creates an exceptional developer experience with full end-to-end type safety. # Expo Router (/docs/integrations/expo) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; ## Using pushduck with Expo Router Expo Router is a file-based router for React Native and web applications that enables full-stack development with API routes. Since Expo Router uses Web Standards APIs, pushduck handlers work directly without any adapters! **Web Standards Native**: Expo Router API routes use standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead. Perfect for universal React Native apps! ## Quick Setup **Install dependencies** ```bash npx expo install expo-router pushduck # For file uploads on mobile npx expo install expo-document-picker expo-image-picker # For file system operations npx expo install expo-file-system ``` ```bash yarn expo install expo-router pushduck # For file uploads on mobile yarn expo install expo-document-picker expo-image-picker # For file system operations yarn expo install expo-file-system ``` ```bash pnpm expo install expo-router pushduck # For file uploads on mobile pnpm expo install expo-document-picker expo-image-picker # For file system operations pnpm expo install expo-file-system ``` ```bash bun expo install expo-router pushduck # For file uploads on mobile bun expo install expo-document-picker expo-image-picker # For file system operations bun expo install expo-file-system ``` **Configure server output** Enable server-side rendering in your `app.json`: ```json title="app.json" { "expo": { "web": { "output": "server" }, "plugins": [ [ "expo-router", { "origin": "https://your-domain.com" } ] ] } } ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3 } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = s3.createRouter({ imageUpload: s3.image().maxFileSize("5MB"), documentUpload: s3.file().maxFileSize("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create API route** ```typescript title="app/api/upload/[...slug]+api.ts" import { uploadRouter } from '../../../lib/upload'; // Direct usage - no adapter needed! export async function GET(request: Request) { return uploadRouter.handlers(request); } export async function POST(request: Request) { return uploadRouter.handlers(request); } ``` ## Basic Integration ### Simple Upload Route ```typescript title="app/api/upload/[...slug]+api.ts" import { uploadRouter } from '../../../lib/upload'; // Method 1: Combined handler (recommended) export async function GET(request: Request) { return uploadRouter.handlers(request); } export async function POST(request: Request) { return uploadRouter.handlers(request); } // Method 2: Individual methods (if you need method-specific logic) export async function PUT(request: Request) { return uploadRouter.handlers(request); } export async function DELETE(request: Request) { return uploadRouter.handlers(request); } ``` ### With CORS Headers ```typescript title="app/api/upload/[...slug]+api.ts" import { uploadRouter } from '../../../lib/upload'; function addCorsHeaders(response: Response) { response.headers.set('Access-Control-Allow-Origin', '*'); response.headers.set('Access-Control-Allow-Methods', 'GET, POST, PUT, DELETE, OPTIONS'); response.headers.set('Access-Control-Allow-Headers', 'Content-Type, Authorization'); return response; } export async function OPTIONS() { return addCorsHeaders(new Response(null, { status: 200 })); } export async function GET(request: Request) { const response = await uploadRouter.handlers(request); return addCorsHeaders(response); } export async function POST(request: Request) { const response = await uploadRouter.handlers(request); return addCorsHeaders(response); } ``` ## Advanced Configuration ### Authentication with Expo Auth ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; import { jwtVerify } from 'jose'; const { s3 } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = s3.createRouter({ // Private uploads with JWT authentication privateUpload: s3 .image() .maxFileSize("5MB") .formats(['jpeg', 'jpg', 'png', 'webp']) .middleware(async ({ req }) => { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authorization required'); } const token = authHeader.substring(7); try { const secret = new TextEncoder().encode(process.env.JWT_SECRET!); const { payload } = await jwtVerify(token, secret); return { userId: payload.sub as string, platform: 'mobile' }; } catch (error) { throw new Error('Invalid token'); } }), // User profile pictures profilePicture: s3 .image() .maxFileSize("2MB") .maxFiles(1) .formats(['jpeg', 'jpg', 'png', 'webp']) .middleware(async ({ req }) => { const userId = await authenticateUser(req); return { userId, category: 'profile' }; }) .paths({ generateKey: ({ metadata, file }) => { return `profiles/${metadata.userId}/avatar.${file.name.split('.').pop()}`; } }), // Document uploads documents: s3 .file() .maxFileSize("10MB") .types(['application/pdf', 'text/plain']) .maxFiles(5) .middleware(async ({ req }) => { const userId = await authenticateUser(req); return { userId, category: 'documents' }; }), // Public uploads (no auth) publicUpload: s3 .image() .maxFileSize("2MB") // No middleware = public access }); async function authenticateUser(req: Request): Promise { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authorization required'); } const token = authHeader.substring(7); const secret = new TextEncoder().encode(process.env.JWT_SECRET!); const { payload } = await jwtVerify(token, secret); return payload.sub as string; } export type AppUploadRouter = typeof uploadRouter; ``` ## Client-Side Usage (React Native) ### Upload Hook ```typescript title="hooks/useUpload.ts" import { createUploadClient } from 'pushduck/client'; import type { AppUploadRouter } from '../lib/upload'; export const upload = createUploadClient({ endpoint: '/api/upload' }); ``` ### Image Upload Component ```typescript title="components/ImageUploader.tsx" import React, { useState } from 'react'; import { View, Text, TouchableOpacity, Image, Alert, Platform } from 'react-native'; import * as ImagePicker from 'expo-image-picker'; import { upload } from '../hooks/useUpload'; export default function ImageUploader() { const [selectedImage, setSelectedImage] = useState(null); const { uploadFiles, files, isUploading, error } = upload.imageUpload(); const pickImage = async () => { // Request permission if (Platform.OS !== 'web') { const { status } = await ImagePicker.requestMediaLibraryPermissionsAsync(); if (status !== 'granted') { Alert.alert('Permission needed', 'Camera roll permission is required'); return; } } const result = await ImagePicker.launchImageLibraryAsync({ mediaTypes: ImagePicker.MediaTypeOptions.Images, allowsEditing: true, aspect: [4, 3], quality: 1, }); if (!result.canceled) { const asset = result.assets[0]; setSelectedImage(asset.uri); // Create File object for upload const file = { uri: asset.uri, name: asset.fileName || 'image.jpg', type: asset.type || 'image/jpeg', } as any; uploadFiles([file]); } }; return ( {isUploading ? 'Uploading...' : 'Pick Image'} {error && ( Error: {error.message} )} {selectedImage && ( )} {files.length > 0 && ( {files.map((file) => ( {file.name} {file.status === 'success' ? 'Complete' : `${file.progress}%`} {file.status === 'success' && file.url && ( βœ“ Uploaded )} ))} )} ); } ``` ### Document Upload Component ```typescript title="components/DocumentUploader.tsx" import React, { useState } from 'react'; import { View, Text, TouchableOpacity, Alert, FlatList } from 'react-native'; import * as DocumentPicker from 'expo-document-picker'; import { upload } from '../hooks/useUpload'; interface UploadedFile { name: string; size: number; url: string; } export default function DocumentUploader() { const [uploadedFiles, setUploadedFiles] = useState([]); const { uploadFiles, isUploading, error } = upload.documents(); const pickDocument = async () => { try { const result = await DocumentPicker.getDocumentAsync({ type: ['application/pdf', 'text/plain'], multiple: true, }); if (!result.canceled) { const files = result.assets.map(asset => ({ uri: asset.uri, name: asset.name, type: asset.mimeType || 'application/octet-stream', })) as any[]; const uploadResult = await uploadFiles(files); if (uploadResult.success) { const newFiles = uploadResult.results.map(file => ({ name: file.name, size: file.size, url: file.url, // Permanent URL downloadUrl: file.presignedUrl, // Temporary download URL (1 hour) })); setUploadedFiles(prev => [...prev, ...newFiles]); Alert.alert('Success', `${files.length} file(s) uploaded successfully!`); } } } catch (error) { Alert.alert('Error', 'Failed to pick document'); } }; return ( {isUploading ? 'Uploading...' : 'Pick Documents'} {error && ( Error: {error.message} )} index.toString()} renderItem={({ item }) => ( {item.name} {(item.size / 1024).toFixed(1)} KB )} /> ); } ``` ## Project Structure Here's a recommended project structure for Expo Router with pushduck: ## Complete Example ### Main Upload Screen ```typescript title="app/(tabs)/upload.tsx" import React from 'react'; import { View, Text, ScrollView, StyleSheet } from 'react-native'; import ImageUploader from '../../components/ImageUploader'; import DocumentUploader from '../../components/DocumentUploader'; export default function UploadScreen() { return ( File Upload Demo Image Upload Document Upload ); } const styles = StyleSheet.create({ container: { flex: 1, backgroundColor: '#fff', }, title: { fontSize: 24, fontWeight: 'bold', textAlign: 'center', marginVertical: 20, }, section: { padding: 20, borderBottomWidth: 1, borderBottomColor: '#eee', }, sectionTitle: { fontSize: 18, fontWeight: '600', marginBottom: 15, }, }); ``` ### Tab Layout ```typescript title="app/(tabs)/_layout.tsx" import { Tabs } from 'expo-router'; import { Ionicons } from '@expo/vector-icons'; export default function TabLayout() { return ( ( ), }} /> ( ), }} /> ); } ``` ## Deployment Options ### EAS Build Configuration Configure automatic server deployment in your `eas.json`: ```json title="eas.json" { "cli": { "version": ">= 5.0.0" }, "build": { "development": { "developmentClient": true, "distribution": "internal", "env": { "EXPO_UNSTABLE_DEPLOY_SERVER": "1" } }, "preview": { "distribution": "internal", "env": { "EXPO_UNSTABLE_DEPLOY_SERVER": "1" } }, "production": { "env": { "EXPO_UNSTABLE_DEPLOY_SERVER": "1" } } } } ``` Deploy with automatic server: ```bash # Build for all platforms eas build --platform all # Deploy server only npx expo export --platform web eas deploy ``` ### Development Build Setup ```bash # Install dev client npx expo install expo-dev-client # Create development build eas build --profile development # Or run locally npx expo run:ios --configuration Release npx expo run:android --variant release ``` Configure local server origin: ```json title="app.json" { "expo": { "plugins": [ [ "expo-router", { "origin": "http://localhost:8081" } ] ] } } ``` ### Local Development Server ```bash # Start Expo development server npx expo start # Test API routes curl http://localhost:8081/api/upload/presigned-url # Clear cache if needed npx expo start --clear ``` For production testing: ```bash # Export for production npx expo export # Serve locally npx expo serve ``` ## Environment Variables ```bash title=".env" # AWS/Cloudflare R2 Configuration AWS_ACCESS_KEY_ID=your_access_key AWS_SECRET_ACCESS_KEY=your_secret_key AWS_REGION=auto AWS_ENDPOINT_URL=https://your-account.r2.cloudflarestorage.com S3_BUCKET_NAME=your-bucket-name R2_ACCOUNT_ID=your-cloudflare-account-id # JWT Authentication JWT_SECRET=your-jwt-secret # Expo Configuration (for client-side, use EXPO_PUBLIC_ prefix) EXPO_PUBLIC_API_URL=https://your-domain.com ``` **Important**: Server environment variables (without `EXPO_PUBLIC_` prefix) are only available in API routes, not in client code. Client-side variables must use the `EXPO_PUBLIC_` prefix. ## Performance Benefits Share upload logic between web and native platforms with a single codebase. Direct access to native file system APIs for optimal performance on mobile. Built-in support for upload progress tracking and real-time status updates. Deploy to iOS, Android, and web with the same upload infrastructure. ## Troubleshooting **File Permissions**: Always request proper permissions for camera and photo library access on mobile devices before file operations. **Server Bundle**: Expo Router API routes require server output to be enabled in your `app.json` configuration. ### Common Issues **Metro bundler errors:** ```bash # Clear Metro cache npx expo start --clear # Reset Expo cache npx expo r -c ``` **Permission denied errors:** ```typescript // Always check permissions before file operations import * as ImagePicker from 'expo-image-picker'; const { status } = await ImagePicker.requestMediaLibraryPermissionsAsync(); if (status !== 'granted') { Alert.alert('Permission needed', 'Camera roll permission is required'); return; } ``` **Network errors in development:** ```typescript // Make sure your development server is accessible const { upload } = useUpload('/api/upload', { endpoint: __DEV__ ? 'http://localhost:8081' : 'https://your-domain.com', }); ``` **File upload timeout:** ```typescript const { upload } = useUpload('/api/upload', { timeout: 60000, // 60 seconds }); ``` ### Debug Mode Enable debug logging for development: ```typescript title="lib/upload.ts" const { s3 } = createUploadConfig() .provider("cloudflareR2",{ /* config */ }) .defaults({ debug: __DEV__, // Only in development }) .build(); ``` This will log detailed information about upload requests, file processing, and S3 operations to help diagnose issues during development. ## Framework-Specific Notes 1. **File System Access**: Use `expo-file-system` for advanced file operations 2. **Permissions**: Always request permissions before accessing camera or photo library 3. **Web Compatibility**: Components work on web out of the box with Expo Router 4. **Platform Detection**: Use `Platform.OS` to handle platform-specific logic 5. **Environment Variables**: Server variables don't need `EXPO_PUBLIC_` prefix in API routes # Express (/docs/integrations/express) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; ## Using pushduck with Express Express uses the traditional Node.js `req`/`res` API pattern. Pushduck provides a simple adapter that converts Web Standard handlers to Express middleware format. **Custom Request/Response API**: Express uses `req`/`res` objects instead of Web Standards, so pushduck provides the `toExpressHandler` adapter for seamless integration. ## Quick Setup **Install dependencies** ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` ```bash bun add pushduck ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().maxFileSize("5MB"), documentUpload: s3.file().maxFileSize("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create Express server with upload routes** ```typescript title="server.ts" import express from 'express'; import { uploadRouter } from './lib/upload'; import { toExpressHandler } from 'pushduck/adapters'; const app = express(); // Convert pushduck handlers to Express middleware app.all('/api/upload/*', toExpressHandler(uploadRouter.handlers)); app.listen(3000, () => { console.log('Server running on http://localhost:3000'); }); ``` ## Basic Integration ### Simple Upload Route ```typescript title="server.ts" import express from 'express'; import cors from 'cors'; import { uploadRouter } from './lib/upload'; import { toExpressHandler } from 'pushduck/adapters'; const app = express(); // Middleware app.use(cors()); app.use(express.json()); // Upload routes using adapter app.all('/api/upload/*', toExpressHandler(uploadRouter.handlers)); // Health check app.get('/health', (req, res) => { res.json({ status: 'healthy', timestamp: new Date().toISOString() }); }); const port = process.env.PORT || 3000; app.listen(port, () => { console.log(`πŸš€ Server running on http://localhost:${port}`); }); ``` ### With Authentication Middleware ```typescript title="server.ts" import express from 'express'; import jwt from 'jsonwebtoken'; import { uploadRouter } from './lib/upload'; import { toExpressHandler } from 'pushduck/adapters'; const app = express(); app.use(express.json()); // Authentication middleware const authenticateToken = (req: express.Request, res: express.Response, next: express.NextFunction) => { const authHeader = req.headers['authorization']; const token = authHeader && authHeader.split(' ')[1]; if (!token) { return res.sendStatus(401); } jwt.verify(token, process.env.JWT_SECRET!, (err, user) => { if (err) return res.sendStatus(403); req.user = user; next(); }); }; // Public upload route (no auth) app.all('/api/upload/public/*', toExpressHandler(uploadRouter.handlers)); // Private upload route (with auth) app.all('/api/upload/private/*', authenticateToken, toExpressHandler(uploadRouter.handlers)); app.listen(3000); ``` ## Advanced Configuration ### Upload Configuration with Express Context ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Profile pictures with authentication profilePicture: s3 .image() .maxFileSize("2MB") .maxFiles(1) .formats(["jpeg", "png", "webp"]) .middleware(async ({ req }) => { // Extract user from JWT token in Authorization header const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authentication required'); } const token = authHeader.substring(7); const user = await verifyJWT(token); return { userId: user.id, userRole: user.role, category: "profile" }; }), // Document uploads for authenticated users documents: s3 .file() .maxFileSize("10MB") .maxFiles(5) .types([ "application/pdf", "application/msword", "application/vnd.openxmlformats-officedocument.wordprocessingml.document", "text/plain" ]) .middleware(async ({ req }) => { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authentication required'); } const token = authHeader.substring(7); const user = await verifyJWT(token); return { userId: user.id, category: "documents" }; }), // Public uploads (no authentication) publicImages: s3 .image() .maxFileSize("1MB") .maxFiles(1) .formats(["jpeg", "png"]) // No middleware = public access }); async function verifyJWT(token: string) { // Your JWT verification logic const jwt = await import('jsonwebtoken'); return jwt.verify(token, process.env.JWT_SECRET!) as any; } export type AppUploadRouter = typeof uploadRouter; ``` ### Complete Express Application ```typescript title="server.ts" import express from 'express'; import cors from 'cors'; import helmet from 'helmet'; import rateLimit from 'express-rate-limit'; import { uploadRouter } from './lib/upload'; import { toExpressHandler } from 'pushduck/adapters'; const app = express(); // Security middleware app.use(helmet()); app.use(cors({ origin: process.env.NODE_ENV === 'production' ? ['https://your-domain.com'] : ['http://localhost:3000'], credentials: true })); // Rate limiting const uploadLimiter = rateLimit({ windowMs: 15 * 60 * 1000, // 15 minutes max: 100, // limit each IP to 100 requests per windowMs message: 'Too many upload requests from this IP, please try again later.', standardHeaders: true, legacyHeaders: false, }); // Body parsing middleware app.use(express.json({ limit: '50mb' })); app.use(express.urlencoded({ extended: true, limit: '50mb' })); // Logging middleware app.use((req, res, next) => { console.log(`${new Date().toISOString()} - ${req.method} ${req.path}`); next(); }); // Health check endpoint app.get('/health', (req, res) => { res.json({ status: 'healthy', timestamp: new Date().toISOString(), uptime: process.uptime(), memory: process.memoryUsage(), version: process.env.npm_package_version || '1.0.0' }); }); // API info endpoint app.get('/api', (req, res) => { res.json({ name: 'Express Upload API', version: '1.0.0', endpoints: { health: '/health', upload: '/api/upload/*' }, uploadTypes: [ 'profilePicture - Single profile picture (2MB max)', 'documents - PDF, Word, text files (10MB max, 5 files)', 'publicImages - Public images (1MB max)' ] }); }); // Upload routes with rate limiting app.all('/api/upload/*', uploadLimiter, toExpressHandler(uploadRouter.handlers)); // 404 handler app.use('*', (req, res) => { res.status(404).json({ error: 'Not Found', message: `Route ${req.originalUrl} not found`, timestamp: new Date().toISOString() }); }); // Error handler app.use((err: Error, req: express.Request, res: express.Response, next: express.NextFunction) => { console.error('Express error:', err); res.status(500).json({ error: 'Internal Server Error', message: process.env.NODE_ENV === 'development' ? err.message : 'Something went wrong', timestamp: new Date().toISOString() }); }); const port = process.env.PORT || 3000; app.listen(port, () => { console.log(`πŸš€ Express server running on http://localhost:${port}`); console.log(`πŸ“ Upload endpoint: http://localhost:${port}/api/upload`); }); ``` ## Project Structure ## Modular Route Organization ### Separate Upload Routes ```typescript title="routes/uploads.ts" import { Router } from 'express'; import { uploadRouter } from '../lib/upload'; import { toExpressHandler } from 'pushduck/adapters'; import { authenticateToken } from '../middleware/auth'; const router = Router(); // Public uploads router.all('/public/*', toExpressHandler(uploadRouter.handlers)); // Private uploads (requires authentication) router.all('/private/*', authenticateToken, toExpressHandler(uploadRouter.handlers)); export default router; ``` ```typescript title="middleware/auth.ts" import { Request, Response, NextFunction } from 'express'; import jwt from 'jsonwebtoken'; export const authenticateToken = (req: Request, res: Response, next: NextFunction) => { const authHeader = req.headers['authorization']; const token = authHeader && authHeader.split(' ')[1]; if (!token) { return res.status(401).json({ error: 'Access token required' }); } jwt.verify(token, process.env.JWT_SECRET!, (err, user) => { if (err) { return res.status(403).json({ error: 'Invalid or expired token' }); } req.user = user; next(); }); }; ``` # Fastify (/docs/integrations/fastify) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; ## Using pushduck with Fastify Fastify is a high-performance Node.js web framework that uses custom `request`/`reply` objects. Pushduck provides a simple adapter that converts Web Standard handlers to Fastify handler format. **Custom Request/Response API**: Fastify uses `request`/`reply` objects instead of Web Standards, so pushduck provides the `toFastifyHandler` adapter for seamless integration. ## Quick Setup **Install dependencies** ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` ```bash bun add pushduck ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().maxFileSize("5MB"), documentUpload: s3.file().maxFileSize("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create Fastify server with upload routes** ```typescript title="server.ts" import Fastify from 'fastify'; import { uploadRouter } from './lib/upload'; import { toFastifyHandler } from 'pushduck/adapters'; const fastify = Fastify({ logger: true }); // Convert pushduck handlers to Fastify handler fastify.all('/api/upload/*', toFastifyHandler(uploadRouter.handlers)); const start = async () => { try { await fastify.listen({ port: 3000 }); console.log('πŸš€ Fastify server running on http://localhost:3000'); } catch (err) { fastify.log.error(err); process.exit(1); } }; start(); ``` ## Basic Integration ### Simple Upload Route ```typescript title="server.ts" import Fastify from 'fastify'; import cors from '@fastify/cors'; import { uploadRouter } from './lib/upload'; import { toFastifyHandler } from 'pushduck/adapters'; const fastify = Fastify({ logger: { level: 'info', transport: { target: 'pino-pretty' } } }); // Register CORS await fastify.register(cors, { origin: ['http://localhost:3000', 'https://your-domain.com'] }); // Upload routes using adapter fastify.all('/api/upload/*', toFastifyHandler(uploadRouter.handlers)); // Health check fastify.get('/health', async (request, reply) => { return { status: 'healthy', timestamp: new Date().toISOString(), framework: 'Fastify' }; }); const start = async () => { try { await fastify.listen({ port: 3000, host: '0.0.0.0' }); } catch (err) { fastify.log.error(err); process.exit(1); } }; start(); ``` ### With Authentication Hook ```typescript title="server.ts" import Fastify from 'fastify'; import jwt from '@fastify/jwt'; import { uploadRouter } from './lib/upload'; import { toFastifyHandler } from 'pushduck/adapters'; const fastify = Fastify({ logger: true }); // Register JWT await fastify.register(jwt, { secret: process.env.JWT_SECRET! }); // Authentication hook fastify.addHook('preHandler', async (request, reply) => { // Only protect upload routes if (request.url.startsWith('/api/upload/private/')) { try { await request.jwtVerify(); } catch (err) { reply.send(err); } } }); // Public upload routes fastify.all('/api/upload/public/*', toFastifyHandler(uploadRouter.handlers)); // Private upload routes (protected by hook) fastify.all('/api/upload/private/*', toFastifyHandler(uploadRouter.handlers)); const start = async () => { try { await fastify.listen({ port: 3000 }); } catch (err) { fastify.log.error(err); process.exit(1); } }; start(); ``` ## Advanced Configuration ### Upload Configuration with Fastify Context ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Profile pictures with authentication profilePicture: s3 .image() .maxFileSize("2MB") .maxFiles(1) .formats(["jpeg", "png", "webp"]) .middleware(async ({ req }) => { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authentication required'); } const token = authHeader.substring(7); const user = await verifyJWT(token); return { userId: user.id, userRole: user.role, category: "profile" }; }), // Document uploads for authenticated users documents: s3 .file() .maxFileSize("10MB") .maxFiles(5) .types([ "application/pdf", "application/msword", "application/vnd.openxmlformats-officedocument.wordprocessingml.document", "text/plain" ]) .middleware(async ({ req }) => { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authentication required'); } const token = authHeader.substring(7); const user = await verifyJWT(token); return { userId: user.id, category: "documents" }; }), // Public uploads (no authentication) publicImages: s3 .image() .maxFileSize("1MB") .maxFiles(1) .formats(["jpeg", "png"]) // No middleware = public access }); async function verifyJWT(token: string) { // Your JWT verification logic const jwt = await import('jsonwebtoken'); return jwt.verify(token, process.env.JWT_SECRET!) as any; } export type AppUploadRouter = typeof uploadRouter; ``` ### Complete Fastify Application ```typescript title="server.ts" import Fastify from 'fastify'; import cors from '@fastify/cors'; import helmet from '@fastify/helmet'; import rateLimit from '@fastify/rate-limit'; import { uploadRouter } from './lib/upload'; import { toFastifyHandler } from 'pushduck/adapters'; const fastify = Fastify({ logger: { level: process.env.NODE_ENV === 'production' ? 'warn' : 'info', transport: process.env.NODE_ENV !== 'production' ? { target: 'pino-pretty' } : undefined } }); // Security middleware await fastify.register(helmet, { contentSecurityPolicy: false }); // CORS configuration await fastify.register(cors, { origin: process.env.NODE_ENV === 'production' ? ['https://your-domain.com'] : true, credentials: true }); // Rate limiting await fastify.register(rateLimit, { max: 100, timeWindow: '15 minutes', errorResponseBuilder: (request, context) => ({ error: 'Rate limit exceeded', message: `Too many requests from ${request.ip}. Try again later.`, retryAfter: Math.round(context.ttl / 1000) }) }); // Request logging fastify.addHook('onRequest', async (request, reply) => { request.log.info({ url: request.url, method: request.method }, 'incoming request'); }); // Health check endpoint fastify.get('/health', async (request, reply) => { return { status: 'healthy', timestamp: new Date().toISOString(), uptime: process.uptime(), memory: process.memoryUsage(), version: process.env.npm_package_version || '1.0.0', framework: 'Fastify' }; }); // API info endpoint fastify.get('/api', async (request, reply) => { return { name: 'Fastify Upload API', version: '1.0.0', endpoints: { health: '/health', upload: '/api/upload/*' }, uploadTypes: [ 'profilePicture - Single profile picture (2MB max)', 'documents - PDF, Word, text files (10MB max, 5 files)', 'publicImages - Public images (1MB max)' ] }; }); // Upload routes with rate limiting fastify.register(async function (fastify) { await fastify.register(rateLimit, { max: 50, timeWindow: '15 minutes' }); fastify.all('/api/upload/*', toFastifyHandler(uploadRouter.handlers)); }); // 404 handler fastify.setNotFoundHandler(async (request, reply) => { reply.status(404).send({ error: 'Not Found', message: `Route ${request.method} ${request.url} not found`, timestamp: new Date().toISOString() }); }); // Error handler fastify.setErrorHandler(async (error, request, reply) => { request.log.error(error, 'Fastify error'); reply.status(500).send({ error: 'Internal Server Error', message: process.env.NODE_ENV === 'development' ? error.message : 'Something went wrong', timestamp: new Date().toISOString() }); }); // Graceful shutdown const gracefulShutdown = () => { fastify.log.info('Shutting down gracefully...'); fastify.close().then(() => { fastify.log.info('Server closed'); process.exit(0); }).catch((err) => { fastify.log.error(err, 'Error during shutdown'); process.exit(1); }); }; process.on('SIGTERM', gracefulShutdown); process.on('SIGINT', gracefulShutdown); const start = async () => { try { const port = Number(process.env.PORT) || 3000; const host = process.env.HOST || '0.0.0.0'; await fastify.listen({ port, host }); fastify.log.info(`πŸš€ Fastify server running on http://${host}:${port}`); fastify.log.info(`πŸ“ Upload endpoint: http://${host}:${port}/api/upload`); } catch (err) { fastify.log.error(err); process.exit(1); } }; start(); ``` ## Plugin-Based Architecture ### Upload Plugin ```typescript title="plugins/upload.ts" import { FastifyPluginAsync } from 'fastify'; import { uploadRouter } from '../lib/upload'; import { toFastifyHandler } from 'pushduck/adapters'; const uploadPlugin: FastifyPluginAsync = async (fastify) => { // Upload routes fastify.all('/upload/*', toFastifyHandler(uploadRouter.handlers)); // Upload status endpoint fastify.get('/upload-status', async (request, reply) => { return { status: 'ready', supportedTypes: ['images', 'documents', 'publicImages'], maxSizes: { profilePicture: '2MB', documents: '10MB', publicImages: '1MB' } }; }); }; export default uploadPlugin; ``` ### Main Server with Plugins ```typescript title="server.ts" import Fastify from 'fastify'; import uploadPlugin from './plugins/upload'; const fastify = Fastify({ logger: true }); // Register upload plugin await fastify.register(uploadPlugin, { prefix: '/api' }); const start = async () => { try { await fastify.listen({ port: 3000 }); } catch (err) { fastify.log.error(err); process.exit(1); } }; start(); ``` ## Client Usage The client-side integration is identical regardless of your backend framework: ```typescript title="client/upload-client.ts" import { createUploadClient } from 'pushduck/client'; import type { AppUploadRouter } from '../lib/upload'; export const upload = createUploadClient({ endpoint: 'http://localhost:3000/api/upload', headers: { 'Authorization': `Bearer ${getAuthToken()}` } }); function getAuthToken(): string { return localStorage.getItem('auth-token') || ''; } ``` ```typescript title="client/upload-form.tsx" import { upload } from './upload-client'; export function DocumentUploader() { const { uploadFiles, files, isUploading, error } = upload.documents(); const handleFileSelect = (e: React.ChangeEvent) => { const selectedFiles = Array.from(e.target.files || []); uploadFiles(selectedFiles); }; return (
{error && (
Error: {error.message}
)} {files.map((file) => (
{file.name} {file.status === 'success' && ( Download )}
))}
); } ``` ## Deployment ### Docker Deployment ```dockerfile title="Dockerfile" FROM node:18-alpine WORKDIR /app # Copy package files COPY package*.json ./ RUN npm ci --only=production # Copy source code COPY . . # Build TypeScript RUN npm run build EXPOSE 3000 CMD ["npm", "start"] ``` ### Package Configuration ```json title="package.json" { "name": "fastify-upload-api", "version": "1.0.0", "scripts": { "dev": "tsx watch src/server.ts", "build": "tsc", "start": "node dist/server.js" }, "dependencies": { "fastify": "^4.24.0", "pushduck": "latest", "@fastify/cors": "^8.4.0", "@fastify/helmet": "^11.1.0", "@fastify/rate-limit": "^8.0.0", "@fastify/jwt": "^7.2.0" }, "devDependencies": { "@types/node": "^20.0.0", "tsx": "^3.12.7", "typescript": "^5.0.0", "pino-pretty": "^10.2.0" } } ``` ### Environment Variables ```bash title=".env" # Server Configuration PORT=3000 HOST=0.0.0.0 NODE_ENV=development JWT_SECRET=your-super-secret-jwt-key # Cloudflare R2 Configuration AWS_ACCESS_KEY_ID=your_r2_access_key AWS_SECRET_ACCESS_KEY=your_r2_secret_key AWS_ENDPOINT_URL=https://your-account-id.r2.cloudflarestorage.com S3_BUCKET_NAME=your-bucket-name R2_ACCOUNT_ID=your-account-id ``` ## Performance Benefits Fastify is one of the fastest Node.js frameworks, perfect for high-throughput upload APIs. Leverage Fastify's extensive plugin ecosystem alongside pushduck's upload capabilities. Excellent TypeScript support with full type safety for both Fastify and pushduck. Built-in schema validation, logging, and error handling for production deployments. *** **Fastify + Pushduck**: High-performance file uploads with Fastify's speed and pushduck's universal design, connected through a simple adapter. # Fresh (/docs/integrations/fresh) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; **🚧 Client-Side In Development**: Fresh server-side integration is fully functional with Web Standards APIs. However, Fresh-specific client-side components and hooks are still in development. You can use the standard pushduck client APIs for now. ## Using pushduck with Fresh Fresh is a modern web framework for Deno that uses islands architecture for optimal performance. It uses Web Standards APIs and provides server-side rendering with minimal client-side JavaScript. Since Fresh uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters! **Web Standards Native**: Fresh API routes use Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead. ## Quick Setup **Install Fresh and pushduck** ```bash # Create a new Fresh project deno run -A -r https://fresh.deno.dev my-app cd my-app # Add pushduck to import_map.json ``` ```json title="import_map.json" { "imports": { "$fresh/": "https://deno.land/x/fresh@1.6.1/", "preact": "https://esm.sh/preact@10.19.2", "preact/": "https://esm.sh/preact@10.19.2/", "pushduck/server": "https://esm.sh/pushduck@latest/server", "pushduck/client": "https://esm.sh/pushduck@latest/client" } } ``` ```bash # Create a new Fresh project deno run -A -r https://fresh.deno.dev my-app cd my-app # Install pushduck via npm (requires Node.js compatibility) npm install pushduck ``` ```bash # Create a new Fresh project deno run -A -r https://fresh.deno.dev my-app cd my-app # Install pushduck via yarn (requires Node.js compatibility) yarn add pushduck ``` ```bash # Create a new Fresh project deno run -A -r https://fresh.deno.dev my-app cd my-app # Install pushduck via pnpm (requires Node.js compatibility) pnpm add pushduck ``` ```bash # Create a new Fresh project deno run -A -r https://fresh.deno.dev my-app cd my-app # Install pushduck via bun (requires Node.js compatibility) bun add pushduck ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: Deno.env.get("AWS_ACCESS_KEY_ID")!, secretAccessKey: Deno.env.get("AWS_SECRET_ACCESS_KEY")!, region: 'auto', endpoint: Deno.env.get("AWS_ENDPOINT_URL")!, bucket: Deno.env.get("S3_BUCKET_NAME")!, accountId: Deno.env.get("R2_ACCOUNT_ID")!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().maxFileSize("5MB"), documentUpload: s3.file().maxFileSize("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create API route** ```typescript title="routes/api/upload/[...path].ts" import { Handlers } from "$fresh/server.ts"; import { uploadRouter } from "../../../lib/upload.ts"; // Direct usage - no adapter needed! export const handler: Handlers = { async GET(req) { return uploadRouter.handlers(req); }, async POST(req) { return uploadRouter.handlers(req); }, }; ``` ## Basic Integration ### Simple Upload Route ```typescript title="routes/api/upload/[...path].ts" import { Handlers } from "$fresh/server.ts"; import { uploadRouter } from "../../../lib/upload.ts"; // Method 1: Combined handler (recommended) export const handler: Handlers = { async GET(req) { return uploadRouter.handlers(req); }, async POST(req) { return uploadRouter.handlers(req); }, }; // Method 2: Universal handler export const handler: Handlers = { async GET(req) { return uploadRouter.handlers(req); }, async POST(req) { return uploadRouter.handlers(req); }, async OPTIONS(req) { return new Response(null, { status: 200, headers: { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'GET, POST, OPTIONS', 'Access-Control-Allow-Headers': 'Content-Type', }, }); }, }; ``` ### With Middleware ```typescript title="routes/_middleware.ts" import { MiddlewareHandlerContext } from "$fresh/server.ts"; export async function handler( req: Request, ctx: MiddlewareHandlerContext, ) { // Add CORS headers for upload routes if (ctx.destination === "route" && req.url.includes("/api/upload")) { const response = await ctx.next(); response.headers.set("Access-Control-Allow-Origin", "*"); response.headers.set("Access-Control-Allow-Methods", "GET, POST, OPTIONS"); response.headers.set("Access-Control-Allow-Headers", "Content-Type"); return response; } return ctx.next(); } ``` ## Advanced Configuration ### Authentication with Fresh ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; import { getCookies } from "https://deno.land/std@0.208.0/http/cookie.ts"; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: Deno.env.get("AWS_ACCESS_KEY_ID")!, secretAccessKey: Deno.env.get("AWS_SECRET_ACCESS_KEY")!, region: 'auto', endpoint: Deno.env.get("AWS_ENDPOINT_URL")!, bucket: Deno.env.get("S3_BUCKET_NAME")!, accountId: Deno.env.get("R2_ACCOUNT_ID")!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with cookie-based authentication privateUpload: s3 .image() .maxFileSize("5MB") .middleware(async ({ req }) => { const cookies = getCookies(req.headers); const sessionId = cookies.sessionId; if (!sessionId) { throw new Error('Authentication required'); } const user = await getUserFromSession(sessionId); if (!user) { throw new Error('Invalid session'); } return { userId: user.id, username: user.username, }; }), // Public uploads (no auth) publicUpload: s3 .image() .maxFileSize("2MB") // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; // Helper function async function getUserFromSession(sessionId: string) { // Implement your session validation logic // This could connect to a database, Deno KV, etc. return { id: 'user-123', username: 'demo-user' }; } ``` ## Client-Side Usage ### Upload Island Component ```tsx title="islands/FileUpload.tsx" import { useUpload } from "pushduck/client"; import type { AppUploadRouter } from "../lib/upload.ts"; const { UploadButton, UploadDropzone } = useUpload({ endpoint: "/api/upload", }); export default function FileUpload() { function handleUploadComplete(files: any[]) { console.log("Files uploaded:", files); alert("Upload completed!"); } function handleUploadError(error: Error) { console.error("Upload error:", error); alert(`Upload failed: ${error.message}`); } return (

Image Upload

Document Upload

); } ``` ### Using in Pages ```tsx title="routes/index.tsx" import { Head } from "$fresh/runtime.ts"; import FileUpload from "../islands/FileUpload.tsx"; export default function Home() { return ( <> File Upload Demo

File Upload Demo

); } ``` ## File Management ### Server-Side File API ```typescript title="routes/api/files.ts" import { Handlers } from "$fresh/server.ts"; export const handler: Handlers = { async GET(req) { const url = new URL(req.url); const userId = url.searchParams.get('userId'); if (!userId) { return new Response(JSON.stringify({ error: 'User ID required' }), { status: 400, headers: { 'Content-Type': 'application/json' } }); } // Fetch files from database/Deno KV const files = await getFilesForUser(userId); return new Response(JSON.stringify({ files: files.map(file => ({ id: file.id, name: file.name, url: file.url, size: file.size, uploadedAt: file.createdAt, })), }), { headers: { 'Content-Type': 'application/json' } }); }, }; async function getFilesForUser(userId: string) { // Example using Deno KV const kv = await Deno.openKv(); const files = []; for await (const entry of kv.list({ prefix: ["files", userId] })) { files.push(entry.value); } return files; } ``` ### File Management Page ```tsx title="routes/files.tsx" import { Head } from "$fresh/runtime.ts"; import { Handlers, PageProps } from "$fresh/server.ts"; import FileUpload from "../islands/FileUpload.tsx"; interface FileData { id: string; name: string; url: string; size: number; uploadedAt: string; } interface PageData { files: FileData[]; } export const handler: Handlers = { async GET(req, ctx) { // Fetch files for current user const files = await getFilesForUser("current-user"); return ctx.render({ files }); }, }; export default function FilesPage({ data }: PageProps) { function formatFileSize(bytes: number): string { const sizes = ['Bytes', 'KB', 'MB', 'GB']; if (bytes === 0) return '0 Bytes'; const i = Math.floor(Math.log(bytes) / Math.log(1024)); return Math.round(bytes / Math.pow(1024, i) * 100) / 100 + ' ' + sizes[i]; } return ( <> My Files

My Files

Uploaded Files

{data.files.length === 0 ? (

No files uploaded yet.

) : (
{data.files.map((file) => (

{file.name}

{formatFileSize(file.size)}

{new Date(file.uploadedAt).toLocaleDateString()}

View File
))}
)}
); } async function getFilesForUser(userId: string) { // Implementation depends on your storage solution return []; } ``` ## Deployment Options ```bash # Deploy to Deno Deploy deno task build deployctl deploy --project=my-app --include=. --exclude=node_modules ``` ```json title="deno.json" { "tasks": { "build": "deno run -A dev.ts build", "preview": "deno run -A main.ts", "start": "deno run -A --watch=static/,routes/ dev.ts", "deploy": "deployctl deploy --project=my-app --include=. --exclude=node_modules" } } ``` ```dockerfile title="Dockerfile" FROM denoland/deno:1.38.0 WORKDIR /app # Copy dependency files COPY deno.json deno.lock import_map.json ./ # Cache dependencies RUN deno cache --import-map=import_map.json main.ts # Copy source code COPY . . # Build the application RUN deno task build EXPOSE 8000 CMD ["deno", "run", "-A", "main.ts"] ``` ```bash # Install Deno curl -fsSL https://deno.land/install.sh | sh # Clone and run your app git clone cd deno task start ``` ```systemd title="/etc/systemd/system/fresh-app.service" [Unit] Description=Fresh App After=network.target [Service] Type=simple User=deno WorkingDirectory=/opt/fresh-app ExecStart=/home/deno/.deno/bin/deno run -A main.ts Restart=always [Install] WantedBy=multi-user.target ``` ## Environment Variables ```bash title=".env" # AWS Configuration AWS_REGION=us-east-1 AWS_ACCESS_KEY_ID=your_access_key AWS_SECRET_ACCESS_KEY=your_secret_key AWS_S3_BUCKET=your-bucket-name # Fresh PORT=8000 ``` ## Performance Benefits ## Real-Time Upload Progress ```tsx title="islands/AdvancedUpload.tsx" import { useState } from "preact/hooks"; export default function AdvancedUpload() { const [uploadProgress, setUploadProgress] = useState(0); const [isUploading, setIsUploading] = useState(false); async function handleFileUpload(event: Event) { const target = event.target as HTMLInputElement; const files = target.files; if (!files || files.length === 0) return; setIsUploading(true); setUploadProgress(0); try { // Simulate upload progress for (let i = 0; i <= 100; i += 10) { setUploadProgress(i); await new Promise(resolve => setTimeout(resolve, 100)); } alert('Upload completed!'); } catch (error) { console.error('Upload failed:', error); alert('Upload failed!'); } finally { setIsUploading(false); setUploadProgress(0); } } return (
{isUploading && (

{uploadProgress}% uploaded

)}
); } ``` ## Deno KV Integration ```typescript title="lib/storage.ts" // Example using Deno KV for file metadata storage export class FileStorage { private kv: Deno.Kv; constructor() { this.kv = await Deno.openKv(); } async saveFileMetadata(userId: string, file: { id: string; name: string; url: string; size: number; type: string; }) { const key = ["files", userId, file.id]; await this.kv.set(key, { ...file, createdAt: new Date().toISOString(), }); } async getFilesForUser(userId: string) { const files = []; for await (const entry of this.kv.list({ prefix: ["files", userId] })) { files.push(entry.value); } return files; } async deleteFile(userId: string, fileId: string) { const key = ["files", userId, fileId]; await this.kv.delete(key); } } export const fileStorage = new FileStorage(); ``` ## Troubleshooting **Common Issues** 1. **Route not found**: Ensure your route is `routes/api/upload/[...path].ts` 2. **Import errors**: Check your `import_map.json` configuration 3. **Permissions**: Deno requires explicit permissions (`-A` flag for all permissions) 4. **Environment variables**: Use `Deno.env.get()` instead of `process.env` ### Debug Mode Enable debug logging: ```typescript title="lib/upload.ts" export const uploadRouter = createS3Router({ // ... routes }).middleware(async ({ req, file }) => { if (Deno.env.get("DENO_ENV") === "development") { console.log("Upload request:", req.url); console.log("File:", file.name, file.size); } return {}; }); ``` ### Fresh Configuration ```typescript title="fresh.config.ts" import { defineConfig } from "$fresh/server.ts"; export default defineConfig({ plugins: [], // Enable static file serving staticDir: "./static", // Custom build options build: { target: ["chrome99", "firefox99", "safari15"], }, }); ``` Fresh provides an excellent foundation for building modern web applications with Deno and pushduck, combining the power of islands architecture with Web Standards APIs and Deno's secure runtime environment. # Hono (/docs/integrations/hono) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; ## Using pushduck with Hono Hono is a fast, lightweight web framework built on Web Standards. Since Hono uses `Request` and `Response` objects natively, pushduck handlers work directly without any adapters! **Web Standards Native**: Hono exposes `c.req.raw` as a Web Standard `Request` object, making pushduck integration seamless with zero overhead. ## Quick Setup **Install dependencies** ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` ```bash bun add pushduck ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().maxFileSize("5MB"), documentUpload: s3.file().maxFileSize("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create Hono app with upload routes** ```typescript title="app.ts" import { Hono } from 'hono'; import { uploadRouter } from './lib/upload'; const app = new Hono(); // Direct usage - no adapter needed! app.all('/api/upload/*', (c) => { return uploadRouter.handlers(c.req.raw); }); export default app; ``` ## Basic Integration ### Simple Upload Route ```typescript title="app.ts" import { Hono } from 'hono'; import { uploadRouter } from './lib/upload'; const app = new Hono(); // Method 1: Combined handler (recommended) app.all('/api/upload/*', (c) => { return uploadRouter.handlers(c.req.raw); }); // Method 2: Separate handlers (if you need method-specific logic) app.get('/api/upload/*', (c) => uploadRouter.handlers.GET(c.req.raw)); app.post('/api/upload/*', (c) => uploadRouter.handlers.POST(c.req.raw)); export default app; ``` ### With Middleware ```typescript title="app.ts" import { Hono } from 'hono'; import { cors } from 'hono/cors'; import { logger } from 'hono/logger'; import { uploadRouter } from './lib/upload'; const app = new Hono(); // Global middleware app.use('*', logger()); app.use('*', cors({ origin: ['http://localhost:3000', 'https://your-domain.com'], allowMethods: ['GET', 'POST'], allowHeaders: ['Content-Type'], })); // Upload routes app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw)); // Health check app.get('/health', (c) => c.json({ status: 'ok' })); export default app; ``` ## Advanced Configuration ### Authentication with Hono ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; import { verify } from 'hono/jwt'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with JWT authentication privateUpload: s3 .image() .maxFileSize("5MB") .middleware(async ({ req }) => { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authorization required'); } const token = authHeader.substring(7); try { const payload = await verify(token, process.env.JWT_SECRET!); return { userId: payload.sub as string, userRole: payload.role as string }; } catch (error) { throw new Error('Invalid token'); } }), // Public uploads (no auth) publicUpload: s3 .image() .maxFileSize("2MB") // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; ``` ## Deployment Options ```typescript title="src/index.ts" import { Hono } from 'hono'; import { uploadRouter } from './lib/upload'; const app = new Hono(); app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw)); export default app; ``` ```toml title="wrangler.toml" name = "my-upload-api" main = "src/index.ts" compatibility_date = "2023-12-01" [env.production] vars = { NODE_ENV = "production" } ``` ```bash # Deploy to Cloudflare Workers npx wrangler deploy ``` ```typescript title="server.ts" import { Hono } from 'hono'; import { uploadRouter } from './lib/upload'; const app = new Hono(); app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw)); export default { port: 3000, fetch: app.fetch, }; ``` ```bash # Run with Bun bun run server.ts ``` ```typescript title="server.ts" import { serve } from '@hono/node-server'; import { Hono } from 'hono'; import { uploadRouter } from './lib/upload'; const app = new Hono(); app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw)); const port = 3000; console.log(`Server is running on port ${port}`); serve({ fetch: app.fetch, port }); ``` ```bash # Run with Node.js npm run dev ``` ```typescript title="server.ts" import { Hono } from 'hono'; import { uploadRouter } from './lib/upload.ts'; const app = new Hono(); app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw)); Deno.serve(app.fetch); ``` ```bash # Run with Deno deno run --allow-net --allow-env server.ts ``` ## Performance Benefits No adapter layer means zero performance overhead - pushduck handlers run directly in Hono. Hono is one of the fastest web frameworks, perfect for high-performance upload APIs. Works on Cloudflare Workers, Bun, Node.js, and Deno with the same code. Hono + pushduck creates incredibly lightweight upload services. *** **Perfect Match**: Hono's Web Standards foundation and pushduck's universal design create a powerful, fast, and lightweight file upload solution that works everywhere. # Framework Integrations (/docs/integrations) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; ## Supported Frameworks Pushduck provides **universal file upload handlers** that work with any web framework through a single, consistent API. Write your upload logic once and deploy it anywhere! **Universal Design**: Pushduck uses Web Standards (Request/Response) at its core, making it compatible with both Web Standards frameworks and those with custom request/response APIs without framework-specific code. ## 🌟 Universal API All frameworks use the same core API: ```typescript import { createS3Router, s3 } from 'pushduck/server'; const uploadRouter = createS3Router({ imageUpload: s3.image().maxFileSize("5MB"), documentUpload: s3.file().maxFileSize("10MB"), videoUpload: s3.file().maxFileSize("100MB").types(["video/*"]) }); // Universal handlers - work with ANY framework export const { GET, POST } = uploadRouter.handlers; ``` ## Framework Categories Pushduck supports frameworks in two categories: **No adapter needed!** Use `uploadRouter.handlers` directly. * Hono * Elysia * Bun Runtime * TanStack Start * SolidJS Start **Simple adapters provided** for seamless integration. * Next.js (App & Pages Router) * Express * Fastify ## Quick Start by Framework ```typescript // Works with: Hono, Elysia, Bun, TanStack Start, SolidJS Start import { uploadRouter } from '@/lib/upload'; // Direct usage - no adapter needed! app.all('/api/upload/*', (ctx) => { return uploadRouter.handlers(ctx.request); // or c.req.raw }); ``` ```typescript // app/api/upload/route.ts import { uploadRouter } from '@/lib/upload'; // Direct usage (recommended) export const { GET, POST } = uploadRouter.handlers; // Or with explicit adapter for extra type safety import { toNextJsHandler } from 'pushduck/adapters'; export const { GET, POST } = toNextJsHandler(uploadRouter.handlers); ``` ```typescript import express from 'express'; import { uploadRouter } from '@/lib/upload'; import { toExpressHandler } from 'pushduck/adapters'; const app = express(); app.all("/api/upload/*", toExpressHandler(uploadRouter.handlers)); ``` ```typescript import Fastify from 'fastify'; import { uploadRouter } from '@/lib/upload'; import { toFastifyHandler } from 'pushduck/adapters'; const fastify = Fastify(); fastify.all('/api/upload/*', toFastifyHandler(uploadRouter.handlers)); ``` ## Why Universal Handlers Work **Web Standards Foundation** Pushduck is built on Web Standards (`Request` and `Response` objects) that are supported by all modern JavaScript runtimes. ```typescript // Core handler signature type Handler = (request: Request) => Promise ``` **Framework Compatibility** Modern frameworks expose Web Standard objects directly: * **Hono**: `c.req.raw` is a Web `Request` * **Elysia**: `context.request` is a Web `Request` * **Bun**: Native Web `Request` support * **TanStack Start**: `{ request }` is a Web `Request` * **SolidJS Start**: `event.request` is a Web `Request` **Framework Adapters** For frameworks with custom request/response APIs, simple adapters convert between formats: ```typescript // Express adapter example export function toExpressHandler(handlers: UniversalHandlers) { return async (req: Request, res: Response, next: NextFunction) => { const webRequest = convertExpressToWebRequest(req); const webResponse = await handlers[req.method](webRequest); convertWebResponseToExpress(webResponse, res); }; } ``` ## Configuration (Same for All Frameworks) Your upload configuration is identical across all frameworks: ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Image uploads with validation imageUpload: s3 .image() .maxFileSize("5MB") .formats(["jpeg", "png", "webp"]) .middleware(async ({ req }) => { const userId = await getUserId(req); return { userId, category: "images" }; }), // Document uploads documentUpload: s3 .file() .maxFileSize("10MB") .types(["application/pdf", "text/plain"]) .middleware(async ({ req }) => { const userId = await getUserId(req); return { userId, category: "documents" }; }), // Video uploads videoUpload: s3 .file() .maxFileSize("100MB") .types(["video/mp4", "video/quicktime"]) .middleware(async ({ req }) => { const userId = await getUserId(req); return { userId, category: "videos" }; }) }); export type AppUploadRouter = typeof uploadRouter; ``` ## Client Usage (Framework Independent) The client-side code is identical regardless of your backend framework: ```typescript title="lib/upload-client.ts" import { createUploadClient } from 'pushduck/client'; import type { AppUploadRouter } from './upload'; export const upload = createUploadClient({ endpoint: '/api/upload' }); ``` ```typescript title="components/upload-form.tsx" import { upload } from '@/lib/upload-client'; export function UploadForm() { // Property-based access with full type safety const { uploadFiles, files, isUploading } = upload.imageUpload(); const handleUpload = async (selectedFiles: File[]) => { await uploadFiles(selectedFiles); }; return (
handleUpload(Array.from(e.target.files || []))} /> {files.map(file => (
{file.name} {file.url && View}
))}
); } ``` ## Benefits of Universal Design Migrate from Express to Hono or Next.js to Bun without changing your upload implementation. Web Standards native frameworks get direct handler access with no adapter overhead. Master pushduck once and use it with any framework in your toolkit. As more frameworks adopt Web Standards, they automatically work with pushduck. ## Next Steps Choose your framework integration guide: Complete guide for Next.js App Router and Pages Router Fast, lightweight, built on Web Standards TypeScript-first framework with Bun Classic Node.js framework integration *** **Universal by Design**: Write once, run anywhere. Pushduck's universal handlers make file uploads work seamlessly across the entire JavaScript ecosystem. # Next.js (/docs/integrations/nextjs) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; ## Next.js Integration Pushduck provides seamless integration with both Next.js App Router and Pages Router through universal handlers that work with Next.js's Web Standards-based API. **Next.js 13+**: App Router uses Web Standards (Request/Response), so pushduck handlers work directly. Pages Router requires a simple adapter for the legacy req/res API. ## Quick Setup **Install pushduck** npm pnpm yarn bun ```bash npm install pushduck ``` ```bash pnpm add pushduck ``` ```bash yarn add pushduck ``` ```bash bun add pushduck ``` **Configure your upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ // [!code highlight] accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, // [!code highlight] }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().maxFileSize("5MB"), // [!code highlight] documentUpload: s3.file().maxFileSize("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create API route** App Router Pages Router ```typescript title="app/api/upload/route.ts" import { uploadRouter } from '@/lib/upload'; // Direct usage (recommended) export const { GET, POST } = uploadRouter.handlers; ``` ```typescript title="pages/api/upload/[...path].ts" import { uploadRouter } from '@/lib/upload'; import { toNextJsPagesHandler } from 'pushduck/server'; export default toNextJsPagesHandler(uploadRouter.handlers); ``` ## App Router Integration Next.js App Router uses Web Standards, making integration seamless: ### Basic API Route ```typescript title="app/api/upload/route.ts" import { uploadRouter } from '@/lib/upload'; // Direct usage - works because Next.js App Router uses Web Standards export const { GET, POST } = uploadRouter.handlers; ``` ### With Type Safety Adapter For extra type safety and better IDE support: ```typescript title="app/api/upload/route.ts" import { uploadRouter } from '@/lib/upload'; import { toNextJsHandler } from 'pushduck/adapters'; // Explicit adapter for enhanced type safety export const { GET, POST } = toNextJsHandler(uploadRouter.handlers); ``` ### Advanced Configuration ```typescript title="app/api/upload/route.ts" import { createUploadConfig } from 'pushduck/server'; import { getServerSession } from 'next-auth'; import { authOptions } from '@/lib/auth'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); const uploadRouter = createS3Router({ // Profile pictures with authentication profilePicture: s3 .image() .maxFileSize("2MB") .formats(["jpeg", "png", "webp"]) .middleware(async ({ req }) => { const session = await getServerSession(authOptions); if (!session?.user?.id) { throw new Error("Authentication required"); } return { userId: session.user.id, category: "profile" }; }), // Document uploads for authenticated users documents: s3 .file() .maxFileSize("10MB") .types(["application/pdf", "text/plain", "application/msword"]) .middleware(async ({ req }) => { const session = await getServerSession(authOptions); if (!session?.user?.id) { throw new Error("Authentication required"); } return { userId: session.user.id, category: "documents" }; }), // Public image uploads (no auth required) publicImages: s3 .image() .maxFileSize("5MB") .formats(["jpeg", "png", "webp"]) // No middleware = publicly accessible }); export type AppUploadRouter = typeof uploadRouter; export const { GET, POST } = uploadRouter.handlers; ``` ## Pages Router Integration Pages Router uses the legacy req/res API, so we provide a simple adapter: ### Basic API Route ```typescript title="pages/api/upload/[...path].ts" import { uploadRouter } from '@/lib/upload'; import { toNextJsPagesHandler } from 'pushduck/adapters'; export default toNextJsPagesHandler(uploadRouter.handlers); ``` ### With Authentication ```typescript title="pages/api/upload/[...path].ts" import { createUploadConfig } from 'pushduck/server'; import { toNextJsPagesHandler } from 'pushduck/adapters'; import { getSession } from 'next-auth/react'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ // ... your config }) .build(); const uploadRouter = createS3Router({ imageUpload: s3 .image() .maxFileSize("5MB") .middleware(async ({ req }) => { // Convert Web Request to get session const session = await getSession({ req: req as any }); if (!session?.user?.id) { throw new Error("Authentication required"); } return { userId: session.user.id }; }) }); export default toNextJsPagesHandler(uploadRouter.handlers); ``` ## Client-Side Usage The client-side code is identical for both App Router and Pages Router: ### Setup Upload Client ```typescript title="lib/upload-client.ts" import { createUploadClient } from 'pushduck/client'; import type { AppUploadRouter } from './upload'; export const upload = createUploadClient({ endpoint: '/api/upload' }); ``` ### React Component ```typescript title="components/upload-form.tsx" 'use client'; // App Router // or just regular component for Pages Router import { upload } from '@/lib/upload-client'; import { useState } from 'react'; export function UploadForm() { const { uploadFiles, files, isUploading, error } = upload.imageUpload(); const handleFileSelect = (e: React.ChangeEvent) => { const selectedFiles = Array.from(e.target.files || []); uploadFiles(selectedFiles); }; return (
{error && (
Error: {error.message}
)} {files.length > 0 && (
{files.map((file) => (

{file.name}

{(file.size / 1024 / 1024).toFixed(2)} MB

{file.status === 'success' ? 'Complete' : `${file.progress}%`}

{file.status === 'success' && file.url && ( View )}
))}
)}
); } ``` ## Project Structure Here's a recommended project structure for Next.js with pushduck: ## Complete Example ### Upload Configuration ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; import { getServerSession } from 'next-auth'; import { authOptions } from './auth'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { const timestamp = Date.now(); const randomId = Math.random().toString(36).substring(2, 8); return `${metadata.userId}/${timestamp}/${randomId}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Profile pictures - single image, authenticated profilePicture: s3 .image() .maxFileSize("2MB") .maxFiles(1) .formats(["jpeg", "png", "webp"]) .middleware(async ({ req }) => { const session = await getServerSession(authOptions); if (!session?.user?.id) throw new Error("Authentication required"); return { userId: session.user.id, type: "profile" }; }), // Gallery images - multiple images, authenticated gallery: s3 .image() .maxFileSize("5MB") .maxFiles(10) .formats(["jpeg", "png", "webp"]) .middleware(async ({ req }) => { const session = await getServerSession(authOptions); if (!session?.user?.id) throw new Error("Authentication required"); return { userId: session.user.id, type: "gallery" }; }), // Documents - various file types, authenticated documents: s3 .file() .maxFileSize("10MB") .maxFiles(5) .types([ "application/pdf", "application/msword", "application/vnd.openxmlformats-officedocument.wordprocessingml.document", "text/plain" ]) .middleware(async ({ req }) => { const session = await getServerSession(authOptions); if (!session?.user?.id) throw new Error("Authentication required"); return { userId: session.user.id, type: "documents" }; }), // Public uploads - no authentication required public: s3 .image() .maxFileSize("1MB") .maxFiles(1) .formats(["jpeg", "png"]) // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; ``` ### API Route (App Router) ```typescript title="app/api/upload/route.ts" import { uploadRouter } from '@/lib/upload'; export const { GET, POST } = uploadRouter.handlers; ``` ### Upload Page ```typescript title="app/upload/page.tsx" 'use client'; import { upload } from '@/lib/upload-client'; import { useState } from 'react'; export default function UploadPage() { const [activeTab, setActiveTab] = useState<'profile' | 'gallery' | 'documents'>('profile'); const profileUpload = upload.profilePicture(); const galleryUpload = upload.gallery(); const documentsUpload = upload.documents(); const currentUpload = { profile: profileUpload, gallery: galleryUpload, documents: documentsUpload }[activeTab]; return (

File Upload Demo

{/* Tab Navigation */}
{[ { key: 'profile', label: 'Profile Picture', icon: 'πŸ‘€' }, { key: 'gallery', label: 'Gallery', icon: 'πŸ–ΌοΈ' }, { key: 'documents', label: 'Documents', icon: 'πŸ“„' } ].map(tab => ( ))}
{/* Upload Interface */}
{ const files = Array.from(e.target.files || []); currentUpload.uploadFiles(files); }} disabled={currentUpload.isUploading} className="block w-full text-sm text-gray-500 file:mr-4 file:py-2 file:px-4 file:rounded-full file:border-0 file:text-sm file:font-semibold file:bg-blue-50 file:text-blue-700 hover:file:bg-blue-100" /> {/* File List */} {currentUpload.files.length > 0 && (
{currentUpload.files.map((file) => (

{file.name}

{(file.size / 1024 / 1024).toFixed(2)} MB

{file.status === 'success' && 'βœ…'} {file.status === 'error' && '❌'} {file.status === 'uploading' && '⏳'} {file.status === 'pending' && '⏸️'}
{file.status === 'success' && file.url && ( View )}
))}
)}
); } ``` ## Environment Variables ```bash title=".env.local" # Cloudflare R2 Configuration AWS_ACCESS_KEY_ID=your_r2_access_key AWS_SECRET_ACCESS_KEY=your_r2_secret_key AWS_ENDPOINT_URL=https://your-account-id.r2.cloudflarestorage.com S3_BUCKET_NAME=your-bucket-name R2_ACCOUNT_ID=your-account-id # Next.js Configuration NEXTAUTH_SECRET=your-nextauth-secret NEXTAUTH_URL=http://localhost:3000 ``` ## Deployment Considerations * Environment variables configured in dashboard * Edge Runtime compatible * Automatic HTTPS * Configure environment variables * Works with Netlify Functions * CDN integration available * Complete Next.js compatibility * Environment variable management * Automatic deployments *** **Next.js Ready**: Pushduck works seamlessly with both Next.js App Router and Pages Router, providing the same great developer experience across all Next.js versions. # Nitro/H3 (/docs/integrations/nitro-h3) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; ## Using pushduck with Nitro/H3 Nitro is a universal web server framework that powers Nuxt.js, built on top of H3 (HTTP framework). It uses Web Standards APIs and provides excellent performance with universal deployment. Since Nitro/H3 uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters! **Web Standards Native**: Nitro/H3 uses Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead and universal deployment capabilities. ## Quick Setup **Install dependencies** ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` ```bash bun add pushduck ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().maxFileSize("5MB"), documentUpload: s3.file().maxFileSize("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create API route** ```typescript title="routes/api/upload/[...path].ts" import { uploadRouter } from '~/lib/upload'; // Direct usage - no adapter needed! export default defineEventHandler(async (event) => { return uploadRouter.handlers(event.node.req); }); ``` ## Basic Integration ### Simple Upload Route ```typescript title="routes/api/upload/[...path].ts" import { uploadRouter } from '~/lib/upload'; // Method 1: Combined handler (recommended) export default defineEventHandler(async (event) => { return uploadRouter.handlers(event.node.req); }); // Method 2: Method-specific handlers export default defineEventHandler(async (event) => { const method = getMethod(event); if (method === 'GET') { return uploadRouter.handlers.GET(event.node.req); } if (method === 'POST') { return uploadRouter.handlers.POST(event.node.req); } throw createError({ statusCode: 405, statusMessage: 'Method Not Allowed' }); }); ``` ### With H3 Utilities ```typescript title="routes/api/upload/[...path].ts" import { uploadRouter } from '~/lib/upload'; import { defineEventHandler, getMethod, setHeader, createError } from 'h3'; export default defineEventHandler(async (event) => { // Handle CORS setHeader(event, 'Access-Control-Allow-Origin', '*'); setHeader(event, 'Access-Control-Allow-Methods', 'GET, POST, OPTIONS'); setHeader(event, 'Access-Control-Allow-Headers', 'Content-Type'); // Handle preflight requests if (getMethod(event) === 'OPTIONS') { return ''; } try { return await uploadRouter.handlers(event.node.req); } catch (error) { throw createError({ statusCode: 500, statusMessage: 'Upload failed', data: error }); } }); ``` ## Advanced Configuration ### Authentication with H3 ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; import { getCookie } from 'h3'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with session authentication privateUpload: s3 .image() .maxFileSize("5MB") .middleware(async ({ req }) => { const cookies = req.headers.cookie; const sessionId = parseCookie(cookies)?.sessionId; if (!sessionId) { throw new Error('Authentication required'); } const user = await getUserFromSession(sessionId); if (!user) { throw new Error('Invalid session'); } return { userId: user.id, username: user.username, }; }), // Public uploads (no auth) publicUpload: s3 .image() .maxFileSize("2MB") // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; // Helper functions function parseCookie(cookieString: string | undefined) { if (!cookieString) return {}; return Object.fromEntries( cookieString.split('; ').map(c => { const [key, ...v] = c.split('='); return [key, v.join('=')]; }) ); } async function getUserFromSession(sessionId: string) { // Implement your session validation logic return { id: 'user-123', username: 'demo-user' }; } ``` ## Standalone Nitro App ### Basic Nitro Setup ```typescript title="nitro.config.ts" export default defineNitroConfig({ srcDir: 'server', routeRules: { '/api/upload/**': { cors: true, headers: { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'GET, POST, OPTIONS', 'Access-Control-Allow-Headers': 'Content-Type' } } }, experimental: { wasm: true } }); ``` ### Server Entry Point ```typescript title="server/index.ts" import { createApp, toNodeListener } from 'h3'; import { uploadRouter } from './lib/upload'; const app = createApp(); // Upload routes app.use('/api/upload/**', defineEventHandler(async (event) => { return uploadRouter.handlers(event.node.req); })); // Health check app.use('/health', defineEventHandler(() => ({ status: 'ok' }))); export default toNodeListener(app); ``` ## Client-Side Usage ### HTML with Vanilla JavaScript ```html title="public/index.html" File Upload Demo

File Upload Demo

Image Upload

Document Upload

``` ### With Framework Integration ```typescript title="plugins/upload.client.ts" import { useUpload } from "pushduck/client"; import type { AppUploadRouter } from "~/lib/upload"; export const { UploadButton, UploadDropzone } = useUpload({ endpoint: "/api/upload", }); ``` ## File Management ### File API Route ```typescript title="routes/api/files.get.ts" import { defineEventHandler, getQuery, createError } from 'h3'; export default defineEventHandler(async (event) => { const query = getQuery(event); const userId = query.userId as string; if (!userId) { throw createError({ statusCode: 400, statusMessage: 'User ID required' }); } // Fetch files from database const files = await getFilesForUser(userId); return { files: files.map(file => ({ id: file.id, name: file.name, url: file.url, size: file.size, uploadedAt: file.createdAt, })), }; }); async function getFilesForUser(userId: string) { // Implement your database query logic return []; } ``` ### File Management Page ```html title="public/files.html" My Files

My Files

Uploaded Files

``` ## Deployment Options ```typescript title="nitro.config.ts" export default defineNitroConfig({ preset: 'vercel-edge', // or 'vercel' for Node.js runtime }); ``` ```typescript title="nitro.config.ts" export default defineNitroConfig({ preset: 'netlify-edge', // or 'netlify' for Node.js runtime }); ``` ```typescript title="nitro.config.ts" export default defineNitroConfig({ preset: 'node-server', }); ``` ```typescript title="nitro.config.ts" export default defineNitroConfig({ preset: 'cloudflare-workers', }); ``` ## Environment Variables ```bash title=".env" # AWS Configuration AWS_REGION=us-east-1 AWS_ACCESS_KEY_ID=your_access_key AWS_SECRET_ACCESS_KEY=your_secret_key AWS_S3_BUCKET=your-bucket-name # Nitro NITRO_PORT=3000 NITRO_HOST=0.0.0.0 ``` ## Performance Benefits ## Middleware and Plugins ```typescript title="middleware/cors.ts" export default defineEventHandler(async (event) => { if (event.node.req.url?.startsWith('/api/upload')) { setHeader(event, 'Access-Control-Allow-Origin', '*'); setHeader(event, 'Access-Control-Allow-Methods', 'GET, POST, OPTIONS'); setHeader(event, 'Access-Control-Allow-Headers', 'Content-Type'); if (getMethod(event) === 'OPTIONS') { return ''; } } }); ``` ```typescript title="plugins/database.ts" export default async (nitroApp) => { // Initialize database connection console.log('Database plugin initialized'); // Add database to context nitroApp.hooks.hook('request', async (event) => { event.context.db = await getDatabase(); }); }; ``` ## Real-Time Upload Progress ```html title="public/advanced-upload.html" Advanced Upload
``` ## Troubleshooting **Common Issues** 1. **Route not found**: Ensure your route is `routes/api/upload/[...path].ts` 2. **Build errors**: Check that pushduck and h3 are properly installed 3. **CORS issues**: Use Nitro's built-in CORS handling or middleware 4. **Environment variables**: Make sure they're accessible in your deployment environment ### Debug Mode Enable debug logging: ```typescript title="lib/upload.ts" export const uploadRouter = createS3Router({ // ... routes }).middleware(async ({ req, file }) => { if (process.env.NODE_ENV === "development") { console.log("Upload request:", req.url); console.log("File:", file.name, file.size); } return {}; }); ``` ### Nitro Configuration ```typescript title="nitro.config.ts" export default defineNitroConfig({ srcDir: 'server', buildDir: '.nitro', output: { dir: '.output', serverDir: '.output/server', publicDir: '.output/public' }, runtimeConfig: { awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID, awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY, awsRegion: process.env.AWS_REGION, s3BucketName: process.env.S3_BUCKET_NAME, }, experimental: { wasm: true } }); ``` Nitro/H3 provides an excellent foundation for building universal web applications with pushduck, offering flexibility, performance, and deployment options across any platform while maintaining full compatibility with Web Standards APIs. # Nuxt.js (/docs/integrations/nuxtjs) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; **🚧 Client-Side In Development**: Nuxt.js server-side integration is fully functional with Web Standards APIs. However, Nuxt.js-specific client-side components and hooks are still in development. You can use the standard pushduck client APIs for now. ## Using pushduck with Nuxt.js Nuxt.js is the intuitive Vue.js framework for building full-stack web applications. It uses Web Standards APIs and provides excellent performance with server-side rendering. Since Nuxt.js uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters! **Web Standards Native**: Nuxt.js server routes use Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead. ## Quick Setup **Install dependencies** ```bash npm install pushduck ``` **Configure upload router** ```typescript title="server/utils/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().maxFileSize("5MB"), documentUpload: s3.file().maxFileSize("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create API route** ```typescript title="server/api/upload/[...path].ts" import { uploadRouter } from '~/server/utils/upload'; // Direct usage - no adapter needed! export default defineEventHandler(async (event) => { return uploadRouter.handlers(event.node.req); }); ``` ## Basic Integration ### Simple Upload Route ```typescript title="server/api/upload/[...path].ts" import { uploadRouter } from '~/server/utils/upload'; // Method 1: Combined handler (recommended) export default defineEventHandler(async (event) => { return uploadRouter.handlers(event.node.req); }); // Method 2: Method-specific handlers export default defineEventHandler({ onRequest: [ // Add middleware here if needed ], handler: async (event) => { if (event.node.req.method === 'GET') { return uploadRouter.handlers.GET(event.node.req); } if (event.node.req.method === 'POST') { return uploadRouter.handlers.POST(event.node.req); } } }); ``` ### With Server Middleware ```typescript title="server/middleware/cors.ts" export default defineEventHandler(async (event) => { if (event.node.req.url?.startsWith('/api/upload')) { // Handle CORS for upload routes setHeader(event, 'Access-Control-Allow-Origin', '*'); setHeader(event, 'Access-Control-Allow-Methods', 'GET, POST, OPTIONS'); setHeader(event, 'Access-Control-Allow-Headers', 'Content-Type'); if (event.node.req.method === 'OPTIONS') { return ''; } } }); ``` ## Advanced Configuration ### Authentication with Nuxt ```typescript title="server/utils/upload.ts" import { createUploadConfig } from 'pushduck/server'; import jwt from 'jsonwebtoken'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with JWT authentication privateUpload: s3 .image() .maxFileSize("5MB") .middleware(async ({ req }) => { const authHeader = req.headers.authorization; if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authorization required'); } const token = authHeader.substring(7); try { const payload = jwt.verify(token, process.env.JWT_SECRET!) as any; return { userId: payload.sub, userRole: payload.role }; } catch (error) { throw new Error('Invalid token'); } }), // Public uploads (no auth) publicUpload: s3 .image() .maxFileSize("2MB") // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; ``` ## Client-Side Usage ### Upload Composable ```typescript title="composables/useUpload.ts" import { useUpload } from "pushduck/client"; import type { AppUploadRouter } from "~/server/utils/upload"; export const { UploadButton, UploadDropzone } = useUpload({ endpoint: "/api/upload", }); ``` ### Upload Component ```vue title="components/FileUpload.vue" ``` ### Using in Pages ```vue title="pages/index.vue" ``` ## File Management ### Server-Side File API ```typescript title="server/api/files.get.ts" export default defineEventHandler(async (event) => { const query = getQuery(event); const userId = query.userId as string; if (!userId) { throw createError({ statusCode: 400, statusMessage: 'User ID required' }); } // Fetch files from database const files = await $fetch('/api/database/files', { query: { userId } }); return { files: files.map((file: any) => ({ id: file.id, name: file.name, url: file.url, size: file.size, uploadedAt: file.createdAt, })), }; }); ``` ### File Management Page ```vue title="pages/files.vue" ``` ## Deployment Options ```typescript title="nuxt.config.ts" export default defineNuxtConfig({ nitro: { preset: 'vercel-edge', // or 'vercel' for Node.js runtime }, runtimeConfig: { awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID, awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY, awsRegion: process.env.AWS_REGION, s3BucketName: process.env.S3_BUCKET_NAME, } }); ``` ```typescript title="nuxt.config.ts" export default defineNuxtConfig({ nitro: { preset: 'netlify-edge', // or 'netlify' for Node.js runtime }, runtimeConfig: { awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID, awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY, awsRegion: process.env.AWS_REGION, s3BucketName: process.env.S3_BUCKET_NAME, } }); ``` ```typescript title="nuxt.config.ts" export default defineNuxtConfig({ nitro: { preset: 'node-server', }, runtimeConfig: { awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID, awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY, awsRegion: process.env.AWS_REGION, s3BucketName: process.env.S3_BUCKET_NAME, } }); ``` ```typescript title="nuxt.config.ts" export default defineNuxtConfig({ nitro: { preset: 'cloudflare-pages', }, runtimeConfig: { awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID, awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY, awsRegion: process.env.AWS_REGION, s3BucketName: process.env.S3_BUCKET_NAME, } }); ``` ## Environment Variables ```bash title=".env" # AWS Configuration AWS_REGION=us-east-1 AWS_ACCESS_KEY_ID=your_access_key AWS_SECRET_ACCESS_KEY=your_secret_key AWS_S3_BUCKET=your-bucket-name # JWT Secret (for authentication) JWT_SECRET=your-jwt-secret # Nuxt NUXT_PUBLIC_UPLOAD_ENDPOINT=http://localhost:3000/api/upload ``` ## Performance Benefits ## Real-Time Upload Progress ```vue title="components/AdvancedUpload.vue" ``` ## Troubleshooting **Common Issues** 1. **Route not found**: Ensure your route is `server/api/upload/[...path].ts` 2. **Build errors**: Check that pushduck is properly installed 3. **CORS issues**: Use server middleware for CORS configuration 4. **Runtime config**: Make sure environment variables are properly configured ### Debug Mode Enable debug logging: ```typescript title="server/utils/upload.ts" export const uploadRouter = createS3Router({ // ... routes }).middleware(async ({ req, file }) => { if (process.dev) { console.log("Upload request:", req.url); console.log("File:", file.name, file.size); } return {}; }); ``` ### Nitro Configuration ```typescript title="nuxt.config.ts" export default defineNuxtConfig({ nitro: { experimental: { wasm: true }, // Enable debugging in development devProxy: { '/api/upload': { target: 'http://localhost:3000/api/upload', changeOrigin: true } } } }); ``` Nuxt.js provides an excellent foundation for building full-stack Vue.js applications with pushduck, combining the power of Vue's reactive framework with Web Standards APIs and Nitro's universal deployment capabilities. # Qwik (/docs/integrations/qwik) import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; **🚧 Client-Side In Development**: Qwik server-side integration is fully functional with Web Standards APIs. However, Qwik-specific client-side components and hooks are still in development. You can use the standard pushduck client APIs for now. ## Using pushduck with Qwik Qwik is a revolutionary web framework focused on resumability and edge optimization. It uses Web Standards APIs and provides instant loading with minimal JavaScript. Since Qwik uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters! **Web Standards Native**: Qwik server endpoints use Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead and perfect for edge deployment. ## Quick Setup **Install dependencies** ```bash npm install pushduck ``` **Configure upload router** ```typescript title="src/lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: import.meta.env.VITE_AWS_ACCESS_KEY_ID!, secretAccessKey: import.meta.env.VITE_AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: import.meta.env.VITE_AWS_ENDPOINT_URL!, bucket: import.meta.env.VITE_S3_BUCKET_NAME!, accountId: import.meta.env.VITE_R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().maxFileSize("5MB"), documentUpload: s3.file().maxFileSize("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create API route** ```typescript title="src/routes/api/upload/[...path]/index.ts" import type { RequestHandler } from '@builder.io/qwik-city'; import { uploadRouter } from '~/lib/upload'; // Direct usage - no adapter needed! export const onGet: RequestHandler = async ({ request }) => { return uploadRouter.handlers(request); }; export const onPost: RequestHandler = async ({ request }) => { return uploadRouter.handlers(request); }; ``` ## Basic Integration ### Simple Upload Route ```typescript title="src/routes/api/upload/[...path]/index.ts" import type { RequestHandler } from '@builder.io/qwik-city'; import { uploadRouter } from '~/lib/upload'; // Method 1: Combined handler (recommended) export const onRequest: RequestHandler = async ({ request }) => { return uploadRouter.handlers(request); }; // Method 2: Separate handlers (if you need method-specific logic) export const onGet: RequestHandler = async ({ request }) => { return uploadRouter.handlers.GET(request); }; export const onPost: RequestHandler = async ({ request }) => { return uploadRouter.handlers.POST(request); }; ``` ### With CORS Support ```typescript title="src/routes/api/upload/[...path]/index.ts" import type { RequestHandler } from '@builder.io/qwik-city'; import { uploadRouter } from '~/lib/upload'; export const onRequest: RequestHandler = async ({ request, headers }) => { // Handle CORS preflight if (request.method === 'OPTIONS') { headers.set('Access-Control-Allow-Origin', '*'); headers.set('Access-Control-Allow-Methods', 'GET, POST, OPTIONS'); headers.set('Access-Control-Allow-Headers', 'Content-Type'); return new Response(null, { status: 200 }); } const response = await uploadRouter.handlers(request); // Add CORS headers to actual response headers.set('Access-Control-Allow-Origin', '*'); return response; }; ``` ## Advanced Configuration ### Authentication with Qwik ```typescript title="src/lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: import.meta.env.VITE_AWS_ACCESS_KEY_ID!, secretAccessKey: import.meta.env.VITE_AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: import.meta.env.VITE_AWS_ENDPOINT_URL!, bucket: import.meta.env.VITE_S3_BUCKET_NAME!, accountId: import.meta.env.VITE_R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with cookie-based authentication privateUpload: s3 .image() .maxFileSize("5MB") .middleware(async ({ req }) => { const cookies = req.headers.get('Cookie'); const sessionId = parseCookie(cookies)?.sessionId; if (!sessionId) { throw new Error('Authentication required'); } const user = await getUserFromSession(sessionId); if (!user) { throw new Error('Invalid session'); } return { userId: user.id, username: user.username, }; }), // Public uploads (no auth) publicUpload: s3 .image() .maxFileSize("2MB") // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; // Helper functions function parseCookie(cookieString: string | null) { if (!cookieString) return {}; return Object.fromEntries( cookieString.split('; ').map(c => { const [key, ...v] = c.split('='); return [key, v.join('=')]; }) ); } async function getUserFromSession(sessionId: string) { // Implement your session validation logic return { id: 'user-123', username: 'demo-user' }; } ``` ## Client-Side Usage ### Upload Component ```tsx title="src/components/file-upload.tsx" import { component$, useSignal } from '@builder.io/qwik'; import { useUpload } from "pushduck/client"; import type { AppUploadRouter } from "~/lib/upload"; export const FileUpload = component$(() => { const uploadProgress = useSignal(0); const isUploading = useSignal(false); const { UploadButton, UploadDropzone } = useUpload({ endpoint: "/api/upload", }); const handleUploadComplete = $((files: any[]) => { console.log("Files uploaded:", files); alert("Upload completed!"); }); const handleUploadError = $((error: Error) => { console.error("Upload error:", error); alert(`Upload failed: ${error.message}`); }); return (

Image Upload

Document Upload

); }); ``` ### Using in Routes ```tsx title="src/routes/index.tsx" import { component$ } from '@builder.io/qwik'; import type { DocumentHead } from '@builder.io/qwik-city'; import { FileUpload } from '~/components/file-upload'; export default component$(() => { return (

File Upload Demo

); }); export const head: DocumentHead = { title: 'File Upload Demo', meta: [ { name: 'description', content: 'Qwik file upload demo with pushduck', }, ], }; ``` ## File Management ### Server-Side File Loader ```typescript title="src/routes/files/index.tsx" import { component$ } from '@builder.io/qwik'; import type { DocumentHead } from '@builder.io/qwik-city'; import { routeLoader$ } from '@builder.io/qwik-city'; import { FileUpload } from '~/components/file-upload'; export const useFiles = routeLoader$(async (requestEvent) => { const userId = 'current-user'; // Get from session/auth // Fetch files from database const files = await getFilesForUser(userId); return { files: files.map(file => ({ id: file.id, name: file.name, url: file.url, size: file.size, uploadedAt: file.createdAt, })), }; }); export default component$(() => { const filesData = useFiles(); const formatFileSize = (bytes: number): string => { const sizes = ['Bytes', 'KB', 'MB', 'GB']; if (bytes === 0) return '0 Bytes'; const i = Math.floor(Math.log(bytes) / Math.log(1024)); return Math.round(bytes / Math.pow(1024, i) * 100) / 100 + ' ' + sizes[i]; }; return (

My Files

Uploaded Files

{filesData.value.files.length === 0 ? (

No files uploaded yet.

) : (
{filesData.value.files.map((file) => (

{file.name}

{formatFileSize(file.size)}

{new Date(file.uploadedAt).toLocaleDateString()}

View File
))}
)}
); }); export const head: DocumentHead = { title: 'My Files', }; async function getFilesForUser(userId: string) { // Implement your database query logic return []; } ``` ## Deployment Options ```typescript title="vite.config.ts" import { defineConfig } from 'vite'; import { qwikVite } from '@builder.io/qwik/optimizer'; import { qwikCity } from '@builder.io/qwik-city/vite'; import { qwikCloudflarePages } from '@builder.io/qwik-city/adapters/cloudflare-pages/vite'; export default defineConfig(() => { return { plugins: [ qwikCity({ adapter: qwikCloudflarePages(), }), qwikVite(), ], }; }); ``` ```typescript title="vite.config.ts" import { defineConfig } from 'vite'; import { qwikVite } from '@builder.io/qwik/optimizer'; import { qwikCity } from '@builder.io/qwik-city/vite'; import { qwikVercel } from '@builder.io/qwik-city/adapters/vercel-edge/vite'; export default defineConfig(() => { return { plugins: [ qwikCity({ adapter: qwikVercel(), }), qwikVite(), ], }; }); ``` ```typescript title="vite.config.ts" import { defineConfig } from 'vite'; import { qwikVite } from '@builder.io/qwik/optimizer'; import { qwikCity } from '@builder.io/qwik-city/vite'; import { qwikNetlifyEdge } from '@builder.io/qwik-city/adapters/netlify-edge/vite'; export default defineConfig(() => { return { plugins: [ qwikCity({ adapter: qwikNetlifyEdge(), }), qwikVite(), ], }; }); ``` ```typescript title="vite.config.ts" import { defineConfig } from 'vite'; import { qwikVite } from '@builder.io/qwik/optimizer'; import { qwikCity } from '@builder.io/qwik-city/vite'; import { qwikDeno } from '@builder.io/qwik-city/adapters/deno/vite'; export default defineConfig(() => { return { plugins: [ qwikCity({ adapter: qwikDeno(), }), qwikVite(), ], }; }); ``` ## Environment Variables ```bash title=".env" # AWS Configuration VITE_AWS_REGION=us-east-1 VITE_AWS_ACCESS_KEY_ID=your_access_key VITE_AWS_SECRET_ACCESS_KEY=your_secret_key VITE_AWS_S3_BUCKET=your-bucket-name # Qwik VITE_PUBLIC_UPLOAD_ENDPOINT=http://localhost:5173/api/upload ``` ## Performance Benefits ## Real-Time Upload Progress ```tsx title="src/components/advanced-upload.tsx" import { component$, useSignal, $ } from '@builder.io/qwik'; export const AdvancedUpload = component$(() => { const uploadProgress = useSignal(0); const isUploading = useSignal(false); const handleFileUpload = $(async (event: Event) => { const target = event.target as HTMLInputElement; const files = target.files; if (!files || files.length === 0) return; isUploading.value = true; uploadProgress.value = 0; try { // Simulate upload progress for (let i = 0; i <= 100; i += 10) { uploadProgress.value = i; await new Promise(resolve => setTimeout(resolve, 100)); } alert('Upload completed!'); } catch (error) { console.error('Upload failed:', error); alert('Upload failed!'); } finally { isUploading.value = false; uploadProgress.value = 0; } }); return (
{isUploading.value && (

{uploadProgress.value}% uploaded

)}
); }); ``` ## Qwik City Form Integration ```tsx title="src/routes/upload-form/index.tsx" import { component$ } from '@builder.io/qwik'; import type { DocumentHead } from '@builder.io/qwik-city'; import { routeAction$, Form, zod$, z } from '@builder.io/qwik-city'; import { FileUpload } from '~/components/file-upload'; export const useUploadAction = routeAction$(async (data, requestEvent) => { // Handle form submission // Files are already uploaded via pushduck, just save metadata console.log('Form data:', data); // Redirect to files page throw requestEvent.redirect(302, '/files'); }, zod$({ title: z.string().min(1), description: z.string().optional(), })); export default component$(() => { const uploadAction = useUploadAction(); return (

Upload Files