# AI & LLM Integration URL: /docs/ai-integration Access documentation content in AI-friendly formats for large language models and automated tools. *** title: AI & LLM Integration description: Access documentation content in AI-friendly formats for large language models and automated tools. --------------------------------------------------------------------------------------------------------------- # AI & LLM Integration Pushduck documentation provides AI-friendly endpoints that make it easy for large language models (LLMs) and automated tools to access and process our documentation content. ## Available Endpoints ### ๐Ÿ“„ Complete Documentation Export Access all documentation content in a single, structured format: ``` GET /llms.txt ``` This endpoint returns all documentation pages in a clean, AI-readable format with: * Page titles and URLs * Descriptions and metadata * Full content with proper formatting * Structured sections and hierarchies **Example Usage:** ```bash curl https://your-domain.com/llms.txt ``` ### ๐Ÿ“‘ Individual Page Access Access any documentation page's raw content by appending `.mdx` to its URL: ``` GET /docs/{page-path}.mdx ``` **Examples:** * `/docs/quick-start.mdx` - Quick start guide content * `/docs/api/hooks/use-upload.mdx` - Hook documentation * `/docs/guides/setup/aws-s3.mdx` - AWS S3 setup guide ## Use Cases ### ๐Ÿค– **AI Assistant Integration** * Train custom AI models on our documentation * Create chatbots that can answer questions about Pushduck * Build intelligent documentation search systems ### ๐Ÿ”ง **Development Tools** * Generate code examples and snippets * Create automated documentation tests * Build CLI tools that reference our docs ### ๐Ÿ“Š **Content Analysis** * Analyze documentation completeness * Track content changes over time * Generate documentation metrics ## Content Format The LLM endpoints return content in a structured format: ``` # Page Title URL: /docs/page-path Page description here # Section Headers Content with proper markdown formatting... ## Subsections - Lists and bullet points - Code blocks with syntax highlighting - Tables and structured data ``` ## Technical Details * **Caching**: Content is cached for optimal performance * **Processing**: Uses Remark pipeline with MDX and GFM support * **Format**: Clean markdown with frontmatter removed * **Encoding**: UTF-8 text format * **CORS**: Enabled for cross-origin requests ## Rate Limiting These endpoints are designed for programmatic access and don't have aggressive rate limiting. However, please be respectful: * Cache responses when possible * Avoid excessive automated requests * Use appropriate user agents for your tools ## Examples ### Python Script ```python import requests # Get all documentation response = requests.get('https://your-domain.com/llms.txt') docs_content = response.text # Get specific page page_response = requests.get('https://your-domain.com/docs/quick-start.mdx') page_content = page_response.text ``` ### Node.js/JavaScript ```javascript // Fetch all documentation const allDocs = await fetch("/llms.txt").then((r) => r.text()); // Fetch specific page const quickStart = await fetch("/docs/quick-start.mdx").then((r) => r.text()); ``` ### cURL ```bash # Download all docs to file curl -o pushduck-docs.txt https://your-domain.com/llms.txt # Get specific page content curl https://your-domain.com/docs/api/hooks/use-upload.mdx ``` ## Integration with Popular AI Tools ### OpenAI GPT Use the `/llms.txt` endpoint to provide context about Pushduck in your GPT conversations. ### Claude/Anthropic Feed documentation content to Claude for detailed analysis and code generation. ### Local LLMs Download content for training or fine-tuning local language models. *** These AI-friendly endpoints make it easy to integrate Pushduck documentation into your development workflow and AI-powered tools! # Examples & Demos URL: /docs/examples Experience pushduck with interactive demos and real-world examples. All demos use live Cloudflare R2 integration. *** title: Examples & Demos description: Experience pushduck with interactive demos and real-world examples. All demos use live Cloudflare R2 integration. ------------------------------------------------------------------------------------------------------------------------------ import { Callout } from "fumadocs-ui/components/callout"; import { Tabs, Tab } from "fumadocs-ui/components/tabs"; **Live Demos:** These are fully functional demos using real Cloudflare R2 storage. Files are uploaded to a demo bucket and may be automatically cleaned up. Don't upload sensitive information. **Having Issues?** If uploads aren't working (especially with `next dev --turbo`), check our [Troubleshooting Guide](/docs/api/troubleshooting) for common solutions including the known Turbo mode compatibility issue. ## Interactive Upload Demo The full-featured demo showcasing all capabilities: **ETA & Speed Tracking:** Upload speed (MB/s) and estimated time remaining (ETA) appear below the progress bar during active uploads. Try uploading larger files (1MB+) to see these metrics in action! ETA becomes more accurate after the first few seconds of upload. ## Image-Only Upload Focused demo for image uploads with preview capabilities: ## Document Upload Streamlined demo for document uploads: ## Key Features Demonstrated ### โœ… **Type-Safe Client** ```typescript // Property-based access with full TypeScript inference const imageUpload = upload.imageUpload(); const fileUpload = upload.fileUpload(); // No string literals, no typos, full autocomplete await imageUpload.uploadFiles(selectedFiles); ``` ### โšก **Real-Time Progress** * Individual file progress tracking with percentage completion * Upload speed monitoring (MB/s) with live updates * ETA calculations showing estimated time remaining * Pause/resume functionality (coming soon) * Comprehensive error handling with retry mechanisms ### ๐Ÿ”’ **Built-in Validation** * File type validation (MIME types) * File size limits with user-friendly errors * Custom validation middleware * Malicious file detection ### ๐ŸŒ **Provider Agnostic** * Same code works with any S3-compatible provider * Switch between Cloudflare R2, AWS S3, DigitalOcean Spaces * Zero vendor lock-in ## Code Examples ```typescript "use client"; import { upload } from "@/lib/upload-client"; export function SimpleUpload() { const { uploadFiles, files, isUploading } = upload.imageUpload(); return (
uploadFiles(Array.from(e.target.files || []))} disabled={isUploading} /> {files.map(file => (
{file.name} {file.status} {file.url && View}
))}
); } ```
```typescript // app/api/upload/route.ts import { createUploadConfig } from "pushduck/server"; const { s3, } = createUploadConfig() .provider("cloudflareR2",{ accountId: process.env.CLOUDFLARE_ACCOUNT_ID!, bucket: process.env.R2_BUCKET!, }) .defaults({ maxFileSize: "10MB", acl: "public-read", }) .build(); const uploadRouter = s3.createRouter({ imageUpload: s3 .image() .max("5MB") .formats(["jpeg", "png", "webp"]) .middleware(async ({ file, metadata }) => { // Custom authentication and metadata const session = await getServerSession(); if (!session) throw new Error("Unauthorized"); return { ...metadata, userId: session.user.id, uploadedAt: new Date().toISOString(), }; }) .onUploadComplete(async ({ file, url, metadata }) => { // Post-upload processing console.log(`Upload complete: ${url}`); await saveToDatabase({ url, metadata }); }), }); export const { GET, POST } = uploadRouter.handlers; export type AppRouter = typeof uploadRouter; ``` ```typescript "use client"; import { upload } from "@/lib/upload-client"; export function RobustUpload() { const { uploadFiles, files, errors, reset } = upload.imageUpload(); const handleUpload = async (fileList: FileList) => { try { await uploadFiles(Array.from(fileList)); } catch (error) { console.error("Upload failed:", error); // Error is automatically added to the errors array } }; return (
e.target.files && handleUpload(e.target.files)} /> {/* Display errors */} {errors.length > 0 && (

Upload Errors:

{errors.map((error, index) => (

{error}

))}
)} {/* Display files with status */} {files.map(file => (
{file.name} {file.status} {file.status === "uploading" && ( )} {file.status === "error" && ( {file.error} )} {file.status === "success" && file.url && ( View File )}
))}
); } ```
## Real-World Use Cases ### **Profile Picture Upload** Single image upload with instant preview and crop functionality. ### **Document Management** Multi-file document upload with categorization and metadata. ### **Media Gallery** Batch image upload with automatic optimization and thumbnail generation. ### **File Sharing** Secure file upload with expiration dates and access controls. ## Next Steps
โšก Quick Start
Get set up in 2 minutes with our CLI
๐Ÿ“š API Reference
Complete API documentation
โ˜๏ธ Providers
Configure your storage provider
๐ŸŽฌ Full Demo
Complete upload experience
# Pushduck URL: /docs Own your file uploads. The most comprehensive upload solution for Next.js. *** title: Pushduck description: Own your file uploads. The most comprehensive upload solution for Next.js. --------------------------------------------------------------------------------------- import { Card, Cards } from "fumadocs-ui/components/card"; import { Step, Steps } from "fumadocs-ui/components/steps"; # Own Your File Uploads *File uploads in Next.js have been overcomplicated for too long. Developers shouldn't need to cobble together multiple libraries, write custom middleware, and manage complex state just to handle file uploads. We believe the TypeScript ecosystem deserves betterโ€”hence, Pushduck.* **The most comprehensive file upload solution for Next.js.** Guided setup, full TypeScript support, and everything you need out of the box. ```typescript // It's really this simple const upload = createUploadClient(); export function MyComponent() { const { uploadFiles, uploadedFiles, isUploading } = upload.imageUpload(); return ( uploadFiles(e.target.files)} disabled={isUploading} /> ); } ```
## Why Pushduck? File uploads should be **simple**, **secure**, and **scalable**. Other solutions make you choose between ease of use and control, or require vendor lock-in and ongoing costs. Pushduck gives you: * **Full ownership** of your upload infrastructure * **Zero vendor lock-in** with provider-agnostic design * **Production-grade features** without the complexity * **Type-safe development** with full TypeScript inference * **Community-driven** with real-world usage patterns ## Get Started
## Loved by Developers > "Finally, an upload solution that just works. The TypeScript inference is incredible - I get autocomplete for everything and catch errors before they hit production." โ€” **React Developer**, SaaS Startup > "We migrated from Uploadthing to pushduck and cut our upload costs by 80%. The provider-agnostic design means we can switch S3-compatible providers anytime." โ€” **CTO**, E-commerce Platform > "The property-based client approach is genius. No more passing route names as strings - everything is type-safe and the DX is outstanding." โ€” **Full-Stack Developer**, Agency *** ## Framework Agnostic While optimized for Next.js, it works seamlessly across the JavaScript ecosystem:
## What's Included Everything you need for production file uploads: * โœ… **Validation & Security** - File type, size, and custom validation * โœ… **Overall Progress Tracking** - Real-time aggregate progress, speed, and ETA across all files * โœ… **Error Handling** - Comprehensive error states and recovery * โœ… **Middleware System** - Custom logic for authentication, metadata, and processing * โœ… **Type Inference** - Full TypeScript safety from server to client * โœ… **Provider Support** - Cloudflare R2, AWS S3, DigitalOcean, MinIO, and more * โœ… **Image Processing** - Built-in Sharp integration for optimization * โœ… **Drag & Drop** - Ready-to-use components and hooks * โœ… **Multi-file Support** - Concurrent uploads with progress aggregation # Quick Start URL: /docs/quick-start Get production-ready file uploads working in your Next.js app in under 2 minutes. *** title: Quick Start description: Get production-ready file uploads working in your Next.js app in under 2 minutes. ---------------------------------------------------------------------------------------------- import { Step, Steps } from "fumadocs-ui/components/steps"; import { Callout } from "fumadocs-ui/components/callout"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; # Quick Start Get **production-ready file uploads** working in your Next.js app in under 2 minutes with our CLI tool. Interactive setup, just one command. **๐Ÿš€ New!** Use our CLI for instant setup: `npx @pushduck/cli@latest init` - handles everything automatically! ## Choose Your Setup Method ### โšก Interactive CLI Setup Get everything set up instantly with our interactive CLI: ```bash npx @pushduck/cli@latest init ``` That's it! The CLI will: * โœ… Install dependencies automatically * โœ… Set up your chosen provider (AWS S3, Cloudflare R2, etc.) * โœ… Create API routes with type safety * โœ… Generate example components * โœ… Configure environment variables * โœ… Create and configure your S3 bucket **What you get:** * Production-ready upload API in `app/api/upload/route.ts` * Type-safe upload client in `lib/upload-client.ts` * Example components in `components/ui/` * Working demo page in `app/upload/page.tsx` [**๐Ÿ“š Full CLI Documentation โ†’**](/docs/guides/setup/cli-setup) **Example CLI Output:** ``` โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚ โ”‚ โ”‚ ๐Ÿš€ Welcome to Pushduck โ”‚ โ”‚ โ”‚ โ”‚ Let's get your file uploads working in 2 minutes! โ”‚ โ”‚ โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ ๐Ÿ” Detecting your project... โœ“ Next.js App Router detected โœ“ TypeScript configuration found ? Which cloud storage provider would you like to use? โฏ AWS S3 (recommended) Cloudflare R2 (S3-compatible, global edge) DigitalOcean Spaces (simple, affordable) โœจ Generated files: โ”œโ”€โ”€ app/api/upload/route.ts โ”œโ”€โ”€ app/upload/page.tsx โ”œโ”€โ”€ components/ui/upload-button.tsx โ”œโ”€โ”€ lib/upload-client.ts โ””โ”€โ”€ .env.example ๐ŸŽ‰ Setup complete! Your uploads are ready. ``` ### ๐Ÿ”ง Manual Setup If you prefer to set things up manually or need custom configuration: ## Prerequisites * Next.js 13+ with App Router * An S3-compatible storage provider (we'll use AWS S3 in this guide) * Node.js 18+ ## Install Pushduck ```bash npm install pushduck ``` **Using a different package manager?** ```bash npm install pushduck ``` ```bash pnpm add pushduck ``` ```bash yarn add pushduck ``` ```bash bun add pushduck ``` ## Set Environment Variables Create a `.env.local` file in your project root with your S3 credentials: ```dotenv # .env.local AWS_ACCESS_KEY_ID=your_access_key AWS_SECRET_ACCESS_KEY=your_secret_key AWS_REGION=us-east-1 AWS_S3_BUCKET_NAME=your-bucket-name ``` **Don't have S3 credentials yet?** Follow our [AWS S3 setup guide](/docs/guides/setup/aws-s3) to create a bucket and get your credentials in 2 minutes. ## Create Your Upload Router Create an API route to handle file uploads: ```typescript // app/api/s3-upload/route.ts import { createUploadConfig } from "pushduck/server"; const { s3 } = createUploadConfig() .provider("cloudflareR2",{ accountId: process.env.CLOUDFLARE_ACCOUNT_ID!, accessKeyId: process.env.CLOUDFLARE_ACCESS_KEY_ID!, secretAccessKey: process.env.CLOUDFLARE_SECRET_ACCESS_KEY!, bucket: process.env.CLOUDFLARE_BUCKET_NAME!, region: "auto", }) .build(); const router = s3.createRouter({ // Define your upload routes with validation imageUpload: s3 .image() .max("10MB") .formats(["jpeg", "png", "webp"]), documentUpload: s3.file().max("50MB").types(["application/pdf", "application/msword"]), }); export { router as POST }; // Export the router type for client-side type safety export type Router = typeof router; ``` **What's happening here?** - `s3.createRouter()` creates a type-safe upload handler - `s3.image()` and `s3.file()` provide validation and TypeScript inference - The router automatically handles presigned URLs, validation, and errors - Exporting the type enables full client-side type safety ## Create Upload Client Create a type-safe client for your components using the **recommended structured approach**: **Recommended**: The structured client provides the best developer experience with property-based access, centralized configuration, and enhanced type safety. ```typescript // lib/upload-client.ts import { createUploadClient } from "pushduck/client"; import type { Router } from "@/app/api/s3-upload/route"; // Create a type-safe upload client (recommended) export const upload = createUploadClient({ endpoint: "/api/s3-upload", }); ``` **Why this approach is recommended:** * โœ… **Full type inference** from your server router * โœ… **Property-based access** - `upload.imageUpload()` instead of strings * โœ… **IntelliSense support** - see all available endpoints * โœ… **Refactoring safety** - rename routes with confidence * โœ… **Centralized config** - set headers, timeouts, and options once **Alternative**: You can also use the hook-based approach if you prefer traditional React patterns: ```typescript // With type parameter (recommended) const { uploadFiles } = useUploadRoute('imageUpload') // Or without type parameter (also works) const { uploadFiles } = useUploadRoute('imageUpload') ``` The structured client is still recommended for most use cases. ## Use in Your Components Now you can use the upload client in any component with full type safety: ```typescript // components/image-uploader.tsx "use client"; import { upload } from "@/lib/upload-client"; export function ImageUploader() { const { uploadFiles, uploadedFiles, isUploading, progress, error } = upload.imageUpload(); const handleFileChange = (e: React.ChangeEvent) => { const files = e.target.files; if (files) { uploadFiles(Array.from(files)); } }; return (
{isUploading && (
Uploading... {Math.round(progress)}%
)}
{error && (

{error.message}

)} {uploadedFiles.length > 0 && (
{uploadedFiles.map((file) => (
Uploaded image

{file.name}

))}
)}
); } ``` ## Add to Your Page Finally, use your upload component in any page: ```typescript // app/page.tsx import { ImageUploader } from "@/components/image-uploader"; export default function HomePage() { return (

Upload Images

); } ```
## ๐ŸŽ‰ Congratulations! You now have **production-ready file uploads** working in your Next.js app! Here's what you accomplished: * โœ… **Type-safe uploads** with full TypeScript inference * โœ… **Automatic validation** for file types and sizes * โœ… **Progress tracking** with loading states * โœ… **Error handling** with user-friendly messages * โœ… **Secure uploads** using presigned URLs * โœ… **Multiple file support** with image preview ## What's Next? Now that you have the basics working, explore these advanced features:

๐ŸŽจ Enhanced UI

Add drag & drop, progress bars, and beautiful components

Image Upload Guide โ†’
{" "}

๐Ÿ”’ Custom Validation

Add authentication, custom metadata, and middleware

Router Configuration โ†’
{" "}

โ˜๏ธ Other Providers

Switch to Cloudflare R2, DigitalOcean, or MinIO

Provider Setup โ†’

โšก Enhanced Client

Upgrade to property-based access for better DX

Migration Guide โ†’
## Need Help? * ๐Ÿ“– **Documentation**: Explore our comprehensive [guides](/docs/guides) * ๐Ÿ’ฌ **Community**: Join our [Discord community](https://discord.gg/pushduck) * ๐Ÿ› **Issues**: Report bugs on [GitHub](https://github.com/abhay-ramesh/pushduck) * ๐Ÿ“ง **Support**: Email us at [support@pushduck.com](mailto:support@pushduck.com) **Loving Pushduck?** Give us a โญ on [GitHub](https://github.com/abhay-ramesh/pushduck) and help spread the word! # Roadmap URL: /docs/roadmap Our vision for the future of file uploads in Next.js *** title: Roadmap description: Our vision for the future of file uploads in Next.js ----------------------------------------------------------------- import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { File, Folder, Files } from "fumadocs-ui/components/files"; import { TypeTable } from "fumadocs-ui/components/type-table"; # Roadmap Our mission is to make file uploads **simple**, **secure**, and **scalable** for every developer and every use case. ## โœ… Completed ### Core Foundation โœ… **Universal Compatibility** - Works with 16+ frameworks and edge runtimes\ โœ… **Type-Safe APIs** - Full TypeScript inference from server to client\ โœ… **Multi-Provider Support** - AWS S3, Cloudflare R2, DigitalOcean Spaces, MinIO\ โœ… **Production Security** - Presigned URLs, file validation, CORS handling\ โœ… **Developer Experience** - Property-based client, comprehensive error handling\ โœ… **Overall Progress Tracking** - Now provides real-time aggregate progress metrics * `progress` - 0-100% completion across all files * `uploadSpeed` - Combined transfer rate in bytes/second * `eta` - Overall time remaining in seconds ### Setup & Tooling โœ… **Interactive CLI** - Guided setup with smart defaults and auto-detection\ โœ… **Code Generation** - Type-safe API routes and client components\ โœ… **Framework Detection** - Automatic Next.js App Router/Pages Router detection\ โœ… **Environment Setup** - Automated credential configuration ### Documentation & Examples โœ… **Comprehensive Docs** - Complete API reference and integration guides\ โœ… **Live Examples** - Working demos for all supported frameworks\ โœ… **Migration Guides** - Step-by-step migration from other solutions\ โœ… **Best Practices** - Security, performance, and architecture guidance ## โš ๏ธ Current Limitations ### Progress Tracking Constraints โœ… **Overall Progress Tracking** - Now provides real-time aggregate progress metrics * `progress` - 0-100% completion across all files * `uploadSpeed` - Combined transfer rate in bytes/second * `eta` - Overall time remaining in seconds ### Upload Control Limitations Current upload management has constraints for handling real-world scenarios: โŒ **No Resumable Uploads** - Cannot resume interrupted uploads from where they left off\ โŒ **No Pausable Uploads** - Cannot pause ongoing uploads and resume later\ โŒ **No Cancel Support** - Cannot cancel individual uploads in progress\ โŒ **Limited Network Resilience** - No automatic retry on network failures or connection switching These limitations may be addressed in future releases based on community feedback and use case requirements. ## ๐Ÿšง In Progress ### Enhanced Developer Experience ๐Ÿšง **Visual Studio Code Extension** - IntelliSense, snippets, and debugging tools\ ๐Ÿšง **Enhanced Error Messages** - Contextual help and troubleshooting suggestions\ ๐Ÿšง **Performance Monitoring** - Built-in metrics and optimization recommendations ### Advanced Features ๐Ÿšง **Image Processing Pipeline** - Automatic optimization, resizing, and format conversion\ ๐Ÿšง **Video Processing** - Transcoding, thumbnail generation, and streaming support\ ๐Ÿšง **Advanced Validation** - Content scanning, virus detection, and custom rules ## ๐Ÿ“‹ Planned ### Q3 2025 - Enterprise Features * **Advanced Analytics** - Upload metrics, performance insights, and usage tracking * **Enhanced Hook APIs** - onProgress callbacks and advanced upload state management * **Advanced Upload Control** - Resumable, pausable uploads with cancel support and network resilience * **Team Management** - Multi-user access, role-based permissions, and audit logs * **Advanced Security** - Content scanning, encryption at rest, and compliance tools * **SLA & Support** - Enterprise support plans and guaranteed uptime ### Q4 2025 - Platform Expansion * **Mobile SDKs** - React Native, Flutter, and native iOS/Android support * **Desktop Applications** - Electron and Tauri integration * **Serverless Optimization** - Enhanced edge runtime support and cold start optimization * **Global CDN** - Built-in content delivery and edge caching ### Q1 2026 - AI Integration * **Smart Tagging** - Automatic content categorization and metadata extraction * **Image Recognition** - Object detection, OCR, and content moderation * **Intelligent Compression** - AI-powered optimization for different use cases * **Content Insights** - Usage patterns and optimization recommendations ### Q2 2026 - Ecosystem Growth * **Plugin Architecture** - Extensible middleware system for custom workflows * **Third-party Integrations** - CMS platforms, e-commerce solutions, and productivity tools * **Community Templates** - Shared configurations and best practices * **Certification Program** - Training and certification for developers and teams ## ๐ŸŽฏ Long-term Vision ### Universal File Management Platform Transform pushduck from a upload library into a comprehensive file management platform that handles the entire file lifecycle: * **Intelligent Storage** - Automatic tier management and cost optimization * **Global Distribution** - Multi-region replication and edge delivery * **Advanced Processing** - Real-time transformation and processing pipelines * **Collaborative Features** - Shared workspaces, comments, and version control ### Developer Ecosystem Build a thriving ecosystem around pushduck: * **Marketplace** - Community-driven plugins, templates, and integrations * **Certification** - Professional training and certification programs * **Events & Community** - Conferences, meetups, and developer advocacy * **Enterprise Solutions** - Custom implementations and consulting services ## ๐Ÿ’ก Ideas & Suggestions Have ideas for pushduck? We'd love to hear them! * [Feature Requests](https://github.com/abhay-ramesh/pushduck/discussions/categories/ideas) * [Community Discord](https://discord.gg/pushduck) * [Developer Survey](https://forms.gle/pushduck-feedback) ## ๐Ÿค Contributing Want to help build the future of file uploads? Check out our [Contributing Guide](/docs/contributing) to get started. ### Current Priorities We're actively looking for contributors in these areas: * **Framework Integrations** - Help us support more frameworks and platforms * **Documentation** - Improve guides, examples, and API documentation * **Testing** - Expand test coverage and add integration tests * **Performance** - Optimize bundle size and runtime performance * **Security** - Security audits and vulnerability assessments *** *Last updated: June 2025* This roadmap is community-driven. **Your feedback shapes our priorities.** Join our [Discord](https://discord.gg/pushduck) or open an issue on [GitHub](https://github.com/abhay-ramesh/pushduck) to influence what we build next. ## Current Status We've already solved the core problems that have frustrated developers for years: โœ… **Interactive CLI** - Guided setup with smart defaults and auto-detection\ โœ… **Type Safety** - Full TypeScript inference for upload schemas\ โœ… **Multiple Providers** - Cloudflare R2, AWS S3, Google Cloud, and more\ โœ… **Production Ready** - Used by teams processing millions of uploads\ โœ… **Developer Experience** - Property-based client access with enhanced IntelliSense ## What's Next ### ๐Ÿš€ Q3 2025: Developer Experience Revolution The CLI will automatically detect your project structure and configure everything: ```bash npx @pushduck/cli@latest init # โœ… Detected Next.js 14 with App Router # โœ… Created upload route at /api/upload # โœ… Added environment variables to .env.local # โœ… Generated type-safe upload config ``` {" "} No more YAML or complex configuration files. Build upload pipelines visually and export to code with real-time preview of your upload components. Hot-reload your upload components with live preview of validation, progress, and error states. Perfect for rapid prototyping and design iteration. Complete control over upload lifecycle with automatic recovery from network issues: ```typescript const { files, uploadFiles, pauseUpload, resumeUpload, cancelUpload } = upload.images // Pause individual uploads await pauseUpload(fileId) // Resume from where it left off await resumeUpload(fileId) // Cancel with cleanup await cancelUpload(fileId) // Automatic network resilience const config = { retryAttempts: 3, networkSwitchTolerance: true, resumeOnReconnect: true } ``` ### ๐Ÿ”’ Q4 2025: Enterprise & Security ```typescript const s3Router = s3.createRouter({ images: s3.image() .permissions(["user:upload", "admin:all"]) .auditLog(true) .compliance("SOC2") }) ``` Built-in integration with leading security providers for automatic threat detection, content moderation, and policy enforcement. Pre-built compliance workflows and automatic data handling policies with audit trails and data retention management. ### โšก Q1 2026: Performance & Scale ```typescript const s3Router = s3.createRouter({ images: s3.image() .processing({ resize: { width: 800, height: 600 }, format: "webp", edge: true // Process at nearest edge location }) }) ``` Automatic cache warming and smart invalidation strategies for optimal performance. Includes built-in CDN integration with major providers. Real-time dashboard showing upload success rates, processing times, storage costs, and performance bottlenecks with actionable insights. ### ๐ŸŒ Q2 2026: Ecosystem Expansion Complete framework support with the same developer experience: ```typescript // Vue 3 Composition API import { createUploadClient } from '@pushduck/vue' const upload = createUploadClient({ endpoint: '/api/upload' }) const { files, uploadFiles, isUploading } = upload.imageUpload ``` ```typescript // Svelte stores import { uploadStore } from '@pushduck/svelte' const upload = uploadStore('/api/upload') // Reactive stores for upload state $: ({ files, isUploading } = $upload.imageUpload) ``` ```typescript // Pure JavaScript import { UploadClient } from '@pushduck/core' const client = new UploadClient('/api/upload') client.upload('imageUpload', files) .on('progress', (progress) => console.log(progress)) .on('complete', (urls) => console.log(urls)) ``` {" "} Native mobile SDKs with the same type-safe API you love on the web. Full offline support with automatic retry and background uploads. Automatic alt text generation, content categorization, duplicate detection, and smart compression based on content analysis. ## Community Roadmap ### What You're Asking For Based on community feedback, GitHub issues, and Discord discussions: **๐Ÿ”ฅ High Priority (Next 3 months)** - **Drag & Drop File Manager** - Visual file organization and bulk operations - **Video Processing Pipeline** - Automatic transcoding and thumbnail generation - **Better Error Messages** - More helpful error descriptions with suggested fixes - **Upload Resume** - Automatic retry and resume for failed large file uploads - **Real-time Collaboration** - Multiple users uploading to shared spaces *These features have 100+ upvotes across GitHub and Discord* {" "} **๐Ÿ’ญ Exploring (6 months)** - **GraphQL Integration** - Native GraphQL subscription support for upload progress - **Webhook Builder** - Visual webhook configuration for upload events - **Template Gallery** - Pre-built upload components for common use cases - **A/B Testing** - Built-in experimentation for upload flows - **White-label Solution** - Fully customizable upload interface *Join the discussion on these features in our Discord* **๐Ÿ”ฎ Vision (12+ months)** - **No-Code Integration** - Zapier/Make.com connectors - **Blockchain Storage** - IPFS and decentralized storage options * **AI-Powered Optimization** - Automatic performance tuning - **Cross-Platform Desktop** - Electron-based upload manager - **Enterprise Marketplace** - Plugin ecosystem for custom integrations *These are aspirational goals that depend on community growth* ## How We Prioritize Our roadmap is driven by three key factors: 1. **Community Impact** - Features that solve real problems for the most developers 2. **Technical Excellence** - Maintaining our high standards for type safety and DX 3. **Ecosystem Health** - Building a sustainable, long-term solution ### Voting on Features Have an idea or want to prioritize something? Here's how to influence our roadmap: Use our feature request template with use cases and expected API design. Include code examples and real-world scenarios. {" "} Join our Discord server where we run monthly polls on upcoming features. Your vote directly influences our development priorities. First Friday of every month at 10 AM PT - open to all developers. Share your use cases and help shape the future. ## Behind the Scenes ### What We're Working on Right Now **Week of June 23, 2025:** * ๐Ÿ”จ Enhanced type inference for nested upload schemas * ๐Ÿงช Testing framework for upload workflows * ๐Ÿ“š Interactive examples in documentation * ๐Ÿ› Bug fixes for edge cases in multi-part uploads Follow our [GitHub project board](https://github.com/abhay-ramesh/pushduck/projects) for real-time updates on development progress. ### Development Structure Our development process is organized around clear modules: ### Core Principles As we build new features, we never compromise on: * **Type Safety First** - Every feature must have full TypeScript support * **Zero Breaking Changes** - Backward compatibility is non-negotiable * **Performance by Default** - New features can't slow down existing workflows * **Developer Happiness** - If it's not delightful to use, we rebuild it ## Get Involved This roadmap exists because of developers like you. Here's how to shape the future: ### For Users * **Share your use case** - Tell us what you're building * **Report pain points** - What's still too complicated? * **Request integrations** - Which providers or tools do you need? ### For Contributors * **Code contributions** - Check our [contributing guide](https://github.com/abhay-ramesh/pushduck/blob/main/CONTRIBUTING.md) * **Documentation** - Help improve examples and guides * **Community support** - Answer questions in Discord and GitHub ### For Organizations * **Sponsorship** - Support full-time development * **Enterprise feedback** - Share your scale challenges * **Partnership** - Integrate pushduck with your platform *** **Ready to build the future of file uploads?** Join our [Discord community](https://discord.gg/pushduck) and help us make file uploads delightful for every Next.js developer. # S3 Router URL: /docs/api/s3-router Type-safe upload routes with schema validation and middleware *** title: S3 Router description: Type-safe upload routes with schema validation and middleware -------------------------------------------------------------------------- # S3 Router The S3 router provides a type-safe way to define upload endpoints with schema validation, middleware, and lifecycle hooks. ## Basic Router Setup ```typescript // app/api/upload/route.ts import { s3 } from '@/lib/upload' const s3Router = s3.createRouter({ imageUpload: s3 .image() .max('5MB') .formats(['jpeg', 'jpg', 'png', 'webp']) .middleware(async ({ file, metadata }) => { // Add authentication and user context return { ...metadata, userId: 'user-123', uploadedAt: new Date().toISOString(), } }), documentUpload: s3 .file() .max('10MB') .types(['application/pdf', 'text/plain']) .paths({ prefix: 'documents', }), }) // Export the handler export const { GET, POST } = s3Router.handlers; ``` ## Schema Builders ### Image Schema ```typescript s3.image() .max('5MB') .formats(['jpeg', 'jpg', 'png', 'webp', 'gif']) .dimensions({ minWidth: 100, maxWidth: 2000 }) .quality(0.8) // JPEG quality ``` ### File Schema ```typescript s3.file() .max('10MB') .types(['application/pdf', 'text/plain', 'application/json']) .extensions(['pdf', 'txt', 'json']) ``` ### Object Schema (Multiple Files) ```typescript s3.object({ images: s3.image().max('5MB').count(5), documents: s3.file().max('10MB').count(2), thumbnail: s3.image().max('1MB').count(1), }) ``` ## Route Configuration ### Middleware Add authentication, validation, and metadata: ```typescript .middleware(async ({ file, metadata, req }) => { // Authentication const user = await authenticateUser(req) if (!user) { throw new Error('Authentication required') } // File validation if (file.size > 10 * 1024 * 1024) { throw new Error('File too large') } // Return enriched metadata return { ...metadata, userId: user.id, userRole: user.role, uploadedAt: new Date().toISOString(), ipAddress: req.headers.get('x-forwarded-for'), } }) ``` ### Path Configuration Control where files are stored: ```typescript .paths({ // Simple prefix prefix: 'user-uploads', // Custom path generation generateKey: (ctx) => { const { file, metadata, routeName } = ctx const userId = metadata.userId const timestamp = Date.now() return `${routeName}/${userId}/${timestamp}/${file.name}` }, // Simple suffix suffix: 'processed', }) ``` ### Lifecycle Hooks React to upload events: ```typescript .onUploadStart(async ({ file, metadata }) => { console.log(`Starting upload: ${file.name}`) // Log to analytics await analytics.track('upload_started', { userId: metadata.userId, filename: file.name, fileSize: file.size, }) }) .onUploadComplete(async ({ file, url, metadata }) => { console.log(`Upload complete: ${file.name} -> ${url}`) // Save to database await db.files.create({ filename: file.name, url, userId: metadata.userId, size: file.size, contentType: file.type, uploadedAt: new Date(), }) // Send notification await notificationService.send({ userId: metadata.userId, type: 'upload_complete', message: `${file.name} uploaded successfully`, }) }) .onUploadError(async ({ file, error, metadata }) => { console.error(`Upload failed: ${file.name}`, error) // Log error await errorLogger.log({ operation: 'file_upload', error: error.message, userId: metadata.userId, filename: file.name, }) }) ``` ## Advanced Examples ### E-commerce Product Images ```typescript const productRouter = s3.createRouter({ productImages: s3 .image() .max('5MB') .formats(['jpeg', 'jpg', 'png', 'webp']) .dimensions({ minWidth: 800, maxWidth: 2000 }) .middleware(async ({ metadata, req }) => { const user = await authenticateUser(req) const productId = metadata.productId // Verify user owns the product const product = await db.products.findFirst({ where: { id: productId, ownerId: user.id } }) if (!product) { throw new Error('Product not found or access denied') } return { ...metadata, userId: user.id, productId, productName: product.name, } }) .paths({ generateKey: (ctx) => { const { metadata } = ctx return `products/${metadata.productId}/images/${Date.now()}.jpg` } }) .onUploadComplete(async ({ url, metadata }) => { // Update product with new image await db.products.update({ where: { id: metadata.productId }, data: { images: { push: url } } }) }), productDocuments: s3 .file() .max('10MB') .types(['application/pdf']) .paths({ prefix: 'product-docs', }) .onUploadComplete(async ({ url, metadata }) => { await db.productDocuments.create({ productId: metadata.productId, documentUrl: url, type: 'specification', }) }), }) ``` ### User Profile System ```typescript const profileRouter = s3.createRouter({ avatar: s3 .image() .max('2MB') .formats(['jpeg', 'jpg', 'png']) .dimensions({ minWidth: 100, maxWidth: 500 }) .middleware(async ({ req }) => { const user = await authenticateUser(req) return { userId: user.id, type: 'avatar' } }) .paths({ generateKey: (ctx) => { return `users/${ctx.metadata.userId}/avatar.jpg` } }) .onUploadComplete(async ({ url, metadata }) => { // Update user profile await db.users.update({ where: { id: metadata.userId }, data: { avatarUrl: url } }) // Invalidate cache await cache.del(`user:${metadata.userId}`) }), documents: s3 .object({ resume: s3.file().max('5MB').types(['application/pdf']).count(1), portfolio: s3.file().max('10MB').count(3), }) .middleware(async ({ req }) => { const user = await authenticateUser(req) return { userId: user.id } }) .paths({ prefix: 'user-documents', }), }) ``` ## Client-Side Usage Once you have your router set up, use it from the client: ```typescript // components/FileUploader.tsx import { useUploadRoute } from 'pushduck' export function FileUploader() { const { upload, isUploading } = useUploadRoute('imageUpload') const handleUpload = async (files: FileList) => { try { const results = await upload(files, { // This metadata will be passed to middleware productId: 'product-123', category: 'main-images', }) console.log('Upload complete:', results) } catch (error) { console.error('Upload failed:', error) } } return (
e.target.files && handleUpload(e.target.files)} disabled={isUploading} /> {isUploading &&

Uploading...

}
) } ``` ## Type Safety The router provides full TypeScript support: ```typescript // Types are automatically inferred type RouterType = typeof s3Router // Get route names type RouteNames = keyof RouterType // 'imageUpload' | 'documentUpload' // Get route input types type ImageUploadInput = InferRouteInput // Get route metadata types type ImageUploadMetadata = InferRouteMetadata ``` # Troubleshooting URL: /docs/api/troubleshooting Common issues and solutions for pushduck *** title: Troubleshooting description: Common issues and solutions for pushduck ----------------------------------------------------- import { Callout } from "fumadocs-ui/components/callout"; import { Tabs, Tab } from "fumadocs-ui/components/tabs"; # Troubleshooting Common issues and solutions when using pushduck. ## Development Issues ### Next.js Turbo Mode Compatibility **Known Issue:** pushduck has compatibility issues with Next.js Turbo mode (`--turbo` flag). **Problem:** Uploads fail or behave unexpectedly when using `next dev --turbo`. **Solution:** Remove the `--turbo` flag from your development script: ```json { "scripts": { // โŒ This may cause issues "dev": "next dev --turbo", // โœ… Use this instead "dev": "next dev" } } ``` ```bash # โŒ This may cause issues npm run dev --turbo # โœ… Use this instead npm run dev ``` **Why this happens:** Turbo mode's aggressive caching and bundling can interfere with the upload process, particularly with presigned URL generation and file streaming. ## Upload Failures ### CORS Errors **Problem:** Browser console shows CORS errors when uploading files. **Symptoms:** ``` Access to XMLHttpRequest at 'https://bucket.s3.amazonaws.com/...' from origin 'http://localhost:3000' has been blocked by CORS policy ``` **Solution:** Configure CORS on your S3 bucket. See the [provider setup guides](/docs/providers) for detailed CORS configuration. ### Environment Variables Not Found **Problem:** Errors about missing environment variables. **Symptoms:** ``` Error: Environment variable CLOUDFLARE_R2_ACCESS_KEY_ID is not defined ``` **Solution:** Ensure your environment variables are properly set: 1. **Check your `.env.local` file exists** in your project root 2. **Verify variable names** match exactly (case-sensitive) 3. **Restart your development server** after adding new variables ```bash # .env.local CLOUDFLARE_R2_ACCESS_KEY_ID=your_access_key CLOUDFLARE_R2_SECRET_ACCESS_KEY=your_secret_key CLOUDFLARE_R2_ACCOUNT_ID=your_account_id R2_BUCKET=your-bucket-name ``` ### File Size Limits **Problem:** Large files fail to upload. **Solution:** Check and adjust size limits: ```typescript // app/api/upload/route.ts const uploadRouter = s3.createRouter({ imageUpload: s3 .image() .max("10MB") // Increase as needed .formats(["jpeg", "png", "webp"]), }); ``` ## Type Errors ### TypeScript Inference Issues **Problem:** TypeScript errors with upload client. **Solution:** Ensure proper type exports: ```typescript // app/api/upload/route.ts export const { GET, POST } = uploadRouter.handlers; export type AppRouter = typeof uploadRouter; // โœ… Export the type // lib/upload-client.ts import type { AppRouter } from "@/app/api/upload/route"; export const upload = createUploadClient({ // โœ… Use the type endpoint: "/api/upload", }); ``` ## Performance Issues ### Slow Upload Speeds **Problem:** Uploads are slower than expected. **Solutions:** 1. **Choose the right provider region** close to your users 2. **Check your internet connection** and server resources 3. **Consider your provider's performance characteristics** ### Memory Issues with Large Files **Problem:** Browser crashes or high memory usage with large files. **Solution:** File streaming is handled automatically by pushduck: ```typescript // File streaming is handled automatically // No additional configuration needed const { uploadFiles } = upload.fileUpload(); await uploadFiles(largeFiles); // โœ… Streams automatically ``` ## Getting Help If you're still experiencing issues: 1. **Check the documentation** for your specific provider 2. **Enable debug logging** by setting `NODE_ENV=development` 3. **Check browser console** for detailed error messages 4. **Verify your provider configuration** is correct **Need more help?** Create an issue on [GitHub](https://github.com/abhay-ramesh/pushduck/issues) with detailed information about your setup and the error you're experiencing. # Manual Setup URL: /docs/getting-started/manual-setup Step-by-step manual setup for developers who prefer full control over configuration *** title: Manual Setup description: Step-by-step manual setup for developers who prefer full control over configuration ------------------------------------------------------------------------------------------------ import { Step, Steps } from "fumadocs-ui/components/steps"; import { Callout } from "fumadocs-ui/components/callout"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; ## Prerequisites * Next.js 13+ with App Router * An S3-compatible storage provider (we recommend Cloudflare R2 for best performance and cost) * Node.js 18+ ## Install Pushduck ```bash npm install pushduck ``` ```bash pnpm add pushduck ``` ```bash yarn add pushduck ``` ```bash bun add pushduck ``` ## Set Environment Variables Create a `.env.local` file in your project root with your storage credentials: ```dotenv title=".env.local" # Cloudflare R2 Configuration CLOUDFLARE_R2_ACCESS_KEY_ID=your_access_key CLOUDFLARE_R2_SECRET_ACCESS_KEY=your_secret_key CLOUDFLARE_R2_ACCOUNT_ID=your_account_id CLOUDFLARE_R2_BUCKET_NAME=your-bucket-name ``` **Don't have R2 credentials yet?** Follow our [Cloudflare R2 setup guide](/docs/providers/cloudflare-r2) to create a bucket and get your credentials in 2 minutes. ```dotenv title=".env.local" # AWS S3 Configuration AWS_ACCESS_KEY_ID=your_access_key AWS_SECRET_ACCESS_KEY=your_secret_key AWS_REGION=us-east-1 AWS_S3_BUCKET_NAME=your-bucket-name ``` **Don't have S3 credentials yet?** Follow our [AWS S3 setup guide](/docs/providers/aws-s3) to create a bucket and get your credentials in 2 minutes. ## Configure Upload Settings First, create your upload configuration: ```typescript // lib/upload.ts import { createUploadConfig } from "pushduck/server"; // Configure your S3-compatible storage export const { s3, storage } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.CLOUDFLARE_R2_ACCESS_KEY_ID!, secretAccessKey: process.env.CLOUDFLARE_R2_SECRET_ACCESS_KEY!, region: "auto", endpoint: `https://${process.env.CLOUDFLARE_R2_ACCOUNT_ID}.r2.cloudflarestorage.com`, bucket: process.env.CLOUDFLARE_R2_BUCKET_NAME!, accountId: process.env.CLOUDFLARE_R2_ACCOUNT_ID!, }) .build(); ``` ## Create Your Upload Router Create an API route to handle file uploads: ```typescript // app/api/s3-upload/route.ts import { s3 } from "@/lib/upload"; const s3Router = s3.createRouter({ // Define your upload routes with validation imageUpload: s3 .image() .max("10MB") .formats(["jpg", "jpeg", "png", "webp"]), documentUpload: s3.file().max("50MB").types(["application/pdf", "application/msword", "application/vnd.openxmlformats-officedocument.wordprocessingml.document"]), }); export const { GET, POST } = s3Router.handlers; // Export the router type for client-side type safety export type Router = typeof s3Router; ``` **What's happening here?** - `s3.createRouter()` creates a type-safe upload handler - `s3.image()` and `s3.file()` provide validation and TypeScript inference - The router automatically handles presigned URLs, validation, and errors - Exporting the type enables full client-side type safety ## Create Upload Client Create a type-safe client for your components: ```typescript // lib/upload-client.ts import { createUploadClient } from "pushduck"; import type { Router } from "@/app/api/s3-upload/route"; // Create a type-safe upload client export const upload = createUploadClient({ baseUrl: "/api/s3-upload", }); // You can also export specific upload methods export const { imageUpload, documentUpload } = upload; ``` ## Use in Your Components Now you can use the upload client in any component with full type safety: ```typescript // components/image-uploader.tsx "use client"; import { upload } from "@/lib/upload-client"; export function ImageUploader() { const { uploadFiles, uploadedFiles, isUploading, progress, error } = upload.imageUpload(); const handleFileChange = (e: React.ChangeEvent) => { const files = e.target.files; if (files) { uploadFiles(Array.from(files)); } }; return (
{isUploading && (
Uploading... {Math.round(progress)}%
)}
{error && (

{error.message}

)} {uploadedFiles.length > 0 && (
{uploadedFiles.map((file) => (
Uploaded image

{file.name}

))}
)}
); } ``` ## Add to Your Page Finally, use your upload component in any page: ```typescript // app/page.tsx import { ImageUploader } from "@/components/image-uploader"; export default function HomePage() { return (

Upload Images

); } ```
## ๐ŸŽ‰ Congratulations! You now have **production-ready file uploads** working in your Next.js app! Here's what you accomplished: * โœ… **Type-safe uploads** with full TypeScript inference * โœ… **Automatic validation** for file types and sizes * โœ… **Progress tracking** with loading states * โœ… **Error handling** with user-friendly messages * โœ… **Secure uploads** using presigned URLs * โœ… **Multiple file support** with image preview **Turbo Mode Issue:** If you're using `next dev --turbo` and experiencing upload issues, try removing the `--turbo` flag from your dev script. There's a known compatibility issue with Turbo mode that can affect file uploads. ## What's Next? Now that you have the basics working, explore these advanced features:

๐ŸŽจ Enhanced UI

Add drag & drop, progress bars, and beautiful components

Image Upload Guide โ†’
{" "}

๐Ÿ”’ Custom Validation

Add authentication, custom metadata, and middleware

Router Configuration โ†’
{" "}

โ˜๏ธ Other Providers

Try Cloudflare R2 for better performance, or AWS S3, DigitalOcean, MinIO

Provider Setup โ†’

โšก Enhanced Client

Upgrade to property-based access for better DX

Migration Guide โ†’
## Need Help? * ๐Ÿ“– **Documentation**: Explore our comprehensive [guides](/docs/guides) * ๐Ÿ’ฌ **Community**: Join our [Discord community](https://discord.gg/pushduck) * ๐Ÿ› **Issues**: Report bugs on [GitHub](https://github.com/abhay-ramesh/pushduck) * ๐Ÿ“ง **Support**: Email us at [support@pushduck.com](mailto:support@pushduck.com) **Loving Pushduck?** Give us a โญ on [GitHub](https://github.com/abhay-ramesh/pushduck) and help spread the word! # Quick Start URL: /docs/getting-started/quick-start Get file uploads working in your Next.js app in under 2 minutes with our CLI *** title: Quick Start description: Get file uploads working in your Next.js app in under 2 minutes with our CLI ----------------------------------------------------------------------------------------- import { Step, Steps } from "fumadocs-ui/components/steps"; import { Callout } from "fumadocs-ui/components/callout"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; Get **production-ready file uploads** working in your Next.js app in under 2 minutes with our CLI tool. Interactive setup, just one command. **๐Ÿš€ New!** Use our CLI for instant setup: `npx @pushduck/cli@latest init` - handles everything automatically! ### โšก Interactive CLI Setup Get everything set up instantly with our interactive CLI: ```bash npx @pushduck/cli@latest init ``` ```bash pnpm dlx pushduck init ``` ```bash yarn dlx pushduck init ``` ```bash bunx pushduck init ``` That's it! The CLI will: * โœ… **Auto-detect your package manager** (npm, pnpm, yarn, bun) * โœ… Install dependencies using your preferred package manager * โœ… Set up your chosen provider (Cloudflare R2, AWS S3, etc.) * โœ… Create API routes with type safety * โœ… Generate example components * โœ… Configure environment variables * โœ… Create and configure your storage bucket **What you get:** * Production-ready upload API in `app/api/upload/route.ts` * Type-safe upload client in `lib/upload-client.ts` * Example components in `components/ui/` * Working demo page in `app/upload/page.tsx` [**๐Ÿ“š Full CLI Documentation โ†’**](/docs/api/cli/cli-setup) **Example CLI Output:** ``` โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚ โ”‚ โ”‚ ๐Ÿš€ Welcome to Pushduck โ”‚ โ”‚ โ”‚ โ”‚ Let's get your file uploads working in 2 minutes! โ”‚ โ”‚ โ”‚ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ ๐Ÿ” Detecting your project... โœ“ Next.js App Router detected โœ“ TypeScript configuration found โœ“ Package manager: pnpm detected โœ“ No existing upload configuration โœ“ Project structure validated ? Which cloud storage provider would you like to use? โฏ Cloudflare R2 (recommended) AWS S3 (classic, widely supported) DigitalOcean Spaces (simple, affordable) Google Cloud Storage (enterprise-grade) MinIO (self-hosted, open source) Custom S3-compatible endpoint โœจ Generated files: โ”œโ”€โ”€ app/api/upload/route.ts โ”œโ”€โ”€ app/upload/page.tsx โ”œโ”€โ”€ components/ui/upload-button.tsx โ”œโ”€โ”€ lib/upload-client.ts โ””โ”€โ”€ .env.example ๐Ÿ“ฆ Installing dependencies with pnpm... โœ“ pushduck โœ“ @aws-sdk/client-s3 โœ“ react-dropzone ๐ŸŽ‰ Setup complete! Your uploads are ready. ``` **Turbo Mode Issue:** If you're using `next dev --turbo` and experiencing upload issues, try removing the `--turbo` flag. There's a known compatibility issue with Turbo mode that can affect file uploads. *** ## Step-by-Step CLI Walkthrough Here's exactly what happens when you run the CLI and how to make the best choices: ### Project Detection & Validation ``` ๐Ÿ” Detecting your project... โœ“ Next.js App Router detected โœ“ TypeScript configuration found โœ“ Package manager: pnpm detected โœ“ No existing upload configuration โœ“ Project structure validated ``` **What's happening:** The CLI automatically detects your project setup to ensure compatibility. **If you see warnings:** * โš ๏ธ **Pages Router detected**: Still works, but examples will be for App Router * โš ๏ธ **No TypeScript**: JavaScript examples will be generated instead * โš ๏ธ **Existing configuration**: CLI will ask if you want to overwrite * โš ๏ธ **Package manager not detected**: Will default to npm ### Provider Selection ``` ? Which cloud storage provider would you like to use? โฏ Cloudflare R2 (recommended) AWS S3 (classic, widely supported) DigitalOcean Spaces (simple, affordable) Google Cloud Storage (enterprise-grade) MinIO (self-hosted, open source) Custom S3-compatible endpoint ``` **How to choose:** **Choose: Cloudflare R2 (recommended)** * Zero egress fees (bandwidth is FREE) * Global edge network with 200+ locations * Simple setup with excellent documentation * Best performance for most applications **Choose: Cloudflare R2** * No egress fees (bandwidth is free) * $0.015/GB storage (cheaper than S3) * Global edge network included * Perfect for high-traffic applications **Choose: Cloudflare R2** * Global edge network with 200+ locations * Automatic geographic distribution * Faster uploads worldwide * Built-in CDN functionality **Choose: Google Cloud Storage** * Enterprise-grade security and compliance * Advanced analytics and monitoring * Integration with Google Cloud ecosystem * Multi-region redundancy options **Use arrow keys** to navigate, **Enter** to select. ### Credential Detection & Setup ``` ๐Ÿ”ง Setting up Cloudflare R2... ๐Ÿ” Checking for existing credentials... โœ“ Found CLOUDFLARE_R2_ACCESS_KEY_ID โœ“ Found CLOUDFLARE_R2_SECRET_ACCESS_KEY โœ“ Found CLOUDFLARE_R2_ACCOUNT_ID โš  CLOUDFLARE_R2_BUCKET_NAME not found ``` **What this means:** * โœ… **Found credentials**: CLI detected existing environment variables * โš ๏ธ **Missing credentials**: You'll be prompted to enter them * โŒ **No credentials**: CLI will guide you through setup **If prompted for credentials:** ``` ? Enter your Cloudflare R2 Access Key ID: f1d2... ? Enter your Cloudflare R2 Secret Access Key: [hidden] ? Enter your Cloudflare Account ID: abc123... ? Enter your R2 bucket name: my-app-uploads-2024 ``` **Pro tips:** * Use a **unique bucket name** (globally unique across all R2) * **Account ID** can be found in your Cloudflare dashboard * **Don't have credentials?** Check our [Cloudflare R2 setup guide](/docs/providers/cloudflare-r2) ### Bucket Creation ``` ? Create bucket automatically? (Y/n) ``` **Recommended: Yes** - The CLI will: * Create the bucket with proper permissions * Set up CORS configuration for web uploads * Configure public read access for uploaded files * Test the connection to ensure everything works **Choose "No" if:** * You already have a bucket configured * Your organization requires manual bucket creation * You need custom bucket policies **Success looks like:** ``` โœ… Created R2 bucket: my-app-uploads-2024 โœ… Configured CORS for web uploads โœ… Set up public read permissions โœ… Connection test successful ``` ### API Route Configuration ``` ? Where should we create the upload API? โฏ app/api/upload/route.ts (recommended) app/api/s3-upload/route.ts (classic) Custom path ``` **Recommendations:** * **`/api/upload`**: Clean, modern route name * **`/api/s3-upload`**: If you want to be explicit about S3 * **Custom path**: If you have specific routing requirements **The CLI will create:** * Type-safe API route with validation * Authentication middleware (ready to customize) * Support for multiple upload types (images, documents) * Proper error handling and responses ### Component Generation ``` ? Generate example upload page? โฏ Yes, create app/upload/page.tsx with full example Yes, just add components to components/ui/ No, I'll build my own ``` **Choose based on your needs:** **Full example page** - Best for: * Learning how the library works * Quick prototyping and testing * Getting a working demo immediately **Components only** - Best for: * Adding uploads to existing pages * Custom UI integration * Building your own demo **No components** - Best for: * Experienced developers * Custom implementation requirements * API-only usage ### File Generation & Installation ``` ๐Ÿ› ๏ธ Generating files... โœจ Created files: โ”œโ”€โ”€ app/api/upload/route.ts # Type-safe API endpoint โ”œโ”€โ”€ app/upload/page.tsx # Demo upload page โ”œโ”€โ”€ components/ui/upload-button.tsx # Simple upload button โ”œโ”€โ”€ components/ui/upload-dropzone.tsx # Drag & drop component โ”œโ”€โ”€ lib/upload-client.ts # Type-safe client โ””โ”€โ”€ .env.example # Environment template ๐Ÿ“ฆ Installing dependencies with pnpm... โœ“ pushduck โœ“ @aws-sdk/client-s3 โœ“ react-dropzone ๐ŸŽ‰ Setup complete! Your uploads are ready. ``` **What happens:** 1. **Files generated** with your specific configuration 2. **Dependencies installed** using your detected package manager 3. **Types generated** for full TypeScript support 4. **Environment configured** with your settings **Package Manager Support:** * **npm**: `npm install` commands * **pnpm**: `pnpm add` commands * **yarn**: `yarn add` commands * **bun**: `bun add` commands **Next steps shown:** ``` ๐Ÿš€ Next steps: 1. Start your dev server: pnpm dev 2. Visit: http://localhost:3000/upload 3. Try uploading a file! ๐Ÿ“š Learn more: โ€ข API Reference: /docs/api โ€ข Providers: /docs/providers โ€ข Examples: /docs/examples ``` *** ## Common CLI Scenarios ### First-Time Setup (Recommended) ```bash npx @pushduck/cli@latest init # Follow prompts, choose Cloudflare R2, let CLI create bucket ``` ```bash pnpm dlx pushduck init # Follow prompts, choose Cloudflare R2, let CLI create bucket ``` ```bash yarn dlx pushduck init # Follow prompts, choose Cloudflare R2, let CLI create bucket ``` ```bash bunx pushduck init # Follow prompts, choose Cloudflare R2, let CLI create bucket ``` ### Quick Cloudflare R2 Setup ```bash # Works with any package manager npx @pushduck/cli@latest init --provider cloudflare-r2 pnpm dlx pushduck init --provider cloudflare-r2 yarn dlx pushduck init --provider cloudflare-r2 bunx pushduck init --provider cloudflare-r2 ``` ### AWS S3 Setup ```bash # Skip provider selection, go straight to AWS S3 npx @pushduck/cli@latest init --provider aws # Use AWS S3 for existing AWS infrastructure ``` ### Components Only ```bash # Generate API and components, no demo page npx @pushduck/cli@latest init --skip-examples ``` ### Preview Mode ```bash # See what would be created without making changes npx @pushduck/cli@latest init --dry-run ``` *** **Need help?** The CLI includes built-in help: `npx @pushduck/cli@latest --help`. For detailed documentation, see our [complete CLI reference](/docs/api/cli/cli-setup). # Client-Side Approaches URL: /docs/guides/client-approaches Compare the structured client vs hook-based approaches for file uploads *** title: Client-Side Approaches description: Compare the structured client vs hook-based approaches for file uploads ------------------------------------------------------------------------------------ import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; # Client-Side Approaches Pushduck provides **two ways** to integrate file uploads in your React components. Both approaches now provide **identical functionality** including per-route callbacks, progress tracking, and error handling. **Recommendation**: Use the **Enhanced Structured Client** approach for the best developer experience. It now provides the same flexibility as hooks while maintaining superior type safety and centralized configuration. ## Quick Comparison ```typescript const upload = createUploadClient({ endpoint: '/api/upload' }) // Simple usage const { uploadFiles, files } = upload.imageUpload() // With per-route callbacks (NEW!) const { uploadFiles, files } = upload.imageUpload({ onSuccess: (results) => handleSuccess(results), onError: (error) => handleError(error), onProgress: (progress) => setProgress(progress) }) ``` **Best for**: Most projects - provides superior DX, type safety, and full flexibility ```typescript const { uploadFiles, files } = useUploadRoute('imageUpload', { onSuccess: (results) => handleSuccess(results), onError: (error) => handleError(error), onProgress: (progress) => setProgress(progress) }) ``` **Best for**: Teams that strongly prefer React hooks, legacy code migration ## Feature Parity Both approaches now support **identical functionality**: | Feature | Enhanced Structured Client | Hook-Based | | --------------------- | -------------------------------- | ---------------------------- | | โœ… Type Safety | **Superior** - Property-based | Good - Generic types | | โœ… Per-route Callbacks | **โœ… Full support** | โœ… Full support | | โœ… Progress Tracking | **โœ… Full support** | โœ… Full support | | โœ… Error Handling | **โœ… Full support** | โœ… Full support | | โœ… Multiple Endpoints | **โœ… Per-route endpoints** | โœ… Per-route endpoints | | โœ… Upload Control | **โœ… Enable/disable uploads** | โœ… Enable/disable uploads | | โœ… Auto-upload | **โœ… Per-route control** | โœ… Per-route control | | โœ… Overall Progress | **โœ… progress, uploadSpeed, eta** | โœ… progress, uploadSpeed, eta | ## API Comparison: Identical Capabilities Both approaches now return **exactly the same** properties and accept **exactly the same** configuration options: ```typescript // Hook-Based Approach const { uploadFiles, // (files: File[]) => Promise files, // S3UploadedFile[] isUploading, // boolean errors, // string[] reset, // () => void progress, // number (0-100) - overall progress uploadSpeed, // number (bytes/sec) - overall speed eta // number (seconds) - overall ETA } = useUploadRoute('imageUpload', { onSuccess: (results) => handleSuccess(results), onError: (error) => handleError(error), onProgress: (progress) => setProgress(progress), endpoint: '/api/custom-upload', disabled: false, autoUpload: true }); // Enhanced Structured Client - IDENTICAL capabilities const { uploadFiles, // (files: File[]) => Promise files, // S3UploadedFile[] isUploading, // boolean errors, // string[] reset, // () => void progress, // number (0-100) - overall progress uploadSpeed, // number (bytes/sec) - overall speed eta // number (seconds) - overall ETA } = upload.imageUpload({ onSuccess: (results) => handleSuccess(results), onError: (error) => handleError(error), onProgress: (progress) => setProgress(progress), endpoint: '/api/custom-upload', disabled: false, autoUpload: true }); ``` ## Complete Options Parity Both approaches support **identical configuration options**: ```typescript interface CommonUploadOptions { onSuccess?: (results: UploadResult[]) => void; onError?: (error: Error) => void; onProgress?: (progress: number) => void; endpoint?: string; // Custom endpoint per route disabled?: boolean; // Enable/disable uploads autoUpload?: boolean; // Auto-upload when files selected } // Hook-based: useUploadRoute(routeName, options) // Structured: upload.routeName(options) // Both accept the same CommonUploadOptions interface ``` ## Return Value Parity Both approaches return **identical properties**: ```typescript interface CommonUploadReturn { uploadFiles: (files: File[]) => Promise; files: S3UploadedFile[]; isUploading: boolean; errors: string[]; reset: () => void; // Overall progress tracking (NEW in both!) progress?: number; // 0-100 percentage across all files uploadSpeed?: number; // bytes per second across all files eta?: number; // seconds remaining for all files } ``` ## Enhanced Structured Client Examples ### Basic Usage (Unchanged) ```typescript import { createUploadClient } from 'pushduck/client' import type { AppRouter } from '@/lib/upload' const upload = createUploadClient({ endpoint: '/api/upload' }) export function SimpleUpload() { const { uploadFiles, files, isUploading } = upload.imageUpload() return ( uploadFiles(Array.from(e.target.files || []))} disabled={isUploading} /> ) } ``` ### With Per-Route Configuration (NEW!) ```typescript export function AdvancedUpload() { const [progress, setProgress] = useState(0) const { uploadFiles, files, isUploading, errors, reset } = upload.imageUpload({ onSuccess: (results) => { console.log('โœ… Upload successful!', results) showNotification('Images uploaded successfully!') }, onError: (error) => { console.error('โŒ Upload failed:', error) showErrorNotification(error.message) }, onProgress: (progress) => { console.log(`๐Ÿ“Š Progress: ${progress}%`) setProgress(progress) } }) return (
uploadFiles(Array.from(e.target.files || []))} /> {progress > 0 && }
) } ``` ### Multiple Routes with Different Configurations ```typescript export function MultiUploadComponent() { // Images with progress tracking const images = upload.imageUpload({ onProgress: (progress) => setImageProgress(progress) }) // Documents with different endpoint and success handler const documents = upload.documentUpload({ endpoint: '/api/secure-upload', onSuccess: (results) => updateDocumentLibrary(results) }) // Videos with upload disabled (feature flag) const videos = upload.videoUpload({ disabled: !isVideoUploadEnabled }) return (
) } ``` ### Global Configuration with Per-Route Overrides ```typescript const upload = createUploadClient({ endpoint: '/api/upload', // Global defaults (optional) defaultOptions: { onProgress: (progress) => console.log(`Global progress: ${progress}%`), onError: (error) => logError(error) } }) // This route inherits global defaults const basic = upload.imageUpload() // This route overrides specific options const custom = upload.documentUpload({ endpoint: '/api/secure-upload', // Override endpoint onSuccess: (results) => handleSecureUpload(results) // Add success handler // Still inherits global onProgress and onError }) ``` ## Hook-Based Approach (Unchanged) ```typescript import { useUploadRoute } from 'pushduck/client' export function HookBasedUpload() { const { uploadFiles, files, isUploading, error } = useUploadRoute('imageUpload', { onSuccess: (results) => console.log('Success:', results), onError: (error) => console.error('Error:', error), onProgress: (progress) => console.log('Progress:', progress) }) return ( uploadFiles(Array.from(e.target.files || []))} disabled={isUploading} /> ) } ``` ## Migration Guide ### From Hook-Based to Enhanced Structured Client ```typescript // Before: Hook-based const { uploadFiles, files } = useUploadRoute('imageUpload', { onSuccess: handleSuccess, onError: handleError }) // After: Enhanced structured client const upload = createUploadClient({ endpoint: '/api/upload' }) const { uploadFiles, files } = upload.imageUpload({ onSuccess: handleSuccess, onError: handleError }) ``` ### Benefits of Migration 1. **Better Type Safety**: Route names are validated at compile time 2. **Enhanced IntelliSense**: Auto-completion for all available routes 3. **Centralized Configuration**: Single place to configure endpoints and defaults 4. **Refactoring Support**: Rename routes safely across your codebase 5. **No Performance Impact**: Same underlying implementation ## When to Use Each Approach ### Use Enhanced Structured Client When: * โœ… **Starting a new project** - best overall developer experience * โœ… **Want superior type safety** - compile-time route validation * โœ… **Need centralized configuration** - single place for settings * โœ… **Value refactoring support** - safe route renames ### Use Hook-Based When: * โœ… **Migrating existing code** - minimal changes required * โœ… **Dynamic route names** - routes determined at runtime * โœ… **Team strongly prefers hooks** - familiar React patterns * โœ… **Legacy compatibility** - maintaining older codebases ## Performance Considerations Both approaches have **identical performance** characteristics: * Same underlying `useUploadRoute` implementation * Same network requests and upload logic * Same React hooks rules and lifecycle The enhanced structured client adds zero runtime overhead while providing compile-time benefits. *** **Full Feature Parity**: Both approaches now support the same functionality. The choice comes down to developer experience preferences rather than feature limitations. ## Detailed Comparison ### Type Safety & Developer Experience ```typescript // โœ… Complete type inference from server router const upload = createUploadClient({ endpoint: '/api/upload' }) // โœ… Property-based access - no string literals const { uploadFiles, files } = upload.imageUpload() // โœ… IntelliSense shows all available endpoints upload. // <- Shows: imageUpload, documentUpload, videoUpload... // โœ… Compile-time validation upload.nonExistentRoute() // โŒ TypeScript error // โœ… Refactoring safety // Rename routes in router โ†’ TypeScript shows all usage locations ``` **Benefits:** * ๐ŸŽฏ **Full type inference** from server to client * ๐Ÿ” **IntelliSense support** - discover endpoints through IDE * ๐Ÿ›ก๏ธ **Refactoring safety** - rename with confidence * ๐Ÿšซ **No string literals** - eliminates typos * โšก **Better DX** - property-based access feels natural ```typescript // โœ… With type parameter - recommended for better type safety const { uploadFiles, files } = useUploadRoute('imageUpload') // โœ… Without type parameter - also works const { uploadFiles, files } = useUploadRoute('imageUpload') // Type parameter provides compile-time validation const typed = useUploadRoute('imageUpload') // Route validated const untyped = useUploadRoute('imageUpload') // Any string accepted ``` **Characteristics:** * ๐Ÿช **React hook pattern** - familiar to React developers * ๐Ÿ”ค **Flexible usage** - works with or without type parameter * ๐Ÿงฉ **Component-level state** - each hook manages its own state * ๐ŸŽฏ **Type safety** - enhanced when using `` * ๐Ÿ” **IDE support** - best with type parameter ### Code Examples **Structured Client:** ```typescript import { upload } from '@/lib/upload-client' export function ImageUploader() { const { uploadFiles, files, isUploading, error } = upload.imageUpload() return (
uploadFiles(Array.from(e.target.files || []))} disabled={isUploading} /> {/* Upload UI */}
) } ``` **Hook-Based:** ```typescript import { useUploadRoute } from 'pushduck/client' export function ImageUploader() { const { uploadFiles, files, isUploading, error } = useUploadRoute('imageUpload') return (
uploadFiles(Array.from(e.target.files || []))} disabled={isUploading} /> {/* Same upload UI */}
) } ```
**Structured Client:** ```typescript export function FileManager() { const images = upload.imageUpload() const documents = upload.documentUpload() const videos = upload.videoUpload() return (
) } ``` **Hook-Based:** ```typescript export function FileManager() { const images = useUploadRoute('imageUpload') const documents = useUploadRoute('documentUpload') const videos = useUploadRoute('videoUpload') return (
) } ```
**Structured Client:** ```typescript // lib/upload-client.ts export const upload = createUploadClient({ endpoint: '/api/upload', headers: { Authorization: `Bearer ${getAuthToken()}` } }) // components/secure-uploader.tsx export function SecureUploader() { const { uploadFiles } = upload.secureUpload() // Authentication handled globally } ``` **Hook-Based:** ```typescript export function SecureUploader() { const { uploadFiles } = useUploadRoute('secureUpload', { headers: { Authorization: `Bearer ${getAuthToken()}` } }) // Authentication per hook usage } ```
## Conclusion **Our Recommendation**: Use the **Enhanced Structured Client** approach (`createUploadClient`) for most projects. It provides superior developer experience, better refactoring safety, and enhanced type inference. **Both approaches are supported**: The hook-based approach (`useUploadRoute`) is fully supported and valid for teams that prefer traditional React patterns. **Quick Decision Guide:** * **Most projects** โ†’ Use `createUploadClient` (recommended) * **Strongly prefer React hooks** โ†’ Use `useUploadRoute` * **Want best DX and type safety** โ†’ Use `createUploadClient` * **Need component-level control** โ†’ Use `useUploadRoute` ### Next Steps * **New Project**: Start with [createUploadClient](/docs/api/utilities/create-upload-client) * **Existing Hook Code**: Consider [migrating gradually](/docs/guides/migration/enhanced-client) * **Need Help**: Join our [Discord community](https://discord.gg/pushduck) for guidance # Astro URL: /docs/integrations/astro Modern static site file uploads with Astro using Web Standards - no adapter needed! *** title: Astro description: Modern static site file uploads with Astro using Web Standards - no adapter needed! ------------------------------------------------------------------------------------------------ import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; # Astro Integration Astro is a modern web framework for building fast, content-focused websites with islands architecture. It uses Web Standards APIs and provides excellent performance with minimal JavaScript. Since Astro uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters! **Web Standards Native**: Astro API routes use Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead. ## Quick Setup **Install dependencies** ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` ```bash bun add pushduck ``` **Configure upload router** ```typescript title="src/lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: import.meta.env.AWS_ACCESS_KEY_ID!, secretAccessKey: import.meta.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: import.meta.env.AWS_ENDPOINT_URL!, bucket: import.meta.env.S3_BUCKET_NAME!, accountId: import.meta.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().max("5MB"), documentUpload: s3.file().max("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create API route** ```typescript title="src/pages/api/upload/[...path].ts" import type { APIRoute } from 'astro'; import { uploadRouter } from '../../../lib/upload'; // Direct usage - no adapter needed! export const ALL: APIRoute = async ({ request }) => { return uploadRouter.handlers(request); }; ``` ## Basic Integration ### Simple Upload Route ```typescript title="src/pages/api/upload/[...path].ts" import type { APIRoute } from 'astro'; import { uploadRouter } from '../../../lib/upload'; // Method 1: Combined handler (recommended) export const ALL: APIRoute = async ({ request }) => { return uploadRouter.handlers(request); }; // Method 2: Separate handlers (if you need method-specific logic) export const GET: APIRoute = async ({ request }) => { return uploadRouter.handlers.GET(request); }; export const POST: APIRoute = async ({ request }) => { return uploadRouter.handlers.POST(request); }; ``` ### With CORS Support ```typescript title="src/pages/api/upload/[...path].ts" import type { APIRoute } from 'astro'; import { uploadRouter } from '../../../lib/upload'; export const ALL: APIRoute = async ({ request }) => { // Handle CORS preflight if (request.method === 'OPTIONS') { return new Response(null, { status: 200, headers: { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'GET, POST, OPTIONS', 'Access-Control-Allow-Headers': 'Content-Type', }, }); } const response = await uploadRouter.handlers(request); // Add CORS headers to actual response response.headers.set('Access-Control-Allow-Origin', '*'); return response; }; ``` ## Advanced Configuration ### Authentication with Astro ```typescript title="src/lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: import.meta.env.AWS_ACCESS_KEY_ID!, secretAccessKey: import.meta.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: import.meta.env.AWS_ENDPOINT_URL!, bucket: import.meta.env.S3_BUCKET_NAME!, accountId: import.meta.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with cookie-based authentication privateUpload: s3 .image() .max("5MB") .middleware(async ({ req }) => { const cookies = req.headers.get('Cookie'); const sessionId = parseCookie(cookies)?.sessionId; if (!sessionId) { throw new Error('Authentication required'); } const user = await getUserFromSession(sessionId); if (!user) { throw new Error('Invalid session'); } return { userId: user.id, username: user.username, }; }), // Public uploads (no auth) publicUpload: s3 .image() .max("2MB") // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; // Helper functions function parseCookie(cookieString: string | null) { if (!cookieString) return {}; return Object.fromEntries( cookieString.split('; ').map(c => { const [key, ...v] = c.split('='); return [key, v.join('=')]; }) ); } async function getUserFromSession(sessionId: string) { // Implement your session validation logic // This could connect to a database, Redis, etc. return { id: 'user-123', username: 'demo-user' }; } ``` ## Client-Side Usage ### Upload Component (React) ```tsx title="src/components/FileUpload.tsx" import { useUpload } from "pushduck/client"; import type { AppUploadRouter } from "../lib/upload"; const { UploadButton, UploadDropzone } = useUpload({ endpoint: "/api/upload", }); export default function FileUpload() { function handleUploadComplete(files: any[]) { console.log("Files uploaded:", files); alert("Upload completed!"); } function handleUploadError(error: Error) { console.error("Upload error:", error); alert(`Upload failed: ${error.message}`); } return (

Image Upload

Document Upload

); } ``` ### Upload Component (Vue) ```vue title="src/components/FileUpload.vue" ``` ### Using in Astro Pages ```astro title="src/pages/index.astro" --- // Server-side code (runs at build time) --- File Upload Demo

File Upload Demo

``` ## File Management ### Server-Side File API ```typescript title="src/pages/api/files.ts" import type { APIRoute } from 'astro'; export const GET: APIRoute = async ({ request, url }) => { const searchParams = url.searchParams; const userId = searchParams.get('userId'); if (!userId) { return new Response(JSON.stringify({ error: 'User ID required' }), { status: 400, headers: { 'Content-Type': 'application/json' } }); } // Fetch files from database const files = await getFilesForUser(userId); return new Response(JSON.stringify({ files: files.map(file => ({ id: file.id, name: file.name, url: file.url, size: file.size, uploadedAt: file.createdAt, })), }), { headers: { 'Content-Type': 'application/json' } }); }; async function getFilesForUser(userId: string) { // Implement your database query logic return []; } ``` ### File Management Page ```astro title="src/pages/files.astro" --- // This runs on the server at build time or request time const files = await fetch(`${Astro.url.origin}/api/files?userId=current-user`) .then(res => res.json()) .catch(() => ({ files: [] })); --- My Files

My Files

Uploaded Files

{files.files.length === 0 ? (

No files uploaded yet.

) : (
{files.files.map((file: any) => (

{file.name}

{formatFileSize(file.size)}

{new Date(file.uploadedAt).toLocaleDateString()}

View File
))}
)}
``` ## Deployment Options ```javascript title="astro.config.mjs" import { defineConfig } from 'astro/config'; import vercel from '@astrojs/vercel/serverless'; export default defineConfig({ output: 'server', adapter: vercel({ runtime: 'nodejs18.x', }), }); ``` ```javascript title="astro.config.mjs" import { defineConfig } from 'astro/config'; import netlify from '@astrojs/netlify/functions'; export default defineConfig({ output: 'server', adapter: netlify(), }); ``` ```javascript title="astro.config.mjs" import { defineConfig } from 'astro/config'; import node from '@astrojs/node'; export default defineConfig({ output: 'server', adapter: node({ mode: 'standalone', }), }); ``` ```javascript title="astro.config.mjs" import { defineConfig } from 'astro/config'; import cloudflare from '@astrojs/cloudflare'; export default defineConfig({ output: 'server', adapter: cloudflare(), }); ``` ## Environment Variables ```bash title=".env" # AWS Configuration AWS_REGION=us-east-1 AWS_ACCESS_KEY_ID=your_access_key AWS_SECRET_ACCESS_KEY=your_secret_key AWS_S3_BUCKET=your-bucket-name # Astro PUBLIC_UPLOAD_ENDPOINT=http://localhost:3000/api/upload ``` ## Performance Benefits ## Real-Time Upload Progress ```tsx title="src/components/AdvancedUpload.tsx" import { useState } from 'react'; export default function AdvancedUpload() { const [uploadProgress, setUploadProgress] = useState(0); const [isUploading, setIsUploading] = useState(false); async function handleFileUpload(event: React.ChangeEvent) { const files = event.target.files; if (!files || files.length === 0) return; setIsUploading(true); setUploadProgress(0); try { // Simulate upload progress for (let i = 0; i <= 100; i += 10) { setUploadProgress(i); await new Promise(resolve => setTimeout(resolve, 100)); } alert('Upload completed!'); } catch (error) { console.error('Upload failed:', error); alert('Upload failed!'); } finally { setIsUploading(false); setUploadProgress(0); } } return (
{isUploading && (

{uploadProgress}% uploaded

)}
); } ``` ## Troubleshooting **Common Issues** 1. **Route not found**: Ensure your route is `src/pages/api/upload/[...path].ts` 2. **Build errors**: Check that pushduck is properly installed and configured 3. **Environment variables**: Use `import.meta.env` instead of `process.env` 4. **Client components**: Remember to add `client:load` directive for interactive components ### Debug Mode Enable debug logging: ```typescript title="src/lib/upload.ts" export const uploadRouter = createS3Router({ // ... routes }).middleware(async ({ req, file }) => { if (import.meta.env.DEV) { console.log("Upload request:", req.url); console.log("File:", file.name, file.size); } return {}; }); ``` ### Astro Configuration ```javascript title="astro.config.mjs" import { defineConfig } from 'astro/config'; import react from '@astrojs/react'; import vue from '@astrojs/vue'; export default defineConfig({ integrations: [ react(), // For React components vue(), // For Vue components ], output: 'server', // Required for API routes vite: { define: { // Make environment variables available 'import.meta.env.AWS_ACCESS_KEY_ID': JSON.stringify(process.env.AWS_ACCESS_KEY_ID), } } }); ``` Astro provides an excellent foundation for building fast, content-focused websites with pushduck, combining the power of islands architecture with Web Standards APIs for optimal performance and developer experience. # Bun Runtime URL: /docs/integrations/bun Ultra-fast JavaScript runtime with native Web Standards support - no adapter needed! *** title: Bun Runtime description: Ultra-fast JavaScript runtime with native Web Standards support - no adapter needed! ------------------------------------------------------------------------------------------------- import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; # Bun Runtime Bun is an ultra-fast JavaScript runtime with native Web Standards support. Since Bun uses Web Standard `Request` and `Response` objects natively, pushduck handlers work directly without any adapters! **Web Standards Native**: Bun's `Bun.serve()` uses Web Standard `Request` objects directly, making pushduck integration seamless with zero overhead. ## Quick Setup **Install dependencies** ```bash bun add pushduck ``` ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().max("5MB"), documentUpload: s3.file().max("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create Bun server with upload routes** ```typescript title="server.ts" import { uploadRouter } from './lib/upload'; // Direct usage - no adapter needed! Bun.serve({ port: 3000, fetch(request) { const url = new URL(request.url); if (url.pathname.startsWith('/api/upload/')) { return uploadRouter.handlers(request); } return new Response('Not found', { status: 404 }); }, }); console.log('๐Ÿš€ Bun server running on http://localhost:3000'); ``` ## Basic Integration ### Simple Upload Server ```typescript title="server.ts" import { uploadRouter } from './lib/upload'; Bun.serve({ port: 3000, fetch(request) { const url = new URL(request.url); // Method 1: Combined handler (recommended) if (url.pathname.startsWith('/api/upload/')) { return uploadRouter.handlers(request); } // Health check if (url.pathname === '/health') { return new Response(JSON.stringify({ status: 'ok' }), { headers: { 'Content-Type': 'application/json' } }); } return new Response('Not found', { status: 404 }); }, }); console.log('๐Ÿš€ Bun server running on http://localhost:3000'); ``` ### With CORS and Routing ```typescript title="server.ts" import { uploadRouter } from './lib/upload'; function handleCORS(request: Request) { const origin = request.headers.get('origin'); const allowedOrigins = ['http://localhost:3000', 'https://your-domain.com']; const headers = new Headers(); if (origin && allowedOrigins.includes(origin)) { headers.set('Access-Control-Allow-Origin', origin); } headers.set('Access-Control-Allow-Methods', 'GET, POST, OPTIONS'); headers.set('Access-Control-Allow-Headers', 'Content-Type, Authorization'); return headers; } Bun.serve({ port: 3000, fetch(request) { const url = new URL(request.url); const corsHeaders = handleCORS(request); // Handle preflight requests if (request.method === 'OPTIONS') { return new Response(null, { status: 200, headers: corsHeaders }); } // Upload routes if (url.pathname.startsWith('/api/upload/')) { return uploadRouter.handlers(request).then(response => { // Add CORS headers to response corsHeaders.forEach((value, key) => { response.headers.set(key, value); }); return response; }); } // Health check if (url.pathname === '/health') { return new Response(JSON.stringify({ status: 'ok', runtime: 'Bun', timestamp: new Date().toISOString() }), { headers: { 'Content-Type': 'application/json', ...Object.fromEntries(corsHeaders) } }); } return new Response('Not found', { status: 404 }); }, }); console.log('๐Ÿš€ Bun server running on http://localhost:3000'); ``` ## Advanced Configuration ### Authentication and Rate Limiting ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with authentication privateUpload: s3 .image() .max("5MB") .middleware(async ({ req }) => { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authorization required'); } const token = authHeader.substring(7); try { const payload = await verifyJWT(token); return { userId: payload.sub as string, userRole: payload.role as string }; } catch (error) { throw new Error('Invalid token'); } }), // Public uploads (no auth) publicUpload: s3 .image() .max("2MB") // No middleware = public access }); async function verifyJWT(token: string) { // Your JWT verification logic here // Using Bun's built-in crypto or a JWT library return { sub: 'user-123', role: 'user' }; } export type AppUploadRouter = typeof uploadRouter; ``` ### Production Server with Full Features ```typescript title="server.ts" import { uploadRouter } from './lib/upload'; // Simple rate limiting store const rateLimitStore = new Map(); function rateLimit(ip: string, maxRequests = 100, windowMs = 15 * 60 * 1000) { const now = Date.now(); const key = ip; const record = rateLimitStore.get(key); if (!record || now > record.resetTime) { rateLimitStore.set(key, { count: 1, resetTime: now + windowMs }); return true; } if (record.count >= maxRequests) { return false; } record.count++; return true; } function getClientIP(request: Request): string { // In production, you might get this from headers like X-Forwarded-For return request.headers.get('x-forwarded-for') || request.headers.get('x-real-ip') || 'unknown'; } Bun.serve({ port: process.env.PORT ? parseInt(process.env.PORT) : 3000, fetch(request) { const url = new URL(request.url); const clientIP = getClientIP(request); // Rate limiting if (!rateLimit(clientIP)) { return new Response(JSON.stringify({ error: 'Too many requests' }), { status: 429, headers: { 'Content-Type': 'application/json' } }); } // CORS const corsHeaders = { 'Access-Control-Allow-Origin': process.env.NODE_ENV === 'production' ? 'https://your-domain.com' : '*', 'Access-Control-Allow-Methods': 'GET, POST, OPTIONS', 'Access-Control-Allow-Headers': 'Content-Type, Authorization', }; // Handle preflight if (request.method === 'OPTIONS') { return new Response(null, { status: 200, headers: corsHeaders }); } // Upload routes if (url.pathname.startsWith('/api/upload/')) { return uploadRouter.handlers(request).then(response => { Object.entries(corsHeaders).forEach(([key, value]) => { response.headers.set(key, value); }); return response; }).catch(error => { console.error('Upload error:', error); return new Response(JSON.stringify({ error: 'Upload failed', message: process.env.NODE_ENV === 'development' ? error.message : 'Internal server error' }), { status: 500, headers: { 'Content-Type': 'application/json', ...corsHeaders } }); }); } // API info if (url.pathname === '/api') { return new Response(JSON.stringify({ name: 'Bun Upload API', version: '1.0.0', runtime: 'Bun', endpoints: { health: '/health', upload: '/api/upload/*' } }), { headers: { 'Content-Type': 'application/json', ...corsHeaders } }); } // Health check if (url.pathname === '/health') { return new Response(JSON.stringify({ status: 'ok', runtime: 'Bun', version: Bun.version, timestamp: new Date().toISOString(), uptime: process.uptime() }), { headers: { 'Content-Type': 'application/json', ...corsHeaders } }); } return new Response('Not found', { status: 404, headers: corsHeaders }); }, }); console.log(`๐Ÿš€ Bun server running on http://localhost:${process.env.PORT || 3000}`); console.log(`๐Ÿ“Š Environment: ${process.env.NODE_ENV || 'development'}`); ``` ## File-based Routing ### Structured Application ```typescript title="routes/upload.ts" import { uploadRouter } from '../lib/upload'; export function handleUpload(request: Request) { return uploadRouter.handlers(request); } ``` ```typescript title="routes/api.ts" export function handleAPI(request: Request) { return new Response(JSON.stringify({ name: 'Bun Upload API', version: '1.0.0', runtime: 'Bun' }), { headers: { 'Content-Type': 'application/json' } }); } ``` ```typescript title="server.ts" import { handleUpload } from './routes/upload'; import { handleAPI } from './routes/api'; const routes = { '/api/upload': handleUpload, '/api': handleAPI, '/health': () => new Response(JSON.stringify({ status: 'ok' }), { headers: { 'Content-Type': 'application/json' } }) }; Bun.serve({ port: 3000, fetch(request) { const url = new URL(request.url); for (const [path, handler] of Object.entries(routes)) { if (url.pathname.startsWith(path)) { return handler(request); } } return new Response('Not found', { status: 404 }); }, }); ``` ## Performance Benefits Bun is 3x faster than Node.js, providing incredible performance for file upload operations. No adapter layer means zero performance overhead - pushduck handlers run directly in Bun. Built-in bundler, test runner, package manager, and more - no extra tooling needed. Run TypeScript directly without compilation, perfect for rapid development. ## Deployment ### Docker Deployment ```dockerfile title="Dockerfile" FROM oven/bun:1 as base WORKDIR /usr/src/app # Install dependencies COPY package.json bun.lockb ./ RUN bun install --frozen-lockfile # Copy source code COPY . . # Expose port EXPOSE 3000 # Run the app CMD ["bun", "run", "server.ts"] ``` ### Production Scripts ```json title="package.json" { "name": "bun-upload-server", "version": "1.0.0", "scripts": { "dev": "bun run --watch server.ts", "start": "bun run server.ts", "build": "bun build server.ts --outdir ./dist --target bun", "test": "bun test" }, "dependencies": { "pushduck": "latest" }, "devDependencies": { "bun-types": "latest" } } ``` *** **Bun + Pushduck**: The perfect combination for ultra-fast file uploads with zero configuration overhead and exceptional developer experience. # Elysia URL: /docs/integrations/elysia TypeScript-first framework with Bun - Web Standards native, no adapter needed! *** title: Elysia description: TypeScript-first framework with Bun - Web Standards native, no adapter needed! ------------------------------------------------------------------------------------------- import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; # Elysia Elysia is a TypeScript-first web framework designed for Bun. Since Elysia uses Web Standard `Request` objects natively, pushduck handlers work directly without any adapters! **Web Standards Native**: Elysia exposes `context.request` as a Web Standard `Request` object, making pushduck integration seamless with zero overhead. ## Quick Setup **Install dependencies** ```bash bun add pushduck ``` ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().max("5MB"), documentUpload: s3.file().max("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create Elysia app with upload routes** ```typescript title="server.ts" import { Elysia } from 'elysia'; import { uploadRouter } from './lib/upload'; const app = new Elysia(); // Direct usage - no adapter needed! app.all('/api/upload/*', (context) => { return uploadRouter.handlers(context.request); }); app.listen(3000); ``` ## Basic Integration ### Simple Upload Route ```typescript title="server.ts" import { Elysia } from 'elysia'; import { uploadRouter } from './lib/upload'; const app = new Elysia(); // Method 1: Combined handler (recommended) app.all('/api/upload/*', (context) => { return uploadRouter.handlers(context.request); }); // Method 2: Separate handlers (if you need method-specific logic) app.get('/api/upload/*', (context) => uploadRouter.handlers.GET(context.request)); app.post('/api/upload/*', (context) => uploadRouter.handlers.POST(context.request)); app.listen(3000); ``` ### With Middleware and CORS ```typescript title="server.ts" import { Elysia } from 'elysia'; import { cors } from '@elysiajs/cors'; import { uploadRouter } from './lib/upload'; const app = new Elysia() .use(cors({ origin: ['http://localhost:3000', 'https://your-domain.com'], allowedHeaders: ['Content-Type', 'Authorization'], methods: ['GET', 'POST'] })) // Upload routes .all('/api/upload/*', (context) => uploadRouter.handlers(context.request)) // Health check .get('/health', () => ({ status: 'ok' })) .listen(3000); console.log(`๐ŸฆŠ Elysia is running at http://localhost:3000`); ``` ## Advanced Configuration ### Authentication with JWT ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; import jwt from '@elysiajs/jwt'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with JWT authentication privateUpload: s3 .image() .max("5MB") .middleware(async ({ req }) => { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authorization required'); } const token = authHeader.substring(7); try { // Use your JWT verification logic here const payload = jwt.verify(token, process.env.JWT_SECRET!); return { userId: payload.sub as string, userRole: payload.role as string }; } catch (error) { throw new Error('Invalid token'); } }), // Public uploads (no auth) publicUpload: s3 .image() .max("2MB") // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; ``` ### Full Production Setup ```typescript title="server.ts" import { Elysia } from 'elysia'; import { cors } from '@elysiajs/cors'; import { rateLimit } from '@elysiajs/rate-limit'; import { swagger } from '@elysiajs/swagger'; import { uploadRouter } from './lib/upload'; const app = new Elysia() // Swagger documentation .use(swagger({ documentation: { info: { title: 'Upload API', version: '1.0.0' } } })) // CORS .use(cors({ origin: process.env.NODE_ENV === 'production' ? ['https://your-domain.com'] : true, allowedHeaders: ['Content-Type', 'Authorization'], methods: ['GET', 'POST'] })) // Rate limiting .use(rateLimit({ max: 100, windowMs: 15 * 60 * 1000, // 15 minutes })) // Upload routes .all('/api/upload/*', (context) => uploadRouter.handlers(context.request)) // Health check .get('/health', () => ({ status: 'ok', timestamp: new Date().toISOString() })) .listen(process.env.PORT || 3000); console.log(`๐ŸฆŠ Elysia is running at http://localhost:${process.env.PORT || 3000}`); ``` ## TypeScript Integration ### Type-Safe Client ```typescript title="lib/upload-client.ts" import { createUploadClient } from 'pushduck/client'; import type { AppUploadRouter } from './upload'; export const uploadClient = createUploadClient({ baseUrl: process.env.NEXT_PUBLIC_API_URL || 'http://localhost:3000' }); ``` ### Client Usage ```typescript title="components/upload.tsx" import { uploadClient } from '../lib/upload-client'; export function UploadComponent() { const handleUpload = async (files: File[]) => { try { const results = await uploadClient.upload('imageUpload', { files, // Type-safe metadata based on your router configuration metadata: { userId: 'user-123' } }); console.log('Upload successful:', results); } catch (error) { console.error('Upload failed:', error); } }; return ( { if (e.target.files) { handleUpload(Array.from(e.target.files)); } }} /> ); } ``` ## Performance Benefits No adapter layer means zero performance overhead - pushduck handlers run directly in Elysia. Built for Bun's exceptional performance, perfect for high-throughput upload APIs. Full TypeScript support from server to client with compile-time safety. Extensive plugin ecosystem for authentication, validation, rate limiting, and more. ## Deployment ### Production Deployment ```dockerfile title="Dockerfile" FROM oven/bun:1 as base WORKDIR /usr/src/app # Install dependencies COPY package.json bun.lockb ./ RUN bun install --frozen-lockfile # Copy source code COPY . . # Expose port EXPOSE 3000 # Run the app CMD ["bun", "run", "server.ts"] ``` ```bash # Build and run docker build -t my-upload-api . docker run -p 3000:3000 my-upload-api ``` *** **Perfect TypeScript Integration**: Elysia's TypeScript-first approach combined with pushduck's type-safe design creates an exceptional developer experience with full end-to-end type safety. # Expo Router URL: /docs/integrations/expo Full-stack React Native file uploads with Expo Router API routes - no adapter needed! *** title: Expo Router description: Full-stack React Native file uploads with Expo Router API routes - no adapter needed! -------------------------------------------------------------------------------------------------- import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; # Expo Router Integration Expo Router is a file-based router for React Native and web applications that enables full-stack development with API routes. Since Expo Router uses Web Standards APIs, pushduck handlers work directly without any adapters! **Web Standards Native**: Expo Router API routes use standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead. Perfect for universal React Native apps! ## Quick Setup **Install dependencies** ```bash npx expo install expo-router pushduck # For file uploads on mobile npx expo install expo-document-picker expo-image-picker # For file system operations npx expo install expo-file-system ``` ```bash yarn expo install expo-router pushduck # For file uploads on mobile yarn expo install expo-document-picker expo-image-picker # For file system operations yarn expo install expo-file-system ``` ```bash pnpm expo install expo-router pushduck # For file uploads on mobile pnpm expo install expo-document-picker expo-image-picker # For file system operations pnpm expo install expo-file-system ``` ```bash bun expo install expo-router pushduck # For file uploads on mobile bun expo install expo-document-picker expo-image-picker # For file system operations bun expo install expo-file-system ``` **Configure server output** Enable server-side rendering in your `app.json`: ```json title="app.json" { "expo": { "web": { "output": "server" }, "plugins": [ [ "expo-router", { "origin": "https://your-domain.com" } ] ] } } ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3 } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = s3.createRouter({ imageUpload: s3.image().max("5MB"), documentUpload: s3.file().max("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create API route** ```typescript title="app/api/upload/[...slug]+api.ts" import { uploadRouter } from '../../../lib/upload'; // Direct usage - no adapter needed! export async function GET(request: Request) { return uploadRouter.handlers(request); } export async function POST(request: Request) { return uploadRouter.handlers(request); } ``` ## Basic Integration ### Simple Upload Route ```typescript title="app/api/upload/[...slug]+api.ts" import { uploadRouter } from '../../../lib/upload'; // Method 1: Combined handler (recommended) export async function GET(request: Request) { return uploadRouter.handlers(request); } export async function POST(request: Request) { return uploadRouter.handlers(request); } // Method 2: Individual methods (if you need method-specific logic) export async function PUT(request: Request) { return uploadRouter.handlers(request); } export async function DELETE(request: Request) { return uploadRouter.handlers(request); } ``` ### With CORS Headers ```typescript title="app/api/upload/[...slug]+api.ts" import { uploadRouter } from '../../../lib/upload'; function addCorsHeaders(response: Response) { response.headers.set('Access-Control-Allow-Origin', '*'); response.headers.set('Access-Control-Allow-Methods', 'GET, POST, PUT, DELETE, OPTIONS'); response.headers.set('Access-Control-Allow-Headers', 'Content-Type, Authorization'); return response; } export async function OPTIONS() { return addCorsHeaders(new Response(null, { status: 200 })); } export async function GET(request: Request) { const response = await uploadRouter.handlers(request); return addCorsHeaders(response); } export async function POST(request: Request) { const response = await uploadRouter.handlers(request); return addCorsHeaders(response); } ``` ## Advanced Configuration ### Authentication with Expo Auth ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; import { jwtVerify } from 'jose'; const { s3 } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = s3.createRouter({ // Private uploads with JWT authentication privateUpload: s3 .image() .max("5MB") .formats(['jpeg', 'jpg', 'png', 'webp']) .middleware(async ({ req }) => { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authorization required'); } const token = authHeader.substring(7); try { const secret = new TextEncoder().encode(process.env.JWT_SECRET!); const { payload } = await jwtVerify(token, secret); return { userId: payload.sub as string, platform: 'mobile' }; } catch (error) { throw new Error('Invalid token'); } }), // User profile pictures profilePicture: s3 .image() .max("2MB") .count(1) .formats(['jpeg', 'jpg', 'png', 'webp']) .middleware(async ({ req }) => { const userId = await authenticateUser(req); return { userId, category: 'profile' }; }) .paths({ generateKey: ({ metadata, file }) => { return `profiles/${metadata.userId}/avatar.${file.name.split('.').pop()}`; } }), // Document uploads documents: s3 .file() .max("10MB") .types(['application/pdf', 'text/plain']) .count(5) .middleware(async ({ req }) => { const userId = await authenticateUser(req); return { userId, category: 'documents' }; }), // Public uploads (no auth) publicUpload: s3 .image() .max("2MB") // No middleware = public access }); async function authenticateUser(req: Request): Promise { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authorization required'); } const token = authHeader.substring(7); const secret = new TextEncoder().encode(process.env.JWT_SECRET!); const { payload } = await jwtVerify(token, secret); return payload.sub as string; } export type AppUploadRouter = typeof uploadRouter; ``` ## Client-Side Usage (React Native) ### Upload Hook ```typescript title="hooks/useUpload.ts" import { createUploadClient } from 'pushduck/client'; import type { AppUploadRouter } from '../lib/upload'; export const upload = createUploadClient({ endpoint: '/api/upload' }); ``` ### Image Upload Component ```typescript title="components/ImageUploader.tsx" import React, { useState } from 'react'; import { View, Text, TouchableOpacity, Image, Alert, Platform } from 'react-native'; import * as ImagePicker from 'expo-image-picker'; import { upload } from '../hooks/useUpload'; export default function ImageUploader() { const [selectedImage, setSelectedImage] = useState(null); const { uploadFiles, files, isUploading, error } = upload.imageUpload(); const pickImage = async () => { // Request permission if (Platform.OS !== 'web') { const { status } = await ImagePicker.requestMediaLibraryPermissionsAsync(); if (status !== 'granted') { Alert.alert('Permission needed', 'Camera roll permission is required'); return; } } const result = await ImagePicker.launchImageLibraryAsync({ mediaTypes: ImagePicker.MediaTypeOptions.Images, allowsEditing: true, aspect: [4, 3], quality: 1, }); if (!result.canceled) { const asset = result.assets[0]; setSelectedImage(asset.uri); // Create File object for upload const file = { uri: asset.uri, name: asset.fileName || 'image.jpg', type: asset.type || 'image/jpeg', } as any; uploadFiles([file]); } }; return ( {isUploading ? 'Uploading...' : 'Pick Image'} {error && ( Error: {error.message} )} {selectedImage && ( )} {files.length > 0 && ( {files.map((file) => ( {file.name} {file.status === 'success' ? 'Complete' : `${file.progress}%`} {file.status === 'success' && file.url && ( โœ“ Uploaded )} ))} )} ); } ``` ### Document Upload Component ```typescript title="components/DocumentUploader.tsx" import React, { useState } from 'react'; import { View, Text, TouchableOpacity, Alert, FlatList } from 'react-native'; import * as DocumentPicker from 'expo-document-picker'; import { upload } from '../hooks/useUpload'; interface UploadedFile { name: string; size: number; url: string; } export default function DocumentUploader() { const [uploadedFiles, setUploadedFiles] = useState([]); const { uploadFiles, isUploading, error } = upload.documents(); const pickDocument = async () => { try { const result = await DocumentPicker.getDocumentAsync({ type: ['application/pdf', 'text/plain'], multiple: true, }); if (!result.canceled) { const files = result.assets.map(asset => ({ uri: asset.uri, name: asset.name, type: asset.mimeType || 'application/octet-stream', })) as any[]; const uploadResult = await uploadFiles(files); if (uploadResult.success) { const newFiles = uploadResult.results.map(file => ({ name: file.name, size: file.size, url: file.url, })); setUploadedFiles(prev => [...prev, ...newFiles]); Alert.alert('Success', `${files.length} file(s) uploaded successfully!`); } } } catch (error) { Alert.alert('Error', 'Failed to pick document'); } }; return ( {isUploading ? 'Uploading...' : 'Pick Documents'} {error && ( Error: {error.message} )} index.toString()} renderItem={({ item }) => ( {item.name} {(item.size / 1024).toFixed(1)} KB )} /> ); } ``` ## Project Structure Here's a recommended project structure for Expo Router with pushduck: ## Complete Example ### Main Upload Screen ```typescript title="app/(tabs)/upload.tsx" import React from 'react'; import { View, Text, ScrollView, StyleSheet } from 'react-native'; import ImageUploader from '../../components/ImageUploader'; import DocumentUploader from '../../components/DocumentUploader'; export default function UploadScreen() { return ( File Upload Demo Image Upload Document Upload ); } const styles = StyleSheet.create({ container: { flex: 1, backgroundColor: '#fff', }, title: { fontSize: 24, fontWeight: 'bold', textAlign: 'center', marginVertical: 20, }, section: { padding: 20, borderBottomWidth: 1, borderBottomColor: '#eee', }, sectionTitle: { fontSize: 18, fontWeight: '600', marginBottom: 15, }, }); ``` ### Tab Layout ```typescript title="app/(tabs)/_layout.tsx" import { Tabs } from 'expo-router'; import { Ionicons } from '@expo/vector-icons'; export default function TabLayout() { return ( ( ), }} /> ( ), }} /> ); } ``` ## Deployment Options ### EAS Build Configuration Configure automatic server deployment in your `eas.json`: ```json title="eas.json" { "cli": { "version": ">= 5.0.0" }, "build": { "development": { "developmentClient": true, "distribution": "internal", "env": { "EXPO_UNSTABLE_DEPLOY_SERVER": "1" } }, "preview": { "distribution": "internal", "env": { "EXPO_UNSTABLE_DEPLOY_SERVER": "1" } }, "production": { "env": { "EXPO_UNSTABLE_DEPLOY_SERVER": "1" } } } } ``` Deploy with automatic server: ```bash # Build for all platforms eas build --platform all # Deploy server only npx expo export --platform web eas deploy ``` ### Development Build Setup ```bash # Install dev client npx expo install expo-dev-client # Create development build eas build --profile development # Or run locally npx expo run:ios --configuration Release npx expo run:android --variant release ``` Configure local server origin: ```json title="app.json" { "expo": { "plugins": [ [ "expo-router", { "origin": "http://localhost:8081" } ] ] } } ``` ### Local Development Server ```bash # Start Expo development server npx expo start # Test API routes curl http://localhost:8081/api/upload/presigned-url # Clear cache if needed npx expo start --clear ``` For production testing: ```bash # Export for production npx expo export # Serve locally npx expo serve ``` ## Environment Variables ```bash title=".env" # AWS/Cloudflare R2 Configuration AWS_ACCESS_KEY_ID=your_access_key AWS_SECRET_ACCESS_KEY=your_secret_key AWS_REGION=auto AWS_ENDPOINT_URL=https://your-account.r2.cloudflarestorage.com S3_BUCKET_NAME=your-bucket-name R2_ACCOUNT_ID=your-cloudflare-account-id # JWT Authentication JWT_SECRET=your-jwt-secret # Expo Configuration (for client-side, use EXPO_PUBLIC_ prefix) EXPO_PUBLIC_API_URL=https://your-domain.com ``` **Important**: Server environment variables (without `EXPO_PUBLIC_` prefix) are only available in API routes, not in client code. Client-side variables must use the `EXPO_PUBLIC_` prefix. ## Performance Benefits Share upload logic between web and native platforms with a single codebase. Direct access to native file system APIs for optimal performance on mobile. Built-in support for upload progress tracking and real-time status updates. Deploy to iOS, Android, and web with the same upload infrastructure. ## Troubleshooting **File Permissions**: Always request proper permissions for camera and photo library access on mobile devices before file operations. **Server Bundle**: Expo Router API routes require server output to be enabled in your `app.json` configuration. ### Common Issues **Metro bundler errors:** ```bash # Clear Metro cache npx expo start --clear # Reset Expo cache npx expo r -c ``` **Permission denied errors:** ```typescript // Always check permissions before file operations import * as ImagePicker from 'expo-image-picker'; const { status } = await ImagePicker.requestMediaLibraryPermissionsAsync(); if (status !== 'granted') { Alert.alert('Permission needed', 'Camera roll permission is required'); return; } ``` **Network errors in development:** ```typescript // Make sure your development server is accessible const { upload } = useUpload('/api/upload', { endpoint: __DEV__ ? 'http://localhost:8081' : 'https://your-domain.com', }); ``` **File upload timeout:** ```typescript const { upload } = useUpload('/api/upload', { timeout: 60000, // 60 seconds }); ``` ### Debug Mode Enable debug logging for development: ```typescript title="lib/upload.ts" const { s3 } = createUploadConfig() .provider("cloudflareR2",{ /* config */ }) .defaults({ debug: __DEV__, // Only in development }) .build(); ``` This will log detailed information about upload requests, file processing, and S3 operations to help diagnose issues during development. ## Framework-Specific Notes 1. **File System Access**: Use `expo-file-system` for advanced file operations 2. **Permissions**: Always request permissions before accessing camera or photo library 3. **Web Compatibility**: Components work on web out of the box with Expo Router 4. **Platform Detection**: Use `Platform.OS` to handle platform-specific logic 5. **Environment Variables**: Server variables don't need `EXPO_PUBLIC_` prefix in API routes # Express URL: /docs/integrations/express Popular Node.js framework integration with pushduck using adapters for req/res API *** title: Express description: Popular Node.js framework integration with pushduck using adapters for req/res API ----------------------------------------------------------------------------------------------- import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; # Express Express uses the traditional Node.js `req`/`res` API pattern. Pushduck provides a simple adapter that converts Web Standard handlers to Express middleware format. **Custom Request/Response API**: Express uses `req`/`res` objects instead of Web Standards, so pushduck provides the `toExpressHandler` adapter for seamless integration. ## Quick Setup **Install dependencies** ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` ```bash bun add pushduck ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().max("5MB"), documentUpload: s3.file().max("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create Express server with upload routes** ```typescript title="server.ts" import express from 'express'; import { uploadRouter } from './lib/upload'; import { toExpressHandler } from 'pushduck/adapters/express'; const app = express(); // Convert pushduck handlers to Express middleware app.all('/api/upload/*', toExpressHandler(uploadRouter.handlers)); app.listen(3000, () => { console.log('Server running on http://localhost:3000'); }); ``` ## Basic Integration ### Simple Upload Route ```typescript title="server.ts" import express from 'express'; import cors from 'cors'; import { uploadRouter } from './lib/upload'; import { toExpressHandler } from 'pushduck/adapters/express'; const app = express(); // Middleware app.use(cors()); app.use(express.json()); // Upload routes using adapter app.all('/api/upload/*', toExpressHandler(uploadRouter.handlers)); // Health check app.get('/health', (req, res) => { res.json({ status: 'healthy', timestamp: new Date().toISOString() }); }); const port = process.env.PORT || 3000; app.listen(port, () => { console.log(`๐Ÿš€ Server running on http://localhost:${port}`); }); ``` ### With Authentication Middleware ```typescript title="server.ts" import express from 'express'; import jwt from 'jsonwebtoken'; import { uploadRouter } from './lib/upload'; import { toExpressHandler } from 'pushduck/adapters/express'; const app = express(); app.use(express.json()); // Authentication middleware const authenticateToken = (req: express.Request, res: express.Response, next: express.NextFunction) => { const authHeader = req.headers['authorization']; const token = authHeader && authHeader.split(' ')[1]; if (!token) { return res.sendStatus(401); } jwt.verify(token, process.env.JWT_SECRET!, (err, user) => { if (err) return res.sendStatus(403); req.user = user; next(); }); }; // Public upload route (no auth) app.all('/api/upload/public/*', toExpressHandler(uploadRouter.handlers)); // Private upload route (with auth) app.all('/api/upload/private/*', authenticateToken, toExpressHandler(uploadRouter.handlers)); app.listen(3000); ``` ## Advanced Configuration ### Upload Configuration with Express Context ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Profile pictures with authentication profilePicture: s3 .image() .max("2MB") .count(1) .formats(["jpeg", "png", "webp"]) .middleware(async ({ req }) => { // Extract user from JWT token in Authorization header const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authentication required'); } const token = authHeader.substring(7); const user = await verifyJWT(token); return { userId: user.id, userRole: user.role, category: "profile" }; }), // Document uploads for authenticated users documents: s3 .file() .max("10MB") .count(5) .types([ "application/pdf", "application/msword", "application/vnd.openxmlformats-officedocument.wordprocessingml.document", "text/plain" ]) .middleware(async ({ req }) => { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authentication required'); } const token = authHeader.substring(7); const user = await verifyJWT(token); return { userId: user.id, category: "documents" }; }), // Public uploads (no authentication) publicImages: s3 .image() .max("1MB") .count(1) .formats(["jpeg", "png"]) // No middleware = public access }); async function verifyJWT(token: string) { // Your JWT verification logic const jwt = await import('jsonwebtoken'); return jwt.verify(token, process.env.JWT_SECRET!) as any; } export type AppUploadRouter = typeof uploadRouter; ``` ### Complete Express Application ```typescript title="server.ts" import express from 'express'; import cors from 'cors'; import helmet from 'helmet'; import rateLimit from 'express-rate-limit'; import { uploadRouter } from './lib/upload'; import { toExpressHandler } from 'pushduck/adapters/express'; const app = express(); // Security middleware app.use(helmet()); app.use(cors({ origin: process.env.NODE_ENV === 'production' ? ['https://your-domain.com'] : ['http://localhost:3000'], credentials: true })); // Rate limiting const uploadLimiter = rateLimit({ windowMs: 15 * 60 * 1000, // 15 minutes max: 100, // limit each IP to 100 requests per windowMs message: 'Too many upload requests from this IP, please try again later.', standardHeaders: true, legacyHeaders: false, }); // Body parsing middleware app.use(express.json({ limit: '50mb' })); app.use(express.urlencoded({ extended: true, limit: '50mb' })); // Logging middleware app.use((req, res, next) => { console.log(`${new Date().toISOString()} - ${req.method} ${req.path}`); next(); }); // Health check endpoint app.get('/health', (req, res) => { res.json({ status: 'healthy', timestamp: new Date().toISOString(), uptime: process.uptime(), memory: process.memoryUsage(), version: process.env.npm_package_version || '1.0.0' }); }); // API info endpoint app.get('/api', (req, res) => { res.json({ name: 'Express Upload API', version: '1.0.0', endpoints: { health: '/health', upload: '/api/upload/*' }, uploadTypes: [ 'profilePicture - Single profile picture (2MB max)', 'documents - PDF, Word, text files (10MB max, 5 files)', 'publicImages - Public images (1MB max)' ] }); }); // Upload routes with rate limiting app.all('/api/upload/*', uploadLimiter, toExpressHandler(uploadRouter.handlers)); // 404 handler app.use('*', (req, res) => { res.status(404).json({ error: 'Not Found', message: `Route ${req.originalUrl} not found`, timestamp: new Date().toISOString() }); }); // Error handler app.use((err: Error, req: express.Request, res: express.Response, next: express.NextFunction) => { console.error('Express error:', err); res.status(500).json({ error: 'Internal Server Error', message: process.env.NODE_ENV === 'development' ? err.message : 'Something went wrong', timestamp: new Date().toISOString() }); }); const port = process.env.PORT || 3000; app.listen(port, () => { console.log(`๐Ÿš€ Express server running on http://localhost:${port}`); console.log(`๐Ÿ“ Upload endpoint: http://localhost:${port}/api/upload`); }); ``` ## Project Structure ## Modular Route Organization ### Separate Upload Routes ```typescript title="routes/uploads.ts" import { Router } from 'express'; import { uploadRouter } from '../lib/upload'; import { toExpressHandler } from 'pushduck/adapters/express'; import { authenticateToken } from '../middleware/auth'; const router = Router(); // Public uploads router.all('/public/*', toExpressHandler(uploadRouter.handlers)); // Private uploads (requires authentication) router.all('/private/*', authenticateToken, toExpressHandler(uploadRouter.handlers)); export default router; ``` ```typescript title="middleware/auth.ts" import { Request, Response, NextFunction } from 'express'; import jwt from 'jsonwebtoken'; export const authenticateToken = (req: Request, res: Response, next: NextFunction) => { const authHeader = req.headers['authorization']; const token = authHeader && authHeader.split(' ')[1]; if (!token) { return res.status(401).json({ error: 'Access token required' }); } jwt.verify(token, process.env.JWT_SECRET!, (err, user) => { if (err) { return res.status(403).json({ error: 'Invalid or expired token' }); } req.user = user; next(); }); }; ``` # Fastify URL: /docs/integrations/fastify High-performance Node.js framework integration with pushduck using adapters *** title: Fastify description: High-performance Node.js framework integration with pushduck using adapters ---------------------------------------------------------------------------------------- import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; # Fastify Fastify is a high-performance Node.js web framework that uses custom `request`/`reply` objects. Pushduck provides a simple adapter that converts Web Standard handlers to Fastify handler format. **Custom Request/Response API**: Fastify uses `request`/`reply` objects instead of Web Standards, so pushduck provides the `toFastifyHandler` adapter for seamless integration. ## Quick Setup **Install dependencies** ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` ```bash bun add pushduck ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().max("5MB"), documentUpload: s3.file().max("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create Fastify server with upload routes** ```typescript title="server.ts" import Fastify from 'fastify'; import { uploadRouter } from './lib/upload'; import { toFastifyHandler } from 'pushduck/adapters/fastify'; const fastify = Fastify({ logger: true }); // Convert pushduck handlers to Fastify handler fastify.all('/api/upload/*', toFastifyHandler(uploadRouter.handlers)); const start = async () => { try { await fastify.listen({ port: 3000 }); console.log('๐Ÿš€ Fastify server running on http://localhost:3000'); } catch (err) { fastify.log.error(err); process.exit(1); } }; start(); ``` ## Basic Integration ### Simple Upload Route ```typescript title="server.ts" import Fastify from 'fastify'; import cors from '@fastify/cors'; import { uploadRouter } from './lib/upload'; import { toFastifyHandler } from 'pushduck/adapters/fastify'; const fastify = Fastify({ logger: { level: 'info', transport: { target: 'pino-pretty' } } }); // Register CORS await fastify.register(cors, { origin: ['http://localhost:3000', 'https://your-domain.com'] }); // Upload routes using adapter fastify.all('/api/upload/*', toFastifyHandler(uploadRouter.handlers)); // Health check fastify.get('/health', async (request, reply) => { return { status: 'healthy', timestamp: new Date().toISOString(), framework: 'Fastify' }; }); const start = async () => { try { await fastify.listen({ port: 3000, host: '0.0.0.0' }); } catch (err) { fastify.log.error(err); process.exit(1); } }; start(); ``` ### With Authentication Hook ```typescript title="server.ts" import Fastify from 'fastify'; import jwt from '@fastify/jwt'; import { uploadRouter } from './lib/upload'; import { toFastifyHandler } from 'pushduck/adapters/fastify'; const fastify = Fastify({ logger: true }); // Register JWT await fastify.register(jwt, { secret: process.env.JWT_SECRET! }); // Authentication hook fastify.addHook('preHandler', async (request, reply) => { // Only protect upload routes if (request.url.startsWith('/api/upload/private/')) { try { await request.jwtVerify(); } catch (err) { reply.send(err); } } }); // Public upload routes fastify.all('/api/upload/public/*', toFastifyHandler(uploadRouter.handlers)); // Private upload routes (protected by hook) fastify.all('/api/upload/private/*', toFastifyHandler(uploadRouter.handlers)); const start = async () => { try { await fastify.listen({ port: 3000 }); } catch (err) { fastify.log.error(err); process.exit(1); } }; start(); ``` ## Advanced Configuration ### Upload Configuration with Fastify Context ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Profile pictures with authentication profilePicture: s3 .image() .max("2MB") .count(1) .formats(["jpeg", "png", "webp"]) .middleware(async ({ req }) => { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authentication required'); } const token = authHeader.substring(7); const user = await verifyJWT(token); return { userId: user.id, userRole: user.role, category: "profile" }; }), // Document uploads for authenticated users documents: s3 .file() .max("10MB") .count(5) .types([ "application/pdf", "application/msword", "application/vnd.openxmlformats-officedocument.wordprocessingml.document", "text/plain" ]) .middleware(async ({ req }) => { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authentication required'); } const token = authHeader.substring(7); const user = await verifyJWT(token); return { userId: user.id, category: "documents" }; }), // Public uploads (no authentication) publicImages: s3 .image() .max("1MB") .count(1) .formats(["jpeg", "png"]) // No middleware = public access }); async function verifyJWT(token: string) { // Your JWT verification logic const jwt = await import('jsonwebtoken'); return jwt.verify(token, process.env.JWT_SECRET!) as any; } export type AppUploadRouter = typeof uploadRouter; ``` ### Complete Fastify Application ```typescript title="server.ts" import Fastify from 'fastify'; import cors from '@fastify/cors'; import helmet from '@fastify/helmet'; import rateLimit from '@fastify/rate-limit'; import { uploadRouter } from './lib/upload'; import { toFastifyHandler } from 'pushduck/adapters/fastify'; const fastify = Fastify({ logger: { level: process.env.NODE_ENV === 'production' ? 'warn' : 'info', transport: process.env.NODE_ENV !== 'production' ? { target: 'pino-pretty' } : undefined } }); // Security middleware await fastify.register(helmet, { contentSecurityPolicy: false }); // CORS configuration await fastify.register(cors, { origin: process.env.NODE_ENV === 'production' ? ['https://your-domain.com'] : true, credentials: true }); // Rate limiting await fastify.register(rateLimit, { max: 100, timeWindow: '15 minutes', errorResponseBuilder: (request, context) => ({ error: 'Rate limit exceeded', message: `Too many requests from ${request.ip}. Try again later.`, retryAfter: Math.round(context.ttl / 1000) }) }); // Request logging fastify.addHook('onRequest', async (request, reply) => { request.log.info({ url: request.url, method: request.method }, 'incoming request'); }); // Health check endpoint fastify.get('/health', async (request, reply) => { return { status: 'healthy', timestamp: new Date().toISOString(), uptime: process.uptime(), memory: process.memoryUsage(), version: process.env.npm_package_version || '1.0.0', framework: 'Fastify' }; }); // API info endpoint fastify.get('/api', async (request, reply) => { return { name: 'Fastify Upload API', version: '1.0.0', endpoints: { health: '/health', upload: '/api/upload/*' }, uploadTypes: [ 'profilePicture - Single profile picture (2MB max)', 'documents - PDF, Word, text files (10MB max, 5 files)', 'publicImages - Public images (1MB max)' ] }; }); // Upload routes with rate limiting fastify.register(async function (fastify) { await fastify.register(rateLimit, { max: 50, timeWindow: '15 minutes' }); fastify.all('/api/upload/*', toFastifyHandler(uploadRouter.handlers)); }); // 404 handler fastify.setNotFoundHandler(async (request, reply) => { reply.status(404).send({ error: 'Not Found', message: `Route ${request.method} ${request.url} not found`, timestamp: new Date().toISOString() }); }); // Error handler fastify.setErrorHandler(async (error, request, reply) => { request.log.error(error, 'Fastify error'); reply.status(500).send({ error: 'Internal Server Error', message: process.env.NODE_ENV === 'development' ? error.message : 'Something went wrong', timestamp: new Date().toISOString() }); }); // Graceful shutdown const gracefulShutdown = () => { fastify.log.info('Shutting down gracefully...'); fastify.close().then(() => { fastify.log.info('Server closed'); process.exit(0); }).catch((err) => { fastify.log.error(err, 'Error during shutdown'); process.exit(1); }); }; process.on('SIGTERM', gracefulShutdown); process.on('SIGINT', gracefulShutdown); const start = async () => { try { const port = Number(process.env.PORT) || 3000; const host = process.env.HOST || '0.0.0.0'; await fastify.listen({ port, host }); fastify.log.info(`๐Ÿš€ Fastify server running on http://${host}:${port}`); fastify.log.info(`๐Ÿ“ Upload endpoint: http://${host}:${port}/api/upload`); } catch (err) { fastify.log.error(err); process.exit(1); } }; start(); ``` ## Plugin-Based Architecture ### Upload Plugin ```typescript title="plugins/upload.ts" import { FastifyPluginAsync } from 'fastify'; import { uploadRouter } from '../lib/upload'; import { toFastifyHandler } from 'pushduck/adapters/fastify'; const uploadPlugin: FastifyPluginAsync = async (fastify) => { // Upload routes fastify.all('/upload/*', toFastifyHandler(uploadRouter.handlers)); // Upload status endpoint fastify.get('/upload-status', async (request, reply) => { return { status: 'ready', supportedTypes: ['images', 'documents', 'publicImages'], maxSizes: { profilePicture: '2MB', documents: '10MB', publicImages: '1MB' } }; }); }; export default uploadPlugin; ``` ### Main Server with Plugins ```typescript title="server.ts" import Fastify from 'fastify'; import uploadPlugin from './plugins/upload'; const fastify = Fastify({ logger: true }); // Register upload plugin await fastify.register(uploadPlugin, { prefix: '/api' }); const start = async () => { try { await fastify.listen({ port: 3000 }); } catch (err) { fastify.log.error(err); process.exit(1); } }; start(); ``` ## Client Usage The client-side integration is identical regardless of your backend framework: ```typescript title="client/upload-client.ts" import { createUploadClient } from 'pushduck/client'; import type { AppUploadRouter } from '../lib/upload'; export const upload = createUploadClient({ endpoint: 'http://localhost:3000/api/upload', headers: { 'Authorization': `Bearer ${getAuthToken()}` } }); function getAuthToken(): string { return localStorage.getItem('auth-token') || ''; } ``` ```typescript title="client/upload-form.tsx" import { upload } from './upload-client'; export function DocumentUploader() { const { uploadFiles, files, isUploading, error } = upload.documents(); const handleFileSelect = (e: React.ChangeEvent) => { const selectedFiles = Array.from(e.target.files || []); uploadFiles(selectedFiles); }; return (
{error && (
Error: {error.message}
)} {files.map((file) => (
{file.name} {file.status === 'success' && ( Download )}
))}
); } ``` ## Deployment ### Docker Deployment ```dockerfile title="Dockerfile" FROM node:18-alpine WORKDIR /app # Copy package files COPY package*.json ./ RUN npm ci --only=production # Copy source code COPY . . # Build TypeScript RUN npm run build EXPOSE 3000 CMD ["npm", "start"] ``` ### Package Configuration ```json title="package.json" { "name": "fastify-upload-api", "version": "1.0.0", "scripts": { "dev": "tsx watch src/server.ts", "build": "tsc", "start": "node dist/server.js" }, "dependencies": { "fastify": "^4.24.0", "pushduck": "latest", "@fastify/cors": "^8.4.0", "@fastify/helmet": "^11.1.0", "@fastify/rate-limit": "^8.0.0", "@fastify/jwt": "^7.2.0" }, "devDependencies": { "@types/node": "^20.0.0", "tsx": "^3.12.7", "typescript": "^5.0.0", "pino-pretty": "^10.2.0" } } ``` ### Environment Variables ```bash title=".env" # Server Configuration PORT=3000 HOST=0.0.0.0 NODE_ENV=development JWT_SECRET=your-super-secret-jwt-key # Cloudflare R2 Configuration AWS_ACCESS_KEY_ID=your_r2_access_key AWS_SECRET_ACCESS_KEY=your_r2_secret_key AWS_ENDPOINT_URL=https://your-account-id.r2.cloudflarestorage.com S3_BUCKET_NAME=your-bucket-name R2_ACCOUNT_ID=your-account-id ``` ## Performance Benefits Fastify is one of the fastest Node.js frameworks, perfect for high-throughput upload APIs. Leverage Fastify's extensive plugin ecosystem alongside pushduck's upload capabilities. Excellent TypeScript support with full type safety for both Fastify and pushduck. Built-in schema validation, logging, and error handling for production deployments. *** **Fastify + Pushduck**: High-performance file uploads with Fastify's speed and pushduck's universal design, connected through a simple adapter. # Fresh URL: /docs/integrations/fresh Deno-powered file uploads with Fresh using Web Standards - no adapter needed! *** title: Fresh description: Deno-powered file uploads with Fresh using Web Standards - no adapter needed! ------------------------------------------------------------------------------------------ import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; # Fresh Integration Fresh is a modern web framework for Deno that uses islands architecture for optimal performance. It uses Web Standards APIs and provides server-side rendering with minimal client-side JavaScript. Since Fresh uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters! **Web Standards Native**: Fresh API routes use Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead. ## Quick Setup **Install Fresh and pushduck** ```bash # Create a new Fresh project deno run -A -r https://fresh.deno.dev my-app cd my-app # Add pushduck to import_map.json ``` ```json title="import_map.json" { "imports": { "$fresh/": "https://deno.land/x/fresh@1.6.1/", "preact": "https://esm.sh/preact@10.19.2", "preact/": "https://esm.sh/preact@10.19.2/", "pushduck/server": "https://esm.sh/pushduck@latest/server", "pushduck/client": "https://esm.sh/pushduck@latest/client" } } ``` ```bash # Create a new Fresh project deno run -A -r https://fresh.deno.dev my-app cd my-app # Install pushduck via npm (requires Node.js compatibility) npm install pushduck ``` ```bash # Create a new Fresh project deno run -A -r https://fresh.deno.dev my-app cd my-app # Install pushduck via yarn (requires Node.js compatibility) yarn add pushduck ``` ```bash # Create a new Fresh project deno run -A -r https://fresh.deno.dev my-app cd my-app # Install pushduck via pnpm (requires Node.js compatibility) pnpm add pushduck ``` ```bash # Create a new Fresh project deno run -A -r https://fresh.deno.dev my-app cd my-app # Install pushduck via bun (requires Node.js compatibility) bun add pushduck ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: Deno.env.get("AWS_ACCESS_KEY_ID")!, secretAccessKey: Deno.env.get("AWS_SECRET_ACCESS_KEY")!, region: 'auto', endpoint: Deno.env.get("AWS_ENDPOINT_URL")!, bucket: Deno.env.get("S3_BUCKET_NAME")!, accountId: Deno.env.get("R2_ACCOUNT_ID")!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().max("5MB"), documentUpload: s3.file().max("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create API route** ```typescript title="routes/api/upload/[...path].ts" import { Handlers } from "$fresh/server.ts"; import { uploadRouter } from "../../../lib/upload.ts"; // Direct usage - no adapter needed! export const handler: Handlers = { async GET(req) { return uploadRouter.handlers(req); }, async POST(req) { return uploadRouter.handlers(req); }, }; ``` ## Basic Integration ### Simple Upload Route ```typescript title="routes/api/upload/[...path].ts" import { Handlers } from "$fresh/server.ts"; import { uploadRouter } from "../../../lib/upload.ts"; // Method 1: Combined handler (recommended) export const handler: Handlers = { async GET(req) { return uploadRouter.handlers(req); }, async POST(req) { return uploadRouter.handlers(req); }, }; // Method 2: Universal handler export const handler: Handlers = { async GET(req) { return uploadRouter.handlers(req); }, async POST(req) { return uploadRouter.handlers(req); }, async OPTIONS(req) { return new Response(null, { status: 200, headers: { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'GET, POST, OPTIONS', 'Access-Control-Allow-Headers': 'Content-Type', }, }); }, }; ``` ### With Middleware ```typescript title="routes/_middleware.ts" import { MiddlewareHandlerContext } from "$fresh/server.ts"; export async function handler( req: Request, ctx: MiddlewareHandlerContext, ) { // Add CORS headers for upload routes if (ctx.destination === "route" && req.url.includes("/api/upload")) { const response = await ctx.next(); response.headers.set("Access-Control-Allow-Origin", "*"); response.headers.set("Access-Control-Allow-Methods", "GET, POST, OPTIONS"); response.headers.set("Access-Control-Allow-Headers", "Content-Type"); return response; } return ctx.next(); } ``` ## Advanced Configuration ### Authentication with Fresh ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; import { getCookies } from "https://deno.land/std@0.208.0/http/cookie.ts"; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: Deno.env.get("AWS_ACCESS_KEY_ID")!, secretAccessKey: Deno.env.get("AWS_SECRET_ACCESS_KEY")!, region: 'auto', endpoint: Deno.env.get("AWS_ENDPOINT_URL")!, bucket: Deno.env.get("S3_BUCKET_NAME")!, accountId: Deno.env.get("R2_ACCOUNT_ID")!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with cookie-based authentication privateUpload: s3 .image() .max("5MB") .middleware(async ({ req }) => { const cookies = getCookies(req.headers); const sessionId = cookies.sessionId; if (!sessionId) { throw new Error('Authentication required'); } const user = await getUserFromSession(sessionId); if (!user) { throw new Error('Invalid session'); } return { userId: user.id, username: user.username, }; }), // Public uploads (no auth) publicUpload: s3 .image() .max("2MB") // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; // Helper function async function getUserFromSession(sessionId: string) { // Implement your session validation logic // This could connect to a database, Deno KV, etc. return { id: 'user-123', username: 'demo-user' }; } ``` ## Client-Side Usage ### Upload Island Component ```tsx title="islands/FileUpload.tsx" import { useUpload } from "pushduck/client"; import type { AppUploadRouter } from "../lib/upload.ts"; const { UploadButton, UploadDropzone } = useUpload({ endpoint: "/api/upload", }); export default function FileUpload() { function handleUploadComplete(files: any[]) { console.log("Files uploaded:", files); alert("Upload completed!"); } function handleUploadError(error: Error) { console.error("Upload error:", error); alert(`Upload failed: ${error.message}`); } return (

Image Upload

Document Upload

); } ``` ### Using in Pages ```tsx title="routes/index.tsx" import { Head } from "$fresh/runtime.ts"; import FileUpload from "../islands/FileUpload.tsx"; export default function Home() { return ( <> File Upload Demo

File Upload Demo

); } ``` ## File Management ### Server-Side File API ```typescript title="routes/api/files.ts" import { Handlers } from "$fresh/server.ts"; export const handler: Handlers = { async GET(req) { const url = new URL(req.url); const userId = url.searchParams.get('userId'); if (!userId) { return new Response(JSON.stringify({ error: 'User ID required' }), { status: 400, headers: { 'Content-Type': 'application/json' } }); } // Fetch files from database/Deno KV const files = await getFilesForUser(userId); return new Response(JSON.stringify({ files: files.map(file => ({ id: file.id, name: file.name, url: file.url, size: file.size, uploadedAt: file.createdAt, })), }), { headers: { 'Content-Type': 'application/json' } }); }, }; async function getFilesForUser(userId: string) { // Example using Deno KV const kv = await Deno.openKv(); const files = []; for await (const entry of kv.list({ prefix: ["files", userId] })) { files.push(entry.value); } return files; } ``` ### File Management Page ```tsx title="routes/files.tsx" import { Head } from "$fresh/runtime.ts"; import { Handlers, PageProps } from "$fresh/server.ts"; import FileUpload from "../islands/FileUpload.tsx"; interface FileData { id: string; name: string; url: string; size: number; uploadedAt: string; } interface PageData { files: FileData[]; } export const handler: Handlers = { async GET(req, ctx) { // Fetch files for current user const files = await getFilesForUser("current-user"); return ctx.render({ files }); }, }; export default function FilesPage({ data }: PageProps) { function formatFileSize(bytes: number): string { const sizes = ['Bytes', 'KB', 'MB', 'GB']; if (bytes === 0) return '0 Bytes'; const i = Math.floor(Math.log(bytes) / Math.log(1024)); return Math.round(bytes / Math.pow(1024, i) * 100) / 100 + ' ' + sizes[i]; } return ( <> My Files

My Files

Uploaded Files

{data.files.length === 0 ? (

No files uploaded yet.

) : (
{data.files.map((file) => (

{file.name}

{formatFileSize(file.size)}

{new Date(file.uploadedAt).toLocaleDateString()}

View File
))}
)}
); } async function getFilesForUser(userId: string) { // Implementation depends on your storage solution return []; } ``` ## Deployment Options ```bash # Deploy to Deno Deploy deno task build deployctl deploy --project=my-app --include=. --exclude=node_modules ``` ```json title="deno.json" { "tasks": { "build": "deno run -A dev.ts build", "preview": "deno run -A main.ts", "start": "deno run -A --watch=static/,routes/ dev.ts", "deploy": "deployctl deploy --project=my-app --include=. --exclude=node_modules" } } ``` ```dockerfile title="Dockerfile" FROM denoland/deno:1.38.0 WORKDIR /app # Copy dependency files COPY deno.json deno.lock import_map.json ./ # Cache dependencies RUN deno cache --import-map=import_map.json main.ts # Copy source code COPY . . # Build the application RUN deno task build EXPOSE 8000 CMD ["deno", "run", "-A", "main.ts"] ``` ```bash # Install Deno curl -fsSL https://deno.land/install.sh | sh # Clone and run your app git clone cd deno task start ``` ```systemd title="/etc/systemd/system/fresh-app.service" [Unit] Description=Fresh App After=network.target [Service] Type=simple User=deno WorkingDirectory=/opt/fresh-app ExecStart=/home/deno/.deno/bin/deno run -A main.ts Restart=always [Install] WantedBy=multi-user.target ``` ## Environment Variables ```bash title=".env" # AWS Configuration AWS_REGION=us-east-1 AWS_ACCESS_KEY_ID=your_access_key AWS_SECRET_ACCESS_KEY=your_secret_key AWS_S3_BUCKET=your-bucket-name # Fresh PORT=8000 ``` ## Performance Benefits ## Real-Time Upload Progress ```tsx title="islands/AdvancedUpload.tsx" import { useState } from "preact/hooks"; export default function AdvancedUpload() { const [uploadProgress, setUploadProgress] = useState(0); const [isUploading, setIsUploading] = useState(false); async function handleFileUpload(event: Event) { const target = event.target as HTMLInputElement; const files = target.files; if (!files || files.length === 0) return; setIsUploading(true); setUploadProgress(0); try { // Simulate upload progress for (let i = 0; i <= 100; i += 10) { setUploadProgress(i); await new Promise(resolve => setTimeout(resolve, 100)); } alert('Upload completed!'); } catch (error) { console.error('Upload failed:', error); alert('Upload failed!'); } finally { setIsUploading(false); setUploadProgress(0); } } return (
{isUploading && (

{uploadProgress}% uploaded

)}
); } ``` ## Deno KV Integration ```typescript title="lib/storage.ts" // Example using Deno KV for file metadata storage export class FileStorage { private kv: Deno.Kv; constructor() { this.kv = await Deno.openKv(); } async saveFileMetadata(userId: string, file: { id: string; name: string; url: string; size: number; type: string; }) { const key = ["files", userId, file.id]; await this.kv.set(key, { ...file, createdAt: new Date().toISOString(), }); } async getFilesForUser(userId: string) { const files = []; for await (const entry of this.kv.list({ prefix: ["files", userId] })) { files.push(entry.value); } return files; } async deleteFile(userId: string, fileId: string) { const key = ["files", userId, fileId]; await this.kv.delete(key); } } export const fileStorage = new FileStorage(); ``` ## Troubleshooting **Common Issues** 1. **Route not found**: Ensure your route is `routes/api/upload/[...path].ts` 2. **Import errors**: Check your `import_map.json` configuration 3. **Permissions**: Deno requires explicit permissions (`-A` flag for all permissions) 4. **Environment variables**: Use `Deno.env.get()` instead of `process.env` ### Debug Mode Enable debug logging: ```typescript title="lib/upload.ts" export const uploadRouter = createS3Router({ // ... routes }).middleware(async ({ req, file }) => { if (Deno.env.get("DENO_ENV") === "development") { console.log("Upload request:", req.url); console.log("File:", file.name, file.size); } return {}; }); ``` ### Fresh Configuration ```typescript title="fresh.config.ts" import { defineConfig } from "$fresh/server.ts"; export default defineConfig({ plugins: [], // Enable static file serving staticDir: "./static", // Custom build options build: { target: ["chrome99", "firefox99", "safari15"], }, }); ``` Fresh provides an excellent foundation for building modern web applications with Deno and pushduck, combining the power of islands architecture with Web Standards APIs and Deno's secure runtime environment. # Hono URL: /docs/integrations/hono Fast, lightweight file uploads with Hono using Web Standards - no adapter needed! *** title: Hono description: Fast, lightweight file uploads with Hono using Web Standards - no adapter needed! ---------------------------------------------------------------------------------------------- import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; # Hono Integration Hono is a fast, lightweight web framework built on Web Standards. Since Hono uses `Request` and `Response` objects natively, pushduck handlers work directly without any adapters! **Web Standards Native**: Hono exposes `c.req.raw` as a Web Standard `Request` object, making pushduck integration seamless with zero overhead. ## Quick Setup **Install dependencies** ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` ```bash bun add pushduck ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().max("5MB"), documentUpload: s3.file().max("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create Hono app with upload routes** ```typescript title="app.ts" import { Hono } from 'hono'; import { uploadRouter } from './lib/upload'; const app = new Hono(); // Direct usage - no adapter needed! app.all('/api/upload/*', (c) => { return uploadRouter.handlers(c.req.raw); }); export default app; ``` ## Basic Integration ### Simple Upload Route ```typescript title="app.ts" import { Hono } from 'hono'; import { uploadRouter } from './lib/upload'; const app = new Hono(); // Method 1: Combined handler (recommended) app.all('/api/upload/*', (c) => { return uploadRouter.handlers(c.req.raw); }); // Method 2: Separate handlers (if you need method-specific logic) app.get('/api/upload/*', (c) => uploadRouter.handlers.GET(c.req.raw)); app.post('/api/upload/*', (c) => uploadRouter.handlers.POST(c.req.raw)); export default app; ``` ### With Middleware ```typescript title="app.ts" import { Hono } from 'hono'; import { cors } from 'hono/cors'; import { logger } from 'hono/logger'; import { uploadRouter } from './lib/upload'; const app = new Hono(); // Global middleware app.use('*', logger()); app.use('*', cors({ origin: ['http://localhost:3000', 'https://your-domain.com'], allowMethods: ['GET', 'POST'], allowHeaders: ['Content-Type'], })); // Upload routes app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw)); // Health check app.get('/health', (c) => c.json({ status: 'ok' })); export default app; ``` ## Advanced Configuration ### Authentication with Hono ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; import { verify } from 'hono/jwt'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with JWT authentication privateUpload: s3 .image() .max("5MB") .middleware(async ({ req }) => { const authHeader = req.headers.get('authorization'); if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authorization required'); } const token = authHeader.substring(7); try { const payload = await verify(token, process.env.JWT_SECRET!); return { userId: payload.sub as string, userRole: payload.role as string }; } catch (error) { throw new Error('Invalid token'); } }), // Public uploads (no auth) publicUpload: s3 .image() .max("2MB") // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; ``` ## Deployment Options ```typescript title="src/index.ts" import { Hono } from 'hono'; import { uploadRouter } from './lib/upload'; const app = new Hono(); app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw)); export default app; ``` ```toml title="wrangler.toml" name = "my-upload-api" main = "src/index.ts" compatibility_date = "2023-12-01" [env.production] vars = { NODE_ENV = "production" } ``` ```bash # Deploy to Cloudflare Workers npx wrangler deploy ``` ```typescript title="server.ts" import { Hono } from 'hono'; import { uploadRouter } from './lib/upload'; const app = new Hono(); app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw)); export default { port: 3000, fetch: app.fetch, }; ``` ```bash # Run with Bun bun run server.ts ``` ```typescript title="server.ts" import { serve } from '@hono/node-server'; import { Hono } from 'hono'; import { uploadRouter } from './lib/upload'; const app = new Hono(); app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw)); const port = 3000; console.log(`Server is running on port ${port}`); serve({ fetch: app.fetch, port }); ``` ```bash # Run with Node.js npm run dev ``` ```typescript title="server.ts" import { Hono } from 'hono'; import { uploadRouter } from './lib/upload.ts'; const app = new Hono(); app.all('/api/upload/*', (c) => uploadRouter.handlers(c.req.raw)); Deno.serve(app.fetch); ``` ```bash # Run with Deno deno run --allow-net --allow-env server.ts ``` ## Performance Benefits No adapter layer means zero performance overhead - pushduck handlers run directly in Hono. Hono is one of the fastest web frameworks, perfect for high-performance upload APIs. Works on Cloudflare Workers, Bun, Node.js, and Deno with the same code. Hono + pushduck creates incredibly lightweight upload services. *** **Perfect Match**: Hono's Web Standards foundation and pushduck's universal design create a powerful, fast, and lightweight file upload solution that works everywhere. # Next.js URL: /docs/integrations/nextjs Complete guide to integrating pushduck with Next.js App Router and Pages Router *** title: Next.js description: Complete guide to integrating pushduck with Next.js App Router and Pages Router icon: "nextjs" -------------- import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; # Next.js Integration Pushduck provides seamless integration with both Next.js App Router and Pages Router through universal handlers that work with Next.js's Web Standards-based API. **Next.js 13+**: App Router uses Web Standards (Request/Response), so pushduck handlers work directly. Pages Router requires a simple adapter for the legacy req/res API. ## Quick Setup **Install pushduck** ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` ```bash bun add pushduck ``` **Configure your upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().max("5MB"), documentUpload: s3.file().max("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create API route** ```typescript title="app/api/upload/route.ts" import { uploadRouter } from '@/lib/upload'; // Direct usage (recommended) export const { GET, POST } = uploadRouter.handlers; ``` ```typescript title="pages/api/upload/[...path].ts" import { uploadRouter } from '@/lib/upload'; import { toNextJsPagesHandler } from 'pushduck/server'; export default toNextJsPagesHandler(uploadRouter.handlers); ``` ## App Router Integration Next.js App Router uses Web Standards, making integration seamless: ### Basic API Route ```typescript title="app/api/upload/route.ts" import { uploadRouter } from '@/lib/upload'; // Direct usage - works because Next.js App Router uses Web Standards export const { GET, POST } = uploadRouter.handlers; ``` ### With Type Safety Adapter For extra type safety and better IDE support: ```typescript title="app/api/upload/route.ts" import { uploadRouter } from '@/lib/upload'; import { toNextJsHandler } from 'pushduck/adapters/nextjs'; // Explicit adapter for enhanced type safety export const { GET, POST } = toNextJsHandler(uploadRouter.handlers); ``` ### Advanced Configuration ```typescript title="app/api/upload/route.ts" import { createUploadConfig } from 'pushduck/server'; import { getServerSession } from 'next-auth'; import { authOptions } from '@/lib/auth'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); const uploadRouter = createS3Router({ // Profile pictures with authentication profilePicture: s3 .image() .max("2MB") .formats(["jpeg", "png", "webp"]) .middleware(async ({ req }) => { const session = await getServerSession(authOptions); if (!session?.user?.id) { throw new Error("Authentication required"); } return { userId: session.user.id, category: "profile" }; }), // Document uploads for authenticated users documents: s3 .file() .max("10MB") .types(["application/pdf", "text/plain", "application/msword"]) .middleware(async ({ req }) => { const session = await getServerSession(authOptions); if (!session?.user?.id) { throw new Error("Authentication required"); } return { userId: session.user.id, category: "documents" }; }), // Public image uploads (no auth required) publicImages: s3 .image() .max("5MB") .formats(["jpeg", "png", "webp"]) // No middleware = publicly accessible }); export type AppUploadRouter = typeof uploadRouter; export const { GET, POST } = uploadRouter.handlers; ``` ## Pages Router Integration Pages Router uses the legacy req/res API, so we provide a simple adapter: ### Basic API Route ```typescript title="pages/api/upload/[...path].ts" import { uploadRouter } from '@/lib/upload'; import { toNextJsPagesHandler } from 'pushduck/adapters/nextjs-pages'; export default toNextJsPagesHandler(uploadRouter.handlers); ``` ### With Authentication ```typescript title="pages/api/upload/[...path].ts" import { createUploadConfig } from 'pushduck/server'; import { toNextJsPagesHandler } from 'pushduck/adapters/nextjs-pages'; import { getSession } from 'next-auth/react'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ // ... your config }) .build(); const uploadRouter = createS3Router({ imageUpload: s3 .image() .max("5MB") .middleware(async ({ req }) => { // Convert Web Request to get session const session = await getSession({ req: req as any }); if (!session?.user?.id) { throw new Error("Authentication required"); } return { userId: session.user.id }; }) }); export default toNextJsPagesHandler(uploadRouter.handlers); ``` ## Client-Side Usage The client-side code is identical for both App Router and Pages Router: ### Setup Upload Client ```typescript title="lib/upload-client.ts" import { createUploadClient } from 'pushduck/client'; import type { AppUploadRouter } from './upload'; export const upload = createUploadClient({ endpoint: '/api/upload' }); ``` ### React Component ```typescript title="components/upload-form.tsx" 'use client'; // App Router // or just regular component for Pages Router import { upload } from '@/lib/upload-client'; import { useState } from 'react'; export function UploadForm() { const { uploadFiles, files, isUploading, error } = upload.imageUpload(); const handleFileSelect = (e: React.ChangeEvent) => { const selectedFiles = Array.from(e.target.files || []); uploadFiles(selectedFiles); }; return (
{error && (
Error: {error.message}
)} {files.length > 0 && (
{files.map((file) => (

{file.name}

{(file.size / 1024 / 1024).toFixed(2)} MB

{file.status === 'success' ? 'Complete' : `${file.progress}%`}

{file.status === 'success' && file.url && ( View )}
))}
)}
); } ``` ## Project Structure Here's a recommended project structure for Next.js with pushduck: ## Complete Example ### Upload Configuration ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; import { getServerSession } from 'next-auth'; import { authOptions } from './auth'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { const timestamp = Date.now(); const randomId = Math.random().toString(36).substring(2, 8); return `${metadata.userId}/${timestamp}/${randomId}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Profile pictures - single image, authenticated profilePicture: s3 .image() .max("2MB") .count(1) .formats(["jpeg", "png", "webp"]) .middleware(async ({ req }) => { const session = await getServerSession(authOptions); if (!session?.user?.id) throw new Error("Authentication required"); return { userId: session.user.id, type: "profile" }; }), // Gallery images - multiple images, authenticated gallery: s3 .image() .max("5MB") .count(10) .formats(["jpeg", "png", "webp"]) .middleware(async ({ req }) => { const session = await getServerSession(authOptions); if (!session?.user?.id) throw new Error("Authentication required"); return { userId: session.user.id, type: "gallery" }; }), // Documents - various file types, authenticated documents: s3 .file() .max("10MB") .count(5) .types([ "application/pdf", "application/msword", "application/vnd.openxmlformats-officedocument.wordprocessingml.document", "text/plain" ]) .middleware(async ({ req }) => { const session = await getServerSession(authOptions); if (!session?.user?.id) throw new Error("Authentication required"); return { userId: session.user.id, type: "documents" }; }), // Public uploads - no authentication required public: s3 .image() .max("1MB") .count(1) .formats(["jpeg", "png"]) // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; ``` ### API Route (App Router) ```typescript title="app/api/upload/route.ts" import { uploadRouter } from '@/lib/upload'; export const { GET, POST } = uploadRouter.handlers; ``` ### Upload Page ```typescript title="app/upload/page.tsx" 'use client'; import { upload } from '@/lib/upload-client'; import { useState } from 'react'; export default function UploadPage() { const [activeTab, setActiveTab] = useState<'profile' | 'gallery' | 'documents'>('profile'); const profileUpload = upload.profilePicture(); const galleryUpload = upload.gallery(); const documentsUpload = upload.documents(); const currentUpload = { profile: profileUpload, gallery: galleryUpload, documents: documentsUpload }[activeTab]; return (

File Upload Demo

{/* Tab Navigation */}
{[ { key: 'profile', label: 'Profile Picture', icon: '๐Ÿ‘ค' }, { key: 'gallery', label: 'Gallery', icon: '๐Ÿ–ผ๏ธ' }, { key: 'documents', label: 'Documents', icon: '๐Ÿ“„' } ].map(tab => ( ))}
{/* Upload Interface */}
{ const files = Array.from(e.target.files || []); currentUpload.uploadFiles(files); }} disabled={currentUpload.isUploading} className="block w-full text-sm text-gray-500 file:mr-4 file:py-2 file:px-4 file:rounded-full file:border-0 file:text-sm file:font-semibold file:bg-blue-50 file:text-blue-700 hover:file:bg-blue-100" /> {/* File List */} {currentUpload.files.length > 0 && (
{currentUpload.files.map((file) => (

{file.name}

{(file.size / 1024 / 1024).toFixed(2)} MB

{file.status === 'success' && 'โœ…'} {file.status === 'error' && 'โŒ'} {file.status === 'uploading' && 'โณ'} {file.status === 'pending' && 'โธ๏ธ'}
{file.status === 'success' && file.url && ( View )}
))}
)}
); } ``` ## Environment Variables ```bash title=".env.local" # Cloudflare R2 Configuration AWS_ACCESS_KEY_ID=your_r2_access_key AWS_SECRET_ACCESS_KEY=your_r2_secret_key AWS_ENDPOINT_URL=https://your-account-id.r2.cloudflarestorage.com S3_BUCKET_NAME=your-bucket-name R2_ACCOUNT_ID=your-account-id # Next.js Configuration NEXTAUTH_SECRET=your-nextauth-secret NEXTAUTH_URL=http://localhost:3000 ``` ## Deployment Considerations * Environment variables configured in dashboard * Edge Runtime compatible * Automatic HTTPS * Configure environment variables * Works with Netlify Functions * CDN integration available * Complete Next.js compatibility * Environment variable management * Automatic deployments *** **Next.js Ready**: Pushduck works seamlessly with both Next.js App Router and Pages Router, providing the same great developer experience across all Next.js versions. # Nitro/H3 URL: /docs/integrations/nitro-h3 Universal web server file uploads with Nitro and H3 using Web Standards - no adapter needed! *** title: Nitro/H3 description: Universal web server file uploads with Nitro and H3 using Web Standards - no adapter needed! --------------------------------------------------------------------------------------------------------- import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; # Nitro/H3 Integration Nitro is a universal web server framework that powers Nuxt.js, built on top of H3 (HTTP framework). It uses Web Standards APIs and provides excellent performance with universal deployment. Since Nitro/H3 uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters! **Web Standards Native**: Nitro/H3 uses Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead and universal deployment capabilities. ## Quick Setup **Install dependencies** ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` ```bash bun add pushduck ``` **Configure upload router** ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().max("5MB"), documentUpload: s3.file().max("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create API route** ```typescript title="routes/api/upload/[...path].ts" import { uploadRouter } from '~/lib/upload'; // Direct usage - no adapter needed! export default defineEventHandler(async (event) => { return uploadRouter.handlers(event.node.req); }); ``` ## Basic Integration ### Simple Upload Route ```typescript title="routes/api/upload/[...path].ts" import { uploadRouter } from '~/lib/upload'; // Method 1: Combined handler (recommended) export default defineEventHandler(async (event) => { return uploadRouter.handlers(event.node.req); }); // Method 2: Method-specific handlers export default defineEventHandler(async (event) => { const method = getMethod(event); if (method === 'GET') { return uploadRouter.handlers.GET(event.node.req); } if (method === 'POST') { return uploadRouter.handlers.POST(event.node.req); } throw createError({ statusCode: 405, statusMessage: 'Method Not Allowed' }); }); ``` ### With H3 Utilities ```typescript title="routes/api/upload/[...path].ts" import { uploadRouter } from '~/lib/upload'; import { defineEventHandler, getMethod, setHeader, createError } from 'h3'; export default defineEventHandler(async (event) => { // Handle CORS setHeader(event, 'Access-Control-Allow-Origin', '*'); setHeader(event, 'Access-Control-Allow-Methods', 'GET, POST, OPTIONS'); setHeader(event, 'Access-Control-Allow-Headers', 'Content-Type'); // Handle preflight requests if (getMethod(event) === 'OPTIONS') { return ''; } try { return await uploadRouter.handlers(event.node.req); } catch (error) { throw createError({ statusCode: 500, statusMessage: 'Upload failed', data: error }); } }); ``` ## Advanced Configuration ### Authentication with H3 ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; import { getCookie } from 'h3'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with session authentication privateUpload: s3 .image() .max("5MB") .middleware(async ({ req }) => { const cookies = req.headers.cookie; const sessionId = parseCookie(cookies)?.sessionId; if (!sessionId) { throw new Error('Authentication required'); } const user = await getUserFromSession(sessionId); if (!user) { throw new Error('Invalid session'); } return { userId: user.id, username: user.username, }; }), // Public uploads (no auth) publicUpload: s3 .image() .max("2MB") // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; // Helper functions function parseCookie(cookieString: string | undefined) { if (!cookieString) return {}; return Object.fromEntries( cookieString.split('; ').map(c => { const [key, ...v] = c.split('='); return [key, v.join('=')]; }) ); } async function getUserFromSession(sessionId: string) { // Implement your session validation logic return { id: 'user-123', username: 'demo-user' }; } ``` ## Standalone Nitro App ### Basic Nitro Setup ```typescript title="nitro.config.ts" export default defineNitroConfig({ srcDir: 'server', routeRules: { '/api/upload/**': { cors: true, headers: { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'GET, POST, OPTIONS', 'Access-Control-Allow-Headers': 'Content-Type' } } }, experimental: { wasm: true } }); ``` ### Server Entry Point ```typescript title="server/index.ts" import { createApp, toNodeListener } from 'h3'; import { uploadRouter } from './lib/upload'; const app = createApp(); // Upload routes app.use('/api/upload/**', defineEventHandler(async (event) => { return uploadRouter.handlers(event.node.req); })); // Health check app.use('/health', defineEventHandler(() => ({ status: 'ok' }))); export default toNodeListener(app); ``` ## Client-Side Usage ### HTML with Vanilla JavaScript ```html title="public/index.html" File Upload Demo

File Upload Demo

Image Upload

Document Upload

``` ### With Framework Integration ```typescript title="plugins/upload.client.ts" import { useUpload } from "pushduck/client"; import type { AppUploadRouter } from "~/lib/upload"; export const { UploadButton, UploadDropzone } = useUpload({ endpoint: "/api/upload", }); ``` ## File Management ### File API Route ```typescript title="routes/api/files.get.ts" import { defineEventHandler, getQuery, createError } from 'h3'; export default defineEventHandler(async (event) => { const query = getQuery(event); const userId = query.userId as string; if (!userId) { throw createError({ statusCode: 400, statusMessage: 'User ID required' }); } // Fetch files from database const files = await getFilesForUser(userId); return { files: files.map(file => ({ id: file.id, name: file.name, url: file.url, size: file.size, uploadedAt: file.createdAt, })), }; }); async function getFilesForUser(userId: string) { // Implement your database query logic return []; } ``` ### File Management Page ```html title="public/files.html" My Files

My Files

Uploaded Files

``` ## Deployment Options ```typescript title="nitro.config.ts" export default defineNitroConfig({ preset: 'vercel-edge', // or 'vercel' for Node.js runtime }); ``` ```typescript title="nitro.config.ts" export default defineNitroConfig({ preset: 'netlify-edge', // or 'netlify' for Node.js runtime }); ``` ```typescript title="nitro.config.ts" export default defineNitroConfig({ preset: 'node-server', }); ``` ```typescript title="nitro.config.ts" export default defineNitroConfig({ preset: 'cloudflare-workers', }); ``` ## Environment Variables ```bash title=".env" # AWS Configuration AWS_REGION=us-east-1 AWS_ACCESS_KEY_ID=your_access_key AWS_SECRET_ACCESS_KEY=your_secret_key AWS_S3_BUCKET=your-bucket-name # Nitro NITRO_PORT=3000 NITRO_HOST=0.0.0.0 ``` ## Performance Benefits ## Middleware and Plugins ```typescript title="middleware/cors.ts" export default defineEventHandler(async (event) => { if (event.node.req.url?.startsWith('/api/upload')) { setHeader(event, 'Access-Control-Allow-Origin', '*'); setHeader(event, 'Access-Control-Allow-Methods', 'GET, POST, OPTIONS'); setHeader(event, 'Access-Control-Allow-Headers', 'Content-Type'); if (getMethod(event) === 'OPTIONS') { return ''; } } }); ``` ```typescript title="plugins/database.ts" export default async (nitroApp) => { // Initialize database connection console.log('Database plugin initialized'); // Add database to context nitroApp.hooks.hook('request', async (event) => { event.context.db = await getDatabase(); }); }; ``` ## Real-Time Upload Progress ```html title="public/advanced-upload.html" Advanced Upload
``` ## Troubleshooting **Common Issues** 1. **Route not found**: Ensure your route is `routes/api/upload/[...path].ts` 2. **Build errors**: Check that pushduck and h3 are properly installed 3. **CORS issues**: Use Nitro's built-in CORS handling or middleware 4. **Environment variables**: Make sure they're accessible in your deployment environment ### Debug Mode Enable debug logging: ```typescript title="lib/upload.ts" export const uploadRouter = createS3Router({ // ... routes }).middleware(async ({ req, file }) => { if (process.env.NODE_ENV === "development") { console.log("Upload request:", req.url); console.log("File:", file.name, file.size); } return {}; }); ``` ### Nitro Configuration ```typescript title="nitro.config.ts" export default defineNitroConfig({ srcDir: 'server', buildDir: '.nitro', output: { dir: '.output', serverDir: '.output/server', publicDir: '.output/public' }, runtimeConfig: { awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID, awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY, awsRegion: process.env.AWS_REGION, s3BucketName: process.env.S3_BUCKET_NAME, }, experimental: { wasm: true } }); ``` Nitro/H3 provides an excellent foundation for building universal web applications with pushduck, offering flexibility, performance, and deployment options across any platform while maintaining full compatibility with Web Standards APIs. # Nuxt.js URL: /docs/integrations/nuxtjs Vue.js full-stack file uploads with Nuxt.js using Web Standards - no adapter needed! *** title: Nuxt.js description: Vue.js full-stack file uploads with Nuxt.js using Web Standards - no adapter needed! ------------------------------------------------------------------------------------------------- import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; # Nuxt.js Integration Nuxt.js is the intuitive Vue.js framework for building full-stack web applications. It uses Web Standards APIs and provides excellent performance with server-side rendering. Since Nuxt.js uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters! **Web Standards Native**: Nuxt.js server routes use Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead. ## Quick Setup **Install dependencies** ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` ```bash bun add pushduck ``` **Configure upload router** ```typescript title="server/utils/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().max("5MB"), documentUpload: s3.file().max("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create API route** ```typescript title="server/api/upload/[...path].ts" import { uploadRouter } from '~/server/utils/upload'; // Direct usage - no adapter needed! export default defineEventHandler(async (event) => { return uploadRouter.handlers(event.node.req); }); ``` ## Basic Integration ### Simple Upload Route ```typescript title="server/api/upload/[...path].ts" import { uploadRouter } from '~/server/utils/upload'; // Method 1: Combined handler (recommended) export default defineEventHandler(async (event) => { return uploadRouter.handlers(event.node.req); }); // Method 2: Method-specific handlers export default defineEventHandler({ onRequest: [ // Add middleware here if needed ], handler: async (event) => { if (event.node.req.method === 'GET') { return uploadRouter.handlers.GET(event.node.req); } if (event.node.req.method === 'POST') { return uploadRouter.handlers.POST(event.node.req); } } }); ``` ### With Server Middleware ```typescript title="server/middleware/cors.ts" export default defineEventHandler(async (event) => { if (event.node.req.url?.startsWith('/api/upload')) { // Handle CORS for upload routes setHeader(event, 'Access-Control-Allow-Origin', '*'); setHeader(event, 'Access-Control-Allow-Methods', 'GET, POST, OPTIONS'); setHeader(event, 'Access-Control-Allow-Headers', 'Content-Type'); if (event.node.req.method === 'OPTIONS') { return ''; } } }); ``` ## Advanced Configuration ### Authentication with Nuxt ```typescript title="server/utils/upload.ts" import { createUploadConfig } from 'pushduck/server'; import jwt from 'jsonwebtoken'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with JWT authentication privateUpload: s3 .image() .max("5MB") .middleware(async ({ req }) => { const authHeader = req.headers.authorization; if (!authHeader?.startsWith('Bearer ')) { throw new Error('Authorization required'); } const token = authHeader.substring(7); try { const payload = jwt.verify(token, process.env.JWT_SECRET!) as any; return { userId: payload.sub, userRole: payload.role }; } catch (error) { throw new Error('Invalid token'); } }), // Public uploads (no auth) publicUpload: s3 .image() .max("2MB") // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; ``` ## Client-Side Usage ### Upload Composable ```typescript title="composables/useUpload.ts" import { useUpload } from "pushduck/client"; import type { AppUploadRouter } from "~/server/utils/upload"; export const { UploadButton, UploadDropzone } = useUpload({ endpoint: "/api/upload", }); ``` ### Upload Component ```vue title="components/FileUpload.vue" ``` ### Using in Pages ```vue title="pages/index.vue" ``` ## File Management ### Server-Side File API ```typescript title="server/api/files.get.ts" export default defineEventHandler(async (event) => { const query = getQuery(event); const userId = query.userId as string; if (!userId) { throw createError({ statusCode: 400, statusMessage: 'User ID required' }); } // Fetch files from database const files = await $fetch('/api/database/files', { query: { userId } }); return { files: files.map((file: any) => ({ id: file.id, name: file.name, url: file.url, size: file.size, uploadedAt: file.createdAt, })), }; }); ``` ### File Management Page ```vue title="pages/files.vue" ``` ## Deployment Options ```typescript title="nuxt.config.ts" export default defineNuxtConfig({ nitro: { preset: 'vercel-edge', // or 'vercel' for Node.js runtime }, runtimeConfig: { awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID, awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY, awsRegion: process.env.AWS_REGION, s3BucketName: process.env.S3_BUCKET_NAME, } }); ``` ```typescript title="nuxt.config.ts" export default defineNuxtConfig({ nitro: { preset: 'netlify-edge', // or 'netlify' for Node.js runtime }, runtimeConfig: { awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID, awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY, awsRegion: process.env.AWS_REGION, s3BucketName: process.env.S3_BUCKET_NAME, } }); ``` ```typescript title="nuxt.config.ts" export default defineNuxtConfig({ nitro: { preset: 'node-server', }, runtimeConfig: { awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID, awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY, awsRegion: process.env.AWS_REGION, s3BucketName: process.env.S3_BUCKET_NAME, } }); ``` ```typescript title="nuxt.config.ts" export default defineNuxtConfig({ nitro: { preset: 'cloudflare-pages', }, runtimeConfig: { awsAccessKeyId: process.env.AWS_ACCESS_KEY_ID, awsSecretAccessKey: process.env.AWS_SECRET_ACCESS_KEY, awsRegion: process.env.AWS_REGION, s3BucketName: process.env.S3_BUCKET_NAME, } }); ``` ## Environment Variables ```bash title=".env" # AWS Configuration AWS_REGION=us-east-1 AWS_ACCESS_KEY_ID=your_access_key AWS_SECRET_ACCESS_KEY=your_secret_key AWS_S3_BUCKET=your-bucket-name # JWT Secret (for authentication) JWT_SECRET=your-jwt-secret # Nuxt NUXT_PUBLIC_UPLOAD_ENDPOINT=http://localhost:3000/api/upload ``` ## Performance Benefits ## Real-Time Upload Progress ```vue title="components/AdvancedUpload.vue" ``` ## Troubleshooting **Common Issues** 1. **Route not found**: Ensure your route is `server/api/upload/[...path].ts` 2. **Build errors**: Check that pushduck is properly installed 3. **CORS issues**: Use server middleware for CORS configuration 4. **Runtime config**: Make sure environment variables are properly configured ### Debug Mode Enable debug logging: ```typescript title="server/utils/upload.ts" export const uploadRouter = createS3Router({ // ... routes }).middleware(async ({ req, file }) => { if (process.dev) { console.log("Upload request:", req.url); console.log("File:", file.name, file.size); } return {}; }); ``` ### Nitro Configuration ```typescript title="nuxt.config.ts" export default defineNuxtConfig({ nitro: { experimental: { wasm: true }, // Enable debugging in development devProxy: { '/api/upload': { target: 'http://localhost:3000/api/upload', changeOrigin: true } } } }); ``` Nuxt.js provides an excellent foundation for building full-stack Vue.js applications with pushduck, combining the power of Vue's reactive framework with Web Standards APIs and Nitro's universal deployment capabilities. # Overview URL: /docs/integrations/overview Universal file uploads that work with any web framework - from Web Standards to custom request/response APIs *** title: Overview description: Universal file uploads that work with any web framework - from Web Standards to custom request/response APIs ------------------------------------------------------------------------------------------------------------------------- import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; # Framework Integrations Overview Pushduck provides **universal file upload handlers** that work with any web framework through a single, consistent API. Write your upload logic once and deploy it anywhere! **Universal Design**: Pushduck uses Web Standards (Request/Response) at its core, making it compatible with both Web Standards frameworks and those with custom request/response APIs without framework-specific code. ## ๐ŸŒŸ Universal API All frameworks use the same core API: ```typescript import { createS3Router, s3 } from 'pushduck/server'; const uploadRouter = createS3Router({ imageUpload: s3.image().max("5MB"), documentUpload: s3.file().max("10MB"), videoUpload: s3.file().max("100MB").types(["video/*"]) }); // Universal handlers - work with ANY framework export const { GET, POST } = uploadRouter.handlers; ``` ## Framework Categories Pushduck supports frameworks in two categories: **No adapter needed!** Use `uploadRouter.handlers` directly. * Hono * Elysia * Bun Runtime * TanStack Start * SolidJS Start **Simple adapters provided** for seamless integration. * Next.js (App & Pages Router) * Express * Fastify ## Quick Start by Framework ```typescript // Works with: Hono, Elysia, Bun, TanStack Start, SolidJS Start import { uploadRouter } from '@/lib/upload'; // Direct usage - no adapter needed! app.all('/api/upload/*', (ctx) => { return uploadRouter.handlers(ctx.request); // or c.req.raw }); ``` ```typescript // app/api/upload/route.ts import { uploadRouter } from '@/lib/upload'; // Direct usage (recommended) export const { GET, POST } = uploadRouter.handlers; // Or with explicit adapter for extra type safety import { toNextJsHandler } from 'pushduck/adapters/nextjs'; export const { GET, POST } = toNextJsHandler(uploadRouter.handlers); ``` ```typescript import express from 'express'; import { uploadRouter } from '@/lib/upload'; import { toExpressHandler } from 'pushduck/adapters/express'; const app = express(); app.all("/api/upload/*", toExpressHandler(uploadRouter.handlers)); ``` ```typescript import Fastify from 'fastify'; import { uploadRouter } from '@/lib/upload'; import { toFastifyHandler } from 'pushduck/adapters/fastify'; const fastify = Fastify(); fastify.all('/api/upload/*', toFastifyHandler(uploadRouter.handlers)); ``` ## Why Universal Handlers Work **Web Standards Foundation** Pushduck is built on Web Standards (`Request` and `Response` objects) that are supported by all modern JavaScript runtimes. ```typescript // Core handler signature type Handler = (request: Request) => Promise ``` **Framework Compatibility** Modern frameworks expose Web Standard objects directly: * **Hono**: `c.req.raw` is a Web `Request` * **Elysia**: `context.request` is a Web `Request` * **Bun**: Native Web `Request` support * **TanStack Start**: `{ request }` is a Web `Request` * **SolidJS Start**: `event.request` is a Web `Request` **Framework Adapters** For frameworks with custom request/response APIs, simple adapters convert between formats: ```typescript // Express adapter example export function toExpressHandler(handlers: UniversalHandlers) { return async (req: Request, res: Response, next: NextFunction) => { const webRequest = convertExpressToWebRequest(req); const webResponse = await handlers[req.method](webRequest); convertWebResponseToExpress(webResponse, res); }; } ``` ## Configuration (Same for All Frameworks) Your upload configuration is identical across all frameworks: ```typescript title="lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: process.env.AWS_ACCESS_KEY_ID!, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: process.env.AWS_ENDPOINT_URL!, bucket: process.env.S3_BUCKET_NAME!, accountId: process.env.R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Image uploads with validation imageUpload: s3 .image() .max("5MB") .formats(["jpeg", "png", "webp"]) .middleware(async ({ req }) => { const userId = await getUserId(req); return { userId, category: "images" }; }), // Document uploads documentUpload: s3 .file() .max("10MB") .types(["application/pdf", "text/plain"]) .middleware(async ({ req }) => { const userId = await getUserId(req); return { userId, category: "documents" }; }), // Video uploads videoUpload: s3 .file() .max("100MB") .types(["video/mp4", "video/quicktime"]) .middleware(async ({ req }) => { const userId = await getUserId(req); return { userId, category: "videos" }; }) }); export type AppUploadRouter = typeof uploadRouter; ``` ## Client Usage (Framework Independent) The client-side code is identical regardless of your backend framework: ```typescript title="lib/upload-client.ts" import { createUploadClient } from 'pushduck/client'; import type { AppUploadRouter } from './upload'; export const upload = createUploadClient({ endpoint: '/api/upload' }); ``` ```typescript title="components/upload-form.tsx" import { upload } from '@/lib/upload-client'; export function UploadForm() { // Property-based access with full type safety const { uploadFiles, files, isUploading } = upload.imageUpload(); const handleUpload = async (selectedFiles: File[]) => { await uploadFiles(selectedFiles); }; return (
handleUpload(Array.from(e.target.files || []))} /> {files.map(file => (
{file.name} {file.url && View}
))}
); } ``` ## Benefits of Universal Design Migrate from Express to Hono or Next.js to Bun without changing your upload implementation. Web Standards native frameworks get direct handler access with no adapter overhead. Master pushduck once and use it with any framework in your toolkit. As more frameworks adopt Web Standards, they automatically work with pushduck. ## Next Steps Choose your framework integration guide: Complete guide for Next.js App Router and Pages Router Fast, lightweight, built on Web Standards TypeScript-first framework with Bun Classic Node.js framework integration *** **Universal by Design**: Write once, run anywhere. Pushduck's universal handlers make file uploads work seamlessly across the entire JavaScript ecosystem. # Qwik URL: /docs/integrations/qwik Edge-optimized file uploads with Qwik using Web Standards - no adapter needed! *** title: Qwik description: Edge-optimized file uploads with Qwik using Web Standards - no adapter needed! ------------------------------------------------------------------------------------------- import { Callout } from "fumadocs-ui/components/callout"; import { Card, Cards } from "fumadocs-ui/components/card"; import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Steps, Step } from "fumadocs-ui/components/steps"; import { File, Folder, Files } from "fumadocs-ui/components/files"; # Qwik Integration Qwik is a revolutionary web framework focused on resumability and edge optimization. It uses Web Standards APIs and provides instant loading with minimal JavaScript. Since Qwik uses standard `Request`/`Response` objects, pushduck handlers work directly without any adapters! **Web Standards Native**: Qwik server endpoints use Web Standard `Request`/`Response` objects, making pushduck integration seamless with zero overhead and perfect for edge deployment. ## Quick Setup **Install dependencies** ```bash npm install pushduck ``` ```bash yarn add pushduck ``` ```bash pnpm add pushduck ``` ```bash bun add pushduck ``` **Configure upload router** ```typescript title="src/lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: import.meta.env.VITE_AWS_ACCESS_KEY_ID!, secretAccessKey: import.meta.env.VITE_AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: import.meta.env.VITE_AWS_ENDPOINT_URL!, bucket: import.meta.env.VITE_S3_BUCKET_NAME!, accountId: import.meta.env.VITE_R2_ACCOUNT_ID!, }) .build(); export const uploadRouter = createS3Router({ imageUpload: s3.image().max("5MB"), documentUpload: s3.file().max("10MB") }); export type AppUploadRouter = typeof uploadRouter; ``` **Create API route** ```typescript title="src/routes/api/upload/[...path]/index.ts" import type { RequestHandler } from '@builder.io/qwik-city'; import { uploadRouter } from '~/lib/upload'; // Direct usage - no adapter needed! export const onGet: RequestHandler = async ({ request }) => { return uploadRouter.handlers(request); }; export const onPost: RequestHandler = async ({ request }) => { return uploadRouter.handlers(request); }; ``` ## Basic Integration ### Simple Upload Route ```typescript title="src/routes/api/upload/[...path]/index.ts" import type { RequestHandler } from '@builder.io/qwik-city'; import { uploadRouter } from '~/lib/upload'; // Method 1: Combined handler (recommended) export const onRequest: RequestHandler = async ({ request }) => { return uploadRouter.handlers(request); }; // Method 2: Separate handlers (if you need method-specific logic) export const onGet: RequestHandler = async ({ request }) => { return uploadRouter.handlers.GET(request); }; export const onPost: RequestHandler = async ({ request }) => { return uploadRouter.handlers.POST(request); }; ``` ### With CORS Support ```typescript title="src/routes/api/upload/[...path]/index.ts" import type { RequestHandler } from '@builder.io/qwik-city'; import { uploadRouter } from '~/lib/upload'; export const onRequest: RequestHandler = async ({ request, headers }) => { // Handle CORS preflight if (request.method === 'OPTIONS') { headers.set('Access-Control-Allow-Origin', '*'); headers.set('Access-Control-Allow-Methods', 'GET, POST, OPTIONS'); headers.set('Access-Control-Allow-Headers', 'Content-Type'); return new Response(null, { status: 200 }); } const response = await uploadRouter.handlers(request); // Add CORS headers to actual response headers.set('Access-Control-Allow-Origin', '*'); return response; }; ``` ## Advanced Configuration ### Authentication with Qwik ```typescript title="src/lib/upload.ts" import { createUploadConfig } from 'pushduck/server'; const { s3, createS3Router } = createUploadConfig() .provider("cloudflareR2",{ accessKeyId: import.meta.env.VITE_AWS_ACCESS_KEY_ID!, secretAccessKey: import.meta.env.VITE_AWS_SECRET_ACCESS_KEY!, region: 'auto', endpoint: import.meta.env.VITE_AWS_ENDPOINT_URL!, bucket: import.meta.env.VITE_S3_BUCKET_NAME!, accountId: import.meta.env.VITE_R2_ACCOUNT_ID!, }) .paths({ prefix: 'uploads', generateKey: (file, metadata) => { return `${metadata.userId}/${Date.now()}/${file.name}`; } }) .build(); export const uploadRouter = createS3Router({ // Private uploads with cookie-based authentication privateUpload: s3 .image() .max("5MB") .middleware(async ({ req }) => { const cookies = req.headers.get('Cookie'); const sessionId = parseCookie(cookies)?.sessionId; if (!sessionId) { throw new Error('Authentication required'); } const user = await getUserFromSession(sessionId); if (!user) { throw new Error('Invalid session'); } return { userId: user.id, username: user.username, }; }), // Public uploads (no auth) publicUpload: s3 .image() .max("2MB") // No middleware = public access }); export type AppUploadRouter = typeof uploadRouter; // Helper functions function parseCookie(cookieString: string | null) { if (!cookieString) return {}; return Object.fromEntries( cookieString.split('; ').map(c => { const [key, ...v] = c.split('='); return [key, v.join('=')]; }) ); } async function getUserFromSession(sessionId: string) { // Implement your session validation logic return { id: 'user-123', username: 'demo-user' }; } ``` ## Client-Side Usage ### Upload Component ```tsx title="src/components/file-upload.tsx" import { component$, useSignal } from '@builder.io/qwik'; import { useUpload } from "pushduck/client"; import type { AppUploadRouter } from "~/lib/upload"; export const FileUpload = component$(() => { const uploadProgress = useSignal(0); const isUploading = useSignal(false); const { UploadButton, UploadDropzone } = useUpload({ endpoint: "/api/upload", }); const handleUploadComplete = $((files: any[]) => { console.log("Files uploaded:", files); alert("Upload completed!"); }); const handleUploadError = $((error: Error) => { console.error("Upload error:", error); alert(`Upload failed: ${error.message}`); }); return (

Image Upload

Document Upload

); }); ``` ### Using in Routes ```tsx title="src/routes/index.tsx" import { component$ } from '@builder.io/qwik'; import type { DocumentHead } from '@builder.io/qwik-city'; import { FileUpload } from '~/components/file-upload'; export default component$(() => { return (

File Upload Demo

); }); export const head: DocumentHead = { title: 'File Upload Demo', meta: [ { name: 'description', content: 'Qwik file upload demo with pushduck', }, ], }; ``` ## File Management ### Server-Side File Loader ```typescript title="src/routes/files/index.tsx" import { component$ } from '@builder.io/qwik'; import type { DocumentHead } from '@builder.io/qwik-city'; import { routeLoader$ } from '@builder.io/qwik-city'; import { FileUpload } from '~/components/file-upload'; export const useFiles = routeLoader$(async (requestEvent) => { const userId = 'current-user'; // Get from session/auth // Fetch files from database const files = await getFilesForUser(userId); return { files: files.map(file => ({ id: file.id, name: file.name, url: file.url, size: file.size, uploadedAt: file.createdAt, })), }; }); export default component$(() => { const filesData = useFiles(); const formatFileSize = (bytes: number): string => { const sizes = ['Bytes', 'KB', 'MB', 'GB']; if (bytes === 0) return '0 Bytes'; const i = Math.floor(Math.log(bytes) / Math.log(1024)); return Math.round(bytes / Math.pow(1024, i) * 100) / 100 + ' ' + sizes[i]; }; return (

My Files

Uploaded Files

{filesData.value.files.length === 0 ? (

No files uploaded yet.

) : (
{filesData.value.files.map((file) => (

{file.name}

{formatFileSize(file.size)}

{new Date(file.uploadedAt).toLocaleDateString()}

View File
))}
)}
); }); export const head: DocumentHead = { title: 'My Files', }; async function getFilesForUser(userId: string) { // Implement your database query logic return []; } ``` ## Deployment Options ```typescript title="vite.config.ts" import { defineConfig } from 'vite'; import { qwikVite } from '@builder.io/qwik/optimizer'; import { qwikCity } from '@builder.io/qwik-city/vite'; import { qwikCloudflarePages } from '@builder.io/qwik-city/adapters/cloudflare-pages/vite'; export default defineConfig(() => { return { plugins: [ qwikCity({ adapter: qwikCloudflarePages(), }), qwikVite(), ], }; }); ``` ```typescript title="vite.config.ts" import { defineConfig } from 'vite'; import { qwikVite } from '@builder.io/qwik/optimizer'; import { qwikCity } from '@builder.io/qwik-city/vite'; import { qwikVercel } from '@builder.io/qwik-city/adapters/vercel-edge/vite'; export default defineConfig(() => { return { plugins: [ qwikCity({ adapter: qwikVercel(), }), qwikVite(), ], }; }); ``` ```typescript title="vite.config.ts" import { defineConfig } from 'vite'; import { qwikVite } from '@builder.io/qwik/optimizer'; import { qwikCity } from '@builder.io/qwik-city/vite'; import { qwikNetlifyEdge } from '@builder.io/qwik-city/adapters/netlify-edge/vite'; export default defineConfig(() => { return { plugins: [ qwikCity({ adapter: qwikNetlifyEdge(), }), qwikVite(), ], }; }); ``` ```typescript title="vite.config.ts" import { defineConfig } from 'vite'; import { qwikVite } from '@builder.io/qwik/optimizer'; import { qwikCity } from '@builder.io/qwik-city/vite'; import { qwikDeno } from '@builder.io/qwik-city/adapters/deno/vite'; export default defineConfig(() => { return { plugins: [ qwikCity({ adapter: qwikDeno(), }), qwikVite(), ], }; }); ``` ## Environment Variables ```bash title=".env" # AWS Configuration VITE_AWS_REGION=us-east-1 VITE_AWS_ACCESS_KEY_ID=your_access_key VITE_AWS_SECRET_ACCESS_KEY=your_secret_key VITE_AWS_S3_BUCKET=your-bucket-name # Qwik VITE_PUBLIC_UPLOAD_ENDPOINT=http://localhost:5173/api/upload ``` ## Performance Benefits ## Real-Time Upload Progress ```tsx title="src/components/advanced-upload.tsx" import { component$, useSignal, $ } from '@builder.io/qwik'; export const AdvancedUpload = component$(() => { const uploadProgress = useSignal(0); const isUploading = useSignal(false); const handleFileUpload = $(async (event: Event) => { const target = event.target as HTMLInputElement; const files = target.files; if (!files || files.length === 0) return; isUploading.value = true; uploadProgress.value = 0; try { // Simulate upload progress for (let i = 0; i <= 100; i += 10) { uploadProgress.value = i; await new Promise(resolve => setTimeout(resolve, 100)); } alert('Upload completed!'); } catch (error) { console.error('Upload failed:', error); alert('Upload failed!'); } finally { isUploading.value = false; uploadProgress.value = 0; } }); return (
{isUploading.value && (

{uploadProgress.value}% uploaded

)}
); }); ``` ## Qwik City Form Integration ```tsx title="src/routes/upload-form/index.tsx" import { component$ } from '@builder.io/qwik'; import type { DocumentHead } from '@builder.io/qwik-city'; import { routeAction$, Form, zod$, z } from '@builder.io/qwik-city'; import { FileUpload } from '~/components/file-upload'; export const useUploadAction = routeAction$(async (data, requestEvent) => { // Handle form submission // Files are already uploaded via pushduck, just save metadata console.log('Form data:', data); // Redirect to files page throw requestEvent.redirect(302, '/files'); }, zod$({ title: z.string().min(1), description: z.string().optional(), })); export default component$(() => { const uploadAction = useUploadAction(); return (

Upload Files