HomeBlog Posts
Solving Cloudflare Worker Resource Limits: A Deep Dive into Performance Optimization

Solving Cloudflare Worker Resource Limits: A Deep Dive into Performance Optimization

2024-08-24 by Roel Kristelijn

Solving Cloudflare Worker Resource Limits: A Deep Dive into Performance Optimization

When your production application suddenly starts throwing Error 1102: Worker exceeded resource limits, it's time for some serious performance detective work. This is the story of how we diagnosed, analyzed, and solved a critical resource limit issue in our Next.js blog deployed on Cloudflare Workers.

The Problem: Error 1102 in Production

The Incident

Error 1102 Ray ID: 97444d3e0ece590c • 2025-08-24 16:42:14 UTC
Worker exceeded resource limits

Our blog suddenly became inaccessible, throwing this cryptic error. Users couldn't access any pages, and the Worker was consistently hitting resource limits.

Understanding Cloudflare Worker Limits

Cloudflare Workers have strict resource constraints:

ResourceFree PlanPaid Plan
CPU Time10ms50ms
Memory128MB128MB
Bundle Size1MB10MB
Subrequests501000

When any of these limits are exceeded, you get Error 1102.

Root Cause Analysis

Initial Investigation

Let's examine what was happening in our application:

1# Check bundle size 2ls -lh src/data/posts.json 3# Output: -rw-r--r-- 1 staff 226K Aug 24 13:33 posts.json

226KB for a single JSON file! This was our first red flag.

Memory Usage Pattern Analysis

Here's what was happening on every request:

The Data Structure Problem

Our posts.json contained:

1{ 2 "posts": [ 3 { 4 "id": "01-creating-nextjs-project", 5 "slug": "creating-nextjs-project...", 6 "title": "Creating a Next.js Project", 7 "excerpt": "Brief description...", 8 "content": "# Very long content with Mermaid diagrams, code blocks, etc..." 9 } 10 // ... 18 more posts with full content 11 ] 12}

Problem: Every page load imported the entire 226KB file, even when only metadata was needed.

Performance Impact Analysis

MetricImpactConsequence
Bundle Size226KB loaded on importSlow Worker startup
Memory UsageAll posts in memoryHigh memory pressure
CPU TimeJSON parsing + processingExceeded 10ms limit
NetworkLarge bundle transferIncreased latency

The Solution: Lazy Loading Architecture

Strategy Overview

We implemented a two-tier data architecture:

  1. Metadata Layer: Lightweight index (10.9KB)
  2. Content Layer: Individual files loaded on-demand

Implementation Deep Dive

Step 1: Data Separation Script

1// scripts/optimize-for-workers.js 2async function optimizeForWorkers() { 3 const postsData = JSON.parse(fs.readFileSync(POSTS_FILE, 'utf8')); 4 5 // Create lightweight metadata 6 const metadata = { 7 posts: postsData.posts.map(post => ({ 8 id: post.id, 9 slug: post.slug, 10 title: post.title, 11 excerpt: post.excerpt, 12 date: post.date, 13 author: post.author, 14 // Remove content field 15 })), 16 slugs: postsData.slugs, 17 generatedAt: postsData.generatedAt 18 }; 19 20 // Save individual content files 21 for (const post of postsData.posts) { 22 const contentFile = path.join(CONTENT_DIR, `${post.slug}.json`); 23 fs.writeFileSync(contentFile, JSON.stringify({ content: post.content })); 24 } 25 26 fs.writeFileSync(METADATA_FILE, JSON.stringify(metadata, null, 2)); 27}

Step 2: Optimized Data Loading

1// src/lib/posts-static.ts 2import postsMetadata from '@/data/posts-metadata.json'; // Only 10.9KB! 3 4const contentCache = new Map<string, string>(); 5 6async function loadPostContent(slug: string): Promise<string> { 7 if (contentCache.has(slug)) { 8 return contentCache.get(slug)!; 9 } 10 11 try { 12 // Dynamic import - only loads when needed 13 const contentModule = await import(`@/data/content/${slug}.json`); 14 const content = contentModule.content || ''; 15 16 contentCache.set(slug, content); 17 return content; 18 } catch (error) { 19 console.error(`Error loading content for ${slug}:`, error); 20 return ''; 21 } 22} 23 24// Metadata only - fast and lightweight 25export function getAllPosts(): Omit<Post, 'content'>[] { 26 return postsMetadata.posts.map(post => ({ 27 id: post.id, 28 slug: post.slug, 29 title: post.title, 30 date: post.date, 31 author: post.author, 32 excerpt: post.excerpt, 33 })); 34} 35 36// Full post with content - loaded on demand 37export async function getPostBySlug(slug: string): Promise<Post | undefined> { 38 const postMeta = postsMetadata.posts.find(post => post.slug === slug); 39 40 if (!postMeta) return undefined; 41 42 const content = await loadPostContent(slug); 43 return { ...postMeta, content }; 44}

New Request Flow

Performance Analysis

Bundle Size Optimization

1# Before optimization 2Original size: 226.3KB 3 4# After optimization 5Metadata size: 10.9KB 6Space saved: 95.2% (215.4KB)

Memory Usage Comparison

ScenarioBeforeAfterImprovement
Posts listing226KB10.9KB95.2% reduction
Single post226KB10.9KB + ~12KB~90% reduction
Multiple posts226KB10.9KB + (n × ~12KB)Scales linearly

CPU Time Analysis

Network Bandwidth Analysis

Initial Bundle Transfer

ComponentBeforeAfterSavings
Metadata226KB10.9KB215.1KB
ContentIncludedOn-demandVariable
Total Initial226KB10.9KB95.2%

Runtime Loading Patterns

1// Posts listing: Only metadata needed 2const posts = getAllPosts(); // 10.9KB loaded 3 4// Individual post: Metadata + specific content 5const post = await getPostBySlug('my-post'); // 10.9KB + ~12KB

Implementation Challenges & Solutions

Challenge 1: Dynamic Imports in Workers

Problem: Cloudflare Workers have limitations with dynamic imports.

Solution: Use static imports with dynamic paths that are known at build time:

1// This works in Workers 2const contentModule = await import(`@/data/content/${slug}.json`); 3 4// This doesn't work in Workers 5const contentModule = await import(dynamicPath);

Challenge 2: Type Safety

Problem: Posts without content need different types.

Solution: Flexible type definitions:

1interface PostCardProps { 2 post: Omit<Post, 'content'> | Post; // Supports both 3}

Challenge 3: Build Pipeline Integration

Problem: Need to run optimization automatically.

Solution: Integrated build pipeline:

1{ 2 "scripts": { 3 "prebuild": "node scripts/generate-posts-data.js && node scripts/optimize-for-workers.js" 4 } 5}

Results & Impact

✅ Deployment Success

1# Successful deployment output 2✨ Success! Uploaded 7 files (66 already uploaded) (1.77 sec) 3Total Upload: 13450.13 KiB / gzip: 2695.83 KiB 4Worker Startup Time: 24 ms 5Deployed next-blog triggers (1.27 sec) 6https://next-blog.rkristelijn.workers.dev

Performance Metrics

MetricBeforeAfterImprovement
Error Rate100% (Error 1102)0%✅ Resolved
Bundle Size226KB10.9KB95.2% reduction
Memory UsageHighLow~95% reduction
Startup TimeSlowFastSignificantly improved
ScalabilityLimitedHighLinear scaling

Resource Utilization

Lessons Learned

1. Monitor Bundle Sizes Early

1# Add to CI/CD pipeline 2npm run build | grep "First Load JS"

2. Implement Lazy Loading from Start

Don't load everything upfront. Design for on-demand loading:

1// Good: Load what you need 2const metadata = getPostMetadata(); 3 4// Bad: Load everything 5const allData = getAllDataIncludingContent();

3. Use Appropriate Data Structures

1// For listings: Metadata only 2interface PostSummary { 3 id: string; 4 title: string; 5 excerpt: string; 6 date: string; 7} 8 9// For details: Full content 10interface PostDetail extends PostSummary { 11 content: string; 12}

4. Cache Strategically

1// Cache expensive operations 2const contentCache = new Map<string, string>(); 3 4// But don't cache everything 5// Cache only frequently accessed content

Best Practices for Cloudflare Workers

1. Bundle Size Management

  • Keep initial bundles under 50KB
  • Use dynamic imports for large content
  • Monitor bundle sizes in CI/CD

2. Memory Optimization

  • Load data on-demand
  • Implement intelligent caching
  • Avoid loading entire datasets

3. CPU Time Management

  • Minimize JSON parsing
  • Use efficient algorithms
  • Profile critical paths

4. Monitoring & Alerting

1// Add performance monitoring 2console.time('operation'); 3await expensiveOperation(); 4console.timeEnd('operation'); 5 6// Monitor in production 7if (process.env.NODE_ENV === 'production') { 8 // Log performance metrics 9}

Future Optimizations

1. Content Compression

1# Gzip content files 2gzip content/*.json 3# Potential 60-80% additional savings

2. Edge Caching

1// Cache content at Cloudflare edge 2const cacheKey = `post-content-${slug}`; 3const cached = await caches.default.match(cacheKey);

3. Incremental Loading

1// Load content sections progressively 2const sections = await loadPostSections(slug);

4. Service Worker Caching

1// Client-side caching for repeat visits 2self.addEventListener('fetch', event => { 3 if (event.request.url.includes('/content/')) { 4 event.respondWith(cacheFirst(event.request)); 5 } 6});

Conclusion

The Error 1102 "Worker exceeded resource limits" taught us valuable lessons about performance optimization in serverless environments. By implementing lazy loading and reducing our bundle size by 95.2%, we not only solved the immediate problem but also created a more scalable architecture.

Key Takeaways:

  1. Monitor resource usage proactively - Don't wait for errors
  2. Design for lazy loading - Load only what you need, when you need it
  3. Optimize bundle sizes - Every KB matters in serverless environments
  4. Implement intelligent caching - Balance memory usage with performance
  5. Test at scale - Resource limits become apparent under load

The solution demonstrates that with careful analysis and strategic optimization, even complex applications can run efficiently within Cloudflare Worker constraints while maintaining excellent performance and user experience.

Our blog now handles traffic smoothly, scales efficiently, and stays well within resource limits - proving that sometimes the best optimization is simply not loading what you don't need.


Performance Stats:

  • 95.2% bundle size reduction (226KB → 10.9KB)
  • Zero Error 1102 incidents since optimization
  • Linear scalability for additional content
  • Sub-5ms CPU time for most operations

The complete optimization code and scripts are available in our GitHub repository.