This guide provides recommendations and best practices for using RunCache effectively in your applications. Following these guidelines will help you optimize performance, maintain data consistency, and avoid common pitfalls.
Cache Key Design
Use Structured Key Naming
Adopt a consistent key naming convention to make pattern matching more effective and your code more maintainable:
// Good - pure function
await RunCache.set({
key: 'user:1',
sourceFn: () => fetchUserById(1)
});
// Avoid - depends on external state
let currentUserId = 1;
await RunCache.set({
key: 'current-user',
sourceFn: () => fetchUserById(currentUserId) // Will break if currentUserId changes
});
Handle Errors Properly
Always implement error handling in your source functions:
await RunCache.set({
key: 'api-data',
sourceFn: async () => {
try {
const response = await fetch('https://api.example.com/data');
if (!response.ok) {
throw new Error(`API returned ${response.status}`);
}
const data = await response.json();
return JSON.stringify(data);
} catch (error) {
console.error('Source function error:', error);
throw error; // Rethrow to let RunCache handle it
}
}
});
Optimize for Performance
Keep source functions efficient:
Fetch only the data you need
Use appropriate caching headers for API requests
Consider batching related requests
// Inefficient - fetches too much data
await RunCache.set({
key: 'user-name',
sourceFn: async () => {
const response = await fetch('https://api.example.com/users/1');
const user = await response.json();
return JSON.stringify(user.name); // Only needed the name
}
});
// Better - uses a more targeted endpoint
await RunCache.set({
key: 'user-name',
sourceFn: async () => {
const response = await fetch('https://api.example.com/users/1/name');
const name = await response.text();
return name;
}
});
Automatic Refetching
Use for Critical Data Only
Automatic refetching is most beneficial for:
Frequently accessed data
Data that should never be unavailable
Data where freshness is important but not critical
// Good candidate for autoRefetch
await RunCache.set({
key: 'global-settings',
sourceFn: () => fetchGlobalSettings(),
ttl: 5 * 60 * 1000,
autoRefetch: true
});
// May not need autoRefetch
await RunCache.set({
key: 'user-activity-log',
sourceFn: () => fetchUserActivity(),
ttl: 30 * 60 * 1000
// No autoRefetch as it's less critical
});
Implement Proper Error Handling
Always handle potential failures in automatic refetching:
Use patterns that are as specific as possible to avoid unintended matches:
// Too broad - might match unrelated keys
await RunCache.delete('user*');
// Better - more specific pattern
await RunCache.delete('user:*');
// Best - most specific pattern for the use case
await RunCache.delete('user:1:*');
Consider Performance Implications
Pattern matching operations require checking all cache keys, which can be expensive for large caches:
// Inefficient for large caches - checks all keys
const allUserData = await RunCache.get('user:*');
// More efficient - use specific patterns when possible
const specificUserData = await RunCache.get('user:1:*');
Tags and Dependencies
Use Tags for Broad Grouping
Tags are ideal for grouping related cache entries that need to be invalidated together:
// Set entries with tags
await RunCache.set({
key: 'user:1:profile',
value: '...',
tags: ['user:1', 'profile']
});
await RunCache.set({
key: 'user:1:settings',
value: '...',
tags: ['user:1', 'settings']
});
// Later, invalidate all user:1 data
RunCache.invalidateByTag('user:1');
Use Dependencies for Derived Data
Dependencies are ideal for derived data that depends on base data:
// Primary data
await RunCache.set({
key: 'user:1:profile',
value: JSON.stringify({ name: 'John Doe' })
});
// Dependent data
await RunCache.set({
key: 'user:1:dashboard',
value: JSON.stringify({ widgets: [...] }),
dependencies: ['user:1:profile'] // This entry depends on the profile
});
// When user:1:profile changes, user:1:dashboard will be automatically invalidated
Avoid Deep Dependency Chains
Deep dependency chains can lead to widespread invalidations:
// Potentially problematic - deep dependency chain
A → B → C → D → E → F → G
// Better - flatter dependency structure
A → B, C, D
E → F, G
Event System
Keep Event Handlers Lightweight
Event handlers should be fast and non-blocking:
// Good - lightweight handler
RunCache.onExpiry((event) => {
console.log(`Cache expired: ${event.key}`);
});
// Avoid - heavy processing in handler
RunCache.onExpiry(async (event) => {
// DON'T do this - blocks the event loop
await heavyProcessing();
await networkRequest();
});
If you need to perform heavy work, delegate it to a separate process or queue.
Clean Up Event Listeners
Remove event listeners when they're no longer needed:
// In a component or module initialization
function initializeModule() {
// Set up event listeners
RunCache.onKeyRefetch('module-data', handleRefetch);
// Return cleanup function
return () => {
RunCache.clearEventListeners({
event: EVENT.REFETCH,
key: 'module-data'
});
};
}
// Later, when shutting down
const cleanup = initializeModule();
// When module is unloaded
cleanup();
Memory Management
Set Appropriate Cache Size Limits
Configure maximum cache size based on your application's memory constraints:
// Configure cache with size limits
RunCache.configure({
maxEntries: 1000, // Limit to 1000 entries
evictionPolicy: EvictionPolicy.LRU
});
Choose the Right Eviction Policy
Select an eviction policy that matches your access patterns:
LRU (Least Recently Used): Best for most applications where recently accessed items are likely to be accessed again
LFU (Least Frequently Used): Better for cases where access frequency is more important than recency
NONE: Use only when you want manual control over eviction or have other TTL-based mechanisms
// For most applications
RunCache.configure({
maxEntries: 1000,
evictionPolicy: EvictionPolicy.LRU
});
// For frequency-based patterns
RunCache.configure({
maxEntries: 1000,
evictionPolicy: EvictionPolicy.LFU
});
Optimize Value Size
Minimize the size of cached values:
// Bad - storing unnecessary data
await RunCache.set({
key: 'user-profile',
value: JSON.stringify({
name: 'John Doe',
email: 'john@example.com',
avatar: largeBase64EncodedImage, // Unnecessarily large
fullHistory: completeUserHistory, // Rarely needed data
// ...more data
})
});
// Better - store only what's needed
await RunCache.set({
key: 'user-profile',
value: JSON.stringify({
name: 'John Doe',
email: 'john@example.com',
avatarUrl: '/avatars/john.jpg', // Just the URL, not the image
// ...essential data only
})
});
// Store rarely needed data separately
await RunCache.set({
key: 'user-profile:full-history',
value: JSON.stringify(completeUserHistory),
ttl: 3600000 // With shorter TTL
});
Persistent Storage
Choose the Right Storage Adapter
Select the appropriate storage adapter based on your environment and requirements:
LocalStorageAdapter: For browser environments with small cache data
IndexedDBAdapter: For browser environments with larger cache data
FilesystemAdapter: For Node.js applications
// For browser with small data
RunCache.configure({
storageAdapter: new LocalStorageAdapter()
});
// For browser with larger data
RunCache.configure({
storageAdapter: new IndexedDBAdapter()
});
// For Node.js applications
RunCache.configure({
storageAdapter: new FilesystemAdapter()
});
// For frequently changing data, save more often
RunCache.setupAutoSave(60000); // 1 minute
// For relatively stable data, save less frequently
RunCache.setupAutoSave(3600000); // 1 hour
Error Handling
Handle Cache Misses Gracefully
Always handle potential cache misses:
// Good - handles cache miss
const userData = await RunCache.get('user:1');
if (userData) {
// Cache hit
const user = JSON.parse(userData);
displayUser(user);
} else {
// Cache miss
showLoadingIndicator();
const user = await fetchUserFromApi(1);
await RunCache.set({ key: 'user:1', value: JSON.stringify(user) });
displayUser(user);
}
Implement Circuit Breakers
For critical systems, implement circuit breakers to prevent cascading failures:
let cacheFailures = 0;
const FAILURE_THRESHOLD = 5;
const RESET_TIMEOUT = 60000; // 1 minute
async function getCachedData(key) {
if (cacheFailures >= FAILURE_THRESHOLD) {
// Circuit is open, bypass cache
return fetchDataFromSource(key);
}
try {
const data = await RunCache.get(key);
if (data) return JSON.parse(data);
// Cache miss, fetch from source
const sourceData = await fetchDataFromSource(key);
try {
await RunCache.set({ key, value: JSON.stringify(sourceData) });
} catch (error) {
cacheFailures++;
if (cacheFailures === FAILURE_THRESHOLD) {
console.error('Cache circuit opened due to repeated failures');
setTimeout(() => {
cacheFailures = 0;
console.log('Cache circuit reset');
}, RESET_TIMEOUT);
}
}
return sourceData;
} catch (error) {
cacheFailures++;
// Fall back to source on cache error
return fetchDataFromSource(key);
}
}
Performance Optimization
Batch Related Operations
Group related cache operations to minimize overhead:
// Less efficient - multiple separate operations
for (const id of userIds) {
await RunCache.set({ key: `user:${id}:profile`, value: profiles[id] });
}
// More efficient - batch processing
async function batchSetProfiles(profiles) {
const operations = Object.entries(profiles).map(([id, profile]) =>
RunCache.set({ key: `user:${id}:profile`, value: profile })
);
await Promise.all(operations);
}
await batchSetProfiles(profiles);
Implement Cache Warming
Pre-populate critical cache entries on application startup:
async function warmCache() {
console.log('Warming cache...');
// Check if entries already exist
const hasConfig = await RunCache.has('app:config');
const hasGlobalData = await RunCache.has('app:global-data');
// Warm up only what's missing
if (!hasConfig) {
const config = await fetchAppConfig();
await RunCache.set({ key: 'app:config', value: JSON.stringify(config) });
}
if (!hasGlobalData) {
const globalData = await fetchGlobalData();
await RunCache.set({ key: 'app:global-data', value: JSON.stringify(globalData) });
}
console.log('Cache warming complete');
}
// Call on application startup
warmCache();
Use Middleware for Cross-Cutting Concerns
Implement middleware for functionality that applies to all cache operations: