Timeout Configuration
Timeout Configuration
Section titled “Timeout Configuration”ERA Agent provides flexible timeout configuration at multiple levels. This guide shows you how to control execution timeouts and container lifecycle for optimal performance.
Quick Reference
Section titled “Quick Reference”| Timeout Type | Default | Scope | Can Change at Runtime? |
|---|---|---|---|
| Execution Timeout | 30 seconds | Per-request | ✅ Yes (per-run override) |
| Session Default Timeout | 30 seconds | Per-session | ✅ Yes (PATCH /api/sessions/{id}) |
| Container Sleep | 5 minutes | Global | ❌ No (requires redeploy) |
Execution Timeout
Section titled “Execution Timeout”The execution timeout controls how long a single code execution can run before being terminated.
Per-Request Timeout
Section titled “Per-Request Timeout”Override timeout for a specific execution:
# 60 second timeoutcurl -X POST https://anewera.dev/api/sessions/my-session/run \ -H "Content-Type: application/json" \ -d '{ "code": "import time; time.sleep(45); print(\"Done!\")", "timeout": 60 }'
# 5 minute timeout for long operationscurl -X POST https://anewera.dev/api/sessions/my-session/run \ -H "Content-Type: application/json" \ -d '{ "code": "# Long-running data processing", "timeout": 300 }'
# 10 minute timeoutcurl -X POST https://anewera.dev/api/sessions/my-session/run \ -H "Content-Type: application/json" \ -d '{ "code": "# Very long operation", "timeout": 600 }'Session Default Timeout
Section titled “Session Default Timeout”Set a default timeout for all executions in a session:
curl -X POST https://anewera.dev/api/sessions \ -H "Content-Type: application/json" \ -d '{ "language": "python", "session_id": "long-runner", "persistent": true, "default_timeout": 120 }'Now all executions in this session will use 120 seconds by default:
# Uses 120 second timeout (session default)curl -X POST https://anewera.dev/api/sessions/long-runner/run \ -H "Content-Type: application/json" \ -d '{ "code": "import time; time.sleep(90); print(\"Completed!\")" }'
# Override with 300 seconds for this specific runcurl -X POST https://anewera.dev/api/sessions/long-runner/run \ -H "Content-Type: application/json" \ -d '{ "code": "import time; time.sleep(250); print(\"Long task done!\")", "timeout": 300 }'Updating Session Default Timeout
Section titled “Updating Session Default Timeout”You can update the default timeout of an existing session at any time:
# Update session timeout from 30s to 5 minutescurl -X PATCH https://anewera.dev/api/sessions/long-runner \ -H "Content-Type: application/json" \ -d '{ "default_timeout": 300 }'This is useful when:
- You realize your workload needs more time after creating the session
- Different phases of processing have different time requirements
- You want to adjust timeout based on runtime conditions
Example workflow:
# 1. Create session with standard timeoutcurl -X POST https://anewera.dev/api/sessions \ -H "Content-Type: application/json" \ -d '{ "language": "python", "session_id": "adaptive-timeout", "persistent": true, "default_timeout": 30 }'
# 2. Run quick tasks with 30s timeoutcurl -X POST https://anewera.dev/api/sessions/adaptive-timeout/run \ -H "Content-Type: application/json" \ -d '{ "code": "print(\"Quick task\")" }'
# 3. Need longer timeout for next phase - update it!curl -X PATCH https://anewera.dev/api/sessions/adaptive-timeout \ -H "Content-Type: application/json" \ -d '{ "default_timeout": 600 }'
# 4. Now long-running tasks use 10 minute timeout by defaultcurl -X POST https://anewera.dev/api/sessions/adaptive-timeout/run \ -H "Content-Type: application/json" \ -d '{ "code": "# Long data processing task that takes 8 minutes..." }'Timeout Priority
Section titled “Timeout Priority”Timeout values are applied in this order:
- Per-request timeout (highest priority)
- Session default_timeout
- Global default (30 seconds)
// Priority exampletimeout = request.timeout || session.default_timeout || 30Container Lifecycle
Section titled “Container Lifecycle”Container Sleep Duration
Section titled “Container Sleep Duration”Containers stay alive between requests to avoid cold starts. Configure how long they stay active:
Current setting: 5 minutes
Location: cloudflare/src/index.ts
export class EraAgent extends Container { defaultPort = 8787; sleepAfter = '5m'; // Container stops after 5 minutes of inactivity}Changing Container Sleep Duration
Section titled “Changing Container Sleep Duration”Edit cloudflare/src/index.ts and redeploy:
export class EraAgent extends Container { defaultPort = 8787; sleepAfter = '30m'; // Keep alive for 30 minutes // or sleepAfter = '1h'; // Keep alive for 1 hour // or sleepAfter = '2h'; // Keep alive for 2 hours}Then deploy:
cd cloudflarenpx wrangler deployBenefits of Longer Container Lifetime
Section titled “Benefits of Longer Container Lifetime”- ✅ Faster subsequent requests - No cold start overhead
- ✅ Persistent in-memory caches - Data survives between requests
- ✅ Reuse package installations - Dependencies stay loaded
- ✅ Lower latency - Instant response for frequent operations
Trade-offs
Section titled “Trade-offs”- ⚠️ Higher resource usage - Containers kept in memory
- ⚠️ Billing implications - Longer-running containers cost more
- ⚠️ Memory leaks - Long-lived containers may accumulate memory issues
Use Cases & Recommendations
Section titled “Use Cases & Recommendations”Quick Scripts (< 30 seconds)
Section titled “Quick Scripts (< 30 seconds)”Default settings work great:
curl -X POST https://anewera.dev/api/sessions \ -H "Content-Type: application/json" \ -d '{ "language": "python", "session_id": "quick-task", "persistent": false }'
# Run with default 30s timeoutcurl -X POST https://anewera.dev/api/sessions/quick-task/run \ -H "Content-Type: application/json" \ -d '{ "code": "print(\"Hello World!\")" }'Data Processing (1-5 minutes)
Section titled “Data Processing (1-5 minutes)”Set session default timeout:
curl -X POST https://anewera.dev/api/sessions \ -H "Content-Type: application/json" \ -d '{ "language": "python", "session_id": "data-processor", "persistent": true, "default_timeout": 300, "setup": { "pip": { "requirements": "pandas numpy" } } }'
# All runs use 5 minute timeout automaticallycurl -X POST https://anewera.dev/api/sessions/data-processor/run \ -H "Content-Type: application/json" \ -d '{ "code": "import pandas as pd; df = pd.read_csv(\"large_file.csv\"); result = df.groupby(\"category\").sum(); print(result)" }'Long-Running Operations (5+ minutes)
Section titled “Long-Running Operations (5+ minutes)”Use per-request timeout override:
# Create session with reasonable defaultcurl -X POST https://anewera.dev/api/sessions \ -H "Content-Type: application/json" \ -d '{ "language": "python", "session_id": "ml-training", "persistent": true, "default_timeout": 120 }'
# Most operations use 2 minute defaultcurl -X POST https://anewera.dev/api/sessions/ml-training/run \ -H "Content-Type: application/json" \ -d '{ "code": "# Data preprocessing" }'
# Override for long training runscurl -X POST https://anewera.dev/api/sessions/ml-training/run \ -H "Content-Type: application/json" \ -d '{ "code": "# Train model for 15 minutes", "timeout": 900 }'Web Scraping with Rate Limits
Section titled “Web Scraping with Rate Limits”Balance timeout with API rate limits:
curl -X POST https://anewera.dev/api/sessions \ -H "Content-Type: application/json" \ -d '{ "language": "python", "session_id": "scraper", "persistent": true, "default_timeout": 180, "allowInternetAccess": true, "setup": { "pip": { "requirements": "requests beautifulsoup4" } } }'
# Scrape with delays for rate limitingcurl -X POST https://anewera.dev/api/sessions/scraper/run \ -H "Content-Type: application/json" \ -d '{ "code": "import requests; import time; urls = [...]; results = []; [results.append(requests.get(url).text) or time.sleep(2) for url in urls]; print(len(results))" }'Batch Processing
Section titled “Batch Processing”Process items with progress tracking:
import jsonimport time
# Load session datawith open('.session_data.json', 'r') as f: data = json.load(f)
if 'processed' not in data: data['processed'] = [] data['pending'] = list(range(100)) # 100 items to process
# Process batch (within timeout)start_time = time.time()max_duration = 110 # Leave 10s buffer for 120s timeout
while data['pending'] and (time.time() - start_time) < max_duration: item = data['pending'].pop(0)
# Process item (simulate work) time.sleep(1) result = f"processed_{item}"
data['processed'].append(result)
# Save progresswith open('.session_data.json', 'w') as f: json.dump(data, f)
print(f"Processed: {len(data['processed'])}/{100}")print(f"Remaining: {len(data['pending'])}")Run repeatedly until complete:
# Run multiple times until all items processedwhile true; do RESULT=$(curl -s -X POST https://anewera.dev/api/sessions/batch/run \ -H "Content-Type: application/json" \ -d '{"code": "..."}')
REMAINING=$(echo "$RESULT" | jq -r '.stdout' | grep "Remaining" | awk '{print $2}')
if [ "$REMAINING" = "0" ]; then echo "Batch complete!" break fi
echo "Still processing... $REMAINING items remaining" sleep 5doneHandling Timeouts
Section titled “Handling Timeouts”Detecting Timeout
Section titled “Detecting Timeout”Exit code 124 indicates timeout:
RESULT=$(curl -s -X POST https://anewera.dev/api/sessions/my-session/run \ -H "Content-Type: application/json" \ -d '{ "code": "import time; time.sleep(60)", "timeout": 10 }')
EXIT_CODE=$(echo "$RESULT" | jq -r '.exit_code')
if [ "$EXIT_CODE" = "124" ]; then echo "Operation timed out!"else echo "Operation completed"fiGraceful Timeout Handling
Section titled “Graceful Timeout Handling”Save progress before timeout:
import jsonimport timeimport signal
# Load datawith open('.session_data.json', 'r') as f: data = json.load(f)
# Track progressif 'items_processed' not in data: data['items_processed'] = 0
# Setup timeout handlerdef save_and_exit(signum, frame): with open('.session_data.json', 'w') as f: json.dump(data, f) print(f"Timeout! Saved progress: {data['items_processed']} items") exit(124)
# Register signal handler (optional, VM will kill anyway)signal.signal(signal.SIGTERM, save_and_exit)
# Process itemsitems = list(range(100))start_time = time.time()max_duration = 25 # Buffer for 30s timeout
for item in items: if time.time() - start_time > max_duration: break
# Process time.sleep(0.5) data['items_processed'] += 1
# Save progresswith open('.session_data.json', 'w') as f: json.dump(data, f)
print(f"Completed: {data['items_processed']} items")Retry Logic
Section titled “Retry Logic”#!/bin/bash
SESSION_ID="retry-demo"MAX_RETRIES=3RETRY_COUNT=0
while [ $RETRY_COUNT -lt $MAX_RETRIES ]; do RESULT=$(curl -s -X POST https://anewera.dev/api/sessions/$SESSION_ID/run \ -H "Content-Type: application/json" \ -d '{ "code": "import time; time.sleep(40)", "timeout": 30 }')
EXIT_CODE=$(echo "$RESULT" | jq -r '.exit_code')
if [ "$EXIT_CODE" = "0" ]; then echo "Success!" exit 0 elif [ "$EXIT_CODE" = "124" ]; then RETRY_COUNT=$((RETRY_COUNT + 1)) echo "Timeout! Retry $RETRY_COUNT/$MAX_RETRIES" sleep 2 else echo "Error: Exit code $EXIT_CODE" exit 1 fidone
echo "Failed after $MAX_RETRIES retries"exit 1Best Practices
Section titled “Best Practices”1. Choose Appropriate Timeouts
Section titled “1. Choose Appropriate Timeouts”# ✅ Good: Match timeout to expected duration"timeout": 60 # For 45-second operation
# ❌ Bad: Excessive timeout"timeout": 3600 # For 10-second operation
# ✅ Good: Session default for consistent workload"default_timeout": 120 # All operations take ~90s
# ✅ Good: Override for exceptions"timeout": 600 # Special long-running operation2. Add Progress Tracking
Section titled “2. Add Progress Tracking”import jsonimport time
with open('.session_data.json', 'r') as f: data = json.load(f)
data['progress'] = { 'total': 100, 'completed': 0, 'started_at': time.time()}
for i in range(100): # Process item time.sleep(0.5)
# Update progress data['progress']['completed'] = i + 1
# Save every 10 items if (i + 1) % 10 == 0: with open('.session_data.json', 'w') as f: json.dump(data, f)
# Final savewith open('.session_data.json', 'w') as f: json.dump(data, f)
print(f"Processed: {data['progress']['completed']}")3. Use Callbacks for Long Operations
Section titled “3. Use Callbacks for Long Operations”For operations that exceed practical timeout limits, use callbacks:
curl -X POST https://anewera.dev/api/sessions \ -H "Content-Type: application/json" \ -d '{ "language": "python", "session_id": "async-task", "persistent": true, "data": { "callback_url": "https://your-app.com/webhook" } }'Code with callback:
import jsonimport requests
with open('.session_data.json', 'r') as f: data = json.load(f)
callback_url = data.get('callback_url')
# Do workresult = process_long_task()
# Notify completionif callback_url: requests.post(callback_url, json={ 'status': 'completed', 'result': result })See Callbacks & Webhooks for more details.
4. Break Down Large Tasks
Section titled “4. Break Down Large Tasks”# ❌ Bad: One giant operationcurl ... -d '{"code": "process_all_100k_records()", "timeout": 3600}'
# ✅ Good: Batched operationsfor batch in {1..100}; do curl ... -d '{"code": "process_batch('$batch')", "timeout": 60}'done5. Monitor Container Usage
Section titled “5. Monitor Container Usage”# Check session infocurl https://anewera.dev/api/sessions/my-session | jq '{ id: .id, language: .language, default_timeout: .default_timeout, last_run: .last_run_at}'Monitoring & Debugging
Section titled “Monitoring & Debugging”Check Execution Duration
Section titled “Check Execution Duration”RESULT=$(curl -s -X POST https://anewera.dev/api/sessions/my-session/run \ -H "Content-Type: application/json" \ -d '{ "code": "import time; time.sleep(5); print(\"Done\")", "timeout": 30 }')
# Check durationecho "$RESULT" | jq '{ exit_code: .exit_code, duration: .duration}'Output:
{ "exit_code": 0, "duration": "5.123s"}Track Session Performance
Section titled “Track Session Performance”import jsonimport time
with open('.session_data.json', 'r') as f: data = json.load(f)
if 'metrics' not in data: data['metrics'] = { 'runs': 0, 'total_duration': 0, 'avg_duration': 0 }
start = time.time()
# Your code heretime.sleep(2)
duration = time.time() - start
# Update metricsdata['metrics']['runs'] += 1data['metrics']['total_duration'] += durationdata['metrics']['avg_duration'] = data['metrics']['total_duration'] / data['metrics']['runs']
with open('.session_data.json', 'w') as f: json.dump(data, f)
print(f"Runs: {data['metrics']['runs']}")print(f"Avg duration: {data['metrics']['avg_duration']:.2f}s")Troubleshooting
Section titled “Troubleshooting”Operation Always Times Out
Section titled “Operation Always Times Out”Problem: Code consistently hits timeout
Solutions:
-
Increase timeout:
Terminal window curl ... -d '{"code": "...", "timeout": 120}' -
Optimize code:
# ❌ Slowresults = [heavy_function(x) for x in large_list]# ✅ Fasterresults = [optimized_function(x) for x in large_list[:100]] -
Break into smaller operations
Container Cold Starts
Section titled “Container Cold Starts”Problem: First request is always slow
Solution: Increase sleepAfter duration:
sleepAfter = '30m'; // Keep warm for 30 minutesInconsistent Performance
Section titled “Inconsistent Performance”Problem: Some runs are fast, others slow
Causes:
- Container cold start vs. warm
- Network latency (for HTTP requests)
- Variable data sizes
Solution: Set appropriate timeouts and monitor:
# Use default_timeout for consistency"default_timeout": 90 # Handle both fast and slow casesNext Steps
Section titled “Next Steps”- Learn about Data & Communication for state management
- See Callbacks & Webhooks for async operations
- Explore Multi-File Projects for complex workflows
- Check Code Management for reusable code patterns