🔍 Code Extractor

function main_v21

Maturity: 48

Asynchronous main function that runs a reMarkable tablet file watcher as a separate process, monitoring a specified folder for new documents, processing them, and uploading responses back to the reMarkable Cloud.

File:
/tf/active/vicechatdev/e-ink-llm/run_remarkable_bridge.py
Lines:
23 - 99
Complexity:
complex

Purpose

This function serves as the entry point for a bridge service between reMarkable Cloud and a local processing system. It authenticates with reMarkable Cloud, watches a designated folder for new documents, stages them for processing by copying to a bridge directory, waits for processed responses, and uploads the results back to the reMarkable Cloud. It's designed to enable automated workflows where documents written on a reMarkable tablet can be processed by external systems (like LLMs) and have responses automatically synced back.

Source Code

async def main():
    """Run reMarkable file watcher as a separate process"""
    
    # Parse command line arguments
    if len(sys.argv) < 2:
        print("Usage: python run_remarkable_bridge.py <config_json>")
        sys.exit(1)
    
    config = json.loads(sys.argv[1])
    
    # Initialize reMarkable manager
    manager = RemarkableCloudManager()
    
    # Authenticate if needed
    one_time_code = config.get('one_time_code')
    if not await manager.authenticate(one_time_code):
        print("❌ reMarkable authentication failed!")
        sys.exit(1)
    
    # Set up file watcher
    watch_folder = config.get('watch_folder', '/E-Ink LLM Input')
    poll_interval = config.get('poll_interval', 60)
    output_folder = config.get('output_folder', '/E-Ink LLM Output')
    bridge_dir = Path(config.get('bridge_dir', '/tmp/remarkable-bridge'))
    bridge_dir.mkdir(exist_ok=True)
    
    watcher = RemarkableFileWatcher(manager, watch_folder, poll_interval)
    
    async def process_file(document, local_file):
        """Process a new reMarkable file"""
        print(f"📄 Processing reMarkable document: {document.name}")
        
        # Copy file to bridge directory for main process
        input_file = bridge_dir / f"INPUT_{document.id}_{local_file.name}"
        import shutil
        shutil.copy2(local_file, input_file)
        
        # Create metadata file
        metadata = {
            'document_id': document.id,
            'document_name': document.name,
            'original_path': str(local_file),
            'output_folder': output_folder,
            'timestamp': str(datetime.now())
        }
        metadata_file = bridge_dir / f"META_{document.id}.json"
        with open(metadata_file, 'w') as f:
            json.dump(metadata, f, indent=2)
        
        print(f"📁 Staged for processing: {input_file}")
        
        # Wait for main process to create response
        response_file = bridge_dir / f"OUTPUT_{document.id}.pdf"
        timeout = 300  # 5 minutes timeout
        start_time = time.time()
        
        while not response_file.exists() and (time.time() - start_time) < timeout:
            await asyncio.sleep(5)
        
        if response_file.exists():
            print(f"📤 Uploading response to reMarkable Cloud...")
            success = await manager.upload_document(
                response_file, output_folder, f"Response to {document.name}"
            )
            if success:
                print(f"✅ Response uploaded successfully")
                # Clean up
                response_file.unlink(missing_ok=True)
                input_file.unlink(missing_ok=True)
                metadata_file.unlink(missing_ok=True)
            else:
                print(f"❌ Failed to upload response")
        else:
            print(f"⏰ Timeout waiting for response processing")
    
    # Start watching
    await watcher.start_watching(process_file)

Return Value

This function does not return a value (implicitly returns None). It runs indefinitely as a file watcher service until interrupted or until an error causes it to exit with sys.exit(1) on authentication failure or missing configuration.

Dependencies

  • asyncio
  • json
  • sys
  • tempfile
  • pathlib
  • time
  • datetime
  • shutil
  • remarkable_cloud

Required Imports

import asyncio
import json
import sys
import tempfile
from pathlib import Path
from remarkable_cloud import RemarkableCloudManager
from remarkable_cloud import RemarkableFileWatcher
import time
from datetime import datetime

Conditional/Optional Imports

These imports are only needed under specific conditions:

import shutil

Condition: imported inside the nested process_file function, used for file copying operations

Required (conditional)

Usage Example

import asyncio
import json

# Prepare configuration
config = {
    'one_time_code': 'abc123xyz',
    'watch_folder': '/E-Ink LLM Input',
    'poll_interval': 60,
    'output_folder': '/E-Ink LLM Output',
    'bridge_dir': '/tmp/remarkable-bridge'
}

# Run from command line:
# python run_remarkable_bridge.py '{"one_time_code": "abc123xyz", "watch_folder": "/E-Ink LLM Input", "poll_interval": 60, "output_folder": "/E-Ink LLM Output", "bridge_dir": "/tmp/remarkable-bridge"}'

# Or programmatically:
import sys
sys.argv = ['script.py', json.dumps(config)]
await main()

Best Practices

  • Ensure the bridge directory has appropriate read/write permissions before running
  • The function expects a separate main process to monitor the bridge directory and create OUTPUT_*.pdf files in response to INPUT_* files
  • Handle the 5-minute timeout appropriately - consider adjusting based on expected processing time
  • Clean up bridge directory periodically to prevent accumulation of orphaned files
  • Implement proper error handling and logging in production environments
  • The one_time_code is only needed for initial authentication; subsequent runs can use cached credentials
  • Monitor disk space in the bridge directory as files are temporarily stored there
  • Consider implementing graceful shutdown handling for SIGTERM/SIGINT signals
  • The poll_interval should be balanced between responsiveness and API rate limits

Similar Components

AI-powered semantic similarity - components with related functionality:

  • function main_v65 72.8% similar

    Entry point function that orchestrates a complete synchronization of a reMarkable tablet's content, displaying progress and summary statistics.

    From: /tf/active/vicechatdev/e-ink-llm/cloudtest/sync_replica.py
  • function main_v62 72.7% similar

    Entry point function that orchestrates the analysis of a document uploaded through a reMarkable app, saves results and logs, and reports success or failure.

    From: /tf/active/vicechatdev/e-ink-llm/cloudtest/test_real_app_upload.py
  • function main_v15 72.2% similar

    A test function that uploads a PDF document to reMarkable cloud, syncs the local replica, and validates the upload with detailed logging and metrics.

    From: /tf/active/vicechatdev/e-ink-llm/cloudtest/test_raw_upload.py
  • function main_v104 71.5% similar

    A test function that uploads a PDF document to a reMarkable tablet folder using the folder's hash value as the parent identifier instead of its UUID, then verifies the upload through replica synchronization.

    From: /tf/active/vicechatdev/e-ink-llm/cloudtest/test_hash_parent_upload.py
  • function main_v103 70.0% similar

    Asynchronous main entry point for a test suite that validates Mixed Cloud Processor functionality, including authentication, discovery, and dry-run operations for reMarkable and OneDrive integration.

    From: /tf/active/vicechatdev/e-ink-llm/test_mixed_mode.py
← Back to Browse