Skip to main content
If you encounter issues while using browseruse-bench, here are ways to get help.

Common Issues

Installation Issues

Try installing with uv:
pip install uv
uv sync
Ensure Chrome is installed and in PATH, or use Lexmount Cloud Browser.
Ensure Node.js version >= 18:
node --version
npm install -g @agent-tars/cli@0.3.0

Runtime Issues

Increase timeout duration:
uv run scripts/run.py --agent browser-use --benchmark LexBench-Browser --timeout 600
Check if API Key in .env file is correct:
cat .env | grep API_KEY
  1. Check LEXMOUNT_API_KEY and LEXMOUNT_PROJECT_ID
  2. Confirm network access to Lexmount API

Evaluation Issues

Check OPENAI_API_KEY or EVAL_MODEL_BASE_URL configuration.
Try lowering the score threshold:
uv run scripts/eval.py --agent browser-use --benchmark LexBench-Browser --score-threshold 50

View Logs

Runtime Logs

Runtime logs are saved in the logs/ directory:
# View latest logs
ls -lt logs/ | head -5

# Tail log content
tail -f logs/<latest_log>.log

Debug Mode

Enable debug mode for more information:
uv run scripts/run.py \
  --agent browser-use \
  --benchmark LexBench-Browser \
  --mode first_n --count 1 \
  --debug

Submit Issue

If none of the above methods solve your problem, please submit an Issue on GitHub:
  1. Visit GitHub Issues
  2. Click “New Issue”
  3. Provide the following information:
    • Problem description
    • Reproduction steps
    • Error logs
    • System environment (OS, Python version, etc.)

Issue Template

## Problem Description
Briefly describe the problem encountered

## Reproduction Steps
1. Run command `...`
2. See error `...`

## Error Logs
Paste error logs

## Environment Info
- OS: macOS / Linux / Windows
- Python: 3.10 / 3.11 / 3.12
- browseruse-bench version: ...

Community Support

  • GitHub Discussions: Questions and discussions
  • GitHub Issues: Report bugs and feature requests

Commercial Support

For commercial support or enterprise deployment inquiries, please contact support@lexmount.com.