Evaluate 30 Logging Libraries for Your Stack
Run one Devin session per library to score pricing, performance, and SDK quality — then merge everything into a ranked comparison table.Write a research prompt with a consistent template
The key to useful parallel research is giving every session the same checklist. Each session researches one library independently, so the template ensures results are directly comparable when merged.Open a new Advanced Devin session (click the sparkle icon in the top-left of the input box), then switch to the Start Batch Sessions tab.
Review and approve the proposed sessions
After submitting, Advanced Devin parses your list and proposes one session per library. You’ll see a preview like:Review the list and click Approve to launch all sessions simultaneously. Each session runs independently — browsing the library’s website, reading documentation, checking developer forums, and filling in the template.If you want to skip or add libraries, edit the list before approving. You can also attach a playbook to ensure every session follows the same research methodology.
Collect and compare results
Once all sessions complete, Advanced Devin automatically merges the individual reports into a single comparison. The output follows whatever format you requested — here’s what the compiled spreadsheet-style comparison looks like:You can ask follow-up questions in the same Advanced session — it has context from all the child sessions.Once you’ve picked a winner, you can launch a Devin session directly from the same Advanced session to set up the library in your repo:
Go deeper on the shortlist
Once you have a shortlist, start targeted follow-up sessions for deeper evaluation.
Tips
