Skip to main content
1

Create a test-writing playbook

Your e-commerce monorepo has 30+ modules but only a handful have meaningful test coverage. You want to go from 44% overall coverage to 80% — starting with the 8 worst offenders. Before launching parallel sessions, you need a playbook that ensures all 8 sessions write tests the same way.Ask Devin to create the playbook for you — just describe your testing conventions in any session:This playbook becomes the shared instruction set for every parallel session. You can also add Knowledge entries about your test utilities, mocking patterns, or any project-specific quirks (e.g., “always call resetMocks() in afterEach”).
2

Launch 8 parallel sessions at 6 PM

At the end of your workday, open a new Devin session from the Devin home page and describe the batch task.
  1. Select your test-writing playbook from the dropdown
  2. Describe the task in the prompt:
  1. Review the proposed sessions — Devin lists each module with its current coverage and confirms which sessions it will create:
Proposed sessions (8 modules, all below 50% coverage):
  1. src/services/PaymentService — 31% coverage
  2. src/services/UserService — 38% coverage
  3. src/api/routes/billing — 42% coverage
  4. src/middleware/auth — 44% coverage
  5. src/services/NotificationSvc — 47% coverage
  6. src/components/Checkout — 49% coverage
  7. src/utils/validation — 51% coverage
  8. src/services/SearchService — 53% coverage

Start 8 parallel sessions? (y/n)
  1. Approve the batch and close your laptop. All 8 sessions launch simultaneously on separate Devin machines, each following your playbook independently.
3

Wake up to 8 PRs

By morning, each session has finished and opened its own PR. You’ll see 8 PRs in your repository, each containing new test files and a coverage summary:
Module                       | Before | After  | PR     | Status
-----------------------------|--------|--------|--------|--------
src/services/PaymentService  |  31%   |  87%   | #412   | Ready
src/services/UserService     |  38%   |  84%   | #413   | Ready
src/api/routes/billing       |  42%   |  91%   | #414   | Ready
src/middleware/auth           |  44%   |  82%   | #415   | Ready
src/services/NotificationSvc |  47%   |  85%   | #416   | Ready
src/components/Checkout      |  49%   |  83%   | #417   | Ready
src/utils/validation         |  51%   |  93%   | #418   | Ready
src/services/SearchService   |  53%   |  86%   | #419   | Ready

Overall coverage: 44% -> 68% (+24 pts across targeted modules)
Merge the PRs in any order — since each session only adds new test files to its own module, conflicts are rare. If two sessions touched a shared test helper, resolve the conflict manually or ask Devin to fix it.
4

Run a second batch for the next tier

One overnight batch won’t hit your 80% target across the entire codebase. The next evening, run a follow-up for the next tier of modules:You can also shift from unit tests to integration tests for critical user flows:Two nights of parallel sessions can take a codebase from under 50% coverage to over 80% — work that would take an engineer weeks of dedicated effort.