Resolving conflicts between suppliers/engineers will be critical in maintaining a fair and valuable platform. Although the primary method of securing a contract should be comprehensive automated testing, there will undoubtedly be opportunity for disagreements regarding submitted work. In this case, we offer a "dispute resolution process" where contracts can be arbitrated to a fair conclusion by three randomly chosen technical peers.

  1. Supplier contests the job results. 3% of the bounty + 3% of the buy-in are used to incentivize the resolution process. This makes it painful for both parties to contest the results, and incentivizes "working it out between yourselves". This 6% is immediately paid out equally to the reviewers as they accept the role.
  2. Option A: A "conflict resolution" job is automatically posted, which needs 3 participants to start. Option B: 3 technical DAO members are randomly chosen to participate, and they get an opportunity to decline, in which case another is picked. Option C: We start a Kleros Court workflow.
  3. The 3 resolver members review all job description and submitted material and make a decision.
  4. This decision can be contested one more time, but the job supplier will have to pay an extra amount. Something like 10% of the job, which makes it a more lucrative resolution contract. This should be an absolute last resort, and won't happen very often, because now 16% of the contract pay-out value is gone. Actually, not doing this for MVP. Maybe later?
  5. The contract automatically pays out to the winner of the dispute resolution process.

Decision Criteria

Arbitration is not a perfect process, and will rely primary on human value judgements. Criteria should be established for determining reviewer qualifications:

  1. Reviewer should be technically competent in the area being reviewed. If the reviewer is a sufficiently senior software engineer, they should be capable of reviewing code in any language, running it, and determining if the submission is sufficiently valuable to necessitate a contract pay-out. Reviewers should self-attest to their abilities in their profile, using tags. These tags will be used to auto-assign contracts.
  2. If a reviewer feels uncomfortable with their ability to make a fair assessment, they will have an opportunity to opt-out. Another reviewer will be chosen.
  3. The submitted work should reasonably fulfill the project requirements. 90-100% should be our target for a passing grade. Small, easily fixable issues should not necessarily fail the contract. Overall, we're looking for value provided per the terms of the contract. If a competent, domain-skilled reviewer finds the work to be sufficiently valuable, the contract should pass.

Rewarding Reviewers

In order to incentivize reviews by anons, community members, or DAO trustees, we need to reward reviews. The 6% of the contract value will be distributed evenly to the 3 reviewers upon submission of their successful review.

MVP

What is the fastest way to implement dispute resolution?

In our MVP, we will offer centralized dispute resolution. A collection of core DAO members will act as the reviewers. 3 will be chosen at random.