Picking a Winner: Check that Box

Hello and welcome back to our series on army appearance judging. I’m going through the most commonly used approaches and talking about the pros and cons of each. Last time I talked about crowdsourcing, and this time I’m going to talk about objective criteria, aka checklists.

As a quick reminder, this is intended to be food for thought for Tournament Organizers (TOs), and not a defense of soft scores or a discussion of whether we should incentivize people to improve their hobby skills. I’m just going to assume that if you’re reading this, you’re already on board. 😉 In the interest of full disclosure, I feel like I should tell you that the checklist approach is my least favorite. However, I’m going to do my best to keep that bias under control and give it a fair shake. It does have its uses in the right situations, after all.

For the objective criteria approach, the TO writes a rubric of criteria and assigns point values to each individual criterion. The criteria might describe base level expectations, such as “the entire army is fully painted to at least a three color tabletop standard” or “all units meet minimum model count,” or require examples of more advanced techniques, such as model conversions, wet blending, object source lighting, etc. The TO then applies the rubric to each army at the event, checks off the criteria that a given army satisfies, and totals up the point values of the criteria that the army satisfies to arrive at an appearance score. One of the commonly stated goals of this approach is to be less subjective than other approaches, since the criteria are meant to be bright-line rules.

Objective Criteria Pros

  • Very Transparent: In my mind, transparency is the biggest pro for this approach and the thing it does the best compared to other approaches. You just can’t beat a clear list of binary criteria for determining why you received a certain score. If the TO publishes the rubric in advance, you can even know what your score will be before you get to the event and possibly do some quick hobby work to satisfy a few more criteria and bring your score up.
  • Low Level of Effort for the TO: It might not be as low as crowdsourcing, but overall this approach can be done with a low level of effort on the part of the TO. The clearer your criteria, the quicker and easier it is to do the evaluation. Some TOs even allow the players to fill out their own checklist and then just do a quick pass to make sure they were done correctly.
  • Low Level of Hobby Expertise Required for the TO: Again, it’s not as low as crowdsourcing, but applying a clear rubric of objective criteria doesn’t require a high degree of hobby skills and knowledge. Writing a good rubric and assigning appropriate point values, however, does require a bit more expertise, but the TO doesn’t necessarily have to write their own. I’ll talk about that more under potential pitfalls.

Objective Criteria Cons

  • Prescriptive: If transparency is the biggest pro for this approach, then the flip side of the coin has to be its biggest con. The more clearly defined the criteria are, the more prescriptive the rubric becomes. The more prescriptive the rubric becomes, the more it limits the flexibility and creativity of hobbyists. While it might encourage hobbyists to experiment in order to meet the criteria listed in the rubric, it does nothing to encourage or reward hobbyists going above and beyond or experimenting with techniques or effects not covered by the rubric. It can also punish hobbyists who select specific themes and execute them to a high degree, but whose theme doesn’t lend itself to inclusion of all the prescribed criteria. For example, a hobbyist might create a beautiful Herd or Forces of Nature army with a very primal, animalistic theme. It wouldn’t make sense for such an army to contain metallic surfaces. So the hobbyist has to decide whether to stick to their theme and paint any weapons or armor as bone or stone, and therefore lose any points reserved for the “non metallic metal” technique, or go against the theme and look of the army that they want just to check a box. This approach rewards an everything and the kitchen sink approach to the hobby, but doesn’t necessarily incentivize hobbyists to create better looking armies.
  • Not Actually Objective: I know, I know. I decided to call this approach “objective criteria,” and now here I am trying to tell you that it isn’t really objective. But hear me out. While the individual criteria might each be objective in a vacuum, the entire checklist isn’t. The decision to include or exclude criteria, and especially the point values to assign to each, are still subjective and based on the opinion of whoever writes the checklist. I’m not sure that you actually can remove all the subjectivity from appearance judging. By trying to do so, this approach introduces the additional problem of being prescriptive without actually accomplishing its goal.

Potential Pitfalls

  • Objectionable Criteria: One of the motivations for removing subjectivity from the appearance judging process is to limit the number of disagreements over the results. But after attending tournaments for many years, I have yet to read a checklist that didn’t contain at least one item that I felt didn’t belong there. I don’t mean to go off on a rant or anything, but… one of the most memorable for me was “characters and unit leaders have been given additional attention to make them stand out.” I, and the other top hobbyist at the event, spent a long time arguing with the judge over this one, since we had both painted every model in our armies to the highest standard we could. In order to satisfy this criteria, we would have had to paint the entire rest of our armies to a lower standard than we did. Criteria which are particularly prescriptive of techniques, models, or model vendors, based on incorrect assumptions about how far the top tier hobbyists will go, or legacy criteria left over from previous game systems, are all likely to draw objections from hobbyists. The only advice I can give here if you want to use this approach is to be proactive about soliciting feedback well before the event and iterate on the rubric until you’ve addressed as many of the objections as you feel you need to up front.
  • No Consideration for Quality: Another downside of the checklist approach is that it doesn’t really reward how well any of the prescribed techniques are performed, just that they are attempted. The approach, again, incentivizes hobbyists to attempt lots of techniques, but doesn’t reward them for perfecting those techniques. This is another pitfall that can be particularly frustrating for more highly skilled hobbyists, since they can feel that the higher quality of their results are not being adequately recognized and rewarded. The two ways I’ve seen TOs try to deal with this problem is by either adding more criteria to the rubric that require higher and higher quality levels, or changing the value of some criteria to be a range instead of a flat score. Both of those solutions go against the pros of the approach though, as they require more effort from the TO during the event, introduce more subjectivity dependent on the TO’s judging capabilities, and make the criteria less clear cut. At that point, you’ve basically created a hybrid approach between objective criteria and subjective assessment that suffers from the cons of both without fully benefiting from the pros of either.
  • The Criteria Can’t Cover Everything: Some armies are works of art, and it’s hard to put words around exactly what makes them appealing because they are unique. Sometimes it might be a particularly clever theme executed to a high level across the army. Sometimes it might be the way color choices and painting techniques just make the whole thing pop. Sometimes it might be the way the models were selected, converted, and composed to tell a clear and compelling story. Sometimes the overall effect of an army just makes you say “wow.” I’ve seen rubrics that contain a “wow factor” criterion, or even a pool of discretionary points for the TO to award based on the general effect of an army. Those are the best ways I can think of to handle the “I can’t explain it, but I know it when I see it” quality of some armies within this approach. But I feel like those solutions suffer from the same problems as there being no consideration for quality — you end up with a hybrid approach that is both prescriptive and subjective, while requiring more of the TO.

As a TO, it’s up to you to know your community. If you are building a community from the ground up, starting with mostly new players, then the transparency of this approach can incentivize them to begin working on their hobby skills with clear and easy to accomplish milestones. If you have an existing community who are mostly content to meet the basic appearance expectations, then this approach can incentivize them to explore and experiment with more advanced techniques by offering clear direction and rewards. But as your community begins growing beyond that point, the cons of this approach really begin to cause friction, and that friction only grows as their skills continue to improve.

About Mike Adkins

I'm the admin for the site. You might run into me at events in the eastern US. I'm one of the Artistocrats, which means I get stomped by Alex Chaves and Mike Austin on the regular.

View all posts by Mike Adkins →

One Comment on “Picking a Winner: Check that Box”

  1. So I prefer checklist style paint scoring, as I feel it’s more likely to reward effort than however it is non-rubric scoring works (that right there shows how opaque paint judging without a checklist feels on the other side). I’ve experienced sour grapes with checklists as well, but those are when checklists are tiered so that ‘higher level’ boxes can’t be checked unless ‘lower’ ones are. To whit, I make *very* themed armies, and I appreciate when checklists include theme or tone or model selection on the sheet, because IMO it’s a huge part of the KOW hobby for me, as we blessedly don’t have prescribed model lines to stick with. Unfortunately my experience has been that theme wasn’t unlocked until things like ‘freehand’ or ‘painted standards in multiple units’ were, which my armies rarely have (I’m an icon man, not a cloth banner man).

    Which is I guess to agree, checklists can stumble in their construction, however I will always prefer the transparency they provide, as well as frankly the higher paint scores, from my experience with having the same army judged different ways.

    Aside: I used to be a teacher and currently write textbooks, where rubrics are absolutely a positive for the receiving party.

Comments are closed.