Last year Philip reflected in a Q&A about how the Crown’s A&S competition went, and I thought I’d do the same this year. However, I want to focus in more specifically on how the EK competition rubric preformed.
Last year it functioned, but not as well as we wanted it to. We had made some significant changes before the competition last year, but those changes brought their own new problems with them. In addition to ideas about how we would need to modify the rubric, we also learned last year that it would be very important for us to work with potential judges to introduce them to the rubric and help them calibrate with us on how to best use the rubric during judging. We hoped that we could try to create a more consistent judging experience, which was somewhat lacking last year.
So- since last year’s competition, our rubric deputy Magnus worked with Philip, Elena and I to make changes to the rubric. We updated language, tested the rubric, listened to feedback, made more updates, and tested again. We modified how scores were calculated to make judges feel more comfortable, and to provide entrants with more detailed feedback. We did all this as we worked to hold rubric training sessions online and in person at a bunch of events throughout the kingdom, including at Pennsic.
And, you know what, we think this process worked! We still have some changes that we will be making to the general rubric before next year’s competition, but those changes are comparatively small in nature. All in all, the general rubric performed very well this year, giving us a range of different scores for entrants, and hopefully giving entrants detailed feedback about what they did well, and how they can take their projects further next time.
This year we also significantly updated our performance rubric, and used it to decent success with two competition entrants. Since this rubric is much newer and much less well tested, we will be making more significant changes to it based on feedback from judges and entrants in the coming months, and we thank our performing arts entrants and judges for their patience in working with a product that was not quite finished.
We have received a few bits of feedback from entrants so far this year, including comments about how we might be able to to better prepare entrants for their judging experience, especially those entrants who make it to the final round. We are listening, and we will do our best to make changes to improve the experience for next year.
I would also like to point out something that I found quite and encouraging about our competition this year, and that is the number of non-laurels who we had judging or shadow judging the competition. 11 out of 24 judges were not laurels, and 5 of those individuals were able to participate fully in the judging process (i.e. they were not shadow judges) because they had attended one or more rubric training sessions. While experienced and knowledgeable content expert judges will ALWAYS be important and very much needed at Crown’s A&S, the rubric and the standardization it brings with it allows us to open up judging to many more people, and that is exciting!
Our new rubric deputy, Elena, will continue to work with the rubric throughout the summer and fall, making changes, testing those changes, and organizing rubric training workshops, so that we have a better product, and an even larger pool of judges and entrants who are comfortable with the rubric, for next year.
If you have questions about rubric training, or wish to offer additional feedback on the rubrics themselves, please e-mail Elena at firstname.lastname@example.org
Thank you for listening,