Entropy, information entropy, is the average amount of information contained in each message received. It is a measure of uncertainty. What does this have to do with tabletop RPGs? Well we'll constantly sending messages during our games, aren't we? The GM says something, players respond, the dice are rolled and the story evolves.
Now if something has zero entropy it has zero information. An example of this is a die roll that always fails or flipping a coin with two heads. You know it's going to fail, why roll? You know it's going to land heads, why flip it? Taking 20 is also a zero entropy event, you know you'll win. Just skip the bull and tell me the story already.
Why do I find this important? Well we're usually used to handling odds of success or failure in our games. Like 17 or better on a d20 to do something, or roll 10 or better with 2d6, etc., you get the idea. We usually think of these as random events, but they're not the only ones, and entropy is quite different from odds of success or failure. They're linked, but not the same. A small chance of success can still add a significant amount of entropy and thus uncertainty in a game. So where do we draw the line? At what point do we stop rolling dice and just rule the outcome?
Well you see, the catch is that ruling, GM fiat and fudging dice is also a random events. It's a human controlled random event, but random nonetheless because as a player I can't know what the GM is going to rule a priori. As a GM I can't know what my players are going to do. I don't know if the player will use a bennie or not.
Entropy is something I seldom see brought up when talking about resolution mechanics. In upcoming posts I'll look into some game mechanics and dice rolls and look at the entropy involved. Do these actions add to the story or just distract us with needless activities?
No comments:
Post a Comment