Thus spake Brent Easton:
- “Rolling-release” like, that is implementing feature by feature
then
releasing that feature into main V4 when deemed stable enough
This would be my choice, with the initial release being quite modest
e.g. to the level required to implement a relatively simple hex game
like Battle for Moscow.
It’s funny you picked that—it’s the game I used for the XML example
I posted yesterday.
As well as the structural design that is going on now, I would like to
see some thought to developing a list of game ‘mechanisms’, for want of
a better word, that must be supported. It is important that as we design
and build, these core ‘mechanisms’ must be supported. Things like piece
‘ownership’ and changing ownership (dealing etc.), information hiding
(masking, hidden).
We should start making a list.
The only mechanisms which I think will be difficult to support via
properties are ones involving hidden information, the difficulty having
nothing to do with the nature of hidden information.
I think the problem becomes clearer by dividing cases. If there’s hidden
information,
- a player cannot access that information unilaterally, or
- a player can access that information unilaterally
a. where doing so will trigger notification of another party, or
b. without anyone else knowing he has.
Secure systems, as when the player doesn’t have the hidden information
or has it, but it’s encrypted with a strong cipher, are in category 1.
A lot of browser-based games with hidden information are like this—
presumably the server does not hand out information to clients that they
are not permitted to see. Systems in category 2.a are things like ACTS,
if I understand it correctly: You can draw a card any time you want, but
your opponent will know that you have. Every virtual tabletop system I
know of is in category 2.b, though the amount of effort needed to access
the hidden information varies.
Which category makes sense for a game depends on both the game and the
players. Lots of games have no hidden information, so for those it
doesn’t matter. For other games, there’s a trade-off between trust and
convenience. Category 1 systems require player interaction if the
player triggering the revelation of some hidden information is not the
one who possesses the key to it. This can be acceptible for real-time
games, or for PBEM games where such revelation happens infrequently
(e.g., many games have secret victory conditions set at the start and
revealed at the end), but could become onerous for games where hidden
information is revealed often (e.g., think of a card game where one
player has the ability to steal a card from some other player every
turn, or a game which has sighting distances for units and real fog-of-
war).
Category 2.a systems don’t require intervention by other players to
reveal information, but also don’t have any protections against
information being revealed improperly. E.g., I can draw a card, and
everyone is notified that I drew a card, but there’s no way to undo
the revelation if I wasn’t supposed to draw a card. If mistakes are
not a problem (or not possible) then this kind of system is fine, and
could expedite PBEM play quite a bit.
Category 2.b systems rely on players not peeking at things they’re not
supposed to see. For me, this is fine when playing games with my
friends—it’s just like playing games face-to-face. I trust them not
to look at my cards when I leave the table to get a beer. But it’s not
so suitable if you’re running a PBEM tournament full of people who don’t
know each other. For that case, having a system in one of the other two
categories means that you don’t have to worry about playing against
cheaters, and you also don’t have to worry about being called a cheater
when you’ve guessed well or had an exceptional run of luck.
Finally, there’s the issue of data recovery with encrypted category 1
systems. People will loose their passwords. If they’ve chosen good
passwords (not “12345” and “password”) and we’re using strong
encryption, then even government-level efforts will not get their
data back from the cyphertext. This might be solvable simply by making
it easy for players to store their data unencrypted locally, and only
encrypt the data in files meant to be sent to others.
That doesn’t help in the case where you need to recover the data after
a player vanishes. This can be a big, big problem in long multi-player
games. E.g, what do you do if all of the force size data is encrypted
and one of your Empires in Arms players stops responding and needs to
be replaced? There is forunately a way around this using public key
crypto, in that you can encrypt the data so that it can be decrypted by
some number of other players acting in concert, but not by fewer
players—the idea being that, say, if the remaining six players in your
EiA game agree it’s ok to release the remaining player’s secret
information, then it’s for a good reason.
So, this issue is rather complex. I think there are at least three
scenarios here we should support:
A) encrypted hidden information stored in saved games
B) unencrypted hidden information stored in saved games
C) hidden information stored in the game server
–
J.