Game model

Is it too late to make game model adjustments?

Some things should me modeled as amounts and flows not percents. Levers and dials can be percents but production, population, pollution, most ‘things’ should be amounts with flow control.

population x productivity=gdp

starting values UK
66.65 mill x .0425 = 2.83 T

Now all the levers in the game can work on pop and productivity up and down. Initialize game with current country values. Limit the simulation to a certain # of years to keep things going off rails but i think amount based rather than percentage based will make the simulation more stable rather than less.

Its tricky because the game is actually a neural network, where almost everything internally is modelled as a neuron with a value from 0 to 1 or -1 to 1. We then layer on an interpretation of those values to calculate everything, so we have upper limits to GDP< wages etc and then do the scaling.

I don’t know if that’s the ideal way to model it or not, but it keeps the internal processing consistent, which is very helpful.

The capping often seems weird tbh. Not sure how much can be done about this, but maybe going through an s-curve of sorts to compress extreme numbers would be more sensible.

If you could do a logistic curve this would be fantastic. You can keep the 0 to 1 scale and just increment along a logistic, it’s a natural fit for neural networks which often use logistics functions. Right now the values do seem to respond linearly which leads to quick ceilings. A logistic curve would fix this problem. Depending on how your values are represented there are different ways of implementing it. Either this way:

Or the way it is used in biological modeling if you want to define your center EC50.

Or define your carrying capacity.

I do ecological and biological modeling and use those second two formulations.

A second thing you could try, and i don’t know if this would work, but you should be able to set a different scale factor at each node. Move GDP in .01 increments and something else like tax rates on increments of 1.

I hope this is not too late because often these decisions have to be taken at the start of the model creation.

1 Like

This is awesome, and very much appreciated.
Its definitely not too late, because each neural connection in the game is basically defined by an equation, and some weights at each end, so the simplest looks like
and we also support an extra variable like
which is evaluated as
but the ^ operator is just one thats defined in code. we could replace it with any symbol that we map to any calculation, its just some programming :D.

We also support any value being any name (this is new for democracy 4) so you can do this:

0.25*(1.0-Pollution)^EnergyEfficiency if you like!

The thing is, adding the extra variable, the ^ and the named parameters happened fairly late during dev, so we have a bunch of really simple, really dumb equations in the game that will get refined, and improved as we go through alpha.


any s-curve transform (such as the logistic function or hyperbolic tangent, though there are more) will automatically crush what every value down to at most a fixed bound and at least another fixed bound. You could use completely unbounded variables at every step, and then, as last thing, slap a logistic over it, and it’ll still end up in the required range.

Although most naturally, stuff like GDP should probably not be strictly limited at all, though you might wanna show it on a logarithmic plot, since a couple dollars more or less GDP don’t actually matter that much if you already have billions or trillions, but halving your GDP certainly would.

Another related thing that might be interesting to use in some situations is Softmax. Which you could use if mutually exclusive things happen and you gotta weigh them against voter group affiliation or something.
Softmax will take any list of values and turn them into a new list of values that sum to 1, so you can use them as percentage weights for future steps. Not sure if any of your logic thus far could use this, but it might help.
Really, Softmax is like a generalization of the logistic curve to arbitrarily many inputs.

1 Like

I was thinking about this a bit more and it takes a bit of thought as to which value you want to model with which curve. First, taxes and any of the policies sliders makes sense to go linearly from 0-1 on .1 steps. Outcomes mostly seem to fit s-curves, crime, population, pollution, even GDP. Infrastructure capacity such as hospitals, congestion, it would depend if demand response was implicit or explicit. In this case i think demand shift is modeled externally so the things like traffic congestion could be modeled linearly 0-100. If you wanted to make it implicit the s-curve would imply demand and supply shift so that congestion and hospital use would never hit 100. I think the short hand is just to think what values do you never want to go to 0 or 100%.

I agree with kram1032 that GDP is not strictly limited but things like population and GDP can be modeled as a series of s-curves where your population or GDP does reach an cap but then do to a technological change and that cap is then lifted to a higher limit. For example maybe population limit in England middle ages was limited to 5m, then 10m, 55m, as the technology progresses population and GDP caps get raised. I dont think this is necessary but if you wanted to you could model this by having a succession of GDP nodes that you switch out. For example you can start with the logistic capped 0-1 and move in .1 increments simulating a space from 0-100. Then after a technological transition you can set your curve to increments of .05 (rebase your position to x counts into the new curve ) you have effectively moved your cap from 100 to 500 and so on.

Anything that’s effectively a percentage should be given between 0 and 1 but ideally not capped.
One way to do that is to go to odds instead, where it goes from 0 to infinity, or logaritmic odds going from -infinity to infinity and you can just effectively add or multiply odds as may seem more logical, and then, once you’re done with that, go back to percentages. This would be quite the same as filtering stuff through an s-curve at least in effect.
If you prefer to work directly on percentages, your operations should only be:
p*q (“and”)
1-p (“not”/negation)
p^a for a>0 (weakens a percentage. If used on negation, strengthens instead)
min(p,q) “less likely”
max(p,q) “more likely”
“Or” is actually hard to model with a single percentage. You’d really need distributions for that I think. So you can model how much overlap there is. If two things are independent, “or” would be “+”. However, this only makes sense if the summed percentages sum to at most 1 under all circumstances.
Same for weighted sums.

Anything that may well span many orders of magnitude, such as debt or GDP, probably ought to be displayed logarithmically, not necessarily with a hard upper limit. Though if you prefer to keep the current look, making it an s-curve on final transform for visualization is certainly an option.
It would be kinda cool, though, if the various curves could be made to roughly match the development of actual countries given similar polices. s-curves just aren’t gonna do that.

Internally for calculations, you probably don’t ever want to cap anything unless it explicitly makes sense (can’t have less than 0 or more than #population people, say)

The tech limit stuff depends on food production and medicinal advances mostly, though population today, in developed countries, is kept in check not via such tech limitations, but really via family planning and contraceptives. Food production today is above what’s necessary to feed everybody well and healthily. It’s more a matter of that food reaching the right people than one of producing that food at all. There are big inefficiencies here but the trend is improving. (AFAIK actual starvation has become kinda rare these days, though in lots of poorer countries, stunting is a huge problem. Wherein children don’t outright die, but they get too little food to properly develop, so they will look like six year olds by the time they are like twelve or something)

Really, that’s one of the biggest reasons why adding population mechanics could be really interesting. If you’re going to model an African country, families just look very different there, especially rural ones. Things slowly change and life expectancy goes up and, with some delay, average child count goes down, slowly changing the makeup of the entire population.
And in fact, even developed countries are still in the process of reaching this as yet final stage, which is having implications about how many retired people we are expected to have in the near and farish future, along with the associated costs of pensions and social care.
Depending on where you live, the local age-demographics could look wildly different, and with it come wildly different problems.
Some of that is already somewhat modelled, but I think there could be lots of improvements in that regard.

I agree of course on all the parameters you mentioned being important but i was thinking of trying to simplify the model into fewer parameters. Just a tech parameter directly affecting, gdp and population levels. Less realistic but simpler. It would make sense that the tech parameter would drive population below the actual cap.

I suppose what I’m saying is the opposite of simplifying, heh

I see so many problems, big and small, in the world today, many of which would be very hard to model in the game as it stands now, that, imo, it would help to actually model some things more finely.
That doesn’t, of course, mean, that everything should be made finer. Some other things could perhaps be simplified. Would take quite some fiddling and pondering to see what would be better though.
The other reason why more, not less complication might be decent, is that it could serve to de-trivialize many decisions. Right now a lot of the policies are, for any given playstyle, effectively non-brainers. And not just that you want them, but also that you want to, like, minimize or maximize them. It’s effectively a binary choice: Do I want this? Yes. All the way yes.
Very rarely are there policies where you truly are forced to take a measured approach.
More complex systems could perhaps support that kind of things more easily.
But I guess ultimately it’s about preference…

Right now a lot of the policies are, for any given playstyle, effectively non-brainers. And not just that you want them, but also that you want to, like, minimize or maximize them. It’s effectively a binary choice: Do I want this? Yes. All the way yes.
Very rarely are there policies where you truly are forced to take a measured approach.

I tend to agree with this right now, some elements feel definitely binary or preferred path because of the current setup or effects.

As long as introducing complexity doesn’t take too much away from understanding what sort of impacts your decision makes (e.g. it’d be nice to not have to read a wikipedia page on monetary policy before making a decision! ha) then complex systems or enhancements or more nuance in general is definitely welcome.

Have you played Fate of the World? Very complete climate and economic model with material flows. It allows you to see graphs of all the model parameters. I enjoyed digging into the model as the game played out.


I have not but that sounds very interesting

Im keeping in mind that cliff mentioned earlier that the game interface was too cluttered and complex for most players. I think that he can actually simulate more values but show less values and “knobs” on the interface. Can make it arbitrarily complex but keep it hidden unless you drill down via right clicking though to the variable graphs. See the game “Fate of the World” for an example.

1 Like

Honestly, one of the strong suits of this game is that it makes influences extremely clear through its stellar interface. There are some minor rough edges (effects that have different long and short term influences could be clarified, and imo the capping stuff is a little weird), but for the most part, I think that helps a lot in order to make this actually rather complex game very, very approachable.

IMO that’s only first impression. Play for five minutes and that’s no longer true

I like the game interface but i think what is displayed can be simplified. It has to be if you want to increase the simulated values. Right now i think there are too many values sitting at the top level and more of the complexity should be hidden (but also be able to be exposed if the user wants to go down a level). I am think how to do this but I realize this is also a matter of opinion.

That is entirely fair and I can see that.

I think it’s cool that you can dig into the nitty gritty, but less of it being exposed to the player without digging might be a good idea.

Maybe there could even be a setting modifying how many or even which of the blue effects you want to see, with the default being less than currently. (Though which ones)

I want to go back to this point. I think a some of values should have their primary determinant being what the value was last turn. I’m mostly thinking of things like development or infrastructure, which are harder to develop than maintain (although I suppose you could model infrastructure spending by having a simulation value for “amount of infrastructure”, and policy for infrastructure spending where the cost is lowered by how much infrastructure already exists.) But other than hacks like that, the Democracy neural net model means you can’t “build” anything and have it stick around without being actively maintained to the degree that created it in the first place.

This is very exciting. I almost want to volunteer to assist in rewriting sone of those old equations (there are an awful lot of them) but god knows another person working on something like this would make the job about 4 times harder for everyone lol