There's been a lively discussion around the economy and how to possibly fix it. So here are my 2c on the matter:
I know that there's a tendency for the IRE games
not to snipe each other's mechanics, but we should at least take a peek at how some of them tackled the economy problem and perhaps use this knowledge as we look at solutions that are fitting to Lusternia. In particular, we can focus on two game economies: Achaea and Starmourn.
In
Achaea, the commodity production of the game is driven almost entirely by players through mining (for ores and stone) and foraging (for basically everything else). Miners and foragers go out to look for 'resource nodes' and spend time and gold to harvest those nodes. It is, therefore, a gold drain (possibly the biggest drain in the game). Miners and foragers will then sell the resources they gathered to other players or (more importantly) cities. This is because cities have improvements (constructs, in the Lusternian version) that require a yearly upkeep. Basically, the system is this:
- Players bash up gold (gold faucet)
- Cities/houses get a cut of the credit purchases of citizens
- Cities/houses sell those credits to earn gold
- Cities/houses/players buy commodities to upkeep their constructs (for players, this usually translates to ships or private housing).
- Miners and foragers sell commodities
- Miners and foragers get commodities by spending time and gold (gold drain).
The quantity of resource nodes that spawn is limited by the admin. There have been instances wherein the admins modify the spawn rates based on the game's needs, but this is manual and thus requires the admin to actually look at the economy and decide whether a modification is actually needed. The system works because the need for commodities is constant enough (specifically, through the IG year upkeep of city constructs) and the supply is limited enough (via the spawn rates) that miners/foragers can almost always sell for a profit (iron, for example, barely breaks even, but silver is in very high demand almost constantly).
Furthermore, the storage of excess commodities past a certain amount has a constant gold cost for the miners and foragers. If the cost is not paid, then the commodities are lost forever. Thus, hoarding is discouraged.
There is a problem wherein more experienced miners/foragers can box out newer blood for resource nodes, though.
For
Starmourn, the system is more or less the same, except that the bigger consumers of commodities are the players (to outfit their ships and keep a supply of things like missiles, batteries for lasers, and the like). Organizations also have 'city constructs', though, that require constant upkeep in terms of commodities. One big difference, though, is that the spawn rate of resource nodes is largely affected by how much of that commodity was used up within a time period in the past. For example, if 10k units of helium-11 and 2k units of vandium were consumed in the past RL week, then the next RL week will have an increased spawn rate of helium-11 and a decreased rate for vandium. (These are just examples; the exact length of the time periods hasn't been revealed, I think).
Like Achaea, there are fees for storage to discourage hoarding.
The game was still fairly new when I was playing it, and the economy was still finding its footing, so I can't really comment on how stable the design was. It looked sound, though!
So, in the case of
Lusternia, we should take a good look at where commodities go. From my perspective, it seems like a lot of it is used up by players (to craft and, more importantly, to refine items for aethertrading). Cities and communes also use up a bit to raise constructs, while guilds consume for upkeep of research. The first question that needs to be answered is,
how many commodities are leaving the system, and what kind are they (milk, sugar, cloth, steel, etc.)?
Then, we take a look at the production of commodities. Almost all of them are tied to villages. Villages send a portion of the commodities in their commodity shops to whoever they are loyal to at that moment. Players can perform quests to increase the commodities stored in the village as well as get some directly onto their own hands. The second question, then, is
how many commodities can be generated via quests and tithes, and are they in line with the commodity drain as was asked in the first question? This is difficult to pinpoint because of the passive nature of tithes: if no one decides to buy out the leather from Shanthmark, for example, then the village will passively tithe a larger amount of leather every RL day. There is also another question to be asked:
should there be alternative methods to generating commodities other than villages? I personally think that there should be; village ownership is dictated largely by how 'strong' your organization is; if you got no villages, you almost certainly have no commodity income. Organizations which are on the 'downswing' should have an alternative method to get commodities to feed their researches and constructs.
Finally...well, this isn't a question. But, just like Achaea and Starmourn,
we should more closely tie the gold system to the commodity system. They're largely separate at the moment: you spend time to either generate commodities or spend time to generate gold. You *could* spend gold to buy commodities from village shops (this is a gold drain, since the spent gold is removed from the system), but as was said before, this negatively affects the commodity generation of tithes.
I'll try to come up with more fleshed-out suggestions, but for the moment, these are my thoughts on the matter. Discuss!