Tuesday, December 13, 2016

Spell points by formula: 5E variant rule

[snip]

I have been using the DMG spell point system since the DMG first came out. I understand your issues with not liking tables, and frankly I don't see a big reason why you couldn't just interpolate a simple formula and use that instead. It's already a fairly linear progression. One formula that comes fairly close is: spell points = 4 * LEVEL ^ (1 + LEVEL/100), rounded to the nearest integer. At level 1 this gives you 4 spell points; at level 5 you have 22 spell points; at level 11 you have 57 spell points; and by level 20 you have 146. Those numbers are all reasonably close to the DMG numbers (4, 27, 73, 133) although somewhat underpowered at low levels--they don't quite catch up to the DMG numbers until level 16, except for an anomaly at level 2 and a smaller anomaly at level 4. But they do remain within 2 levels of DMG numbers at all times.

Spell points by level:
Level 1 (DMG) 4 (formula) 4
Level 2 (DMG) 6 (formula) 8
Level 3 (DMG) 14 (formula) 12
Level 4 (DMG) 17 (formula) 17
Level 5 (DMG) 27 (formula) 22
Level 6 (DMG) 32 (formula) 27
Level 7 (DMG) 38 (formula) 32
Level 8 (DMG) 44 (formula) 38
Level 9 (DMG) 57 (formula) 44
Level 10 (DMG) 64 (formula) 50
Level 11 (DMG) 73 (formula) 58
Level 12 (DMG) 73 (formula) 65
Level 13 (DMG) 83 (formula) 73
Level 14 (DMG) 83 (formula) 81
Level 15 (DMG) 94 (formula) 91
Level 16 (DMG) 94 (formula) 100
Level 17 (DMG) 107 (formula) 110
Level 18 (DMG) 114 (formula) 121
Level 19 (DMG) 123 (formula) 133
Level 20 (DMG) 133 (formula) 146

Furthermore, I doubt the missing spell points would be all that sorely missed, since spell points systems give greater flexibility and there is less pressure to conserve some of every type of slot. A regular PHB 9th level wizard with only a 3rd level slot and two 1st level slot remaining would be quite nervous; but a spell point wizard with 9 spell points left is likely is be relatively cool and collected because he can still utilize any of his memorized spells and still have power a 1st level spell like Shield or Expeditious Retreat for emergencies. If you gave me a choice between running a spell point wizard under this formula or a PHB spell slot wizard, I'd take spell points every time.

My opinions on the 6th+ level slot issue are mostly theorycraft, because I've only played characters at those level in one-shots. IMO the biggest impact of that rule is that it makes multiclassing more attractive; since you can't get multiple 6th+ slots per day anyway, and you already have plenty of spell points, you might as well consider investing two levels in Rogue or Fighter or Warlock or something somewhere along the line instead of sticking with pure spellcaster classes.

Aesthetically I don't like the 6th+ limitation because it prevents it from being a real spell point system; it's actually a hybrid spell slot/spell point system because you still have to keep track of slots 6, 7, 8, and 9. But I don't have an elegant solution either, because 5E does clearly intend to keep a lid on level 6+ spells in a way that it doesn't for spell levels 1-5. (E.g. Arcane Recovery doesn't work with them, Sorcerers can't create them from sorcery points, etc.) If you held a knife to my throat and made me come up with a solution now I would simply increase the cost exponentially after level 5 and drop the 1/day restriction: spells over level 7 cost (14 * 1.4^(LEVEL - 5)) spell points, rounded to the nearest number.

Level 6: 20 spell points
Level 7: 27 spell points
Level 8: 38 spell points
Level 9: 54 spell points

Only the mightiest wizards could ever dream of casting multiple high-level spells in a day, and doing so would drain them utterly. That seems like it would maintain the flavor of 6th+ level spells in 5E: they're rare and significant.


--
If I esteem mankind to be in error, shall I bear them down? No. I will lift them up, and in their own way too, if I cannot persuade them my way is better; and I will not seek to compel any man to believe as I do, only by the force of reasoning, for truth will cut its own way.

"Thou shalt love thy wife with all thy heart, and shalt cleave unto her and none else."

Saturday, December 3, 2016

Optimization and Performance

Some good thoughts here: http://joeduffyblog.com/2010/09/06/the-premature-optimization-is-evil-myth/

I am personally used to writing code where 100 CPU cycles matters. So invoking a function that acquires a lock by way of a shared-memory interlocked instruction that may take 100 cycles is something I am apt to think hard about; even more worrisome is if that acquisition could block waiting for 100,000 cycles. Indeed this situation could become disastrous under load. As you can tell, I write a lot of systems code. If you're working on a network-intensive application, on the other hand, most of the code you write is going to be impervious to 100 cycle blips, and more sensitive to efficient network utilization, scalability, and end-to-end performance. And if you're writing a little one-time script, or some testing or debugging program, you may get away with ignoring performance altogether, even multi-million cycle network round-trips.

To be successful at this, you'll need to know what things cost. If you don't know what things cost, you're just flailing in the dark, hoping to get lucky. This includes rule of thumb order of magnitudes for primitive operations – e.g. reading / writing a register (nanoseconds, single-digit cycles), a cache hit (nanoseconds, tens of cycles), a cache miss to main memory (nanoseconds, hundreds of cycles), a disk access including page faults (micro- or milliseconds, millions of cycles), and a network roundtrip (milliseconds or seconds, many millions of cycles) – in addition to peering beneath opaque abstractions provided by other programmers, to understand their best, average, and worst case performance.

Clearly the concerns and situations you must work to avoid change quite substantially depending on the class of code you are writing, and whether the main function of your program is delivering a user experience (where usability reigns supreme), delivering server-side throughput, etc. Thinking this through is crucial, because it helps avoid true "premature optimization" traps where a programmer ends up writing complicated and convoluted code to save 10 cycles, when he or she really needs to be thinking about architecting the interaction with the network more thoughtfully to asynchronously overlap round-trips. Understanding how performance impacts the main function of your program drives all else.


--
If I esteem mankind to be in error, shall I bear them down? No. I will lift them up, and in their own way too, if I cannot persuade them my way is better; and I will not seek to compel any man to believe as I do, only by the force of reasoning, for truth will cut its own way.

"Thou shalt love thy wife with all thy heart, and shalt cleave unto her and none else."