i've started moving cohost posts of mine, so far mostly gamedev ones, to an internet forum you are welcome to post yours on

having decided (or rather realised) that i can't be arsed, right now, figuring out what approach i would like to take, if any, to an actual blog

go post imo

you couldn't be more welcome. the more you post the less self conscious other people are about posting. the modern internet has made it feel weird to make a forum thread for something that doesn't feel significant enough. or to try to get people to join your forum actually! it's not weird! you're not weird i'm not weird! post about whatever it's fine

https://impromptu.zone/

Dev Scoops: Simulating Eyes in Half-Life 2

The Gman's eyeballs in 2003 My eyes at some point around 2017

Years ago, I was making a stealth game in Unreal that I've since had to cancel amid circumstances beyond my control. At one point I was trying to make my characters' eyes nice, and the gold standard for that was (and arguably still is) Half-Life 2.

HL2 takes a novel approach: the eyes are not rotating sphere meshes with bones, they’re more-or-less flat planes with a shader on ‘em that makes 'em look like balls and points the iris/pupil where you tell it. The eye "plane" can be stretched as the eyelids open or close without affecting the visual, and you don't get any mesh intersection issues (which is why you've never seen the gman's eyeball push through his eyelid even though Gmod exists) or the uncanny appearance of rotating "with the head".

To get this right, I asked Valve's Ken Birdwell about how they got such a good sense of eye contact with this shader back in 2003. Here's the scoop:


In HL2 and other games, there are three main textures.

There’s the sclera texture which is textured and shaded independently. It has some hand painted darkening around the edges to simulate self shadowing from the eyelids, as well a shader to do simulated subsurface scattering of light and a clamping cone to reject light sources that would pass through the skull. There’s better ways to do this now, but in olden days there weren’t.

The second texture is the iris texture, which is literally just a just a planer projection onto a sphere. The underlying visible “eyeball” vertices will need to be spherical, and you’ll need to know the original size of the eyeball, and the size of the iris texture, or more accurately the area of the iris texture that actually contains the pixels for the iris. Once you do that, you’re half way done.

The next tricky bit is to simulate the cornea bulge with a third texture. We originally did this by creating a simple run-time texture that mapped each light source into a point on a simulated “sclera ball” and cornea (partial sphere about half the radius of the eye, offset a bit) using a super simple ray tracer. This is about 90% of what makes “eyes” work. You don’t cue off the reflections on the sphere of the eye or even the iris, you mostly cue off the reflections off the cornea bulge to judge view direction, and their subtle differences between the ball and cornea, and differences between both eyes.

Do all that, and make sure after placement the iris/cornea is about 4-5% wall-eyed (the fovea is offset from centerline of the cornea axis), and suddenly you’ll make “eye contact”.

All the numbers for this can be found in any basic eye anatomy book, and don’t worry about eye twist (your eyes slightly spin when you look around due to how the muscles are attached, but it’s not human perceivable)
I think an example for this code might still exist in the SDK, maybe in hlmv? I know eventually it all got replaced with a fancy shader that does it all on one pass, but the HL2 era version didn’t and the code might all still be there.

If you really want to get fancy, then you’ll want to do a geometry shader and let the cornea bulge deform the eyelids. That, and play with pupil size and they’ll be alive! It all just depends on how close you want to get to the character, and how much CPU/GPU you’re willing to spend.

On the Unreal end, I made an EyeComponent that any actor with this style of eye uses to manage their gaze and convergence, and it all worked out pretty well, although I never shipped anything using this method. I did also end up simulating pupil dilation and cornea bump!

Click that dang ol' Dev Scoops tag for more Dev Scoops, if you want!

dev scoops: Max Payne Kung Fu Edition

The most impressive video game mod of all time is for certain Max Payne: Kung Fu Edition, which, despite being by far the most accomplished Payne mod code-wise and having a ton of custom animations high quality enough to fit in with the rest of Max Payne, was developed by one guy, Kenneth Yeung. Here's some gameplay I recorded just now:

https://youtu.be/YGkFDvx6FQM

There were a lot of Max Payne mods, but this was the only one that really reached farther than custom weapons, levels and characters - not many had custom animations or code. This had far more developed melee combat, combos, a leveling-up system, wall jumps and wall runs, and different "shootdodge" moves when you used it standing vs running. Nobody else had figured out how to do any of that stuff! And besides being an incredible achievement, it also just plays really well, even today - it and Oni will be my main touchstones if I ever get to make the melee-heavy shooter of my dreams.

A while back I tweeted at Kenneth about a vague memory I had of something he'd done to get beyond the limits of Max Payne's modding tools:


me:

Hey man, I have a vague memory of a Kung Fu mod dev anecdote of yours where...something like, you weren't able to do a trace for your wall run in the MP1 SDK, so you used a bullet? Wondering if that rings any bells for ya

Kenneth Yeung:

Wow, you have an incredible memory of 20 years ago! And yes you're spot on, let me take a trip down Max Payne modding memory lane...

The game didn't have any sort of special event triggers for player collisions with arbitrary walls, but I noticed that bullets COULD differentiate between different collision types (ie they could trigger a different event when hitting a character vs hitting level geometry). So in the player's sideways jump animation, I shot tons of invisible and extremely short-lived bullets sideways out of your hip, which would spawn a special type of explosion only if they hit a wall.

That explosion only affected the player, and if the player ever took damage from that explosion type it would immediately kick you into a wall run animation. There were multiple versions of these special bullets+explosions, one for each wall run and wall jump direction.

A similar thing was used for the player's attack combo animations. There were invisible bullets flying out of your fists and feet, looking for collisions with enemy bodies in order to trigger the next animation in the attack combo.

I eventually ran out of new animations you were allowed to add to a character, which is when I stopped adding more moves. But that's probably a good thing, otherwise I might have been working on the kung fu mod for 10 more years.

Here's the trailer from back in the day, which had me hyped as hell - note the Jet Li and Jackie Chan models, which didn't end up coming with the mod (and you can barely tell who they are here, but the screenshots on the mod's website looked really good).

https://www.youtube.com/watch?v=CnegCg638N4

Also, Remedy added an homage to the mod as an easter egg in Max Payne 2, which is lovely:

dev scoops: dhabih eng reveals a previously undiscovered half-life 2 easter egg

Dev Scoops: XCOM/2's art direction

xcom 1 xcom 1 xcom 1 xcom 2

I had a moan about XCOM 1 being prettier than XCOM 2 the other day and got some nice replies from the Art Director of both, Greg Foertsch.1

Having replayed both again just lately, it's striking to me how much more interesting 1 is visually. It uses baked lighting (Unreal's Lightmass, first seen in Gears 2) and gets lovely results; 2 doesn't, and gets no indirect lighting.

XCOM 2 also has less of the nice cartoony style of 1, going for something closer to generically modern - lots of clean shiny surfaces with screenspace reflections (not part of UE3, so done custom for XCOM 2 or backported from UE4), very flat, very sparse. The best-looking areas (forests, etc) break the visual up with too much noise, which feels like an occasional bandaid on the lack of global illumination, which gives some amount of "free" variation to the scene on top of just being very pleasant.

My assumption is that 2 ditched static lighting in favour of more semi-procgen level assembly and swappable time of day and stuff, which you can still statically light, but with a lot of extra hassle. Personally, it's hard for me to see it as worth it without any other GI subbing in.

As I said on Twitter, though, I'll probably be in the minority in caring what this type of game looks like at all. Which is where Greg came in:

Greg Foertsch:

I appreciate that minority. The baked lighting in XCOM:EU served it well and along with a lot of familiar environments, really resonated with players. Your assessment of the games is correct. The design problems in X2 we were trying to tackle dictated some of the decisions that were made. XCOM:EU will always be my favorite of the two. Lots of good insights in your comments. I could talk about this stuff all day

And in response to someone else, about the dynamic lighting:

Correct. The addition of procedural levels and dynamic lighting together had a significant impact on my approach to the art direction

Greg seems cool, thanks Greg


  1. (i like to document these sorts of twitter interactions on cohost now because i think cohost will actually tell me before it shuts off for good)

Dev Scoops: Max Payne 2's character shadows

A while ago I tweeted this about Max Payne 2's really nice shadow. This was 2003, remember.

Max Payne in 2 only ever casts one shadow, so it just moves around based on - I guess? - averaging the relevant lights. It's a nice shadow though. A lot of UE3 games did this too, I think including Batman AA?

Luckily, a couple of Remedy folks who worked on the game saw it!

Petri Häkkinen:

Hey, that looks familiar! I wrote the shadow rendering code for Max Payne 2. :) It’s basically a CPU decal with 2 shadow maps, hard and soft, blended together. Direction & intensity is sampled from the radiosity lightmaps. @jaakkolehtinen
wrote the GI system.

Jaakko Lehtinen:

It’s a shame we never talked about this, or the (what I still think is cool) distributed radiosity solver that scales well with scene complexity by breaking things down with portals and mediating radiance between “rooms” with 4D light fields in a 2-level iterative manner. It’s based on a 1st order SH irradiance volume, which is equivalent to an ambient term (DC) and two directional lights, positive & negative, from opposite directions (linear terms). Shadow direction and strength come from the latter: in a uniformly lit spot, the shadow fades away.

EDIT: After I posted this very post you're reading, someone else chimed in!

Peter Hajba:

So, what Jaakko was saying there was that the shadow (light) direction is baked into every point in the rooms where the light is rendered on, so you didn't need to cast any rays to cast the shadow, just look up the direction value from the spot Max Payne was standing on.
Radiosity rendering is pretty neat. Basically it goes onto each point on a surface, looks around to see if there are any lights visible, and then decides if that spot is lit. That's the first pass. Then in the second pass the renderer also sees the lit surfaces with color.
So if there are strongly coloured surfaces, the light applied to each point gets coloured by that. Then you run a third pass and the light bounces a third time, and more, until you have beautiful baked lighting.
We had a little distributed render farm at the Remedy office. Whenever we let our work computers idle, they would start calculating Radiosity lighting on the Max Payne levels.

Thanks Remedy gang!

dev scoops: Fire Propagation in Weird West

Here's something I did on Weird West that I thought was cool: fire propagation across grass and foliage. Lots of stuff is flammable in the game, and there's a general elemental signal system that handles most of those interactions, but foliage needed some special handling.

Maps full of foliage had already been made using instanced foliage meshes, which are super performant but aren't Actors, so they don't have Components, which everything else uses to catch fire/get wet/electrified etc, and I didn't want LDs to have to do anything for this to work. Foliage being painted using the Unreal foliage tool One approach is to swap each foliage instance out for an actor, either in-editor or at runtime when some firey event happens, but that's potentially pretty bad perf-wise (spawning very many actors), and doesn't account for the arbitrary density of these instances - that is, you might have a field where every blade of grass is an instance, or maybe it's in big clumps, or it's multiple foliage types so it's both, and you just want a nice even fire spread across this field. As you may now have guessed, here I cluster the instances. The blueprint function called when something wants to interact with foliage When stuff burns, it calls this function, which checks for foliage in that location/radius and looks those meshes up in a data table. The table also has stuff like "what fx to play if we cut this plant with a machete" + actors to spawn on break (eg corn plant spawns corn if cut). The Foliage Data Table If the foliage we found is burnable/chunkable, we spawn a single actor to represent that whole radius of foliage, add meshes to the actor at the same transforms as the original instances, and delete the foliage instances, swapping a few meters of foliage instances for 1 actor. That actor can now use our Signal system for burning/getting wet/whatever, just like characters do, and starts doing this process again for any foliage around it. Keeping this performant is just a balance of how fast your fire spreads vs runs out of fuel. Our fire rules are pretty much the same for most types of object, but foliage wants to act a little more impressive even if it means being a touch unpredictable, so we have a few specific settings, eg wind contribution. I like to keep these in a data asset so designers can get at 'em without touching code (including BP). For performance, none of these fires actually creates any light directly. When there's a foliage fire going on, we create one non-shadow-casting fire light that moves and changes size to cover the entire burning area. Ditto sound. Same idea as my old fire system.

We also use all the Cool Kid Shader Perf Tricks, like having the burning foliage turn to embers and threshold out using Time in the shader rather than wasting any CPU on it, which ends up meaning we can have very large dangerous exciting fires with generally quite a low perf hit. I'm pretty happy with the result, and it ends up a system where you get these kinds of anecdotes, so who can complain: For cutting foliage, we use the same function but it's much simpler: check in front of the player for foliage, look it up in the table, destroy the foliage, spawn the tabled FX/items etc for that foliage, or replace with an actor. This game doesn't do it, but if we wanted to be able to cut down trees with an axe using this system we could have it going in like 20 minutes. Also, this was all done in Blueprint. Blueprint is the best.