any day now I will release a video game

but cohost is shutting down, so i won't be able to post about it.

you might be reading this in the future, while it is out already. if so, you should go buy it and leave a positive review! at the link below!

if it isn't out, you should wishlist it at the same link. i won't be able to edit this post when it does come out, so if you're reading this not right when i posted it, you actually have to click the link to find out of it's out. that's quite intriguing, isn't it?

https://store.steampowered.com/app/867150/InFlux_Redux/

anyway, if you do happen to notice when it does come out, please do help me spread word of its existence however you think you can. i spent years on it on and off, it would be nice to get money

also there is an eggbug in it

A NEW TRAILER FOR MY GAME IS OUT. COHOST MAY DIE SLIGHTLY BEFORE IT SHIPS SO PROBABLY WISHLIST IT NOW? SO YOU CAN BUY IT WHEN IT COMES OUT?

https://www.youtube.com/watch?v=JJsQPUANK98

It's a remake I've been doing on the side for like 6 years of a game I put out 11 years ago; I have no marketing or anything I'm just droppin' it sometime in the next few weeks so it would be nice if some of you bought it and put a positive review on it so Steam doesn't necessarily just swallow it right up! Thanks gang!!!

https://store.steampowered.com/app/867150/InFlux_Redux/

for fun, I've been making a game for a few weeks using weird west assets and the imsim framework i was making some time ago. it's third person. yesterday i made it so holstered items get slotted around the body

i also made a neato intro cutscene you can see on twitter https://x.com/joewintergreen/status/1833435748303049067?t=DS40YLpFoROo7wQL7ALUpA&s=19
or the other one https://bsky.app/profile/did:plc:ht6gfr46gk4nfkkfpqwh3su4/post/3l3ytu7xmgx26

my Deus Ex Lipsyncing Fix Mod: making of

https://www.youtube.com/watch?v=oxTWU2YgzfQ

Back in 2021 I made a mod for Deus Ex 1 that fixes the lipsyncing and blinking, which, I betcha didn't know, was broken since ship. Everything I wrote about it is on Twitter, and it oughta be somewhere else, so here's a post about it.

I guess I was playing DX1 and thinking, geez, was this lipsync always this bad? In a weird way? It's insta-snapping mouth shapes, but they're not always the same mouth shapes. Is this broken? I couldn't find anything online about it, but I did find this article: an interview with Chris Norden, a coder on DX, where he goes into the lipsyncing and how it was, at one point, super elaborate and amazing, and they had to pare it back for performance reasons. I thought I'd check how much of this was done in Unrealscript (since the C++ source for DX is nowhere) and whether I could un-pare it. It turns out it was an extremely simple fix to get it as good as I got it, and I think that's as good as you can get it until someone leaks the source code.

I'd messed around with lipsyncing stuff before and was familiar with the broad strokes of how it tends to work via my intense familiarity with Half-Life 2: you figure out, hopefully automatically, the sounds (phonemes) present in a sound file ("oo", "ah", whatever) and map those to mouth shapes (visemes), then when the audio plays, move the mouth into the right shape for the phoneme we're in at this moment. The figuring-out process is called "phoneme extraction", at least by Valve, and Valve do this offline, because it takes a sec. In Valve's case they append this phoneme information to the end of the .wav file, and it looks like this:


PLAINTEXT
{
Okay, I don't blame you for hesitating, but if we're gonna do this thing, then let's just get through it. 
}
WORDS
{
WORD Okay 0.064 0.224
{
111 ow 0.014 0.096 1.000
107 k 0.096 0.142 1.000
101 ey 0.142 0.220 1.000
}
WORD I 0.224 0.352
{
593 ay 0.220 0.310 1.000
105 iy 0.310 0.364 1.000
}
WORD don't 0.352 0.496
{
100 d 0.364 0.396 1.000
111 ow 0.396 0.456 1.000
110 n 0.456 0.496 1.000
}

, etc. Phonemes, start times, end times. Easy!

My assumption is that the reason Deus Ex's super cool lipsyncing was too expensive to ship was, they don't seem to save this information anywhere, so I guess they were figuring out the phonemes in realtime. If correct, this is sort of a bummer - doing what Valve did would have scooped the whole cost out. Maybe there was more to it.

Anyway, the Unrealscript. Deus Ex is pre-Unreal having skeletal animation, it's all vertex animation. The character heads have a few: relevant here, 7 visemes and a blink. nextphoneme is set from somewhere outside this code (probably a cpp audio system I can't access) to A, E, F, M, O, T or U, which it doesn't matter which is which and I don't remember, or X, which is nothing (close mouth). Then this Unrealscript on the character sets the head's anim sequence to the appropriate pose. This all happens on tick, but only if IsSpeaking. We have a tweentime we're using to blend between these poses, so we should be seeing nice smooth blending, the lack of which is why I'm here in the first place! So what's the problem?

The main thing is a dodgy frame rate check:

// update the animation timers that we are using
	animTimer[0] += deltaTime;
	animTimer[1] += deltaTime;
	animTimer[2] += deltaTime;

	if (bIsSpeaking)
	{
		// if our framerate is high enough (>20fps), tween the lips smoothly
		if (Level.TimeSeconds - animTimer[3]  < 0.05)
			tweentime = 0;
		else
			tweentime = 0.1;

"tweentime" is how long it takes to blend to the next viseme in seconds; if 0, it's an instant snap. The intent here is to skip blending entirely if our framerate is so low that it looks better snapping the lips around than showing any in-between poses, only it doesn't work. The code is keeping Level.TimeSeconds from the previous frame and subtracting that from the current Level.TimeSeconds to get deltatime, which if it's less than 0.05, we're assumed to be getting less than 20fps. So it's flipped.

Also, 0.1 is just way too fast a value, which I suspect a reason for that I'll come back to*. I increased it to 0.35 to make the blends take long enough to really see.

With that fixed, the lipsync is smooth! Hooray! But it's not perfect: at the end of a line, when the audio finishes, we don't smoothly close the mouth; we snap the mouth shut instantly. This is because we're only doing any blending if bIsSpeaking=true, which it suddenly isn't. The perf hit of this function no longer matters at all, so I just skip that check too: every character always gets to run lipsync. Tweentime is also local to this function and initialises at 0, so I had to set it to 0.3 to get blending even when we have no phoneme.

Blinking was also way too fast, so fast as to be invisible, so I slowed it down a ton. Now you can see 'em blinkin'.

So now we have nice blinking and smooth mouth movement, but there's one thing that still sucks: presumably as part of the optimisation that made this ship at all, nextphoneme does not update every tick, or anywhere near every tick. It doesn't even update at a fixed rate - sometimes you'll get a good amount of updates in a sentence, sometimes one or two. This means that all the smooth blending in the world won't get you a correct result unless you happen to get lucky: JC can be speaking the M in "a bomb" and you're still back on the "a". As far as I can tell there's no way to fix this right now - the code that updates the phonemes just needs to do it every tick, and it don't, and it's not Unrealscript so I can't touch it. If the time between phoneme updates was at least consistent, you could set tweentime to that duration and make your blend take as long as it takes for a new phoneme to show up, but it ain't. So close!

*In the interview where Norden alludes to this amazing lipsync demo they had going on before they optimised it down, I assume it was initially getting a new phoneme every tick, and that is probably when they set 0.1 seconds as a blend duration. If you're getting constant new phonemes, blending super fast to the next one makes sense; it's only when you're not that a slower blend time looks good.

There's a lot of jank to this code. The silliest thing about it might be that it lives in ScriptedPawn, Deus Ex's NPC class, which does not share an immediate parent with the player character, so this whole function is just duplicated between the two classes.

Anyway, here's the whole function after I futzed with it.

// lip synching support - DEUS_EX CNN
//
function LipSynch(float deltaTime)
{
	local name animseq;
	local float rnd;
	local float tweentime;

	// update the animation timers that we are using
	animTimer[0] += deltaTime;
	animTimer[1] += deltaTime;
	animTimer[2] += deltaTime;

	if (bIsSpeaking)
	{
		// if our framerate is high enough (>20fps), tween the lips smoothly
		
//JOE CHANGE: 
//This used to set tweentime to 0 (no blend) if it thought FPS was low, else 0.1. It was 
//backwards though, the result was the opposite. 
//Even 0.1 is too fast to look good though. Anyway, skip the check, we don't care
//
//		if (Level.TimeSeconds - animTimer[3]  < 0.05)
//			tweentime = 0.4;
//		else
			tweentime = 0.36;

//Also, ideally tweentime would be the duration until the next time we get a phoneme update?
//But I don't know where that update comes from at the moment

		// the last animTimer slot is used to check framerate
		animTimer[3] = Level.TimeSeconds;

		if (nextPhoneme == "A")
			animseq = 'MouthA';
		else if (nextPhoneme == "E")
			animseq = 'MouthE';
		else if (nextPhoneme == "F")
			animseq = 'MouthF';
		else if (nextPhoneme == "M")
			animseq = 'MouthM';
		else if (nextPhoneme == "O")
			animseq = 'MouthO';
		else if (nextPhoneme == "T")
			animseq = 'MouthT';
		else if (nextPhoneme == "U")
			animseq = 'MouthU';
		else if (nextPhoneme == "X")
			animseq = 'MouthClosed';

		if (animseq != '')
		{
					if (lastPhoneme != nextPhoneme)
			{
				lastPhoneme = nextPhoneme;
				TweenBlendAnim(animseq, tweentime);
				TimeLastPhoneme = Level.TimeSeconds;
			}
		
		}
		

//		if ((Level.TimeSeconds - TimeLastPhoneme) >= tweentime*0.8 && TimeLastPhoneme != 0)
//		{
//		TweenBlendAnim('MouthClosed', 0.2);
//		nextPhoneme = "X";
//		lastPhoneme = "A";
//		TimeLastPhoneme = Level.TimeSeconds;
//		}
	}
	else
	if (bWasSpeaking)
	{
		bWasSpeaking = false;
		
//JOE: I added this tweentime set. Without it it was 0 as initialised, so the jaw snapped shut

		tweentime = 0.3;
		TweenBlendAnim('MouthClosed', tweentime);
	}

	// blink randomly
	if (animTimer[0] > 0.5)
	{
		animTimer[0] = 0;
		if (FRand() < 0.4)
			PlayBlendAnim('Blink', 0.2, 0.1, 1);
	}

	LoopHeadConvoAnim();
	LoopBaseConvoAnim();
}
[archived]

the thing about development hell is that often there is no "the game as originally envisioned" to be pined for or found with "just a little more time"

a lot of times you end up in development hell because the core vision is vague, contradictory, keeps changing, has big holes in it. It's not that those games fell short of their original ideals so much as they were trying to build a brick house on a swamp

sometimes, if you're lucky, you firm it up in the right places and it stabilizes

other times it just sinks

[archived]

It's layered like a lasagna

I just finished a three-year stint on a AAA game project that has been "in development" for at least six years. This client has now gone back to essentially pre-production. One of the things they're struggling with is carrying around a huuuuge amount of technical debt for games that never shipped.

We're talking things like an inventory system (built two games ago) that was never fit for purpose stacked on top of a backpack system (built three games ago) that was a colossal hack, which has to interact with "pockets" (built one game ago) for the player character to put their weapon in. Except that the inventory system is on an entirely different backend than the pocket system and doesn't replicate equipped weapons correctly, so the UI team (that's me!) has to assign a weapon picture directly to a player weapon slot based on their pocket index.

Throughout the project, I would poke the lasagne only to find it rotten all the way through. There was no "original vision" to RETVRN to. The vision was always inspired by whatever game was currently in vogue. As a UI programmer, I was constantly asked to display information to the player that simply did not exist. But because it was always for an "important demo," my team would have to fake the data themselves. This was a very bad idea (and I told them as such!!) because it meant that upper management would look at the game and see massive improvements while it was just another layer of load-bearing paint.

#gamedev#AAA#AAAAAAAAAAAAAAAAAAAAAA
[archived]

[archived]

boy have i seen this

i was on a project that i joined at a point where they had spent years in the kind of "preproduction" where you allocate a chunk of time to figure out what you're trying to make by doing a build that you know you'll throw out when you figure out the real goal. then you throw it out and start fresh from a better position.

except they had neither thrown it out nor figured out a design. chunks of it had been thrown out and restarted at random, with no more vision of the final product than they'd had years ago. at all times, they were working on some funding or conf demo milestone that precluded any actual game development; half the time it was a "vertical slice" that was just pretending to be a vertical slice because there was nothing, even on paper or in someone's brain, to be a slice of.

every task i would get was a trap: there's no way to build anything in a way that supports the design when there is no design. you either go out on a limb with a system you've designed, which you get told is wrong even though there is no right, or you do your best to accommodate multiple of whatever options you imagine they might end up wanting and never find out if you succeeded or not because they never commit to any design. when this happens, it's like you didn't do the work at all; nobody has a task to interact with the result of your task so eventually they forget about it and task it to someone else again and now there's three inventory systems. you try to just take ownership of a feature because nobody else has, and get your hand slapped, so you go talk privately to the many designers who have been hired and find out the same thing is happening to them.

this is obvs cartoonishly bad mismanagement but some of these traps are distressingly easy to fall into the second you're not doing strictly-for-free indie development: the second you're working towards "the build" for "the conference" or "the milestone" instead of just chipping away at the game, there's a huge risk you never get back to what you were supposed to be doing. there's always another build

A funny thing that happened was my remake of my 11-year-old game failed Steam build review and one of the issues was "seems identical to the game you put out 11 years ago. explain why this should be its own product" like damn. I was expecting a steam review not a steam review

so i was working on a game. the game has haystacks sometimes. if you set fire to the haystack, there's a one in six chance that when it burns away, it turns out to contain a treasure chest! you can get some loot

i implement this. much later, QA reports a bug: the chance of a treasure chest spawning isn't one in six. uhhh ok. i have a look and can't repro. i look at the code and it's the same "random chance" code used everywhere, so if there was anything wrong with it, this wouldn't be where it came up. i have a bit more of a think and then send it back, CNR.

the bug bounces back and forth for 2 weeks as the tester keeps saying it's still not one in six. i am down a few rabbit holes thinking about, could we be not spawning the chest even though the result comes in true? is some other class destroying chests? are some levels containing instances of the haystack that are busted in some way? much to think about

eventually i'm like, ok, what exactly are you seeing in your repro. old mate says "i burn 6 haystacks. sometimes there's no treasure chest at all. sometimes i burn 6 haystacks and get more than one treasure chest. so that's not a 1 in 6 chance"

i now have a new, much more challenging task: explain this without embarassing anyone

a thing i have never known

is how some unreal engineers never provide tooltips for variables and functions exposed to blueprint

do you know how easy it is to do? you literally comment it. that's it. the editor uses the comment as the tooltip

so many people never do this

if you want shorter games with worse graphics made by people who are paid more to work less, graphics tech advancements should please you

it's massively massively feasibler than ever before for one-person devs/tiny studios to rapidly iteratively make stuff, all kinds of stuff, from the fancy-graphic'd to the deliberately lo-fi, than it ever has been before, with fewer limitations, because of the same tech advancements that are mostly marketed via boring shit like a realistic building exploding or a couple of photorealistic guys punching each other.

the actual cost of making games is now almost completely decoupled from their graphical fidelity and that's a good thing. every task in game dev takes less time than it ever took. everything is cheaper to do.

so when you see a big company grind their workers into paste forcing them to deliver more and more on tighter and tighter schedules, please be aware that it has nothing to do with graphics or hardware or gamers' demands for anything, and everything to do with the capitalist thinking that says: oh, you can work twice as fast now? i will give you half the time


there's been a shift from better hardware directly meaning "better graphics" to it meaning "devs can make a wider range of stuff more quickly using systems/workflows/techniques that weren't previously feasible", which to players (or even devs not using the stuff) looks the same as nothing changing.

so you're like, "this looks like a game from 10 years ago! or near enough! why's it need a 10 years newer computer!" because being able to target that higher spec meant we could waste less time, work more iteratively, experiment more and make better stuff. we can be more expressive because we're less limited by process. that doesn't even mean you're going to like the output more, but it does mean it's less compromised and closer to what the author wanted to make. unless the boss is a prick

games all want an SSD now, including games that don't appear to be doing anything so very much wilder than was happening in 2009, but the SSD is letting us do that stuff in less time. like development time. the cleverer ways we were making it work back then took time, and forced premature commitments, and limited what we were able to do. invisible compromises nobody ever found out about were getting made constantly. remember how thief 3 had all its levels awkwardly split into chunks so it could run on the xbox? that's just an unusually visible compromise. it's surrounded by worse ones you'll never hear about

tech people have been chasing "dynamic global illumination" for years, and now it's here, and players don't care, because we had many ways to make stuff be fine without it. but it's expanding vastly the range of stuff you can feasibly do as a dev. think of the lighting in Mirror's Edge that everyone loves: it took many hours to bake every time they wanted to see it, then you'd make a tweak and have to bake all over again, so you just didn't iterate on your lighting all that much, it wasn't practical. now we can get those same results 60 times a second, but you need better hardware. but you're making better art because you don't have to wait ages every time you want to try something.

this is the same issue we always have in trying to convey the value of good level design and narrative tools: players (or anyone but the author directly benefiting) don't see the value. but the value is in everything: you make different, differently interesting stuff, because of the better tools.

every time i have the job of advocating for writers to have better tools (i make this my job everywhere i go) someone insists it's pointless because there are words in the game already. some people just cannot be convinced there's a benefit to a thing unless you can jump into a parallel universe to show them exactly what the result would have been without it

[archived]

The Steam page for InFlux Redux now has recent screenshots on it

Consider wishlisting this, the long-awaited and quite intensely remastered version of the puzzle game I shipped 11 years ago when I was like 2 years old

here is the steam page

#influxredux#game dev#unreal engine#indie dev#indie games
[archived]

not rechosting/sharing this on every platform available to you would be pretty fucked up behaviour honestly

here is the link, come on. come on! https://store.steampowered.com/app/867150/InFlux_Redux/

doing PCG tests with Weird West assets

Something very satisfying about making a forest and then running a road spline through it and having the forest automatically make way for the road, or drop a building in and not get grass or trees spawning where the building is

path tracing experiments with weird west assets

becoming a real Guy Who Has Websites

I guess so far they're all Resources (rumour has it, really fuckin good ones) but it's fun, you know. I think I'm gonna do more. This "world wide web" thing might really have legs

[archived]

unreal scoops dot com

I didn't like that I ran out of pinned messages on my gamedev discord so I've started putting those scoops on a website on the internet instead:

http://www.unrealscoops.com

#gamedev#unreal engine#unrealscoops
[archived]

I have updated http://Unrealscoops.com

so that now, instead of being a static website, it pulls the scoop list from my forum at Impromptu Dot Zone (http://impromptu.zone). So you can now submit your own scoops via the forum.

Made a quick video to show someone some basic level editor workflow stuff in unreal

just watched back this video i did about soldier AI in half-life 2 years ago and i still agree with it! what a relief

https://www.youtube.com/watch?v=OsDM7GKb0xU

once again thanking myself for creating my sound splines

impromptu dot zone

So I have a good and lovely Discord gamedev community, where much useful info is shared, but Discord is a lousy place for information to live

so now there is an internet forum for that same community and those same scoops: http://impromptu.zone. feel free to join

First I made https://unrealscoops.com, which is good, but with a forum it's much easier to add a new scoop, and accept scoop submissions.

Unreal is the only scoop genre on the forum at the moment but I'll probably add others, it's not specifically an Unreal community

Also you can share general art here, make a thread for a dev log for your project or whatever if you want, but y'know, if nobody does and it's just the new Unrealscoops that's fine too

As with the discord, no dickheads or assholes will be accommodated

made a new portal material

Finally trying the material based radial motion blur from ContentExamples (since regular motion blur is bad at things that spin). Works p well

added impact sparks when you boost into stuff

tested something in influx redux, almost quit, but then decided to go to the beach.

real fun to be tossed around by the waves.

https://youtu.be/pH0i7R7Yy-o

when a game dev has unusual steam activity

UE5 Auto Material Instance Maker's on the Marketplace

You can now buy my cool material instance helper tool on the Unreal Marketplace. Also it's still on Itch.

https://www.unrealengine.com/marketplace/en-US/product/571ececbf55d4f77b76525743172a964

https://impromptu-games.itch.io/

unreal scoops dot com

I didn't like that I ran out of pinned messages on my gamedev discord so I've started putting those scoops on a website on the internet instead:

http://www.unrealscoops.com

[archived]

Sellin' my Light Audit Tool

Selling this since a bunch of people offered to buy it. It's a helper widget I made to help track down expensive lights in UE5. Sorts all the lights in your scene by how expensive they probably are (based on how many shadow casting objects are in the radius, etc).

Also gives you an easy way to directly change properties of LightComponents inside Actors without selecting the actor and then the component.

Features:

-Sort all lights in your scene by a cost estimate. This includes lights that are components on actors, not just Light Actors.

-Change properties on individual Light Components, regardless of their owning actor, more easily than usual.

-Preview Max Draw Distance: Optionally show only lights whose MaxDrawDistance the editor camera is currently within, ie skipped culled lights.

-Scale Attenuation Radii tool - takes every selected light and scales its radius by some multiplier. Eg: enter 0.5 to half every light's radius, which can be handy if they were all different radii.

-Scale Intensity tool - same thing, but for intensities.

-Select All Lights Matching Color - takes the selected light, and selects every light in the scene of that same color, optionally including not just Light Actors but all actors with LightComponents.

-Tool shows an indicator when the light you're dealing with was added via Blueprint script, and therefore will probably (not definitely) get its settings stomped by the same script.

#gamedev#UE5#unreal engine#editor utility widget#tools#itch.io
[archived]

[archived]

If you prefer to buy your unreal junk through the unreal marketplace even though itchio is better, my light audit tool is on there now

[archived]

UE5 Auto Material Instance Maker Tool

I just made another neato time-saving tool in Unreal, maybe I'll whack it on the marketplace, let me know if that is something that you would like it if I did it

https://www.youtube.com/watch?v=sXzwHp9EwBQ

[archived]

you can buy this now

UE5 Auto Material Instance Maker Tool

I just made another neato time-saving tool in Unreal, maybe I'll whack it on the marketplace, let me know if that is something that you would like it if I did it

https://www.youtube.com/watch?v=sXzwHp9EwBQ

The excellent Liz Edwards (@lizaledwards) sent me her excellent Sam Fisher model and I faffed about getting some spooky shots in Unreal with it

I skinned it myself (which I've never done before) inside Unreal, so any ugliness is my fault

Sellin' my Light Audit Tool

Selling this since a bunch of people offered to buy it. It's a helper widget I made to help track down expensive lights in UE5. Sorts all the lights in your scene by how expensive they probably are (based on how many shadow casting objects are in the radius, etc).

Also gives you an easy way to directly change properties of LightComponents inside Actors without selecting the actor and then the component.

Features:

-Sort all lights in your scene by a cost estimate. This includes lights that are components on actors, not just Light Actors.

-Change properties on individual Light Components, regardless of their owning actor, more easily than usual.

-Preview Max Draw Distance: Optionally show only lights whose MaxDrawDistance the editor camera is currently within, ie skipped culled lights.

-Scale Attenuation Radii tool - takes every selected light and scales its radius by some multiplier. Eg: enter 0.5 to half every light's radius, which can be handy if they were all different radii.

-Scale Intensity tool - same thing, but for intensities.

-Select All Lights Matching Color - takes the selected light, and selects every light in the scene of that same color, optionally including not just Light Actors but all actors with LightComponents.

-Tool shows an indicator when the light you're dealing with was added via Blueprint script, and therefore will probably (not definitely) get its settings stomped by the same script.

Light Audit Tool, Again

https://youtu.be/afg6mz3_oVk

I've had a Light Audit Tool kickin' around for a long time and gotten a lot of use out of it on a few projects. Showed it off recently and folks were into it, so I remade it from scratch; it is now cooler. Might sell it or somethin'.

Now with:

-option to only show lights whose MaxDrawDistance our editor cam is within
-Scale Attenuation Radii or intensities on all selected lights
-Select lights matching color
-shows if a light was added from blueprint

portal collision

jumped back into my old unreal portal project and fixed it up to use the newer DynamicMeshComponent instead of ProceduralMeshComponent to cut holes in walls for collisions, which is much faster and less fucked up. plus it means the holes are portal shaped now instead of secretly squares.

the problem this solves is, when you place a portal on a wall, then try to walk through it, you hit the wall. so i'm disabling collision on the wall, making an invisible copy of the wall, cutting a hole in that, and letting that be the collision for the wall with the portal on it until you remove the portal. p cool

constrained curl noise is my jam of late

3d curl noise but then force all the particles to a plane/surface, looks fuckin sick

light audit tool

here's a cool editor utility widget I made some time ago, been useful on a few projects. sorts all the lights in your scene by how expensive they probably are (based on how many shadow casting objects are in the radius etc).

it's hard to extract level geometry from splinter cell 3 (chaos theory)

it's unreal engine 2, which you'd think would mean there was something for that, but people are generally only interested in models generally, not maps, so there's no tools for this, and every ue2 game seems to differ just enough to make its sdk useless for getting anything out of another ue2 game. best bet seems to be something like "ninja ripper" but i have had pretty shit results with that so far (that's the "just rip whatever the fuck i am rendering right now" stuff and has all the problems you would expect plus exciting ones you would not expect).

splinter cell 3 does actually have a level editor but that's for the multiplayer, which is actually a separate game, and doesn't help with the cooked singleplayer maps

someone leak me the splinter cell 3 dev repo

enjoying niagara

had a crack at some auto-phoneme-generated lipsync this arvo. I think the phonemes themselves are alright and the jank is mainly the shapes and blending, but also this is using Rhubarb which is meant for 2d stuff and only does 9 phonemes

the goal would be something comparable to HL2, whose basically-automatic phoneme extraction is extremely good and detailed. You might as well just mocap your whole face with a phone these days but not practical for every line probably, nice to have a passable lipsync for any wav

https://youtube.com/shorts/nSHnQ58wkyc?si=r5tTxhLCyanD0DUL

The PhysX Guy

When Zelda: TOTK came out and everyone was goin' nuts about the physics, I wrote a Twitter thread about the state of physics in video games, which I'll put below because it's interesting. In the thread, I allude to (and request not to be @'d by) a PhysX Guy, a specific guy who worked on PhysX who always materialises on Twitter to make pedantic defenses of PhysX to anyone who thinks PhysX sucks (which it does).

Anyway, I only just noticed that in response to the thread, this guy registered a whole new Twitter account just to address the thread, and what he reckons are misconceptions about PhysX, although functionally they aren't, as he's had explained to him a billion times. I completely missed this whole thing until like 8 months later! Anyway, I thought I would boost it now because he put the work in and got zero engagement back.

Here's the thread copied into Cohost for preservation


Can't stress enough that the physics being amazing in TOTK is because of Nintendo paying for Havok. It's the same reason No Man's Sky has the best Barrel Rolling Down A Hill and HL2 had the best Blow Up A Bunch Of Crates With A Grenade. Most games use something cheaper and worse. Half-Life Alyx also has extremely good physics, it's not Havok but Valve's own thing called Rubikon made partly by, I'm told, ex-Havok folks.

Unreal 5 has a new physics engine called Chaos, which started rough but is now getting pretty Havoky. 99% of stuff on Unreal or Unity uses an old version of PhysX that the PhysX guy who always materialises when I talk about this reckons both companies did a bad job implementing. For whatever reason, PhysX in Unreal and Unity sucks, but gets used by everyone because it's free. Everyone whose gameplay doesn't rely on physics, that is! Because it's unreliable. Mostly if a game requires you to stand on a physics object to progress, it's not PhysX.

Rocket League is on Unreal, but they implemented a third party physics solution called Bullet because it's not feasible to have cars punt a ball around in multiplayer with PhysX (caveat for That Guy: I'm sure it is in your ideal environment nobody really has access to).

BOTW also had amazing physics; you could stand on a ball! Shit never went through other shit! You could stasis-smash shit real hard into other shit no problem! Havok baby. PhysX, you can't let something spin too fast or it'll freak out. (do not @ me, PhysX guy).

So if you wondered why for a while around HL2, it was "oh my god, physics gameplay is the future" and then it wasn't, it's because for a minute, the only physics available was Expensive-But-Good and as soon as Shitty-But-Free was available everyone went for that. It took everything HL2 did off the table, but you could have a ton of impressive but basically cosmetic physics happening and it didn't cost you anything. So 20 years later we get what is basically the next game to lean real heavy on physics, and it's Havok again.

A lot of people say the weak-ass CPU but for me it's always the weak-ass GPU and limited memory. The Switch is effectively a 7 year old mid-range android phone, and it's very difficult to get to 30fps (forget 60, for most things) without making your game look like total ass. This is a big issue for #influxredux whenever I try to put that on there - a lot of that game is about looking and sounding nice, and there aren't a lot of options that will survive the Switch's gpu and memory situation.

Until I moved it to Lumen recently, Redux was using baked lighting using Lightmass, which is great visually and for performance - static lighting has functionally no cost on most machines. But on Switch, you can hit a memory wall pretty fast, crashing because you're out of memory because your pretty lightmaps were too large and too numerous. Crunch the lightmaps down enough to consistently avoid that and you lose most of the visual benefit. Dynamic lighting then becomes more attractive, but of course it's got a big perf hit and, on Switch, will never look that good.

On [redacted project] which I ported to the Switch, static lighting wasn't an option, and shadows were important to the visual for some objects. I ended up doing a trick where I disabled cascaded shadowmaps and most other dynamic shadows, but enabled the Far Shadow, which is basically an opt-in extra whole-scene shadow meant for selectively enabling dynamic shadows on objects far from the camera (out of the usual range for shadows). I tweaked the distance on the "Far Shadow" so that it was super near and was basically the only shadow we ever saw, and then selectively enabled shadows only on objects that needed them, like buildings and nearby characters. Any time you can flip an expensive feature from opt-out to opt-in like that can end up being a big win.

I also did the port of Adios to the Switch, which I wrote a post about here.

I really love the console in Source 2. I dunno why Unreal's output log and console don't show the print color specified in PrintString/log etc.

Also the "find" command that lists all cvars containing the query rules, as does just having the console draggable/resizable

probably i ship this remaster in 2024

probably

slightly revamped my Moving-From-Unity-To-Unreal website

Took some screenshots

for the sweethearts on my remake's community hub who were hoping it'd be out this year but it's not gonna be, here are those

tweaked my respawn effect a bit

goin for an abe's oddysee vibe

i will not be getting over nanite displacement for some time

Been spending a lot of time making Editor Utility Widgets

in Unreal (basically whole new elaborate editor UI and tooling using mostly Blueprint). getting a lot of mileage out of the Asset Registry now that I have bothered to get into it. FOR INSTANCE this function to return all Data Tables in the project that match a specified struct. It's real nice that a designer can just, eg, create a new loot table and fill it out and do nothing else and the game magically knows about their newly tabled loot. Or whatever it might be. Doing that took a little bit of figuring out so here is the graph

eggbug asked…

i've made a few gb studio games and i enjoyed that. but i ran out of game ideas. i can get other art project ideas easily but not games. what part of my brain do i need to exercise to come up with a good & simple game idea?

i think you need to examine why (and whether) you want to make games. a huge amount of people end up in the industry who never did examine their desire to do so and basically are just unhappy. that might not be you, but thinking about what your interest actually is, and it's not just "making games", might get you on the thought-path that leads to where you don't yet realise you fancy going

jedi game tech test

just made public a video of the tech test i did one time when i applied to work on the jedi game (i got the offer! but did not accept)

https://www.youtube.com/watch?v=FZ-8mvRuO_Y


Spent 8 hours on this Jedi Survivor tech test when I was interviewing for a Technical Narrative Designer job there a while back. Got offered the job, but didn't take it. The test called for "a system to manage the narrative", plus timed lifts that are easy for level designers to set up, a trap that shoots a projectile from offscreen, etc etc.

The main thing is the dialogue. It's different every time you play, and remembers which lines it randomly chose so that it can choose different lines later that reference the previous lines. Eg: If you take falling damage and the character complains about his shins, he might mention his shins later, but he won't if he didn't.

Note:
-Some of the dialogue has random variations
-There's a chance for some different dialogue when you arrive at the temple if you died a lot on the way
-He'll only complain about his shins once
-He'll only mention feeling like he can jump the bottomless pit now if he previously remarked on the bottomless pit
-He'll only say "this again?!" on a falling death if he's previously died a falling death
-He won't complain about his sore feet if you haven't taken any falling damage

started makin a little game about hackin stuff

bein a cool little hacker guy

https://youtu.be/UaM1eg4p5fk

Dev Scoops: Simulating Eyes in Half-Life 2

The Gman's eyeballs in 2003 My eyes at some point around 2017

Years ago, I was making a stealth game in Unreal that I've since had to cancel amid circumstances beyond my control. At one point I was trying to make my characters' eyes nice, and the gold standard for that was (and arguably still is) Half-Life 2.

HL2 takes a novel approach: the eyes are not rotating sphere meshes with bones, they’re more-or-less flat planes with a shader on ‘em that makes 'em look like balls and points the iris/pupil where you tell it. The eye "plane" can be stretched as the eyelids open or close without affecting the visual, and you don't get any mesh intersection issues (which is why you've never seen the gman's eyeball push through his eyelid even though Gmod exists) or the uncanny appearance of rotating "with the head".

To get this right, I asked Valve's Ken Birdwell about how they got such a good sense of eye contact with this shader back in 2003. Here's the scoop:


In HL2 and other games, there are three main textures.

There’s the sclera texture which is textured and shaded independently. It has some hand painted darkening around the edges to simulate self shadowing from the eyelids, as well a shader to do simulated subsurface scattering of light and a clamping cone to reject light sources that would pass through the skull. There’s better ways to do this now, but in olden days there weren’t.

The second texture is the iris texture, which is literally just a just a planer projection onto a sphere. The underlying visible “eyeball” vertices will need to be spherical, and you’ll need to know the original size of the eyeball, and the size of the iris texture, or more accurately the area of the iris texture that actually contains the pixels for the iris. Once you do that, you’re half way done.

The next tricky bit is to simulate the cornea bulge with a third texture. We originally did this by creating a simple run-time texture that mapped each light source into a point on a simulated “sclera ball” and cornea (partial sphere about half the radius of the eye, offset a bit) using a super simple ray tracer. This is about 90% of what makes “eyes” work. You don’t cue off the reflections on the sphere of the eye or even the iris, you mostly cue off the reflections off the cornea bulge to judge view direction, and their subtle differences between the ball and cornea, and differences between both eyes.

Do all that, and make sure after placement the iris/cornea is about 4-5% wall-eyed (the fovea is offset from centerline of the cornea axis), and suddenly you’ll make “eye contact”.

All the numbers for this can be found in any basic eye anatomy book, and don’t worry about eye twist (your eyes slightly spin when you look around due to how the muscles are attached, but it’s not human perceivable)
I think an example for this code might still exist in the SDK, maybe in hlmv? I know eventually it all got replaced with a fancy shader that does it all on one pass, but the HL2 era version didn’t and the code might all still be there.

If you really want to get fancy, then you’ll want to do a geometry shader and let the cornea bulge deform the eyelids. That, and play with pupil size and they’ll be alive! It all just depends on how close you want to get to the character, and how much CPU/GPU you’re willing to spend.

On the Unreal end, I made an EyeComponent that any actor with this style of eye uses to manage their gaze and convergence, and it all worked out pretty well, although I never shipped anything using this method. I did also end up simulating pupil dilation and cornea bump!

Click that dang ol' Dev Scoops tag for more Dev Scoops, if you want!

pretty game

game dev post: Porting Adios To The Nintendo Switch Entertainment System

Couple years ago, I handled the port of Adios, the game about the pig farmer who doesn't want to dispose of corpses for the mob anymore, to the Switch. It took a week. The Switch is weak as piss, so this is always a little bit of a pain, but the port ended up looking visually pretty good and runs at a solid 30fps, usually at the full 720p in handheld mode. Sometimes it has to dynamic-res to a lower res, but it's usually not noticeable thanks to Unreal's temporal upsampling.


I really wanted to hit 60fps on the Switch, but ultimately the the necessary visual sacrifices wouldn't have been worth it on a game where you aren't twitch-reacting to anything and where the nice art is a lot of the vibe, and the vibes are a lot of the whole thing. Responsiveness did benefit a lot from Unreal's Low Latency Frame Syncing, and much as some folks hate it, enabling motion blur helped a ton with keeping camera rotation from feeling juddery at that awful 30fps.

The worst thing to deal with was memory. It's a small game, but everything's in one pretty big level, and it's statically lit, and there are multiple times of day, so lightmaps are a substantial memory cost. On the other hand, it wouldn't have been worth lighting it dynamically, because the static lighting also carries a lot of vibe, and saves a lot of render time. The lightmaps did need to be downrezzed a lot though, which fortunately you can do in Unreal without having to rebake the lighting.

Every texture in the game was also drastically reduced in resolution, which again, is easy to do per-platform on Unreal, but I did have to spend a lot of time assigning texturegroups so that it wasn't done indiscriminately - landscape textures and characters stayed reasonably high res at 512, small inconsequential objects got as low as 64, more important objects were more like 256, and any normal maps were crunched even harder (since they mainly informed the static lighting at build time).

A lot of memory and GPU perf was also clawed back by mass-LODing every single mesh in the game, which sounds worse than it is; I have a tutorial about it. I also removed convex collision from every mesh, since that too costs memory, and collided per-poly with the simplest LODs instead.

I also ended up using Unreal's extremely dated Precomputed Visibility System to squeeze the most perf out on the Switch. This is an old system made specifically for Infinity Blade a long time ago, which will have had a similar set of issues.

I did some other work on Adios before it came out for PC/Xbox, mostly dejanking. At the time it had no real lipsync - just some facial expressions and looping "I am talking" anims. I had no time to do a real lipsync (or you know I would), but I got the jaw flexing with the wav amplitude, which makes a massive difference, and it's still a little more sophisticated than Half-Life 1's lipsync because that "I am talking" anim is still playing and giving us some flex in the lips, even if it's functionally random.

Another eleventh-hour dejank was that when the character you spend the whole game with interacts with anything, he teleports into position, then plays an animation. This looked pretty rough, but there wasn't time to make any real changes to how it worked, so I just smoothed it out by interpolating his mesh's transform on tick, from what it was last tick to the desired position. He still teleports, but now his body takes a second to catch up.

If there'd been time/money for it, the game would have benefited from a lot of character tech-anim stuff that would have been fun to do: better eyes, facial anim, lipsync, proper blending and layering - but I think the character's "performance" still comes over surprisingly well.

ughiguessiwanttomovefromunitytounreal.com

Decided to do something useful: whipped up a quick Internet Site for all the Unity folks who now want to learn Unreal and don't know where (or whether) to start because it's daunting as hell.

www.ughiguessiwanttomovefromunitytounreal.com

my unreal engine 5 cohost app

now shows shares correctly and also comment counts, usernames, avatars, dates and times, etc. i wanna make it pull cohost/tumblr/masto/bluesky posts in one timeline and also let me multipost.

cohost ue5

images, tags, comment counts, asks. no youtube embeds or gifs (lmao how the fuck am i gonna do those)

game dev post: the Class Viewer in unreal

I'm always telling people to turn on the Class Viewer in Unreal so now I have a video about what that is and why you should, just to save myself typing paragraphs about it to everyone I meet

https://youtu.be/72omGTlkfG8

Yall wanna see something fucking stupid

[archived]

more diving through the air

https://www.youtube.com/watch?v=1wwgByozbw0&feature=youtu.be

I'm really very psyched with this for ~a day's work. Character animation has always been pretty intimidating to me but I think this looks good

#gamedev#animation#UE5#unreal engine
[archived]

firing a gun whilst flying through the air

more diving through the air

https://www.youtube.com/watch?v=1wwgByozbw0&feature=youtu.be

I'm really very psyched with this for ~a day's work. Character animation has always been pretty intimidating to me but I think this looks good

[archived]

not an animator

but i made an animation today

#animation#gamedev#unreal engine
[archived]

i did another one!

now if i do a left and right one (and make sure they are the same duration/distance travelled) i can make a blendspace and have a stew goin

i've really almost never animated before, being able to do it inside unreal rather than use some dogshit-or-i-am-dogshit-at-using-it other software is entirely responsible for my bothering at all

#animation#gamedev#Max Payne#unreal engine
[archived]

a blendspace

[archived]

not an animator

but i made an animation today

#animation#gamedev#unreal engine
[archived]

i did another one!

now if i do a left and right one (and make sure they are the same duration/distance travelled) i can make a blendspace and have a stew goin

i've really almost never animated before, being able to do it inside unreal rather than use some dogshit-or-i-am-dogshit-at-using-it other software is entirely responsible for my bothering at all

not an animator

but i made an animation today

still kinda blown away by this metahuman bullshit

shadowplay is the most useful game dev tool of all time. unexpected bug? i have it on video. not sure if i did what i thought i did 5 minutes ago? i have it on video. accidentally dismissed something i should have read? i have it on video. amazing.

dev scoops: Max Payne Kung Fu Edition

The most impressive video game mod of all time is for certain Max Payne: Kung Fu Edition, which, despite being by far the most accomplished Payne mod code-wise and having a ton of custom animations high quality enough to fit in with the rest of Max Payne, was developed by one guy, Kenneth Yeung. Here's some gameplay I recorded just now:

https://youtu.be/YGkFDvx6FQM

There were a lot of Max Payne mods, but this was the only one that really reached farther than custom weapons, levels and characters - not many had custom animations or code. This had far more developed melee combat, combos, a leveling-up system, wall jumps and wall runs, and different "shootdodge" moves when you used it standing vs running. Nobody else had figured out how to do any of that stuff! And besides being an incredible achievement, it also just plays really well, even today - it and Oni will be my main touchstones if I ever get to make the melee-heavy shooter of my dreams.

A while back I tweeted at Kenneth about a vague memory I had of something he'd done to get beyond the limits of Max Payne's modding tools:


me:

Hey man, I have a vague memory of a Kung Fu mod dev anecdote of yours where...something like, you weren't able to do a trace for your wall run in the MP1 SDK, so you used a bullet? Wondering if that rings any bells for ya

Kenneth Yeung:

Wow, you have an incredible memory of 20 years ago! And yes you're spot on, let me take a trip down Max Payne modding memory lane...

The game didn't have any sort of special event triggers for player collisions with arbitrary walls, but I noticed that bullets COULD differentiate between different collision types (ie they could trigger a different event when hitting a character vs hitting level geometry). So in the player's sideways jump animation, I shot tons of invisible and extremely short-lived bullets sideways out of your hip, which would spawn a special type of explosion only if they hit a wall.

That explosion only affected the player, and if the player ever took damage from that explosion type it would immediately kick you into a wall run animation. There were multiple versions of these special bullets+explosions, one for each wall run and wall jump direction.

A similar thing was used for the player's attack combo animations. There were invisible bullets flying out of your fists and feet, looking for collisions with enemy bodies in order to trigger the next animation in the attack combo.

I eventually ran out of new animations you were allowed to add to a character, which is when I stopped adding more moves. But that's probably a good thing, otherwise I might have been working on the kung fu mod for 10 more years.

Here's the trailer from back in the day, which had me hyped as hell - note the Jet Li and Jackie Chan models, which didn't end up coming with the mod (and you can barely tell who they are here, but the screenshots on the mod's website looked really good).

https://www.youtube.com/watch?v=CnegCg638N4

Also, Remedy added an homage to the mod as an easter egg in Max Payne 2, which is lovely:

Half-Life Alyx's Level Design Tools: They Are So Great

On Twitter once I did a big thread where I went through Half-Life Alyx's SDK - so this is Source 2 - and had a big ol' yarn about 'em. That thread's turned into a useful resource, so here it is in Cohost Post Form.


Gonna look at Half-Life: Alyx's SDK and tweet about it a little. If you don't know already, I've historically had a big old axe to grind with modern level design tools! Still do. Neither Unity nor Unreal has level design tools worth mentioning built-in. This looks more promising.

Of course, Valve's not Unity or Epic - Valve just makes games; they're not big into licensing. These are internal tools, so there's gonna be jank. It's part of why actually devving on this engine would be a bad idea. But we sure can steal their awesome tool ideas!

An initial interesting thing is that we boot up straight into an Asset Browser, not a level editor. It doesn't assume we want to deal with maps, which is interesting, and reminds me of the days when UnrealEd had an Unrealscript code editor built right in. First thing I'm gonna open is Hammer though.

Already this is hot as hell, UI-wise. We have a lot of uncaptioned buttons that we probably mouseover the first time to find out what it is, and after that it's always one click away - we're not slowing our users down with submenus just so we can fit a caption in. Refreshing!

We've also got geo editing modes front and center, which bodes extremely well. Often level designers aren't able to edit geo in level editors at all, or if they can it's real clumsy. Isn't that wild?? Level design? Sort of important? Gotta iterate? Can't usually do it anymore? Wild. We've also got a vertical toolbar super similar to the old Hammer/Worldcraft one, which was always great. For reference, in Unity and Unreal, if you can reach this stuff at all it's behind many tabs and dropdowns. More clicks, more time, more frustration, less iteration, bad.

Yeah, this is really nice. First time opening the editor! Sure, I'm making trash, but look what I can do! I can drag stuff out and make changes to it super fast, which is great for level design, but also it's more powerful modelling-functionality-wise than something like ProBuilder.

This is the most functionality I've ever seen in a level editing tool, and so far it all operates more intuitively than the same operations in any modeling program I've used (Max, Maya, etc). And I'm barely into this yet!

Here's something I always point out to people making editors: in Hammer (1 and 2), if you select something or drag something out, the dimensions are right there. Nothing else does this. UE4 has a ruler, cool. Why does it not just draw these numbers on whatever I select?

Look at these UV tools. A whole bunch o' buttons for each of Align, Scale, Shift, Rotate, Fit, and crucially Justify. Hammer 1 had these, but this is a better UI. Nobody else bothers with this at all. If they do, there's at least twice as many clicks involved as defeats the purpose.

I'm sure that Very Many of Valve's systems and tools are janky and terrible compared to Unreal's, but these level design tools? I am literally seeing one good design choice and tripping over four others on the way to tweet about it. I am drowning in The Good Shit.

https://youtu.be/pBUddmjob20

Hotspot materials! An incredible feature! It's extremely good! We all would have had stuff like this a decade ago if AAA had even attempted to reconcile increased graphical fidelity with the needs of level design, rather than throwing LD under the Env Art Bus. This little house was made with just hammer geometry and hotspot materials. Looks fuckin' great.


These are all set up in something called the subrect editor, which is dead simple and cool. Like, you can make your own ones of these so easy.

Time to look into something called Tile Meshes which I've also been hearing about for a while. They sound pretty sick.

https://youtu.be/CzCDCPPIKF8

Oh I see, it's sorta like hotspots but for meshes + a bit jankier. You start with a quad and if it's of the dimensions of any of the meshes in the tileset it picks one, also choosing the right mesh for corners. I kind of want the meshes to scale the distance rather than leaving gaps, though.

It's incredibly pleasing that the folks at Valve have even been building their tools from this angle. For years, folks just got artists to churn out dozens and dozens of bespoke and/or modular meshes for mundane shit like this, all it did was cost money and stop LDs from working. These tools are amazing and everything, but none of it is a tech advancement. We could have had this ages ago. We didn't because of people just resolutely not giving a shit about the field of level design, including most people who make in-editor geometry tools. For programmers, modeling tools inside game engines seems to be an interesting challenge for whatever reason; making those tools useful for level design seems not to be.

ProBuilder, for instance, is barely better than nothing for level design, but it's not interested in level design. It just wants to exist for its own sake. And if your in-engine geo tools aren't useful for level design, they're good for nothing at all - artists aren't ever going to use them.

There's also the issue that it's programmers making these tools, and programmers are the people who use tools the least. So the only people able to make these things are the people least qualified to do it. Unless your coders are like, multidisciplinary or interested and empathetic. If you're making level design or geometry tools for game dev? Ape Valve, or make bad tools. Them's your choices at this point.

Here are some fullbright screenshots of Half-Life Alyx maps made largely out of Hammer geo:


And here's some with no props, only geo made in Hammer (no tilemeshes either, so of the stuff that's hidden, not much of it was actually bespoke for the scene). In any other current engine, this would be unfeasible to have a level designer do. Someone whipped up these concrete hallways, with their grates and trim and worn edges, in no time at all, without an artist being involved.




This thread has had eyeballs on it from Valve, Epic and Unity, but I'm not too optimistic about it having an impact - engineers' ability to stare directly at this problem without seeing it has held for 15 years and might easily hold for 15 more. Still, can't hurt!

Someone suggested that this might be less of a problem in Unreal if Epic were still doing linear games, and it's actually interesting that I don't think that would do it. After having had great LD tools in UE1 and 2, Epic actually sort of led the charge away from them around UE3. Their in-editor geo tools got worse from there, and continuing into UE4 as support gradually went away entirely. Things like Gears of War wanted the extra fidelity you could get from making maps out of expensive Lego pieces versus their existing LD tools, and they could afford the pieces. It's a worse workflow, but it "works" if you can afford it.

Of course, it doesn't work enough to hold a candle to the work on display in games that licensed UE1 or 2, and it doesn't work at all if you're not extremely cashed up. And bit by bit, you push your level designers away from their levels, and start designing the game itself around that absence.

Unreal has been the engine to chase for a very long time, nothing has ever seriously competed with it, so these decisions have rippled out. If you used it, which a ton of AAA did, it was make levels the Epic way or make a lot of work for yourself that you might not be equipped for, and the effects of that flow on forever to affect indie development and everything else.

So dogfooding works for a lot of things, and it's a lot of why Unreal is an amazing engine, but you'd need more than that for this. You'd almost have to build some massive project around understanding the things that have been lost. Feels like Hammer 2 was that.

In summary, let level designers make levels and we'll just call you when we want the rest of an owl

modular meshes 101

in fairness to the Epic Games engineers who i cut so little slack, this particular thing is pretty much over now - it's trivial to cut or bend a mesh like this into the shape you require

will ya though? will ya fuck

A while back I had a quick go at modding Doom 3 to turn this cutscene into a half-life style scripted sequence. Like most of the cutscenes in Doom 3, it takes control of the camera to show you something that's happening right in front of you anyway. It's a bit broken here, but I always think Doom 3 needs a halflifeification mod. Fix all the cutscenes to not be cutscenes and do the HL1 follow/unfollow thing with the scientists and security guards. Would elevate the whole thing

https://www.youtube.com/watch?v=CMzF3ZghRmc

Dev Scoops: XCOM/2's art direction

xcom 1 xcom 1 xcom 1 xcom 2

I had a moan about XCOM 1 being prettier than XCOM 2 the other day and got some nice replies from the Art Director of both, Greg Foertsch.1

Having replayed both again just lately, it's striking to me how much more interesting 1 is visually. It uses baked lighting (Unreal's Lightmass, first seen in Gears 2) and gets lovely results; 2 doesn't, and gets no indirect lighting.

XCOM 2 also has less of the nice cartoony style of 1, going for something closer to generically modern - lots of clean shiny surfaces with screenspace reflections (not part of UE3, so done custom for XCOM 2 or backported from UE4), very flat, very sparse. The best-looking areas (forests, etc) break the visual up with too much noise, which feels like an occasional bandaid on the lack of global illumination, which gives some amount of "free" variation to the scene on top of just being very pleasant.

My assumption is that 2 ditched static lighting in favour of more semi-procgen level assembly and swappable time of day and stuff, which you can still statically light, but with a lot of extra hassle. Personally, it's hard for me to see it as worth it without any other GI subbing in.

As I said on Twitter, though, I'll probably be in the minority in caring what this type of game looks like at all. Which is where Greg came in:

Greg Foertsch:

I appreciate that minority. The baked lighting in XCOM:EU served it well and along with a lot of familiar environments, really resonated with players. Your assessment of the games is correct. The design problems in X2 we were trying to tackle dictated some of the decisions that were made. XCOM:EU will always be my favorite of the two. Lots of good insights in your comments. I could talk about this stuff all day

And in response to someone else, about the dynamic lighting:

Correct. The addition of procedural levels and dynamic lighting together had a significant impact on my approach to the art direction

Greg seems cool, thanks Greg


  1. (i like to document these sorts of twitter interactions on cohost now because i think cohost will actually tell me before it shuts off for good)

you should never have uninstalled it

Speaking of Oni, I got ahold of a bunch of its animations (thanks to someone on the Oni Central discord) and recreated Oni's combo system in Unreal.

https://youtu.be/Yz4n44YLYMM

Here's the twitter thread I did at the time, which I will probably turn into a cohost post at some point soon.

gotta ask yourself the question,

https://www.youtube.com/watch?v=U8OgaV6Lu94

(I made a video to demonstrate some basic brush operations/how good Hammer is to somebody who makes tools at Epic Games)

eggbug asked…

I'm sure you've talked about it previously, but can you elaborate on why in your opinion "modular meshes [...] don't let you do level design"?

Here is what Orwell has to say on the matter.

And here are some videos I made many years ago about it all.

https://youtu.be/xOBEy-zIotE

https://youtu.be/6ubu76gEvM8

EDIT: I made another one, god help me. This one is just to demonstrate some basic brush operations/how good Hammer is to somebody who makes tools at Epic Games. It was pretty fruitless in that discussion, but LDs may find it cathartic

https://www.youtube.com/watch?v=U8OgaV6Lu94

EDIT: and here's another video I forgot about that's got some better ways to do things using the third party plugin MeshTool and the UE5 tool CubeGrid.

https://www.youtube.com/watch?v=e7u5tqOCMk8

eggbug asked…

say a thought about Dishonored. maybe multiple

Dishonored rules, obviously. Here are some selected thoughts I have had about Dishonored.

  • It is insane how well they migrated Dishonored 1's systems from UE3 to id tech for Dishonored 2. Everything feels the same, except D1's cool camera animations for locomotion didn't make it across, which is a bummer but a small one.

  • Dishonored has some of the best level design around. I don't think they did anything crazy tools-wise except use brushes, which probably they were used to coming from Source. This was at a time where most people had switched to modular meshes, which, yeah, don't let you do level design. Folks think they do but they don't.

  • It's interesting replaying the games (which I recently have been doing) having now known and worked with Raf Colantonio and some of the rest of the team. They make immersive sims the only way that really works - systems-first, "design" second, with a big leap of faith that your systems, once finished, are going to output a good time. Raf said once that they only "found the fun" on Dishonored like a few months before ship, which is wild but also just what has to happen, I think, to make an imsim.

  • Dishonored's expansions are the best expansions there are. I always want this type of thing - an immediate followup from a dev who, having just shipped almost this exact thing, is now incredible at it. Everyone should always do this. The best Half-Life 1 chapter is Uplink, the standalone Half-Life demo Valve made right after they shipped Half-Life. It's better than anything in the main game. A friend at Valve at that time told me that a bunch of them, right after HL1, just wanted to roll straight onto HL2 on all the same tech, but there was a "fancy new technology" mandate from on high. I want to visit the parallel universe where they were permitted to do that, instead of condemned to dev hell for years.

  • Dishonored impresses the hell out of me narratively for a lot of reasons, but my favourite is: It has a character who's basically the Gman, the Cigarette Smoking Man; famously a trope that if you attempt to resolve it you'll fuck it up and make the whole setting retroactively less interesting. In Death of the Outsider, they resolve it without fucking it up. Amazing.

If you could wave a magic wand and have your ideal video game studio to work in, what would that look like? How big, what roles, and what would your job be?

I think all I want really is to be given a decent amount of free reign to do what I'm good at in service of someone else's cool vision (because I usually can't be fucked having a vision, and I don't want one if I'm not in charge of it), which turns out to be a taller order than I would have really thought at any company that's more than a few people (which will surprise zero people who have worked in AAA, but I'm in Australia so I more or less haven't). I'm usually credited as a Technical Designer because that's the nice desirable vague title for "does lots of shit that's both technical and designey" and I guess that's fine.

I'm happiest when I'm making my own stuff, but barring that, when I'm building tools and systems and workflows to empower someone else (writers/designers mainly) to be able to insert their work into the game unilaterally. I think any time someone creative has to get their work into the game via someone technical, it's a terrible failure on the engineering side, albeit not always an avoidable one. I'm least happy when someone is putting up arbitrary roadblocks that stop me from doing what I know is the right thing, which happens all the time the closer you get to AAA, either because someone is actively recalcitrant or because the company is too roles-based to know what to do with the input of a generalist.

So basically I want to work for an indie that's small enough that I can basically be a tech director, which is what I functionally am anyway half the time, but cashed up enough that I don't take a pay cut doing it. Make it happen Johnnemann

cassie @porglezomp asked…

I remember I originally followed you on Twitter for level design tool review stuff. Do you have opinions on level design tools that should exist but don’t? I always want to hear about tools

i have many such opinions but i’ve spent so much time expressing them over the years that i’ve somewhat burned out on it. being extremely correct is not in itself very satisfying

i would say though that there has been progress, of a kind, on this, on the unreal side. at some point they made a tool called CubeGrid which is included in UE5 (and i think late UE4) and it’s an extremely good blocking-out tool. the modeling tools inside ue5 have pretty bad ux, but they’re very powerful and they’re increasingly exposing everything you need to make your own editor tools using that tech, which i think is probably going to be the way of things in the future - it won’t be long now until you can build trenchbroom inside ue5, and have it handle all the mesh admin on its own, and sell that on the marketplace and the whole issue will have been addressed. maybe it will be me who does that. probably it won’t

Dev Scoops: Max Payne 2's character shadows

A while ago I tweeted this about Max Payne 2's really nice shadow. This was 2003, remember.

Max Payne in 2 only ever casts one shadow, so it just moves around based on - I guess? - averaging the relevant lights. It's a nice shadow though. A lot of UE3 games did this too, I think including Batman AA?

Luckily, a couple of Remedy folks who worked on the game saw it!

Petri Häkkinen:

Hey, that looks familiar! I wrote the shadow rendering code for Max Payne 2. :) It’s basically a CPU decal with 2 shadow maps, hard and soft, blended together. Direction & intensity is sampled from the radiosity lightmaps. @jaakkolehtinen
wrote the GI system.

Jaakko Lehtinen:

It’s a shame we never talked about this, or the (what I still think is cool) distributed radiosity solver that scales well with scene complexity by breaking things down with portals and mediating radiance between “rooms” with 4D light fields in a 2-level iterative manner. It’s based on a 1st order SH irradiance volume, which is equivalent to an ambient term (DC) and two directional lights, positive & negative, from opposite directions (linear terms). Shadow direction and strength come from the latter: in a uniformly lit spot, the shadow fades away.

EDIT: After I posted this very post you're reading, someone else chimed in!

Peter Hajba:

So, what Jaakko was saying there was that the shadow (light) direction is baked into every point in the rooms where the light is rendered on, so you didn't need to cast any rays to cast the shadow, just look up the direction value from the spot Max Payne was standing on.
Radiosity rendering is pretty neat. Basically it goes onto each point on a surface, looks around to see if there are any lights visible, and then decides if that spot is lit. That's the first pass. Then in the second pass the renderer also sees the lit surfaces with color.
So if there are strongly coloured surfaces, the light applied to each point gets coloured by that. Then you run a third pass and the light bounces a third time, and more, until you have beautiful baked lighting.
We had a little distributed render farm at the Remedy office. Whenever we let our work computers idle, they would start calculating Radiosity lighting on the Max Payne levels.

Thanks Remedy gang!

answered someone's unreal question just now with a visual aid

you can have it too, as a gift

I just made a cool new tool

This UE5 tool automatically creates and assigns textures and material instances for you to paint to with the Texture Painting Tool, meaning you can, e.g., have a dirt mask in your material and paint to it right from the viewport, and paint different dirt per instance of your mesh.

This is extra useful because Nanite meshes can't have vertex color per-instance right now, but they can do this.

Unfortunately I then found out the Texture Painting Tool is broken right now making this all but useless until they fix it

But it only took me an hour or so so whatever

i've been mesh-to-metahumaning old game characters.

behold: epic metapaul and sam metafisher

Feel free to join my Discord game dev server, link in bio, if you are rad. No fuckheads. Here are some nice things people have said about it

How do you feel about abstraction vs models for game mechanics? e.g. having a realistic system for aiming a shot vs "roll to hit"

It's pretty situational I guess, but I usually enjoy things less the more abstracted or board-game-rulesy they are, especially in a real-time context. But the less that stuff is exposed for what it is, the less I mind it - I probably don't mind having a 10% chance of headshotting a guy, but I do mind knowing about it, versus just taking the shot if it feels like I can make it.

Half-Life 2 does an interesting thing nobody knows about, where bullets fired by NPCs appear to just be sprayed out, but actually are doing a roll on whether they hit the target or not, based on a value that represents the character's aptitude with the weapon they're holding, which is tabled somewhere - eg, cops might be worse at using shotguns than soldiers are. This seems a little overengineered for what HL2 ended up being, and functionally I don't think it really ends up executing (HL2 gun-wielders are pretty much turrets in the end) but it also hooks into another cool system where if a bullet has rolled a miss, it tries instead to hit something interesting near the target, like a physics object.

Only vaguely related to the question, but here are some old thoughts on ways to make hitscan weapons feel cool and fair in an FPS. but the HL2 thing seems in line with this - considering accuracy to be a person value, not a gun value.

hell yeah tim sweeney should open source unreal engines 1 and 2

ETPC @ETPC asked…

should tim sweeny open source unreal engine 1 and 2

absolutely. i gather the reason this doesn't happen (at least for 1) is a valve-style "it would require a fair bit of work nobody ever feels like doing". but it would fucking rule

i feel like it's not acknowledged enough that unreal engine 4/5 is in the weird position of being the only commercially available game engine, in the sense of what a "game engine" traditionally was: tools and tech emerging from game development, not just for game development. it's not a good state of affairs

i'm not real into it myself, but there are folks who specifically want to make mega-lightweight games that run on anything and look like they're from 2001 or earlier, and it sucks that the tools from that time still exist but aren't available. there's no good reason you can't take the tools that made Deus Ex, which were great tools, and do things that would have made a pentium 3 die on the spot, in a shipping game, on steam.

some of the doom/quake engines are open source, but tools-wise those never were as good as unreal, because unreal then was already a licensing-first company, actively trying to build tools for licensees to use and be happy with. unreal 1 would be empowering for indie developers in ways that things like unity and godot never quite can be, and all they would have to be is available. doesn't seem like all that tall an order

game dev post: Weird West and the Case of the Cursed Strawberry. cw: unsatisfying conclusion

a meme inspired by this tale, from weird west player "kida"

So in this game Weird West, you can cut foliage. Like if you've got a machete or something you can cut away grass or bushes or whatever, and some of them drop stuff. I did a post about some of that stuff. A little while after ship, I got a bug assigned to me where sometimes, a player will be in a certain level, cutting up foliage, and suddenly they can't interact with certain objects. Most objects. Including the travel zones that let you leave the level. So they're just stuck forever.


Debugging the interaction code doesn't really give me anything, it basically just says there's nothing there. After a while it turns out that this happens only on levels with strawberry bushes, if the player cuts the strawberry bushes. And not every time! But a lot of the time.

So my first thought is, something is going wrong with removing these foliage strawberry-bush instances, probably a weird collision thing. Maybe the collision for the strawberry bush instance sticks around after the instance is removed, with an invalid transform, and becomes an invisible volume blocking the interaction detection traces. OR SOMETHING. So I remove collision from the strawberry bush (unused anyway).

No good, still happens. Try various physics/collision visualisation viewmodes, nothing appears to change when this issue presents itself. In fact, I can delete the whole InstancedFoliageActor and it still happens. I unhook various parts of the foliage-removal code and it still happens.

One of the many things I can't interact with is the strawberries, but if I go into the editor after it spawns, and drag the strawberry a few meters away from where the bush was, suddenly I can grab it. This tricks me into still thinking it's to do with the bush I cut, and I waste more time.

So far I haven't really considered the strawberries themselves being a problem, because strawberries can be found in other places also, and they're fine, and when I cut the bush, the strawberries fall to the ground and roll around fine, and about four out of five times, the bug never even happens. Sometimes you can cut up every bush in the map and you're fine.

After a few more hours of investigating this, I go into the editor while playing, and try to drag a strawberry like I've been doing, and it doesn't move. The other strawberries I can throw around fine, but this one is stuck. This is the Cursed Strawberry.

https://youtu.be/xHNy5iE0oKc

Every time I cut a strawberry plant there's roughly a 1 in 5 chance of one of the strawberries it creates being cursed. Sometimes you cut the whole grove away and you're fine, sometimes, a cursed strawberry. If I delete the strawberry, the bug goes away, interaction works, the player can leave.

Knowing there's a cursed strawberry doesn't really help though, because it's identical in every way to every other strawberry in the game, except it breaks the game, and also doesn't fall to the ground or move or anything, even as the next strawberry over behaves normally.

But then it turns out, if I manually change its location, it's fixed. The strawberry falls to the ground and stops messing everything up and I can eat it. Although now I don't even want to eat it.

So my "fix" is, 1 second after any strawberry spawns, it teleports itself 1cm up, and that lifts the curse. So if you notice strawberries bounce a bit after they spawn, that's not a bounce, that's a stupid hack to make the game not break.

I was incredibly thrown by certain objects being interactable, or seeming to only be interactable, when you moved them away from the cursed strawberry or the strawberry plants and shit like that. It was just random. Stacks of red herrings.

This is presumably a down-deep physx or unreal bug that we'll never know what caused it unless someone else on here has seen it, because there really is nothing special about these strawberries. Especially compared to corn. Corn is the same!!!!! It's the same!!!!

game dev post: cooking and crafting in Weird West

More stuff I did on Weird West: the Cooking/Crafting system.

You can cook raw food over a fire or stove, use forges and tanning racks to craft weapon upgrades and armour, smelt ingots from nuggets or melt ingots back down into nuggets, etc. Arguably what I did for this ended up being a little overengineered versus the game's requirements.


Most cooking is simple - pickups have a CookedPickupClass, so, a Raw Chicken is easily cookable into a Roast Chicken in this UI. We also do this in-world - you can drop a raw chicken out of your inventory, set fire to it, and it'll become a roast chicken! But the system also supports Recipes.

For a long time, we had a janky placeholder system for cooking that I had hooked into the dialogue system - if you wanted to cook a chicken, you literally had to talk to the stove about it. When it was time to make a proper UI, it still wasn't clear how complex the system had to be yet. Is it always just RawItem->CookedItem? Are there different types of crafting stations? Do they overlap? Can you roast a chicken on a forge? Can one recipe have multiple ingredients? Can you only cook a pie using the cooking UI or can you get it by throwing all the ingredients in pot like Breath of the Wild?

All that being unknown, I decided it'd be best to just support all that stuff in a data-driven way, rather than doing the minimum viable thing and risk someone being sad they can't do a "get every ingredient to make an apple pie" quest. I gather this is a pretty Arkane way to operate (WolfEye was made out of a lot of ex-Arkane folks) - a ton of Weird West was "build a system out even though we're not sure how it fits into the design yet and see what happens". In the final game, you pretty much just cook raw food into cooked food, but the system I built fully supports multiple ingredients (with a specific number of each) and multiple results - a feature only used at ship to let you upgrade guns using worse guns and metal ingots, and also for one or two easter eggs.

So this example recipe will let you use a raw chicken, an apple, a bone, a bottle of beer and some cornbread to craft 3 cans of beans and a shotgun:

Crafting stations can support multiple Types, so while it's no longer enabled in the game, you can allow a forge to also cook meat. Anyway, the cool thing about this system is the cool thing about everything data-driven: if someone wants to add a recipe, they can do it in a simple UI or a spreadsheet without adding any code or talking to a coder, and bam, the recipe is in the game. It's nice when it works out that you can take a relatively vague prompt and put together a system that handles most of the possibilities designers might end up wanting, and isn't gonna mess you up later.

The beaut ingame UI for this was designed by Julien Mario, who you might recognise from his similar work on Deathloop, and implemented by me, which was great. UI implementation is a hell of a lot more enjoyable when you have a super pro designer handling that end of it.

game dev post: Weird West dissolvin' walls

https://youtu.be/IevdRqGL9Zo

One of many things I worked on on Weird West is the hiding of walls/floors/etc between player and camera.

When I joined the project, roof-hiding and the-floors-above-you hiding were all that was in; the camera went top-down at about 80 degrees when you entered a building. I found that a bit disorienting and wanted to try to figure out a way to smoothly hide the walls, but as with my fire propagation stuff, this was complicated by the fact that a ton of levels had already been built and I didn't want to add complexity or fixup work onto other folks' tasks. So this is a wall-hiding system that doesn't know what a wall is.


Walls dissolve (using TAA dither) from ankle-height up in a plumbob shape:


Initially I was just trying to math-bodge my way out of the problems that come with using a cylinder, but this is better and was a suggestion from Mike Blackney (Dead Static Drive's solution does it). The plumbob shape (which is just two dot products multiplied together) reduces a lot of times close to the player character where bits of wall off to the sides of the player would be hidden incorrectly when using a cylinder. I also spheremask a small area right around the player for if you're right up against a wall and don't want to lose sight of your character.

We're just fading wall pixels based on where they are relative to the player and camera, so if you walk into a tight corner towards the bottom of the screen, most of the walls won't hide, which is an edge case that was deemed acceptable, especially since you have control of the camera pitch. The ideal thing, of course, would be to actually know which wall actors to hide when you're in a given room, but we don't. All this wall-fading stuff only kicks in when you're under a roof, except there are some objects that opt into doing it at all times, like trees when you're in a forest.

Everything in Weird West uses the same three or four materials, so it was easy to set up material params to opt-out of this per object; eg doors, doorframes, interactables never dissolve (unless they're on a dissolved floor of a building). You could also disable or enable it per-object instead of per-material by adding a component to that actor, which just set the appropriate material parameters on the owner. None of these wall meshes were designed with this effect in mind, so initially it looked a bit bad, but then VFX cool-guy Ewan Croft made the wall material two-sided and had the backfaces coloured black, which worked nicely and means the interior wall parts you can suddenly see just look like they're in darkness, rather than like missing polys.

The roofs hide when you're under them, but also when you're aiming under them, which was a hassle to get right consistently. If you're aiming into a building you want to see who's in there (it might be a baddie) but you also want to show the roof if you're aiming at someone on the roof, even though your aim line might be going straight through a bottom-floor window of the same building. Plus, if you do this (find which buildings we're aiming into) with a bunch of traces out in your aim direction from your aim location, moving either your character or your aim a little bit can get different results from frame to frame, and now we're flickering the roof and nobody is having a good time. Plus the windows aren't at a consistent height!

To fix the frame-to-frame issue I just rounded the player location we do all our traces from to the nearest half-meter-ish. This needed some tweaking to not just skip certain windows as you strafe past while aiming into them. If you're not aiming, I think I just ended up tracing up from the player to find the roof, and hiding the right building layers; if there is no roof I think I'm doing radial traces out in a few-meters-wide circle around the player, and looking for a nearby-enough roof to be worth hiding. If you ARE aiming, I ignore the player's location and do quite a lot of traces out in the aim direction in like a 30 degree arc, checking upwards for a roof every couple of meters out. The aim direction is rounded to the nearest 35 degrees-ish, again so that we don't flicker the roofs as we thumb the right stick.

The aiming system ended up having sort of a soft lock-on thing; if you're considered to be "aiming" at something it's highlighted red. That's convenient - if it happens, I just forget all this other stuff and act like the player is standing where the target is standing. So if someone is sniping you from a top level window, and you're aiming at them, you can see the whole top floor. The way layers were set up for the game means you can't easily do both "show building A's top floor" and "show building B's bottom floor", but I don't think it ever comes up.

The actual hiding-chunks-of-building stuff, as opposed to the fadey walls, was all Jouan Amate I believe - the results of the traces I mentioned go into a C++ system of his which hides stuff using Layers (an oft-forgotten Unreal feature). Characters, furniture, etc, which needs to hide along with the floor it's on, has a FloorSensingComponent, and that lets a guy suddenly become visible if he walks out onto the balcony of a hidden floor of a building, for instance. Or hide a vulture when it lands on a hidden roof. That stuff was all very clever and I was pleased not to have to do it. Thanks Jouan!

dev scoops: Fire Propagation in Weird West

Here's something I did on Weird West that I thought was cool: fire propagation across grass and foliage. Lots of stuff is flammable in the game, and there's a general elemental signal system that handles most of those interactions, but foliage needed some special handling.

Maps full of foliage had already been made using instanced foliage meshes, which are super performant but aren't Actors, so they don't have Components, which everything else uses to catch fire/get wet/electrified etc, and I didn't want LDs to have to do anything for this to work. Foliage being painted using the Unreal foliage tool One approach is to swap each foliage instance out for an actor, either in-editor or at runtime when some firey event happens, but that's potentially pretty bad perf-wise (spawning very many actors), and doesn't account for the arbitrary density of these instances - that is, you might have a field where every blade of grass is an instance, or maybe it's in big clumps, or it's multiple foliage types so it's both, and you just want a nice even fire spread across this field. As you may now have guessed, here I cluster the instances. The blueprint function called when something wants to interact with foliage When stuff burns, it calls this function, which checks for foliage in that location/radius and looks those meshes up in a data table. The table also has stuff like "what fx to play if we cut this plant with a machete" + actors to spawn on break (eg corn plant spawns corn if cut). The Foliage Data Table If the foliage we found is burnable/chunkable, we spawn a single actor to represent that whole radius of foliage, add meshes to the actor at the same transforms as the original instances, and delete the foliage instances, swapping a few meters of foliage instances for 1 actor. That actor can now use our Signal system for burning/getting wet/whatever, just like characters do, and starts doing this process again for any foliage around it. Keeping this performant is just a balance of how fast your fire spreads vs runs out of fuel. Our fire rules are pretty much the same for most types of object, but foliage wants to act a little more impressive even if it means being a touch unpredictable, so we have a few specific settings, eg wind contribution. I like to keep these in a data asset so designers can get at 'em without touching code (including BP). For performance, none of these fires actually creates any light directly. When there's a foliage fire going on, we create one non-shadow-casting fire light that moves and changes size to cover the entire burning area. Ditto sound. Same idea as my old fire system.

We also use all the Cool Kid Shader Perf Tricks, like having the burning foliage turn to embers and threshold out using Time in the shader rather than wasting any CPU on it, which ends up meaning we can have very large dangerous exciting fires with generally quite a low perf hit. I'm pretty happy with the result, and it ends up a system where you get these kinds of anecdotes, so who can complain: For cutting foliage, we use the same function but it's much simpler: check in front of the player for foliage, look it up in the table, destroy the foliage, spawn the tabled FX/items etc for that foliage, or replace with an actor. This game doesn't do it, but if we wanted to be able to cut down trees with an axe using this system we could have it going in like 20 minutes. Also, this was all done in Blueprint. Blueprint is the best.

car game

i was the black car in this vid, so when I died i thought I'd have to rerecord but then the green car made it the whole way. always nice to see your ai babies succeed

https://www.youtube.com/watch?v=WLXOPbJLDmg

i'm quite happy with the camera, which was the first thing i did, it took literally like 5 mins and it's doing exactly what i wanted

more car game

lil car game: placing works, back car explodes if it gets too far from the front car

https://www.youtube.com/watch?v=rxCyHuPCEZs

car game

tonight i made an arcadey-feeling race-some-cars thing. pretty much just a spline road, basic car ai and the camera. i will either never touch this again or do weapons next

https://www.youtube.com/watch?v=pa6qNv2Pm2c

one time i made a vector field painter

https://www.youtube.com/watch?v=U4iHYJ-_Mps

it is pretty cool. you can export vector fields out of it and use 'em for your particles and so forth (not just in unreal)

games should require specific input to pull the pin on a grenade,

separate from throwing it. 90% of the potential interactivity of a grenade is collapsed when it explodes

in a game where you can throw a grenade without pulling the pin

-force enemies to break cover, unaware grenade is not live

-dialogue opportunities when they realise

-a free "throw distraction" verb without adding a system

-shootable non-live grenade doubles as remote bomb

-can throw one in somebody's face, stunning and freakin' em out

-share grenades with pals

-risk/reward, enemies can pick em up. "hey free grenade"

-enemies approach non-exploded thrown grenades tentatively, because throwing a non-live grenade is a weird thing to do. what if it's just defective

-if you do this too much they start expecting it and getting cocky about your grenades, with hilarious consequences

as long as i am only throwing a grenade to flush somebody out of cover, why waste it. foolishness

i uh. might have remastered this game a little bit more than i meant to

edsplash.bmp

in the unreal engine, edsplash.bmp is the name of the splash image presented to the user while the editor loads. often a given edsplash will never be seen outside its development studio. but sometimes an edsplash escapes onto the internet. these are their stories

unrealed 2.0

unrealed 2.1

bioshock drawing

expand for more juicy edsplash


gears of war

splinter cell: chaos theory

unrealed 3

primal carnage extinction

borderlands 1

borderlands 2

rainbow six: raven shield

bioshock with a little sister on it

bioshock again

bioshock 2

dead man's hand

thanks 2 my man @vectorpoem for some of these

randomly remembered the time i met palmer luckey when the oculus dk2 was new. i was like, "positional tracking is super cool, but the camera is a bummer. keen for in a few years when you can do it without an external camera"

he goes "no, that will never happen. it's impossible"

the time i made headcrabs in unreal

this was before half-life alyx had been announced

https://www.youtube.com/watch?v=sZZuA5xI2OU

weird west in the physical world

Came home to this! First game I've worked on and had a physical copy of.

Knocked this up the other week in ue5 in a couple hours. Megascans and free marketplace stuff

true game dev anecdote from the development of Weird West

i was on a call trying to demo something unrelated, and ran past a chicken, scaring it, it ran through a campfire, caught fire, flapped into a cornfield, and set the entire cornfield alight. the corn fire generated enough popcorn to crash the editor

this was a direct self-own since i had myself implemented the chicken-spooking, fire propagation and cooking systems

Unreal Engine Plugins That I Reckon Are Good

Here is my list of unreal engine plugins that I like

Mesh Tool by Nate Mary is a general modeling tool and the best way to do level design in the editor, though UE5's CubeGrid tool is really good too (but has fewer functions than this). You can use both together, both just output meshes. I asked Nate to add Hotspot Texturing like Source 2 has, and the madman did it. The pictures above are all of MeshTool stuff, all of it whipped up real quick.

HammUEr is a map importert that @turfster made for me like 6 years ago, which lets you import maps, models, textures, materials, etc, from the Source Engine, Quake 1-3 and Doom 3 engines, etc. It even handles pulling assets out of WADs, or exporting your textures from Unreal into Source Engine format so you can do your mapping in Hammer and import back in. It's wild and Turfster's great.


Nvidia's DLSS plugin is ridiculously easy to set up. You just drop it in and bam, you've got DLSS like all the new AAA games, ain't gotta do anything. AMD FSR is the same deal apparently but I haven't used it.

Accumulation-Based Motion Blur by @TheEnbyWitch is a really good Accumulation Motion Blur plugin. What that is is basically the motion blur from Metal Gear Solid and games like that. You know the one. It's really well-integrated, even with Sequencer, so you can use it for cutscenes and stuff as well as gameplay. Super worth it.

UIWS (Unified Interactive Water System) and UIPF (Unified Interactive Foliage System by @elliotgraytho are really good water and foliage interaction systems. UE5 has built-in ripply reactive water now, but depending on what you want to do this can be easier to work with. The foliage plugin is pretty unmatched, even if you only use the shader interaction (eg: the ball in #influxredux rolling through the grass flattens it). Used both of these in Weird West!

Voxel Plugin is an absolutely massive project that lets you make voxel-based games, whether that's a Minecraft or a Teardown or a No Man's Sky or a Valheim or what. Everything you'd want to be supported is in there, plus a bunch of stuff you never would have thought of. I always hear that Epic are super impressed with it too, and support seems great.

design thoughts: inventing a new special infected for left 4 dead

Hi I'm Joe and here's what I'd do if someone at Valve put a gun to my head and told me to add a new type of Special Infected to the video game Left 4 Dead


First thing to think about is, as a goal, you need an issue you want to address. For me, in Left 4 Dead, that's players rushing. Whether it's co-op or or versus, players in L4D can pretty much just speedrun everything, if they want, and get away with it. I'm hostile to this. I don't like those players - they can only have a good time if everyone else has a bad time.

So, my Special Infected is the Lurker.

-The Lurker always spawns ahead of the survivors on the escape route (an L4D concept which is just "the path from map start to map end").

-It doesn't spawn very often, and only once per map.

-It looks and acts identically to an idle common infected (normal zombie), and does all the usual idle behaviours: leaning on walls, staggering around, vomiting.

-If any player gets too close, it erupts into a terrifying mess of bloody tentacles like The Thing and does low-damage minor-knockdown slashes within a several-meter radius, still staggering around. It doesn't incapacitate like other special infected, but it knocks you around.

-If it takes too much damage, it disengages and tries to escape by getting low and snaking off ahead. If it gets away, you might encounter it again, back in stealth mode.

-It doesn't do much damage, but it has a lot of health, so fighting it takes time, and draws some attention. It's also faster than a player, so you can't just run away.

-Pre-aggro, there's two ways to know it's not a common infected:

  1. Regardless of noise or damage, it never aggros unless you're close to it, it just keeps doing those idle zombie behaviours.
  2. If you shoot it in the head, it "dies" like a normal Z, but like 5-10 seconds later it gets up again.

This means you can't confidently detect a Lurker without drawing attention from other enemies by shooting (which matters a lot or a little, contextually), or by slowing down and taking a few seconds to be sure. You're disincentivised from rushing because the level of danger ahead is unclear, since every zombie is a little bit of a suspect, at least some of the time.

I think this could be a pretty good solution to how, for a certain type of player, a lot of the survival-horror tension goes away when you realise that instead of cooperating with the experience, you can just run fast.

In versus, as the Lurker, you get a third-person view, and play the The Ship type game of trying to look plausible as an NPC, while trying to get close to an area where the survivors will have to get close to you. For instance, playing as a Lurker, you might stagger into the path the survivors need to stealth along to avoid disturbing the Witch.

I wish L4D supported modding so I could actually give this a try!

Thanks for coming to my Zed Talk

[archived]

f.e.a.r: first encounter assault recon rules

The video game called F.E.A.R: First Encounter Assault Recon gets a bum rap, even though everyone knows it rules. The bum rap is that everyone thinks it's just the AI and the guyshoot that rule in F.E.A.R: First Encounter Assault Recon, when actually a ton else about F.E.A.R: First Encounter Assault Recon also rules.


You're on some kind of little X-Files XCOM squad, which is a great idea, and it's called F.E.A.R (First Encounter Assault Recon), which obviously rules, and this whole idea isn't taken seriously by anyone not on the squad, which is perfect. The intro is super succinct. It's got, I reckon, the best largely-audio-log-based story in any game, where the "audio logs" are all voicemail messages left on landline telephones, the least contrived approach to this ever taken. They're genuinely good performances of out-of-context tidbits from which you can piece together several stories of relatively mundane capitalist evil that frame the supernatural horror stuff really nicely.

A lot of folks reckon the horror stuff is clumsy or naff, or the occasional jumpscares are not scary, or they are scary in a cheap way, but I see it as pretty well-executed, and what I suspect is actually happening is that people have a hard time switching gears from awesome-feeling shootouts into spooky slow stuff. A horror experience is something you have to fully cooperate with, sort of the opposite of a challenging shooter. A game with basically the same horror stuff but none of the action movie stuff is Condemned, by the same folks, and that scared everyone.

One of F.E.A.R: First Encounter Assault Recon's side stories is the minor mystery of an entire part of town being abandoned (it's been poisoned by runoff from the corporation the game is about) and I always thought that would have been a perfect tie-in to Condemned (a game about a city full of ultra violent people being affected by something which might as well be the same runoff stuff), had Monolith decided they wanted a Shared Universe Situation, but Shared Universe Situations are a nerd-impulse best resisted, so this, too, rules.

F.E.A.R: First Encounter Assault Recon does the Silent Protagonist Who The Twist Is They're Central To The Plot thing, which doesn't rule now, because everything did it, but it sort of did rule at the time. The Twist is almost exactly the same in its content and delivery as Bioshock's, but earlier; I have a hard time thinking it wasn't an influence. It's also a story where the evil supernatural force thing turns out to be totally justified in its boundless wrath and rage, which always rules.

Besides all this, it really does have the best FPS AI and guyshoot ever devised, and constantly when you play it you see very cool little VFX or design touches that you don't see anywhere else, plus you can slide kick a guy so hard he goes through a wall, and it's like two bucks on Steam or whatever.

#F.E.A.R: First Encounter Assault Recon#the sequels do not rule#turn off AA or the soft shadows don't work#Video game thoughts
[archived]

i was going to just comment on joe's post (good as always) but as usual when a comment gets past three paragraphs, i became filled with doubt before i finished it and investigated and discovered my convictions were false

minor F.E.A.R. spoilers below


i played fear when it wasn't yet over a decade old and it was the best horror game i'd ever played in my life. it did things that we didn't know were possible in games yet and nobody talked about it and i didn't understand why. if joe's right that people talk shit on this game i don't know why. it was a masterpiece and while i don't know if it holds up, there was nothing to complain about when it was new.

my comment was about the best scare in any horror game in history. and i'm talking about forwards and back because nobody is going to outdo this, or at least that's what i remembered. here is my memory of the scare

in the middle of the game, you're getting onto a ladder, like you've done 3,000 times already. this is a "you can see your feet" type of game (one of the sole two genres of FPS) so you have to animation-priority your way onto it. you press E, and your character puts their foot on the top rung, and the camera swings around in the standard getting-onto-a-ladder animation you've seen dozens of times already, which happens to end with your field of view below the level of the floor you were just on.

2/3 into the animation, as your character is past the point of no return and has to finish getting onto this ladder if you don't want to plummet to the floor, your camera sweeps across alma standing six inches away from you directly at the top of the ladder. but the animation is locked in, and just continues past her as you settle, and ends with her out of your field of view. by the time you can look up or climb up, it's been a quarter second; you know she won't be there. you still check, though, and you're still freaked the fuck out when she isn't there.

i remember being absolutely floored by this scene. turns out it didn't happen at all like that.

https://www.youtube.com/watch?v=Fo7HA7Y89XI

yes, you animation-priority your way onto the ladder, but it's much faster and clunkier than i recall. when the animation ends, you're just staring at alma, standing dead still. she then begins to dissolve. here, in 2023, this has no punch whatsoever. she just... stands there and melts?? i am laughing, almost. what? i was scared by this?

well.

in 2006 or whatever when i first played this, i had never played an FPS with animation priority motion. i was used to half life, etc. where ladders were basically magnets that your character could slide up and down on. you couldn't see your hands or feet, so you couldn't see them reach out and touch things. you were not a person, you were a camera equipped with v_ models

fear felt so organic, so real, so heavy. i had learned, by this point in the game, that i could not get off a ladder without waiting through the animation. i knew that i was glued to that ladder as soon as i was on it, and it made me uncomfortable. i actively felt less safe and in control because i couldn't do absurd "air control" quake bullshit in this game. i liked that, a lot, i liked that my character had weight and physics and couldn't do a zero-zero jump six feet straight up and eight feet to the side off of a ladder.

but that was so novel that it made me actually feel even more constrained than i really was. i am positive that, at the time, i believed this scene played out exactly the way I said above, and that's because i was so scared of the way the game had manipulated me, had made me take an action that made me vulnerable which i had internalized up to that point as perfectly safe, that I saw things that didn't exist.

i told myself that getting onto a ladder was safe. more specifically, i told myself that turning my back to a place i'd already looked at was safe, because in 2005, videogames didn't know which direction you were looking.

in 2005 i had never seen an FPS that changed the world state or triggered script actions based on whether you were looking in a particular direction. this is of course very easy nowadays - it was easy then, too, but i had never seen it and i don't think it was common.

but even moreso, getting onto a ladder in a half life era game wasn't an event. no game, to that point, had ever changed the world state based on your being on a ladder, because that wasn't an action. you didn't press a button to climb onto a ladder, it was a movement state, you could enter and exit "being on a ladder" by moving one pixel left and right, to contact or not contact it.

fear, however, made it a specific action which you had to commit to, an Event, a little micro cutscene you had to watch, and that threw me off so hard. i had accepted "this game uses an animation to put you on ladders," sure, but it had never occurred to me that that meant that the game was taking over my body when i clambered onto one, and what the implications of that were.

see, in all other FPSes, you couldn't do something like make an object appear behind the player, because game scripts couldn't seem to make anything happen at a resolution finer than a couple seconds. if clive barker undying wanted demons to spawn, they just kind of popped into existence and hung out for a couple seconds before doing anything. that, plus the seeming inability to tell where the player was looking that i had observed, plus the fact that this was a PC game, where the player could survey their entire surroundings by just whipping the mouse half an inch to the side on their desk, meant that i had simply never conceived of it being possible to introduce an NPC to the world without them either running in from another room, or spawning in full view of the player.

i had been subconsciously scared when fear took my instant mouse control away for those brief moments when clambering onto ladders. it had bothered me that i didn't get to choose where i was looking, so i couldn't survey my surroundings for attackers, but it didn't click that this unsettled me until i climbed onto that ladder and was stuck, mid-animation, staring at alma.

the reason i remember this being so slow and deliberate is because i was genuinely panicking, so time perceptually slowed down. the reason i don't remember her being visible at the end of the animation is because my animal brain said "welp, we just got eaten by a Predator i guess" and shut off my hippocampus in order to move my consciousness to the Penalty Box to watch my fuckup on looping Instant Replay

game of the year

#gravis-gaming-dot-net
[archived]

FEAR, HL2, AI, Only Doing Stuff When The Player Is Looking, Ways To Make AI Seem Smart

(gravis did a comment on my FEAR post so long he made it a post, and now I'm doing that with that post, and this ends up being about all kinds of things)

Half-Life 2 (and maybe 1, but I don't think so?) had a trigger type called trigger_look, which was like a regular trigger, but it only executed if you were both standing in the trigger and looking at the specified entity. They used it heavily for Gman stuff - in HL2, if you walk backwards into an area you're supposed to see the Gman in, and then turn around, that's when he straightens his tie and walks away, he only does it if you're looking.

I remember thinking this was incredibly cool at the time, and it was, not for the ability to do that check necessarily, but for exposing it to level designers as a trigger type - a coder at Valve saw the problem of "player isn't looking when the cool thing happens" and provided LDs (who were the ones making all the scripted sequences) with an easy way to say "wait til the player's looking, then do it". It's a level of facilitation by code of LD that, these days, you're surprised (and often pathetically grateful) when you see it.


HL2, while I have my issues with it, was a rare case of a sequel seeing a lot of the core concepts that worked in the first one, which are hardly ever the ones that players notice, and digging deeper into those. The "do cool thing only when player is looking" stuff is everywhere in HL2 - if an NPC's bullet is going to miss its target and hit an explosive barrel, only do that when the player's gonna see it, otherwise the player doesn't know why the thing is on fire. The fast zombie only leaps at you when you're looking at it. The Strider only fires its big alt-fire cannon when someone's gonna see.

While the AI in HL2 was, for me, a big miss (and I did a youtube video about it) this "look trigger" stuff is, I reckon, a big piece of the oft-neglected puzzle of making game AI feel smart, a goal which I've come to reckon is basically at odds with how most games are produced (at least beyond indie, indie being usually not where a ton of AI work gets done).

There are a ton of games, real expensive AAA games, with what you'd call, if you were a person who writes AI, really good AI, but players don't note the AI, and have no stories to tell about it.

The last two Splinter Cell games are like this - the AI is doing cool, impressive things, and nobody is impressed; it comes across as serviceable. Nobody says it's bad, nobody says it's good. But the very first time you play a level of those games, you are impressed. You're impressed when you're hanging off the side of a building, and an alerted guy who thought he saw you running towards the window a second ago actually bothers to stick his flashlight out the window and check the side of the building. That's incredibly cool, the first time. It stops being cool the same mission, because they do it constantly, even if they never find you there. Suddenly we're living in a world where the outside upper wall of a multi-storey building is the first place you look for an intruder. Everyone knows that!

The issue is rarity: behaviours are cool because they're rare. AI looks smart when it deals correctly with an unusual situation. Like in FEAR, the way a scared soldier who's the last in his squad will throw himself onto his belly to crawl under a truck, and this only happens when there's a scared last guy in the squad and he doesn't think you can see him and there's a truck. You can easily never see this happen!

But the way big games are produced, if a thing is cool, it cost money, it continues to cost money to keep it working, and you better justify that investment by putting the Thing on the screen whenever you can. What's the point if nobody sees it, right? But actually, what's the point if everyone sees it all the time.

I reckon the best approach to making smart, reactive-feeling AI in a real-time game that people relate to, remember, talk about, get freaked out by (especially, but not necessarily, in an action or horror thing) is to really budget and stagger out your cool behaviours, the way some of your smarter games will handle dynamic dialogue (eg Left 4 Dead) such that they're genuinely rare - one-offs, even - and making sure that when they do execute, they have the player's attention. There was a pre-release interview about Half-Life 2 where some dev said the zombies were so smart, they could encounter a locked door, punch out the little window in the door, reach through and unlock it from the inside. That didn't ship, but imagine if it did, and then they made it super unlikely? Wouldn't that be the coolest fuckin' thing that ever happened to you?

a funny thing happened when i told a lie on the internet

i suggested that this photo i took near my house was a screenshot out of unreal engine 5, and many people believed me and were amazed, or didn't but were unsure, or recognised it as a troll but thought it was a troll on people thinking it was a photo (which it was), or just got confused and annoyed, it was pretty great

but then: somebody spent 3 weeks recreating the photo in unreal engine 5

pretty great

quick-inventory bar thing

https://youtu.be/5ApeFN0bs3k

It ain't much, but i like to do something on this project most days (which is not to say that's what happens) so I made a particle effect for when you crowbar the keypad open. Niagara is so good once you get to grips with it, this took like 4 minutes

coming soon: manual hacking

made it so you can use your crowbar to pry keypads off the wall, with a view to using your electronics skill (when skills are implemented) to bypass the lock, possibly also through the use of some kind of multitool dealio

https://youtu.be/JtxEN_c8FrU

ammo pickups, reloadin' and weapon mods

ammo as inventory item

finally handled ammo properly in my imsim thing. ammo is stored in the inventory (but all pooled in one item) and you draw from that item's quantity when you reload

https://www.youtube.com/watch?v=ZUxZXkF8pUU

my Weird West first person mod is out

https://www.youtube.com/watch?v=yNskDpRj6IA

https://www.youtube.com/watch?v=t2avT6QQYx8&t=4s

Today Weird West gets its final patch! It introduces mod support, and includes the First Person Mod that WolfEye got me to finish up after I put out a vid of a much jankier version I pooped out a while back. Here's a kind of low-energy making-of I threw together.

joe availability report

I'm looking for work that starts early next year, Unreal stuff, you know, generally being good at Unreal. http://www.impromptugames.com has the scoop on stuff I have done before, and also an email link. Holler if you want to hire a person to make your stuff good

silencer

did the stuff so weapon mods can add attachments to the weapon now. neato

https://youtu.be/S31DWrru39c

weapon mods online

behold: items that can be attached to other items to modify the parent item in some arbitrary way.

https://www.youtube.com/watch?v=obpaFdBcCVw

unreal engine general beginner tips for persons who are beginning with unreal engine

Spent some time on a comment this morning to help someone out who was having trouble starting out in Unreal, but it might as well be a post, so here it is:

Unreal is really good, but it doesn't teach itself well, which frustrates me a lot. Once you're across it it's wildly empowering, but getting across it is way harder than it needs to be, so you're not alone here at all.


The most important thing is just to be in a community of helpful folks who know it well already or are also learning it. The Unreal forums is mostly not that place. I run a discord which is a good one, there's a link in my bio. There's also a longtime Unity user Alex Rose who made a discord for Unity users jumping ship to Unreal, which I'm in too and those people are helpful to each other: https://twitter.com/AlexRoseGames/status/1542871322136662016?s=20&t=yf642ZAODq-8SDjQcHUtMQ

I don't do well with video tutorials either mostly, but I did make the vid below, which isn't a tutorial but a rapid-fire 20-minute list of stuff people often don't find out on their own. It's not supposed to teach you this stuff as much as give you a feel for the sorts of things that are there and not surfaced. I also used to tutor folks on this stuff for money, which I recommend if you can find someone good.

https://www.youtube.com/watch?v=VhYDqkTqZPg

It's frustrating initially, but the learning curve on most things is one steep initial cliff (find out what this even is) and then an immediate dropoff (oh, this is easy).

There's an official docs page for transitioning from Unity also: https://docs.unrealengine.com/5.0/en-US/unreal-engine-for-unity-developers/

Other general tips:

-never assume you have to roll your own anything (character movement basics, standing on moving platforms, decals, databases, camera systems, AI systems, pathfinding, profiling tools, localisation systems, dynamic music, networking, state machines, projectiles, damage, vehicles, anything). Sometimes you do but always check

-95% of UE4 docs/tutorials also apply to UE5. They moved some stuff around in the UI but most stuff is the same

-You should work primarily in Blueprint, not C++. This applies equally (at least initially, until you're someone who Knows Unreal) to everyone. Some people will fight me on this but they're all wrong

-C++ in unreal is a different beast than C++ in other places; if it shits you in other places it's still worth a try here

-Read the entire Gameplay Framework doc, it's important, there is no project where the Gameplay Framework isn't the right way to work

-Opening and docking the Class Viewer panel (Tools->Class Viewer) will help you passively observe the way the engine is structured. In UE1 and 2 this was docked at all times, they never shoulda hidden it away

-Download the Content Examples example project through the Epic launcher, it's pretty exhaustive

-You can theme the UE5 editor in Editor Settings, which is good if the default dark theme in UE5 is too dark for you

Hope this helps!

Made an editor utility widget for capturing inventory icon textures for my pickups! You just pick the item you want, tweak its offsets in a table until you like the result, and hit save. Now my inventory looks nice

https://youtu.be/vwNWvHAoiGY

whacked in a descriptive tooltip for the inventory items

today i added stacking for inventory items. baked my noodle for a minute there

https://youtu.be/PU6lRCSetmY

Added retractable inventory, and equipping for items besides weapons; like the way in DX you can equip anything and see it held out in front of you, allowing for "use thing on thing" potentially

https://youtu.be/ItWWBWQzE08

notifications ui for my lil imsim framework thing

https://youtu.be/ZPDnbzTFwV0

deus ex/general imsim systems project

https://www.youtube.com/watch?v=kerFDwJwKMI

Started a new little just-for-fun project the other day where I just build systems that're in Deus Ex in case I ever want to make Deus Ex. Got inventories, weapons, moving objects around, breakables, inventories drop everything if their owner breaks, doors, locks, keys, keypads.

It's a whole thread on twitter. Sure would like to be able to drop vids into cohost posts

weird west first person mode

https://www.youtube.com/watch?v=sJoHJ5ZABrE

After Weird West came out (I worked on it for ~3 years as a Technical/Systems Designer) I spent some time on a weekend mocking up a first-person mode. It's janky and uncomfortable, but pretty cool! The folks in charge were kind enough to let me share it.

this was unofficial, unsanctioned buffoonery on my part; do not expect a first-person mode to get added to Weird West

first guy

i never made a character before, but tonight i did! modelled him inside ue5 and skinned him in blender, i guess. pleased!

Joni (Oni but Joe)

I have been gettin' asked about my Oni UE4 project, Joni, from a while ago so I did a quick vid where I show how it all works. Pretty simple but perhaps of interest to some folks

https://www.youtube.com/watch?v=Yz4n44YLYMM

Made a quick vid explaining Hotspot Texturing in Unreal using the MeshTool plugin, and how, combined with UE5's CubeGrid, this makes for a pretty slick level design workflow

https://youtu.be/e7u5tqOCMk8

here's some pretty pictures from a game i had to cancel due to circumstances beyond my control. map by me and @trashbang

https://www.youtube.com/watch?v=VoSVRiMkVo0

here's an old trailer for a game idea i was noodlin' with at one time

been relighting this forest a bit. there were too many trees, i did a little blueprint utility to thin 'em out.

influx redux

it's a game

game coming out soon looks like this

it's "influx redux" and it's not gonna make any money

Unreal Engine Tips You Might Not Discover Quickly On Your Own

made a 20 minute rapid fire tips video for people new to unreal (especially from unity)

https://www.youtube.com/watch?v=VhYDqkTqZPg