I need practice doing more of these until I stop rambling on and repeating myself… but anyways.
Quick video talking about and demonstrating the FOV settings we now have in Ground Branch.
Personally, I have a 24″ monitor that I sit about 90cm/3′ from.
Then again, I sit about 10mm away from the lenses of the Rift.
I put in some good hours on the demo map and thought I would share some of the results. This is also the same map that Kris showed in his post the other day.
This is a multi-purpose map that we are using to test out various game types, profile performance, use for various visual benchmarks and prototype day/night/weather setups. It will also be the map that will first be released in a public preview build.
Spent a good amount of time on replication this week. Original plan was to use a system that converting loadouts from a JSon formatted loadout to a replication friendly version that would then be updated and sent from server-to-clients as required. This worked fine, but was never quite as easy as I would have liked.
Only missed it by a few weeks.
An Oculus Rift DevKit arrived near the end of May. I was hoping to have some footage of its use in Ground Branch, but alas, I don’t have the UE3 Oculus Rift SDK (yet). Rather frustrating, as it is fairly awesome, despite being low-res.
The Oculus Rift development kits have begun shipping to developers and despite the relatively low resolution and missing head translation, much praise is being heaped upon them.
Will Ground Branch be supporting it?
The short answer is, yes.
The long answer is, $#&@ yes.
The Rift brings a lot to the tactical FPS genre, not to mention gaming and non-gaming alike. The only issue the Rift has is that there are no decent interfaces to go with it – we’re stuck using a mouse/keyboard combo or gamepad.
The problem with any new device is getting past the super-human accuracy we are given when we use a mouse. Any non-mouse implementation will either need to match a mouses accuracy or provide a substantial benefit above and beyond this limitation.
You can see an interesting way of using a Razer Hyrda in the following video:
I could see an implementation similar to this that replaces aiming, sprinting, reloading and the like with simple realistic hand movements.
Hand to magazine, hold trigger.
Both hands down.
Swing weapon out.
Not only more immersive, but faster and with more control then a button press.
Will that be enough to make up for the lack of accuracy?
I don’t know.
I really don’t want to see “no mouse/keyboard” servers, but wouldn’t be surprised if it happens.
Took care of a several issues related to the true first person viewpoint and thought I’d share a quick in-game vid of it in action. I’m not sure why, but the volume of my voices alternates a bit in the uploaded version. I’ll try to figure out why before the next vid.
Due to a dose of Real Life™, I did not have it in me to do updates or socialise much in the past few weeks. Thankfully, things have a progressed and much of the stress is gone.
On the bright side, one way I deal with stress is to distract myself by coding… a lot. I’ve done everything from updating more loadout and UI related code (booo!) to gametypes and AI (woohoo!).
One thing that can happen with game development that spans a lengthy time period is art done early in the process can start to feel outdated and fall behind the current standards. This isn’t always a bad thing (It’s the gameplay man!) but I felt like the weapon models/textures needed an upgrade since they are so visible. The work previously done by Snowfella was top notch and served its purpose very well. So again, my hats off to him…..and we are still using some of his work in the finish game as well.
So let’s start off with the high res version of the AK-74. This model is used to bake Normal and Ambient Occlusion maps for the in game model. I’ll do some more updates down the road to show some of the process.
New UE3 build this week, which delayed work in other areas a bit, but no matter.
After finishing an update of this kind, I like to check things by making a quick map in the editor, chucking in random stuff and running it with multiple players. I was already in the process (before the update) of more tests involving loadouts, so I thought I’d combine the two.
Basic bot code has been sitting there for ages. I just created a nav mesh, told them to use a loadout and let em’ rip. Considering the level of asset optimisation (none) and the age of the machine (2008), I’m happy with the results.
Also been testing and updating firearm related code.
With any luck, we’ll be shooting at each other soon, which will be nice.
You know, read out of context, that could sound weird…
Pretty rough (lighting etc), but the information travels across the network correctly, which is the most important thing.
Loadouts are saved in the JSON format to make sharing and editing them a lot easier. To make this easier to send across a network, the client will take this loadout and reduce it down to references from a item list that has been synchronised between the server and client, then send that to the server.
After verifying its contents, based on admin, gametype or scenario restrictions, the server then distributes this to all the clients, as required.
This allows for plenty of player customisation, but ensures that the server remains king.
Another decent step towards a playable build 🙂