Sunday, October 27, 2013

DualShock 4 for SDL layout.

Here's a quick dump of getting DualShock 4 support in your SDL title that should work for SDL 1.2 and 2.0.  If you're not using the controller mapping in SDL ( like for Steam OS titles ), but want to do it old-school here's the button layout in SDL terms:

You can detect the pad via the string "Sony Computer Entertainment Wireless Controller".  Here's a dump from Linux dmesg for more USB details if you want to fingerprint:

[ 1154.896287] usb 3-1: new full-speed USB device number 5 using xhci_hcd
[ 1154.913200] usb 3-1: New USB device found, idVendor=054c, idProduct=05c4
[ 1154.913209] usb 3-1: New USB device strings: Mfr=1, Product=2, SerialNumber=0
[ 1154.913215] usb 3-1: Product: Wireless Controller
[ 1154.913220] usb 3-1: Manufacturer: Sony Computer Entertainment
[ 1154.913420] usb 3-1: ep 0x84 - rounding interval to 32 microframes, ep desc says 40 microframes
[ 1154.913435] usb 3-1: ep 0x3 - rounding interval to 32 microframes, ep desc says 40 microframes
[ 1154.919473] input: Sony Computer Entertainment Wireless Controller as /devices/pci0000:00/0000:00:14.0/usb3/3-1/3-1:1.0/input/input15
[ 1154.919743] hid-generic 0003:054C:05C4.0007: input,hidraw4: USB HID v1.11 Gamepad [Sony Computer Entertainment Wireless Controller] on usb-0000:00:14.0-1/input0

Analog Sticks and Triggers
Axis 0 - Left Stick Hortz
Axis 1 - Left Stick Vert
Axis 2 - Right Stick Hortz
Axis 5 - Right Stick Vert
Axis 3 - Analog L2
Axis 4 - Analog R2

Directional Pad
Up - 1
Right - 2
Down - 4
Left - 8
Up+Right - 3
Down+Right - 6
Up+Left - 9
Down+Left - 12

Digital buttons
Button 0 - Square
Button 1 - Cross
Button 2 - Circle
Button 3 - Triangle
Button 4 - L1
Button 5 - R1
Button 6 - L2
Button 7 - R2
Button 8 - Share
Button 9 - Options
Button 10 - L3
Button 11 - R3
Button 12 - PS Button
Button 13 - Touchpad depress

Setup Loop
Here's some generic code snippets for setup and detection of the controller.

for ( uint32_t i = 0, count = SDL_NumJoysticks(); i < count; ++i )
{
   /* eg "Sony Computer Entertainment Wireless Controller" */
   const char* joystickName = SDL_JoystickNameForIndex( i );
   SDL_JoystickEventState( SDL_ENABLE );
   sdlDevice->mJoystick = SDL_JoystickOpen( i );
   InputJoystickMap( joystickName, sdlDevice->mJoystick, i );  
}


Event Loop Style Sampling
Here's a generic SDL event loop to access the pad input frame by frame if you like.

 SDL_Event event;
 while ( SDL_PollEvent( &event ) )
 {
  switch (event.type)
  {
  case SDL_JOYDEVICEADDED:
   InputJoystickSignalAdd( event.jdevice.which );
   break;

  case SDL_JOYDEVICEREMOVED:
   InputJoystickSignalRemove( event.jdevice.which );
   break;

  case SDL_JOYHATMOTION:
   InputJoystickSignalDpad( event.jhat.which, event.jhat.hat, event.jhat.value );
   break;

  case SDL_JOYAXISMOTION:
   InputJoystickSignalAxis( event.jaxis.which, event.jaxis.axis, event.jaxis.value );
   break;
  
  case SDL_JOYBUTTONUP:
   InputJoystickSignalButtonRelease( event.jbutton.which, event.jbutton.button, event.jbutton.state );
   break;

  case SDL_JOYBUTTONDOWN:
   InputJoystickSignalButtonPress( event.jbutton.which, event.jbutton.button, event.jbutton.state );
   break;

   ...


Monday, October 7, 2013

Terrain and Biomes and Flora oh my.

So I dusted off the terrain / biome system to hook it up to the voxel terrain system, since it makes a lot more sense to model very large landmasses in a way people can understand.  Density functions by parts isn't a bad thing, but even if you expose this with a UI flow graph -- who the fuck is going to understand it that also wants to use a flow graph chart?  Exactly.

Better to allow for 'big picture' visualization of what those damn things do I say.  Take for example this simple gem on how to generate large landmasses... Clamp your density by masks of large water bodies and scale by smaller ones.  Simple and effective.  Rock composition is the variable that effects water erosion the most.  It happens in nature everywhere that's why you get the grand canyon to the salt flats.  Forget elevation as the main factor.

In fact you can completely ignore the physics and go for a purely artistic version of a bay like below.  Only using three of the density function operators no less for a playground for artists to start hacking away.  Each KaDensity function takes a list of parameters and returns a density.  They're pretty self-explanatory:

KaDensityNoiseSimplex2d : octaves, frequency, lacunarity, gain
KaDensitySaturate : density
KaDensityMul : density_A, density_B

Here each stage is generated via simplex noise and saturated to threshold the values, and then combined to produce composite images.  Don't have to teach artists about physics or generate all the voxel terrain for them to visualize while dialing it in...  make continent iteration quicker.  This mask alone has a large effect on the Terrain and Biomes as you can see.  From here you can pass this to VoxelSector density samplers and get your 3d terrain as a GPU visualization and / or CPU mesh from isosurface extraction as seen previously on the blog.  BTW I have found the ideal width of a river is ~1km so these pixels are about 1km^2 while typically I would generate VoxelSectors at about 1/4km^3 to give a sense of scale.  That makes these preview windows about 512x512km while the camera in game tends to only go to a view distance of 12km.  From here you can feed back masks into each other such as for Fanua mask generator.  :)

Generating Water Mask
Terrain + Biomes Visualization

Flora Visualization

Friday, September 27, 2013

Retro? freyja 3d. Future? Public alpha test for kagura?

Had a free hour between 0145 and 0245, so I "ported" freyja 3d to a modern Linux system.  Kind of funny things have changed so much since then...  mostly a hassle due to cmake being so horribad.  If premake4 was around back then it would have been much easier.  The cmake system ties into a hand rolled cross compiler system for building for platforms like Windows from Linux if you didn't know.  Good times.  I mostly fixed it to use as a model viewer / tweaker / debugger for some blender exported skinned meshes with animations.

 

Also got the per voxel sector procedural deco system up and running, but didn't post anything about it as there is a long way to go to get the cubemap gen, depth/height gen, etc in for IBL fun per sector.  Not much to look at especially without even volumeric fog / aerial perspective enabled.  I was just testing 'fill' with this screen capture basically.  The bright yellow dots are markers that flash on/off to visualize the sector layout, and show where the 'smart sector' object is located.  It is mostly for coordinating deco batches and the cached/dynamic maps mentioned above.

About to get back to working on graphics after making several script based camera rigs and character controllers to test the new scripting API improvements including fast math lib exposure and the ability to dynamically bind script classes from the native side.  May toss in a quick shooter game play demo like I did in midgard... that was 11 years ago!  I feel old as the last engine I wrote from scratch with network gameplay is older than most game companies today.

Once I have a polished network model for gameplay I may do a public alpha test.  If that goes well... ah kickstarter?  This time it will be an action rpg-ish death match in an open world environment.  I also need to get the Makehuman models animated for demo content I can release.  Currently I'm using "stand-in" animated models that can't be redistributed with the demo as you likely noticed.  Right now that would mean you can plug-in any md5mesh model and play, and I'm not going to encrypt the script files or shaders as I want to demo with the web browser as editor, so people can mod it.  It'll be like old times.  I thought about purchasing some stock animated models in FBX and converting them... but I would have to reverse engineer FBX binaries.  If you have some to sell and are willing to export to md5 for me I will pay extra to save the time for coding on my engine.  ;)

Failing that I may release a single player version to let people kill and loot some mobs, and focus on gameplay net code later.  Hell worked for minecraft going single player + editor.  I already support a couple network servers per game client already for editor on httpd and script console via telnet.  The dedicated server part is easier that this, but I have yet to put time into it.  Time for bed! 0310  :)


Wednesday, September 11, 2013

Better Look

Here's a better look at the Voxel Sandbox with the LoD off with an inlay:

Monday, September 9, 2013

Voxel Terrain Sector Preview

Got the basic voxel terrain in today.  The most time was spent on new ways to generate and enqueue Voxels, RenderMesh, PhysicsMesh in various worker threads and sync them amongst all the subsystems.  Also refactored some of the math library and noise generation while in there to avoid overlap.
Voxel Terrain Sectors 2x1x2

The scale of this image required me to adjust my camera... typical player could be about as big as a pixel here... going to hook this up to my biome and erosion simulation next.  That way flora and fauna will spawn with rivers and 'set piece' locations, and have some more high frequency details... I had some crazy stuff like ocean trenches prototyped, but who would ever see that?  Took out the randomized caves for now too as they should be created by the biome / erosion system.  The trick will be to convert it to density functions properly - eg account for overlapping terrain.  I don't think trees grow under ground.

This is just a 2x1x2 voxel terrain patch to get a sense of scale... I have mostly been testing with 6x2x6 and 9x3x9 for a complete world earlier.  It is all hooked up to script you can spawn more terrain as the player runs around until you run out of floating point sanity if you like.  

Memory overhead is super low at ~1/4 the size of the older heightmap generated terrain.  That said you could make worlds smaller than one walk cycle of animation or short sound clip by far still.  :D

Planning on adding real-time editor later that works by altering density in a selected Voxel Sector for artist controlled editing via terrain brush.  It would be nice to make it work with touchscreens via the web editor too in real-time, but not high on my list.  Elevation and Biome splat controller needs to come in next.  Maybe add some slope blending adjustments to the basic shader / materials too.  

The most fun will be allowing scripted density functions by exposing the native structs holding function pointers and args to script that run in the density summation as addends.  In other words alter the density functions from script ( data driven ) at any time not just alter the voxel field... well laters.  

Saturday, September 7, 2013

Isosurface Extraction done quick...

Cross post from Google+:

Voxels and Triplanar Mapping only took a day and a half thanks to the excellent docs from Nvidia and my awesome little engine that could. Going to work in real-time editing and maybe tessellation guided by isosurface extraction later if I don't get sidetracked with other big features. This is a CPU converted isosurface to standard mesh to use the CPU side physics. eg voxels -> mesh. Will be doing GPU side physics to use with noise based particles / fluid simulation at some point. Also GPU only is good for real-time editing.

BTW was using marching cubes for this, but I'm still toying with naive surface nets for low end mobile platforms as it's much less ploygon soup when done. Also with naive surface nets you can generate less dense CPU side physics meshes then render on GPU in higher detail.

I also tossed in a preview of a simple 2 texture triplanar 'cave' material for the same shader.

Here's the algorithms:

Chapter 1. Generating Complex Procedural Terrains Using the GPU

Polygonising a scalar field

Wednesday, August 21, 2013

Hey guys. It's been a long time since I posted anything about my engine kagura. I've started working on Physically-Based Lighting, and Clustered Deferred Shading with Forward+ will be the next big renderer change. Yes, I do enjoy rewriting my renderer fairly often it may seem. That's why I architected a plugable renderer! ;) First a screenshot of the latest alpha build with some horrible debug spew all over the place:

Currently I have Lambert with a Normalized Blinn-Phong enabled as I have stopped toying with the Oren-Nayar to get to some real business -- Removing the MDR system to replace with a standard HDR system with filmic tone mapping. 
Still using random assets from the web and what I can do with blender.  Really need to contract a professional artist do do a rigged character model with a walk cycle soon.




Also the editor has fully generic property mappings now.  This means you can bind script from native code or script to procedurally generate the UI on the target device such as a desktop webbrowser, nexus 7, or galaxy note in my case(s).  It's not pretty, but it gets the job done.  From native code this is how a bind looks:

KAEDITOR_PROPERTY_RANGE( "Exposure", 0.1, 4.0, mCamera->Exposure(), 0.01, Exposure ); This generates the JSON subpacket to be issued to the HTML5 editor when requested. This would look something like this for a full fresh:
"KaCameraComponent" : { "Properties" : [ { "name" : "Debug", "type" : "bool", "value" : 0, "script" : "$type.SetDebug( $instance, $value )" },{ "name" : "Culling", "type" : "bool", "value" : 1, "script" : "$type.SetVisbility( $instance, $value )" },{ "name" : "Field of View", "type" : "float", "value" : 60.000000, "script" : "$type.SetFieldOfView( $instance, $value )" },{ "name" : "Exposure", "type" : "range", "value" : { "min" : 0.100000, "max" : 4.000000, "value" : 2.000000, "step" : 0.010000 }, "script" : "$type.Exposure( $instance, $value )" },{ "name" : "Near", "type" : "float", "value" : 0.100000, "script" : "$type.SetClipNear( $instance, $value )" },{ "name" : "Far", "type" : "float", "value" : 12000.000000, "script" : "$type.SetClipFar( $instance, $value )" },{ "name" : "Clear Color", "type" : "color", "value" : [0.219000,0.219000,0.219000,1.000000], "script" : "$type.ClearColor( $instance, $value )" },... ] } }

This is then processed in javascript into HTML widgets with signals, updated values, and handlers. A preprocessor in javascript allows for Lua script expansion to save time yeilding something like this for the KaTelemetry callback ( html->js->lua on the remote host/game ) :
KaTelemetry( "KaCameraComponent.Exposure( KaGameObject.GetComponent( KaGameObject.FindByGuid( 0 ), "KaCameraComponent" ), 1.97 )" )

All this happens while you drag your exposure slider around. Yeah I decided to put in on the camera instead of in the Image Filter material while I was testing the tone mapping and linear color changes. This also allows for very complex scripting across the HTML editor and game host / remote server, but this is a simple example. I need to clean it up and optimize it, so I can do a full study and write-up at some point. :D