Jump to content

andrewmac

Members
  • Posts

    295
  • Joined

  • Last visited

Everything posted by andrewmac

  1. The two most expensive parts of it, the voxelization and light injection, could not *easily* be multithreaded, but they certainly could. Problem is, both touch the geometry and light lists, and light injection relies on raycast. None of the above are thread safe. I personally don't believe we'll ever see this go real time. We could probably do something like relighting, where you can call a command to reinject the lighting and propagate again. This would help for scene lighting changes, and perhaps calling it per-tick during a time of day sequence would be acceptable, but I really don't think we'll see it happening per-frame and being worth the cost. We won't know for sure until some artists start playing around with it but I do believe the best way to utilize these volumes would be as you said, placing multiple volumes around the level and positioning/tweaking them until you get a result you like. They aren't an exact science, and sometimes you'll really like the results in one spot, but not in another and artist placed volumes will alleviate that.
  2. @az From digging in the DDS code for a minute I'd say you'll want to look at this bit: https://github.com/Azaezel/Torque3D/blob/9b2b4ed7da47a33cbdcea92c33bcd7d6cdb4fcb3/Engine/source/gfx/bitmap/ddsLoader.cpp#L751-L770 It puts the DDS into a GBitmap and then saves it to disk. Should be trivial to change that into returning the GBitmap after creating it. Then you can just treat it like any other map.
  3. @jay1ne : Not too much longer. I've gotta merge JeffR's improved voxelization, get it properly detecting material texture colors, sort out an issue with normals and then make sure my reflection angles are all correct. After that though, I think it'll be ready to be tested in some dev environments. There's a ton of improvements and tweaks that can and will be done for awhile, but as far as being able to place a volume, generate some GI and use it in a game? Maybe a week away. @buckmaster : I'm not personally diving in the PBR side of things, az has a handle on that. I did, however, lay all the groundwork and make material info available in the reflection shader, allowing az to shape the reflections in a more PBR-esque way. PBR really needs something to do glossy reflections that doesn't cost as much as dynamic cubemaps, and while this doesn't completely fill the void, it's a step in the right direction. I think the glossy reflections from this combined with a screenspace reflection shader should produce plausible glossy reflections for PBR, which is a HUGE part of PBR. PBR without reflections is like a sandwich without bread.
  4. Option 1 sucks for artists as they'd have to "compile" a combined map every time they wanted to make a slight change to the maps used. Also, option 4 is not really an option, its a supplement to option 1 since the tool would spit out the RGB map to go with this option. Option 2 is the most flexible, but it makes the shaders and material features uglier since you have to account for two possible input methods, either the combined map or the separate maps. The cleanest and fastest way to handle this would be to combine your 3 optional maps into 1 combined map.. but that just leads to.. Option 3. I believe to be the best option. Artists supply roughness, metalness, and AO as separate maps and change/tweak them as they please, and then during the material loading phase we generate a single combined map to upload to GPU by grabbing the specific channels from each of the 3 maps. This is really quite trivial as long as the maps are the same resolutions (which they should be).
  5. Yeah, as az said, transluency only bleeds light through. For this reason, it isn't expensive to apply. It would have worked well for foliage I think, as you could bleed light through the foliage.
  6. http://i.imgur.com/yj8MWI2.gif Good point. I've got nothing on that one.
  7. He shudders at PHP because he's one of those Haskell guys :P
  8. http://i.imgur.com/HFR45lz.jpg
  9. The static/dynamic shadowmap mixing will take up more memory usage because of both maps, but the total rendered polys should be roughly the same since stock renders all the shadows all the time. Unless things are being shadow casted more than once, there should be no difference in polys with static/dynamic shadow system. I have no idea why deferred shading is reporting more polys in those screenshots. The reflection on the water seems very different, and would have a major impact on polys if it were not working correctly, or rendering different distances in stock. Additionally, if you're using dynamic cubemaps on anything for PBR in that scene you're gonna tank your performance and increase your polys significantly.
  10. I think the easiest place to multithread that will have a noticeable impact and isn't a crazy project would be the asset loading. I started the research on it one day and it didn't look too bad. The key thing is creating some kind of state an asset can be in that's essentially "Loading". It won't be displayed but will still exist as an object in the world. Once that's in place you can just offload the asset loading to another thread and then mark the asset as ready/loaded once it's done processing. This should alleviate the hangs you experience when you quickly rotate the camera and do other things that cause hiccups. The man who multithreads that renderer deserves a lifetime of supply of whiskey. He'll also need it for the PTSD he'll surely have after the project is complete. I've looked it up and down 3 or 4 times now and I keep coming to the same conclusion: it would make more sense to gut it, build it proper, and then go through all the existing code and update it to use a new threaded render system. It's what I started concluding when doing BGFX. In some cases you can fix a house by replacing one wall at a time.. but I think in this case the house should be torn down and rebuilt. Just my two cents anyway.
  11. I think we graduated from cornell box to sponza. Still have some work to do though: Direct Lighting: http://i.imgur.com/KVnlkFl.jpg Direct + Indirect: http://i.imgur.com/heYZTyW.jpg
  12. I spent the last few days experimenting with spherical harmonics. It seems like the natural progression people take when doing light propagation, and I see now why that is. With spherical harmonics you can store intensity of a color in a direction instead of just storing the color. You can also store more than one by adding them together. It's some neat math. I made this javascript demo while figuring them out: http://andrewmac.ca/spherical_js/2d_lpv.html What you see is a wall lit by two lights. As you press propagate it will propagate the light outward in the reflected direction. What's neat is it's a single row of voxels holding light data from both lights. As you propagate the two lights separate and go in opposite directions. My old propagation method was more or less a blur and would have just blurred those colors together into yellowish and pushed them outward in all directions. This new propagation method produces much better results: 2KtFuiEP2Bk As shown in the video propagated light will show up on dynamic objects. They just can't occlude it, so if you stand in between the box and the wall the box will still be lit green from the wall. Also, the player will have green ambient light on sides not facing the green wall. It's not perfect, but it's plausible.
  13. I just did a push with some usability updates. I added a field flag so the gui inspector can display buttons instead of the hacked checkboxes: http://i.imgur.com/FPQmJtA.png If anyone wants to try it out you just place the volume around the area you want to use it on and then go through the steps from top to bottom. voxelSize: size of each voxel in torque units. When you change this you need to regen volume. regenVolume: voxelizes the area within the area and clears all the light and propagation grids. You can visualize it the editor with showVoxels. injectLights: pulls the pointlights from the scene (pointlights only for now, will add spot and sun very soon), visualize in editor with showDirectLight. propagateLights: propagates the lights through the grid. You can press this as many times as you want until you get a result you like. Visualize it in the editor with showPropagated. exportPropagated: exports the results of propagation to the GPU to be displayed. exportDirectLight: exports the results of injectLights to be used for reflections (optional). Check off renderReflection to enable the shader to display reflection within the volume. This property will save with the volume so you can turn it on/off. Lastly, saveResults/loadResults. This exports the results from your propagation grid, and your direct light grid to the file path chosen in fileName. These results are loaded back in when the mission loads (or when you press the load button). These final results hold only what it needs to send to the GPU. You cannot propagate them further or work with them in the editor when you load from saved files. You have to start back at regenVolume and go through all the steps again. By doing it this way the final results are loaded straight to the GPU from temp buffers. This gives minimal memory usage when using them in a game.
  14. I don't think you'll ever get any kind of good shadows out of it. You want your pixel perfect shadows for things like fences. You'll always need a dynamic shadow solution. The shadows can be fixed, it's just a different project is all.
  15. @8bitprodigy Correct. @buckmaster Those are just stock pointlight shadows. Fixing the shadows is beyond the scope of this project, but perhaps I'll take a look at them when/if I drag out the old Variance Shadow Maps project I have on my github (branch is vsm).
  16. I just did a push the branch with two great updates. First I cleaned up the options in the properties a bit and I added the reflection shader as an option that's rendered on top of the regular stuff. You can turn it on/off. Second, voxels now detect the diffuse color assigned to the material. The color detected is blended with the light that bounces off it. This finally gives us some color bleed and reflection color: http://i.imgur.com/Kdevy2A.png http://i.imgur.com/2iUxAWd.png
  17. andrewmac

    Better Foliage

    Maybe if the normals were just weighed heavily in the Z+ direction using a configurable value in the material from 0 - 1. That way it wouldn't be so cut and dry as all upward facing normals, but instead let you scale the normals towards Z+ using a value from 0 - 1. I'm trying to avoid having to place a new material info flag into the gbuffer to flag foliage and light it differently. If I could provide a workable solution from altering the normals before the lighting stage that would be ideal. Worse case scenario I'll just bite the bullet and make a custom lighting flag for foliage.
  18. andrewmac

    Better Foliage

    Would a potential engine side solution be a material property that overwrites the normals that are writtten into the gbuffer for that material with upward facing ones? You could enable it on the grass material and the grass would be lit as though it had all upward facing normals.
  19. So, JeffR had another one of his crazy ideas. Couldn't we trace a ray into the voxel grid in real time to do glossy reflections? Last night I set out to answer that question. After a number of hours of fighting with angles, normals, etc I emerged victorious! -myAkLhwP4Q As you can see the calculations are still a little rough around the edges, but I'd say it a good proof of concept for the idea. Combine this with a screen-space reflection shader and I think we could get pretty decent glossy reflections that still work with off screen objects.
  20. Why not just make a forum called Resources below Blogs?
  21. I think it's something like 128 players. What you should ask yourself is what kind of things will your server be responsible for? If it's going to handle things like collision, raytracing projectiles, etc then I would take torque and start making the modifications you need to support the amount of clients you're planning for. The main reason being that in order to do proper collision and raytracing you'll need the information about the geometry and you're just gonna be reinventing the wheel making your new server able to load all of this when torque already has the infrastructure to do it all. On the other hand, if your server is there to just keep score, or distribute certain elements across lots of clients (mingleplayer, like dark souls, you put down a message somewhere and other players can see it) then I would write my own server to do it.
  22. I just discovered that I messed up the code loading the final result into the volume texture. By loading it in the wrong order x was z and z was x so nothing was displaying right. After fixing that the results from the cornell box are a lot better: http://i.imgur.com/nGj7qjy.png What it really needs now is the ability to pull the color from the material and store that in the geometry voxel so the bounced light can pick up that color. That way it'll light the backside of the box with red on the left side, and green on the right side.
  23. @buckmaster fuuuu @LukasPJ this is pretty neat. I like this system. @TRON HI TRON!
×
×
  • Create New...