-
Posts
1011 -
Joined
-
Last visited
Content Type
Profiles
Forums
Blogs
Articles
Docs
Gallery
Everything posted by JeffR
-
Very cool! Also, as a heads up, you can format a block of code via the code2 tags, like: Open the block with code2=cpp, for cpp-style syntax highlighting(with it in brackets, of course) and then close with /code2(also in brackets) Went ahead and did that for your post :)
-
Sounding pretty good :D
-
So what part of it doesn't look correct, to you? Not saying you're wrong, just trying to understand what you feel is off about it.
-
Don't have access to one, so that limits the options a good bit. Mango seems to be the guy that's got access to all the VR toys, so he may do that at some point, but I don't believe that anyone is actively working on it currently. If we could get our hands on the hardware and stuff it'd be worth looking into though!
-
Anyone have any experience making their own editor?
JeffR replied to chriscalef's topic in TorqueScript
I've implemented a few small editor deals since then and it hadn't been as profoundly terrible, so I think part of the difficulty stemmed from bugs that had been fixed. That said, I feel there's a better way to approach the entire thing. The first that occured to me was basically rather than swapping the UI's around, the world editor UI is always visible, but it has callback hooks to a Tool class that enacts the specific behavior. This would drastically simplify the swapping shenaningans, as we only have to worry about what the editor ui has as it's 'Active Tool', and change that via a simple function call, it changes the tool it's pointed at, and all interaction functions trickle down to the tool for handling, such as your mouse clicks and the like. This would make things WAY easier to add new tools and just drop things in, and just alltogether make things more self contained per-tool. Another though I had, which would tie to the above, is build in different pointer modes. Ie, regular mouse pointer(like your regular world editor mode), paint brush pointer(like the terrain editors or forest), etc. So rather than needing an entire render override for the editor view just to enable stuff like the paint mode, you'd set the pointer mode to radial brush mode, set the radius and an option for grid-alignment for stuff like terrain), and the masks for the raycasts it fires out when rendering and also painting. This way the tool can just worry about the results of the interaction, rather than micromanaging the interaction itself. Then, obviously, the tools can just have script functions out the wazoo for hooks into the editor suite itself. Does that approach sound like it'd make more sense? -
Well, effectively, that's what the grid shows. It shows spheres with 0-1 roughness and metalness, one for each axis. Same idea as this display: http://i.imgur.com/Gt7VoVn.jpg
-
it should generate a console.log file in the same directory as the exe.
-
@Chelaru Can you clarify what looks WIP? Do you not like the demo model? Or does the PBR look incorrect to you?
-
openSimEarth gets statics, and goes back to decal roads.
JeffR replied to chriscalef's topic in Show-off
Yeah, I did pull that to test, there was some issue with the terrain texture fix they added, so I'll try and get that sorted out and we can get it merged in. As for the edge issue, I did create a new map with 2 terrains side by side, both flat, and plunked the cheeta right on the seam, and all 4 wheels correctly matched to the ground(which uses raycasts) So the best-guess so far is that - going off your mention it seems to be a problem when the opposite sides of the terrain block have differeing heights - the terrain is trying to have wrapping occur, even if that setting is off. If that's the case, that should narrow things down quite a bit. -
openSimEarth gets statics, and goes back to decal roads.
JeffR replied to chriscalef's topic in Show-off
I'd seen a few other people comment about casts and stuff breaking down on terrain edges, but hadn't had much luck with replication. I take it it happens with every block seam? -
Do you have a log file from when it crashes from loading a model?
-
Can you post some screenshots? Bit easier to have an idea of graphical problems that way.
-
Are you familiar with the entity/component stuff I've been working on? The E/C stuff has a redone collision scheme, making it way easier to set up collisions on stuff in whatever way the user wants.
-
T3D's gotten a lot of work done on it to make it use SDL2 for most of the platform stuffs. It may be worth checking that out and seeing if that'd be more prudent than trying to maintain the platform layers manually. If all the kinks with the platformSDL stuff get worked out, then conceptually you'd only need to maintain tidbits of gluecode, rather than entire platform layers.
-
Dynamically, as in during the game running? So like leg IK? In that case, the ShapeConstructor methods wouldn't work. That's for configuring the shape to your needs before spawning. Once it's created, the ShapeConstructor methods don't affect it outside of reloading. To manually change bone positions in code during runtime, you'd need to look into the animation code in ShapeInstance, and the HandsOff mask for bones. That lets the bones be animated by code rather than animation sequences.
-
Feel free to copy the rig once that dude is ready to go :)
-
I needed a baseline player model and rig for prototyping, and eventually to be used for the characters in my games. So I rigged up a simple Low-poly dude to a simple skeleton, with some nice controls to make it easier to utilize effectively. http://ghc-games.com/public/RedGuy.png Figured I'd toss him up here, and see if anyone had a use for it. As said, it's my baseline rig, so I plan to use it in general, but if others have an interest, I could take suggestions/ideas for some improvements. Who knows, maybe given how light Red Guy is comparatively(and that there'd be source art), we could make him the default player model in the templates :P Link to him is here
-
So, given how weird git can be, you guys may not be familiar with an easy way to pull and test PRs. As such, I've started a new wiki page and added an instruction set for utilizing TortiseGit for pulling down PRs, here. Anyone that's familiar with other methods or tools is encouraged to add onto that page with a new section. Like the page says, the more people we get testing on the PRs, the more assured we can be that they work properly and don't introduce any new issues.
-
To clarify on it, with the spacing correction back in there, does it still not import correctly into blender without first passing it through FBX and OBJ?
-
Generated it via cmake
-
Shadergen just generates the procedural shaders. The material classes are largely what act as the binding between the shaders and the classes to be rendered for the procedural shaders. Classes that utilize the common, pre-written shaders like the water classes just call to the shaders directly.
-
@kylehagin Namely that breakpoints aren't triggered, so I can't stop the application in certain areas, check variable values, etc. Does ddd allow for that?
-
I'd made a passing mention of it in the 3.8 announcement blog as a goal for 3.9, but the plan is to start work on a node-based material/shader authoring interface and an updated shadergen. I'd gotten started on the node-graph ui stuffs, most of the work is trying to figure out exactly how to design the new shadergen system to operate to generate everything. I've got some stuff I'm doing some prelim R&D on the subject for, and once I have something a provable, I'll be posting an update about it. If you guys have any ideas or feedback on how you think it should operate, I'm completely all ears. If we can hammer out a general approach, it'd be easier to iterate the specifics of the implementation.
