Jump to content

JackStone

Members
  • Posts

    67
  • Joined

  • Last visited

Everything posted by JackStone

  1. Hello, thank you for your reply. I had a good look at that link, and I do think it helped a lot. It is possible that I am closer than I thought. I actually have code that rotates a point (using Rodrigues' rotation formula). However, this seems to rotate about the wrong origin point. I think if I could translate the nodes back to the origin, rotate them, then translate them again by the same distance, I could accomplish the rotation. The angle to rotate by seems to be given by the code you posted. Wikipedia says about Rodrigues formula: "By extension, this can be used to transform all three basis vectors to compute a rotation matrix in SO(3), the group of all rotation matrices, from an axis–angle representation" So, this should work... I will give it a go as soon as I can, and post an update either way, thanks again!
  2. Sorry, I only just noticed this. I have looked at those tutorials in detail, thanks Daniel, they are hugely useful.
  3. Hi, Thanks for your reply, I think you're right, translating probably won't do it. Ah, and there is matrix math again :P I'm still not great with that stuff, it's probably why I haven't figured this out yet. From what I am told, I don't need to actually convert too a spherical coordinate system, is this correct? I can use the translation algorithm on the cartesian coordinates? I will have to look into matrix math then, and see if I can figure this out. Thanks!
  4. I have the 3D gravity completely done, that was fairly easy. That just pulls the player towards the center of the world though, it doesn't seem to help with moving/offsetting the terrain. The player is already moving in a spherical path, really, since the terrain itself is actually working an paging properly. The only thing I need to do is reset the positions of the terrain and the player when the player has walked a certain distance. This sketch illustrates what is essentially happening: The player is standing at the origin, with the nodes close by rendering (The red x is the origin, and the nodes with the red lines are rendering). They then walk towards the blue X, which is a distance of 10,000 meters. They have now moved in a curve, around the terrain. I am now rendering the nodes with the blue lines under them. This works fine. What I now need to do is move the player back to the red x, which is simple, AND move the BLUE nodes back to the origin as well, such that the player doesn't notice a transition. Moving the nodes is easy, but they are always offset in some way.
  5. Of course, you are absolutely right! I realised this myself a day or so ago, and your post put the final little piece into place. The problem wasn't with the *planet*, it was with the *player*. I thought I was making this easier by moving the player along one axis only, but of course that is essentially what caused the problem. What I need to do is move the *player* in a circular path. That is going to take a little time to figure out, with my current implementation, but with all the work i've done on spherical rotation so far, it shouldn't be too difficult! Thanks for your help, that was the last little piece of the puzzle that I needed!
  6. Yeah, I am basically just doing this: nn1 += moveoffsetvec; Where the moveoffsetvec is a simple vector. I was just doing: nn1.x -= 10000, which was even simpler. However, if you look at this image: http://phoenixgamedevelopment.com/blog/wp-content/uploads/2016/09/PhoenixGameDevelopment-04_09_2016-01_43_54-AM.jpg If I move point 4 by the offset distance, it will be beneath p3, since it is offset on Z. So, I am assuming that this is due to the curvature of the planet, and I need to move the new nodes along that curvature, but I may be totally wrong about that...
  7. Hi, I have run into something of a brick wall with my spherical terrain implementation (again!). I may have gotten myself into a situation that can't actually be resolved here, I'm still trying to figure out what the problem is exactly, and I'm hoping someone here can shed at least a little light on what's going on. Basically, I have got a very large planet rendering in T3D by dividing it up using a quad tree, and rendering only nodes close to the player. For the purposes of this argument, the player is close to the surface of the planet, and has moved 10,000 meters into the next "zone", and so I want to render the next set of nodes. This is what gnuplot shows: http://phoenixgamedevelopment.com/blog/wp-content/uploads/2016/08/PhoenixGameDevelopment-28_08_2016-08_06_07-AM.jpg This is correct. P1 (pink) are the original nodes, at the players starting position, these are now no longer needed, as the player has moved on to P2(green) which is now rendering. So far so good. The problem is that I need to implement a floating origin system, I can't have the player walking all around the planet, since there will be all kinds of precision issues. So, I reset the player back to the origin, and I now need to move the P2 nodes back along the same vector that I moved the player back (let's assume it's 10,000 units along the x axis for now). Now, the player should be at the origin, and the p2 nodes should be rendering as if they were in their original position, but they have been moved back 10,000 units. The X axis is indeed correct, but for some reason, the nodes are in the wrong positions in the Y and Z axes. Each time I increase the distance by 10,000 units, this error increases, causing the nodes position to be completely incorrect. This is the best image I have of what's happening. This is after just one movement by 10,000 units: http://phoenixgamedevelopment.com/blog/wp-content/uploads/2016/08/PhoenixGameDevelopment-28_08_2016-08_19_15-AM.jpg These two groups of nodes should be in exactly the same place, but as can be seen, the second set of green nodes is offset along the z axis. I have tried all manner of solutions, from changing the positions in the spherical terrain code, changing them in preprenderimage, scaling and rotating, using rodrigues rotation formula, and adding an offset to the z axes to try to compensate for the shift. Nothing seems to work. I have confirmed that when I move the nodes by 10,000 units, then move them back by 10,000 units, they are *exactly* the same, which means that what ever is happening is not some random occurrence, for some reason there is an offset being added to the y and z axes of these nodes when I change their x values. I suspect that the fact that the nodes are on a curve (being part of a spherical terrain surface) must be the issue. It could also be to do with the local coordinates of the nodes, they are between -1 and +1, could this be an issue? Another thing that I just thought of is some kind of an issue caused by the scale of the actual world object, in the mission editor. Does anyone have any advice on where I could continue looking?
  8. I actually got this working, it turned out to be so simple. I just needed to change: runAcc.z = 0; to runAcc.z = runAcc.z * mDataBlock->airControl; Inside: else if (!mSwimming && mDataBlock->airControl > 0.0f) That enables full 3D rotation and movement. This is pretty cool, I can walk all around any object in 3 dimensions now.
  9. Alright, the camera issue is fixed. I now just have one issue left, and that is that the players movement direction changes when their orientation changes. IE, pressing the forward key when the player is oriented with Z-Up moves them forward, but when Y is up the forward key moves them in a different direction. Do I need to somehow rotate the move vector by the mOrient quaternion?
  10. Hi, Thanks again for your help. I am using standard move, and the code you posted *almost* fixed the issue. The camera is stable, but now, for some reason, the players head moves by itself to face a particular direction, and every time I move the camera, it return to that same spot. So there must be something causing the player to move their head? It's not the doQuatOrientation code, I checked that. I'm also having issues combining the move vector (the actual player movement) with the gravity vector. The two vectors seem to cancel out at certain points. making the player unable to move. I am currently combining them simply by adding them and normalising: Point3F m = Point3F(move->x, move->y, move->z); gravvec += m; gravvec.normalize(); VectorF acc(gravvec); This, again, *almost* works, but the direction of the players movement along X and Y changes as their orientation changes. I just saw your edit about the pack/unpack, I will have a look at that tomorrow. I am slowly making progress with this, thanks again!
  11. Irei1as, you sir are a genius! Thank you very much for that! Your code works great! I have the orientation working perfectly now. However, I have run into issues with the camera, as you said. I am using a plane at the moment, but even so, as the player rotates the camera, it seems to interfere with the new code that I added to orient the player. I assume I have to combine the "look" quaternion with the new quaternion that I added, do you have any ideas how I would go about doing this? My current code is: if (!contactNormal.isZero()) { Con::printf("CONTACTNORMAL %f %f %f", contactNormal.x, contactNormal.y, contactNormal.z); doQuatOrientation(contactNormal); } else { Con::printf("NO CONTACT NORMAL"); Point3F cogvec = Point3F(0, 0, 0) - getPosition(); cogvec.normalize(); cogvec *= -1; doQuatOrientation(cogvec); }
  12. Ok, so I think I have reduced this to a simple test case. When the player is standing on top of a cube, the contact normal is 0,0,1, and the quaternion should be QuatF(Point3F(0,0,0)) for the to be correctly rotated. When the stand on the side of the cube, facing the negative x axis, the contact normal is (-1,0,0), which it should be. The quaternion here, for correct facing, should be: QuatF(Point3F(0,1.57,0)). Which is basically a 90 degree rotation along y. The question is, how do I mathematically derive that quaternion from the contact normal and the player position, etc, alone?
  13. Thank, you this seems to have helped me a little. I had seen that code before, but hadn't used it, since I was working on a system using a vector from the players position to the terrain center. I have replaced my spherical terrain with a simple cube for testing, and I can obtain a contact normal, of the form: "1,0,1" with the player standing on one of the faces. However, I now need to convert this VectorF into a QuatF. I tried just setting the quatf to the contactnormal, which compiled, but doesn't work. There is obviously a trick to this that I am not getting.
  14. This would be ideal, but I'm not sure how to grab that normal? The other thing, is that I'm not sure how to align the player to face a particular normal. I am doing this at the moment: Point3F gravityvec = teraincenterpos - getPosition(); gravityvec.normalize(); QuatF q = QuatF(gravityvec); mOrient = q; It seems to be close to working, but it doesn't work on all cases.
  15. I have made some progress with this. If I add this to updatemove(): QuatF q = QuatF(0, 0, 1, 0); mOrient = q; I can change the orientation of the player. I just need to know what to set mOrient to in order to face the center of the terrain.
  16. Yes, I do see a lot of that code in there. I have actually implemented that resource, and it does seem to be working. I had forgotten that, I just realised now after looking at that link. It seems then that all I need to do is use the mOrient from that resource to change the orientation of the player in updatemove? I'm not sure how to do that though, any ideas?
  17. I am currently working on a spherical terrain implementation in T3D, which is going well, but I have hit a snag with the players movement. I have implemented 3D gravity according to an old resource that I found, and this works great, it pulls the player object towards the center of the world. In update move ,I am adding the 3D gravity by doing this: VectorF a = terpos - getPosition(); a.normalize(); VectorF acc = a; The problem here is that even though the player *moves* towards the terrain center, they are not oriented towards the terrain center. IE, their feet don't face the ground, which means they can't walk on the terrain unless the are standing at or very close to the north pole. I have done some research into this, but I have not made a whole lot of progress. There is some code in updatemove() that looks promising, such as: // If we don't have a runSurface but we do have a contactNormal, // then we are standing on something that is too steep. // Deflect the force of gravity by the normal so we slide. // We could also try aligning it to the runSurface instead, // but this seems to work well. if ( !runSurface && !contactNormal.isZero() ) acc = ( acc - 2 * contactNormal * mDot( acc, contactNormal ) ); What I want to do, basically, is alight the players position with the contactNormal, is this correct? How would this be done? Thanks!
  18. As I mentioned in my previous post, I am attempting to render planet-sized objects in T3D. I was trying to implement a solutions based on scaling the planet, and dividing it into chunks, each of which was less than 10,000 units, but this hasnt worked. I am now considering changing T3D's position and scaling variables for my custom "spherical terrain" object type to use F64's instead of F32's. I found this thread on the old forum: https://www.garagegames.com/community/forums/viewthread/82413 Which seems to indicate that this involves interacting closely with the rendering layer. My question is, how complex would this be to do? I am trying to simply render an object the size of a planet, I dont want to change the entire engine to 64-bit. This is the effect when I try to do this with the current engine: http://phoenixgamedevelopment.com/blog/wp-content/uploads/2016/06/PhoenixGameDevelopment-12_06_2016-08_10_21-PM.jpg http://phoenixgamedevelopment.com/blog/wp-content/uploads/2016/06/PhoenixGameDevelopment-12_06_2016-08_08_36-PM.jpg Clearly these are 32-bit precision issues. Can anyone shed any light on the right approach here? Is it just a matter of going through the rendering system and changing F32/Point3F's to F64's/Point3D's?
  19. Hello, For some time I have been working on a spherical terrain implementation in T3D. I have completed most of the basic work, and the terrain looks pretty good. The terrain contains many "nodes" each of which is square, containing two triangular polygons. The vertex data for the polygons is in local space, and it between -1 and + 1. I scale the vertex data by the "radius" of the world. The next goal I would like to tackle is rendering this terrain at a realistic planetary scale. I am not sure how to solve this issue, but I have spend some time thinking about it. First of all, I am aware that the main issue is the floating point precision problem with 32-bit floats. I am assuming that I will be using a "floating origin" system, ie, the player will move around the world, but once they reach a certain distance from the origin, the entire world will be translated such that the player will be set to the origin. This is a fairly standard solution to this problem, and I believe it is what Kerbal Space Program uses. Rendering a planet at long ranges is also fairly easy. I can just render the planet at the maximum distance that can safely be processed, say, 10,000 units, and then scale the planet to simulate a much further distance. IE a planet rendered at 10,000 units from the origin but scaled down by 50% will appear to be the same size as a planet rendered at 20,000 units from the origin and not scaled at all. While the player is standing on the terrain, I should be able to treat the terrain as essentially flat, since the curvature won't be visible at that altitude. So, I could just render everything within the safe render distance of the player, and then cull anything beyond that. The player shouldn't notice this. The problem I am having comes from the middle ground between being very near to the terrain and being very far. For example, what if the player is in the upper atmosphere? They are high enough to see the curvature of the earth, and yet close enough that I can't scale the planet to a realistic size, since it's radius would then be greater than the safe value for a 32-bit float. Can anyone shed any light on how I would solve this? I think there is something I am missing conceptually. I have done some testing, and scaling the terrain up to the size of a planet does cause rendering artifacts. What I need is some kind of floating origin system for the individual nodes of the terrain...
  20. That's great, thanks a lot! I'll check that out as soon as I can.
  21. Ok, I didn't realise that SteamVR has to be running, I assumed just the Oculus app needed to be running. I can now get video on my Rift, but I have no head tracking,and the camera position seems to be wrong. The controller also doesn't work, but that's not a major problem. I will keep digging, I am almost there! EDIT: I should point out that head tracking works fine in the Oculus Home screen, and all of the Rift Demos, so it's not a hardware issue.
  22. I think I might have found the source of the issue. After creating the canvas with VR enabled, I see this error in the console: "VR: HMD device has an invalid adapter." I'm digging through the source now, Could this be the cause, or is it inconsequential?
  23. Ok, so I am almost there! I have my rift, it's working great. I am now attempting to run my T3D projects on it. I have enabled "unknown sources", and I am running the D3D11/Open VR example project given in this thread. Without enabling VR, the project works fine, no issues. When I enable VR (in windowed mode) the headset picks up an app being loaded, switches out of the home screen, but both my desktop view and the HMD view are black. When I hit escape, I can see a message "are you sure you want to quit?" appearing on the headset and the desktop app. When I run in fullscreen, I get a crash, something about DirectX not being able to run in full screen. I am assuming I don't need to run in fullscreen mode? Is there anything obvious that I am missing here?
  24. Hi, Has anyone tried creating an accumulation volume in the DX 11 build? I am getting a crash with "unable to compile shader", this doesn't happen on the Dx9 build, but I can't actually see the accumulation on Dx9 either. If anyone could confirm that accumulation works on DX 11, I will at least know if it's just my build or not. I am using the Dx11/OpenVR build.
×
×
  • Create New...