Jump to content

Anyone have any experience making their own editor?


Recommended Posts

Just wondering if anyone out there has a solid understanding of the black magic involved in creating a new editor. I recall having a boatload of issues when I created Ecstasy Motion as its own editor, mostly related to not being able to reliably control windows when switching between editors, and in general just not understanding all the callbacks and things that need to be set up just so in order to play nice with the world editor and menus.

I'm about to get involved in this again for an openSimEarth editor, and I've completely forgotten most of what I had to do for EM, but I came across this old posting of mine from 2012:


Back then @JeffR was sharing my pain, at least. Have you been back to this neck of the woods since, Jeff? Anything gotten easier?

Nothing has really broken for me yet this time around, I'm just barely getting started... but I expect the process to be long and difficult, so figured a preemptive cry for help here probably wouldn't be a bad idea...

In any event, if others are interested I can share whatever I figure out.

Link to comment
Share on other sites

I've implemented a few small editor deals since then and it hadn't been as profoundly terrible, so I think part of the difficulty stemmed from bugs that had been fixed.

That said, I feel there's a better way to approach the entire thing.

The first that occured to me was basically rather than swapping the UI's around, the world editor UI is always visible, but it has callback hooks to a Tool class that enacts the specific behavior.

This would drastically simplify the swapping shenaningans, as we only have to worry about what the editor ui has as it's 'Active Tool', and change that via a simple function call, it changes the tool it's pointed at, and all interaction functions trickle down to the tool for handling, such as your mouse clicks and the like.

This would make things WAY easier to add new tools and just drop things in, and just alltogether make things more self contained per-tool.

Another though I had, which would tie to the above, is build in different pointer modes.

Ie, regular mouse pointer(like your regular world editor mode), paint brush pointer(like the terrain editors or forest), etc.

So rather than needing an entire render override for the editor view just to enable stuff like the paint mode, you'd set the pointer mode to radial brush mode, set the radius and an option for grid-alignment for stuff like terrain), and the masks for the raycasts it fires out when rendering and also painting.

This way the tool can just worry about the results of the interaction, rather than micromanaging the interaction itself.

Then, obviously, the tools can just have script functions out the wazoo for hooks into the editor suite itself.

Does that approach sound like it'd make more sense?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Create New...