What would a browser-based pipeline look like?
So I’m fully on the browser-based app bandwagon, but what would that technology look like implemented in a traditional game pipeline?
You have a totally portable UI. To some extent, you can get this with 3ds Max and .NET, or Maya and PyQt. With both of those, though, there is still a significant platform reliance, and inevitably there are layers of quirks (I can only speak for 3dsMax in which learning how to use your C# .NET UI components inside of it was a never ending nightmare spanning the full spectrum of problems, but I assume, based on intuition and posts on tech-artists.org, that the experience is similar in Maya). With a browser, you have a really, truly portable UI, that you can use from any app or the browser. You can just use one of the available .NET/Qt controls to host a browser inside of a control.
You have a totally decoupled UI. The decoupling is even more important than the portability. Nothing you do in JavaScript or HTML is going to be dependent upon the quirks of Max and Maya, so you should really be able to use the app from entirely outside of Maya/Max with few or minimal changes.
Well guess what, Insomniac has been doing this stuff for a while already. And it looks fucking awesome.
How does the UI communicate with your app? The benefits of abstracted UI’s are great when you’re just using standalone tools inside your 3d app, but what about tools that need to interact with the scene? Well the answer here is to develop all that awesome communication infrastructure you’ve been thinking about ;) Studios like Volition have pipelines that allow 3dsMax and python to talk to each other, and the same capabilities exist in Maya. So your UI, hosted in your 3D app, talks to a service (local or otherwise), which then talks back to the 3D app.
Which is awesome, or redundant, depending on how exciting you are. It seems like a redundant, and complex, step. But to me it is a box of possibilities. First, you can do anything on the backend- logging, for example, that is completely transparent to your tools. But far more interesting is that you’ve introduced a layer of abstraction that can allow you to, say, farm an expensive operation out through your service. I mean, normally the barrier to entry here is high- you’d need to set up all the client/server infrastructure. But if you go down the browser-based pipeline, you need to have it set up by default. So you basically get the flexibility for free. Imagine:
You have a UI that has a ‘generate LOD group’ button and settings. You click it. It sends a message to a local service that says, ‘Tell Maya I want to generate an LoD group with these settings.’ Maya gets the command, and sends info back to the server- ‘Server, here is the info you need to generate the LoDs.’ The server then sends a message back to Maya, and 3 remote machines, that each one generate an LoD. Maya finishes and updates the scene with the generated LoD and 3 placeholders. As the remote machines report progress, they send the LoD info back to the local service, and the local service says ‘Hey Maya, here’s that updated LoD you asked for,’ and Maya updates the scene.
That sounds complex, but think about how much of that you already have, or could use for other things. The 3d/service layers you can use, and may already have, for any form of interop communication (like COM). The data structures and functionality you’d need to send data to/from Maya can be used to generate LoDs, or just export meshes, or anything else you can think of doing with mesh data outside of Maya. The remote farming ability can be used for distributed processing of anything.
So now we move much closer towards what I’ve discussed with the Object Model Pipeline, except it happens much more flexibly, naturally, and asynchronously. Services expose the functionality to nearly all of your tools- basically anything you could want to use outside of your application- and you can write anything against those services.
Ambitious, but feasible, and not for the feint of heart. I’ll certainly be pushing for this, and we’ll see how it goes.