Backburner fun!

Hi All,

So, now I’ve got a hold on fiddling with Backburner via MaxScript (whilst trying to avoid its…. subtleties, as described here and here) I’ve been having some fun, using Backburner for some interesting tasks.

One of the most useful but obvious – insomuch as it’s right there in the cmdjob help entry, is submitting After Effects renders to the farm. It’s very easy to write a little bit of code wrapped up in an UI to make the tasklist file described (a comma separated file detailing the name and range of a job). Submit this along with the location of the After Effects .aep file and the comp name and you’re good to go.

But it got me thinking… all you’re really doing when you do this is submit a command via cmd.exe to the machines in question. So… why not go further? The first thing I thought of was a response to a problem we had at work where we received a file from a colleague off-site that contained a plugin none of us had installed, and nor did the farm. We had the choice of either stripping out the offending objects (if we could find the damn things) and then replicating its functionality without using the (free) plugin, or we could go on the usually arduous process of installing the plugin on all the workstations and all the render nodes. It’s just copying some .dlo files into Max’s /plugins/ directory, but still, if only there were some way of giving a universal copy command across the network…

The basic code was very simple – it offered the user the ability to select a file, a destination and select one of the groups (mentioned in the first link up there) on the farm. It then creates a small .bat file which is, effectively, just a command copy the file to the destination (ie COPY “X:\network\file.dlo” “C:\plugins\”, exit 0) and sends a separate job to Backburner for each node in the group, with only that node offered as a server for each job. Once the underlying code was done, I added a few tweaks, such as the ability to add entire folders to the mix, but it’s really just an expansion on this pretty basic concept.

Putting these two examples together – After Effects jobs and Copying files – gives me some fancy ideas for a bit of fun, but the most obvious one to me was… fonts! Unfortunately when you install a Font, it isn’t just a matter of copying it to the Windows /fonts/ folder – there’s also a registry entry that gets added. Otherwise the above script would be enough to install fonts, network wide – very handy if your AE job has a non-standard font. However, it shouldn’t be hard to add a checkbox to the above script, which will effectively add a line to the .bat file that gets generated which performs the appropriate registry tweaks using regedit – I just haven’t done it yet!

But if anyone has any other cool ideas on how Backburner could be leveraged for useful or fun tasks, please let me know! If anyone wants any more information on how to do the above in detail, feel free to email me at dan-grover@dan-grover.com.

3D World Article

Hi All,

Just a quick note to mention that I have an article in this month’s 3D World (March 2013). It’s a Q&A about scene scale and how to deal with it in 3ds Max. I hope those that read it will find it useful!

article

 

AND it has the word “sexiest” in it!

Beautiful tree there courtesy of Lauren Scott.

Dan

A note about Maxscript and Backburner

Another thing that I have discovered during my trials and tribulations with getting Maxscript and Backburner to play nicely is that Groups on Backburner don’t behave as they should under 3ds Max 2013. Before product update 6, you could not really get any information about groups at all – all requests to GetGroupName returned an empty string.

With product update 6 comes some steps forwards – it now returns the correct group name! – but little else. You still can’t reliably return a list of the servers in a group. You cannot create groups, nor can you edit or delete them. This is true whether your connection to the manager has queue control or not. I have, thusly, come up with a solution that started out as a temporary fix until a new product update or 2014 comes around to fix it, but has turned out to work fairly robustly, so I see no need to change, even should groups get fixed!

The process basically involves having groups defined outside of Backburner. In my case, I have a folder full of files – text files, incidentally, though they are never seen by the user – which has, as its file name, the desired name of the “group” and the text file itself contains a comma separated list of server names. These are created, edited and deleted using a simple script that reads and writes the text files. These same files are read in my Backburner submission script, which then supplies the server list contained in the file to backburner (in the case of NetRender in the “job.submit servers:server_array” parameter, and in the case of cmdjob’s as part of the “-servers ‘server1,server2,server3′” etc flag). This has the advantage of being very very quick and simply, as well as robust – so long as no bugs are brought in that mess with the (currently functioning) use of servers in backburner submission.

The main downside is that this is set at submission and, though you can change the contents of the text files defining the groups whenever you like, this doesn’t change already submitted groups. Of course, you can always change which servers are assigned to a job in the backburner monitor, so it’s not like you need to resubmit if you realise there was a problem in the group.

Anyway, that’s my solution, and hopefully it could help someone if they find themselves in a similar position.

MaxScript, Backburner and Dependencies!

Hi All,

I have today finally made progress with a problem I’ve been contending with, on and off, for a few weeks now! The problem is of using MaxScript and Backburner with dependencies. Just submitting Max renders to Backburner using Maxscript isn’t a problem, though for some unutterable reason dependencies are not supported. There are a handful of options out there to try and solve this problem, namely…

- Sending the job to backburner suspended, without any dependencies, then setting the dependencies by editing the (unused) <DependsOn> XML tag in the backburner job folder, before archiving and unarchiving the job (so that the XML file gets re-read). This is problematic as the only way to archive and unarchive a job (also not a function available via MaxScript!) is via Telnet, either through Python or dotNET. I managed to very vaguely get this working using dotNET but Telnet is not the most elegant of things, especially when in an automated system.

- Setting post-render scripts that perform a certain task when a render is complete. The problem here is that the post-render script is called every time a render server finishes its job – which might not necessarily be at the end of the actual job, of course. Whilst this is potentially surmountable by checking to see what frame was rendered in the script, it also meant that each render node needed a licensed copy of Max if it were to open and close files.

- Using cmdjob.exe, the command-line backburner queue magician. This is what I have ended up using.

The solution is actually relatively elegant now. I can’t post actual code as this is for a paid job, but the basic process is this: cmdjob.exe can send jobs to backburner which have dependencies. So what you do is have a cmdjob task launch your actual render tasks. Instead of submitting scenes directly from Maxscript to Backburner, have cmdjob.exe run on Backburner, call up a copy of max and then run a script on it, with the script containing instructions to send a render job to backburner. Because this job is not sent until the cmdjob is executed, and because cmdjob’s can have dependencies, you can effectively have dependencies through maxscript.

Which is easier said than done! So here is a more detailed approach to the process:

- The user loads a scene they want to render.

- When they run the script, a series of other scripts are generated and saved alongside the .max file. These scripts contain all the instructions needed to alter the scene as per the users wish (for example, there may be 3 scripts – one simulates and saves out a particle sequence, one loads this sequence and pre-calculates a GI solution, the third loads both the generated particles and the pre-calculated GI solution and renders the final frames.) and then submit it to Backburner as netrender, before closing the instance of Max.

- Instead of simply sending the first job to the render farm via backburner in MaxScript, a commandline call is made to cmdjob.exe (more info here) to load a copy of max and run a certain script – in this case, the first one that we just generated. The crucial thing to know here is that cmdjob.exe jobs CAN be set to be dependent on other jobs.

- So the cmdjob.exe job is sent to backburner, and is picked up by a render node. This machine needs to have a licensed copy of Max and for that reason I recommend a special machine dedicated to these sorts of tasks. They don’t need a fast processor or fancy graphics card, but they do need a lot of RAM as they will be opening all your max scenes.

- This machine opens max and runs the script that was generated. This script basically contains all the options needed to be useful (such as turning on any particle generators, setting output paths for precalculated GI etc), then sends itself to Backburner via netrender (Job A). Cruicially, it then also submits the NEXT cmdjob.exe task to backburner, dependent on the job it’s just sent (Job B, dependent on Job A). It then closes that instance of Max, and backburner sets the task as finished.

- Next, Backburner finds itself with two new tasks – Job A and Job B from above. Job B is dependent on Job A. So Job A is set off to render, and when it’s finished, Job B begins – and the same process as the previous step begins all over again. This time it loads up the next script, submits that render (Job C), and submits another cmdjob (Job D), again, depedent on the one it just sent.

- This process continues ad-infinitum.

There are a lot of complications here. Do you want to hard-code the whole process of what submits what? My solution was the have the very original script run by the user generate all the scripts (.ms files prefixed with numbers indicating their order) to be used by the cmdjobs, and one of the last lines of each of those scripts was to “fileIn” another script (“fileIn” being the scripting equivalent of XRefing). This script deletes the .ms that called it and looks in the folder to see if there are any more. If there is, it launches a new cmdjob running the first script alphabetically. Thus, when that script runs and submits its next job, it will again delete the script which called it and look for the next. This way, I can have almost infinite scripts all daisy-chain off one another.

This took me a while to work out (thanks in no small part to a few frustrating Max bugs!) but it’s working quite well, with the added benefit of allowing the user to submit jobs to backburner that could previously not have been sent there. If you have any questions about the process, please feel free to email me at dan-grover@dan-grover.com and I’ll try and help as best I can!

Thanks,

Dan

Another New Script!

Hi,

I’ve just posted up another new script! You can read all about it here in the scripts section. What it basically does is make a nice, easy way and integrating large, complicated models into a scene with lots of other large, complicated models! It makes the files manageable, the renders more efficient and the viewport faster! I developed it in order to create a model of the entirety of London, which in total was hundreds of millions of polys.

Please note, the script requires VRay!

Dan

Nacue Create Event

I should have probably posted this before the event occured, but today has inspired me to be a lot more active with my blog!

I spent the entire day at my old haunt, the University of Hertfordshire, attending an event run by the fantastic gang over at Nacue/Create. They work with students to help them get into the work place, with a specific concentration in entrepreneurial endeavours. I did the keynote talk at the start about how job and how I ended up there, before running a very small, informal interactive session and finally acting as a judge for their Games Jam game design/branding competition, run by dojit games.

The entire event was incredibly inspirational for me – it really enthused me (in a way that, if I’m being honest, I genuinely didn’t expect). Seeing the enthusiasm and mutual encouragement from their peers really gives a feel-good feeling, and the obvious and genuine friendships that have developed around this group is outwardly obvious. I felt like I got a lot more from the event than I gave, truth be told!

On another subject, I hope to get a bunch more scripts up and online as soon as possible. There are the same old legal worries as always, but I think they’re navigable. I have a particularly useful one that remaps and renames 3dsmax asset manager files (everything from material bitmaps to linked vray proxies!) that I was looking for for quite some time before I eventually decided to bite the bullet and just write the thing (with help from a friend and colleague of mine, Simone Nastasi).

New Site

Hi!

I had unimaginable levels of trouble with the last site, so I’m just trashing the whole lot and starting from scratch! I’ll be grabbing most of the content from the old one, so it’ll all be back up soon. I also plan to make the scripting section of the site significantly beefier, so stay tuned for that.

Dan