Point Cloud Export Script Updated

2020 Edit: The latest version of this can be found in the GitHub repo – the actual .py linked to here is older and quite a bit slower, though it’ll still work. 

I haven’t posted in a while!

I’ve updated the Point Cloud Export script, which kicks point clouds generated in Nuke into a .csv format that Thinkbox’s PRT Loader in 3ds Max can load up. This change makes it approximately 10x faster, with the speed improvements being seen most significantly on larger exports. Technically it’s changed its iterator into a generator so that the data is only provided on the fly as it’s written and it’s not deleting items from lists. Wahoo.

You can download the new version here and I’ve also put it onto a GitHub repo which you can find here should you be so inclined – it’s for my benefit more than anyone else’s; it’s a good practice to get into for anyone!

I just started working as a Technical Director at Taylor James in London. So far, so good!

Thanks,
Dan

Converting Assets from Max VRay to Maya VRay

We’re producing a film at work in Maya, rather than our normal 3ds Max. We’re trying to get a *ton* of assets from Max Vray over to Maya Vray and it’s proving to be difficult! More so than you’d think.

We’re using .vrscene’s to get the shaders over, but the mesh doesn’t come in as “normal” mesh (I asked one of our Maya chaps about this – I have very little experience with it myself!) and has some limitations. So we’re shunting the mesh over via FBX and the shaders over via .vrscene.

HOWEVER something funky was happening to the shader names which meant we couldn’t script it to automatically apply the appropriate shaders from the .vrscene to the mesh from the FBX. As such, we now have a custom exporter for Max which renames all the materials with a specific prefix in the material name (after smashing apart mesh with multi-subs and applying individual materials to each object), exports the vrscene, then applies Standard materials with the same name to the mesh’s and exports the FBX. With this, we can use the shader from the .vrscene (With the specific prefix’s) and link them up with the Blinn’s on the FBX (with the same prefix’s).

But the fun doesn’t end there! We had to write a few special exceptions in the prefix naming function to allow for VRayBlend materials, 2-Sided materials etc because they don’t convert over properly in the .vrscene converter, so then the Maya script that matches it all up sorts that out too.

Then you have some maps that don’t work with the .vrscene converter, even though they *do* work with VRay, such as the Composite map. It’s functionality cannot be duplicated with daisy-chained VrayCompTex maps due to the lack of opacity for each layer, so now the custom exporter has to manually move those maps over (because the vrscene converter doesn’t) and writes out a text file to the directory that allows a human to at least see how it was all put together. We could probably automate this into a MayaLayeredTexture but we haven’t got there yet.

I could go on! The upshot is that this is way harder than I thought (because I thought “export as .vrscene” was the extent of it!)

Are we missing an obvious trick here? Or is it actually just a slog?

Nuke PointCloud to Max Python Script

Here it is:

Click here to download!

It should be pretty self explanatory – select a relevant BakedPointCloud node, run the script, tell it where to save the .csv file and you’re good to go. You’ll then have a .csv file that can be loaded into Max using Thinkbox/Krakatoa’s PRT Loader, and it’ll store all the colour information as well as location. The only thing you’ll need to do is rotate the PRT Loader 90 degrees, since Max is Z-Up. When you do that, any cameras or geometry you move between Nuke and Max will align perfectly (since the FBX exporter – as well as the great Max Script Nuke’em – automatically re-orient).

Let me know if you have any questions, comments, or otherwise enjoy it!

Thanks,
Dan

Some Progress!

Well, I’ve made some progress of sorts.

In my last post I mentioned wanting to get a point cloud from NukeX (generated from a camera track) into Max’s new(ish) point cloud system. Well, the good news is that I’ve not got the point cloud – including colour data – from Nuke and into Max. The bad news is that it’s not using Max’s own point cloud system, as this requires a very particular format, the exact structure of which eludes me.

What I have done, courtesy of a smart idea from Dave Wortley, was to bring it into Max using Thinkbox’s PRT Loader, which can be grabbed as part of the Krakatoa demo. Krakatoa’s great and I advise you all take a look, but if you don’t want to make the investment in buying it just yet, the demo supports all the PRT Loading your body (and RAM) can handle, including what we need.

The actual process involves running a Python script inside NukeX with the required Point Cloud node selected. It’ll spit out a .csv file which can be loaded into the PRT Loader (and if you create the PRT Loader at the origin, its location will match up perfectly with any cameras you WriteGeo out from Nuke using .fbx format, as long as you ensure the scale is 1.0 when you import it in). At that point, you have the camera and a great point cloud from which to build up a proxy model of the scene, safe in the knowledge that the point cloud was generated using the same camera you’ll be projecting from.

Once I tidy up the code, I’ll release it on here – at the moment it has no UI and it just spits files out to your desktop (Y-Up, no less) but I’ll try and clean it up ASAP and get it on here.

In the mean time, please take a quick look at the video below showing how it’s working so far:

NukeX Pointcloud in 3ds Max from Dan Grover on Vimeo.

Thanks,

Dan

Nuke and Max with Point Clouds

Does anyone know how to get Nuke’s point clouds into Max 2015 onwards?

Max 2015 onwards supports point cache’s in both the viewport and when rendering (with MR at least) but only specific file formats – Reality Capture Scan Files, .rcs and .rcp files – where as Nuke simply pokes out an .FBX file. You *can* pull this into Max but it just slaps in a load of “Point” objects – ie dummys, for all intents and purposes. Whilst better than nothing, it’s pretty close to useless really.

I’m making it my mission to find out how to do this, and I can’t believe there’s no way already. But, in the likely event that it remains hidden from me, I’m taking a crack at it. It might require either some Python on the Nuke side or some MaxScript on the Max side, but I’ll get there!

Probably.

Dan

Python Adventure

My Raspberry Pi arrives tomorrow, and I have my book all about coding in Python which I’ve started to read through. I also started looking at the Max SDK documentation for the Python stuff – It’s going to be a long road, but I’m excited about it!

I have a MxS currently that remaps assets from wherever they are currently to another, single folder by copying all the assets there and then remapping, including XRefs (and nested XRefs). This is a key part of our cloud based rendering system, but the problem is that some of the XRef Max files are very large, and whilst they zip up nice and tiny, it’s not possible to remotely request a local unzip on the server they get uploaded to. So what I’m hoping to do with Python (the server also runs a WAMP stack) is to have a standalone Python script up there that listens on a port and unzips files on request. Or perhaps I can do it by making a small text file in a given folder that the Py script will check, unzip the file in that text file, then delete the text file? I’ll need to experiment…

Dan

Rasberry Pi and Python fun!

I’ve been so, so lazy about learning Python. I’ve long since wanted to actually write standalone or web apps (And we have a WAMP stack running in the cloud courtesy of Amazon, so there’s definitely somewhere to use it – in fact, I know exactly what I want to do with it!) but I’m starting with something else – A Rasberry Pi and a book teaching Python for beginners with a Rasberry Pi in mind! So I’m adding a new tag, and hope you update this blog on how I do.

Dan

Backburner fun!

Hi All,

So, now I’ve got a hold on fiddling with Backburner via MaxScript (whilst trying to avoid its…. subtleties, as described here and here) I’ve been having some fun, using Backburner for some interesting tasks.

One of the most useful but obvious – insomuch as it’s right there in the cmdjob help entry, is submitting After Effects renders to the farm. It’s very easy to write a little bit of code wrapped up in an UI to make the tasklist file described (a comma separated file detailing the name and range of a job). Submit this along with the location of the After Effects .aep file and the comp name and you’re good to go.

But it got me thinking… all you’re really doing when you do this is submit a command via cmd.exe to the machines in question. So… why not go further? The first thing I thought of was a response to a problem we had at work where we received a file from a colleague off-site that contained a plugin none of us had installed, and nor did the farm. We had the choice of either stripping out the offending objects (if we could find the damn things) and then replicating its functionality without using the (free) plugin, or we could go on the usually arduous process of installing the plugin on all the workstations and all the render nodes. It’s just copying some .dlo files into Max’s /plugins/ directory, but still, if only there were some way of giving a universal copy command across the network…

The basic code was very simple – it offered the user the ability to select a file, a destination and select one of the groups (mentioned in the first link up there) on the farm. It then creates a small .bat file which is, effectively, just a command copy the file to the destination (ie COPY “X:\network\file.dlo” “C:\plugins\”, exit 0) and sends a separate job to Backburner for each node in the group, with only that node offered as a server for each job. Once the underlying code was done, I added a few tweaks, such as the ability to add entire folders to the mix, but it’s really just an expansion on this pretty basic concept.

Putting these two examples together – After Effects jobs and Copying files – gives me some fancy ideas for a bit of fun, but the most obvious one to me was… fonts! Unfortunately when you install a Font, it isn’t just a matter of copying it to the Windows /fonts/ folder – there’s also a registry entry that gets added. Otherwise the above script would be enough to install fonts, network wide – very handy if your AE job has a non-standard font. However, it shouldn’t be hard to add a checkbox to the above script, which will effectively add a line to the .bat file that gets generated which performs the appropriate registry tweaks using regedit – I just haven’t done it yet!

But if anyone has any other cool ideas on how Backburner could be leveraged for useful or fun tasks, please let me know! If anyone wants any more information on how to do the above in detail, feel free to email me at dan-grover@dan-grover.com.