avclubvids wrote: ↑
Wed Jan 02, 2019 9:04 pm
I'm still playing with the Karta tools but I don't think I saw any for extracting depth or disparities from an equirectangular stereo pair, correct?
The Reason why KartaVR has *so many* random comp examples included is to present small workflow ideas one-by-one. Then when you add in the fact the macro nodes are all exposed as GroupOperators, it means artists who love tinkering with nodes can tear it all open to be able to remix the tools and do things that no individual other person has done exactly as you want.
If you feel like getting your hands dirty, and would like to get something neat learning wise out of it, poke around *inside* of the macro nodes in the following examples, and you will have started the journey to build on your own a solution that gives you disparity from stereo equirectangular pairs with the ability to refine how things work done under your control.
Step 1. Create a 6 DOF Fisheye to Equirectangular based Z360 Disparity Stitch
This example shows how 6 circular fisheye images shot on a nodal ninja with a single Sony A7SII camera, a Peleng 8mm fisheye lens, and a Jasper Engineering stereo slide rail can be stitched with disparity depth output.
This fisheye media was processed and stitched in Fusion Studio as 3 stereo pairs into an equirectangular stereo 6DOF stitched output done to an over/under RGBZ "Z360" layout.
To explore this scene:
Go to the KartaVR Example 360VR Stitching Comps
webpage and download the "West-Dover-Forest-Z360-Disparity-Depth-Stitch.zip (73MB)
And also read the overview summary that goes along with the fusion comp titled "West Dover Forest Z360 Disparity Depth Stitch
Step 2. Explore the PanoramicWrap Filters to see how a seamless left/right border effect was done
There are many possible ways that you can get Fusion to do a left/right border seamless expansion of an equirectangular frame so your effects don't get a hard seam line in the output.
Here is one of those approaches:
Open up any of the KartaVR "Source Composition files" that have "PanoramicWrap" the end of their name. Those files are provided by the Reactor "KartaVR/Tools/KartaVR Tools | Source Compositions" atom package and are installed to:
The examples you would see in that folder that are named with "PanoramicWrap" the end of their name include:
- BlurPanoramicWrap attrs.txt
- DefocusPanoramicWrap attrs.txt
- DepthBlurPanoramicWrap attrs.txt
- GlowPanoramicWrap attrs.txt
- SharpenPanoramicWrap attrs.txt
- UnSharpenMaskPanoramicWrap attrs.txt
The .txt file that is next to the .comp file gives a basic guide to how the raw nodes where turned into GroupOperator macros, along with a very short summary of what the attributes are that needed to be exposed in the macro, the default control ranges, and other notes.
If you opened this file for example:
You would see the following content inside of Fusion or Resolve:
The .txt file that lives alongside the "source compositions" folder .comp file lets you browse the Fusion composite, before the node has been made into a macro, and get an idea of what controls are adjustable.
If you add a DepthBlurPanoramicWrap macro to your comp using the Select Tool UI you can see how the GUI was laid on top of the raw nodes you saw earlier. Since this macro is a GroupOperator node type so you could still expand the node and peek inside of it at any time, too.
The "Defocus Blur Glow Sharpen Unsharpen.comp" example is provided by Reactor in the "KartaVR/Comps/KartaVR Example Comps" category. You should install that atom package, along with the sample 360VR media that comes in the Reactor "KartaVR/KartaVR Images" atom.
The KartaVR documentation page here shows the PanoramicWrap nodes
You can open up the example file on disk at:
Reactor:/Deploy/Comp/KartaVR/Defocus Blur Glow Sharpen Unsharpen.comp
This example was neat as you can have an internal gradient control that lets you change the falloff strength of the effect on the PanoramicWrap nodes. This could be used for a soft diffusion effect to make the frame misty looking, or it would be possible to move the control handles to the bottom of the fame and do a nadir based tripod blurring effect with low effort.
Step 3. Remap multi-channel image data into an equirectangular frame
The next example is here to help you try to get comfortable "shuffling" your Fusion 3D system based non-RGB beauty pass data into the different image channels in your comp.
A CustomTool node is one way to do this.
Or you could use a ChannelBooleans node.
Or you could write a fuse if you were into code.
Lets open up the "Boxworld EquirectangularRenderer3DAdvanced.comp" example from:
If you view the output of this comp it the Fusion Viewer windows channel controls you will see lots of extra image data is merged into an Equirectangular projection. This has more channel data that goes beyond the typical RGBA data most VR people use if their media was coming from PNG/JPG/MP4 sources.
If you expand that GroupOperator node you can see all the live-rendered cubic views being merged into Equirectangular image projection outputs. Then a series of ChannelBooleans take that data and help to push it into a custom Equirectangular based multi-channel output.
You can see another variation on this approach in the Z360 atom provided "Z360Renderer3D.setting" node. If you have the KartaVR comp examples loaded then you can access an RGBZ based roller coaster track example comp at
Reactor:/Deploy/Comp/KartaVR/Roller Coaster Ride Z360Renderer3D.comp
This example nodes shown above generates the following RGBZ top/bottom color + depthmap output in the Fusion viewer window:
By expanding the GroupOperator node contents, you will see the internal nodes, including a set of CustomTool nodes that help to remap the rendered image z-depth range.
Step 4. Check out the Z360Stereo and Z360Mesh3D macros
After you have played around inside of the other KartaVR examples on this page, it can be fun to open up the Z360Stereo and Z360Mesh3D nodes comps in Fusion to see how they use common nodes inside the macros to attempt to give omni-stereo based 6DOF conversions without relying on a raytraced stereo lens shader and slit-scan style rendering.
Sure a dedicated C++ Fusion SDK based node that does this process would be better. And having your own Red Manifold filmed footage would be great, too. I'd sure love to have one of those cameras.
But for kicks, you can get pretty far with just plain old Fusion nodes. Since the Z360 macros don't generate disparity data on their own, you could still use them in a bind on Fusion (Free) if you were cool with 3840x2160px media max as your output resolution.
Here are the two main Z360 comp examples to poke-through.
The Z360 Stereo demo looks like this:
It's all nodes - Do with them as you want!
If you kitbash the node workflow concepts in these demos, you could start to make a slightly crude but very workable stereo disparity tool for processing just about an image projection.
, after you explore these Fusion examples and poke around inside them for a while to get comfortable, I'd be happy to step you through how to build a "disparities from an equirectangular stereo pair" node in a step by step fashion over the next few days once your feet are wet... assuming this is something you'd like to push further.
Also, if you want to see an explainer on how Fusion macros are packaged, I made a "Macro Building Essentials
" WSL thread that shows a few techniques on how you can encapsulate any collection of your own Fusion nodes from your personal comps into your own macro's with the GUI controls you can choose to expose to the end artist and set their ranges.
At first, macro building can look like a scary topic to someone who hasn't done it before, but after a few small tests, you can make some customized tools that will save you lots of time on repetitive tasks. And with those macro editing "skillz", you also have the power to re-work any MacroOperator or GroupOperator node you come across in your time using the Resolve Fusion page and Fusion Standalone.