Disparity to KartaVR Z360

Moderator: SecondMan

User avatar
rwcreations
Posts: 6
Joined: Tue Sep 26, 2017 4:51 pm

Disparity to KartaVR Z360

#1

Post by rwcreations » Sat May 25, 2019 10:41 am


User avatar
SecondMan
Site Admin
Posts: 3774
Joined: Thu Jul 31, 2014 5:31 pm
Answers: 7
Location: Vancouver, Canada
Been thanked: 146 times
Contact:

Re: Disparity to KartaVR Z360

#2

Post by SecondMan » Sat May 25, 2019 12:06 pm

I don't know the answer to your question, but when posting Imgur links you don't need to wrap them inside [img][/img] tabs - the board will take care of that.

Alternatively, you can upload images to Imgur directly, as described here: viewtopic.php?f=2&t=2057


User avatar
AndrewHazelden
Fusionator
Posts: 1532
Joined: Fri Apr 03, 2015 3:20 pm
Answers: 8
Location: West Dover, Nova Scotia, Canada
Been thanked: 100 times
Contact:

Re: Disparity to KartaVR Z360

#3

Post by AndrewHazelden » Sat May 25, 2019 6:55 pm

Hi RW.

Would you be willing to send me a copy of the original un-stitched Insta360 Pro fisheye views for that shot? I'd love to try making a 100% Fusion processed example comp of going from the unprocessed camera's native footage to a 6DOF compatible Z360 style Over/Under Color + Depth output. :)


I'm not at my Fusion based workstation at the moment so here are a few comments made without having a chance to look at your comp yet. :)

The documentation for each of the controls on the Z360Stereo node is here.

BTW the Z360Stereo node's "Stereo Depth Displacement" attribute can work with a positive or negative value in the number field. This lets you work with either style of depthmap (White to Black, or Black to White near/far depth ranges) without having to invert the greyscale image data beforehand. Your first step of changing settings for more 3D depth in your output should be to double this value.

If you want to tune the strength of the Z360Stereo node's output as you go from a Z360 style Over/Under Color+Depth image to Left/Right eye stereo color views, it can be handy to look at the full-frame equirectangular image in Fusion's viewer window while you wear anaglyph 3D glasses. You can enable Fusion's Stereo3D viewer mode so it is active with the "VStack" + Anaglyph option engaged. For depth comparisons, it can be useful to look at the whole image without a panoramic viewer active so you can see the complete LatLong image at once and notice defects across the frame. (Also being able to reverse your anaglyph glasses on your face quickly can help you spot if the depth range is inverted where the generated stereo output is cross-eyed backwards.)

With the anaglyph 3D glasses on you can fine-tune the "Stereo Depth Displacement" strength to make the depth "punchier" and stronger without accidentally over-driving the values to a point of causing viewer discomfort.

Also, you might find it handy to tweak the "Stereo Displace Convergence" attribute to set the zero parallax distance in the scene to a value that converges the depth at the yellow metal hand-rail in the foreground.

The Z360Stereo node's "Zdepth - Contrast" and "ZDepth - Gamma" controls can be used to color-correct the depth data and refine how hard the transition between near/far depth zones are.


When you are using the Fusion native "DisparityToZ" node the Camera tab's "Camera Mode" control is often easier to use if you go to the "Artistic" mode. You can then hold-down the "Sample" (Fu16) or "Pick" (Fu9) buttons on the "Foreground Disparity" and "Background Disparity" controls to use them like an "eye-dropper" color-picker cursor to select those values from your viewer window from the actual imagery. This will refine the range of the depth output and give better results.

At the bottom of the Inspector window is a "Falloff" control that lets you go from a Hyperbolic to Linear depth output. Sliding this Falloff control can help bend-reality for the depth in the scene and could be thought of like a greyscale version of HDR tone-mapping an image which gives you a kind of result that makes it easier to jamb a large depth range into a compressed distance.

The CopyAux node 's [x] Enable Remapping control is great at allowing you to scale large floating point depth ranges into a 0-1 color range. You can also invert the depth range on the output in this node too. This range remapping is handy when you want to make an Over/Under Color/Depth JPEG or PNG image or MP4 movie from your disparity generated depth information that has to be done as 8 or 10 bit per channel output.

KartaVR Example Comps

Using KartaVR for more advanced workflows expects a certain familiarity with Fusion's native tools, node based compositing in general, and the Fusion UI.

The following KartaVR examples won't cover all the details you want for a complete explainer on the Fusion Studio disparity mapping to Z360Stereo node workflows but should get you closer to the end result you are after:

Use Reactor to install the "KartaVR > Comps > KartaVR Example Comps" atom package.

Then go to the Fusion "File > Open..." menu item. In the "Open File" Dialog expand the PathMaps section on the left side of the window. Then click on the "Reactor:" PathMap entry. In the folder part of the Open File Dialog navigate to the "Reactor:/Deploy/Comps/KartaVR/" folder.

(If you've never heard of the term PathMap before, a PathMap is a custom relative filepath in Fusion that is defined in Fusion's preferences.)

There are two Z360 examples in this folder - "Z360 Stereo.comp" and "Z360Mesh3D.comp" that show how the Z360 nodes are used to convert the Over/Under Color/Depth images into either left/right eye view stereo color images or into a 3D depth mesh you can view in Fusion.

There is also a "Stereo 3D Roto Conversion.comp" that shows how to use rotoshapes in Fusion to make a rough but usable depthmap by hand from a 2D monoscopic image.

On the Andrew Hazelden Blog page for "KartaVR Example 360VR Stitching Comps" take a look at these resources:

Creating Stereo Video Based Disparity Depthmaps

West Dover Forest Z360 Disparity Depth Stitch


User avatar
rwcreations
Posts: 6
Joined: Tue Sep 26, 2017 4:51 pm

Re: Disparity to KartaVR Z360

#4

Post by rwcreations » Sat May 25, 2019 7:43 pm

Thanks Andrew! Your tool and examples are so thorough I've been spending a ton of time reviewing the sample comps. It's Fusions' disparity tool in this case that I wasn't finding a lot of examples. This ancient Eyeon tutorial ( www.youtube.com/watch?v=ulsJKAV_DBM ) is the best one I could find.

Yes I can definitely give you the camera images from this shot. Send me a PM and I'll shoot you a URL.

-RW


User avatar
AndrewHazelden
Fusionator
Posts: 1532
Joined: Fri Apr 03, 2015 3:20 pm
Answers: 8
Location: West Dover, Nova Scotia, Canada
Been thanked: 100 times
Contact:

Re: Disparity to KartaVR Z360

#5

Post by AndrewHazelden » Mon May 27, 2019 2:25 pm

rwcreations wrote:
Sat May 25, 2019 10:41 am

I am really enjoying Fusion+KartaVR for processing my Insta360 Pro stereo images. I'm a bit stumped right now though. I would like to extract the disparity from an over/under equirectangular, do some cleanup, and then convert back to over/under. The disparity map I'm getting is very weak and needs inverting to match the KartaVR doc examples. The final over/under has very little stereo separation.

Hi @rwcreations

Here is a snapshot of the Fusion stitched Z360 color + depthmap output I got from KartaVR processing your original unstitched Insta360Pro fisheye images. I PM'ed you a link to the Fusion project files I prepared (1.68GB download) so you can explore them. :)

This is the first time I've processed footage from this camera system so it should be possible to get better results on the depthmap consistency after a little bit more experimentation and tinkering with the workflow.


User avatar
rwcreations
Posts: 6
Joined: Tue Sep 26, 2017 4:51 pm

Re: Disparity to KartaVR Z360

#6

Post by rwcreations » Wed May 29, 2019 8:09 pm

For those interested I put up the image in question on Veer so one can view it in stereo 360. https://veer.tv/photos/carrie-loader-v3-437506

Note this version was processed in Fusion, but did not use the Zdepth method. I'll put a Zdepth version here shortly.

-RW