Would you be willing to send me a copy of the original un-stitched Insta360 Pro fisheye views for that shot? I'd love to try making a 100% Fusion processed example comp of going from the unprocessed camera's native footage to a 6DOF compatible Z360 style Over/Under Color + Depth output.
I'm not at my Fusion based workstation at the moment so here are a few comments made without having a chance to look at your comp yet.
The documentation for each of the controls on the Z360Stereo node is here.
BTW the Z360Stereo node's "Stereo Depth Displacement" attribute can work with a positive or negative value in the number field. This lets you work with either style of depthmap (White to Black, or Black to White near/far depth ranges) without having to invert the greyscale image data beforehand. Your first step of changing settings for more 3D depth in your output should be to double this value.
If you want to tune the strength of the Z360Stereo node's output as you go from a Z360 style Over/Under Color+Depth image to Left/Right eye stereo color views, it can be handy to look at the full-frame equirectangular image in Fusion's viewer window while you wear anaglyph 3D glasses. You can enable Fusion's Stereo3D viewer mode so it is active with the "VStack" + Anaglyph option engaged. For depth comparisons, it can be useful to look at the whole image without a panoramic viewer active so you can see the complete LatLong image at once and notice defects across the frame. (Also being able to reverse your anaglyph glasses on your face quickly can help you spot if the depth range is inverted where the generated stereo output is cross-eyed backwards.)
With the anaglyph 3D glasses on you can fine-tune the "Stereo Depth Displacement" strength to make the depth "punchier" and stronger without accidentally over-driving the values to a point of causing viewer discomfort.
Also, you might find it handy to tweak the "Stereo Displace Convergence" attribute to set the zero parallax distance in the scene to a value that converges the depth at the yellow metal hand-rail in the foreground.
The Z360Stereo node's "Zdepth - Contrast" and "ZDepth - Gamma" controls can be used to color-correct the depth data and refine how hard the transition between near/far depth zones are.
When you are using the Fusion native "DisparityToZ" node the Camera tab's "Camera Mode" control is often easier to use if you go to the "Artistic" mode. You can then hold-down the "Sample" (Fu16) or "Pick" (Fu9) buttons on the "Foreground Disparity" and "Background Disparity" controls to use them like an "eye-dropper" color-picker cursor to select those values from your viewer window from the actual imagery. This will refine the range of the depth output and give better results.
At the bottom of the Inspector window is a "Falloff" control that lets you go from a Hyperbolic to Linear depth output. Sliding this Falloff control can help bend-reality for the depth in the scene and could be thought of like a greyscale version of HDR tone-mapping an image which gives you a kind of result that makes it easier to jamb a large depth range into a compressed distance.
The CopyAux node 's
[x] Enable Remapping control is great at allowing you to scale large floating point depth ranges into a 0-1 color range. You can also invert the depth range on the output in this node too. This range remapping is handy when you want to make an Over/Under Color/Depth JPEG or PNG image or MP4 movie from your disparity generated depth information that has to be done as 8 or 10 bit per channel output.
KartaVR Example Comps
Using KartaVR for more advanced workflows expects a certain familiarity with Fusion's native tools, node based compositing in general, and the Fusion UI.
The following KartaVR examples won't cover all the details you want for a complete explainer on the Fusion Studio disparity mapping to Z360Stereo node workflows but should get you closer to the end result you are after:
Use Reactor to install the "KartaVR > Comps > KartaVR Example Comps" atom package.
Then go to the Fusion "File > Open..." menu item. In the "Open File" Dialog expand the PathMaps section on the left side of the window. Then click on the "Reactor:" PathMap entry. In the folder part of the Open File Dialog navigate to the "Reactor:/Deploy/Comps/KartaVR/" folder.
(If you've never heard of the term PathMap before, a PathMap is a custom relative filepath in Fusion that is defined in Fusion's preferences.)
There are two Z360 examples in this folder - "Z360 Stereo.comp" and "Z360Mesh3D.comp" that show how the Z360 nodes are used to convert the Over/Under Color/Depth images into either left/right eye view stereo color images or into a 3D depth mesh you can view in Fusion.
There is also a "Stereo 3D Roto Conversion.comp" that shows how to use rotoshapes in Fusion to make a rough but usable depthmap by hand from a 2D monoscopic image.
On the Andrew Hazelden Blog page for "KartaVR Example 360VR Stitching Comps" take a look at these resources:
Creating Stereo Video Based Disparity Depthmaps
West Dover Forest Z360 Disparity Depth Stitch