Something I've been meaning to do for a while is get Stirec working in VR so you can see exactly what the observer would see from that vantage point. I've now got a basic system working, with a bunch of bugs, but already useful.
You just load Sitrec in your headset's browser (Here I'm using a Meta Quest 3), then click on "Enter VR". It will take your headset to the look camera position (i.e. the simulated viewpoint normally on the right of the screen).
The example above is a very simple reproduction of Fravor's encounter, based on his verbal description. You see the Tic-Tac about 20 seconds in, and a 300 ft patch of water
VR is good because it gives you the actual optical size of objects. The Tic-Tac is visible from 20,000 feet. But shape and details are not very clear. Unfortunately, it's limited by the resolution of the headset. An Apple Vision Pro would be better than the Meta Quest
I'll refine this over time. I need to figure out how to script/edit the motion of both objects to match the full narrative - and then maybe set it to his JRE interview audio.
https://www.metabunk.org/sitrec/?cu...aws.com/1/Fravor Encounter/20251129_005045.js
Last edited: