That's my thought. I still think this problem could also be a card- or driver-specific issue, but there's not much I think I could do if that turns out to be the case. Trying to rule out any potential JOGL-related issue (i.e. either internal or in how I've used it) seemed like an easier first step.
At the time of this writing, the git tree can be found here: https://gitlab.com/ghost-in-the-zsh/sage2/
The relevant branches and files and commits are as follows:
1. The commits where the issue got introduced are: 49e879f5263ee1d15a479cbd51b9ea068629f8e0
2. Commit/diff with workaround: 55bcea2efd78fb2272aaffea161a1e5c4f6d2690
Since most development takes place in the laptop, I didn't observe the issue until I tried running the tests on my desktop several months later, and got it "fixed" yesterday before posting.
The relevant files are:
1. The GL4RenderSystem implements the GLEventListener and contains the GLCanvas setup I had quoted before. These are the commits before
2. The GenericSceneManager was invoking RenderSystem.swapBuffers(), which internally invoked canvas.swapBuffers() as shown above. These are the commits before
Although you might not need to go this far, please note that if you actually intend to run test cases from the tests package, you should make sure to check out the feature/final-updates-and-documentation branch of the project and also add this separate math library
to your class path.
In hindsight, I did wonder about this based on some odd results I saw. If looking at a commit without the workaround, the tests.ray.sage2.scene.MultiViewportTest does display a scene, but the lower-right camera/viewport does not show its content (the test has 4 cameras looking at the same scene from different angles) while the other 3 cameras seem to display the scene correctly.
I remember the thought of a race condition crossing my mind, since the GLEventListener runs on a separate thread, which could allow the scene manager to continue and end up asking the render system canvas to swap the buffers before the last camera had a chance to finish. OTOH, it didn't seem to make sense that other tests with a single camera for the scene (e.g. NodeBasedCameraTest) would display a completely black screen.
I should note that depending on which buffer swapping mode and manual buffer swapping invocation combination you use, you could actually see enough through the severe flickering to realize that the scene had been rendered, even if it was not being displayed properly.
I've not read the JOGL unit tests; I was relying on the javadocs. If, after looking at the code above, you have a more specific suggestion, I'll appreciate it.
What about the reason I provided for wanting to do it?
Is it incorrect to think that, if I have different cameras rendering into different viewports in the same window, JOGL will swap buffers for each camera per frame rather than just once per frame (i.e. only after all cameras have finished rendering)?
More concretely, if I have 4 cameras, I don't want 4 buffer swaps; rather, I want only 1 buffer swap after all 4 cameras have rendered into their viewports.
While I've heard of the existence of Ardor3D Continuation before, I'm quite time constrained and don't really have the time to search through non-trivial and unfamiliar code bases aimlessly just hoping to discover something, so if you have a specific suggestion on what the right way to do this is, I'd appreciate it.
I understand that, but I would've at least expected the issue to have also been observed in my laptop if I had just not done something I should have. That never happened. Not saying I'm dismissing the possibility; just saying that the evidence at this time seems to suggest something different.