Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
![]() ![]() ![]() ![]() ![]() ![]() ![]() |
65 posts
|
As previously mentioned, I've been trying to track down the change that caused my GL_POLYGON_OFFSET_FILL /depth buffer issue. This was working fine for many years until recently. I've narrowed it down to something that changed between b224 and b226 (225 also, but that seems to be a partial build). When I try to track this back to hudson, the dates/times don't seem to tie up with anything relevant. So I'm suspecting the change was actually 1 day earlier. 'choose and use graphics capability' (96af6c9bf2d683115996214cd895f9f9ef7ceea6)
Further testing has shown that prior to 225 a 24bit depth buffer was selected automatically, after that date only a 16bit buffer is selected. So it seems like the code fails to detect the best available? If I manually select 24bit in my request, then things are working again :)
Experiments: https://github.com/WhiteHexagon
|
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Administrator
2933 posts
|
On Wednesday, December 15, 2010 13:01:54 BrickFarmer [via jogamp] wrote:
> > As previously mentioned, I've been trying to track down the change that caused my GL_POLYGON_OFFSET_FILL /depth buffer issue. This was working fine for many years until recently. I've narrowed it down to something that changed between b224 and b226 (225 also, but that seems to be a partial build). When I try to track this back to hudson, the dates/times don't seem to tie up with anything relevant. So I'm suspecting the change was actually 1 day earlier. 'choose and use graphics capability' (96af6c9bf2d683115996214cd895f9f9ef7ceea6) > > Further testing has shown that prior to 225 a 24bit depth buffer was selected automatically, after that date only a 16bit buffer is selected. So it seems like the code fails to detect the best available? If I manually select 24bit in my request, then things are working again :> ) A perfect analysis, good job. indeed I have reduced the default depth value of GLCapabilities from 24 to 16 2aa296771e3e8dd6cf027f27b0455d1803244bfe since some GPUs don't support such high value. So .. this will be the default now, feel free to try a higher :) Sorry for the inconvenience. ~Sven |
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
![]() ![]() ![]() ![]() ![]() ![]() ![]() |
65 posts
|
I thought the code was designed to pick out the best bit depth on offer? at least I saw something during my investigations that implied that, but I could be mistaken.
Experiments: https://github.com/WhiteHexagon
|
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Administrator
2933 posts
|
On Wednesday, December 15, 2010 13:50:05 BrickFarmer [via jogamp] wrote:
> > I thought the code was designed to pick out the best bit depth on offer? at least I saw something during my investigations that implied that, but I could be mistaken. if you explicitly pass a CapsChooser .. that one may do that, if not - and the native stuff recommends a format, we use that one. the native recommendation is what the WGL/GLX ARB choose functions provide, thought this matches the native behavior best. but you may override it with your chooser .. and maybe .. we just have a bug :) if you like, your analysis is very welcome as always, sure thing me trying to fix BITMAP windows bug (offscreen if no GLPbuffer is avail) ~Sven |
Free forum by Nabble | Edit this page |