This post was updated on .
I'm quite stuck here, and can't continue any development until this mystery is explained.
I'm loading texture 2049 x 2049 (size does not matter! this one is given just as an example). And now I want to pass it with GL2.GL_RGB or GL2.GL_BGR (also does not matter), therefore I prepare my ByteBuffer as follows: ByteBuffer.allocateDirect(2049 * 2049 * 3).order(ByteOrder.nativeOrder()); Later, I fill it with texture information and call "glTexImage2D" and here's what I get as a response: Exception in thread "AWT-EventQueue-0" java.lang.IndexOutOfBoundsException: Required 12597251 remaining bytes in buffer, only had 12595203 So now let's count: 2049 * 2049 * 3 = 12595203, so WTF? I mean why he asks me for 12597251 bytes? Since I had one of last year JOGL releases installed - I have decided to install the newest one 2011-03-03, but the problem is still here. Right, maybe I don't understand something? BTW, if I add alpha channel (change from GL2.GL_RGB to GL2.GL_RGBA) and extend buffer to 2049 * 2049 * 4 - everything works fine - this is a complete mystery to me... Edit: I've just noticed that 12597251 - 12595203 = 2048 = 2049 - 1, there must be some dependency, eh? |
Administrator
|
>>BTW, if I add alpha channel (change from GL2.GL_RGB to GL2.GL_RGBA) and extend buffer to 2049 * 2049 * 4 - everything works fine - this is a complete mystery to me... maybe you are reading the alpha channel from the texture so that is why 2049*2049*4 worked!cheers, rami |
I'm not reading anything, currently I'm talking ONLY about "glTexImage2D" call which immediately throws an exception posted above.
|
Administrator
|
>>I'm not reading anything, currently I'm talking ONLY about "glTexImage2D" call which immediately throws an exception posted above.
OK . tried it, this works for me !
IntBuffer test = IntBuffer.allocate(1); ByteBuffer buff = ByteBuffer.allocate(4096*4096*3); gl.glGenTextures(1, test); gl.glBindTexture(GL3.GL_TEXTURE_2D, test.get(0));
gl.glTexImage2D(GL3.GL_TEXTURE_2D, 0, GL3.GL_RGB, tex_width, tex_height, 0, GL3.GL_RGB, GL3.GL_UNSIGNED_BYTE, buff); Is this wat u doing? And can u post more info about ur hardware etc Cheers, Rami |
Alright, I figured it out but I need some explanations. Rami, please try out 2049 x 2049 or 4097 x 4097 or whatever 2^n + 1 x 2^n + 1 sizes and report your results. I've just tried power of 2 sizes and it worked, but why it does not work with 2^n + 1 sizes, I believe it should, shouldn't it!?
Radeon HD 5870 |
Administrator
|
Nope. Not all cards support non power of 2 sizes. But you can over come this by transforming ur texture to a power 2 one.
Cheers, Rami |
I have to compress texture using OpenGL, and I need it to be 2^n + 1 size.
|
Administrator
|
In reply to this post by Haroogan
That's strange -- Ralph just encountered a similar problem on another thread (http://forum.jogamp.org/Textures-program-works-on-one-machine-but-not-on-another-td2566528i20.html) with an ATI Radeon HD 4870. Both cards should support ARB_texture_non_power_of_two, though I can't find docs online to confirm this.
Haroogan, if you run etc\test.bat (or etc/test.sh for Linux/Mac) from the JOGL 2 distribution, do you see GL_ARB_texture_non_power_of_two in the list of extensions that prints out at the end? |
WW I've just run the test and got this:
java.lang.NoClassDefFoundError: com/jogamp/newt/opengl/GLWindow Caused by: java.lang.ClassNotFoundException: com.jogamp.newt.opengl.GLWindow at java.net.URLClassLoader$1.run(Unknown Source) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) Could not find the main class: com.jogamp.newt.opengl.GLWindow. Program will exit. Exception in thread "main" It seems like I'm doing something wrong? How to run this test properly? :) |
Administrator
|
That's strange that etc/test.sh doesn't work, but we'll assume that the ATI 5870 supports NPOT, since we've confirmed on the other thread that the 4870 supports NPOT.
Are you doing the same code as Rami had above (i.e. with no calls to any part of JOGL except through gl.gl*() methods)? If Rami's code is failing for you, I'm not sure where the problem can be, since those gl* calls are almost a straight pass-through to the native code. There's a little JOGL code in there to get buffer offsets, but that code doesn't seem to know about NPOT. Maybe check your GL2 object, and make sure it's 2.1 or later (that you're not somehow using OGL ES or something)? I'll try this on an old ATI card tonight to see what happens. |
Yeah, as I've said no intermediate calls, just glTexImage. It works fine with power of 2's, but with 2^n + 1 it throws an exception, which I've described in the first post.
More over I will run in some details here. I'm trying to do S3 Texture Compression using my graphics card. Yeah, I know that there are specific tools, but right now I have to do it on my own. So I have to compress non-power of 2 images. But since non-power of 2 doesn't work for some reason let's talk about power of 2, because there are some issues on texture compression topic also: First of all to learn how many compression formats are supported by my card I do the following: glGetIntegerv with GL2.GL_NUM_COMPRESSED_TEXTURE_FORMATS Guess how much did I get :) Well I will tell you - 0. So if I believe the results of glGetIntegerv call - my high-end video-card does not support texture compression! But that was just a beginning. Ignoring 0 I've tried to compress it forcefully on 2048x2048 image - at it worked! So I've got my DXT1 compression on power of 2 image (despite the 0 result of glGetIntegerv call). Finally, I've tried to compress 4096x4096 image in the same way - and it has thrown GL_OUT_OF_MEMORY error - again on high-end video-card?! All these bugs are really strange. WW that would be great, if you could test all these cases on your machines, waiting for your feedback :) |
Administrator
|
I tried this last night, but my old ATI Mobility Radeon x300 is too old for this -- it doesn't support NPOT textures at all (the extension is really not present).
I'm seeing some other strange behavior with this ATI card too, since build 271. You might try some very old versions of JOGL 2, to see if this NPOT bug appears after a certain date. That would help us pin down if there was some code change in JOGL 2 that's somehow causing this. Try going backwards in the dev builds, by increments of 50 or so, and see if NPOT starts working. If it does (before you get back to b150 or so, which is the earliest), then you can do a bisection search to see which exact build broke it. |
Administrator
|
In reply to this post by Haroogan
Also, try looking at the GL2 object in your program in the debugger, and see if it shows up as a GL 2.1+ object. If it's showing OGL 1.1 or something, you may not be getting the HW accelerated driver at all (I've occasionally seen this before on laptops in JOGL 1 when restarting from sleep mode or undocking monitors).
|
I'll try all this a little bit later, but still the problem is evident. It must be JOGL, because NPOT textures work fine when I add alpha channel. As I've said in my very first post - when I specify NPOT texture and give it RGB or BGR format the exception is thrown - I've posted it in the very first post. If I use BGRA or RGBA with NPOT texture - everything works fine. I'm 100% sure you have to check code inside glTexImage function or whatever may affect it, because it is JOGL exception...
|
Administrator
|
Sorry for the difficulty of my suggestion, but I think it's the best way to find out if some change to the JOGL 2 code is causing the problem. If you can give us an exact JOGL build number that's the first one to fail, it will narrow down the possibilities considerably.
Or, if you look at old builds and JOGL has always failed like this, that will be helpful too. Maybe this bug has always been there and we didn't realize it In the meantime, I'm still looking at the ATI HotSpot crash bugs that have been reported by others (since I can duplicate that bug on my machine). There may be some commonality here, as it looks like the driver function pointers are somehow set wrong. |
This post was updated on .
Exception in thread "AWT-EventQueue-0" javax.media.opengl.GLException: Thread[AWT-EventQueue-0,6,main] glGetError() returned the following error codes after a call to glTexImage2D(<int> 0xDE1, <int> 0x0, <int> 0x1908, <int> 0x1001, <int> 0x1001, <int> 0x0, <int> 0x1908, <int> 0x1401, <java.nio.Buffer>): GL_OUT_OF_MEMORY ( 1285 0x505) Just was trying to glTexImage2D 2 4097x4097 textures. The first one was uploaded to GPU successfully, but the 2nd one throws above exception. This means I can never have 2 textures of that size on GPU. Moreover, somehow 8193x8193 - is loaded to GPU without any problems - but it's size is 2 times more than 2 textures of 4097x4097 are. This proves that I have enough memory (but I think we don't really need to prove it with Radeon HD 5870? :)). Guys you really have to fix it, I can't work like that :) Right now, I'm going to refresh drivers - shall see how it goes then... Reinstalling drivers didn't help - this is JOGL... |
is your texture data buffer a direct nio buffer or is it allocated on heap? i recommend using Buffers.newFooBuffer() for buffer allocation whenever possible, since you are dealing with large buffers you may want to increase perm gen in case it wouldn't fit if direct allocated (-XX:MaxPermSize=) if you already using direct nio buffers... please report back. best regards, michael On 03/13/2011 06:01 PM, Haroogan [via jogamp] wrote: Exception in thread "AWT-EventQueue-0" javax.media.opengl.GLException: Thread[AWT-EventQueue-0,6,main] glGetError() returned the following error codes after a call to glTexImage2D( 0xDE1, 0x0, 0x1908, 0x1001, 0x1001, 0x0, 0x1908, 0x1401, ): GL_OUT_OF_MEMORY ( 1285 0x505) Just was trying to glTexImage2D 2 4097x4097 textures. The first one was uploaded to GPU successfully, but the 2nd one throws above exception. This means I can never have 2 textures of that size on GPU. Moreover, somehow 8193x8193 - is loaded to GPU without any problems - but it's size is 2 times more than 2 textures of 4097x4097 are. This proves that I have enough memory (but I think we don't really need to prove it with Radeon HD 5870? :)). Guys you really have to fix it, I can't work like that :) _______________________________________________ If you reply to this email, your message will be added to the discussion below: http://forum.jogamp.org/Texture-Issue-Again-tp2646111p2673148.html To start a new topic under jogamp, email [hidden email] To unsubscribe from jogamp, visit -- http://michael-bien.com/ |
What's the difference between
ByteBuffer.allocateDirect(n).order(ByteOrder.nativeOrder()); and Buffers.newFooBuffer() I've already said that I'm using the first case. All 3 issues + this one still remain. And I guess you don't know how to fix them do you? ;) |
no difference.
so you are using direct buffers, good. now i have one code path less to trace through it :) -michael On 03/13/2011 07:37 PM, Haroogan [via jogamp] wrote: > > What's the difference between > > > ByteBuffer.allocateDirect(2049 * 2049 * 3).order(ByteOrder.nativeOrder()); > > > and > > > Buffers.newFooBuffer() > > > _______________________________________________ > If you reply to this email, your message will be added to the discussion below: > http://forum.jogamp.org/Texture-Issue-Again-tp2646111p2673527.html > To start a new topic under jogamp, email [hidden email] > To unsubscribe from jogamp, visit http://michael-bien.com/ |
I can tell you more, I've just tried to load 2 2049x2049 textures to GPU - and it's ok. But you see 2 4097x4097 textures already throw out of memory exception. Also please have a look at my very first post about NPOT textures and nio buffer sizes.
|
Free forum by Nabble | Edit this page |