Login  Register

Re: OpenGL 3.2 in OS X 10.7

Posted by JStoecker on Jul 20, 2011; 11:02pm
URL: https://forum.jogamp.org/OpenGL-3-2-in-OS-X-10-7-tp3186330p3186974.html

No kidding Sven, this is way overdue. Here is a sample C program that I used to check the context. I found this code on a forum somewhere as I've only used GLUT in the past for the window:

#include <OpenGL/OpenGL.h>
#include <OpenGL/gl3.h>
#include <stdio.h>

int main(int argc, char **argv)
{
  CGLContextObj ctx;
  CGLPixelFormatObj pix;
  GLint npix;
  CGLPixelFormatAttribute attribs[] = {
    kCGLPFAOpenGLProfile, kCGLOGLPVersion_3_2_Core, 0 };

  CGLChoosePixelFormat(attribs, &pix, &npix);
  CGLCreateContext(pix, NULL, &ctx);
  CGLSetCurrentContext(ctx);

  printf("Vendor:   %s\n", glGetString(GL_VENDOR));
  printf("Renderer: %s\n", glGetString(GL_RENDERER));
  printf("Version:  %s\n", glGetString(GL_VERSION));
  printf("GLSL:     %s\n", glGetString(GL_SHADING_LANGUAGE_VERSION));
  return 0;
}


Compiled with "gcc -framework OpenGL <filename>", and the output after running:

Vendor:   ATI Technologies Inc.
Renderer: AMD Radeon HD 6750M OpenGL Engine
Version:  3.2 ATI-7.2.9
GLSL:     1.50


To be honest, I can't really answer you questions about how OS X works with the binaries and downgrades. I'm actually a Linux user and I've only recently purchased my first Mac (to the dismay of some of my Linux peers). So far I've had no issues running any JOGL programs I've written with RC2. I *think* Lion is only 64 bit though.