Home » Eclipse Projects » Standard Widget Toolkit (SWT) » GLCanvas: "Unsupported color depth" on VmWare Linux
GLCanvas: "Unsupported color depth" on VmWare Linux [message #1860196] |
Fri, 21 July 2023 12:23 |
Tom Wheeler Messages: 17 Registered: June 2018 |
Junior Member |
|
|
I have an Eclipse RCP application that uses SWT's GLCanvas for OpenGL graphics. It works fine on Windows-10 and -11 as well as on a native Linux PC.
However, when running my application on a Fedora (34) Linux inside a VmWare Workstation Pro (17.0.2), I get an "org.eclipse.swt.SWTException: Unsupported color depth".
To dig into the problem, I installed the org.eclipse.swt sources and created a small Java test code. This way I found out that the line failing in GLCanvas is:
long infoPtr = GLX.glXChooseVisual (xDisplay, OS.XDefaultScreen (xDisplay), glxAttrib);
After this method call, 'inforPtr' is 0, which means that the call has failed.
Here is my Java test code:
package openglexperiments;
import org.eclipse.swt.SWT;
public class GLExperiments {
public void go() {
Display display = new Display();
Shell shell = new Shell(display);
shell.setText("OpenGL Experiments");
shell.setSize(600, 400);
shell.open();
GLData data = new GLData();
// data.doubleBuffer = true;
// data.depthSize = 32;
// data.sampleBuffers = 1;
// data.samples = 4;
// data.redSize = 8;
// data.greenSize = 8;
// data.blueSize = 8;
// data.alphaSize = 8;
System.out.println("data=" + data);
GLCanvas glcanvas = new GLCanvas(shell, SWT.NONE, data);
while (!shell.isDisposed()) {
if (!display.readAndDispatch())
display.sleep();
}
display.dispose();
}
public static void main(String[] args) {
GLExperiments glExperiments = new GLExperiments();
glExperiments.go();
System.exit(0);
}
}
I did experiment a little with the contents of GLData but even when not setting any of its attributes, it still failed.
I then wanted to see if a C-code program that uses glXChooseVisual() could run fine.
I got the C-source code from here:
https://www.khronos.org/opengl/wiki/Programming_OpenGL_in_Linux:_GLX_and_Xlib
// -- Written in C -- //
#include<stdio.h>
#include<stdlib.h>
#include<X11/X.h>
#include<X11/Xlib.h>
#include<GL/gl.h>
#include<GL/glx.h>
#include<GL/glu.h>
Display *dpy;
Window root;
GLint att[] = { GLX_RGBA, GLX_DEPTH_SIZE, 24, GLX_DOUBLEBUFFER, None };
XVisualInfo *vi;
Colormap cmap;
XSetWindowAttributes swa;
Window win;
GLXContext glc;
XWindowAttributes gwa;
XEvent xev;
void DrawAQuad() {
glClearColor(1.0, 1.0, 1.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(-1., 1., -1., 1., 1., 20.);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
gluLookAt(0., 0., 10., 0., 0., 0., 0., 1., 0.);
glBegin(GL_QUADS);
glColor3f(1., 0., 0.); glVertex3f(-.75, -.75, 0.);
glColor3f(0., 1., 0.); glVertex3f( .75, -.75, 0.);
glColor3f(0., 0., 1.); glVertex3f( .75, .75, 0.);
glColor3f(1., 1., 0.); glVertex3f(-.75, .75, 0.);
glEnd();
}
int main(int argc, char *argv[]) {
dpy = XOpenDisplay(NULL);
if(dpy == NULL) {
printf("\n\tcannot connect to X server\n\n");
exit(0);
}
root = DefaultRootWindow(dpy);
vi = glXChooseVisual(dpy, 0, att);
if(vi == NULL) {
printf("\n\tno appropriate visual found\n\n");
exit(0);
}
else {
printf("\n\tvisual %p selected\n", (void *)vi->visualid); /* %p creates hexadecimal output like in glxinfo */
}
cmap = XCreateColormap(dpy, root, vi->visual, AllocNone);
swa.colormap = cmap;
swa.event_mask = ExposureMask | KeyPressMask;
win = XCreateWindow(dpy, root, 0, 0, 600, 600, 0, vi->depth, InputOutput, vi->visual, CWColormap | CWEventMask, &swa);
XMapWindow(dpy, win);
XStoreName(dpy, win, "VERY SIMPLE APPLICATION");
glc = glXCreateContext(dpy, vi, NULL, GL_TRUE);
glXMakeCurrent(dpy, win, glc);
glEnable(GL_DEPTH_TEST);
while(1) {
XNextEvent(dpy, &xev);
if(xev.type == Expose) {
XGetWindowAttributes(dpy, win, &gwa);
glViewport(0, 0, gwa.width, gwa.height);
DrawAQuad();
glXSwapBuffers(dpy, win);
}
else if(xev.type == KeyPress) {
glXMakeCurrent(dpy, None, NULL);
glXDestroyContext(dpy, glc);
XDestroyWindow(dpy, win);
XCloseDisplay(dpy);
exit(0);
}
} /* this closes while(1) { */
} /* this is the } which closes int main(int argc, char *argv[]) { */
The code is compiled using this command:
gcc -o quad quad.c -lX11 -lGL -lGLU
Somewhat surprisingly, it turned out that the C-code program worked just fine. This proves that OpenGL is working on my virtual Linux but I am not able to use it from GLCanvas.
Here is a little bit of the output from glxinfo:
Quote:
Extended renderer info (GLX_MESA_query_renderer):
Vendor: VMware, Inc. (0x15ad)
Device: SVGA3D; build: RELEASE; LLVM; (0x405)
Version: 21.1.8
Accelerated: no
Video memory: 1MB
Unified memory: no
Preferred profile: core (0x1)
Max core profile version: 3.3
Max compat profile version: 3.3
Max GLES1 profile version: 1.1
Max GLES[23] profile version: 2.0
OpenGL vendor string: VMware, Inc.
OpenGL renderer string: SVGA3D; build: RELEASE; LLVM;
OpenGL core profile version string: 3.3 (Core Profile) Mesa 21.1.8
OpenGL core profile shading language version string: 3.30
Does anybody have an idea of how to make GLCanvas work on a VmWare Linux?
|
|
| |
Goto Forum:
Current Time: Tue May 07 01:31:50 GMT 2024
Powered by FUDForum. Page generated in 5.04256 seconds
|