Cool VL Viewer forum http://sldev.free.fr/forum/ |
|
Not quite a bug report, nor a help request... OpenGL on Mac? http://sldev.free.fr/forum/viewtopic.php?f=6&t=2473 |
Page 1 of 1 |
Author: | Gwyneth Llewelyn [ 2024-03-11 23:33:44 ] |
Post subject: | Not quite a bug report, nor a help request... OpenGL on Mac? |
Bear with me for (possibly) posting on the wrong board — moderators are welcome to move it to whatever might be the right one ![]() I have basically the following question... it is Mac-related, so I understand perfectly that this is one of those not-fully-supported things that possibly only @Catten may be able to address — if he's got time, of course! Sooo today I noticed that, by default (and as it should!), Cool VL Viewer 1.32.0.12 will default to the so-called "compatibility mode" OpenGL 2.1. So far, so good, but even my 10-years-old MacBook Pro (Retina, 15-inch, Mid 2014) does support OpenGL — according to GLView, the built-in "performance" graphics card is a Nvidia GT 750M which fully supports OpenGL 4.1 (and passes all the tests with flying colours!). As such, I tried out the cute little setting which allows the Cool VL Viewer to "override" the ancient "default" OpenGL 2.1 "compatibility mode" and goes straight to 3.2 or even 4.1 if it's supported; in my case, it certainly does, and the Cool VL Viewer "About" box certainly shows that 4.1 is recognised and active. However, two things are strange...
Both are a bit baffling. I [i]could expect that, for some reason, the viewer crashed (the 3.X and 4.1 support being outdated to the point that invoking it would cause a panic and a subsequent application crash). I could also expect that the hardware-based OpenGL libraries would not respond properly, and the viewer would fall back to the software version instead — with a drastic loss of performance, obviously. Instead, I have this weird situation, where everything seems to be working — no crashing! — no performance seems to be lost — but none has been gained, either — but the textures refuse to load, which is very weird, since they're all in the cache (I can see them there ![]() Here is how it looks to me: https://gyazo.com/3125d7b485767ce432a9afa35fcb4487 Notice that not a single texture has loaded, although all the meshes have, as well as the texture colour. When looking at the texture console, you can see that there are lots of textures with 1x1 sizes and 0 bytes — all of which in the CRE mode, i.e. "being created". But... they never are! With the build menu focused on a single face and requesting (via the Advanced menu) for the texture data, this seems to be confirmed — the image is still known to the system as of having 1x1 dimensions (and zero bytes): https://gyazo.com/c9fdf25229cbce664e8f49b60045f871 I've also done the following: teleport to the same location with an alt and see what happens. The alt's (official SL) viewer has no problems whatsoever with any of the textures on that particular scene; all get correctly retrieved from the cache and rendered naturally. The alt can also see my main — all the textures will rez eventually (they take perhaps a bit longer than I'd expected, given that all those textures are already on the proxy... but I guess there is the need of checking if all textures are "fresh" enough, and retrieve them if they aren't. The point here is that everything works — there is nothing wrong — it's just that OpenGL 4.1 is struggling with the textures for some reason. Reverting back to OpenGL 2.1, of course, will make Cool VL display everything exactly as expected; textures stop being set to 1x1 and get refreshed/retrieved with the correct settings. Proxy settings have no relevance — it will work no matter what the proxy settings are. The full log for this session can be seen on this gist. You can see that there are no major texture-related issues being reported, and you can see that OpenGL 4.1 is indeed selected/activated, but from my perspective, the textures are simply being ignored... Now, I'm well aware that there are a lot of settings out there to tweak, and I cannot possibly say which ones I've turned on off and which I turned on. All I can say is that whatever setting I tried to change for OpenGL 4.1, didn't have any effect on the texture loading — but it didn't break OpenGL 2.1, either. I'm still undecided if this is a bug or if it's just something on my ancient PowerBook that prevents OpenGL 4.1 to be properly launched/initialised... and so, the Cool VL Viewer might be tricked in believing that all textures are being properly loaded into VRAM, while, in fact, they're not. But the viewer itself might not be to blame. I don't know, and, unfortunately, merely from what I can observe (in-world debugging consoles and external logs made on disk...), I cannot decide if there are, indeed, bug(s) in the code that initialises the OpenGL 4.1 library for usage by the Cool VL Viewer — or if it's just my obsolete and outdated Mac that cannot activate some of the more sophisticated features of its graphic card(s9... |
Author: | Henri Beauchamp [ 2024-03-11 23:37:12 ] |
Post subject: | Re: Not quite a bug report, nor a help request... OpenGL on |
Enable the "Core GL profile" in the graphics preferences: this should force OpenGL v3.2 (or 4.1: macOS decides by itself) usage on macOS... You will need to restart the viewer after enabling core profile. |
Author: | Gwyneth Llewelyn [ 2024-04-05 13:13:35 ] | |||||||||||||||||||||||||||||||||||||||||||||
Post subject: | Working towards a solution! | |||||||||||||||||||||||||||||||||||||||||||||
Uh-oh... sorry for being so late in answering to this thread, but now I get the following error:
This is before I can even enter into the Preferences panel to change the "Core GL profile". Note that your assessment of the issue seems to be 100% accurate and correct, at least according to the Khronos community: https://community.khronos.org/t/glsl-sy ... l/110076/6 The logs, of course, are huge to post here, so I'm just adding the few relevant entries:
All right, so, since I can't enter Preferences, I thought — what if this is the kind of preference that is available via the user_settings.xml file? All I needed to know, of course, was how it's called. But — ha! No information whatsoever anywhere I searched! Thankfully, one of the many advantages of open-source software is, well, the ability to read the code. And so I looked for one of the settings that is mentioned on the above-quoted Khronos user community forum: SDL_GL_CONTEXT_PROFILE_CORE. A quick search showed that this flag is only set once, namely, on line 792 (or so) of indra/llwindow/llwindowsdl.cpp, after checking if LLRender::sGLCoreProfile is set to true or not. LLRender::sGLCoreProfile, in turn, is being assigned the value read from the preferences file in indra/newview/llappviewer.cpp, from a key named "RenderGLCoreProfile" — hooray, now I only need to set that key to Boolean 1 (= true) and that was it, Cool VL Viewer launched, and then I could fiddle with the Preferences again, to great success — now Help > About... shows the expected string OpenGL version: 4.1 NVIDIA-16.0.13. Hooray! Or... not quite. For some weird reason, once that setting is activated, textures do not load at all. That's actually quite weird, since the viewer behaves "as if" all textures have been properly loaded, in terms of FPS; also, the remaining info about object assets (i.e. their location, rotation, polygonal mesh, etc) is being loaded and placed correctly in-world. Turning on the Texture Console, I get the following display: https://gyazo.com/245505cfba834a99e8890af815f42d79 Note the following weird issues:
Afterwards, I noticed that there are not one, but two parameters regarding the Core GL Profile: RenderGLCoreProfile and RenderGLContextCoreProfile. The first is the one that is documented on the SL Wiki. The second... also seems to be used... since it appears in the user_settings.xml for other viewers. So I turned both on, or off, or alternatively on or off, etc. With "Core GL profile" unchecked and restarting, the rendering immediately works (in compatibility 2.1 mode, that is!), and produces the following (same location, same camera positioning): https://gyazo.com/e8315612708c020d9a09729d243a6727 Since this area has little traffic and doesn't have many objects in sight (I use it for testing because of that!), all textures were previously cached (either locally on disk or by the HTTP proxy server), which meant that all of them loaded in about a second (I'm not joking!). This, in fact, is what I expect to happen under normal conditions, with a static avatar that has been in a certain region for a long time, allowing all textures to be properly cached. And that's exactly what happens when OpenGL is in "compatibility mode" at 2.1. Now, I wanted to do similar testing on other viewers, and the results are curious. Firestorm (latest stable version, still on the v6 renderer) accepts those two parameters, but it has a nasty tendency to try to overwrite them every time it launches; even with some minor hacks to force it to accept these settings, Firestorm refuses to do so, and is hard-coded somehow to stick to OpenGL 2.1. I guess it's something the developers decided to do for some good reason (perhaps they had the same issue/bug with textures not loading under OpenGL 4.X on older cards?...). Firestorm does not crash, nor give an error; it simply refuses to do anything with the settings and remains in compatibility mode. The official Second Life Viewer, with the new v7 rendering engine, works flawlessly under OpenGL 4.X. While the texture console info display is different from the one in the Cool VL Viewer, this should be a good reference when everything is working as expected: https://gyazo.com/605d9ef5207214d0381be0b74b020411 Note a few things:
To conclude:
My guess is that this cannot be something very hard to figure out, and perhaps it doesn't even need any code — it could be just an option that I should turn on or off somewhere. Who knows, perhaps by default, when switching Enable Core GL on, it turns off the setting for enabling textures ![]() For the sake of the argument, here is what my settings show: https://gyazo.com/7aba15965305cca7a6137ac110f424d2 Note that I'm attempting to force values (and I even tried to set DisableVRAMCheck to TRUE, just in case...) in the hope of getting more predictable VRAM calculations, but either everything I put there is wrong, or it's irrelevant for this case. And here is the log, searching by the keyword "texture":
Logs searching for "RAM":
Oh — I almost forgot! When OpenGL 4.1 is being "forced", the built-in Web browser also stops working! I just noticed that because the splash screen doesn't load (but I assumed it was just being stubborn and loading slowly). Back in compatibility mode, the LL splash screen appears instantly. Launching the built-in viewer in OpenGL 4.1 results simply in a black background. No errors — it says "loading", then simply assumes that everything has been loaded and has been displayed — but nothing was actually done. It remains a black screen (exactly like the splash screen on start). Here are the logs when trying to use the built-in browser to connect to https://status.secondlifegrid.net/:
The only weird thing there is the empty MIME type, but since I generally don't pay attention to those things, my best guess is that this is actually what happens with all requests... there was no error on the logs, though. Sorry for the long post. Please let me know if I can send you more data that might be pertinent to further debug this issue! |
Author: | Henri Beauchamp [ 2024-04-05 14:02:02 ] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Post subject: | Re: Working towards a solution! | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
You either messed up with the settings, or by forcing OpenGL v4.1 via macOS instead of simply ticking the Core GL profile box in the Graphics settings, you enabled PBR rendering (which would NOT be possible with the default, OpenGL v2.1 compatibility mode); then when you restart the viewer without fiddling with macOS settings, it starts in PBR mode with shaders it cannot compile in OpenGL v2.1 compatibility profile... Please, wipe out your settings_coolvlviewer_*.xml files (all of them, since if you hacked previous versions settings, the viewer will crash again by trying to load an older hacked version) and launch the viewer, then go to the Graphics prefs, tick the Core profile check box, and restart the viewer: it then shall start in core profile mode and allow you to enable PBR.
|
Author: | Gwyneth Llewelyn [ 2024-05-26 01:07:51 ] |
Post subject: | Re: Not quite a bug report, nor a help request... OpenGL on |
Aw... I do apologise for the lack of reply... But I just wanted to confirm that everything you said about tweaking/forcing configuration was absolutely correct! Once I deleted all those settings and let Cool VL Viewer 'rebuild' the configuration file on its own, everything started to immediately work. Aye, I can confirm that OpenGL was still being detected as 4.1; PBR materials were available to select (and there was an immediately perceptible difference!), and all the other advanced options became available. Still, I cannot say how much is being done in hardware, and how much in software. What I can say is that some of the new renderer code being fed into the GPU are much better then the generation before — so I almost always get an improvement in graphics speed with the "new renderer", even at its lowest settings — and everything still looks much better than before ![]() Thank you again for your patience and your most thorough explanations! I've humbly learned my lesson... although I love to tinker and break things ![]() |
Page 1 of 1 | All times are UTC |
Powered by phpBB® Forum Software © phpBB Group https://www.phpbb.com/ |