Cool VL Viewer forum

View unanswered posts | View active topics It is currently 2024-04-27 17:26:29



Reply to topic  [ 4 posts ] 
Not quite a bug report, nor a help request... OpenGL on Mac? 
Author Message

Joined: 2012-07-08 17:37:36
Posts: 10
Location: Neufreistadt, Confederation of Democratic Simulators, Second Life
Reply with quote
Bear with me for (possibly) posting on the wrong board — moderators are welcome to move it to whatever might be the right one :)

I have basically the following question... it is Mac-related, so I understand perfectly that this is one of those not-fully-supported things that possibly only @Catten may be able to address — if he's got time, of course!

Sooo today I noticed that, by default (and as it should!), Cool VL Viewer 1.32.0.12 will default to the so-called "compatibility mode" OpenGL 2.1.

So far, so good, but even my 10-years-old MacBook Pro (Retina, 15-inch, Mid 2014) does support OpenGL — according to GLView, the built-in "performance" graphics card is a Nvidia GT 750M which fully supports OpenGL 4.1 (and passes all the tests with flying colours!).

As such, I tried out the cute little setting which allows the Cool VL Viewer to "override" the ancient "default" OpenGL 2.1 "compatibility mode" and goes straight to 3.2 or even 4.1 if it's supported; in my case, it certainly does, and the Cool VL Viewer "About" box certainly shows that 4.1 is recognised and active.

However, two things are strange...

  1. FPS is about the same as with 2.1, which is sort of weird; I'd expect a bit of more performance, since many more shaders can be pushed into the renderer, freeing up the CPU for doing other tasks;
  2. Textures don't load — although they all are[I] (mostly) on disk already (and those that aren't should be stored on the internal HTTP/S proxy server I run at home).

Both are a bit baffling. I [i]could expect that, for some reason, the viewer crashed (the 3.X and 4.1 support being outdated to the point that invoking it would cause a panic and a subsequent application crash). I could also expect that the hardware-based OpenGL libraries would not respond properly, and the viewer would fall back to the software version instead — with a drastic loss of performance, obviously.

Instead, I have this weird situation, where everything seems to be working — no crashing! — no performance seems to be lost — but none has been gained, either — but the textures refuse to load, which is very weird, since they're all in the cache (I can see them there :) ).

Here is how it looks to me: https://gyazo.com/3125d7b485767ce432a9afa35fcb4487

Notice that not a single texture has loaded, although all the meshes have, as well as the texture colour.

When looking at the texture console, you can see that there are lots of textures with 1x1 sizes and 0 bytes — all of which in the CRE mode, i.e. "being created". But... they never are! With the build menu focused on a single face and requesting (via the Advanced menu) for the texture data, this seems to be confirmed — the image is still known to the system as of having 1x1 dimensions (and zero bytes): https://gyazo.com/c9fdf25229cbce664e8f49b60045f871

I've also done the following: teleport to the same location with an alt and see what happens. The alt's (official SL) viewer has no problems whatsoever with any of the textures on that particular scene; all get correctly retrieved from the cache and rendered naturally. The alt can also see my main — all the textures will rez eventually (they take perhaps a bit longer than I'd expected, given that all those textures are already on the proxy... but I guess there is the need of checking if all textures are "fresh" enough, and retrieve them if they aren't. The point here is that everything works — there is nothing wrong — it's just that OpenGL 4.1 is struggling with the textures for some reason.

Reverting back to OpenGL 2.1, of course, will make Cool VL display everything exactly as expected; textures stop being set to 1x1 and get refreshed/retrieved with the correct settings. Proxy settings have no relevance — it will work no matter what the proxy settings are.

The full log for this session can be seen on this gist. You can see that there are no major texture-related issues being reported, and you can see that OpenGL 4.1 is indeed selected/activated, but from my perspective, the textures are simply being ignored...

Now, I'm well aware that there are a lot of settings out there to tweak, and I cannot possibly say which ones I've turned on off and which I turned on. All I can say is that whatever setting I tried to change for OpenGL 4.1, didn't have any effect on the texture loading — but it didn't break OpenGL 2.1, either.

I'm still undecided if this is a bug or if it's just something on my ancient PowerBook that prevents OpenGL 4.1 to be properly launched/initialised... and so, the Cool VL Viewer might be tricked in believing that all textures are being properly loaded into VRAM, while, in fact, they're not. But the viewer itself might not be to blame. I don't know, and, unfortunately, merely from what I can observe (in-world debugging consoles and external logs made on disk...), I cannot decide if there are, indeed, bug(s) in the code that initialises the OpenGL 4.1 library for usage by the Cool VL Viewer — or if it's just my obsolete and outdated Mac that cannot activate some of the more sophisticated features of its graphic card(s9...


2024-03-11 23:33:44
Profile ICQ YIM WWW

Joined: 2009-03-17 18:42:51
Posts: 5554
Reply with quote
Enable the "Core GL profile" in the graphics preferences: this should force OpenGL v3.2 (or 4.1: macOS decides by itself) usage on macOS...

You will need to restart the viewer after enabling core profile.


2024-03-11 23:37:12
Profile WWW

Joined: 2012-07-08 17:37:36
Posts: 10
Location: Neufreistadt, Confederation of Democratic Simulators, Second Life
Reply with quote
Uh-oh... sorry for being so late in answering to this thread, but now I get the following error:

Code:
Unrecoverable error
ERROR: LLShaderMgr::loadShaderFile: Unsupported GLSL Version.

This is before I can even enter into the Preferences panel to change the "Core GL profile".

Note that your assessment of the issue seems to be 100% accurate and correct, at least according to the Khronos community: https://community.khronos.org/t/glsl-sy ... l/110076/6

The logs, of course, are huge to post here, so I'm just adding the few relevant entries:

Code:
2024-04-03 20:24:12Z INFO: LLGLManager::initGL: Advertised OpenGL version: 2.1
2024-04-03 20:24:12Z INFO: LLGLManager::initGL: Advertised GLSL version: 1.20
2024-04-03 20:24:12Z INFO: LLGLManager::initExtensions: Could not initialize GL_ARB_occlusion_query2
2024-04-03 20:24:12Z INFO: LLGLManager::initGL: Max anisotropy: 16
2024-04-03 20:24:12Z INFO: LLGLManager::initGL: Estimating usable VRAM for textures based on reported
 total VRAM (this is inaccurate): 1024 MB.
2024-04-03 20:24:12Z INFO: LLDiskCache::purge: 7237 files found in cache. Checking the total size and
 possibly purging old files...
2024-04-03 20:24:12Z INFO: LLDiskCache::purge: Cache check took 102ms to execute. Cache size: 228190633 bytes.
2024-04-03 20:24:13Z INFO: LLFeatureManager::loadGPUClass: GPU 'NVIDIA Corporation NVIDIA GeForce GT750M OpenGL Engine' already benchmarked and deemed compatible.
2024-04-03 20:24:13Z INFO: LLFeatureManager::applyBaseMasks: Setting GPU class to: Class2
2024-04-03 20:24:13Z INFO: LLViewerWindow::LLViewerWindow: LLVertexBuffer initialization done.
2024-04-03 20:24:13Z INFO: LLFeatureManager::applyBaseMasks: Setting GPU class to: Class2
2024-04-03 20:24:13Z WARNING: LLFeatureList::maskList: Mask attempting to reenabling disabled feature, ignoring RenderReflectionProbeLevel
2024-04-03 20:24:13Z WARNING: LLFeatureList::maskList: Mask attempting to reenabling disabled feature, ignoring RenderReflectionProbes
2024-04-03 20:24:13Z WARNING: LLFeatureManager::applyFeatures: Feature RenderReflectionProbeLevel not available !
2024-04-03 20:24:13Z WARNING: LLFeatureManager::applyFeatures: Feature RenderReflectionProbes not available !
2024-04-03 20:24:13Z INFO: LLImageGLThread::LLImageGLThread: Initializing with 6 worker threads.
2024-04-03 20:24:13Z INFO: LLThreadPool::run: Starting thread: ThreadPool:LLImageGL:1/6
2024-04-03 20:24:13Z INFO: LLThreadPool::run: Starting thread: ThreadPool:LLImageGL:2/6
2024-04-03 20:24:13Z INFO: LLThreadPool::run: Starting thread: ThreadPool:LLImageGL:3/6
2024-04-03 20:24:13Z INFO: LLThreadPool::run: Starting thread: ThreadPool:LLImageGL:4/6
2024-04-03 20:24:13Z INFO: LLThreadPool::run: Starting thread: ThreadPool:LLImageGL:5/6
2024-04-03 20:24:13Z INFO: LLThreadPool::run: Starting thread: ThreadPool:LLImageGL:6/6
2024-04-03 20:24:13Z INFO: LLImageGLThread::run: Initializing GL for thread ThreadPool:LLImageGL:1/6with context: 600000dbcfb0
2024-04-03 20:24:13Z INFO: LLImageGLThread::run: Initializing GL for thread ThreadPool:LLImageGL:2/6with context: 600000db36b0
2024-04-03 20:24:13Z INFO: LLImageGLThread::run: Initializing GL for thread ThreadPool:LLImageGL:3/6with context: 600000c24df0
2024-04-03 20:24:13Z INFO: LLImageGLThread::run: Initializing GL for thread ThreadPool:LLImageGL:4/6with context: 600000db3790
2024-04-03 20:24:13Z INFO: LLImageGLThread::run: Initializing GL for thread ThreadPool:LLImageGL:5/6with context: 600000c250f0
2024-04-03 20:24:13Z INFO: LLImageGLThread::run: Initializing GL for thread ThreadPool:LLImageGL:6/6with context: 600000c25100
2024-04-03 20:24:13Z INFO: LLViewerTextureList::getMaxVideoRamSetting: Recommended max texture RAM: 1024 MB - System RAM: 16384 MB.
2024-04-03 20:24:13Z INFO: LLViewerTextureList::updateMaxResidentTexMem: Total usable VRAM: 1024 MB - Usable frame buffers VRAM: 256 MB - Usable texture VRAM: 768 MB - Maximum total texture memory set to: 1536 MB - Maximum total GL bound texture memory set to: 768 MB
2024-04-03 20:24:13Z INFO: LLViewerTextureList::init: Preloading images (any crash would be the result of a missing image file)...
2024-04-03 20:24:13Z INFO: LLViewerTextureList::init: Images preloading successful.
2024-04-03 20:24:13Z INFO: LLFontRegistry::dump: LLFontRegistry dump:
... etc for fonts, all fine there...
2024-04-03 20:24:14Z INFO: LLAppViewer::initWindow: Initializing environment classes...
2024-04-03 20:24:14Z INFO: LLWLSkyParamMgr::initClass: Initializing.
2024-04-03 20:24:14Z INFO: LLWLSkyParamMgr::loadPresets: Loading Default WindLight sky settings from/Applications/CoolVLViewer1.32.0.12.app/Contents/Resources/app_settings/windlight/skies/
... all fine here as well...
2024-04-03 20:24:14Z INFO: LLAppViewer::initWindow: Initializing the render pipeline...
2024-04-03 20:24:14Z INFO: LLViewerShaderMgr::setShaders: Using up to 1 texture index channels.
2024-04-03 20:24:14Z INFO: LLViewerShaderMgr::setShaders:
~~~~~~~~~~~~~~~~~~
 Loading Shaders:
~~~~~~~~~~~~~~~~~~
2024-04-03 20:24:14Z INFO: LLViewerShaderMgr::setShaders: Using GLSL 1.20
2024-04-03 20:24:14Z INFO: LLViewerShaderMgr::loadBasicShaders: Screen space reflections disabled
2024-04-03 20:24:14Z INFO: LLViewerShaderMgr::loadBasicShaders: Reflection probes disabled.
2024-04-03 20:24:14Z /Users/karstenbrogaard/Develop/CoolVLViewer/linden/indra/llrender/llshadermgr.cpp(710) : error
2024-04-03 20:24:14Z ERROR: LLShaderMgr::loadShaderFile: Unsupported GLSL Version.
2024-04-03 20:25:31Z INFO: LLThreadPool::close: ThreadPool:Texture cache was informed of viewer crash.
2024-04-03 20:25:31Z INFO: LLThreadPool::close: ThreadPool:Texture cache: closing queue...
2024-04-03 20:25:31Z INFO: LLThreadPool::close: ThreadPool:Texture cache shutdown complete with an empty queue.
... etc all other queues get properly shut down as well, and the log ends here.


All right, so, since I can't enter Preferences, I thought — what if this is the kind of preference that is available via the user_settings.xml file? All I needed to know, of course, was how it's called. But — ha! No information whatsoever anywhere I searched!

Thankfully, one of the many advantages of open-source software is, well, the ability to read the code. And so I looked for one of the settings that is mentioned on the above-quoted Khronos user community forum: SDL_GL_CONTEXT_PROFILE_CORE. A quick search showed that this flag is only set once, namely, on line 792 (or so) of indra/llwindow/llwindowsdl.cpp, after checking if LLRender::sGLCoreProfile is set to true or not.

LLRender::sGLCoreProfile, in turn, is being assigned the value read from the preferences file in indra/newview/llappviewer.cpp, from a key named "RenderGLCoreProfile" — hooray, now I only need to set that key to Boolean 1 (= true) and that was it, Cool VL Viewer launched, and then I could fiddle with the Preferences again, to great success — now Help > About... shows the expected string OpenGL version: 4.1 NVIDIA-16.0.13.

Hooray!

Or... not quite. For some weird reason, once that setting is activated, textures do not load at all. That's actually quite weird, since the viewer behaves "as if" all textures have been properly loaded, in terms of FPS; also, the remaining info about object assets (i.e. their location, rotation, polygonal mesh, etc) is being loaded and placed correctly in-world. Turning on the Texture Console, I get the following display:

https://gyazo.com/245505cfba834a99e8890af815f42d79

Note the following weird issues:
  1. Free VRAM is shown as -1/1024 which looks strange to me. 1024 seems to be a reasonable value for my ancient Nvidia card (it has only 2 GB total).
  2. Notice how the HTTP and UDP values are both at zero. I assume this means that no requests have been made?
  3. Textures are in the process of being created (CRE is active) but are all 1x1 and have 0 bytes.
  4. DDis is correctly set to 0 (highest priority), (Req) is often 0 but sometimes 5; however, Dis is always (-1) for all textures.
  5. Even so-called "system icons" (look at the inventory!) haven't loaded, although it's certain those are locally stored.
Also note that the above image was taken after waiting several minutes without moving the camera, in the hope that it was just a question of waiting longer.

Afterwards, I noticed that there are not one, but two parameters regarding the Core GL Profile: RenderGLCoreProfile and RenderGLContextCoreProfile. The first is the one that is documented on the SL Wiki. The second... also seems to be used... since it appears in the user_settings.xml for other viewers. So I turned both on, or off, or alternatively on or off, etc.

With "Core GL profile" unchecked and restarting, the rendering immediately works (in compatibility 2.1 mode, that is!), and produces the following (same location, same camera positioning):

https://gyazo.com/e8315612708c020d9a09729d243a6727

Since this area has little traffic and doesn't have many objects in sight (I use it for testing because of that!), all textures were previously cached (either locally on disk or by the HTTP proxy server), which meant that all of them loaded in about a second (I'm not joking!). This, in fact, is what I expect to happen under normal conditions, with a static avatar that has been in a certain region for a long time, allowing all textures to be properly cached. And that's exactly what happens when OpenGL is in "compatibility mode" at 2.1.

Now, I wanted to do similar testing on other viewers, and the results are curious. Firestorm (latest stable version, still on the v6 renderer) accepts those two parameters, but it has a nasty tendency to try to overwrite them every time it launches; even with some minor hacks to force it to accept these settings, Firestorm refuses to do so, and is hard-coded somehow to stick to OpenGL 2.1. I guess it's something the developers decided to do for some good reason (perhaps they had the same issue/bug with textures not loading under OpenGL 4.X on older cards?...). Firestorm does not crash, nor give an error; it simply refuses to do anything with the settings and remains in compatibility mode.

The official Second Life Viewer, with the new v7 rendering engine, works flawlessly under OpenGL 4.X. While the texture console info display is different from the one in the Cool VL Viewer, this should be a good reference when everything is working as expected:

https://gyazo.com/605d9ef5207214d0381be0b74b020411

Note a few things:
  • The v7 Official SL Viewer reports more memory to play with, part of which is clearly being used from RAM, not VRAM; I need to tweak those settings at some stage.
  • I forgot that I have different camera settings, so the view is not the same. But it's in the general area (I essentially just logged off from Cool VL Viewer and logged in using the Official SL Viewer).
  • Only after taking the snapshot I noticed that not all textures had fully rezzed (!), as you can see on some of the trees (they're blurred). Interestingly (or perhaps not!), the Cool VL Viewer is way, way faster in rendering all textures in cache. Mind you, the Official SL Viewer doesn't take "an eternity", it just takes a bit longer — enough for the snapshot still catching the textures being rezzed.

To conclude:
  • Enabling Core GL definitely forces the Mac to tell the card to switch to OpenGL 4.1 (in my case), exactly as expected.
  • Firestorm, for some reason that only their developers know, refuses to accept these settings (but also doesn't give warnings/errors).
  • The v7 renderer from the Official SL Viewer works 100% as expected with OpenGL 4.1.
  • Cool VL Viewer accepts the setting for the Core GL enabling, reports that it has successfully activated OpenGL 4.1, and proceeds internally without errors, assuming that "everything" is working "as expected". Object mesh and positioning is being correctly loaded (the frame rate is the same with or without textures, for some strange reason!), but no textures are being loaded, neither the built-in ones, nor those on the disk cache, and not those from the proxy server, either. No warnings/errors are reported on the logs (or, if they are, they weren't obvious to me!).

My guess is that this cannot be something very hard to figure out, and perhaps it doesn't even need any code — it could be just an option that I should turn on or off somewhere. Who knows, perhaps by default, when switching Enable Core GL on, it turns off the setting for enabling textures :) More likely, I think, is the way VRAM and RAM are being set up/calculated — this might be differently interpreted under OpenGL 2.1 (where the calculations are correct) and OpenGL 4.X, where the current settings I've got don't make sense or give contradictory results or whatever — preventing Cool VL Viewer from knowing how much VRAM is really available for textures, and finding none (thus the -1 value...).

For the sake of the argument, here is what my settings show:

https://gyazo.com/7aba15965305cca7a6137ac110f424d2

Note that I'm attempting to force values (and I even tried to set DisableVRAMCheck to TRUE, just in case...) in the hope of getting more predictable VRAM calculations, but either everything I put there is wrong, or it's irrelevant for this case.

And here is the log, searching by the keyword "texture":

Code:
2024-04-05 12:44:28Z INFO: LLTextureCache::LLTextureCache: Initializing with 2 worker threads...
2024-04-05 12:44:28Z INFO: LLThreadPool::run: Starting thread: ThreadPool:Texture cache:1/2
2024-04-05 12:44:28Z INFO: LLThreadPool::run: Starting thread: ThreadPool:Texture cache:2/2
2024-04-05 12:44:28Z INFO: LLThread::threadRun: Running thread Texture fetch with Id: 0x700009357000
2024-04-05 12:44:29Z INFO: LLTextureCache::initCache: Headers: 1048576 Textures size: 3603 MB
2024-04-05 12:44:29Z INFO: LLGLManager::initGL: Estimating usable VRAM for textures based on reported total VRAM (this is inaccurate): 1024 MB.
2024-04-05 12:44:29Z WARNING: LLViewerTextureList::getMaxVideoRamSetting: Overriding the detected VRAM amount with the VRAMOverride debug settings: 1024MB of VRAM assumed.
2024-04-05 12:44:29Z INFO: LLViewerTextureList::getMaxVideoRamSetting: Recommended max texture RAM: 1536 MB - System RAM: 16384 MB.
2024-04-05 12:44:29Z INFO: LLViewerTextureList::getMaxVideoRamSetting: Usable texture RAM: 1536 MB -System RAM: 16384 MB.
2024-04-05 12:44:29Z INFO: LLViewerTextureList::updateMaxResidentTexMem: Total usable VRAM: 1024 MB - Usable frame buffers VRAM: 256 MB - Usable texture VRAM: 768 MB - Maximum total texture memory set to: 1536 MB - Maximum total GL bound texture memory set to: 768 MB
2024-04-05 12:44:29Z INFO: LLViewerTextureList::init: Preloading images (any crash would be the result of a missing image file)...
2024-04-05 12:44:29Z INFO: LLViewerTextureList::init: Images preloading successful.
2024-04-05 12:44:30Z INFO: LLViewerShaderMgr::setShaders: Using up to 16 texture index channels.
2024-04-05 12:44:30Z INFO: LLGLSLShader::createShader: Creating shader: Splat texture rect shader - Level: 2 - File: interface/splattexturerectV.glsl
2024-04-05 12:44:30Z INFO: LLGLSLShader::createShader: Creating shader: Splat texture rect shader - Level: 2 - File: interface/splattexturerectF.glsl
2024-04-05 12:44:30Z INFO: LLGLSLShader::createShader: Creating shader: One texture no color shader - Level: 2 - File: interface/onetexturenocolorV.glsl
2024-04-05 12:44:30Z INFO: LLGLSLShader::createShader: Creating shader: One texture no color shader - Level: 2 - File: interface/onetexturenocolorF.glsl
2024-04-05 12:44:35Z INFO: LLUserAuth::authenticate: Options: inventory-root, inventory-skeleton, inventory-lib-root, inventory-lib-owner, inventory-skel-lib, agent_appearance_service, initial-outfit, gestures, event_categories, event_notifications, classified_categories, adult_compliant, buddy-list, ui-config, max_groups, max-agent-groups, map-server-url, search-server-url, login-flags, global-textures, account_level_benefits, END
2024-04-05 12:44:39Z INFO: get_S32_value:   - texture_upload_cost: 10
2024-04-05 12:44:51Z INFO: LLViewerStats::sendStats: Misc stats: string_1:  - string_2: Texture Time: 23.82, Total Time: 13.33
2024-04-05 12:44:52Z INFO: LLVOAvatarSelf::setNewBakedTexture: New baked texture: head-baked UUID: 1d010000-76bf-1429-475f-d1228a32b527
2024-04-05 12:44:52Z INFO: LLVOAvatarSelf::setNewBakedTexture: New baked texture: upper-baked UUID: 1d010000-76bf-1429-475f-d1228a32b527
2024-04-05 12:44:52Z INFO: LLVOAvatarSelf::setNewBakedTexture: New baked texture: lower-baked UUID: 6a02a7a3-dcd2-2c36-8180-808080800023
2024-04-05 12:44:52Z INFO: LLVOAvatarSelf::setNewBakedTexture: New baked texture: eyes-baked UUID: 6a02a7a3-dcd2-2c36-8180-808080800023
2024-04-05 12:44:52Z INFO: LLVOAvatarSelf::setNewBakedTexture: New baked texture: skirt-baked UUID: 1d010000-76bf-1429-475f-d1228a32b527
2024-04-05 12:44:52Z INFO: LLVOAvatarSelf::setNewBakedTexture: New baked texture: hair-baked UUID: 1d010000-76bf-1429-475f-d1228a32b527
2024-04-05 12:44:52Z INFO: LLVOAvatarSelf::setNewBakedTexture: New baked texture: leftarm-baked UUID: 1d010000-76bf-1429-475f-d1228a32b527
2024-04-05 12:44:52Z INFO: LLVOAvatarSelf::setNewBakedTexture: New baked texture: leftleg-baked UUID: 1d010000-76bf-1429-475f-d1228a32b527
2024-04-05 12:44:52Z INFO: LLVOAvatarSelf::setNewBakedTexture: New baked texture: aux1-baked UUID: 1d010000-76bf-1429-475f-d1228a32b527
2024-04-05 12:44:52Z INFO: LLVOAvatarSelf::setNewBakedTexture: New baked texture: aux2-baked UUID: 1d010000-76bf-1429-475f-d1228a32b527
2024-04-05 12:44:52Z INFO: LLVOAvatarSelf::setNewBakedTexture: New baked texture: aux3-baked UUID: 1d010000-76bf-1429-475f-d1228a32b527
2024-04-05 12:45:08Z WARNING: LLTextureFetchWorker::onCompleted: Texture: 634bcfcb-d6ba-0a95-3943-c1e88a08b1b0 CURL GET FAILED, status: Http_404 - reason: Not Found
2024-04-05 12:45:21Z WARNING: LLTextureFetchWorker::doWork: Texture 634bcfcb-d6ba-0a95-3943-c1e88a08b1b0: failed harder
2024-04-05 12:45:21Z WARNING: LLViewerTexture::updateFetch: No data received for image 634bcfcb-d6ba-0a95-3943-c1e88a08b1b0, setting as missing. decode_priority = 1.2317e+07 - mRawDiscardLevel = 32767 - current_discard = -1
2024-04-05 12:45:21Z WARNING: LLViewerTexture::setIsMissingAsset: 634bcfcb-d6ba-0a95-3943-c1e88a08b1b0: Marking image as missing
2024-04-05 12:49:41Z INFO: LLViewerStats::sendStats: Misc stats: string_1:  - string_2: Texture Time: 313.70, Total Time: 313.34


Logs searching for "RAM":

Code:
2024-04-05 12:44:29Z INFO: LLGLManager::initGL: Estimating usable VRAM for textures based on reported total VRAM (this is inaccurate): 1024 MB.
2024-04-05 12:44:29Z WARNING: LLViewerTextureList::getMaxVideoRamSetting: Overriding the detected VRAM amount with the VRAMOverride debug settings: 1024MB of VRAM assumed.
2024-04-05 12:44:29Z INFO: LLViewerTextureList::getMaxVideoRamSetting: Recommended max texture RAM: 1536 MB - System RAM: 16384 MB.
2024-04-05 12:44:29Z INFO: LLViewerTextureList::getMaxVideoRamSetting: Usable texture RAM: 1536 MB -System RAM: 16384 MB.
2024-04-05 12:44:29Z INFO: LLViewerTextureList::updateMaxResidentTexMem: Total usable VRAM: 1024 MB - Usable frame buffers VRAM: 256 MB - Usable texture VRAM: 768 MB - Maximum total texture memory set to: 1536 MB - Maximum total GL bound texture memory set to: 768 MB


Oh — I almost forgot!

When OpenGL 4.1 is being "forced", the built-in Web browser also stops working! I just noticed that because the splash screen doesn't load (but I assumed it was just being stubborn and loading slowly). Back in compatibility mode, the LL splash screen appears instantly. Launching the built-in viewer in OpenGL 4.1 results simply in a black background. No errors — it says "loading", then simply assumes that everything has been loaded and has been displayed — but nothing was actually done. It remains a black screen (exactly like the splash screen on start).

Here are the logs when trying to use the built-in browser to connect to https://status.secondlifegrid.net/:

Code:
2024-04-05 13:01:03Z INFO: LLThread::threadRun: Running thread LLPluginProcessCreationThread with Id: 0x70000e79b000
2024-04-05 13:01:03Z INFO: LLAppearanceMgr::serverAppearanceUpdateSuccess: Request OK.
2024-04-05 13:01:04Z INFO: LLViewerThrottleGroup::sendToSim: Sending throttle settings, total BW 12286
2024-04-05 13:01:04Z INFO: LLViewerThrottle::updateDynamicThrottle: Easing network throttle to 12582912
2024-04-05 13:01:06Z INFO: LLPluginProcessParent::receiveMessage: plugin version string: Dullahan 1.12.3/CEF 91.1.21/Chromium 91.0.4472.114
2024-04-05 13:01:06Z INFO: LLPluginProcessParent::receiveMessage: Message class: base -> version: 1.0
2024-04-05 13:01:06Z INFO: LLPluginProcessParent::receiveMessage: Message class: media -> version: 1.0
2024-04-05 13:01:06Z INFO: LLPluginProcessParent::receiveMessage: Message class: media_browser -> version: 1.0
2024-04-05 13:01:08Z INFO: LLViewerMediaImpl::updateMediaImage: Initializing media placeholder withmovie image id: ce73176b-ac01-c456-c87b-0ea829869f33
2024-04-05 13:01:22Z INFO: LLCircuitData::dumpResendCountAndReset: Circuit: 54.184.243.93:12035 resent 1 packets
2024-04-05 13:01:53Z INFO: LLViewerMediaImpl::navigateInternal: media id = ce73176b-ac01-c456-c87b-0ea829869f33 - url = https://status.secondlifegrid.net/ - mime_type =
2024-04-05 13:01:54Z INFO: LLViewerMediaImpl::loadURI: Asking media source to load URI: https://status.secondlifegrid.net/


The only weird thing there is the empty MIME type, but since I generally don't pay attention to those things, my best guess is that this is actually what happens with all requests... there was no error on the logs, though.




Sorry for the long post. Please let me know if I can send you more data that might be pertinent to further debug this issue!


2024-04-05 13:13:35
Profile ICQ YIM WWW

Joined: 2009-03-17 18:42:51
Posts: 5554
Reply with quote
Gwyneth Llewelyn wrote:
Uh-oh... sorry for being so late in answering to this thread, but now I get the following error:

Code:
Unrecoverable error
ERROR: LLShaderMgr::loadShaderFile: Unsupported GLSL Version.

This is before I can even enter into the Preferences panel to change the "Core GL profile".
.../...
Code:
2024-04-03 20:24:14Z /Users/karstenbrogaard/Develop/CoolVLViewer/linden/indra/llrender/llshadermgr.cpp(710) : error

This corresponds to a llerrs (a voluntary crash), because, somehow, you managed to get the viewer to execute a part of the code it should never have reached, like explained in the sources:
Code:
   if (major_version == 1 && minor_version < 30)
   {
      if (gUsePBRShaders)
      {
         // We should NEVER get here: OpenGL v3.1 is the minimum requirement
         // for PBR.
         llerrs << "Unsupported GLSL Version." << llendl;
      }
.../...

You either messed up with the settings, or by forcing OpenGL v4.1 via macOS instead of simply ticking the Core GL profile box in the Graphics settings, you enabled PBR rendering (which would NOT be possible with the default, OpenGL v2.1 compatibility mode); then when you restart the viewer without fiddling with macOS settings, it starts in PBR mode with shaders it cannot compile in OpenGL v2.1 compatibility profile...

Please, wipe out your settings_coolvlviewer_*.xml files (all of them, since if you hacked previous versions settings, the viewer will crash again by trying to load an older hacked version) and launch the viewer, then go to the Graphics prefs, tick the Core profile check box, and restart the viewer: it then shall start in core profile mode and allow you to enable PBR.

Gwyneth Llewelyn wrote:
Or... not quite. For some weird reason, once that setting is activated, textures do not load at all.
Make sure the "Open GL worker threads" spinner is set to zero (it should be, by default): macOS' OpenGL implementation sucks rocks and will likely not allow shared GL profiles to properly work (deadlocks & Co), thus the textures stuck in the "CRE" (creation) state...

Gwyneth Llewelyn wrote:
Note the following weird issues:
  1. Free VRAM is shown as -1/1024 which looks strange to me. 1024 seems to be a reasonable value for my ancient Nvidia card (it has only 2 GB total).
Not a problem: this is just because macOS' OpenGL implementation does not have any function to report VRAM usage. The viewer can do without it.

Quote:
Afterwards, I noticed that there are not one, but two parameters regarding the Core GL Profile: RenderGLCoreProfile and RenderGLContextCoreProfile. The first is the one that is documented on the SL Wiki. The second... also seems to be used... since it appears in the user_settings.xml for other viewers. So I turned both on, or off, or alternatively on or off, etc.
The Cool VL Viewer uses its own settings, which differ in number, types and names from all other viewers. There is no "RenderGLContextCoreProfile" setting in my viewer, which uses "RenderGLCoreProfile" instead.

Gwyneth Llewelyn wrote:
Oh — I almost forgot!

When OpenGL 4.1 is being "forced", the built-in Web browser also stops working!
Just do not force it !!! Let the application initialize as it should or can. This also holds true for CEF !

Gwyneth Llewelyn wrote:
The only weird thing there is the empty MIME type, but since I generally don't pay attention to those things, my best guess is that this is actually what happens with all requests... there was no error on the logs, though.
These are totally harmless warnings, which are the result of failed MIME type probes: not an issue at all, and won't prevent the web pages to load.


2024-04-05 14:02:02
Profile WWW
Display posts from previous:  Sort by  
Reply to topic   [ 4 posts ] 

Who is online

Users browsing this forum: No registered users and 18 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB® Forum Software © phpBB Group
Designed by ST Software.