PSA: GLX is not thread safe, even if you use XLockDisplay()/XUnlockDisplay(). Xlib-XCB does not fix this. First Google result on this is a wrong Stack Overflow answer. GLX uses lots of per-process (not per-display) globals. You must use a process-wide lock around each GLX call.
-
Show this thread
-
(The real answer is: Use a library—like the one I’m about to release :)—to interface with the GPU. Don’t use GLX directly. You will drown in bugs.)
2 replies 1 retweet 28 likesShow this thread -
Replying to @pcwalton
Your escapades here remind me of the Joe Armstrong chapter in Coders at Work about interfacing with X11 from Erlang. :)https://books.google.com/books?id=2kMIqdfyT8kC&pg=PA211&lpg=PA211&dq=joe+armstrong+%22x+windows%22+coders+at+work&source=bl&ots=Mm9rqfFJyD&sig=ACfU3U1mAfSNa9kEk55pf74TBmiOzNHhrQ&hl=en&sa=X&ved=2ahUKEwiKioiBwvzlAhUdHzQIHdEGCVUQ6AEwAHoECAsQAg#v=onepage&q=joe%20armstrong%20%22x%20windows%22%20coders%20at%20work&f=false …
1 reply 0 retweets 0 likes -
Replying to @chadaustin
Unfortunately I can't use that solution. There are two original sins: 1. GLX is an *Xlib* standard, not an X protocol standard as it should be. 2. NVIDIA being proprietary means I have to code to GLX and can't just bypass this nonsense with GBM.
1 reply 0 retweets 1 like -
Replying to @pcwalton @chadaustin
What is the reason for GLX and not EGL? Not super deep into it but we just replaced our linuxy things with EGL, any reason we should stay with GLX?
1 reply 0 retweets 1 like -
Replying to @rikarends @chadaustin
GLX works in more environments, like macOS/XQuartz
2 replies 0 retweets 1 like -
(Also I'm unsure how reliable texture from pixmap is with EGL, especially on NVIDIA. There is usually only one way that actually works.)
1 reply 0 retweets 1 like -
Replying to @pcwalton @chadaustin
I wouldn't know when i'd need texture from pixmap, is that to upload textures from system ram? Or something more specific.
1 reply 0 retweets 0 likes -
Replying to @rikarends @chadaustin
It's for sharing textures across threads/processes. Support for that is the main reason why I'm writing a glutin replacement right now.
2 replies 0 retweets 0 likes -
Replying to @pcwalton @chadaustin
Ahh okay. Yeah i'm going for single-core-is-the-render-API design here no x-thread direct gfx api accesses.
2 replies 0 retweets 0 likes
The other reason for surfman's existence is to let the app choose the GPU on multi-GPU systems like laptops. This is extremely annoying to do cross-platform :)
-
-
Replying to @pcwalton @chadaustin
Yeah on linux that sounds dreadful. On metal/dx11 its easy tho, i'm still happy i chose to just get it over with and write all backends whilst unifying shaders and render models. Its surprisingly stable for our own usecases atleast sofar.
0 replies 0 retweets 0 likesThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.