Opens profile photo
Follow
Oliver Kreylos
@okreylos
I build holodecks for a living.
Davis, CAdoc-ok.orgJoined March 2013

Oliver Kreylos’s Tweets

Oh for goodness' sake. The Vulkan VR compositor is now working nicely, but I notice it doesn't look *quite* right, and then realize that Vulkan not only reads textures upside-down, but also treats clip space that way, so things weren't obviously wrong, just slightly so.
2
The 3D scenery was very well done and the painting integration was great, but it had forced movement throughout and no positional head tracking (it appeared the 3D scenes were pre-rendered into full-sphere stereoscopic video), so it was also quite nauseating.
Show this thread
We went to a Van Gogh exhibition on Sunday, and it had a VR component (with 31 Quest headsets) which took us on a tour of 8 paintings in the context of a 3D recreation of Arles. The VR was done by an outfit called "Dirty Monitor."
Image
Image
1
4
Show this thread
Ah, that feeling when you spend two hours debugging your CRC algorithm because it doesn't line up with other implementations, and then realize your results were different because your test string had a newline character at the end.
1
7
I made a "video poster" for a cardiology conference at UC Davis Medical Center last week, and just hit the "publish" button because why not: youtu.be/p3eZfEFAMIY The project is about machine learning, but I used VR during data prep.
5
And one more thing. I've been throwing virtual balls a lot this evening, and I can tell that I'm getting better, much better than I was in the video. That's what I like about VR when it works, i.e., behaves predictably: you don't need auto-aim, you can practice an actual skill.
3
Oh my. In the throwing video, I mention how one should not average reported positions to get velocity. But I forgot to mention the deep reason: averaging reduces random noise. That is *not* random noise. My excuse: it was 1am.
2
I started looking into Vulkan over the weekend for improved HMD support in Vrui, and 3,500 lines of C++ code later, I can draw a static image onto my Index's display:
Image
Image
11
Update: Index driver sends canting info as 3x4 matrix via SetDisplayEyeToHead entry point, where L/R eye x = ±IPD/2 (sent redundantly), and yaw angle is exactly ±5°. Thanks everyone; problem solved. :)
3
I'm late to the party as usual, but am nonetheless excited about putting this thing through its paces. The Vrui VR toolkit natively supports canted displays due to its CAVE heritage, but I still need to figure out how to read the necessary parameters from the driver.
Image
2
17
While browsing through some software on github I found this stunning bit of code (variable names changed): output = ((3000 + ((input)&1) * 500 + (((input) >> 1) & 1) * 1000 + (((input) >> 2) & 1) * 2000) - 250) How? Why?
2
1
I was invited to write a guest post for the European Geosciences Union's geodynamics blog:
Quote Tweet
Oliver Kreylos @okreylos (Researcher at UC Davis KeckCAVES @keckcaves and UC Davis DataLab @UCDavisDataSci) shows us an immersive way of visualising 3D data through virtual reality 😎 #EGUBlogs @EuroGeosciences blogs.egu.eu/divisions/gd/2
Image
10
Tee hee. I find it amusing that the only time anyone from UC Davis admin ever cares about the AR Sandbox is when the web site goes down and they get peppered with support requests from people around the world.
11