The Road to Alexandria (IG: UPX)

Got some assistance with the weird axis flipping stuff that was going on… as I suspected it was to do with the way quaternion rotation and double cover works. I also realised I was doing my rotations all wrong. To rotate around the hmd, I need to be translating the parent back to the origin then changing the rotation by multiplying it by an incremental quaternion, then normalising the output, then translating back to the point which is center stage.

Sad fact is, I’m just not yet used to thinking about quats and affines in my head quite yet. Although I will get there. I have a feeling I should find some more material on computer algebra for games/VR before the end of the month so I can bone up on my knowledge. Its like, I can visualise the translations and the associated matrices in my head, but only if I slow down and treat it like the meditation it is.

Sidenote: I just experimented with a very cool VR app called Radiant, which lets you fly through a psychoactive pattern field with trancey music going on in the background. I had the intuition to run my sub in the background on my speakers, thinking psychoactive visuals should help open that bridge between conscious and subconscious to let the sub have a deeper effect. So we’ll see if that theory holds any weight.

1 Like

Had a nice bottle of chianti tonight to reflect upon the exit from this side of the veil of our oldest member of my family line, who made 99 years of age before the system and her enhanced age finally got to her. While checking the discord where I got the help I finally figured out the solution to my own problem there. The solution to rotating around the HMD is to translate stage space to which all objects are parented by the HMDs global transform, rotate everything by the increment, then translate back again. Then no matter what the direction of rotation of the headset, the objects should rotate around the HMD correctly. I’m going to test this out tomorrow morning to confirm I got everything right.

I didn’t even have to read the 900 pages of mathematics I downloaded today, I just had to get close enough to my subconscious to allow Minds Eye to do its job! LOL. It was amazing how clearly I saw the solution in my minds eye, there was no doubt in my mind that I had the correct solution, even saw the order of multiplication of the matrices to get the final affine.

1 Like

Well, it turned out my assumptions about the rotation weren’t exactly correct, but they were close enough. I did some adb logcat debug prints this morning to check what the global and local orientations and translations of the head mounted display were as the rotation was happening. Even as the stage was rotating around me, the HMD LocalTransform remained relatively constant, changing only when I moved my head or body location slightly.

This should mean that I can indeed rotate stage space so objects move around the HMD. It’s looking like that problem I was having is due to making some assumptions about how the local transform of the HMD worked. It’s still going to be a complex transformation, but I can now safely assume that the HMD’s LocalTransform is fixed for a given movement of the user from their guardian origin.

Even though I try to keep this a non personal journal as much as possible, I have to post this in memory of our departed 99 year old.

Back during the legendary Christmas parties my family would host, under the auspices of one of the sons of this amazing woman, we would get charged up over the course of the day of Christmas festivities, then when the times was right those of us who understood the tradtions of our Christmas (its always “the best ham ever!”) would wait for the male and female patrons of the famiily to get nicely charged up and then put this on for us to all sing our hearts out to.

So don’t forget, even if the most evil of mothercluckers are trying to kill us all, Pete’s message is as true now as it was then. With the help of J&M and the rest of the sunshine band, we shall overcome, some day.

Sorry.

So yesterday I finally got the rotation problem figured out. It’s reasonably simple when you think about it in the right way, unfortunately its very easy to overcomplicate affine transformations and matrices.

When the user is on their point of origin, the transformation is simple: you take the global transform matrix, including its existing translation, and multiply it by the rotational increment affine. This means that the translation as well as the existing rotation matrix get further rotated.

However, the final transform of the head mounted display is global * hmd. This means that each multiplication by a rotational increment not only increases the global rotation as required, but multiplies the translation of the hmd by the rotational increment. This means that if the user moves off their origin, the user ends up describing a circle about the world with a radius equivalent to the translation amount in the hmd’s local transform.

To fix this, you have to subtract from the global transform’s translation the rotational increment times the translation, leaving the hmd in place within the world but rotated correctly. And for seated games, this minor difference from the point of origin is negligible, although it can still cause some disorientation while moving.

Now that I have that sorted out, the other aspect of navigation (up and down) should be simple. I’m going to try translation upwards/downwards in increments when the grip button is depressed/released, based on recording the global stage of the left/right controllers and finding the difference from frame to frame. This should produce a nice smooth translation up and down in space. I also have to get my models ready now and get started with all the other businessy things that have been on hold the last few days.

Take homes from today’s work:

  • I really need to fix that rotation function some time in the next week so it doesn’t cause me problems down the track. Subtracting the rotational increment times the translation from the original forward affine translation didn’t work and I suspect that’s because the forward affine I was subtracting from was not multiplied by the original rotation, and I really need to calculate the full new rotation minus the original rotated translation and subtract that from the global transform.
  • I started reading the vulkan tutorial at this liink. What I’ve learned so far from reading this and then looking at the hotham code is that HH seems to get its speed from its shaders (not just the fact its done in rust). There’s a separate vertex and fragment shader for the GUI layer.
  • I’ve downloaded the SPIR-V specification from Khronos’ site. I really need to understand what Vulkan is doing under the hood to get how all this fits together. It does seem to be adaptable to pretty much any situation from the way its put together, and in Rust uses the vk-shader-macros crate to compile glsl to spir-v bytecode either inline or at compile time.

I’m focusing on understanding this because I really want to be able to create my own overlays and draw text on the screen, and implement my own gui layers that might use some arbitrary input other than egui.

The more I read the HH source code the more I’m glad I chose this particular library for my upper layer, because it gives you direct access to the vulkan and openxr contexts in order to further extend the existing system (which does include a basic physics system and gui system already, which is nevertheless somewhat limited, although I understand v0.3 may change the playing field significantly). Also an input_context which includes all the data returned from openxr about the left and right controllers and the hmd.

What it doesn’t do so far is allow access to some of the more exotic aspects of openxr such as the eye gaze extensions. But from what I’ve read so far, using the EngineBuilder I can specify the extension set I need to load and then manage the eye gaze extension calls myself separately at the right point in the tick loop each frame.

The path forward is clear; understand all the above and how the vertex buffers and mesh creation work in the stress test example code, and then use that to create my own dynamic textured meshes from what will eventually become the polars and/or popple/pdf layer to my apps.

1 Like

Feeling what is either recon or my subconscious dwelling on some other larger problem today. I had difficulty sitting still and reading through the Vulkan material, and even other stuff I found on the same topic seemed confusing as all heck. Also urges to PMO or other useless wastes of time which I quickly squashed. I also keep feeling bad about not doing as much focused work on the code as I feel I could if I didn’t have other things on my mind. Despite this I know at my final meeting with my DJP mentor he seemed fairly impressed by the progress I had made which I lackluster described to them, so really it probably is just carpal diem syndrome (that nasty thing you get when your reach exceeds your grasp).

Aside from this my dream life has been pretty active; I had some really odd dreams the other night of a young kitty cat in two separate dreams, the first time running away and the second time head butting me and snuggling up in a ridiculously oversized bed with all sorts of blankets and plush coverings. And there were other meaningful dreams involving the death of some saintly figure and some notes I held from them; I get the sense they involved breathing exercises. It was not longer after having run a single loop of Dreams v2 (not RoD) but whatever is happening, my sleep processing seems to be increasing. I don’t know if this is a temporary or more permanent thing.

1 Like

Continuing to understand in greater detail the way hotham is put together. I still don’t know how I’m going to put my HUD or a rendered PDF page into the texture that shows up on the wall of a skybox or some other rendered outcome using the library. I originally thought I would need to hook into the graphics pipeline; now, I’m not so sure. I still feel a lot of frustration at the slowness of my progress today. I know it’s probably reconciliation because I’m running a lot of stuff at the same time, but that doesn’t make it any easier.

(other stuff deleted)

1 Like

Captain John Sheridan : What does the candle represent?

Delenn : Life.

Captain John Sheridan : Whose Life?

Delenn : All Life. Every Life. We are all born as molecules in the hearts of a billion stars. Molecules that do not understand politics or policies or differences. Over a billion years, we foolish molecules forget who we are and where we came from. In desperate acts of ego, we give ourselves names, fight over lines on maps, and pretend that our light is better than everyone else’s. The flame reminds us of the piece of those stars that lives on inside us. The spark that tells us: “You should know better.” The flame also reminds us that life is precious as each flame is unique. When it goes out, it’s gone forever. And there will never be another quite like it. So many

– Babylon 5, Season 5, Episode 16 - And All My Dreams, Torn Asunder

1 Like

I think I am beginning to understand what I was missing in the understanding of creating fluidity between the upper and lower levels of consciousness (conscious and subconscious). A long time ago back in my IRC days, a 32 or 33 (I can’t remember which, pretty sure the latter) was revealing to me what they said was the secret of their degree, essentially telling me that the Stone was created within the human (so that it is fluid and coherent like wax, like the Stone is said to be). I now understand this to be a dangerous practice when done by itself (and perhaps this is why I didn’t work with the technique back then, other than the fact that I laughed at this representation of the Magnum Opus as psychological mumbo jumbo).

Doing this technique by itself would indeed create the world in which Crowley proclaimed every man and woman to be a star, one of pure individualities disconnected from the source (and thus able to be controlled and manipulated). This is not the whole Magnum Opus however, more an inversion of the true one, as is usually the case in systems that focus on the image and not the source of the light. Uniting the conscious and subconscious alone would be the alchemical equivalent of creating mud.

The upper elements need to be joined first. Only then can the lower elements be worked/transformed correctly. And I’m sure you know what I mean when I say the upper elements.

1 Like

Today was a rest day, as much as I can have a rest day. I still managed to get in at the end of the day some research into my app.

Rust has a library pdfium-render built on top of Google’s pdfium library for easy rendering of pdf content. With some work I can output the content of such images into an image texture to map over a tessellated mesh to do part of what I want to do eventually. Also, for basic text and rasterized graphics, ab-glyph has an iterator that should do the job if I can get a RefMut to the image data or pass a manipulated buffer into a function to update a created image texture. I still need to research whether Oculus has access to its own native version of pdfium on device or whether I’ll need to link a prebuilt .so to get things working.

Also, my insights this morning about the upper elements told me a good starting point for improving on taking action for KB. Got to look into that this week at some point.

1 Like

I wonder what the effect of playing some of these subs into a Spurling coil would be.

1 Like

I feel so dumb yet edumacated :stuck_out_tongue:

I had been trying to define to myself the difference between a texture and a material, and what all these different maps really represent. Finally made my way to this tutorial today which is explaining to me a lot more about why the Blender workflow is so complex with the principled BSDF shader. I had no clue about metallic or specular workflow and how the different texture maps fit together, working mainly off the diffuse map (not even albedo) and not even dealing with baking the normals of the higher poly model onto the lower one.

This gives me a much better appreciation for the amount of work a game engine needs to do to render every frame, and why 3D scenes nowadays need GPUs to do their work. All this dealing with fresnel effect, separating the lighting from the model and so on, was something I wasn’t needing to deal with in my virtual world because the models so far have been really simple, image texture mapped models, even with the bevelling of the photo frame object I imported for testing. And I hope I do not need to deal too much with metallic channel or reflections in my first person perspective apps (most simple games seem to use very light complex lightning)

Now I have to figure out how my image textured models are showing up material wise in Vulkan so I can replicate this in my dynamic surfaces.

Also, I can now understand a lot better why the GUI context is wrapped in its own shader. Effectively the image returned from egui is tessellated into a mesh (presumably a pixel per quad) and then the vertex shader transforms the input coordinates into screen coordinates and does a conversion from SRGB to linear color space. And the fragment shader simply tints the output for the fragment by multiplying the input color by the font texture information at that coordinate. So now, if I can understand how that font texture map is generated, I’ll understand how to write my own variation of it if I ever need to create my own GUI or overlay (which I probably will have to)

1 Like

A very spiritually focused day after last night. The Road to Alexandria starts to open up tomorrow when I get to start my branding. I also have a few things I need to purchase when that cash comes in for additional safety. Going to get that underway tonight if I can in prep for what is coming.

My recent run of the Revelation series once only in a row seems to have born its manifestation out today some not more than 2 days or so from the run. Lots of good information today, starting to understand the enemy’s plans better. Continue to have faith I am on the right path.

1 Like

Got a good code review in this morning on the custom rendering code, although its still going to need a bunch of my focus this weekend to fully comprehend the way the render context, the vulkan context, and the presentation buffers all interact with the numerous other components in the code to figure out what its truly doing at every level. It’s not that the code is difficult to understand, it just has lots of moving parts and I’m only starting to fully comprehend using unsafe code in Rust.

As I got a little spare cash, invested today in a new keyboard, trackball mouse and hard drive to hopefully speed up development. Unfortunately that means I have to adapt my workflow which ironically slows me down a little. But eventually it will be worth it and my wrists and sanity will thank me for it. I don’t know how many times I have cursed my non working back arrow on my laptop or the lack of a numpad.

Business name registration was significantly less than I thought it was going to be (by a factor of 3) so after I budget tomorrow I’ll finally get all my branding and registrations sorted out. Then its forward from there getting a proper desktop setup for the development.

Code-wise, not a lot got done today. I’ve started downloading the Vulkan tutorial series I found some time ago to the local external drive for later viewing. Confirmed my further protection is on its way and the existing protection is beginning to work in reminding me to stay in the presence more often, which seems to be the key to avoiding PMO.

When you feel the presence of that omnipresent intelligence, the desire to go down that path just loses its steam. Funny how that works. That’s how the Benedict medal is meant to work, by constantly reminding the wearer of His Presence. In that Presence the adversary cannot remain unless it is the will of Him, which is what makes it such an excellent tool which is sadly not used much nowadays.

Another tool is one which was often carried by soldiers during WWI, and in various forms is still widely used today. Who ever knew combat or battle R’s were a thing? Yet its obvious if you think about it, since no-one needs protection in that sphere more than a man on the front line. Especially with the Pardon C, which isn’t well known these days.

The reason I’ve been going down that path is recognising an increased level of activity that would cause you to need that protection. I suppose I shouldn’t be surprised given how close we are to the time of the trumpet. And because of the importance of the work I’ll be doing in putting these apps together, I have been feeling a lot of that very distracting stuff impacting my ability to work on the project, which requires me to double down on remaining in Presence in order to stay focused on the job.

I guess ultimately because of this increased Vigo-like activity (can you say Zuul mf?) I’m inadvertently being pushed down the path of discovering what helped Walter Russell to accomplish what he did by having to push against this increasing torrent of distraction. And in the process figuring out how fear is a mind trick that can only work when you do not walk in Presence.

I think one of the major stumbling blocks I find with trying to learn Vulkan is the incredibly long constant names and structure names. VK_STRUCTURE_TYPE_PIPELINE_LAYOUT_CREATE_INFO – ugh, that’s 45 characters right there. In Rust, assuming you’ve imported ash, it’s &vk::PipelineLayoutCreateInfo::builder() followed by a bunch of function calls to set the fields, and then build. Then there’s all the terminology. Uniform buffers, descriptor sets, swapchains, staging buffers, vertex buffers, index buffers, buffers out your wazoo! All just to efficiently efficiently pipeline the creation of some form of animated view of a world, no matter how simple that world happens to be! Then you have to deal with materials, textures, UV coordinates, fragment interpolation, image sampling, and before long you begin wondering what did I set out to do in the first place? :stuck_out_tongue:

The perfect response/rebuke to those who reject aspects of science or history because “it’s not in the Bible”: John 7:16-17.

“… My doctrine is not mine, but his that sent me. 17 If any man will do his will, he shall know of the doctrine, whether it be of God, or whether I speak of myself.”

Here the master speaks of receiving knowledge he may or may not have told them, directly from the Creator, through obedience to the same. In other words, knowledge of things not previously spoken of may be given by direct revelation. Here we can learn that the view described above is not in line with the teachings of the book itself.

Despite my slow technical progress over the last week, I am beginning to have faith that I can create the app or apps I have been tasked with writing. This is because I know understand where the inspiration and the protection and the focus is meant to come from in creating that, and how to obtain that. I am starting to see that there are aspects to the creation of this app even more foundational than learning Vulkan and hotham.

It’s why my business failed previously, because of lacking the right mind set and approach, the wrong rationale, the wrong modus operandi. Getting that right up front is critical to being able to make the mental focus and action taking work dependably, and so the work I have been doing on the weekend and today is just as crucial to the development of my business as the code is itself. Also, the fleshing out of the R&R app idea I had originally will only come from connection with the wellspring of creativity which gave me the idea in the first place. Knowing that, I continue moving forward slowly but surely with my task

Edit to add notes. In addition to the purely traditional modalities of reaching this state which use the items of the faith and the wisdom of the book, there is another component which is related to and re-inforces the Presence, as well as developing those gifts of the spirit that are called extra-sensory but are of G-d in a practical, reproducible and measurable way. This is known as Magnetism, and is elaborated on by Marco Paret in his book Magnetism and Energetic Ascent, pointing out the pitfalls in the development process. I must integrate these two sets of practices in order to reach the higher degrees of focus needed; practicing these exercises in conjunction with running Khan Black along with periodic runs of IG UPX in between runs of KB/Rev Series/WB. This should provide the necessary structure to successfully develop the focus-without-interruption required for successful implementation of the app in the shortest possible time frame.