The Road to Alexandria (IG: UPX)

As I go to bed tonight, I feel satisfied that having done all the self reflection, video watching and planning today and over the last several days, I am ready to take on the code again knowing exactly what principles and actions must act on one another.

The Paret materials I discussed here and elsewhere and related exercises can be applied to both the spiritual exercises such as prayer as well as every day activities such as consuming the Vulkan content and reviewing that content within the HH code base. Additionally the material I watched today on the HS is starting to mesh with the Paret stuff and the question of how to find the right inpirations to bring the app together.

So I have specific activities I can undertake sequentially to move forward with confidence, to develop the Wanted, the IG: UPX and the Rev series progression. Its taken me a while to crystallize the the best attack plan to tackle both the physical and other worldly distractions in one, but I have a system now that is capable of doing that while also developing my personal energy at the same time. All of the uncertainties around the steps and actions to the goal have been removed, which means the path can be clearly visualised and executed.

1 Like

First return to learning the Vulkan API, so far so good. Brendan Galea’s Game Engine Tutorials are the best I’ve come across so far and go into a lot of detail about what each struct and function call does. Today’s work so far has been reviewing the creation of vertex buffers and how they connect with the vertex shaders. The vertex buffers need to be mapped and then either flushed across to the device or updated automatically via the host coherent bit flag. the structure of each vertex needs to be mapped via a descriptor for each binding.

The same function call used to allocate vertex buffers is also used for other types of buffers like uniform buffers.

If I’m understanding the principle so far, for wall panels or hud displays I can create a couple of uniform buffers which I memcpy to and flush and then add some custom fragment shaders to sample the uniform buffers as image samplers using the UV mapping coordinate of the fragment.

The finer details of how that all fits together in the code are going to have to wait until I go through the chapters on uniform buffers and textures, and read the rest of how the HH code base custom render example fits together.

My CR came today in the mail along with its companion pieces. Very happy with how quick that reached me and helps me feel secure I am moving forward in right alignment. So far my use of the exercises mentioned the other day has been helpful but will take some time to become a regular thing. Like I said on another thread, never realized until recently how RoM had interacted with helping me orient myself in that respect until I went back and read the copy to see the modules I had overlooked when I originally bought it.

1 Like

Woken from second set of nightmares in as many nights. Thankfully this lot weren’t half as bad as the previous night with all my teeth falling out and some weird weird metallic plate coming from my upper jaw, the night before the CR came. Now that it’s here, it was an earlier evening than usual and I had a dream involving PUSA (not the band) and a fat, sweaty man who pointed a gun at me and my friend in the dream after he had lost his mom earlier.

Ended up waking up with really hot palms as though energy had been flowing but with resistance. Same with soles of the feet, unnaturally warm. Both times felt like something had been trying to poke its way through into my dream state because I won’t play with it in waking life :wink: Tonight wearing that CR things weren’t half as bad. Gonna keep a close eye on that dream situation, the funny thing was before feeling tired enough to fall asleep and have that dream tonight I had resisted a temptation not long before that.

1 Like

Two new concepts so far I finally have words for that occur when working in 3D presentation within Vulkan and other libraries.

  1. Z fighting. I had seen this in my test scene within hotham with complex objects especially like my horned man object. Moving slightly one way or another, you would get a shimmering effect. Similar effects occur on the periphery of a scene as the polygons become smaller. The problem occurs when two polygons occupy the same spot in space, including due to the precision constraints of f32 or f16 being used to draw the object. Galea’s tutorial includes a perspective matrix which has the property of reducing z fighting, but even with this, if you have a high poly model, there will be a point at which z-fighting raises its head. This is where careful retopology and texture painting is going to have to come in.
  2. Extrinsic vs intrinsic rotations. This is one which blew my mind when I read it. You can transform the one into the other by changing the order of reading / multiplication. I’m still trying to get my head around how that works mathematically.

After I finish reading on projection matrices, I’m on to view transforms and uniform buffers, which is where I will find the solution to my problem of dynamic texturing. I’m tempted to pull an all nighter, but we’ll see how things go.

1 Like

8.07pm. Restate my assumptions (throwback to the movie Pi)

This essay which I found reference to when seeking information on claims of renewing prenatal jing (aka parallel with KB objectives and the Paret stuff) reminded me again of the concept of harmonic resonance, aka gan ying in this context, and connected with dharana and samadhi in another tradition, as well as the type of projection or union of consciousness people like Viktor Schauberger and Slim Spurling experienced in their professions.

It reminds me that this universal principle, present also in Greek philosophy, found its way into Western thought through masters such as Plotinus, Porphyry and others who spoke of the chain of being, eventually crystalizing into systems such as Kabbalistic thought via the silk road as investigation in Western culture of this internal science dried up, remaining only in hidden form in the work of people such as Bruno, Renaissance thought, demonologies and angelologies, and in an obscured form in alchemy and old theories of telluric currents etc.

The process of applying the will or awareness in a sustained sense like the fire of the wheel, as spoken of in Paret’s method, also seems to have connection to this kind of gan ying or resonance through directed expression under the force of will, although in an abstract kind of way.

Ultimately I find myself wondering, can we combine several of these systems with the revelation of the master who returns regularly in order to assure the success of our final outcome? After all, there are many different ways of spreading a fire, especially when the speed of light is not a barrier.

I know the answer is yes as this has been done before, particularly in several Indian lineages. Focusing on the aspects of Virgo, Sol, Sag A, Terra Magneta, to speak obliquely, and the relevant patterns and images well known in our own culture, united with the direct links, should yield interesting results.

But also other elements are required to ensure the full range of development of the energy field in a healthy way. Kidneys, bones and adrenals and their resonances in particular.

This is coming up because of looking at the existing timeline of the tick of the tock and considering how to use methods devoid of the possibility of sabotage by the “flyers”/inorganic ones, to ensure goals in all areas are accomplished. It needs considerable thought to be done methodically.

Meanwhile actual work on the app continues unabated, but the plan for organizing action taking on the several subs together is taking more depth of form with ideas like this.

BOOM! And there it is.

In the writings of Catholic seer Marie Julie Jahenny, she gives, among her predictions for the final days, a few very interesting ones. One is in regards to these super lightning storms Ben is warning about.

She mentions within her ecstasies the word lightning over 20 times! In her description of the three days of darkness, she mentions lightning entering peoples homes but not blowing out the beeswax candles (which have a high dielectric constant). Prior to this event “There will be in the thunder something strange. There will be lightning without thunder: for half a day, the earth will be covered.” And again “The lightnings of heaven will succeed with a rapid violence. Fire from heaven will travel the earth to an appalling width: the vengeful lightning will burn any point that produces the fruit.”

This should be read in conjunction with 2 Thessalonians 2:7: " For the mystery of lawlessness doth already work: only there is one that restraineth now, until he be taken out of the way". Chan Thomas’ postlude to the Adam and Eve story (a previously classified writing on magnetic excursions and pole shifts) speaks of the changes that occur in men as the magnetic field of the earth weakens, citing experiments with rats in aluminium cylinders within which an artificially reduced magnetic field was simulated to illustrate. The more I read all this, the more I am convinced of how the warnings and prophecies in the Bible directly foretell that cyclical weakening magnetic event, the same one the Hopi’s mention when the twin brothers leave their posts at the poles, and the corresponding easily identifyable consequences on earthquakes, volcanos, the failure of the jet stream and the corresponding climate changes, etc.

OMG: This is pure gold, and 100% in line with what I think too.

If you want to get into “spirituality”, don’t be like the guy who wrote the email to Louis :wink: honestly the tone is just cringe af. One of the best spiritual and even just moral principles you can uphold is radical honesty with yourself and others, and being able to admit when you fukd up. If you can’t do that, how are you ever going to earn the respect of your peers, let alone your customers?

So actions today… last night I ended up sleeping several hours between 10 and midnight… woke up and could not get back to bed until 3am. energy levels way too high. Slept until 6.30. After morning activities, went to shopping centre and visited Japanese dollar store (where everything is 3.30, thats inflation for ya). Ended up picking up a bunch of stationery products and kitchen utensils, and little Japanese snacks I havent touched yet which are probably loaded with MSG.

Picked up ordinary groceries, upscaled a bit in quality due to the recent return. Picked up tobacco and a bottle of fine brivari for an occasional reward for good behavior ;).

Got home,and barely two hours has gone by and not more than one sip of said brivari washed down with a generous amount of juice, and I feel overwhelmed by tiredness after all the walking and lugging I did today. Stick Paragon on the headphones and boom, I’m out like a light for over two hours, and feeling heavily groggy when I wake up, slight thick mucus build up, hands and feet hot as two nights ago. Was it the brivari? Was it the sugar content in the snacks I ate when I got home which I usually try to keep to a minimum? Who knows. Going to have to keep an eye on this. Granted, the muscular exertion was more than usual, only thing sub wise I had done prior to going out was a partial loop of PSITU, state shifting from which was partial but positive at the time. Something is off.

Methionine with my usual supplement regime helped clear out whatever was going on last night. Finally getting into the uniform vuffer code this morning and I realised its not the true solution to my problem. However, I found code that will help me reach my goals.

This goes into following the Vulkan Tutorials in Rust including the texture mapping aspect. I’m building it now to make sure it compiles and picking the code apart and comparing it to the HH code base today (edit: confirmed examples build and run on Windows). I hope over the weekend I’ll be able to get this sorted out once and for all so I can start working on the creative aspect of all this.

So I now have a methodical process for what I need to do. Basically I’m going through load_models_from_glb_data and breaking down each part of what it does, and what each part of each function it calls does, recursively.

Eventually it reaches the point where the textures are loaded in via the import context. Once I finish recursively writing down everything that happens I can derive the full structure and process. Already I can see the code creates an import context from each glb buffer.file and that context’s implementation then loads all the meshes and materials from the glb file as HH objects, and then Material::load calls Texture::load, and then it loads the texture into the render context and returns the index.

Now it becomes a question of how to unpack creating a mesh with a material and texture without going through this rigmarole, by constructing a material separately with an empty texture, and updating the texture image separately.

All the while I keep thinking… damn this library needs better documentation than it has. Maybe thats why I’m here to do explanatory videos.

Synchronicities have been hot and fast, and I had an eye-fucking or two today, one from a pretty gorgeous Asian girl (no capacity to take any action on it though as I was on moving transport at the time), so I think Wanted is starting to have an effect, its just been slow for me. And I still haven’t had an opportunity to go out to a bar/tavern where I might actually have a chance to speak to someone who engages me. I had hoped tonight would be the night to do that, but after today’s action I’m exhausted.

After the markets my feet and hip were already killing me by that point, and then I misjudged the road I had to walk down to find the bus stop home, and ended up walking another half hour or more in hilly terrain before I found the right one, even with GPS. By that point I was literally crying from the pain. Had to go pick up groceries and wine afterwards (for the pain, ye ken?) and then walk home with my feet and hip still feeling like I’d gone through a crucifixion.

So yeah, I’m upwardly mobile at best tonight.

The funny as fuck thing was that I was praying last night to the BVM, using the phosphene, to help me use WB to find her. At the markets today I met a sweet couple, an Italian lady not quite old enough to be my grandmother but older than my parents, selling off kitch that belonged to their parents. One of those pieces of kitch happened to be a porcelain image of the BVM, her hands clasped in prayer, edged in gold with roses on her gown. I bought it off them for 10 bucks, along with a photo frame that showed an angel and the master sharing communion with someone dressed in a nuns outfit.

I also bought the guy’s mother’s set of mini goblets (they could be a communion set for all I know, I could drain one of those goblets in a single gulp) in a purple felt case.

When I was talking with the Italian grandmother, she was telling me it was a Madonna like I was an imbecile. I pointed out that I had a matching version of her around my neck. She was like “oh, are you a Catholic?” all these thoughts and feelings went through my head, the chief of which was “I don’t fucking like being labeled!” Because when I think of that label I still think of people like my grandparents generation, or the brothers and sisters in the Catholic high school I was sent to without a choice, very dry and mild individuals who are nothing like me. Heck, if you have to label me in that damned box maybe I’m like Daredevil. Except with a lower pain threshold :stuck_out_tongue:

Thing is, I separate my faith in a God who came down to Earth and incarnated like a mf avatar from any kind of label because #1 that’s not the only thing I believe and #2 people who did use that label in my past made very piss poor decisions. You could say if they were the marketing team or site reps for the avatar, they really fked up. And also, I just don’t like labels to hem me in. I am me, I am not some religious box you can put me in, I have my own personal relationship with the Creator and I like it that way.

So I smiled at her and said “after a fashion…”

All up I gave that couple 35 bucks to take some of their family heirlooms off their hands and I think they went to a safe place with me.

So bottom line, I went home with some very nice religious and kitchenware paraphernalia as well as a beer mug/glass holder in pewter from another store owner which featured a handle which was a naked woman with her back arched for pleasure, with her hair and feet forming the edges of the handle. The scene engraved around the outside was a Chinese dragon plus pagoda plus bridge over the river style scene. Very nice.

I’m probably going to be moving like a nursing home soul the rest of tonight. And no I didn’t run in to my hearts desire woman down at the markets. Somehow I didn’t think that was going to happen anyway. But at least I got a nice eye fucking or two on my way there and back. Call it a Wanted Hobbits tale or something. LOL. Next week, perhaps.

So my feet and hip this morning still felt smashed up; its beginning to heal but still smarts whenever I stand up.

I’m starting to move forward with understanding colliders and understanding the parenting system a lot better now. Still don’t have the colliders for the walls working properly it seems, photo frames still float through the floor despite it being a rigid body with a collider. I think the problem is that the floor object in the scene which does generate collisions is using a collider shape of a half space and is intersecting collisions from above all the time, whereas my collider is a cuboid whose position may or may not have been properly set with respect to its parent rigid body. Its hard to tell without further debugging whats going on.

The recon I had previously with WB seems to have slackened off and with the additional energy I’m generating from KB the mental bandwidth is starting to return to normal. Going to keep monitoring over the rest of this week and hopefully by end of this week I will be back to full speed in my implementation. There was definitely a period there where my motivation fell through the floor thanks to the energy requirements of WB.

I think I’m finally understanding the reason for my mixed experience with subliminals. I’ve understood for a while the need for circulation of the energy within the body for information to flow into the main brain and provide insights. But there’s knowing the cause of the problem and there’s knowing the solution.

The two or three methods that I have used in the past that inevitably provide results in getting the subs talking to me seem to have something in common. The methods, of mixed effectiveness are:

  • Getting stoned on the right type of weed (this is one I haven’t done in a long time, both for financial reasons and personal spiritual and logistical reasons). Times in the past after having a suitably large amount of the plant and then walking about the city or being in the atmosphere of an event, there were several times when everything around me just started talking to me and made the manifestations of the subs clearly apparent. Images or concepts that otherwise would just be intellectual became pregnant with meaning. Alcohol also has an effect to some extent but in a less desirable way.
  • After sex or after a masturbation session, especially if it has more physical component to it than visual or intellectual. Insights and integration between the sub and the waking brain would happen for a period.
  • Physical exercise for a sustained period of time, or some activity which requires activation of endurance capacities.

The common thread seems to be the activation of the brain in the gut. Marijuana does this (especially a sativa) in a very clear way (think of the sudden desire to eat which comes in those circumstances). Physical exercise does it through establishing an increased level of activity in the enteric brain and a greater collaboration between the head and the torso through having to constantly send commands to the limbs to move and so on.

Sexuality does it in a similar way by activating a more primal mode of being, which after it is balanced with the forebrain, produces this same two way flow of communication. Of course people have long known the secret of using sex to create states of genius (which was something that got Crowley into trouble and earned him a great deal of his notoriety). But it is notoriously difficult to guide without training.

I remember back when I did fire manipulation (causing the flames of a fire to follow my hands or move at will without being burned) the times I was able to do it successfully were a very strange state of consciousness, which was brought on through was was likely gut-empowerment.

Most of the time however, I do not have a solid connection with the gut, being very intellectual and cerebral. The more I read into the way animal magnetism works, the more it seems to me that this is a key I need to focus on. Paret seems to suggest that the images that form in the brain receive their energy and origin within the solar plexus. He suggests a session of focus on this area before beginning any strenuous intellectual task.

Given the problems I seem to have had recently with getting back into the coding side of things, it seems a balancing of the two brains is in order in order to help Index Gate achieve its stated goals.

So having had a chance to dive down part of that rabbit hole from last night, I can confirm the sources quoted are not infallible and make some errors which are absolutely inexcusable, even though there is still a wealth of information there which is being shared freely which sheds a lot of light on the hierarchy and the mistakes they have made and are making. Quite an interesting read, and I can tell the work today on testing the solar plexus theory on being able to focus properly has given good initial results; I will need to continue down that path tomorrow.

Okay, now I feel dumb :laughing:

After all of that debugging of colliders and the physics simulation, I discover the problem was… I had created inserted the SharedShape of the collider into the world, and not Collider::new on that SharedShape. So there really was no collider present for the rigid body, hence the flying through walls. Not only that, but because of the exchanging of y and z axes in Blender VS OpenXR, I had named the floor and ceiling with negz and posz, which were negative and positive y in the game.

Now when I rerun the physics simulations, cuboid colliders work and the floor and ceiling act like they should. Makes a lot of difference when my bloody mind can focus to see the stupid error!

Edit: Fixed gravity as well, things fall at an appropriate rate, although I don’t like how fast 9.8 m/s looks, maybe I need to make this half earth gravity at least to add a little spring in my step! :stuck_out_tongue:

Understanding the texture creation is coming along as well. The pbr shader includes material flags in the lsw and a texture id in the msw as an index into a texture buffer passed to the shader. Simple. I can create an empty texture of size type vkExtent2D, determine some suitable material flags (pretty sure its just going to be flags of 1 or HAS_BASE_COLOR_TEXTURE with no AO map), with 0 and 0 for the other fields, but I need to have a look at one of the textures loaded from gtlf to confirm first.

Then, I push the new material into the materials buffer, and create a simple mesh with primitive data using that material id. and then finally i can update the texture image using the vulkan context methods after using ab_glyph and buffer manipulation to draw the text.

That’s my plan anyway. Quite a bit of work just to print some shit out to the screen. but the good thing is that i should then be able to maybe UV map the entire texture around a curved mesh, without even having to write my own shader, if I’m getting this right.

Progress:

After morning forums and news feeds, did some work to arouse the energy in my solar plexus and move mind energy closer to body energy. Ran a micro loop of GLMC, 6 min, and began work around 12 or 12.30.

I had to split the initialization routines in two, because the borrow checker dislikes passing engine through as mutable when there is also a mutable reference to world. So I executed my code to add a dynamic screen outside the initialization for the textures.

To add a plane, I started with creating position, index and vertex buffers.

image
Each vertex needs to know its normal (in this case, pointing outwards on the z axis towards the player), its texture coordinate (presumably from 0 to 1 as a float), and joint indices and weights if skinning. Since I’m not skinning, left those alone after consulting the shader code.

The index buffer lists the indices into the vertex buffer for the vertices of triangles in this case in order.

Next was creating and storing an empty texture, in this case 1024x1024, in which I’ll eventually be storing some text, probably in at least a 12 point font or larger. I’ll need to play with the sizing to see how legible 12 point is. It should mean 12 point will give 85 5.6cm glyphs across the 4.8m wall which is way big enough and may need to be scaled down. There is also a function for creating a texture using an existing image but that involves figuring out the mip levels I want to use and creating an image buffer before setting everything else up, so I left that for now.

Packed the texture index into the msb, and confirmed through some debugging that my other walls had been loaded from GLTF using unlit workflow and base color texture, which seemed appropriate for my color TV screen on the wall. Pushed the material into the materials buffer, created the required vector of primitives from the vertex buffer data to instantiate the mesh data object, and then added that meshdata as a mesh on the render context to add it to the resources of the renderer.

Finally, spawned that at the world origin with an appropriate dynamic bundle including global and local transforms for the mesh, and an indicator of visibility. After wrestling with it for about two hours or so, I finally fired up the apk in the headset to confirm: awesome, it doesn’t crash! And double awesome, one wall of the skybox is now almost entirely black, with space at the bottom and top of the cube to confirm the texturing of the entirely black window is working.

Now the next step is to create a staging buffer I can update with image data and use ab_glyph to draw text in this part of the screen, and update the texture periodically in vulkan, to see how fast this dynamic updating of a wall textured with writing and other symbols is.

12/08/2023: Rest day

There were no subliminals played overnight during sleep or this morning through to now 5.37pm. I may run a microloop of GLMC later and see how far I’m willing to stretch it. Intuitively I don’t feel going beyond 8 minutes just yet is advisable.

Action taking: other than reviewing my code from yesterday and considering how I’m going to map unicode characters to glyphs, it’s been mainly a day devoted to the spiritual, in line with what a Saturday ought to be. I watched part of a sermon of John Macarthur, and considered my insights so far from KB and the other subs. Continued researching topics related to MJJ and her work, as well as scientific aspects of our spirituality. No PMO or other artificial stimulation.

13/08/2023: Code implementation

Spent mode of the day battling the borrow checker and getting incredibly frustrated. ab_glyph has FontRef with a lifetime, or FontVec with owned data which cannot be cloned, or FontArc which means slow access to the ref. Take my bloody pic. After battling to implement an Option<FontRef<'a>> or Option<Box<FontRef<'a>>> and getting all sorts of problems with the borrow checker, I tried FontVec, which didn’t want to clone. Finally gave up in disgust and wrapped my font handling routines in a separate object, clone()d the font data read from the APK, implemented default(), made it an Option and got rid of Clone trait on my State object because stuff it, I only need one persistent state throughout the entire program so why the hell did the original example try to implement clone for a single state anyway?

Now I can at least assign the font object to my state without it sending the borrow checker’s panties into a twist, there’s no sub-references to state in any place that is going to cause problems propagating the reference to function calls, and the program compiles.

Next step I have to allocate a buffer somewhere to contain the image data to upload via the vulkan context and ensure I’ve used the right format for the draw calls in my font_texturing module.

So much BS just to get some text to appear on screen! I’m tempted to just keep hacking away at it tonight even though my eyes are telling me to take a break because heck, why should it be such a struggle?

So close to having the text drawing code working! I think my problem now is just a misunderstanding of what the API for ab_glyph is meant to do.

I uploaded the vulkan image initially, and because I was using f16 for my indices to iterate the array, at first the color gradient I created repeated several times across the top of the texture and went no further. When I set y*4K and x*4 as u32s and then convert to usize, and set the color to 255*x/1024 & 255 as u8, the nice red color gradient, non gamma corrected, spreads across the screen.

I figured out that regardless of position used in with_scale_and_position, the x and y locations obtained were just with an array of coverage based on point size. So I added an offset based on x and y position plus char index plus a suitable y offset, and got… a series of rectangles displayed to the screen! Each roughly the size the corresponding character would be. I guess this is what outline means in outline_glyph. Duh :stuck_out_tongue:

So now I have to figure out how to get the actual rasterized character glyph to draw. And maybe just use outline_glyph to calculate the coverage of each character. Back to the lab again! Gah!

OK, glyphs are working now… I just have to figure out alignment of the glyphs and harmonize them with the background color, and then add a system to redraw each frame with some dynamic text to see how it handles tearing.