Since @James mentioned making a response, and I’ve already shared my journey on IG elsewhere, I decided I should focus on how I got into programming originally and how I think that relates to the question. I’ve probably already mentioned this on another thread, but it bears repeating as one potential path of taking action.
Back in the late 1980s / early 1990s, our family who had ties into the field of education bought or came into possession of a Microbee. An old Z80 based microprocessor with 64K of RAM, 4K of which was used for a programmable character generator. Making graphics back then was about modifying the character set by poking bit sequences into memory at the appropriate address, then placing the modified characters somewhere within the character memory that represented the screen.
There were a number of free games on 5.25" floppy disk back then, some of which were written in an early version of BASIC, which provided the facility to poke bytes at specific addresses. So I would read through their source code or step through the code and understand what they were doing.
Some games simply called routines written in Z80 assembly language, so I would use the debugger to print out the machine language and understand how it was doing something cool like fading out a screen pixel by pixel to black.
When the 386 later came out, having already had this low level experience with examining code and reading about machine internals, the same principles scaled over to learning about MSDOS and the various interrupts (INT 21h for MSDOS, INT 10h for graphics, INT 9 for the keyboard etc). I learned how these low level interrupts worked as one of the lowest level APIs available on a machine and eventually how compilers were developed to translate things into the machine code needed to build a working program. I cemented this knowledge by building programs in Borland’s Turbo Pascal, the language of the time, which allowed inline assembly language within your Pascal code. For example I built a text based CD player which called the interrupts assigned to the CD player interface to do things like forward track, eject disk, etc.
It wasn’t until late in my university life, and more so after uni, that I finally got into interpreted or loosely typed languages like Perl or Python. And these languages had their own idiosyncracies and ways of doing things. For example, Perl had/has the *FOO{THING} syntax for transforming a text string representing a function into a reference to the function itself. And so I could create configuration files where different pluggable modules were able to be used for different purposes, but using the same underlying calling code.
I maintain that that spirit of investigation of the internals of the machine, which is what used to be known as hacking, allowed me to become good at development. It certainly helped when approaching typed languages that deal with references and memory management, like C and Rust.
You have to develop an innate curiosity about how the whole thing is put together, to a greater or lesser extent. And that’s what will keep you learning.
Back in the early days of computing, before it became all about C Sharp this and Unity that, or this API vs that API, or language against language (which is all rubbish), this spirit of curiosity and pushing the limits was much greater in the people who were working in the field. I would download and run executables people had written in assembly language, something few people would do nowadays because of the potential for viruses, where someone had written code to rotate and gourad shade a 3D model, or melt someones face etc, all within the limitations of a DOS protected mode application and without any language other than assembly language. These were known as demos, and the demo scene was where the talent cut their teeth.
I would say: it’s easy to write code if all you’re doing is using other peoples libraries and doing plug/play type stuff or learning an API. And this is the sort of stuff that they’re trying to obsolete using AI. BUT: the real genius of coding is in being able to push the limitations of the hardware you’re running on, and I very sincerely doubt that is ever going to fall to AI, as it requires someone to understand the internals in a very deep level. Have that curiosity to go beyond the languages that hold your hand and get into FFIs and memory management etc. Because you’ll learn techniques in that area that you won’t learn in other places.