Great, and I appreciate what you are doing, and have a better understanding of what you are trying to say.

Linux is used as such a broad term these days (for all of us) that it's hard to talk about it as a thing. Originally, and technically it was an open source adaptation of Unix in the form of a kernel for the operating system.
Nowadays, what most people mean by Linux is the operating system plus accessories, languages, desktop environments and application programs rolled into named operating systems, like Debian, Ubuntu, Mint, Red Hat, among the best known. And many others, Arch, Slackware, Puppy, among a myriad of others, and the one I now use, EasyOS.
About the only thing one can say they have in common is the Linux kernel, which is not even a desktop environment, but a set of primitives usable in terminal. Because Linux is based on Unix, it is a multi-user system oriented towards institutional infrastructure use with many lower privileged users and groups with an extensive permissions structure, and a separate class of administrators who we would normally associate with an IT department.
What has developed since then are many different ideas about what graphical personal computer operating systems should look like, how they should behave, what applications should be included with them out-of-the-box, and what optional programs they should be capable of adding (ie what program repositories they should be able to access).
The way all of the myriad programs and code developed to support this vast set of systems and possibilities was through individuals and groups of people volunteering their time to A.) reverse engineer computers and hardware to develop drivers for them (since almost no hardware mfrs. provided drivers for linux systems). and B.) create systems where blocks of code could be re-used and re-combined into more and more sophisticated applications. To enable this kind of re-use, the concept of dependencies and open-source software was essential.