I've been working on a graphical writing system that I call Maegrashoda, on what the possible phonetics of the language could be. Admittedly the phonetics of the language won't work all that well as it's a sign language that's Derrida inspired and so isn't phonocentric. Because the language begins with signs and symbols rather than phonemes, the symbol set is strictly larger than the possible ways that your tongue can move and you may also have situations where symbols that are used in conjunction quite often would require your mouth to move in ways that would be impossible so close together. The precedent of the symbol set makes it so that the phonemes won't work. All that being said here are the symbols.
The idea here is to make a graphical language that is as simple as possible in order to make it so that people can write languages in such a way that they can communicate with each other regardless of what their own written language is. Perhaps not all that important given that there are multiple programs online that will do language conversion for free - rather the point here is to see if I can find a symbolic root of language. This could be useful not only in programming languages but also in artificial intelligence where you might be able to teach a computer to convert visual images into a symbolic language that could also be understood by human beings. Seeing as children often learn through the visual field this could be useful in - say - converting large amounts of driverless car data into symbols that then could be used to teach artificial intelligence how to match symbols together to form sentences, rather than being mere stochastic parrots. However, I'm gettting ahead of myself.
The five symbols correspond to grammar, subjects, verbs, adjectives, and objects. I make the argument that these are the only five types of symbols that you need to make a language. Were that I had a backend capability (I don't at the moment) I would be able to make a program where a user could define their own symbols, the language is generative so you define words as you write and then use commonly used characters as well. However, I don't and so what I'm doing is that I've been making the symbols in pixelart and then saving them which is unfortunately a bit time consuming. As a symbol space, a 50x50 pixel art image is 2^2500 possible combinations of symbols (fewer that are recognizably different by a person), so even a small symbol is useful. QR codes are and have been a thing for a while. Again, the point here is to create a language that is understood by both a computer and a human.
I'll save the philosophical basis for another section, although it is related to phenomonology and what I think about the world more generally, and focus on this page on the system itself. Which I'll be adding a bit to every day until I have the entirety of the language. It's not large, it's just a bit of a pain in the ass in order to get all the symbols uploaded and written out. There are a couple of interesting examples that I have as well, which include the definitions of truth, law, chaos, pattern, the solution to Bertrand Russel's paradox, two cartouches and a couple of other examples. It uses set theory which I'm rather fond of as well as recursion. The grammar is similar to a LISP with generics.
The first set of characters is grammar (the "Ma"). These characters are grammatical rules that should be familiar to anyone that has some experience with computer science. A couple of examples should make this more clear if it isn't already. In addition to logical operators, there are a couple of characters that are used to define new characters, either new grammatical rules themselves, or new characters of the other four types. It should be understood that every symbol is a set. This should make sense and is one of the foundational principles of the language. The reason why is that every word itself can be broken down into a set of characteristics of that word which are included in its definition, until you get to complete semantics or definitions of what is "is" (which results in Being which is part of the philosophical and phenomenological context that I won't go into here). An example would be a "cat" would include its definition the sets "four legged, furry, has two eyes" and so on (I ignore the cynical argument that you can shave a cat mostly because "shaved cat" is a subset of cat itself - it's not important so much as what exists as what we think of when we think of the word - we are thinking in terms of subjective thoughtforms as opposed to objective reality - I make the argument that objective reality is a subset of the intersection of subjective thoughtforms in the philosophy section - as of this writing it is yet to be written here).
Here are the grammar rules.
- the "and" operator
- the "or" operator
- variable operator (when defining grammar)
- definition operator (for non-grammatical characters)
- variable operator (when defining non-grammar)
- such that
- sufficiently close to such that two variables are similar
- the "not" operator
- a generic operator meaning "a set"
- implies
- contained in (another set)
- item contained (in this set)
- definition operator (for grammatical characters)
Broadly speaking those are the only grammatical rules that you need. Which is why the language is so powerful. Its' simplicity means that you can add new rules to define new grammar if you need, but it's not a requirement for the language to make sense. Yes, binary is most likely the simplest of all possible languages. I believe for human language this may be the simplest that can be read and understood. And it has some useful definitions and grammatical constructions that come out of it as a result. Most of the operations have a basis in set theory in terms of the mathematical basis for writing the language, whereas the structure is defined as a list similar to the LISP syntax. If you're looking at Heidegger's Being and Nothingness (which I disagree with in almost its' entirety save the bit about phenomenology) somewhere in the opening chapter he wishes for an ideal philosophical language to express phenomenology itself, which I believe this is the current closest. The "ideal language" has been something that philosophers have been looking for since philosophy began so I'm not picking on Heidegger in particular.
I should mention that rather than use p5.js it might be easier to make this in pure css - not to mention be quicker on page load and transfer between html files. All sorts of things. I could probably make two other widgets that would be useful (I may or may not). One of the widgets would be to just draw each of the shapes on a canvas and then save it rather than talk the bing AI to draw the shape for me. The other would be to have an image character recognizer look at what you had drawn and then suggest shapes for you to pick. The latter sounds complicated, but what you would do is have a submatrix go through and determine if there were more black squares or white squares. So if you have a 60 by 60 canvas, the submatrix would be 10 pixels. Then you would have one that is 5 pixels, then one which is one pixel. Then you measure the percentage similarity between the canvas that was drawn on by the user and the idealized canvases of all the other characters. I know how to do it, but I just don't know if I want to spend my time on the website making widgets or explaining my language and adding content, which is more useful. So if I have some time later and am bored I'll come back to it.
So this would be the Maegrashoda as tiles. It looks much cleaner and easier to work with and using AI to generate the files makes it quite a bit simpler as well. In an ideal world there would be some way to just draw a file and then have an AI be able to intuit what you were talking about, but through iteration you are able to get the AI to make the tiles close enough.
Let's take a look at the definition of the first word, "law", and you can see how the language works and how powerful it is. This word is defined strictly with grammar, which corresponds roughly to what I believe would be the platonic idealized form of truth, rather than the shadows in the cave. That this could be said to be true of any word at all given the nebulousness of language is fascinating. Here I rely on intuition and proof by contradiction to let my audience come up with a contradictory result that is more true based on what feeling someone may have from a basis of phenomenology. I don't believe this is possible.
This looks confusing at first, but it's structured to be simply like a LISP and has similarities with proof based languages such as COQ. It should be noted that all the definition variables in this instance are pure (ie they have a dot in the variable name and the definition symbol), meaning that the word is composed only of grammatical sets as opposed to the non-grammatical sets. That this word "law" can be defined this way means that it exists in a platonic idealized space, which further means that there are words that do, and words that do not - which itself is quite interesting. If we write this out line by line and then explain the structure of the whole then it will all of a sudden become intuitionally obvious in a way that is quite pleasing. The symbols are offset with [] below
1) [Law] [(within a set)] [is defined to be] [that variable] [such that]
2) [[sets 1] and [sets2]] [within] [the set of discussion] [are such that]
3) [[set 1] [implies] [set 2]] [is such that]
4) [there exists] [a set three] [such that]
5) [set 1] [implies] [a law on set three] [which implies] [set 2]
So let's take an example. The law of gravity is such that "if I throw the ball" and "it lands over there" there must be an intervening law called "gravity" which is between the implication of the first two elements within the set. What is interesting here is that this works with recursion! And the definition of law is recursion agnostic, meaning we haven't specified to what depth we are assuming the recursion to be true. In other words this new word "gravity" is merely a placeholder for something we don't know and so within the gravity set there is "gravity 1" implies "gravity 2" and so on. And so we begin to talk about the force of each particle of the ball attracting each particle of the ground and so forth (this assumes that we are saying that the "only" cause of the ball being thrown is gravity and not the collection of forces - such as wind resistance - that affect the ball - again the sets are agnostic to breadth in this case and we are taking an idealized instance).
This doesn't exist just work for scientific law, but for social law as well (such as what "law" means in the context of "civil or criminal" law). In the case of law in the United States, to make sure that laws don't conflict there is the idea that the courts use precedence so that no two laws end up meaning that the implications end up conflicting at some level of depth. In this instance the recursion would stop at that level of depth that a jury could understand. This generally, but not always, takes the form of "if this law is true then it applies to this case, but the implication also implies that this must also be true by which it would conflict with this law". What is important to understand when looking at what I am doing is that each word in itself is a set and therefore contains a collection of items which can be expanded as a list. So every word itself (or in this case thoughtform, as it's a broader collection of sensations that include words) is expandable to the depth of phenomenology. Whereas in the case of gravity we are asking "what does gravity *mean*" and therefore it's expanding at the level of the inner term of the last line, in the case of the law we are assuming that there is expansion behind the new law which we are describing which might conflict with precedence. Here, precedence must be such that there is no conflict of truth implicativeness to the depth that someone can hold the law in their minds.
What is so interesting is that these are areas that are so widely separable from each other that they do not seem to have anything at all to do with one another in the slightest. And yet the idea, or "thought form", of "law" takes precedence over both governance and scientific definitions in a set theoretic categorical way. Since philosophy is so often concerned over what takes precedence in order to arrive at true statements set theory should be used as that is in some sense what set theory as a mathematical device is designed to do. So not only does language contain conceptual fields of inquiry, but to the extent that set theory contains language itself, the definitions of other words that don't have standard spoken equivalents may themselves be important in understanding how people think and how reality itself is structured.
When I had originally worked on my philosophy, I had thought that the "money problem" would be the most interesting introduction, but I believe that has created an incredibly large misunderstanding of my position (and so should be written about elsewhere). Overall this is an introduction to the importance of 'maegrashoda' but it's meant to be primarily an introduction to my philosophy as such. In any event, my philosophy isn't important in the sense that I want followers or anyone to believe in it, so to the extent that those are things that you need, it can safely be ignored. Moving on...
Here's how I define truth. Here we are faced with an issue as to define truth is rather philosophically contentious. Is belief in God true? Is it as true as what time your train arrives or that your shoe laces are untied? The entirety of my philosophical framework is based around phenomenology, which makes things like epistemology difficult as it starts from the belief that all of the world is described in such a way so that objective reality is the intersection of all subjective reality. Further, it is all based on sets. So when I talk about truth, what I am talking about is the intersection of things for which they are true within a set. Not only do we have to understand what is and isn't true we have to do so in a way that is both necessary and sufficient so that we aren't leaving holes in either direction. What makes this even worse is that we are assuming things like conditional statements as being axiomatic, which I don't have a solution for in terms of justification (and how do we qualify what justification is?). I make the following claim, which I believe to be true (again - this may be bootstrapping). That a set is considered true, if all subsets within the set are true, the set doesn't include the converse of any of it's subsets (so A and A not cannot both be included in the same set), and, so that the set isn't just a set of "facts", I make the claim that the transitive property has to hold - namely that A implies B and B implies C implies that A implies C. If all of those conditions are meant then the set under consideration is a true set, and if they don't hold then it isn't. In this way religious beliefs in a sense can be true in the sense that they must be morally or ecstatically consistent (we are considering thought forms which at their base also include emotion, which may not be transitively true in many cases), while on the other hand you can have more scientific truths. In this way you could also have a system of categorizing the differences between sets on the basis of how the definition of truth itself which we've defined differs, and then see if you can make connections between fields of inquiry on the difference in their thoughtform sets. So by definining this and thinking about what we mean by the word truth we are again approaching a larger understanding of what reality as such is composed of. In order to prove this wrong the only solution that I know of overall is to come up with counter examples in a proof by contradiction (which given the precedence of the search for truth may be easy), and then come up with a solution that is more elegant. The point here is that using sets makes the search for truth in a way that encompasses language rather straightforward.
In the spirit of getting shit done as opposed to playing around with programming I'm going to be just photocopying a bunch of notes and adding them as pictures to the website. I'll make them look pretty later.
First a small detour. Here is a known "to make" where we see two things that are going on. First, and most importantly we need to redefine our dot and non dot operator for definitions of variables and the variable itself to be recursive in the case of dots and non-recursive in the case of non-dots as there are recursive and non-recursive definitions. This is a mistake we are making in the original useage (and a put on make!). I also give an example of using the phrase "make" to say "I have made a mistake" that violates what would be considered logical but is understood in context by using a sentence (Here there is no definition operator and so we know that we are using a sentence). There is an interplay between understanding things in context and understanding them through definitions. Typically you would use symbols that were already known in the language and then add definitions as necessary. So "I have made a mistake" is transcribed literally as "I have been doing the making such that the making itself doesn't result in the making of the thing which it is the function to be made (the last being a contradiction of the word making itself but which is understood in context)". If you were going to need a definition of "mistake" that didn't rely on context you could make it more strict by defining it, but again, since all words are themselves such that they are ambiguous until you start defining what things are as absolute reality (at the level of raw sensation into inputs and outputs as are expressing language as being based on phenomenology across experience and time) context is important. In essence the depth of recursion is functionally infinite and you define until you get to a basic understanding of the intersection of phenemonal reality.
If you look at the noun food we have some differences. First as it's a noun it's not a function operator and so doesn't have brackets around a set for which it's being defined. Second the way that we define food is such that it is something for which the value of the definition itself is such that it is defined based on the action for which it corresponds - food is that thing for which it is possible to eat something. So we have that there are a number of ways to make the symbols so that they make sense and you can have multiple definition operators in the same phrase. This is especially useful if you are making a sentence such as "John goes to the _something_ where _something_ is defined as that building where people go to read books" and then you now have that John is going to the library and the definition of the library at once.
So here we have what truth is, and we are back to using the recursive operator, so any true set must itself be composed of true sets and not composed of the recursion of non-true sets while the transitive property holds. Here it important to reiterate that this is again a functional operator on a set, so you can have multiple sets for which this property holds, but a "true set when it applies to sociology" is not a "true set as it applies to physics" while at the same time the rules for what constitutes truth remain consistent. There is also some similarity between truth and law, in an interaction that involves how the recursive property is defined. In point of fact truth and law are most likely equivalent at some specific depth of recursion, but they approach that recursion in different ways meaning that they are both fields that intersect but not necessarily at the same speed. This is incredibly interesting, and I have no idea how this would be proven mathematically or pragmatically but it seems to hold.
Here we have pattern and chaos (which I have cheekily used as halos and devils horns, but also this seems to hold also as circles and broken circles - in this way the patterns that we make with the language can be suggestive of cultural artifacts in much the same way that there are (mathematical) collisions between spoken words where "words have two meanings"). Here we have an interesting phenomenon, which will become important later when I talk about philosophy, which is that not only are there words that are recursive versus non-recursive within language, there are words that are time dependent and non-time dependent. Which means that within language if we are to start breaking words up into categories of mathematical meaning we start to see that there are fields of meaning. These are the only two major fields that I have seen so far, but it would be interesting to find what other fields there are and if there can be said to be a finite number of such kinds of fields (which I believe there are at least pragmatically - but mathematically?). We've also have an intuition that the words themselves preserve predicate rules for inverse as we can see that "for pattern there exists at least one example of implication" whereas for "for chaos there exists strictly none" where in predicate rules the inverse of "for all" is "there does not exist an instance". And so "pattern is equivalent to (not chaos)" and vice versa which strengthens the result.
And I'm going to take a break on this part of what I'm working on at the moment and start adding other bits to the website. I've demonstrated how you can start using the language to come up with context dependent sentences that are understandable, and I've demonstrated some important results of words that can be said to "platonically correct" meaning that they are the absolute best known philosophic value for what the word itself means - namely truth, law, pattern, chaos - which I believe to be quite impressive that this exists at all and validates much of what would be considered as something to be "philosophically true" in the strictest meaning of the term and adds validity to the idea that this language encompasses other languages in a priori-ness. Tomorrow I'll be adding some "cartouches" and talking about what that means in addition to coming up with some more pragmatic examples of the useage of the language as well as some of the drawbacks.