Perception Theory (Perception is Everything) — Three Applications
In the presentation of a theory of human existence, Perception is Everything [Jan., 2016], it was suggested the theory could be applied to almost every aspect of human experience. The model paints the picture of the objective/subjective duality of human existence as the interactive dual flow (or flux) of real-world, empirical, and veridical data bombarding our senses and of imaginative, conceptual, and non-veridical data generated by our mind, all encased within the organ we call the brain. The two sides of the duality need not be at odds, and both sides are necessary; the objective and the subjective are in a symbiotic relationship that has evolved out of this necessity; what and who we are simultaneously exist because of this symbiosis that dwells in the head of every human individual. No two humans are alike because no two symbioses in two brains are alike.
This post is to briefly demonstrate how the perception model of Perception is Everything [Jan., 2016] can be use to contribute insights into I. Development of Self-Consciousness in a Human Infant, II. Education, and III. The Origin of Politics.
I. Development of Self-Consciousness in a Human Infant — That the human mind has the ability to develop a concept of “self,” as opposed to “others,” is commonly seen as fundamentally human. It might not be unique to our species, however, as we cannot perceive as do individuals of other species. Often pet owners are convinced their dog or cat behaves as if it is aware of its own individuality. But that might be just too much anthropomorphism cast toward Rover or Garfield by the loving owners. So fundamental is our self-consciousness, most views would assert its development must commence just after birth, and my perception theory is no exception.
The human baby is born with its “nature” genetically dealt by the parents and altered by the “nurture” of the quality of its gestation within the mother’s womb (or within the “test tube” early on or within the artificial womb of the future). The world display screen in the head of the baby (Perception is Everything [Jan., 2016]) has to be primitive at birth, limited to whatever could bombard it veridically and non-veridically while in the womb (Can a baby sense empirical data? Can a baby dream? Are reflex movements of the fetus within her which the mother can feel before birth recorded in the memory of the fetus?) Regardless of any answers to these questions, perception theory would describe the first moments after the cutting of the umbilical cord as the beginning of a “piece of star-stuff contemplating star-stuff all around it” Perception is Everything [Jan., 2016]. The event causing the baby to take its first breath begins the lifelong empirical veridical flux entering one “side” of the baby’s world display screen, triggering on the other “side” of the screen an imaginative non-veridical flux from the other “side.” The dual flux has begun; the baby is “alive” as an individual, independent of the symbiosis with its mother’s body; its life as a distinct person has begun.
The unique “long childhood” of Homo sapiens (due to the size-of-the-birth-canal/size-of-the-baby’s-skull-after-9-months’-gestation consideration), the longest “childhood” of any species before the offspring can “make it on its own” — a childhood necessarily elongated, else we would not be here as a species today — assures the world display screen is so primitive that the first few days, weeks, and months of each of us are never remembered as our memory develops on the non-veridical side of the screen. It takes a while for memory generated from the empirical veridical flux to be able to create a counter flow of imaginative non-veridical flux back to the screen. Perception is Everything [Jan., 2016] indicates the dual flow is necessary for the screen to become “busy” enough to be noticed by the “mind’s eye,” that within us that “observes” the screen. No doubt all of us first had our screens filled by perceptions of faces of caretakers (usually dominated by our mother’s face) and sensations of sound, touch, smell, and taste as our bodies adapted to the cycles of eating, eliminating, and sleeping. Waking hours during which we were doing none of these, we began to focus on the inputs of our senses. These are the indicators we inevitably process non-veridically how we are aware of these inputs; and just as inevitably we at some point become aware of a “perceiver,” an observer of these inputs; we have an idea of “something” is perceiving, that “something” is relating to our caretaker(s) (whose face(s) we always feel good seeing), and that “something” is us. In each individual, the development of a subjective “I” is normally “there” in the head in a few months (exact time interval different, probably, for each individual); a distinction between “me” and “not-me” begins. This distinction is self-consciousness in-the-making, or “proto-self-consciousness.”
That distinction between “me” and “not-me” is vital and fundamental for each piece of star-stuff beginning to contemplate his or her “fellow” star-stuff — contemplation that is constantly painting an increasingly complex world display screen inside his or her head. Early on, anything that “disappears” when eyes are closed is “not-me;” anything that is hungry, that likes things in a hole below the eyes to quench that hunger, that experiences discomfort periodically way below the eyes, and that feels tactile sensations from different locales in the immediate vicinity (through the skin covering all the body as well as the “hole below,” the mouth) is “me.” Eventually, “me” is refined further to include those strange appendages that can be moved at will (early volition) and put into the hunger hole below the eyes, two of which are easy to put in (hands and fingers) and two of which are harder to put in (feet and toes). That face that seems to exist to make “me” feel better and even happy turns out to be part of “not-me” and it becomes apparent that much of “not-me” does not necessarily make “me” feel better, but are interesting nonetheless. Reality is being sorted out in the young brain into that which is sorted and that which sorts, the latter of which is the “mind’s eye,” self-consciousness.
In time, “me” can move at will and that which can move thus is the “housing” and boundary limiting “me.” As soon as the faces “me” can recognize are perceived that they represent other “me’s,” then the distinction between “me” and “you” begins, soon followed by “me,” “you,” and “them.” Some “you’s” and “them’s” don’t look like other “you’s” and “them’s,” such as household pets. Still other “you’s” and “them’s” don’t move on their own like “me, soon to be ‘I'” does, such as dolls and stuffed animals. “You’s” and “them’s” separate into two catagories — “alive” and “not-alive.” As quantity becomes more a developed concept, it soon becomes apparent that there are outside “me” more “not-alives” than “alives;” “not-alives” soon are called “things” and “alives” take on unique identities by learning to recognize and later speak names. Things are also non-veridically given names, and the genetic ability to quickly learn language “kicks in,” as well as the genetic ability to count and learn math. In a few months’ time, existence for “me” has become both complex and fixating to its mind/brain, and growing at an increasing rate (accelerated growth). The name non-veridically given to “me” is the subjective “I” or the objective “myself” — both of which are understood to be self-consciousness.
This clearly is an approach similar to a psychology of infants, which might deal eventually with the development of the ego and the id. This approach using perception theory allows a seamless tracing of the development of the human mind back before birth, employing a more objective approach to talking about subjectivity than possessed by some other psychological approaches; it is an approach based upon evolutionary psychology. In addition, it is clear that the emergence of self-consciousness according to perception theory demands a singular definition of the “self” or of “I” or of “myself,” in order to avoid the problems of schizophrenia and its multiple personalities. Perhaps the widespread phenomenon of children making up “imaginary friends” is an evolved coping mechanism in the individual child’s imagination to order to avoid schizophrenia; an imaginary friend is not the same as the self-consciousness producing such a “friend.” Just like the individual brain, self-consciousness is singularly unique, in ontological resonance with the brain.
II. Education — Perception theory is compatible with the idea of what education should be. Education is not a business turning students into future consumers; education is not a sports team turning students into participants; education is not training to turn students into operators of everything from computer keyboards to spaceship control panels. Instead, education is but the development of students’ minds (1. Education Reform — Wrong Models! [May, 2013], 2. Education Reform — The Right Model [May, 2013], 3. Education Reform — How We Get the Teachers We Need [May, 2013], & Top Ten List for Teachers of HS Students Preparing for College or University (Not a Ranking) — A List for Their Students, Too! [Dec., 2014]). The word “but” here is somewhat misleading, as it indicates that education might be simple. However, education is so complex that as yet we have no science of education (#1 on the “List” in Top Ten List for Teachers of HS Students Preparing for College or University (Not a Ranking) — A List for Their Students, Too! [Dec., 2014]). Perception theory indicates why education is so complex as to defy definition and “sorting out,” Defining education is like the brain trying to define its own development, or, like a piece of star-stuff trying to self-analyze and contemplate itself instead of the universe outside itself. At this writing, I am inclined to say that a more definitive sorting out of what education is and how it is accomplished inside individual brains is not impossible, as an individual seeing his/her own brain activity is impossible, or, as another person seeing my subjective world display screen in my head is impossible (the “subjective trap”) [Perception is Everything [Jan., 2016]].
Following this optimistic inclination, education is seen as developing in individual brain/minds a continuous and stable dual flow of veridical flux and non-veridical flux upon the individual’s world display screen (Perception is Everything [Jan. 2016]). A “balance” of this dual flow in Perception is Everything [Jan., 2016] is seen as a desired “mid-point” of a spectrum of sanity, the two ends of which denote extreme cases of veridical insanity and non-veridical insanity. Therefore, the goal of education is to make the probability of becoming unbalanced and away from this mid-point in either direction as small as possible; in other words, education attempts, ideally, to make in the student’s mind the concentration and focusing of the non-veridical upon the veridical as much as possible. The non-veridical vigor of “figuring out” the veridical from “out there” outside the brain is matched by the vigor of the empirical bombardment of that same veridical daily data. Making this focus a life-long habit, making this focus a comfortable, “natural,” and “fun” thing for the non-veridical mind to do for all time is another way to state this goal of education. Defining education in this manner seems compatible and resonate with the way our mind/brain seems to be constructed (with the necessary duality of the objective and the subjective); our mind/brains seem evolved to be comfortable with being at the mid-point without struggling to getting or staying there; self-educated individuals are those fortunate enough to have discovered this comfort mostly on their own; graduates of educational institutions who become life-long scholars have been guided by teachers and other “educators” to develop this “comfort zone” in their heads. Education, in this sense, is seen as behaving compatibly with the structure of the brain/mind that has assured our survival as a species over our evolution as a species. In order to successfully, comfortably, and delightfully spend our individual spans of time in accordance to the evolution of our mind/brains, we must live a mental life of balance of the two fluxes; education, properly defined and thought upon in individual mind/brains, assures this balance, and therefore assures lives of success, comfort, and delight. He/she who is so educated uses his/her head “in step” with the evolution of their head.
We evolved not to be religious, political, or artistic; we evolved to be in awe of the universe, not to be in awe of the gods, our leaders, or our creations. We evolved not to be godly, patriotic, or impressive; we evolved to survive so that our progeny can also survive. Religion, politics, and the arts are products of our cultural evolution invented by our non-veridical minds to cope with surviving in our historical past. In my opinion these aspects of human culture do not assure the balance of the two fluxes that maximize the probability of our survival. Only focusing upon the universe of which we are a part will maximize that probability — thinking scientifically and “speaking” mathematically, in other words. Education, therefore, is properly defined as developing the scientifically focused mind/brain; that is, developing skills of observation, pattern recognition, mathematical expression, skepticism, imagination, and rational thinking. But it is not an education in a vacuum without the ethical aspects of religion, the social lessons of political science and history, and the imaginative exercises of the arts. In this manner religious studies, social studies, and the fine arts (not to mention vocational education) all can be seen as ancillary, participatory, and helpful in keeping the balance of the two fluxes, as they all strengthen the mind/brain to observe, recognize, think, and imagine (i.e. they exercise and maintain the “health” of the non-veridical). I personally think non-scientific studies can make scientific studies even more effective in the mind/brain than scientific studies without them; non-scientific studies are excellent exercises in developing imagination, expression, senses of humor, and insight, attributes as important in doing science as doing non-science. The “well-rounded” scholar appreciates the role both the objective and the subjective play in the benefit of culture better than the “specialist” scholar, though both types of scholars should understand that the focus of all study, scientific or not, should be upon the veridical, the universe “out there.” Not everyone can development their talents, interests, and skills in the areas of science, math, engineering, and technology, but those who do not can focus their talents, interests, and skills toward toward developing some aspect of humanity-in-the-universe — toward exploring the limitless ramifications of star-stuff in self-contemplation.
Therefore, education, Pre-K through graduate school, needs a new vertical coordination or alignment of all curricula. ALL curricula should be taught in a self-critical manner, as science courses are taught (or should be taught if they are not). An excellent example of what this means was the list of philosophy courses I took in undergraduate school and graduate school. Virtually all the philosophy courses I took or audited were taught in a presentation of X, of good things about X, and of bad things about X sequence. In other words, all courses, regardless of level, should be taught as being fallible, not dogmatic, and subject to criticism. A concept of reliable knowledge, not absolute truth, should be developed in every individual mind/brain so that reliability is proportional to verification when tested against the “real world,” the origin of the veridical flux upon our world display screen; what “checks out” according to a consensus of widely-accepted facts and theories is seen as more reliable than something that is supported by no such consensus. Hence, the philosophy of education should be the universal fallibility of human knowledge; even the statement of universal fallibility should be considered fallible. Material of all curricula should be presented as for consideration, not as authoritative; schools are not to be practitioners of dogma or propagators of propaganda. No change should occur in the incentive to learn the material if it is all considered questionable, as material continues often to be learned in order to pass each and every course through traditional educational assessment (tests, exams, quizzes, etc.). And one does not get diplomas (and all the rights and privileges that come with them) unless one passes his/her courses. Certainly the best incentive to learn material, with no consideration of its fallibility other than it’s all fallible, is the reward of knowing for its own sake; for some students, the fortunate ones, the more one knows, the more one wants to know; just the knowing is its own reward. Would that a higher percentage of present and future students felt that way about what they were learning in the classroom!
The “mantra” of education in presenting all-fallible curricula is embodied in the statement of the students and for the students. Institutions of learning exist to develop the minds of students; socialization and extracurricular development of students are secondary or even tertiary compared to the academic development of students, as important as these secondary and tertiary effects obviously are. As soon as students are in the upper years of secondary schooling the phrase by the students should be added to the other two prepositional phrases; in other words, by the time students graduate from secondary schools, they should have first-hand experience with self-teaching and tutoring, and with self-administration through student government and leadership in other student organizations. Teachers, administrators, coaches, sponsors, and other school personnel who do not do what they do for the sake of students’ minds are in the wrong personal line of work.
Educational goals of schools should be the facilitation of individual student discovery of likes, dislikes, strengths, weaknesses, tastes, and tendencies. Whatever diploma a student clutches should be understood as completing a successful regimen of realistic self-analysis; to graduate at some level should mean each student knows his/herself in a level-appropriate sense; at each level each student should be simultaneously comfortable with and motivated by a realistic view of who and what he/she is. Education should strive to have student bodies free of “big-heads,” bullies, “wall-flowers,” and “wimps.” Part of the non-academic, social responsibility of schools should be help for students who, at any level, struggle, for whatever reason, in reaching a realistic, comfortable, and inspiring self-assessment of themselves. Schools are not only places where you learn stuff about reality outside the self, they are places where you learn about yourself. Students who know a lot “outside and inside” themselves are students demonstrating the two fluxes upon their world display screen in their heads are in some sense balanced. (1. Education Reform — Wrong Models! [May, 2013], 2. Education Reform — The Right Model [May, 2013], 3. Education Reform — How We Get the Teachers We Need [May, 2013], Top Ten List for Teachers of HS Students Preparing for College or University (Not a Ranking) — A List for Their Students, Too! [Dec., 2014], & Top Ten List for Teachers of HS Students Preparing for College or University (Not a Ranking) — A List for Their Students, Too! [Dec., 2014])
Consequently, the only time education should be seen as guaranteeing equality is at the beginning, at the “start-line” the first day in grade K. Education is in the “business” of individual development, not group development; there is no common “social” mind or consciousness — there is only agreement among individual brain/minds. Phrases like “no child left behind” has resulted in overall mediocrity, rather than overall improvement. Obviously, no group of graduates at any level can be at the same level of academic achievement, as each brain has gained knowledge in its own, unique way; some graduates emerge more knowledgeable, more talented, and more skilled than others; diverse educational results emerge from the diversity of our brain/minds; education must be a spectrum of results because of the spectrum of our existence, our ontology, of countless brain/minds. Education, therefore, should be seen as the guardian of perpetual equal opportunity from day 1 to death, not the champion of equal results anywhere along the way.
[Incidentally, one of the consequences of “re-centering” or “re-focusing” the philosophy, the goals, and the practices of education because of perception theory may be a surprising one. One aspect of a scientific curriculum compared to, say, an average “humanities” curriculum, is that in science,, original sources are normally not used, unless it is a history and philosophy of science course (Is history/philosophy of science a humanities course?). I am ending a 40-year career of teaching physics, mostly the first-year course of algebra-based physics for high school juniors and seniors, and, therefore, ending a 40-year career introducing students to the understanding and application of Isaac Newton’s three laws of motion and Newtonian gravitational theory. Never once did I ever read to my physics students, nor did I ever assign to my physics students to read, a single passage from Philosophiae Naturalis Principia Mathematica, Newton’s introduction to the world of these theories. Imagine studying Hamlet but never reading Shakespeare’s original version or some close revised version of the original!
The reason for this comparison above is easy to see (but not easy to put in few words for me): science polices its own content; if nature does not verify some idea or theory, that idea or theory is thrown out and replaced by something different that does a better job of explaining how nature words. At any moment in historical time, the positions throughout science are expected to be the best we collectively know at that moment. Interpretations and alternative views outside the present “best-we-know” consensus are the right and privilege of anyone who thinks about science, but until those interpretations and views start making better explanations of nature than the consensus, they are ignored (and, speaking as a scientist, laughed at).
Though many of the humanities are somewhat more “scientific” than in the past — for instance, history being more and more seen as a forensic science striving to recreate the most reasonable scenes of history — they are by definition focused on the non-veridical rather than the veridical. They are justified in education, again, because they aid and “sharpen” the non-veridical to deal with the veridical with more insight than we have done in the past. The problems we face in the future are better handled with not only knowledge and application of science, math, engineering, and technology but also with knowledge of what we think about, of what we imagine, of the good and bad decisions we have made collectively and individually in the past, and of the myriad of ways we can express ourselves, especially express ourselves about the veridical “real” world. Since the original sources of these “humanities” studies are seen as applicable today as they were when written, since they, unlike Newton, were not describing reality, but only telling often imaginative, indemonstrable, and unverifiable stories about human behavior to which humans today can still relate, the original authors’ versions are usually preferred over modern “re-hashes” of the original story-telling. The interest in the humanities lies in relating to the non-veridical side of the human brain/mind, while the interest in the sciences lies in the world reflecting the same thing being said about it; Newton’s laws of motion are “cool” not because of the personality and times of Isaac, but because they appear to most people today “true;” Hamlet’s soliloquies are “cool” not because they help us understand the world around us, but because they help us understand and deal with our non-veridical selves, which makes their creator, Shakespeare, also “cool;” the laws of motion, not Newton, are today relevant, but Shakespeare’s play is relevant today because in its original form it leads still to a myriad of possibly useful interpretations. What leads to veridical “truth” is independent of its human source; what leads to non-veridical “stories” is irrevocably labeled by its originator.
To finally state my bracketed point on altered education as begged above the opening bracket, science, math, and engineering curricula should be expanded to include important historical details of scientific ideas, so that the expulsion of the bad ideas in the past as well as the presentation of the good ideas of the present are included. Including the reasons the expunged ideas are not part of the curriculum today would be the “self-critical” part of science courses. Science teachers would be reluctant to add anything to the curriculum because of lack of time, true enough, but the clever science teacher can find the few seconds needed to add by being more anecdotal in their lessons, which would require them to be more knowledgeable of the history and philosophy of science. Hence, all the curriculum in education suggested by perception theory would be similar — cast in the universal presentation of X, of good things about X, and of bad things about X mold.]
III. The Origin of Politics (The “Toxic Twin”) — Perception is Everything [Jan., 2016] makes dealing with human politics straightforward, in that politics not only originated, in all likelihood, just as religion and its attendant theology originated, it has developed along the same lines as theology so similarly that politics could be considered the “toxic twin” of theology, in that it can turn as toxic (dangerous) to humanity as theology can turn. (Citizens! (I) Call For the Destruction of the Political Professional Class [Nov., 2012], Citizens! (II) The Redistribution of Wealth [Jan., 2013], Citizens! (III) Call for Election Reform [Jan., 2013], The United States of America — A Christian Nation? [June, 2012], An Expose of American Conservatism — Part 1 [Dec., 2012], An Expose of American Conservatism – Part 2 [Dec., 2012], An Expose of American Conservatism — Part 3 [Dec., 2012], Sorting Out Jesus [July, 2015], At Last, a Probable Jesus [Sept., 2015], & Jesus — A Keeper [Sept., 2015]) In order for us to survive in our hunter-gatherer past, leaders and organizers were apparently needed as much as shamans, or proto-priests; someone or a group of someones (leader, chief, council, elders, etc.) had to decide what would be the best next thing for the collective group to do (usually regarding the procuring of food for the group’s next eating session or regarding threats to the group from predators, storms, or enemy groups over the next hill, etc., etc.,); just as someone was approached to answer the then unanswerable questions, like where the storms come from and why did so-and-so have to die, leaders of the group were looked to for solving the group’s practical and social problems. In other words, politics evolved out of necessity, just like religion. Our non-veridical capabilities produced politics to meet real needs, just as they produced religion to meet real needs.
But, just as theology can go toxic, so can politics and politics’ attendant economic theory. Voltaire’s statement that those who can make you believe in absurdities can make you commit atrocities applies to political and economic ideology just like it does to gods and god stories. Anything based purely upon non-veridical imagination is subject to application of Voltaire’s statement. However, I think politics has an “out” that theology does not. Theology is epistemologically trapped, in that one god, several gods, or any god story cannot be shown to be truer (better in describing reality) than another god, other several gods, or another god story. Politics is not so trapped, in my opinion, as it does not have to be “attached at the hip” with religion, as has been demonstrated in human history since the 18th century. Politics can be shown to be “better” or “worse” than its previous version by comparing the political and social outcome of “before” with “after.” No political solution solves all human problems, if for no other reasons than such problems continually evolve in a matter of weeks or less, and, no political installment can anticipate the problems it will encounter, even when it has solved the problems of the “before.” Nonetheless, I think one can argue that the fledgling United States of America created by the outcome of the American Revolution and the birth of the U.S. Constitution was better than the colonial regime established in the 13 colonies by the reign of George III. The same can be said about the independent nations that emerged peacefully from being commonwealths of the British Empire, like India, Canada, and Australia, though the USA, India, Canada, and Australia were and are never perfect and free from “birth pangs.”
What are the political attributes that are “better” than what was “before?” Many of the references cited just above point out many of them, a list I would not claim to be complete or sufficient. Overall, however, the history of Western and Eastern Civilization has painfully demonstrated, at the cost of spilling of the blood of millions (Thirty Years’ War, Napoleonic Wars, World War I, World War II, etc.) that theocracies and monarchies are “right out.” [Here I am applying the philosophy that history is not so much a parade of great individuals, but, rather, is more apply seen as a parade of great ideas — a parade of non-veridical products much better than other such products.] Democracies only work for small populations, so a representative form of government, a republic, works for larger populations of the modern world. Clearly, secular autocracies and dictatorships are also “right out.” Class structure of privilege and groundless entitlement still rears its ugly head even in representative republican governments in the form of rule-by-the-few of power (oligarchies) and/or wealth (plutocracies). To prevent oligarchies and plutocracies, elected representative government officials should be limited in how long they can serve so that they cannot become a political professional class (limited terms of office); in other words, politicians should be paid so that they cannot make a profit.
[Almost the exact same things can be said of government work staffs and other non-elected officials — the bureaucrats of “big government.” Terms of service should be on a staggered schedule of limitations so that some “experience” is always present in both the elected and their staffs; bureaucrats should be paid in order that they cannot become a professional class of “bean-counters” at tax payer expense; public service should be kept based upon timely representation, and civil service should be kept based upon a system of timely merit; politicians are elected by voters, and bureaucrats are selected by civil service testing — both groups subject to inevitable replacement.]
This, in turn, calls for severe restrictions on lobbying of elected officials of all types (making lobbying a crime?). Preventing oligarchies and plutocracies of any “flavor” can only be effective if the overall political philosophy applied is a liberal one (“liberal” meaning the opportunity to achieve wealth, power, and influence while simultaneously working so that others around you (all over the globe) can achieve the same, all without the unjust expense to someone else’s wealth, power, and influence). The philosophy of such a liberal posture I call “liberalist,” meaning that freedom, equality, and brotherhood (the liberte, egalite, and fraternite of the French Revolution) are all three held constantly at equal strength. When one or two of the three are reduced at the relative boosting of two or one, respectively, then things like the atrocities of the French Terror, the atrocities of fascism, the atrocities of communism, or the atrocities of unregulated capitalism result.
[The word “equality” in political philosophy as used above must be distinguished from the “equality” issue of education in II. above. When the US Constitution speaks of “all men are created equal,” that does not mean equal in knowledge, talents, and skills; rather it means a shared, universal entitlement to basic human rights, such as, in the Constitution’s words, “life, liberty, and the pursuit of happiness.” We all have equal rights, not equal educational results; equal rights does not mean equal brain/minds — something the Terror tragically and horribly did not grasp; equal rights to education does not mean equal knowledge, talents, and skills for graduates — something too many “educators” tragically do not grasp. Perception theory would suggest political equality is different from educational equality; the word “equality” must be understood in its context, if the appropriate adjective is not used with the noun “equality.” The difference is crucial; political equality is crucial to the healthy social organization of the species, while educational equality (equal results, not equal opportunity) is tragic and harmful to the individual brain/minds of the species. Awareness of this difference, or always making this semantic distinction, should avoid unnecessary confusion.]
Certain Western European countries, such as the Scandinavian countries, have shown the future of political systems toward which all nations should strive in accordance to liberal, liberalist views. If anything is needed by the population at large, then a socialist program is called for to deal with all fairly — such as social security, free public education through university level, postal service, public transportation, universal single-payer health care, public safety, state security, and “fair-share” taxation of all who earn and/or own. No one is allowed to achieve personal gain through regulated capitalism or through leadership in any of these socialist programs except upon merit, meaning his/her gain (in wealth, power, and/or influence) is not at the unjust loss of someone else, and is based solely upon the successful individual’s talents, skills, and knowledge; competition in capitalism and program leadership is both necessary and in need of limitations. It is OK to “lose” in the game of capitalism, as long as one loses “fair and square;” every business success and every business failure must be laid at the feet of the entrepreneur. The political system with its social programs is merely the crucible of both individual success and individual failure, continually monitoring and regulating the crucible so as to assure perpetual and equal opportunity for all. Regulation of the political system crucible is achieved by electors of political leadership and program leadership — regulation keeping the programs, like capitalism, perpetually merit-based, fair, and just. This is a system of “checks and balances” toward which every political system should strive.
History has taught us that the foregoing is not a description of some “pie-in-the-sky” Utopia; it is a description of what history has painfully taught us as “the way” of avoiding a theology-like toxicity for politics. Politics is not doomed to be theology’s “toxic twin;” it will be so doomed if the bloody lessons of its past are not heeded. In my opinion, it really is not complicated: it is better to liberally trade, tolerate, and befriend than to conservatively exploit, distrust, and demonize. Politically speaking, we need to securely develop a xenophilia to replace our prehistoric and insecure xenophobia. This “xeno-development” is one of the great lessons taught by the modern world over the last 300 years, and this “xeno-development” is begged by perception theory.
RJH