Citizenship and Culture

Each of these five articles of faith – in the Almighty; our institutions; ourselves; the Founders and our neighbors – is critical to the central issue of our history – citizenship – what it means (rights and responsibilities) to be a citizen; who is a citizen; how does one become a citizen. Central to any discussion of citizenship is the role of culture in creating the communities of which we all are citizens.

 “A community is “a small or large social unit (a group of people) who have something in common [culturally], such as [behavioral] norms, religion, values, or identity [and, most importantly, have faith in the community – the people and the institutions]. Often – but not always – communities share a sense of place that is situated in a [defined] geographical area.” For the purposes of this discussion, the community of interest is that of the United States of America – all 330,000,000 of us. How was this community created, by whom, for whom and to what end?

 Well, it was created by about 4 million of us in post-revolution 1787-88 – predominantly religious and/or commercial colonists from the British Isles (75%) with a smattering of Dutch and Germanic. All colonists of European ancestry (except British loyalists), partners in the fight, were considered citizens upon the establishment of the United States. Some free blacks also enjoyed some benefits of citizenship, but not all. Only male landowners could vote.

 The predominant culture in the new nation was that of the Western Tradition – carried across the ocean along with the ideals of personal rights and responsibilities, literature, music, dance, written laws, representative government, a moral sense of right and wrong, religious freedom, the concept of private property, a passion for truth, a sense of fair play and a yearning for liberty.

 This tradition came down to them primarily through the ancient Celtic civilization of Central Europe, originating near the headwaters of the Rhine and Danube Rivers (as early as 4000bc – having migrated from the Fertile Crescent with the expansion of agriculture and from the Samara Culture of the Caucuses – hence Caucasian – and the steppes of Central Asia), who then migrated west to the Atlantic and North Sea coasts of continental Europe and finally across the water to Ireland and England. Later, as Germanic tribes, they migrated north across the islands of the Kattegat to southern Sweden and Norway to become the Norse.

 The Celts were later joined in England by their Continental cousins – the Angles, Saxons, Jutes and others from what are now Holland, Belgium, Denmark and northern Germany. Finally, in the 9th Century, their coastal areas were invaded by the Norse, better known as the Vikings – from Norway into Eastern England and from Sweden into Western Scotland and the area around Dublin in Ireland.

 They were the descendants of the Greek, Roman and Christian traditions, the Dark Ages, the Carolingians, the great imperial monarchies, the Age of Discovery, the Black Plague, serfdom, feudalism, mercantilism, the invasion of the Mohammedans, the Reformation and centuries of religious wars, the Renaissance, the Enlightenment and the creation of the scientific method, the Puritan ethic and finally, a proto-capitalism and republican-democracy. They said what they meant and meant what they said. They were unique in the world and in all human history. For them, finally and for all time – the individual mattered and that was their cultural center.

They were no longer British, or European, or Russian, or Mediterranean, or Spanish, or North African, or Middle Eastern, or Asian or South Asian, or Sub-Saharan African, or Central or South American, or Aboriginal – they were Americans – having evolved for over a century on their own on the far side of the Atlantic and had been annealed through six years of war with the world’s most powerful military – on their farms, in their yards and houses, cities and towns. Finally, they were free.

 When their time came, they created We, the People and proclaimed themselves citizens of the United States of America “with a firm reliance on the protection of Divine Providence” and with their first act, proclaimed a brilliant, written Constitution within which to enshrine their culture for all time, plain and simple, and which was to be preserved, protected and defended, under oath – the most sacred form of personal commitment – by all who would ever serve under its auspices.

 It was simply, the most profound political statement of all time! Having had enough of imperial governmental abuse and revolution, they chose evolution instead and included provisions in the Great Document to preserve their culture from all enemies, foreign and domestic, as well as to allow the People (not the government) to adopt new provisions as their culture evolved over time.

Today, tribes (Mestizos, Somalis, Kurds, Bedouins, etc.) from other (non-Western) cultures are attempting to insert their tribes’ cultural truths into America, under the rubric of diversity and multiculturalism, and are demanding that the American culture change to accept their particular cultural norms. The differences between insular tribal cultures and a cosmopolitan American culture are inherently incompatible.

The Constitutionally enshrined process for maintaining a viable American culture worked well enough for two centuries but cracks have begun to show because of the efforts a new form of domestic enemy who also favors imperial government – the PLDC.

Any discussion of citizenship and culture would do well to consider the brilliant and evocative work of Leti Volpp, Professor, Boalt School of Law, University of California, Berkeley, in The Culture of Citizenship. [The italics are mine.  This is high scholarship and a profound and convincing argument in one of the great issues of our history which I summarize later.]

 “I begin this part by examining how recent multicultural scholarship has turned to concepts of citizenship to solve the question of how the [classic] liberal state is to respond to cultural difference. I show the[ir] circularity at work.

 To be a citizen, one must rid oneself of certain forms of cultural excess. But citizenship itself is a culturally specific formation.

 Political theorists have devoted much energy in recent years to two questions: first, the debate over the rights of ethnic minorities in multi-ethnic societies, discussed in such terms as multiculturalism and the politics of recognition; and, second, the responsibilities of democratic citizenship, in what some have called the civic-virtue debate.

 As Audrey Macklin has noted, one way to understand the historical bifurcation between these two debates is that the former focuses on substantive questions as to the content of multiculturalism, while the latter focuses on procedural questions as to the virtues, practices and responsibilities of democratic citizenship.

 I am concerned that the attempt to address multiculturalism on the terrain of citizenship is not a salutary turn. The recent reliance upon democratic citizenship as a “solution” to the questions posed by multiculturalism will reflect the limits of a focus on procedural as opposed to substantive rights, with all the much-criticized problems of focusing on process versus outcome, formal versus substantive measures of justice.

 At the same time, focusing on “civic virtue” masks the manner in which the questions of democratic citizenship are value-laden, as the other continues to haunt what is positioned as the purportedly neutral activity of deliberative democracy. Pivotal to my argument is the claim that liberal ideas about citizenship presume nations devoid of culture, against which we find racialized others overburdened with culture. The assumption is that public attachments to culture are contradictory to citizenship. [This begs the question: How can “racialized others overburdened with culture” ever become citizens?]

 When I refer to the substantive content of multiculturalism, I mean the question of what kinds of minority practices (language rights, indigenous claims, ethnic claims) the state chooses to [either] ban or accommodate. The debate over the content of multiculturalism includes controversy over the normative issues raised by minority “rights”, such as the principles on which minority rights should be tolerated or not.

 By the procedural questions involved in democratic citizenship, I mean the question of what kinds of citizenship practices are necessary for a flourishing democracy. Thus, one way of thinking about “substance” versus “process” is that the first inquiry considers the substance of practices engaged in by individual members of minority groups; the second inquiry considers the rules through which individuals participate in citizenship [a province of the majority].

 To be a citizen means to “transcend one’s [unique, individual] ethnic, religious and other particularities and to think and act as a member of a political community [as defined above]” and may be characterized by the belief that minorities’ rights are derived from and vested in the enabling power of liberalism, which positions the non-ethnic as an autonomous, rational and self-sufficient individual, in contrast to the disorderly, irrational, culturally motivated other.

 That the specific cultural content of varieties of citizenship assumed in the literature is somehow absent, so that citizenship appears cultureless, is vividly apparent in Kymlicka’s and Norman’s description of the virtues and practices of democratic citizenship. Citizenship is presented as a blank schematic into which questions of culture are only to be newly introduced [an absurdity].

 Following the work of William Galston, the authors suggest that “responsible citizenship” should be understood as made up of four distinct types of civic virtues: 1) general virtues: courage, law-abidingness, loyalty; 2) social virtues: independence, open-mindedness; 3) economic virtues: work ethic, capacity to delay self-gratification, adaptability to economic and technological change; and, finally, 4) political virtues: capacity to discern and respect the rights of others, willingness to demand only what can be paid for, ability to evaluate the performance of those in office, and willingness to engage in public discourse.

 These virtues appear to be culturally specific values that support what we could consider the Protestant work ethic and the spirit of capitalism [which are the specific virtues of the Western Tradition that animated the Founders and Drafters] molding the values of socially dominant groups into the identity of the citizen.

 These purportedly neutral virtues are not culture-less. Yet somehow these virtues reappear in contemporary debates on multiculturalism and citizenship as the abstract guidelines through which cultural difference is to be processed.

 Cultural attachments are thought to inhibit one’s ability to engage in several distinct forms of citizenship. My focus here is on liberal discourses of citizenship. There are other conceptions of citizenship that posit a more robust relationship between citizenship and a common culture. We could note here Rey Chow’s work which draws out the “protestant ethnic” as the figure created by the contemporary belief of salvation in secular modernism and a capitalist economy.

 I rely upon the work of legal academic Linda Bosniak, who suggests separating citizenship into four different discourses: 1) citizenship as formal legal status (which differentiates the citizen from the alien); 2) citizenship as rights; 3) citizenship as political activity, and; 4) citizenship as identity/solidarity. In thinking through these distinct forms of citizenship, the first question is what cultural attachments one must shed (or never have been attached to) in order to gain citizenship as a formal legal status, to be naturalized as a citizen.

 In the context of the United States, as a contemporary matter, “good moral character” is a prerequisite to naturalization. The requirement of demonstrating “good moral character” contains within it the disavowal of various behaviors thought to inhibit the practice or possibility of good U.S. citizenship, including [respecting the rights of others,] gambling and prostitution. Formal citizenship means the qualifications required to possess the legal status of a citizen — in the United States, as granted by the Constitution or by statute. Citizenship as rights signifies the rights necessary to achieve full and equal membership in society.

 This approach tracks efforts to gain the enjoyment of civil, political and social rights in Western capitalist societies. In the context of the United States, citizenship as rights is premised on a liberal notion of rights, and the failure to be fully enfranchised through the enjoyment of rights guaranteed under the Constitution has been described as exclusion or as “second-class citizenship.”

 Citizenship as political activity posits political engagement in the community as the basis for citizenship, as exemplified both by republican theories that played a key role in the founding of American democracy, as well as by a recent renaissance of civic republicanism. Lastly, citizenship as identity, or citizenship as solidarity, refers to people’s collective experience of themselves, their affective ties of identification and solidarity.

 Alternatively, Will Kymlicka and Wayne Norman provide a schematic of citizenship, which differentiates among status, identity, activity/civic virtue, and the citizenship ideal of social cohesion (which they identify as citizenship at the level of the political community as a whole against concerns of fragmentation). Their citizenship as status encompasses both of Bosniak’s concepts of citizenship as formal legal status and citizenship as rights.

 (Requirements of Naturalization): No person, except as otherwise provided in this title, shall be naturalized, unless such applicant… during all the period referred to in this subsection has been and still is a person of good moral character, attached to the principles of the Constitution of the United States, and well-disposed to the good order and happiness of the United States.

 For naturalization, one must show that one has been a person of good moral character for the statutory period (typically five years or three years if married to a U.S. citizen or one year for Armed Forces expedited citizenship) prior to filing for naturalization. The question of good moral character examines not only present but past practices as well, in certain instances it is of no use even to disavow these practices. A past history of gambling or prostitution bars one from American citizenship.

 Secondly, in the discourse of citizenship as political activity, one is to approach one’s engagement in the republic free of corruption, and free of other ties of loyalty and attachments [to other political entities representing other cultures – see “birthright citizenship]. Here, an attachment to culture might be thought to inhibit one’s ability to function as a citizen in the political sense.

 As an example, the idea of Chinese immigrants as being under the sway of foreign despots and engaged in loyalties based upon community ties, inhibiting their ability to follow the rule of law, fueled historic arguments that the Chinese could not become members of the American political body — they could not “understand republican values.”

 Arguably, this vision of Chinese-Americans lingers today, so that Chinese-Americans continue to be characterized as disloyal, nepotistic and deceitful, as illustrated in the “Asian campaign finance” scandals. Thus, ideas about cultural others are used to frustrate the ability of these others to engage as virtuous participants in the republic. In addition, we can isolate the argument that cultural attachments are thought to inhibit one’s ability to engage in citizenship in another sense, which is that to be a citizen of certain Western democratic states, particular values must be accepted as a baseline for membership — for example, the value of gender equality. The immigrant other must be emancipated from the group and group values of gender subordination to qualify as a citizen [think Islam].

 [Two examples:] The idea that immigrants may espouse contradictory values is thought to preclude them from citizenship, which requires shared moral commitments. We know from Marx’s “On the Jewish Question” that the Jewish other was to be “emancipated” from the group and the purported group value of “hucksterism”. In order to qualify as a citizen; today, if, over the last five years he or she has engaged in various criminal offenses, including gambling, has been involved in prostitution, has smuggled “illegal aliens” into the U.S., has been a “habitual drunkard,” or has not supported his or her dependents, citizenship would not be granted.

 This discourse of the inability to understand republican values was invoked in 1866 in Congress during deliberations as to whether to lift the racial ban on naturalization to allow Chinese to naturalize (the answer was no). In this discussion, the values that immigrants are assumed to espouse, as exemplified by their conduct, are thought to be totally separable from their identities.

 According to the conventional narrative of U.S. citizenship, once upon a time there were identity-based exclusions from citizenship, so that one’s status — first as nonwhite, later as racially ineligible to naturalize — precluded one from membership; today, citizenship is inclusive of all and no longer features such identity-based exclusions. Today we have neutral, non-identity based restrictions premised upon conduct — all one has to do is to share certain visions and normative beliefs about American democracy, and act accordingly.

 However, this vision of a progression from status to conduct ignores two facts. First, the historical status-based exclusions were, in fact, premised upon assumptions regarding conduct or behavior. The first federal citizenship statute, passed by Congress in 1790, restricted naturalization to “free white aliens.” Act of Mar. 26, 1790, ch. 3.

 Birthright citizenship was not guaranteed regardless of race until the passage of the Fourteenth Amendment, which the Supreme Court, in Wong Kim Ark in 1898, ruled applied to Chinese-Americans as well as African-Americans (but not to “children of members of the Indian tribes, standing in a peculiar relation to the National Government”).

 U.S. v. Wong Kim Ark, 169 U.S. 649, 682 (1898). The law was amended in 1870 to add “aliens of African nativity or African descent.” Act of July 14, 1870, ch. 255, § 7, 16 Stat. 254. This led to the “racial prerequisite” cases — litigation where non-citizens attempted to prove they should be considered eligible for naturalization. The racial restrictions were not entirely lifted until 1952.

 [Birthright citizenship will be discussed in a separate chapter below.]

 In addition, the idea that status and conduct can be separable should be thrown into question. The assumption that Chinese immigrants were incapable of comprehending republican values was used to justify the retention of race-based restrictions to naturalization in 1870. Thus, although we remember exclusions as having been status-based, they were in fact premised upon assumptions about normative behavior. Second, the conduct or behavior-based restrictions that exist today, while conventionally understood as neutral, are both constitutive of and the product of – status.

 As one example, we could take the idea of the “terrorist” — a conduct-based distinction that purports to separate out one who engages in terrorism from the citizen. This distinction is at the same time foundationally about a particularized status of national origin in the context of governmental policies of detaining, deporting, interrogating and excluding those who are nationals of countries with significant Al Qaeda [and ISIS] presence or activity. But the governmental policies are not purely based on national origin, or else they would target nationals of countries such as Germany, which is a country with significant Al Qaeda presence or activity. Rather, countries with predominantly Muslim populations are targeted, so that the targeted status fuses national origin and religion.

 Moreover, when we think about the governmental policies targeting residents of certain neighborhoods in the United States, arresting and detaining individuals based upon their appearance, we understand that the targeted status fuses national origin, religion, and race. We could note here the case of Yaser Hamdi, required to “renounce” his citizenship in exchange for being released from U.S. custody, as evidence of the notion that one cannot be both a “citizen” and a “terrorist.” [People who are presumed to participate in] hate-violence are [commonly] identified by a fusion of national origin, religion and race, so that an “Arab, Middle Eastern or Muslim” appearance is believed to reflect or augur conduct befitting of “terrorists.”

 At this point, it seems impossible to separate who is likely to engage in terrorist behavior from assumptions about that person’s race, religion and national origin. Recognizing that identity, racial and otherwise, is a matter of conduct as well as status, and that the two are, in fact, mutually constitutive of one another, should help us in framing a new understanding of the story of citizenship.

Historically, identity — as construed in terms of both status and conduct — has excluded certain persons from citizenship, and it continues to do so in the contemporary moment, now construed in terms of culture. If we consider this together with the fact that the perception of cultural behavior is subject to a kind of selective recognition — so that problematic behavior is thought to be characteristic of the culture of entire nations, rather than the product of individual deviants — we can see the perversity of current configurations of the relationship between citizenship and culture.

 One’s cultural identity constitutes a predictor [understandably, without contradictory first-hand experience] of problematic behavior [in the minds of a nation’s citizens]. To be a citizen, one must not engage in problematic behavior [nor frighten the community with problematic precursors of untoward behavior – demanding Sharia Law in the United States, for instance – if domestic tranquility is to be honored] . Both the cultural norms underlying citizenship and the problematic behavior of those who are already recognized as citizens are made invisible [because both the good citizen and the problematic one have been exposed to the accepted cultural norms and subsequently have chosen their own path]. Next time: Citizenship v. Culture.

Citizenship v. Culture (cont.)

What gave rise to this opposition [conflict?] between citizenship and culture?

 “Let us examine the words of New York Times columnist Thomas Friedman, who began a column as follows:

 “If you listen closely to the emerging debate about Iraq, one of the themes you can start to hear is that culture matters — and therefore this whole Iraq adventure may be a fool’s errand. Because the political culture in the Arab world — where family and tribal identities have always trumped the notion of the citizen — is resistant to democracy. I believe culture does matter, although I have no idea how much it explains the absence of Arab democracies. But I also believe cultures can change under the weight of history, economic reform and technological progress . . ..”

 Friedman is suggesting that “culture” in the “Arab world” bears an adversarial relationship to the “notion of the citizen” and “democracy” that the United States is seeking to encourage in Iraq. In fact, the contradiction between citizenship and culture may be so extreme that the “whole Iraq adventure may be a fool’s errand.” How did such oppositional ideas regarding “citizenship” and “culture” emerge?

 We can look to the roots of Western modernity and concepts of the citizen that have shaped the vision of the West as the site of citizenship, and the rest of the world as the site of culture. The citizen is the rights-bearing subject. Citizenship is, borrowing from the phrasing of Hannah Arendt, the “right to have rights.” The bearer of rights is the human; what characterizes the human is the capacity to reason. Ideas about culture have shaped who is thought to have the capacity to reason, so that some are thought to engage in reason, while others engage in irrational conduct, explained as the product of culture. [The subjugation of women in Muslim society is a prime example of citizenship suppression by culture.]

 As Wendy Brown writes: “‘[W]e’ have culture while culture has ‘them,’ or we have culture while they are a culture. Or, we are a democracy while they are a culture.” [These are] particular disavowals that, in fact, were constitutive exclusions to what became known as the modern and the West. We should think here, for example, of the Haitian Revolution, and the documentation showing how freedom and emancipation came to be associated with the French and American Revolutions and not the Haitian Revolution. Black self-emancipation from white enslavement is absent from the normative narration of human freedom (depicted instead as white self-emancipation from monarchy and feudalism or, alternatively, as white emancipation of blacks from white — or black — enslavement).

“The Haitian Revolution was a successful anti-slavery and anti-colonial insurrection by self-liberated slaves against French colonial rule in Saint-Dominique, now the sovereign nation of Haiti. It began in 1791 and ended in 1804 with the former colony’s independence. It was the only slave uprising that led to the founding of a state, which was both free from slavery, and ruled by non-whites and former captives. With the recent increase in Haitian Revolutionary Studies, it is now widely seen as a defining moment in the history of racism in the Atlantic World.

Its effects on the institution of slavery were felt throughout the Americas. The ending of French rule and the abolition of slavery in the former colony by the former slaves was followed by their successful defense of the freedoms they won, and, with the collaboration of mulattoes, their independence from rule by white Europeans. 

It represents the largest slave uprising since Spartacus’ unsuccessful revolt against the Roman Republic nearly 1,900 years before. It challenged long-held beliefs about black inferiority and about enslaved persons’ capacity to achieve and maintain their own freedom. The rebels’ organizational capacity and tenacity under pressure became the source of stories that shocked and frightened slave owners.”

[In keeping with our thesis of whole truth, it must be recognized that Haiti’s success as an independent state has been marginal at best.  Haiti ranks 163 out of 188 countries on the United Nations index of human development, the only nation in the Western Hemisphere in the lowest category.]

“Because of the sole success of Haiti, the history of m]odernity (specifically Western modernity) has been constructed as the product of white freedom, rather than of black enslavement or black self-emancipation; as the product of individual autonomy and rationality, rather than of cultural and group-based determinism; as the product of public engagement, not of private interactions; as the product of activity by the masculine, not the feminine.

 [The fact that] gender is the current sore point of otherness in debates about multiculturalism may reflect the historical bifurcation between public and private, the conceptualization of the “inner world” as the space of cultures and traditions, with women in the home space constituting a bulwark against the outside world. The citizen is engaged in the public; the private is the space for cultural practices. [This gives rise to the idea that, when culture become public, the citizens’ space is breached.]

 For example; This [conception] is [apparent] in the way the British dealt with law in India, leaving religiously based personal laws covering all issues concerning the family to the private sphere (although it bears mention that British colonialism had a great impact on what those family laws were to be) while aligning the laws of the public with British law. Thus, when the discussion focuses on how to “incorporate” immigrants into citizenship which is equated with incorporating them into the public — one finds the discussion turning time and again to how to emancipate immigrant women from the private.

 [Abigail Adams and Mary Wollstonecraft were early (late 18th Century) champions of the emancipation of women in the Western Tradition.]

 Scholars who write about multiculturalism and citizenship have focused on the question of how to incorporate the “cultural other” into citizenship. But this question is premised upon a false bifurcation. Citizenship is constituted through the exclusion of “cultural others”; the “cultural other” creates the citizen through contrast and negation. The borders of citizenship are always patrolled, whether these borders are national or normative ones that admit and exclude on the basis of behavior — and so some are always excluded, whether for territorial, moral, or cultural reasons.

 Due to this othering and exclusion, the citizen can be assumed (falsely) to be absent of culture. Thus, the citizen emerges through distinction from the cultural other, who is repudiated from citizenship through total identification with an unassimilable cultural difference, and through simultaneous denial that the citizen might share similar cultural values.

 In the words of Renato Rosaldo, “… cultural difference is used to account for the acts of those who are not full citizens. Whereas the “noncitizen” is described as culturally motivated, the citizen’s motivations are explained as the embrace of universal liberal values and rational choice, or as the product of psychological pressures.” But in this narrative, both the citizen’s cultural values and the culture of citizenship are obliterated.

 Tolerance suggests that there is a dominant majority extending its beneficence to a minority community — I choose to tolerate you, or I choose to tolerate what you do. Only some cultures are depicted as tolerant, while non-Western practices or regimes are defined through their intolerance. And tolerance poses as both a universal value and an impartial practice. Thus, a state that espouses the value of tolerance masks the fact that tolerance is always conferred by the dominant while its object is “inevitably figured as something more lowly,” and elides [ignores] how the object of tolerance is aligned with difference, placing it outside of the universal and purifying the tolerant entity of all intolerance.

 Tolerance presumes a difference that is to be tolerated, and a majority that does not practice those norms. But the cultural practices that are the subject of tolerance (or banning) are not unique to minority communities. This selective ascription is rendered invisible because we associate behavior with the identity of the actor, in selectively labeling it as the product of “their” culture, or as the product of individual acts. Thus, the call for “tolerance” will emerge in a context when the practices to be tolerated are in fact [but not universally] norms of the community that tolerates, but are denied as such.

 We must attend to when, and in what contexts, the state feels the need to “tolerate,” and when the language of toleration disappears. The state will feel no need to assert the language of tolerance once the community at issue is considered to be made up of citizens, engaged in acts invisible to us as cultural practices. Unlike the bearers of cultural difference, citizens are part of our everyday world.”

 To summarize this discussion: National citizenship is a right granted to persons by virtue of their national origin. National origin is conferred through ones’ birth parents who, it is assumed, will impart the intricacies of the cultural norms to their offspring. By definition, a nations’ citizens form a majority in that nation. A citizen is a member of a community, in this case, the community forming the United States of America. Communities are formed by citizens with common and acceptable personal beliefs and behaviors toward their fellow citizens – commonly called culture.

 The fact that members of a community think and act similarly with respect to their fellow citizens is a good thing. In fact, it is so essential that the Drafters included the requirement in the Preamble to our Constitution. To wit:

 “We the People of the United States, in Order to form a more perfect Union, establish Justice, insure domestic Tranquility, provide for the common defence, promote the general Welfare, and secure the Blessings of Liberty to ourselves and our Posterity, do ordain and establish this Constitution …”

 Therefore, the citizen’s attachment to culture is not only essential, it is mandated by the Constitution.

 So, as to the question of “naturalized citizenship”, “the substance of practices engaged in by individual members of minority groups seeking citizenship, must consider the rules through which individuals participate in citizenship, which is the province of the majority. Both the citizen’s cultural values and the culture of citizenship are critical to the process. The question society asks is; “Do I choose to tolerate you, or choose to tolerate what you do?”

 In the context of the United States, “good moral character” is a prerequisite to naturalization. The requirement of demonstrating “good moral character” contains within it the disavowal of various behaviors thought to inhibit the practice or possibility of good U.S. citizenship. It is also necessary that one approach one’s engagement in the republic free of corruption, and free of other ties of loyalty and attachments. Here, an attachment to a foreign culture might be thought to inhibit one’s ability to function as a citizen in the political sense.

 One’s cultural identity constitutes a predictor [understandably, without contradictory first-hand experience] of problematic behavior [in the minds of a nation’s citizens]. To be a citizen, one must not engage in problematic behavior. Citizenship is constituted through the exclusion of problematic “cultural others”. The centrality of culture to citizenship is therefore, immutable.”

Should any foreign citizens, by definition being, “cultural others”, wish to come to America to work and become a citizen of the United States, they must first renounce any ties of loyalty to their former community and demonstrate the civic virtues described above for an extended period of time. They must show the desire and ability to assimilate into the American community – to become American and they must understand that the road to citizenship will be hard and that only through struggle will it become cherished.

 That desire and ability must focus on the cultural characteristics discussed above: moral courage, law-abidingness, loyalty, independence, open-mindedness, a determined work ethic, the capacity to delay self-gratification, an adaptability to economic and technological change and finally, the capacity to discern and respect the rights and beliefs of others – especially citizens, the willingness to demand only what can be paid for, the ability to evaluate the performance of those in office and a willingness to engage in public discourse.

 The desire to become American is paramount because the Founders were intent upon creating a new type of nation – one that had moved beyond the tribal culture that had plagued Western Civilization from the time of the Greeks. Tribal culture is insular, elitist, segregationist, likely misogynist and hierarchical (caste) – all anathema to the characteristics valued by the Founders as a whole (although there were individual delegates to the Constitutional Convention who were slave owners and patriarchal). Even then, the Convention provided a means to change the Constitution when the People, as a whole, had evolved to the point where racism and misogyny were rejected.

 In any manner in which their cultural beliefs and practices differ from the recognized American culture, that rubric of diversity must be constrained to the private sphere and must not violate any of the inalienable rights inherent in the American compact – either in the public or private spheres. Cultural others must enter the community quietly and assimilate invisibly (called “fitting in”) because that is also part of the American culture – as are belief in the whole truth, education, American exceptionalism, personal responsibility and respect for America and American citizens.

 A case in point: A noted professional woman, a political refugee accepted into the United States years ago, thus preventing her from being returned to her violent homeland, where she may very well have been killed, states in response to policies meant to ensure that terrorists don’t use refugee status to infiltrate America; “America has gone from being oppressed (by the British) to being the oppressor (present tense).”

 This ignorant, or just plain careless, view denies the historic role of Americans in helping displaced persons for virtually our entire history. This history inspired Emma Lazarus when America was barely a century old and now, as we approach the quarter-millennium mark, the cost of this merciful policy has risen to the point that our economic and political structure is in danger of collapse.

 Remaining on the same course with respect to external forces will lead to the creation of internal forces that will change the character of America forever as economic imbalances literally tear the country apart. We simply cannot continue to import poverty and need.

 The American citizens, as a whole, understand the importance of immigration in our history and for the future. They also understand that designing an immigration system that favors those able to grow and improve their America is not only intelligent, it is necessary.

 Controlling our destiny is our right and not allowing all of the world’s needy is not oppression, it is an appreciation for the fact that, if the United States becomes economically and politically unstable, there will be no one else to help anyone, anywhere.

 There is a similar theme in the American military establishment. A person joins the Army, Navy or Air Force but, a person becomes a United States Marine. Marine basic training “breaks down candidates and rebuilds them as Marines.” They adopt the culture of the Marine Corps – Semper Fidelis – Always Faithful. Similarly, a “cultural other” must become American.

 A serious violation of this compact must result in the loss of the privileges of American citizenship for both natural and naturalized citizens. Such a violation must necessarily contain a renunciation of allegiance to the Union, either specifically, or through implication and must result from the application of due process.

 For example, if a natural born American citizen joins a terrorist organization which, necessarily, would bring him or her into armed conflict with Americans or America’s national interests, he or she would be judicially stripped of their American citizenship, either directly or in abstentia.

 Or, if a university professor publicly advocates the assassination of the President of the United States as part of a lecture to university students, not only would he or she would be in violation of 18 U.S.C 871, but also engaged in conspiracy to deny to millions of Americans the most fundamental right of the American citizen – to be governed by officials they vote for.

 This, of course, is subversion of the highest order, i.e., treason – “the crime of betraying one’s country, especially by attempting to kill the leader or overthrow the government.” Certainly, this is an implied renunciation of allegiance to the Union.

 Since the Constitution is a reflection of the American culture at the founding, can the American culture be changed without the Constitution changing? The answer is “under some circumstances” but, what would those be?

 The answer must be based upon the relationship between the Constitution and the People. What is the nature of that relationship? The Constitution grants and guarantees God-given rights and essential responsibilities that are devolved from the People to the several governments – federal, State and local. The representatives of the People are held accountable to the People for their performance in positions where they must preserve, protect and defend those rights and the attendant culture and exercise those responsibilities. An accounting is held during free and fair elections with only justices of the several courts serving in positions relatively unaccountable to the People.

 The powers granted by the People to those citizens holding the several offices described in the Constitution are not absolute, either in scope or duration – they have been defined and limited by the People and, as such, can be recalled at the pleasure of the People. The exercise of the rights inherent in the state of citizenship is not within the purview of the several governments to define, or limit due process of Constitutional laws, or without permitting the People – the citizens of the United States – to change the Constitution and define and limit new or modified powers after considering their effect on the American culture.

 Should any of the several governments unilaterally redefine or limit any of the rights of the People, it is the duty of the People, per Jefferson’s Declaration, “to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to affect their Safety and Happiness.”

 Perhaps the most essential concept contained within the Constitution is that of the “equality” of all citizens – that is, their inherent and inalienable right to be treated equally before the law in all aspects of their existence. Put simply, all citizens are entitled to the same opportunity to enjoy the benefits conferred by citizenship. What applies to one, applies to all but, when governments create inequities – by promulgating practices that benefit one or some citizens while burdening others – that right is violated for some and not for others. That is inequality and is contrary to the tenets of the Constitution.

 Therefore, the answer to the question “Can the American culture be changed without the Constitution changing?” is: Only if all persons affected are treated equally by the changes. A simple example will suffice. A pregnant woman has an abortion. The two persons involved in this endeavor are treated differently. The mother has her wishes fulfilled. The baby is executed. Hardly equal treatment. In order for the cultural acceptance of abortion to be Constitutional, the Constitution must be amended to reflect the will of all the People not just a majority of the nine justices of the Supreme Court. Since that hasn’t happened, Roe v. Wade is unconstitutional law even though the culture of abortion is thriving.

 A second facet to the question posed above deals with the uncontrolled flooding of America by non-citizens intent on changing the American culture to fit their individual whims. Contemporary America is being invaded by various tribes who demand that their tribal culture be accepted by all American citizens under the rubric of diversity. Essentially, large numbers of tribal immigrants from Mexico (see discussion on Mexican history in Altas Speaks) and Central America, the Middle East, Africa and South Asia have all come to America and have, with the assistance of acolytes of the PLDC, established enclaves in many American urban areas. For example:

 About 61 percent of Mexican immigrants live in just two states, namely California (36%) and Texas (25%). In 2015, persons of Mexican ancestry made up 11.1% of the United States’ population, as 35.8 million U.S. residents identified as being of full or partial Mexican ancestry. Of that total, 11 million of them are illegal.

 The United States is home to the second-largest Mexican community in the world, second only to Mexico itself, and comprising more than 24% of the entire Mexican-origin population of the world.

 East Los Angeles, California is an unincorporated community of roughly 130,000 and is synonymous with Mexican Americans, 97% Hispanic, 88% of Mexicans are immigrants.

The Dallas/Fort Worth Area is the fifth largest Mexican-American population and over 1.5 million Mexicans in the Dallas-Fort Worth Metroplex  (3rd largest foreign born Mexican population in the US).

 San Antonio, Texas – over half of the population in the city proper (53.2%, 705,530) and second largest Mexican population of any city in the US.

Oakland -– California’s third largest Mexican-American city – over 100,000, by percentage (over 25%) after Long Beach (over 130,000, about 30%).

Houston, Texas  – Third largest Mexican ancestry community in the United States – 750,000 (37%)

.El Paso, Texas – largest Mexican-American community bordering a state of Mexico – 435,000 (77%).

Denver, Colorado – Colorado has the eighth largest population of Hispanics, seventh high percentage of Hispanics, fourth largest population of Mexican-Americans, and sixth highest percentage of Mexican-Americans in the United States – 176,000 (35%). According to the 2010 census, there are over 1 million Mexican-Americans in Colorado.

 Laotian immigration to the United States started shortly after the Vietnam War. Refugees began arriving in the U.S. after a Communist government came to power in Laos in 1975 and by 1980, the Laotian population of the U.S. reached 47,683, according to census estimates. These numbers increased dramatically during the 1980s, so that the census estimated that there were 147,375 people by 1990. The group continued to grow, somewhat more slowly, to 167,792 by 2000. By 2008 the population nearly reached 240,532. California (58,424 – 12,000 in San Francisco).

 2010 American Community Survey data indicates that there are approximately 85,700 people with Somali ancestry in the US. Of those, around 25,000 or one third live in Minnesota. Most Somalia-born people in the United States live in the Minneapolis-St. Paul-Bloomington area (17,320). The next largest concentrations of Somalis are in Columbus, Ohio (8,280), Seattle-Tacoma-Bellevue (7,850)

 Between 2007 and 2011, there were approximately 151,515 Ethiopia-born residents in the United States. According to Aaron Matteo Terrazas, “if the descendants of Ethiopian-born migrants (the second generation and up) are included, the estimates range upwards of 460,000 in the United States (of which approximately 350,000 are in Washington, DC; 96,000 in Los Angeles; and 10,000 in New York)

 Known as the “Cambodian capital of the United States”, Long Beach has 20,000 residents of Cambodian descent (4% of the city’s population). It is believed to be home to the second largest population of Cambodian immigrants outside of Southeast Asia.

The city has its own Cambodian consulate. Many of the Cambodians in Long Beach came to the United States as refugees from Democratic Kampuchea from 1975 to 1979, as well as the 1978 invasion and occupation of Cambodia by Vietnam.

 According to the 2000 census, Chinatown was the Los Angeles neighborhood with the highest number of residents who were born outside the United States – 72.4%. 

Koreatown and Westlake were next. In Chinatown, China (55.3%) and Mexico (12.4%) were the most common places of foreign birth.

The 2000 U.S. census counted 9,610 residents in the 0.91-square-mile Chinatown neighborhood, excluding the population of the Los Angeles County Jailcomplex. That made an average of 10,568 people per square mile, which included the empty  Cornfield area.

The Manhattan Chinatown is one of nine Chinatown neighborhoods in New York City, as well as one of twelve in the New York metropolitan area, which contains the largest ethnic Chinese population outside of Asia, enumerating an estimated 819,527 uni-racial individuals as of 2014.

According to the San Francisco Planning Department, Chinatown is “the most densely populated urban area west of Manhattan“, with 15,000 residents living in 20 square blocks. In the 1970s, the population density in Chinatown was seven times the San Francisco average. The estimated total population in the 2000 Censuswas at 100,574 residents.

As of the Census of 2010, there were 285,068 people in St. Paul, MN., 15.0% were Asian from Laos (Hmong).

The U.S. Census Bureau in 2010 estimated that there were 363,699 U.S. residents of Pakistani descent living in the United States. About 80,000 of them live in and around New York City.

There are about 90,000 Iraqis living in the United States. About 32,000 of them live in Dearborn, MI and about 21,000 of them live in San Diego, CA.

There are more than 1.75 million sub-Saharan African immigrants in the United States. About 400,000 live in New York City, Washington, DC and Atlanta, GA.

There are about 20,000 Kurds in America (native to the Fertile Crescent in Northern Iraq, Western Turkey and Northwestern Iran). Approximately 13,000 live in Nashville, TN.

[There are more than 4 million immigrant Asian-Americans in the Los Angeles and New York metropolitan areas alone – more than the plurality of Hillary Clinton’s popular vote margin in the 2016 presidential election. That fact is not lost on the PLDC.]

 Little Saigons, Little Pakistans and Little Bahgdads seem like foreign lands within America’s borders. While the idea may seem quaint and helpful for the residents, it, in reality, prevents assimilation and citizenship which should be the goal of any immigration policy.

 In these enclaves, immigrants can, if they choose, live out their lives without ever needing to learn to read or write English or to participate or interact in any way with Americans. These are areas where Americans fear to go because they are obviously not welcome – the only sign you’re in America are some of the street signs – and because the culture within the limits of these enclaves is unknown and help in understanding it is not readily available.

The upshot of all of these statistics is that none of these people would be living in America if America were not the most vibrant, unified economic power in the world but, should the unexamined immigration policies of the United States not be fundamentally changed, this last refuge for many will no longer exist because the impossibility of assimilating these immigrants will tear apart the Constitutional culture of America.

Recovery of Respect

“We are in a role reversed society. More and more [children] are defining the boundaries rather than parents [who are historically challenged with] teaching them core values, family customs, and patriotism. [In these circumstances], there is little to no discipline for poor behavior. The word “excuse” has replaced “accountability”. Rarely do you see a kid hold a door for someone, say hello, or even hang up the phone when talking to you. They EXPECT everything the world can offer them rather than see what they can offer the world. The more disrespectful people get, the quicker it spreads to others. One can only imagine how unruly this society will actually get before the trend reverses again.”

 “The family, which is the base structure of every society [except Islam where the theocratic state replaces the parents as first teachers and moral guides, relegating women, for instance, to secondary status to any male children] must begin to right their wrongs with regards to restructuring their value systems because most youths learn from the elders in their families and, if truly there has to be a positive change in society, the family must play its role as the major primary agent of socialization.

Government and other authorities, especially in leadership positions, must see themselves as role models for young persons and begin to be responsible adults [by becoming accountable for their own actions – or inactions]. Parents must stop blaming others – white people, black people, the police, their children’s teachers and coaches – for their own failures or shortcomings. They must realize that the future of tomorrow depends on the foundations laid today and youths cannot become trusted leaders if they cannot follow in trust.

Youths must encourage themselves by interacting with one another and creating social networks that can easily strengthen them when faced with discouraging attitudes about moral issues. This can bring about the institution of a strong and viable moral base founded on principles that work.”

“The national goals for the teaching of national consciousness and national unity and the inculcation of the right type of values and attitudes for the survival and success of the individual and the nation can become a reality if – and only if – the nation adopts public moral values that will be recognized as its core identity and encourage their spread among its youths by proving that this identity penetrates all facets of life and is worthwhile.

Youths are major determinants of the level of development in any society. Without youths, there can be no sustenance of society as no society can be self-sustaining without its human components of which the major workforce is the nation’s youth. Realizing this, young people ought to understand their importance and worth in society and begin to value themselves as purveyors of the fortune and progress of any society.

Youths must be committed to imbibing moral values and upholding them through life as these values will shape the ways they think, speak and even act. Moral standards can rise in contemporary societies if the zeal of youths brings about the restoration of moral values. To uphold moral values, youths must be disciplined and resist immoral behavior that will eventually culminate in loss of any moral compass. Youths can be the agents of change in their communities by accepting the challenge to live by moral codes and become “lights in perceived darkness” to show the way forward for all.”

Parents have abrogated their accountability for this moral formation for too many generations. It cannot be recovered because the younger generations have watched the hypocritical behavior of the senior generations for too long. A pattern shift – featuring sound moral behavior by the younger generations – is now required to restore the nations moral footing. This is not impossible because children are born with an innate sense of fairness, good and evil and right and wrong. The task now is how to stop screwing them up.

“Through their formative years, the lives of America’s youth are focused on their education and whether they will attend college at some American university. American universities used to assume four goals. First, their general education core taught students how to reason inductively and imparted an aesthetic sense through acquiring a broad knowledge the significant figures, accomplishments and events of man’s journey in general and of Western Civilization in particular. Second, campuses encouraged edgy speech and raucous expression — and exposure to all sorts of weird ideas and mostly unpopular thoughts. College talk was never envisioned as boring, politically correct megaphones echoing orthodox pieties. Third, four years of college trained students for productive careers. Implicit was the university’s assurance that its degree was a wise career investment. Finally, universities were not monopolistic price gougers. They sought affordability to allow access to a broad middle class that had neither federal subsidies nor lots of money. The American undergraduate university is now failing on all four counts.

A bachelor’s degree is no longer proof that any graduate can read critically or write effectively. National college-entrance-test scores have [on an absolute scale] generally declined the last few [decades] and grading standards have as well. Too often, universities emulate greenhouses where fragile adults are coddled as if they were hothouse orchids. Hypersensitive students are warned about “micro-aggressions” that in the real world would be imperceptible. Apprehensive professors are sometimes supposed to offer “trigger warnings” that assume students are delicate “Victorians” who cannot handle landmark authors such as Joseph Conrad or Mark Twain. “Safe spaces” are designated areas where traumatized students can be shielded from supposedly hurtful or unwelcome language that should not exist in a just and fair world.

One might have concluded from all this doting that 21st-century American youth culture — rap lyrics, rough language, spring break indulgences, sexual promiscuity, epidemic drug usage — is not savage. Hip culture seems to assume that its 18-year old participants are jaded sophisticated adults. Yet the university treats them as if they are preteens in need of vicarious chaperones. Universities entice potential students with all sorts of easy loan packages, hip orientations, and perks like high-tech recreation centers and upscale dorms.

But, on the backside of graduation, such bait-and-switch attention vanishes when it is time to help departing students find jobs. College often turns into a six-year experience. The unemployment rate of college graduates is at near-record levels. Universities have either failed to convince employers that English or history majors make ideal job candidates, or they have failed to ensure that such bedrock majors can, in fact, speak, write, and reason well.

The collective debt of college students and graduates is more than $1 trillion. Such loans result from astronomical tuition costs that for decades have spiked more rapidly than the rate of inflation. Today’s campuses have a higher administrator-to-student ratio than ever before. Those who actually teach are now a minority of university employees [and many of those who do teach are relatively inexperienced. Several years ago, the TAs, the teaching assistants at an elite Ivy League university went on strike for more pay. As many as 75% of the undergraduate classes had to be canceled]. Various expensive “centers” address student problems that once were considered either private matters or well beyond the limited resources of the campus.”

Finally, the observation must be made about our society in general. If people don’t respect others, they probably don’t really respect themselves either. We are then left with the probability that if we cannot respect ourselves, we cannot experience shame (an enormously complex issue discussed below) and if we cannot experience shame, we cannot function in human society where shame has been integral to our progress as an ordered civilization – so integral that it is central to the “creation story” of virtually all civilizations.

The key to respect is humility – not thinking less of yourself but thinking of yourself less – and others more. It’s not difficult but, in today’s world of celebrity, athletes thumping their chests and stars flaunting their material wealth, humility is a difficult sell.

In fact, probably since the end of World War II and the beginning of post-war America – especially with the influence of pediatrician, Dr. Benjamin Spock, author of The Common Sense Book of Baby and Child Care in 1946 (which has sold more than 60 million copies worldwide) – children have been increasingly shielded from the pain of shame by their family and friends until it has effectively lost its ability to motivate individuals to moderate or change behavior that is disruptive (at least) or destructive (at worst) to the domestic tranquility of us all.

“Shame is a painful, social emotion that can be seen as resulting ‘…from comparison of the self’s action with the self’s standards, but which may equally stem from comparison of the self’s state of being with the ideal social context’s standard.’ Thus, shame may stem from volitional action or simply self-regard; no action by the shamed being is required: simply existing is enough. Both the comparison and standards are enabled by socialization. Though usually considered an emotion, shame may also variously be considered an affect, cognition, state or condition.

The roots of the word shame are thought to derive from an older word meaning “to cover”; as such, covering oneself, literally or figuratively, is a natural expression of shame. Nineteenth-century scientist Charles Darwin, in his book The Expression of the Emotions in Man and Animals, described shame-affect as consisting of blushing, confusion of mind, downward cast eyes, slack posture, and lowered head, and he noted observations of shame-affect in human populations worldwide. He also noted the sense of warmth or heat (associated with the vasodilation of the face and skin) occurring in intense shame.

A “sense of shame” is the consciousness or awareness of shame as a state or condition. Such shame cognition may occur as a result of the experience of shame-affect or, more generally, in any situation of embarrassment, dishonor, disgrace, inadequacy,  humiliation or chagrin. A condition or state of shame may also be assigned externally, by others, regardless of one’s own experience or awareness.

“To shame” generally means to actively assign or communicate a state of shame to another. Behaviors designed to “uncover” or “expose” others are sometimes used for this purpose, as are utterances like “Shame!” or “Shame on you!” Finally, to “have shame” means to maintain a sense of restraint against offending others (as with modesty, humility and deference) while to “have no shame” is to behave without such restraint (as with excessive pride or hubris). [ Does sexting suggest anything?]

The location of the dividing line between the concepts of shame, guilt, and embarrassment is not fully standardized.  According to cultural anthropologist Ruth Benedict, shame is a violation of cultural or social values while guilt feelings arise from violations of one’s internal values. Thus, shame arises when one’s ‘defects’ are exposed to others, and results from the negative evaluation (whether real or imagined) of others; guilt, on the other hand, comes from one’s own negative evaluation of oneself, for instance, when one acts contrary to one’s values or idea of one’s self. (Thus, it might be possible to feel ashamed of thought or behavior that no one actually knows about [since one fears their discovery] and conversely, to feel guilty about actions that gain the approval of others.)

Psychoanalyst Helen B. Lewis argued that, “The experience of shame is directly about the self, which is the focus of evaluation. In guilt, the self is not the central object of negative evaluation, but rather the thing done is the focus.”  Similarly, Fossum and Mason say in their book Facing Shame that “While guilt is a painful feeling of regret and responsibility for one’s actions, shame is a painful feeling about oneself as a person.”

Following this line of reasoning, Psychiatrist Judith Lewis Herman concludes that “Shame is an acutely self-conscious state in which the self is ‘split,’ imagining the self in the eyes of the other; by contrast, in guilt the self is unified.” Clinical psychologist Gershen Kaufman’s view of shame is derived from that of affect theory, namely that shame is one of a set of instinctual, short-duration physiological reactions to stimulation. In this view, guilt is considered to be a learned behavior consisting essentially of self-directed blame or contempt, with shame occurring consequent to such behaviors making up a part of the overall experience of guilt. [It is this definition that is so prevalent in today’s Internet connected society.

Here, self-blame and self-contempt mean the application, towards (a part of) one’s self, of exactly the same dynamic that blaming of, and contempt for others, represents when it is applied inter-personally. Kaufman saw that mechanisms such as blame or contempt may be used as a defending strategy against the experience of shame and that someone who has a pattern of applying them to himself may well attempt to defend against a shame experience by applying self-blame or self-contempt. This, however, can lead to an internalized, self-reinforcing sequence of shame events for which Kaufman coined the term “shame spiral”. (However, notice that the word “shame spiral” or “spiral of shame ” might also be used to indicate “public shaming”, i.e., The behavior of attacking somebody en mass for his/her viewpoints or particular words. This can especially refer to cyber bullying). Shame can also be used as a strategy when feeling guilt, in particular when there is the hope to avoid punishment by inspiring pity.

[If any of this sounds familiar, it is because, every day, the news is filled with stories about people, many times public officials, who employ these tactics as part of their public persona. But, don’t blame them – this is the world we, as a society, have created and you, too, would be guilty of “public shaming”. As comedian Oliver Hardy, would say – “A fine mess you’ve gotten us into.”]

One view of difference between shame and embarrassment says that shame does not necessarily involve public humiliation while embarrassment does; that is, one can feel shame for an act known only to oneself but in order to be embarrassed one’s actions must be revealed to others. In the field of ethics (moral psychology in particular) however, there is debate as to whether or not shame is a heteronomous emotion (comes from others) i.e. whether or not shame does involve recognition on the part of the ashamed that they have been judged negatively by others.

Another view of the dividing line between shame and embarrassment holds that the difference is one of intensity. In this view, embarrassment is simply a less intense experience of shame. It is adaptive and functional. Extreme or toxic, shame is a much more intense experience and one that is not functional. In fact, in this view, toxic shame can be debilitating. The dividing line then is between functional and dysfunctional shame. This includes the idea that shame has a function or benefit for the [individual].

[Philosopher] Immanuel Kant and his followers held that shame is heteronomous; Bernard Williams and others have argued that shame can be autonomous (comes from oneself). Shame may carry the connotation of a response to something that is morally wrong whereas embarrassment is the response to something that is morally neutral but socially unacceptable. Another view of shame and embarrassment says that the two emotions lie on a continuum and only differ in intensity. Simply put: A person who feels guilt is saying “I did something bad.”, while someone who feels shame is saying “I am bad”. There is a big difference between the two.

Gershen Kaufman summed up many of the consequences of shame in one paragraph of his book on the psychology of shame:

“…shame is important because no other affect is more disturbing to the self, none more central for the sense of identity. In the context of normal development, shame is the source of low self-esteem, diminished self-image, poor self-concept, and deficient body-image. Shame itself produces self-doubt and disrupts both security and confidence. It can become an impediment to the experience of belonging and to shared intimacy…. It is the experiential ground from which conscience and identity inevitably evolve. In the context of pathological development, shame is central to the emergence of alienation, loneliness, inferiority and perfectionism. It plays a central role in many psychological disorders as well, including depression, paranoia, addiction, and borderline conditions. Sexual disorders and many eating disorders are largely disorders of shame. Both physical abuse and sexual abuse also significantly involve shame.”

 Also,  “…shame has been found to be a very strong predictor of Post-traumatic Stress  Disorder…”.

 Shame has not been uncommon in everyday life in Western Civilization but, its presence has diminished over the centuries and decades. Where shame was used often by the Catholic Church in the Middle Ages to coerce behavior from believers or to force hesitant people to join the Church, it is more common to see examples of shame as described below.

 ·         The belief that one is, or is perceived by others to be, inferior or unworthy of affection or respect because of one’s actions, thoughts, circumstances, or experiences: “I felt shame for having dropped out of school.”

 ·         Respect for propriety or morality: “Have you no shame?”

 ·         A condition of disgrace or dishonor; ignominy: “It was an act that brought shame on the whole family.”

 ·        A regrettable or unfortunate situation: “It was a shame how the place had fallen apart, with tall scorched grass and sagging gutters.” (Tom Drury).

  One that brings dishonor, disgrace, or condemnation: “I would  Forget the shames that you have stained me with.” (Shakespeare).

 It has been suggested that narcissism in adults is related to defenses against shame and that narcissistic personality disorder is connected to shame as well [although untreated early childhood traumas have also been posited as probable sources.

 “The joy-shame connection has its roots in the early mother-infant relationship and is best understood in terms of affect theory. We come into this world primed for joyful interaction with our caretakers. Enjoyment-joy is one of the nine genetically built-in affects; when it is interrupted – say, when our joyful interest in mother is met with indifference or worse – shame affect is the result. If babies have that early experience again and again, when attachment between infant and mother goes badly awry, core shame is the result and the complex of emotions around enjoyment-joy shuts down. In other words, the potential experience of joy threatens the emergence of shame and must be avoided at all costs. Heavy defenses against shame thus limit the opportunity to experience joy in life.]

Shame is considered one aspect of socialization in all societies. According to the anthropologist Ruth Benedict, cultures may be classified by their emphasis on the use of either shame or guilt to regulate the social activities of individuals. Shared opinions and expected behaviors and potential associated feelings of shame are in any case proven to be effective in guiding behavior of a group or society.

Shame may be used by those people who commit relational aggression and may occur in the workplace as a form of overt social control or aggression. Shaming is also a central feature of punishment, shunning or ostracism. In this sense, ‟the real purpose of shaming is not to punish crimes but to create the kind of people who don’t commit them”. In addition, shame is often seen in victims of child neglect and child abuse.”

With the advent of the Internet, a new form of shame was created – the weaponization of shame. Known in the literature as a shame campaign, it is a tactic in which particular individuals are singled out because of their behavior or suspected offenses against the self-styled culture of the perpetrators of the digital assault, often by marking them publicly online in social-media, much as Hester Prynne was marked in Nathaniel Hawthorne’s colonial American classic The Scarlet Letter. Public humiliation, historically expressed by confinement in stocks is another example.

 The difference is that now, literally millions of people can, and do, participate in the public humiliation. As more people participate in this behavior, more and more become inured to the effects of shame, become shameless and so, unable to function in a diverse society longing for tranquility. These cowards become dangerous cultural disruptors as the increasing number of juvenile suicides resulting from online shaming, sometimes called cyber-bullying, graphically demonstrate.

Monica Lewinsky became infamous during the Clinton presidency. As a survivor, she has written:

“I know I’m not alone when it comes to public humiliation. No one, it seems, can escape the unforgiving gaze of the Internet, where gossip, half-truths, and lies take root and fester. We have created, to borrow a term from historian Nicolaus Mills, a “culture of humiliation” that not only encourages and revels in Schadenfreude but also rewards those who humiliate others, from the ranks of the paparazzi to the gossip bloggers, the late-night comedians, and the Web “entrepreneurs” who profit from clandestine videos.

Yes, we’re all connected now. We can tweet a revolution in the streets or chronicle achievements large and small. But we’re also caught in a feedback loop of defame and shame, one in which we have become both perps and victims. We may not have become a crueler society—although it sure feels as if we have—but the Internet has seismically shifted the tone of our interactions. The ease, the speed, and the distance that our electronic devices afford us can also make us colder, more glib, and less concerned about the consequences of our pranks and prejudice. Having lived humiliation in the most intimate possible way, I marvel at how willingly we have all signed on to this new way of being.”

So, shame has evolved from a useful tool of a loving society, used to encourage citizens to contribute positively to the domestic tranquility, to become a weapon wielded by self-appointed, hate-filled – frequently anonymous, absolutely unaccountable – censors, who view their role in society as that of avenger, or Inquisitor, of whatever they deem unworthy on any particular day.

As Rousseau argues in a passage from Emile,

“…our insecurity is inseparable from our sociability, and both from our propensity to emotional attachment; if we think of ourselves as like the self-sufficient gods, we fail to understand the ties that join us to our fellow humans. Nor is that lack of understanding innocent. It engenders a harmful perversion of the social, as people who believe themselves above the vicissitudes of life treat other people in ways that inflict, through hierarchy, miseries that they culpably fail to comprehend.” Rousseau asks, “Why are kings without pity for their subjects? It is because they count on never being human beings.”

Emotions of compassion, grief, fear, anger and shame are in that sense essential and valuable reminders of our common humanity. Unless we can recover them and embrace them as a critical part of who we are as people and as Americans, we will not be able to recover respect in our culture and, without respect for ourselves, for others and for our nation’s institutions, we will be unable to recover America.

This is not just an issue for the white, heterosexual, Christian male-centric part of society most social commentators blame for America’s ills. Consider a modern conundrum: the LGBTQ community demands respect for their lifestyle and condemns disrespect. At the same time, the LGBTQ community condemns respect for Islam because of that groups’ belief that LGBTQs should be killed for their lifestyle.

The issue of the loss, not the lack, the loss of respect by all, for all in our society is the primary cause for the decline in our own beloved nation both in our own eyes and in the eyes of the world. It has infected, not just effected and affected, every segment of society and prevents any solution to the obvious problems that prevent all of us from attaining the American Dream.

The ultimate cure for this disease must be comprehensive and universal. Just treating the symptoms will not do. It must emanate from a national desire and consensus that each and every one of us, from the age of consent unto death, must be singularly and collectively responsible to ourselves, our loved ones and our fellow citizens for the treatment of this disease.

The cure must involve the reconstitution of the nuclear family; the rededication of our public schools to the education and socialization, not the indoctrination, of our children; the resolution of those placed in positions of authority to work for, and set the example for, the betterment of the People and not of themselves; the rediscovery by the press, the media and the infotainment industry as a whole, that the unadulterated and whole  truth is their coin of their realm; the realization by judges, prosecutors and all officers of the courts that all laws, rules and regulations must be applied equally to all citizens and all other persons within their jurisdiction and the recognition by those who consider themselves to be the outsiders in society that their goal must be to accept the invitation of the Constitution to join the American parade in every respect and not to seek to alter its route by forcing their fellow citizens to do or accept things that they consider offensive to their Constitution, their life experience, their belief system or their personal moral and ethical codes.

Absent these processes, this nation will die because, in our Constitutional republic – the world’s first – all citizens have an inalienable right to life, liberty, the pursuit of their dreams, the freedom of speech, the freedom to practice any legally recognized religion, access to a free and honest press, the freedom to assemble peaceably, the freedom to petition their governments for a redress of grievances without retribution, the unfettered ability to keep and bear arms and respect –  from all –  in their persons and in their personal beliefs.

Next: Citizenship v. Culture.

The Rejection of Respect

“We tend to blame the younger generation for these crude behaviors, but the truth is that the situation is degrading all ages and levels of society [refer back to the “rap industry” which spans both coasts and involves all levels of society in its manufacture and distribution]. It is commonplace to see couples openly insulting each other in public and treating each other with absolutely no common courtesy (a sliding scale which leads directly to physical and verbal abuse).

Just as unfortunate, and equally common is disrespectful and dishonest treatment between colleagues in the business world, who fall back on tricks, half-truths and crude vocabulary to make ends meet. And then, to add insult to injury, these issues are left to be resolved by enormous and costly governmental programs, that can do nothing when facing this irreversible deterioration of personal relationships without the involvement and commitment of everyday people in their everyday lives.

The lessons of courtesy and good manners taught by parents at home, or perhaps by teachers at school, too often forgotten, are increasingly absent in mainstream education. This phenomenon is a byproduct of an absurd social model that certain politicians’ attempt to impose, where manners must be re-taught to adults by companies that offer courses in protocol and courtesy to professionals in business environments. But we mustn’t be fooled, to be successful in this world, we have to make good use of good manners and courtesy [in our personal and private interactions] from the beginning, it won’t do to call a possible partner “dude,” and say “Hey, babe” to the future mother of your children or “No way will I take that, you jerk” to your friend.

The deterioration of verbal communication is an evident and alarming symptom of the absence of good manners. The most elemental level of any society is personal relationships. Positive relationships are built on courtesy and a culture that has no regard for polite speech is on the path to swift decline [with society to soon follow]. Our tolerance for rude and discourteous behavior seems infinite and no relationship is immune to the effects of disrespectful conduct and coarse treatment. Whether we’re dealing with insults from other drivers, curt and disagreeable treatment by customer service reps or the rude and aggressive attitudes of people on the street or at work. These behaviors contribute to the lowering of our standards for polite behavior, standards that concern and affect all of us, and we tolerate or justify this inappropriate and unacceptable behavior because [try as we might, if we are the only ones trying, we will never overcome the masses of people that just don’t care.]”

We have arrived at the point in our culture where the nation’s youth take their social cues, not from parents and teachers, but from pop-music, including rap (discussed above) and from light-hearted fare on television and streaming on their personal computers, laptops and I-phones. These productions – save for the Disney Channel and Nickelodeon, which are aimed at pre-teens and pre-pubescent teens – almost universally feature young characters behaving in rude, crude and unacceptable ways, insulting their parents, other adults and their friends without any consequences at all except an inane and raucous “laugh-track” – an artificial compilation of crowd laughter inserted in these shows where the directors believe laughter should be heard – and probably would be if the dialogue or action were actually funny.

“So, while manners may seem unimportant, they’re really vital, because discourteous and vulgar conduct tarnishes the dignity of people and the society that allow them. Good manners, like any learned behavior, require practice and effort. Certain social and governmental models that [many Americans] favor would have people believe that effort and hard work aren’t necessary for success and material comforts. The best example – the “War on Poverty” and Lyndon Johnson’s “Great Society”. As we have seen, this kind of government program promises certain citizens the world for their votes, and creates generations of citizens that think they have a right to “free” healthcare, a certain type of housing, and no obligation to work or make any effort for any of these things without any consideration for the citizens that work and pay for these “free” programs. In this type of system, overachievers are looked down upon as show-offs and formal manners and courteous treatment are often considered unnecessary artifices or, crude behavior and discourtesy are favored in order to mock the achievers.

Society also needs a means by which citizens can indicate the values they place on the goods and services being offered independent of the marketplace. In a free society, the marketplace and the laws of supply and demand largely determine the financial rewards earned by producers. But this mechanism isn’t sufficient. The market alone often doesn’t reflect the intangible values offered by different goods and services. Some of the goods and services most critical to the functioning of a modern society are those least able to justify an attractive level of return solely on the basis of their direct market value.

Artificial interventions in the marketplace, like our previous efforts at prohibiting alcohol and our current efforts toward prohibiting [addictive] drugs can also cause massive distortions, causing the marketplace to deliver the greatest financial rewards to the producers of those goods and services claimed to be of the least real value to society. Even an entirely free market fails to comprehensively reflect the collective values of society [that manifests itself in respect].

For instance; throughout most of history the earnings of prostitutes have exceeded those of teachers. While both professions can be argued to be providing valuable services to their customers – and in the minds of some, the differential rewards provided by the market are entirely valid – most societies have made some attempt to encourage one over the other. The granting or withholding of respect has historically provided this very useful mechanism. Poor teachers were respected citizens while rich prostitutes were not. The desire to be respectable citizens provided sufficient encouragement for many individuals to choose teaching over prostitution.

But, in our current obsession with political correctness we have disturbed this most valuable function of society. Where respect previously was something of value that an individual had to earn through their principles, abilities and deeds, we’re now told that respect has been decreed to be yet another entitlement. Where respect previously served as a badge of honor and measure of the regard of one’s peers, we’re now instructed that the rituals of respect must be most carefully observed when interacting with those we might privately consider least worthy, lest one be accused of the modern equivalent of the “Scarlet Letter” – “the micro-aggression”.

Society as a whole is suffering significant side effects from the forced distortion of our natural inclinations toward conditional respect. By degrading the value of respect, we’ve also effectively degraded the value of those related socially useful concepts that respect formerly rewarded – concepts like honor and personal integrity. Why bother with such inconvenient restraints when society is prohibited from showing its displeasure by withholding its respect?

Having compromised society’s more subtle means of encouraging reasonable accommodation of the accepted norms of behavior in our day to day affairs, we’re obliged to resort to more overtly invasive means of accomplishing this necessary function. Where the practice of conditional respect once provided a significant and relatively automatic moderating influence we, as a society, must now increasingly turn to heavy handed legislative “solutions” to deal with ever more minor aspects of everyday life.

Just as our concept of respect itself is multidimensional, the side effects of eliminating conditional respect are similarly not limited to the single dimension of moderating person-to-person interactions. The “respectable” professions formerly drew adequate numbers of high quality individuals who were motivated more by their dedication and desire for the respect of society than by the limited salaries. Now that conditional respect has been removed from the equation, society must pay ever larger piles of hard cash to convince ever lower qualified individuals to perform these most necessary jobs.

As an example, consider the well documented decline of our educational system. In the past, our highly respected but underpaid teachers provided America with the finest educational system in the world. However, in order to avoid offending those undeserving of our respect, we’ve greatly cheapened those extremely valuable intangible aspects of the job that had formerly attracted the best applicants. As a result, we must now attempt to compensate for this lost value with financial rewards and other concessions such as shorter hours and longer holidays that compromise the educations of the students.

While admittedly there remain a few exceptional teachers, who continue to struggle to educate their charges in spite of the ever-increasing obstacles, the general failure of the education system indicates that they have become the minority. The teachers we fail to properly respect today put in ever- fewer hours failing to teach ever-smaller classes of ever-more violent and out of control students. In order to convince ever-less qualified individuals to occupy this ever-less attractive position solely for the money and perks, we now pay teachers average salaries double the national average earned by the parents of those functionally illiterate children that graduate, unprepared for any meaningful role in life, from our now politically correct but functionally crippled schools.

Is the feel-good effort to expand the distribution of the abstract aspects of life, in the way we have so successfully expanded access to material goods,worth the huge costs and side effects? Is the ever-increasing intrusive regulation of the most minute details of society by government worth the illusionary benefits of forcing society to show respect for those most worthy of our disrespect? Allowing those with only a superficial understanding of how society really works to dismantle its mechanisms based solely on uninformed and misguided emotional reactions to perceived inequities [think Occupy Wall Street, Black Lives Matter and Not My President] invites disaster. The principles and functions that have proven effective in the past sometimes remain the best alternative after all.

In today’s society, with our emphasis on building self-esteem and self-confidence in ourselves and our children, we have forgotten the importance of self-respect. We often hear the words interchanged, as if they mean the same thing. The modern misuse of the word self-respect has watered down not only its original meaning but also its value in our lives. Going back to old English, the definition of self-respect was “proper regard for the dignity of one’s person” or “to hold in honor”. It comes from within and is not reliant upon [elements] outside of ourselves, such as physical appearance, public image, wealth, social status, praise, accomplishments, awards or achievements. Rather than being something you “build” or “earn”, self-respect comes from no other reason than the fact that you have the right to dignity because you are a human being. It means to honor yourself as a person regardless of your life circumstances. Knowing this important difference will affect the choices you make and therefore the quality of your life. 

When we think of the people we truly admire now and in history, it’s because of their self-respect and not their outside achievements. We can all name more than one celebrity who had “it all” – talent, awards, money and fame, only to have it all come crashing down. Their self-esteem and self-confidence might have brought them to the top but their lack of self-respect brought them down.

 Young children build their sense of self-respect from their interactions with others. When they are made to feel special and valued, children grow to respect themselves. A positive sense of one’s self allows the maturing child to respect others. Self-respect is at the heart of respecting others. Children need to learn to nurture their self-respect rather than become dependent upon constant praise and attention from outside in order to be happy.” More important, all adults need to learn to protect, support and encourage self-respect in all of our children. It is a civic responsibility.  

People who have self-respect: 

  1. Know who they are and never apologize for who they are.
  2. Like and excel at being who they are.
  3. Won’t settle for relationships that are not good for them.
  4. Don’t concern themselves with how others perceive them.
  5. Are able to say ‘no’ to things that don’t suit them.
  6. Won’t compromise their values to ‘fit in’.
  7. Take responsibility for themselves.
  8. Are honest with themselves.
  9. Know their time is valuable.
  10. Never give up on themselves.

 The great thing about self-respect is that no one can take it away from you. It also frees you from the expectations of others. You are not swayed by another person’s opinion of you, the next fad or the latest diet. Instead of always seeking, you really know that who you are is enough. You will want to learn and have new experiences, but not to change who you are or to please others. Nurture your self-respect. Take quiet time alone in nature, meditation or prayer. Learn to become comfortable in your own skin and honor the fact that what makes you unique is enough.

As a society, we appear to have lost the instinct for kindness and the willingness to extend the hand of friendship. Our responses to children, to older people, to strangers, are all conditioned by a concern not to offend and a fear of getting involved. The social evils of today highlight a real concern for the way in which society increasingly values people for their economic contribution, at the expense of kindness and compassion. Some blame the nature of regulation – while providing protection for some, it seems to have intimidated the majority. Others feel there has been a general decline in values: individual advancement is seen as more significant than the ability to care for others.”

Whatever the reasons, we are uncomfortable with the society we have created. The idea of the common good has been lost and we are experiencing a severe social recession – the effects of which are far more devastating and long-lasting than any economic recession.”

“Real respect is something that is earned. One earns another’s respect by voluntarily doing the things mentioned above, such as taking that person’s feelings, needs and thoughts into consideration. Respect seems to be like a boomerang in the sense that you must send it out before it will come back to you. Respect cannot be demanded or forced, though sometimes people mistakenly believe that it can. Since a baby has no concept of respect, and feels only its own needs when born, the only successful way to teach a child what respect is, is to earn the respect of the child as they slowly grow into a thinking human being.

The way this is done is first of all by attending to the child’s natural needs, such as to be fed and nurtured. As the child grows, his needs change. He has increasingly sophisticated psychological needs. He begins to express his own views, his own preferences, and he has an increasing need for freedom, autonomy and independence. This is when the adults in his life can treat him with increasing respect and thereby earn his respect in return.

It doesn’t make sense to think of respecting a baby in the same way that we say we respect an adult. Yet on some level the two concepts are similar. This similarity has to do with our voluntarily helping that person with their needs. In either case, we must first accept the needs. For example, if a baby needs to be fed at three in the morning we don’t do it begrudgingly if we respect his natural needs; we simply accept that the infant has a natural need to eat at that particular moment. Likewise, if an adolescent or an adult need to talk, we accept this need and we show respect by listening voluntarily.”

As soon as a parent begins to believe that these needs are burdens – respect begins to deteriorate and self-respect soon follows. An interesting common denominator among successful African-American professional athletes is that the great majority were raised by their single-parent mothers. Those mothers – truly angels-among-us – usually worked multiple jobs, made sure their children did their school-work and got their children to their practices and games. What followed was unimaginable economic success!

Unfortunately, only a few hundred of the millions of young African-Americans per year get the chance to be professional athletes. The example however, has been set. The problem is showing the millions of single-parent mothers of all races and ethnicities how to respect their children as well. Respecting someone means respecting their feelings and their survival needs.

Those in positions of authority too often expect and try to demand that those beneath them show ‘respect.’ But if they have not first earned respect by showing it (which is done by respecting the other person’s feelings and needs), they may find that their power is actually based on fear. Once a person no longer fears such an authority figure, then the authority figure’s power base quickly disappears out from under them, often leaving them feeling frustrated, powerless, confused and resentful.

A New York City gang member was asked why he carried a gun. He replied: “Before I had this gun, I didn’t get no respect [read fear]. Now I do.” Similarly, teachers and parents often believe that if a child obeys them, or says “Yes, Sir/ No, Sir,” it means the child respects them. Several teachers have said they felt more respected when there was more ‘discipline’ in the classrooms. When one probes deeper, without fail they made it clear that they were talking about a time when there was more use of corporal punishment in school, and thus more fear of physical pain for disobedience.

There is a danger in mislabeling fear as respect. To use an analogy, consider what would happen if two jars in the medicine cabinet were mislabeled. What if poison ivy lotion were labeled as cough syrup, or chlorine as contact lens cleaner?

Here are some comparisons between fear and respect:

  • Fear is toxic. Respect is nurturing.
  • Fear destroys self-confidence. Respect builds it.
  • Fear is life-threatening. Respect is life-enhancing.
  • Fear is forced. Respect is earned.
  • Fear is learned. Respect is earned.

When we do not feel respected by our parents while we are living with them, we have an unmet need to feel respected later in life. This is such an obvious statement, yet it needs to be said. It is one of the clearest examples of what happens when our emotional needs are not filled in the right amounts at the right time by our parents. People who did not feel respected by their parents tend to take things personally later in life. They may make a big “scene” over something which to other people would seem small. They do this because they are in pain from the lack of respect which they are still feeling, one which originated many years earlier, but likely was not allowed to be expressed.

They may demand to be respected by their employees, their children, their students and the sales clerks in the supermarket. They may seek positions of power where they have authority over others as a way of trying to fill their unmet need for respect. But when they are in positions of authority it is easy for them to confuse respect and fear. When they are feared, they are not respected. When they try to use authority and fear as a substitute they find that they still feel unfulfilled since you can never get enough of a substitute. On the other hand, another consequence might be that they have such low self-esteems that they never feel worthy of respect. In this case, they will let people take advantage of them, abuse them and manipulate them.

Outside of the home, teachers are one of the first representatives of authority in society. If they earn the respect of their students, the students are likely to respect others in positions of authority and society will tend to function a bit more smoothly. Teacher training programs, for example in a typical university, do not show future teachers how to earn the respect and cooperation of the students. They are then significantly unprepared when they reach the classrooms.

 If a person begins studying to become a teacher with the belief that teachers should be respected or obeyed, just because of their position as a teacher, it will be very hard to change this belief. What might be needed then, is some way of filtering prospective teachers based on their beliefs. While this idea may make some people feel uncomfortable, the reality is that a person’s beliefs do significantly affect their attitudes, and attitudes affect the classroom environment. Beliefs also affect a person’s ability to be taught new things, especially new ideas.

 At present, teacher training programs do not test a future teacher for their open-mindedness. Instead, their ability to adapt to the status quo is much more highly valued. Much depends, of course, on the people who design and control the teacher training curricula and the admissions and graduation processes. Their beliefs will obviously affect the system itself and the future teachers created by the system.

If a teacher or future teacher is emotionally needy, and they have an unmet emotional need to feel in control or to feel important, it will be almost impossible for them to treat students with respect regardless of their training and preparation. On the other hand, there are many teachers and future teachers who agree that respect needs to be earned, so they just need to be offered practical skills to help them learn how to do this.”

One can say with some conviction that people no longer care about how their interactions with others effect the outside world. “Today, everything is about instant gratification and pleasure. Attitudes such as; ‘I can do whatever I want to anyone, but they dare not do it back to me.’ ‘I can close the door in someone’s face, push them aside in annoyance, or publicly humiliate them’. ‘Anything that gives me a shot of pleasure is acceptable – because I am the exception’.

 This mentality is being fed by social media. People are given the perfect place to totally immerse themselves in themselves – if you wanted to you could literally spend all day every day for the rest of your life looking at nothing but pictures of yourself, and doing nothing but talking about yourself. None of your 5,000 “friends” will say anything, because it would by hypocritical of them.

Next time: Youth and adults: rolls reversed.

Respect

“Respect is an acknowledgement of the inherent worth and innate rights of the individual and the collective in society. The value of respect was cherished in the past, especially respect for youth, elders and moral values in a changing society. This is because it was assumed that elders were more knowledgeable and wiser than young persons. Respect was also given to elders as they comported themselves in manners worthy of emulation. Respect comes with honour and dignity as the individual or group of individuals are regarded as important personalities in their own rights.”

 With Mortimer Adler’s philosophical cautions about what is good and what is evil, “We cannot begin to analyze what moral values are without first having an understanding of what morality and moral education are. Morality is derived from the Latin word mores which means “manners” or “morals”. Morality is “an accepted code of human conduct in a society”. Morality entails “having [accepted patterns of behavior] that will regulate dealings of men who can choose to abide by these laws because they know it is good sense to do so”. Being moral or being morally conscious means adopting standards or principles to guide ones ‟actions and conduct in society”. Moral education is a “program of study which teaches the pupil about behaving in accordance with what is good while rejecting what is bad.” It is a holistic approach to stimulate character building and moral development.

Moral education should lead youths to develop from a stage of little understanding of moral principles, often characterized by pre-morality, to a stage where an individual is not forced to be moral [but] is personally convinced on standards that ought to guide his/her conduct in society. Moral values are taught in moral education as certain acceptable, valuable and cherished qualities that are worthwhile in developing a sound character. Values are codes of behavior “considered worthwhile, desirable, right and good and thus craved for and applied daily to enhance existence by the people”. Values determine people’s identity and cultural continuity. Moral values are essential values that determine individuals’ ‟perception of morality and moral consciousness in society.”

 “Moral values include truthfulness, patience, obedience, honesty, integrity, hard work, responsibility, respect, tolerance, loyalty, public spiritedness, freedom, respect for human life and dignity of persons. Others include justice, fairness and equality. Moral values are taught so as to be [internalized] by members of the society to enhance character development and promote good moral upbringing and moral health in individuals. “Moral health is manifested in individuals when a person becomes capable of understanding the principles of moral conduct and is committed to behaving morally in his dealings with others”. Principles of moral conduct can only be effectively understood and practiced when moral values are strongly adhered to by individuals in society. Societies cannot legislate ethics and morals but, can certainly legislate restrictions upon unethical and immoral behaviors.

 The most fundamental respect is “respect for human life and the dignity of all persons. Human life has always been sacred or sacrosanct [in human society]. Life has always been important and the dignity of persons has been pursued because the origin of life has been linked with “gods” or “deities”. Human belief in the ultimate power of “immortal gods” has caused men to have great respect for human life and regard persons with dignity. Traditional societies promoted the security of life because of their beliefs in its sacredness. Before life was taken, it had to be a necessary sacrifice after a series of interventions of the people for adequate cleansing.

 Traditional societies … believed in character development and functionalism of members and they pursued this cause with utmost sincerity. Most of their determination to pursue just causes arose from their religious beliefs in the supremacy of their “gods” and the punishment(s) that could arise as a result of their disobedience of these “gods”. In the words of modern social-scientists, “… the yardstick for measuring an educated man is his morals, manners, obedience to authority and respect for the customs, conventions, superstitions and laws of the land”.

American citizens expect that their personal human and Constitutional rights be respected, so too must they accept the responsibility to respect the rights of others.

 How has the decline of American mores effected how citizens respect both rights and responsibilities? Each of the core components and institutions of society discussed above, that are under attack in America are, at a fundamental level, dependent upon the mutual respect of her citizens.“Respect is important because it shows that one values another as an individual, and that he honors the personal rights and dignity of the person as a fellow human being. People who are disrespectful – even to the unrespectable – often have few friends and alliances, and others do not enjoy being near them.

 Because all of this [now] happens in the spotlight of the interconnected digital world, virtually everyone is cognizant of the disemboweled standards of behavior celebrated, or at least not condemned, in the 24-hour news cycle [and the continuous social-media cycle].

 What has come to be known as “political correctness” started with the reasonably defensible idea of discouraging gratuitous disrespect based on sex, race, or religion. Finding general acceptance for their initial attack on the racial insult considered most offensive to blacks, the advocates of the new political correctness expanded their campaign to include nearly any derogatory term that doesn’t refer to white men. But in the process of demonizing what has come to be called hate speech, the advocates of political correctness didn’t stop at just eliminating disrespect. Somehow the concept was distorted to equate a lack of positive respect with disrespect. No gradations or conditions were allowed. It was all or nothing. It was the “thought police”!

 Like medieval mathematicians trying to deny the concept of the number zero, the overzealous advocates of political correctness have sought to deny that a neutral position showing no disrespect nor unearned positive respect is entirely valid. But in mandating positive respect as a basic entitlement, [they have] compromised the core meaning and purpose of [something less than an absolute unconditional endorsement – known as] conditional respect as a tool of society.

It would hardly be appropriate use of society’s resources to attempt to address such issues as minor rudeness or neglected hygiene with the criminal justice system, and yet society needs some way to encourage preferred patterns of behavior. Conditional respect has always supplied that function. While it is entirely possible to live without the respect of one’s peers, the natural desire for respect does provide a potentially powerful non-coercive, nonviolent influence on most of us. Societies have always granted or withheld respect from individuals for real or perceived minor deviations as a way of encouraging compliance with mainstream society’s expectations.

Now, consider how political correctness has artificially distorted our concept of conditional respect. In the past, at least in theory, an individual earned the respect of his peers by his principles, behavior and accomplishments. Of course, we don’t live in a perfect world and there have always been distortions in society’s implementation of respect. Too often the concept of respect has been twisted into a function of hereditary titles, improperly acquired wealth, skin color, or other invalid criteria. On a superficial level that primarily highlights the aberrations of conditional respect, it’s an almost unavoidable step to wanting to force society to show respect for those it has previously and unfairly denied that respect. What is overlooked – in the emotionally attractive desire to share society’s “wealth” of respect in the same manner as we attempt to share material wealth – is that respect plays a far more complex role in society [than material wealth].

Gaining the respect and positive regard of those around us appears to be the natural desire of most people. We start out life seeking the approval of our parents. This need for approval is so strong that some schools of thought in child rearing hold that when properly managed it is sufficient to provide all needed control and motivation. While there is dissension on this point, it is generally agreed that the need for parental approval is a powerful influence in early life. As we grow up, the desire for the respect of our peers becomes for many an even more powerful influence on their views and behaviors. Much of the wasteful mortality of the teenage years results from misguided attempts to gain the respect of peers by pushing the limits of the rebellion that is such a hallmark of that period of life.

Man’s obsession with respect has always been a primary component of our exceptionally unhealthy attraction to warfare. The demand for respect at a national level is well documented as a primary cause of nations’ initiating war [along with self-defense]. However, while patriotic slogans and nationalistic pride may be effective in convincing otherwise rational individuals to put on a uniform and take up arms, these abstract concepts lose some of their focus in the terrible immediacy of combat. At the point where warm flesh meets cold steel, men fight more for the respect of their fellow soldiers than for their ideals. While it may have been belief in ideals that got them into the trenches in the first place, it is the fear of losing the respect of their fellow soldiers that provides the powerful motivation needed to convince those otherwise rational men to climb out of their relatively safe trenches, and charge headlong into the terrible carnage of the enemy’s machine-guns.

The desire for respect can be so powerful that those who have been denied the respect of mainstream society often turn to fear as a means of forcing at least the illusion of respect. One of the major attractions to gang culture is its promise to provide the appearance of respect to those unable or unwilling to earn real respect. Real or imagined offenses involving the arcane rituals of respect in the violent world of gangsterism are more often the cause for murder among gang members than financial issues. To gang members, even the shallow contrived illusion of respect that results from fear is preferable to the disrespect their failures earn them from society at large. [This type of forced respect is not so different from the artificial respect demanded by political correctness.]

Since our desire for respect is such a powerful influence, it’s hardly surprising that it has become both an integral factor in society and a focus of attention by those attempting to implement ever broader egalitarianism. Human societies tend to involve individuals interacting across a broad range of levels and purposes. We attempt to address the inappropriate aberrations at the extremes of the range through the sanctions offered by the criminal justice system. But a functional society also involves interactions at levels where individual events fall below the threshold for involvement of enforcement agencies, but which in large measure define the quality of life of the citizens.

With [civilization modernizing at a dizzying pace] came a lot of changes – both positively and negatively. One negative change that came with “the dawn of the post-war era” was [an increasing] moral decadence and laxity. Society has witnessed a fall in moral standards and an increased interest in pleasure and enjoyment as opposed to more serious things. Moral decadence has resulted in [disorderliness] at all levels in the society and its resultant effects are seen in a lackadaisical attitude toward [the dignity of] work; our readiness to lie, cheat and embezzle, lack of dignity and respect for human life and the monster of corruption.

 Values that are held and pursued today [especially by political and business leaders, and celebrities leading largely unexamined lives] include: dishonesty, disrespect, intolerance and lack of cooperation, exclusively-profit-oriented relationships, profanity of life and abuse of human dignity, loss of pride in hard work and an increased interest in the pursuit of injustice and other crimes; all in a bid to acquire wealth and/or power by adopting the philosophy that “the end justifies the means”. Modern societies are experiencing the wave of corruption driven by the “get rich [or powerful] quick syndrome”.

 Public goods and resources are audaciously stolen by individuals who are in positions of authority in a bid to acquire wealth for themselves and secure the future of their families, caring less about the pain and burdens to be borne by other members of the society as a consequence of their actions. Today, people take pride in telling lies and celebrating those that do likewise, engaging in ungodly practices and embellishment of various criminal acts. Integrity is lacking in the interactions of men with one another and flagrant abuse of the laws and of human rights is the order of the day.”

 For contrast; the students at the nation’s military academies live by a different code – called the “Honor Code”. It demands that students “… shall never lie, cheat or steal nor tolerate anyone who does.” How refreshing.

 Authority [it must be understood] is the legitimate or [widely] socially approved use of power. It is the legitimate power which one person or a group [exercises] over another. The element of legitimacy is vital to the notion of authority and is the main means by which authority is distinguished from the more general concept of power.

 Power can be exerted by the use of force or violence. Authority, by contrast, depends on the acceptance [respect] by [peers – as in “the People” or] subordinates of the rights of those above them [in an institution or position] to give them orders or directives. This [respect] is accomplished in a myriad of ways.”

 Sociologist “Max Weber divided legitimate authority into three types:

  • The first type discussed by Weber is Rational-legal authority. It is that form of authority which depends for its legitimacy on formal rules and established laws of the state, which are usually written down and are often very complex. The power of the rational legal authority is mentioned in the Constitution. Modern societies depend on legal-rational authority. Government officials wield this type of authority in most [modern] countries of the world. Bureaucracies are the result of this type of authority.
  • The second type of authority is Traditional authority, which derives from long-established customs, habits and social structures. When power passes from one generation to another, it is known as traditional authority. The right of hereditary monarchs to rule furnishes an obvious example. The Tudor dynasty in England and the ruling families of Mewar, in Rajasthan (India) are some examples of traditional authority. [Some might say that the father being described as the “head of the household” might be construed as traditional authority in patriarchal societies.]
  • The third form of authority is Charismatic authority. Here, the charisma of the individual or the leader plays an important role. Charismatic authority is that authority which is derived from “the gift of grace” or when the leader claims that his authority is derived from a “higher power” (e.g. God or natural law or rights) or “inspiration”, that is superior to both the validity of traditional and rational-legal authority and followers accept this and are willing to follow this higher or inspired authority, in the place of the authority that they have hitherto been following.” [Joan of Arc is a great historical example. The best examples in post-war America are Dr. Martin Luther King and the Reverend Billy Graham.]

There is a fine line between the charismatic and the charlatan that, in the modern world, has blurred to the point of absurdity as the most fundamental characteristic of a self-styled charismatic is physical attractiveness. This is most blatantly evident in the world of marketing. No unattractive people need apply as, apparently, attractiveness equates to authority. In fact, most marketing experts agree that Abraham Lincoln could never be elected President of the United States in 21st Century America.

“George Bernard Shaw wrote “…the reasonable man adapts himself to the world; the unreasonable man persists in trying to adapt the world to himself; therefore, all progress depends on the unreasonable man”. [For instance,] youths of today can make a difference in their societies by standing out in the crowd and upholding moral values in a morally bankrupt society.” [What a sad statement.]

 Manners and courtesy are an aspect of modern societies that are experiencing serious deterioration and we are doing nothing to remedy this problem; traditional values in social relations are being erased by new and so-called “modern” behaviors that are, in reality, inconsiderate and often coarse. Bad manners have thus been converted into a growing problem that affects all levels of society: family, work, friendships, business, and politics, not to mention their negative effect on romantic and personal relationships in general.

Bad manners and discourteousness increase when we leave behind basic standards of polite behavior in favor of rude and disrespectful treatment [of others – something we would object to if we were the target]. These bad [behaviors which] result in rude conduct tend to be worse in areas with socialist tendencies [since socialism thrives on the desire to “normalize” the “other” person by demonizing them, which then justifies taking from those that have and giving to those that want] but are not limited to such environments, and represent a worrisome societal model that devalues all of us: the elderly, women, as well as those who are considered different because of race or physical aspects, etc.

The absence of courtesy and good manners creates societies where [certain] individuals [aren’t allowed] personal dignity and subjects them to an environment where crude behavior and inappropriate conduct are considered “normal” – even encouraged.” [Want evidence?  The Trump campaign thrived on rudeness. After his inauguration, he has been subjected to non-stop rudeness from the PLDC. He certainly earned it but its cost in the very operation of government is troubling. Want another example? Consider the world of “rap” and its effect on the African-American youth in the urban core.]

Consider in particular, rap “lyrics”. Rap “lyrics” are words set to an “urban” beat allegedly evoking life on the streets of America’s major urban centers. Rap stars are emulated by youth of all backgrounds – privileged and poor, especially black but also, Latino and white, male and female and are celebrated in society with frequent, well marketed public appearances, major motion pictures and uncritical news coverage. Some are extremely successful producers of rap music in their own right – celebrated business tycoons made off the debasement of police, women, white people and the celebration of drug dealers.

The lyrics to their recordings however, evoke the most ruthless, misogynist, anti-police, drug-centered and violent view of modern America imaginable and yet, they are celebrated by America’s charismatic leaders. The decline in respect among Americans for Americans can be directly related back –  among other factors that have been discussed above – to this phenomenon which has engulfed America’s inner-city youth and their suburban wanna-be’s – America’s next generation of leaders. Here are some examples – by no means, the worst:

“I’m ’bout to dust some cops off/cop killer, better you than me/cop killer, fawk police brutality” – Ice-T;

 “A young n*gga on the warpath/And when I’m finished, it’s gonna’ be a bloodbath/Of cops, dying in L.A.” – Ice Cube;

 “Put molly (a brain-stupefying drug) all in her champagne/She ain’t even know it/ I took her home and I enjoyed that/She ain’t even know it” – Rick Ross, “U.O.E.N.O”;

 “Bout to put rims on my skateboard wheels/Beat that p*ssy up like Emmett Till” – Lil’ Wayne;

 “Fawk money/I don’t rap for dead presidents/I’d rather see the president dead/It’s never been said/But I set precedents and the standards” – Eminem;

 “Slob on my knob like corn on the cob / Check in with me and do your job” – Three 6 Mafia;

“Pregnant p***y is the best you can get/ F***** a b**** while her baby s***** d***”. – UGK;

“The b**** tried to gag me. So, I had to kill her. Yeah, straight hittin’/ Now listen up and lemme tell you how I did it/ Yo, I tied her to the bed, I was thinking the worst but yo I had to let my n****** f*** her first yeah/ Loaded up the 44 yo, then I straight smoked the h**”. – NWA;

“When I first met you, you was a h**/ I tried to reform you, bomb you, warn you and teach you/ But couldn’t reach you, and you’re still a h**/ Your father said you was a h** and when you leave me, b**** you’re gonna be a h**”. – The RZA

“B****** know I’m that n*****/ talking four door Bugatti, I’m the life of the party/ Let’s get these h*** on the Molly” – Rick Ross, Jay-Z

  “I’m a pimp in every sense of the word, b****/ Better trust than believe’em, in the cut where I keep ’em til I need a n**/ til I need to beat the guts. – UGK

Then it’s, beep, beep and I’m pickin’em up/ Let ’em play with the d*** in the truck” – Jay-Z

“B****** ain’t s*** but h*** and tricks/ L*** on these n*** and s*** the d***/ Get the f*** out after you’re done/ And I hop in my ride to make a quick run” – Snoop Dogg

“S***, you think I won’t choke no w****/ ’til the vocal chords don’t work in her throat no more?! (Ah!) These m************ are thinking I’m playing – Eminem

Thinking I’m saying the s*** ‘cause I’m thinking it just to be saying it/ (Ah!) Put your hands down b****, I ain’t gonna shoot you/ I’ma pull you to this bullet, and put it through you”

“All you had to do was step up to her/ She was in the bathroom sayin’ ‘One at a time’” – Too Short;

“I’ll leave ’em looking’ like a rape victim/ Any girl who steps to it/ Ends up getting’ their stomach pumped like Rod Stewart/ I do a damn good job” – Kool G. Rap;

“Now the title ‘b****’ don’t apply to all women/ But all women have a little b**** in ‘em (yeah)/ It’s like a disease that plagues their character, taking’ the women of America”– Ice Cube;

“So, don’t be a n**** sex slave b****/ Don’t try to be brave b****/ You be a dead b**** in the grave b****/ B**** you think a n**** and come beef out” – Kool G. Rap;

“Treat ’em like a prostitute (do what?)/ Don’t treat no girlie well until you’re sure of the scoop/ ’cause all they do is they hurt and trample” –  Slick Rick;

“Girls always ask me why I f*** so much/ I say ‘What’s wrong, baby doll, with a quick nut?’/ ‘Cause you’re the one, and you shouldn’t be mad/ I won’t tell your mama if you don’t tell your dad/ I know he’ll be disgusted when he sees your p**** busted/ Won’t your mama be so mad if she knew I got that a**?” – Fresh Kid Ice.

Had enough?

“In today’s societies, it’s common for people to regard courtesy as old-fashioned and out-of-date, so that increasingly more individuals behave rudely, making interaction difficult and creating an unpleasant social environment that makes people want to run and hide. Bad manners can be observed anytime, anywhere.

This sort of discourtesy is ever present and examples are too numerous to count or even mention: the disrespectful treatment of elderly people; invitations that aren’t responded to in any way; the lack of commitment to any event, job, or person; confirming attendance with no intention of attending; the strange disappearance of “please” and “thank you” from most people’s vocabulary; line-jumping; serial texters and cell-phone addicts who talk on the phone, as well as read and send text messages instead of paying attention to physically present persons; the friend or colleague who never offers to pick up the bill at lunch, or even pay their own way; repulsive children (the spitting image of their parents) who think that the world rotates around them and behave obnoxiously because of it, etc., etc.

If you think the children aren’t watching, consider the words of Natalie Qabazard, a 15-year old Senior at Ursuline High School.

 “In contemporary America, it seems as though more and more teenagers are inclined to act disrespectfully toward adults. The notion that the youth must respect their elders has completely vanished. What has replaced this attitude is a sense of disobedience, noncompliance and rudeness. Some contributors include the lack of discipline from parents, the mimicking of friends’ attitudes toward adults and how the media portrays disrespectful teenagers as being hip.

Modern families contain parents who are more driven and focused on their careers and less focused on the success of their family. According to essortment.com, “Becoming a teenager brings with it a host of new emotions, attitudes and behaviors. As kids age 13 to 19 move from childhood to maturity, they often experiment with language to express their boundaries and talk back to parents in ways that are inappropriate. It then becomes the parents’ duty to instruct their children how to speak with respect to authorities.”

The problem arises when parents fail to teach their children the correct way of behaving toward adults. Once threats are made, the parents back down and the teenager feels powerful. Now, the teenager has control over the parent, causing the parent to feel weak and powerless.

 [In the inner cities, children are empowered by the omnipresence of urban gangs ready to provide a “home” for estranged children who have been confronted with strong parenting in their family. In their (the child’s) eyes, there is no downside to spending as little time as possible at home and as much time as possible with the “kool kids” in the local gang.]

Now more than ever, teens are mimicking the disrespectful and disobedient attitude, which their friends exhibit at school. This can mainly be seen between a student and a teacher. The same attitude that is being used toward parents is used against school officials. Schools should enforce more disciplinary action against these rude teens so as to make them pay for their lack of respect. As teenagers go about their daily lives, they observe others being rude to their friends and their parents, so they in turn do the same. The amount of peer pressure is increasing; therefore, it results in conforming to their peers’ expectations.

The media portrays disrespectful teenagers as being “cool” and therefore has contributed to this epidemic. We see more and more disrespectful teenagers on TV because it is entertaining to watch. However, this should not be at the expense of our future society’s behavior.

On the popular reality show “My Super Sweet Sixteen,” spoiled adolescent girls treat their parents with a lack of respect in order to get what they want. It is apparent, in this TV show, that the parents of these 16-year-olds only care about buying their children happiness when, in fact, the child feeds off of this carelessness and would like to take the power away from the parents and bring it upon themselves. By televising such acts, it is promoting these behaviors, hence more of it.

Teenagers must end this form of verbal abuse because if this behavior persists, America will form into a country filled with insolence. How would the remainder of the world esteem America if the president was arrogant, rude and disrespectful? America is known for its stature as a nation, filled with kind and respectful people. However, with the way that our generation proceeds into the future, that stature will likely plummet.”

 “Out of the mouths of babes…”

Next time: The example set by today’s “adults”.

Dereliction of Duty

So, what were these millions of young Americans, both the drug users and the sober, to think when no official responsibility was ever assigned for the disaster of Vietnam? Shouldn’t someone have been held responsible for the cover-up or the conscious denial of the obvious danger to the troops as well as the campaign, especially for the initial surge of drug use in the early years of the buildup?

 The lesson learned was that nobody was held to have had any personal responsibility for the loss of countless lives when, clearly, Army units were dysfunctional because soldiers were high – not the individuals who did drugs in the field, not their squad or platoon or company officers who may have shared a toke with them, not the battalion or regimental officers who had to construct action reports and determine what went right and what went wrong and certainly not the generals in Saigon and Washington – nor their civilian superiors.

 The fact is they all knew, (and if they claimed they didn’t, they were admitting to dereliction in the face of the enemy) no one did anything about it and the slaughter went on. Afterwards, the Army, as an institution, slunk away to conduct two decades of self-analysis. The government eliminated the draft. The servicemen and women returned to civilian life scarred by their experience; tormented by the fiasco created by those responsible for their safety; disillusioned with the moral and ethical train wreck they had witnessed and scorned by their peers for their collective failures.

 If no one would be held responsible for that, then certainly there was no such attribute as personal responsibility. Even the draft-dodgers who fled to Canada were welcomed home with open arms by Democrat President Jimmy Carter during his first week in office with no assignment of accountability for abandoning their country in time of need.

 This new acceptable flaw in the American character has festered since the end of our Vietnam involvement, has animated our unfortunates, their families and descendants ever since and has manifested itself in what we see today in a declining, but still high, America where no public official in the Democrat administration – especially those of Bill Clinton and Barack Obama has ever been held accountable for any transgression – of which there were many – as we have seen.

 The thing about it is that there were those responsible for all of it – but they got away with it because they were not held accountable.

 ·         The esteemed academic institutions – to whom we entrusted our impressionable children but who exposed them to drugs (often by example) and the extra/contra-Constitutional New Dealers and other neer-do-wells with their radical acolytes in academia.

 ·         The generals – to whom we entrusted our precious children but who allowed them to be exposed to a culture of drugs and the torpor they create that would kill too many of them.

 ·         The modern-day progressive/liberals – to whom we entrusted our personal sovereignty in a Democrat Congress almost exclusively for more than forty years and who have systematically dismantled the protections of life, liberty and private property in our Constitution, all the while blaming failure on others (mainly the Republicans) – the very antithesis of personal responsibility.

 ·         The press/media – to whom we entrust the truth, if for no other reason than the incomprehensible reporting of the Tet offensive in South Vietnam in 1968. The Wall Street Journal’s Arthur Herman wrote in 2008 [my emphases]:

 “On January 30, 1968, more than a quarter million North Vietnamese soldiers and 100,000 Viet Cong irregulars launched a massive attack on South Vietnam. But the public didn’t hear about who had won this most decisive battle of the Vietnam War, the so-called Tet offensive, until much too late.

 Media misreporting of Tet passed into our collective memory. That picture gave antiwar activism an unwarranted credibility that persists today in Congress, and in the media reaction to the war in Iraq. The Tet experience provides a narrative model for those who wish to see all U.S. military successes – such as the Petraeus surge [which led to ultimate military victory in Iraq, only to be undone by an inept State Department during the Obama administration] – minimized and glossed over.

In truth, the war in Vietnam was lost on the propaganda front, in great measure due to the press’s pervasive misreporting of the clear U.S. victory at Tet as a defeat. Forty years is long past time to set the historical record straight.

The Tet [the Vietnamese Lunar New Year celebration] offensive came at the end of a long string of communist setbacks. By 1967 their insurgent army in the South, the Viet Cong, had proved increasingly ineffective, both as a military and political force. Once American combat troops began arriving in the summer of 1965 [and despite debilitating drug-use among American soldiers], the communists were mauled in one battle after another, despite massive Hanoi support for the southern insurgency with soldiers and arms.

By 1967 the VC had lost control over areas like the Mekong Delta – ironically, the very place where reporters David Halberstam and Neil Sheehan had first diagnosed a Vietnam “quagmire” that never existed.

The Tet offensive was Hanoi’s desperate throw of the dice to [1] seize South Vietnam’s northern provinces using conventional armies, while [2] simultaneously triggering a popular uprising in support of the Viet Cong. Both failed. Americans and South Vietnamese soon put down the attacks, which began under cover of a cease-fire to celebrate the Tet lunar new year.

CBS news anchor Walter Cronkite’s public verdict that the 1968 Tet offensive was a “defeat” for the U.S. is widely seen as a turning point in American support for the war. Cronkite falsely claimed that the Vietcong had held the American embassy for six hours and that the offensive “went on for two months.”

The facts [available to Cronkite at the time] show that Tet was actually a major defeat for the communist enemy. In actuality, the Viet Cong loss was devastating and by March 2, when U.S. Marines crushed the last North Vietnamese pockets of resistance in the northern city of Hue, the VC had lost 80,000-100,000 killed or wounded without capturing a single province. The Tet offensive was a last gasp for the Communists and the U.S. had won every single battle.

Tet was a particularly crushing defeat for the VC. It had not only failed to trigger any uprising but also cost them “our best people,” as former Viet Cong doctor Duong Quyunh Hoa later admitted to reporter Stanley Karnow. Yet the very fact of the U.S. military victory – “The North Vietnamese,” noted National Security official William Bundy at the time, “fought to the last Viet Cong” – was spun otherwise by most of the U.S. press.

As the Washington Post‘s Saigon bureau chief Peter Braestrup documented in his 1977 book, The Big Story, the desperate fury of the communist attacks including on Saigon, where most reporters lived and worked, caught the press by surprise. (Not the military: It had been expecting an attack and had been on full alert [and reinforcing outlying areas] since Jan. 24.)

[More critically,] it also put many reporters in physical danger for the first time. Braestrup, a former Marine, calculated that only 40 of 354 print and TV journalists covering the war at the time had seen any real fighting. Their own panic deeply colored their reportage, suggesting that the communist assault had flung Vietnam into chaos.

Their editors at home, like CBS’s Walter Cronkite, seized on the distorted reporting to discredit the military’s version of events. The Viet Cong insurgency was in its death throes, just as U.S. military officials assured the American people at the time. Yet the press version painted a different picture.

To quote Braestrup, “the media tended to leave the shock and confusion of early February, as then perceived, fixed as the final impression of Tet” and of Vietnam generally. “Drama was perpetuated at the expense of information,” and “the negative trend” of media reporting “added to the distortion of the real situation on the ground in Vietnam.”

[It should also be remembered that this was Lyndon Johnson’s war – to the press, the unworthy and crude southerner who had succeeded the beloved, assassinated Northeastern Democrat icon, John Kennedy – whom the press had never accepted and upon whom they had projected their anguish for their untimely loss. His failures were their victories.]

The North Vietnamese were delighted. On the heels of their devastating defeat, Hanoi increasingly shifted its propaganda efforts toward the media and the antiwar movement.

Causing American (not South Vietnamese) casualties, even at heavy cost, became a battlefield objective in order to reinforce the American media’s narrative of a failing policy in Vietnam.

Yet thanks to the success of Tet, the numbers of Americans dying in Vietnam steadily declined — from almost 15,000 in 1968 to 9,414 in 1969 and 4,221 in 1970 – by which time the Viet Cong had ceased to exist as a viable fighting force. One Vietnamese province after another witnessed new peace and stability. By the end of 1969 over 70% of South Vietnam’s population was under government control, compared to 42% at the beginning of 1968. In 1970 and 1971, American ambassador Ellsworth Bunker estimated that 90% of Vietnamese lived in zones under government control.

However, all this went unnoticed [by dis-informed Americans] because misreporting about Tet had left the image of Vietnam as a botched counterinsurgency – an image nearly half a decade out of date. The failure of the North’s next massive invasion over Easter 1972, which cost the North Vietnamese army another 100,000 men and half their tanks and artillery, finally forced it [North Vietnam] to sign the peace accords in Paris and formally to recognize the Republic of South Vietnam.

[By historical standards, this should have been called a victory and a worthy outcome for those who had sacrificed their all for the cause because, by] August 1972 there were no U.S. combat forces left in Vietnam, precisely because, contrary to the overwhelming mass of press reports, American policy there had been a success.

To [the Democrat] Congress [the progressive/liberal press] and the [dis-informed] public, however, the war had been nothing but a debacle. And by withdrawing American troops, Republican President Nixon gave up any U.S. political or military leverage on Vietnam’s future.

With U.S. military might out of the equation, the North quickly cheated on the Paris accords. When its re-equipped army launched a massive attack in 1975, Congress refused to redeem Nixon’s pledges of military support for the South – as we have seen. Instead, accidental President Gerald Ford bowed to what the media had convinced the American public was inevitable: the fall of Vietnam.

Accuracy in Media founder and longtime AIM Report Editor Reed Irvine noted that Cronkite “contributed a great deal to our defeat in Vietnam.” According to one former North Vietnamese leader, Bui Tin, collapse of our political will was “essential to our strategy.” The war could not be won militarily but it could be won at home with the help of anti-war activists and a hostile media.

In an interview with the Wall Street Journal after his retirement, he said that visits from such anti-war advocates as Jane Fonda and Ramsey Clark “gave us confidence that we should hold on in the face of battlefield reverses. We were elated when Jane Fonda, wearing a red Vietnamese dress, said at a press conference that she was ashamed of American actions in the war and that she would struggle along with us.” Ah, that’s ‘Hanoi Jane’ who made a fortune from exercise videos from women who still don’t know or care about her treason.”

Was anyone in the press held responsible for the deceitful reporting about Vietnam and its resulting contribution to wasted American deaths? Of course not. Cronkite went on to become an icon and “the most trusted man in America.”

The Founders are, in fact, the icons of the American story because they embodied personal responsibility to the point of sacrificing all that they had – including their lives – for the cause of liberty. The blueprint they gave us was first and foremost a treatise on personal responsibility for the trust we place in our elected and appointed officials and for the accountability we must demand of them.

 If our leaders abrogate their responsibilities the entire system collapses. But today we have moral and ethical cowards for leaders – afraid of the power of the truth, the burden of personal responsibility and the greatness of American exceptionalism.

 The demise of these character traits in our democracy, enhanced by the unintended consequences of the New Deal, the Great Society and the Vietnam War, is leading us down the road to the loss of self-esteem, ambition, industriousness, hope and, finally, the American Dream. How could this happen? The answer can be found in a story which began less than 100 years ago.

 What began with such promise at the eleventh hour on the eleventh day of the eleventh month in 1918, when communications with the telegraph and the written letter were primitive compared to ours, we have seen in the century since these antiquated methods provided time for one to think about what they were communicating and the brevity, creativity and thoughtfulness that supplied clarity to their words, a world where now instantaneous video images cascade down on anyone, anywhere, at any time.

 This period began with Western Europe decimated and the entire world in chaos. Into this chaos swept a new horde from the east, led by modern khans, intent upon also subduing Western civilization – as their horse-borne predecessors had done fifteen centuries before – when the fall of Rome left Europe in chaos. Only this horde came spouting platitudes about “the workers of the world uniting” under a Communistic ruler – the Soviet Union.

 From the Asian steppes to the northern Kush to the Pacific Ocean, they created a modern terrorist state. Later they inspired many New Deal activists in America, Mao in China, Ho in Vietnam, Kim in Korea, Pol Pot in Cambodia and finally a cleric hiding behind the black robes of an ancient, terrorist  pseudoreligion – Khomeini in Iran.

 In whatever country allegedly educated but, in actuality, mis-informed, dis-informed, politically indoctrinated and spiritually exploited people can be found – terrorism – no longer manifested out of poverty or confined by the state, but freed to become a state-of-mind – threatens that region of the world. Exploiting the spiritual void in the West with a spiritual fanaticism, radical Islamist terrorists seek ultimate world power at the expense of all people. They don’t need great armies, just willing minds.

 In August 1945, with Nazi Germany defeated and the surrender of the Japanese Empire at hand and with World War II entering the history books, America stood astride the globe as if a New Colossus – foretold by poet Emma Lazarus in 1883 – truly the “shining city on a hill” from Jesus’ Sermon on the Mount – cited by both Democrat John Kennedy and Republican Ronald Reagan.

 It was a time when all American children could dream any dream and could be assured that they would be able to pursue that dream, and be the people they desired to be, unfettered and unafraid of malevolent powers within or without the United States. Their only challenge was how big they were willing to dream. It was a glorious and innocent time into which I was born – and headed toward a boundless future upon the shoulders of our parents – America’s “greatest generation”.

 Over the next two generations – through law and happenstance, as we have seen – African-Americans, women and most other “protected” groups would fully gain the rights and opportunities that the Constitution promised to those who practiced personal responsibility – thereby becoming able, if they so desired, to join the American parade. Consider these “firsts for African-Americans”:

 Major league baseball player in the 20th Century: Jackie Robinson, 1947.

Woman gold medalist (Summer Games; individual): Alice Coachman, 1948.

Draftee to play in the NFL: Wally Triplett, halfback (Penn State). Picked by the Detroit Lions 1949.

Wimbledon tennis champion: Althea Gibson, 1957. 

NHL hockey player, Boston Bruins: Willie O’Ree, 1958.

NASCAR stock car driver to win a major race: Wendell Oliver Scott, 1963.

U.S Senator (elected) Edward Brooke, 1966.

U.S. cabinet member: Robert C. Weaver, 1966.

Woman federal judge: Constance Baker Motley, 1966.

U.S. Supreme Court Justice: Thurgood Marshall, 1967.

Male tennis Major Champion: Arthur Ashe, 1968.

Mayor of major city, Cleveland: Carl Stokes, 1967.

Woman U.S. Representative: Shirley Chisholm, 1969.

Woman cabinet officer: Patricia Harris, 1977.

Governor (elected): L. Douglas Wilder, 1989.

Woman mayor of a major U.S. city: Sharon Pratt Dixon Kelly, 1991.

Woman U.S. Senator: Carol Mosely Braun, 1992.

U.S. Secretary of State: Colin Powell, 2001.

Woman Secretary of State: Condoleezza Rice, 2005.

U.S. President: Barack Obama, 2009

 Consider these other “firsts for American women”:

 Woman who is American citizen to be canonized: Mother Maria Frances Cabrini (1850-1917). 1946.

Woman baseball scout: Edith Houghton, 1946.

Woman in the U.S. to undergo astronaut testing: Jerrie Cobb. 1953.

Woman to serve as Secretary of Health, Education, and Welfare: Oveta Culp Hobby, 1953.

She is also the first director of the Women’s Army Auxiliary         Corps (WAAC).

Woman to break the sound barrier: Jacqueline Cochran by flying an F-86 over California, 1953.

Woman nominated for President of the United States by a major political party; Republican National Convention: Margaret Chase Smith. 1964.

Asian-American woman elected to Congress: Patsy Takemoto Mink, of Hawaii, 1965.

Woman to own a seat on the New York Stock Exchange: Muriel “Mickey” Siebert, 1967.

Female jockey to ride in the Kentucky Derby: Diane Crump, 1970.

Woman rabbi in the United States Sally Jean Priesand, 1972.

Woman appointed Secretary of Commerce: Juanita Kreps, 1977.

Woman to conduct at New York’s Metropolitan Opera House: Sarah Caldwell, 1976.

Female Supreme Court Justice: Sandra Day O’Connor, 1981

Additionally, Title IX of the Civil Rights Act of 1964 was passed, prohibiting sex discrimination and mandating rough equivalency in public education and federally assisted programs, especially female high school and collegiate athletics, 1972.

Those days are long gone. Now our children are threatened every hour of every day by evil forces within and without the United States – forces that threaten their dreams and oppose their desire to be who they want to be – politically correct forces (and the chaos they breed) within and barbaric forces without. We shield them as best we can while they grow and encourage them to be more than they think they can be but, the day comes when we have to release them to face a dangerous, coarse, cruel, crushing and cancerous world that has metastasized under the influence of ignorant, cowardly, narrow-minded and self-centered political opportunists, both here and abroad.

It is heartbreaking to see such joy and optimism on idealistic and determined young faces, knowing full well the frustration and disappointment they will have to endure and still may be unable to reach their goals. Sure, struggle is good for the soul but, senseless struggle against needless politically inspired or bureaucratic interference and stumbling blocks is just stupid. We must seek to eliminate these hindrances by eliminating the forces that create and maintain them generation after generation.

 Why? Because this is where we’re going in America with government control of our lives (read “mis-informed” and “dis-informed”) from cradle to grave as discussed earlier. Consider this from columnist and author Todd Starnes:

 “The city of Houston has issued subpoenas demanding a group of pastors turn over any sermons dealing with homosexuality, gender identity or same sex marriage to the city’s first openly lesbian mayor. And those ministers who fail to comply could be held in contempt of court.”

 “The city’s subpoena of sermons and other pastoral communications is both needless and unprecedented,” an Alliance Defending Freedom attorney said in a statement. 

“The city council and its attorneys are engaging in an inquisition designed to stifle any critique of its actions.”

ADF, a nationally-known law firm specializing in religious liberty cases, is representing five Houston pastors. They filed a motion in Harris County Court to stop the subpoenas arguing they are “overbroad, unduly burdensome, harassing, and vexatious.”

Political and social commentary is not a crime,” ADF said. “It is protected by the First Amendment.” The subpoenas are just the latest twist in an ongoing saga over the Houston’s new non-discrimination ordinance. The law, among other things, would allow men to use the ladies room and vice versa.  The city council approved the law.

The Houston Chronicle reported opponents of the ordinance launched a petition drive that generated more than 50,000 signatures – far more than the 17,269 needed to put a referendum on the ballot. However, the city threw out the petition over alleged irregularities. After opponents of the bathroom bill filed a lawsuit, the city’s attorneys responded by issuing the subpoenas against the pastors.

“City council members are supposed to be public servants, not ‘Big Brother’ overlords who will tolerate no dissent or challenge,” said an ADF attorney.  “This is designed to intimidate pastors.” The mayor will not explain why she wants to inspect the sermons.

However, ADF suspects the mayor wants to publicly shame the ministers. He said he anticipates they will hold up their sermons for public scrutiny. In other words – the city is rummaging for evidence to “out” the pastors as anti-gay bigots.

Among those slapped with a subpoena is the senior pastor of Grace Community Church. He was ordered to produce all speeches and sermons related to the mayor, homosexuality and gender identity. The mega-church pastor was also ordered to hand over “all communications with members of your congregation” regarding the non-discrimination law. “This is an attempt to chill pastors from speaking to the cultural issues of the day. The mayor would like to silence our voice. She’s a bully.”

The executive director of the Texas Pastor Council also received a subpoena. He said he will not be intimidated by the mayor. “We’re not afraid of this bully,” he said. “We’re not intimidated at all. We are not going to yield our First Amendment rights,” This is absolutely a complete abuse of authority. The actions by Houston’s mayor are “obscene” and said they “should not be tolerated. This is a shot across the bow of the church.

‘This is the moment I wrote about in my book, God-Less America. I predicted that the government would one day try to silence American pastors. I warned that under the guise of “tolerance and diversity” elected officials would attempt to deconstruct (that word, again) religious liberty. Sadly, that day arrived sooner than even I expected.” he said.

A Pastor compared the culture war skirmish to the 1836 Battle of San Jacinto, fought in present-day Harris County, Texas. It was a decisive battle of the Texas Revolution. “This is the San Jacinto moment for traditional family. This is the place where we stop the LGBT assault on the freedom to practice our faith.” Who will hold these public officials personally responsible for their assault on the Constitution? Probably no one — and that’s a shame.

So, in this climate, how does one instill personal responsibility? One method is to introduce children to shame. Shame, you say. How cruel, you say. Well, like most of the central teachings of Western civilization – contained in the Bible – the value and importance of shame in the development of civilized human beings has been forgotten.

 “Of course you should shame your children,” says Sam Sorbo, a nationally syndicated radio host, and author of The Answer: Proof of God in Heaven.  She studied biomedical engineering at Duke before pursuing a career as an international fashion model and actress. She lives in L.A. with her husband and three children, and advocates home schooling.

 “That is your job, as a responsible parent. Shame is one of our most useful tools! Shame is the barrier between decency and depravity in a moral society.” When Adam and Eve were in the Garden of Eden, they “were both naked and were not ashamed.” (Gen 2:25). Then they ate the fruit of the Tree of Knowledge of Good and Evil.

 From their (original) sin, they discovered shame; they clothed themselves – why? Because they felt ashamed, and that changed the way they looked. They blushed, and probably sweated, too. They fashioned the grape leaves to cover not their nakedness, but their shame. After God received their confessions, he created garments for them and sent them (in shame) from the garden.

 Merriam-Webster defines shame as ‘a painful [personal] emotion caused by the consciousness of guilt, shortcoming, or impropriety.’ It’s also the susceptibility to having such an emotion. Shameless is the inability to feel disgrace, which describes many of our politicians and other political actors these days.

 President Obama was not ashamed of having lied to the American people about their options under ObamaCare. His shameless “apology” stated, ‘I am sorry that they – you know – are finding themselves in this situation, based on assurances they got from me.’

 He was not ashamed about the Benghazi deaths, which occurred on his ‘watch’, though, in truth, we don’t know exactly what he was doing while the ambassador and three others suffered and died. He did imply he was ashamed by a video he had no role in making. Clearly, he misunderstands the word.

 He could not be shamed into action, though people and pundits everywhere were calling for some answer to the Islamic State’s genocide, Putin’s military maneuvers, and the shameful situation at our southern border. Unashamed, he continued his [fundraisers and] vacations. But he also called for air strikes and humanitarian aid in Iraq, citing, in a New York Times interview, his lack of follow up in Libya as a lesson-learned.

 ‘So that’s a lesson that I now apply every time I ask the question, ‘Should we intervene militarily?’” Mr. Obama said. ‘Do we have an answer the day after?’ It’s just great that he’s getting on-the-job training, but notice that his hubris prevents a more open nod to the Bush doctrine of nation-building – which he destroyed, in his haste to withdraw from Iraq, but now clearly espouses! And witness, no apology to all to the [Iraqi and Syrian] Christians who are being starved or beheaded. Shameful.

 After years of claiming the troop withdrawal from Iraq as his great achievement – because it can now be seen as disastrous – Obama just changed his narrative and blamed President George W. Bush for it. ‘… As if this was my decision.’ Immediately following that he blamed President Maliki as well. His blatant rejection of culpability speaks not to his sense of shame but his lack thereof.

 Our societal break with shame traces most clearly to the Bill Clinton White House.  Before that, we had Watergate, a spying and lying tale resulting in the resignation of a president, Nixon, who valued the office of the presidency too much to bring the disgrace of impeachment upon it.

 Not so for Bill Clinton, who was not at all ashamed of his deplorable seduction of a young woman intern less than half his age. The progressive women’s movement lifted no voice against his predatory behavior because they supported him politically. Shameful.

 Then Clinton unabashedly lied directly to the American Public on TV. Impeachment, though inevitable, was unsuccessful, because the Senate was unwilling to shame the man they adored, despite his transgressions. Now he is one of the highest paid speakers in the world. Shameless.

 That Clinton’s and also Obama’s actions have demanded impeachment is clear, except that We, the People have lost our ability to shame. We shun the very idea of it, perhaps because we all feel our own so vividly. ‘Judge not, that you be not judged. For with the judgment you pronounce you will be judged, and with the measure you use it will be measured to you.’ (Matt 7:1-2). Are we seeking so strongly not to be judged, that we refuse to judge others? Shameful.

 Recently, we witnessed the late Canadian politician Rob Ford’s brother defend the former Toronto mayor’s right to retain his seat despite his excessive illegal drug use, by pointing his finger at the parliament in Canada and accusing them of smoking pot. But is this redistribution of shame? Does our own shame mean we lose the right to shame others? In that case, there is no more right and wrong.

 It is asymmetrical warfare, in that shame is used only on those who can feel it – but for the others, well, there is no amount of scorn the PLDC cannot withstand. They simply turn the finger back on those who would correct them. “What difference, at this point, does it make?” (Read, “You will not shame me!”) And, so good people are disarmed of our ability to defend what’s right and condemn what’s deplorable because of societal defects in others.

 Homer Simpson said, ‘It takes two to lie. One to lie and one to listen.’ That’s even truer of shame. Former New York Congressman Anthony Wiener ought to be ashamed, but no. As a forgiving society, we’ve given him a (snickering) pass – though he, thankfully, didn’t get reelected. ‘Yes, you shouldn’t tweet your private parts, but haven’t we all done something regrettable?’ If we did, we ought to be ashamed!

 But (luckily), tweeting inappropriate photos isn’t the same thing as taking some post-it notes from work. So good people ought not to allow themselves to be shamed into submission, just because they aren’t perfect. If it’s wrong, speak up, stand up, and if you are also guilty (of something, aren’t we all?), handle that on your own time. Don’t be a hypocrite, but don’t be a cowardly doormat, either. Or worse, a sycophant, supportive of bad behavior for other rewards.

 It is incumbent on a just society to shame those deserving of it. It is simple self-protection. We no longer have pillories, and the mainstream press clearly has no concept (witness the terrorist-graced Rolling Stone magazine), but therefore the moral society must make up the difference. Shame on them if they don’t.”

 Strong words, valuable words, necessary words.

 How much energy has been wasted to surmount these senseless obstacles created by people who are too self-absorbed to appreciate what damage their truly ignorant actions have wrought at the expense of our childrens’ dreams and their loss of innocence?

 How much greater the good if the forces of creative energy and enlightened self and selfless interest would have been, and hopefully will again be, allowed to fully blossom in sunlight instead of the darkness of evil let loose in the world by these maniacally selfish people?

 How much more glorious the reward for the millions of lives sacrificed during this nation’s wars to preserve our traditional way of life?

Next time: Recovering respect.

Personal Responsibility

The fourth casualty in this loss of faith has been personal responsibility Therefore, personal responsibility must be the next principle restored.

 The fairly quick demise of the ethos of personal responsibility in the American culture began with a two-pronged attack in the mid-sixties. One enabler, of course, was the Democrat administration of President Lyndon Johnson, specifically the Great Society launched in 1966. It began as a “War on Poverty” in America but soon became the largest welfare system in human history, morphing into the current “nanny state” that supplies virtually all of life’s necessities for the favored group – the minority poor (although other favored groups can and do get some benefits – groups like the LGBTQ community (lesbian, gay, bisexual, transsexual, questioning) and illegal immigrants, all with no obligation on the part of the beneficiaries to earn any benefit or take on any responsibility to try and work their way out of poverty.

 “Nearly 50 years after the release of the U.S. Department of Labor report: “The Negro Family: The Case for National Action,” which was highly controversial and widely criticized at the time, the new Urban Institute study found that the alarming statistics in the report back then “have only grown worse, not only for blacks, but for [poor] whites and Hispanics as well.”

 The older report was an unsparing look at the roots of black poverty issued at the height of the Civil Rights movement and at the start of the War on Poverty. Commonly referred to as the Moynihan Report, named for its author, the late Democrat Senator and intellectual Daniel Patrick Moynihan, it called for more government action to improve the economic prospects of black families – a role never envisioned by the Founders who believed (through experience) that communities should and would care for the downtrodden. 

 The challenges to the well-being of black families chronicled back then, included acute and concentrated poverty in low-income black neighborhoods populated by underemployed and unemployed residents; crime; inequality in access to affordable and adequate housing, employment opportunities, education, health care, and the criminal justice system; high rates of non-marital births and children raised in households headed by single women; and social welfare policies that undermined the role of black men. None of those conditions has changed appreciably for the better and most are worse – much worse.

 The percentage of black children born to unmarried mothers, for comparison, tripled between the early 1960s and 2009, remaining far higher than the percentage of white children born to unmarried mothers. About 20 percent of black children were born to unmarried mothers in the early 1960s, compared with 2 to 3 percent of white children. By 2009, nearly three-quarters (73 percent) of black births and thirty percent of white births occurred outside marriage. Hispanics fell between whites and blacks and followed the same rising trend.

 But, since the mid-1960s, America has practiced social-engineering – spending tax money directly, trying to improve the lives of those who don’t have very much. Those payments are called entitlements, or a more accurate description of these payments as ‘unearned benefits’ and they are now so high that they threaten to bankrupt the entire. Liberal Americans tend to support the entitlement society while conservatives are more inclined to promote individualism, earning entitlements and smaller entitlement spending overall.

 Instead of alleviating the material deprivation of the poor, a new class of professions was created to provide services to the poor. This approach would, over the decades, lead to the entrenchment of an entire class of bureaucrats acting as middle men between the poor and the resources the state (federal and State) was distributing, the course of which was to further inflate the power of the state and create a lobby for the preservation of those interests, making any reform difficult [if not impossible]”

 President (Barack) Obama, of course, is a liberal. And the Democratic Party is now dominated by the progressive/liberal left. That’s why, over the past fifty years, federal spending has “broken the bank” – doubling the national debt in the past eight years alone to $20,000,000,000,000 – that’s $20 trillion. Having never experienced poverty, the essential mistake that Barack Obama made is that he believed Lyndon Johnson’s Great Society entitlements can elevate the poor to prosperity. They obviously haven’t and, in historical fact, cannot.

 So, let’s review some statistics on poverty and entitlements in America since the “War on Poverty” began.

 There are now about 108 million people receiving some sort of welfare and yet there are only 102 million Americans with full-time jobs!

 Welfare program participation is not predicated on race. Whites and African-Americans each make up about 40% of the welfare population while Hispanics comprise about 15%.

 There are 69 separate federal welfare needs-based assistance programs covering food, housing, social services, education, cash, vocational training, health, utilities, child care and transportation.

 The average family not participating in any welfare program earns about $137 per day. The average family that is participating in welfare programs earns about $168 per day.

 Today, single mothers are better off earning just $29,000 per year and collecting $28,000 in welfare benefits than earning $69,000 per year and paying normal income taxes!

 While 24% of native born Americans participate in welfare programs, over 36% of immigrants do. For participants in the “food stamp” program, 41% have been in the program for more than ten years.

 In 1960, welfare made up 3% of the federal budget and defense spending was about 50%. By 2010, welfare made up 25% of the federal budget and defense spending was about 20%.

 In 1962, 12% –  about 22 million Americans, were dependent upon some form of “unearned benefit” government programs. By 2014, 24% – about 75 million Americans were dependent upon some form of “unearned benefit” government program (not Social Security, Medicare/ Medicaid or military pensions).

 In the 1960s, the federal government spent about $900 million on welfare programs. In the first decade of the new century, the federal government spent about $10 trillion on welfare programs. About 20% of welfare recipients are classified as lifetime welfare clients.

In 1965, the poverty rate in this country stood at 14 percent. Now, after untold trillions (over $50 trillion) have been spent fighting poverty, the poverty rate is 14.3 percent – 49 million Americans, including 15.9 million children. Amazing, is it not?

 The great State of Wisconsin instituted a “workfare” program in 1986. At the time, over 100,000 people were on the welfare rolls and only 10% of those held a job of any kind. By 1997, the welfare rolls had been reduced to about 40,000 and 70% held jobs! During the same period, national welfare rolls increased from 36 million to over 40 million.

 According to an NBC poll; the People believe that the primary causes of poverty in America are: Too much welfare (Lack of personal initiative) – 24%; Lack of jobs – 18%; Breakdown in the family – 13%; Lack of work ethic – 10%.

 The conclusion – America is bankrupting itself with an entitlement philosophy that does little to alleviate poverty. The federal government “Entitlement Environment” has destroyed any sense of personal responsibility for ones’ own welfare – in effect, leaving ones’ welfare in the hands of the federal government.

 But, this situation has developed despite the fact that it is unconstitutional – yes, it violates the tenets of the Constitution. In the Preamble of the Constitution we are told that one of the reasons the U. S. Constitution was created was to promote the general welfare of the People. This provision anticipates the RIGHT of Americans to have its government serve the welfare of the People in their collective needs – that is, their GENERAL welfare – and not use the resources of the People for the benefit of certain States or certain people, which would be SPECIAL welfare.

 “The term “general welfare” was used in the Articles of Confederation and elsewhere to refer to the well-being of the whole People. The Founders did not want the power and resources of the federal government to be used for the special benefit of any one region or any one State. Nor were the resources of the People to be expended for the benefit of any particular group or any special class of citizens. (see Making of America, p. 244)

 The entire American concept of “freedom to prosper” was based on the belief that man’s instinctive will to succeed in a climate of liberty would result in the whole People prospering together. It was believed that even the poor could lift themselves through training or education and individual effort to become independent and self-sufficient. This was a central tenet of Christianity. It is still known as the “Puritan Ethic”.

The idea was to maximize prosperity, minimize poverty, and make the whole nation rich. Where people suffered the loss of their crops or became [involuntarily] unemployed, the more fortunate in the community were to help. And those who were enjoying “good times” were encouraged to save up in store for the misfortunes which seem to come to everybody someday. Hard work, frugality, thrift, and compassion became the key words in the American ethic. See Franklin’s Poor Richard’s Almanack.

Within a short time, the Americans, as a people, were on their way to becoming the most prosperous and best-educated nation in the world. The key was using the government to protect equal rights, not to provide equal things. Colonial visionary and Revolutionary leader Samuel Adams said the ideas of a welfare state were made unconstitutional by the Founders:

“The utopian schemes of leveling [redistribution of the wealth] and a community of goods [central ownership of all the means of production and distribution] are as visionary and impracticable as those which vest all property in the Crown. [These ideas] are arbitrary, despotic, and, in our government, unconstitutional.”

“The Founders had a deep concern for the poor and needy. Disciples of the collectivist Left in the Founders’ day as well as our own have insisted that compassion for the poor requires that the federal government become involved in taking from the “haves” and giving to the “have nots.” Benjamin Franklin had been one of the “have nots,” and after living several years in England where he saw government welfare programs in operation, he had considerable to say about these public charities and their counterproductive compassion.

Franklin wrote a whole essay on the subject and told one of his friends: “I have long been of your opinion, that your legal provision for the poor (in England) is a very great evil, operating as it does to the encouragement of idleness. We have followed your example, and begin now to see our error, and, I hope, shall reform it.” A survey of Franklin ’s views on counterproductive compassion might be summarized as follows:

Compassion which gives a drunk the means to increase his drunkenness is counterproductive.

 Compassion which breeds debilitating dependency and weakness is counterproductive.

 Compassion which blunts the desire or necessity to work for a living is counterproductive.

 Compassion which smothers the instinct to strive and excel is counterproductive.

 Nevertheless, the Founders recognized that it is a mandate of God to help the poor and under- privileged. It is interesting how they said this should be done. Franklin wrote:

 “To relieve the misfortunes of our fellow creatures is concurring with the Deity; it is godlike; but, if we provide encouragement for laziness, and supports for folly, may we not be found fighting against the order of God and Nature, which perhaps has appointed want and misery as the proper punishments for, and cautions against, as well as necessary consequences of, idleness and extravagance? Whenever we attempt to amend the scheme of Providence, and to interfere with the government of the world, we had need be very circumspect, lest we do more harm than good.”

“Nearly all of the Founders seem to have acquired deep convictions that assisting those in need had to be done through means which might be called “calculated” compassion. Highlights from their writings suggest the following:

Do not completely care for the needy – merely help them to help themselves.

 Give the poor the satisfaction of “earned achievement” instead of rewarding them without achievement.

 Allow the poor to climb the “appreciation ladder”– from tents to cabins, cabins to cottages, cottages to comfortable houses.

 Where emergency help is provided, do not prolong it to the point where it becomes habitual.

 Strictly enforce the scale of “fixed responsibility.”

 The first and foremost level of responsibility is with the individual himself; the second level is the family; then the church; next the community; finally, the county, and, in a disaster or emergency, the state. Under no circumstances was the federal government to become involved in public welfare. The Founders felt it would corrupt the government and also the poor – how right they were. No constitutional authority exists for the federal government to participate in so-called social welfare programs. (see Making of America, p 218-220)

 “The U. S. Constitution states in Article I, section 8: The people of the states empower the Congress to expend money (for the enumerated purposes listed in Article I, section 8), provided it is done in a way that benefits the general welfare of the whole People. 

 Thomas Jefferson explained that this clause was not a grant of power to “spend” for the general welfare of the People, but was intended to “limit the power of taxation” to matters which provided for the welfare of “the Union or the welfare of the whole nation. In other words, federal taxes could not be levied for States, countries, cities, or special interest groups. (see Making of Americap 387)

When the federal Constitution was being considered for ratification by the State Senates, some people were still suspicious of the “general welfare” clause and tried to claim that these two words could authorize any kind of welfare. The general welfare clause in Article 1, Section 8 of the Constitution reads: “The Congress shall have Power to … provide for the common Defence and general Welfare of the United States; …”

 It is an introductory phrase which is followed, after a semi-colon, by a specific list of the 17 things the new government would be authorized to do, such as; to establish Post Offices, coin money, make Treaties, establish standard weights and measures, provide for a Navy, punish pirates, punish counterfeiting, fund a temporary army, declare war, and exercise exclusive jurisdiction over all cases in the future Washington, D.C., etc.

 To counter those rumors that the General Welfare Clause in the proposed Constitution would authorize any kind of welfare, James Madison, in The Federalist #41, explained its clear intent. He stated that it “is an absurdity” to claim that the General Welfare Clause confounds or misleads, because this introductory clause is followed by enumeration of specific particulars that explain and qualify, i.e., “limit” the meaning of the phrase “general welfare” to the specified and identified categories.

 By the way, The Federalist Papers are not just some antiquated editorial opinions, they are, according to the Supreme Court in Cohens v. Virginia, the exact record of the intent of the Constitution (also see Coleman v. Miller). Cohens v. Virginia, 19 U.S. 264 (1821), was a Supreme Court decision, under John Marshall, most noted for the Court’s assertion of its power to review State supreme court decisions in criminal law matters when the defendant claims that their Constitutional rights have been violated. It cited Federalist #82 extensively.

 Also, the U.S. Supreme Court in S. Carolina v. U.S., 199 US 437 (1905) wrote: “The Constitution is a written instrument. As such, its meaning does not alter. That which it meant when it was adopted, it means now …”

 So, the Constitution was ratified under the assurance that it would never be interpreted to provide welfare to individuals. And indeed, to this very day, the U.S. Government cannot legally provide entitlements to ordinary Americans. Here are some examples to support the continuity of this mindset:

 ·         President James Madison vetoed a Congressional Appropriation to assist refugees – a once and future problem. He said; “I cannot undertake to lay my finger on that Article of the Constitution which granted a right to Congress of expending, on objects of benevolence, the money of their constituents.”

 ·         President Franklin Pierce in 1854 vetoed a bill to help the mentally ill. He said: “I cannot find any authority in the Constitution for public charity …. [this] would be contrary to the letter and the spirit of the Constitution and subversive to the whole theory upon which the Union of these States is founded.”

  • Abraham Lincoln said: “You cannot strengthen the weak by weakening the strong. You cannot help small men by tearing down big men. You cannot help the poor by destroying the rich. You cannot lift the wage earner by tearing down the wage payer. You cannot keep out of trouble by spending more than your income. You cannot help men permanently by doing for them what they could and should do themselves.”

 ·         In 1897, President Grover Cleveland vetoed an Appropriation to provide disaster aid to victims of a Texas drought. His veto stated: “I feel obliged to withhold my approval of the plan to indulge in benevolent and charitable sentiment through the appropriation of public funds … I find no warrant for such an appropriation in the Constitution. The lesson should be constantly enforced that though the people should support the government, the government should not support the people.”

 ·         In Busser v. Snyder, 37 ALR 1515 (1925) the Court stated: “An Old Age Assistance Law is prohibited by a constitutional provision that no appropriation shall be made for charitable or benevolent purposes to any person.”

 Also in the Busser case:  “The term ‘poor,’ as used by lawmakers, describes those who are destitute and helpless, unable to support themselves, and without means of support.” Helping the poor is consistent with the Judeo-Christian ethic. If you cannot take care of yourself, others are allowed to take care of you, even if you don’t like it. John 21:18.

 “But, the Supreme Court unlawfully laid the foundation for what turned out to be an amendment to the Constitution in the 1936 Butler case, where “general welfare” was twisted to allow “special welfare”, and the federal budget jumped from six billion to six hundred billion in one generation. (see Making of America, p 255)   Should the Federal Government be involved in Social Welfare, you be the judge.

 Also, the U.S. Supreme Court in Meyer v. Nebraska, 262 US 390, (1923) concluded: “it is the natural duty of the parent to give his children education suitable to their station in life …”

 ·         And the U.S. Supreme Court in Plyler v. Doe, 457 US 202, (1982) concluded: “… [public] education is not a fundamental right …”

 Democrat President Lyndon Johnson’s solution to the problems of the poor was the “Great Society” of the late 1960’s. He said:  

 I intend to establish working groups to prepare a series of White House conferences and meetings – on the cities, on natural beauty, on the quality of education, and on other emerging challenges. And from these meetings and from this inspiration and from these studies we will begin to set our course toward the Great Society.

 The solution to these problems does not rest on a massive program in Washington, nor can it rely solely on the strained resources of local authority. They require us to create new concepts of cooperation, a creative federalism…”

 Johnson’s “creative federalism” was designed to eliminate the federalism defined by the Constitution and singularly responsible for the success of the United States and its People for nearly 200 years. The Founders counted on the ability to make individual choices, the quest to succeed and the willingness to struggle for prosperity of the American people. Nothing could have been more subversive of the intent of the Founders and, trillions of unconstitutionally wasted dollars later, the social condition of the “welfare addicted” poor in America is worse than when the “Great Society” began.

 But, there was a second major contributor to the demise of personal responsibility in the mid-60s and it came from a most unexpected source – the United States Army (not to be confused with the United States Marine Corps – which is a branch of the United States Navy).

 The expansion of the American involvement in the Vietnamese Civil War under Lyndon Johnson in 1965 resulted in a force of more than 500,000 servicemen and women in Vietnam at all times until the early 1970’s. Tours in-country usually lasted a year so that, over the course of the war, literally millions of young Americans, many of them draftees, saw action.

 Many of the young Army draftees came directly from the academic environment which, as we have seen, was in some turmoil and a staple of that environment was the celebration and use of recreational drugs – primarily marijuana (although LSD and opiates were not far behind). When they shipped out to war, they brought the drug culture with them.

 Because many of the young leaders in the field – the (non-U.S. Military Academy) lieutenants, captains and majors – came from the same social environment as their troops, the drug use was overlooked, probably even participated in, by the young officers. It became epidemic and the Army brass had no solution – so it was universally tolerated. The only alternative was to remove the Army from the field.

 A substantial percentage (some studies suggest as high as 67%) of the American Army fought the Vietnamese War stoned and the best that can be said is that the Army achieved a stalemate after a savage and gruesome struggle – some involving war-crimes. Then, of course, the liberals in the Congress, facing no opposition from a stricken Nixon administration, abandoned the fledgling and imperfect Vietnamese democracy (and also Cambodia) to a genocide at the hands of the communists from North Vietnam, and their Soviet and Chinese sponsors in which more than 3 million innocent people were slaughtered.

As a result of the failure in Army leadership during the Vietnamese War, the Army virtually “retired from the field” as an effective institution for more than a decade. Soul searching during that time resulted in a very cautious fighting force, even though the creation of the “all-volunteer” military was beginning to produce the most professional fighting force the world had ever seen. The Army really didn’t return to its former status until it took the field in the Persian Gulf War under President George H.W. Bush in the early 1990’s.

 In contrast to the Army’s performance in Vietnam, the United States Marine Corps performed in a typically stellar fashion. Under one of my former teachers, General Victor Krulak, they pacified large areas of the coastal country under a program called “Vietnamization” – a geographic area pacification program that provided safety and security in return for cooperation against the Viet Cong guerillas and North Vietnamese regulars in the areas around the pacified villages.

 The problem was that there were just not enough Marines to pacify the whole country. Maybe a lesson can be learned about combat effectiveness from the Marines. After all, an individual can be in the Army but, an individual IS a United States Marine. There’s a difference and a distinction.

Next time: The demise of accountability.

The Frontier

The importance of the frontier in American history is largely forgotten today but, in fact, it was essential to the forming of the America that we all grew up in.

 “The Frontier Thesis is the argument, advanced by legendary historian Fredrick Jackson Turner in 1893, that American democracy was formed by the American Frontier. He stressed the process—the moving frontier line—and the impact it had on pioneers going through the process (of establishing communities and local governments). He also stressed results; especially that American democracy was the primary result, along with egalitarianism, a lack of interest in high culture, and violence.

 ‘American democracy was born of no theorist’s dream; it was not carried in the Sarah Constant to Virginia, nor in the Mayflower to Plymouth. It came out of the American forest, and it gained new strength each time it touched a new frontier,’ said Turner. 

 In the thesis, the American frontier established liberty by releasing Americans from European mindsets and eroding old, dysfunctional customs. The frontier had no need for standing armies, established churches, aristocrats or nobles, nor for landed gentry who controlled most of the land and charged heavy rents. Frontier land was free for the taking (and militias could be raised if circumstances warranted).

 Turner first announced his thesis in a paper entitled “The Significance of the Frontier in American History“, delivered to the American Historical Associationin Chicago in 1893. He won very wide acclaim among historians and intellectuals. Turner elaborated on the theme in his advanced history lectures and in a series of essays published over the next 25 years, published along with his initial paper as The Frontier in American History.

 Turner’s emphasis on the importance of the frontier in shaping American character influenced the interpretation found in thousands of scholarly histories. By the time Turner died in 1932, 60% of the leading history departments in the U.S. were teaching courses in frontier history along Turnerian lines.”

 With the closing of the frontier near the end of the century, what I call the (millennia long) Epoch of Conquest came to an end and has remained absent from world history ever since. Of course, there have been significant attempts at conquest and some minor successes (Tibet, Crimea, etc.) and colossal failures (Nazi Germany, Imperial Japan and the Soviet Union) but two world wars and a long, shadowy Cold War have set a new standard for relations between nations, large and small.

America’s romance with the frontier literally created the motion picture industry with the cowboy/lawman/modest hero morality play of the “good guys” defeating the “bad guys” becoming the staple of the “silent movie” which transitioned well to the “talkies” and to television. American’s crossed the frontier into space and to the moon and continue to push frontiers in science and the arts. The “frontier thesis” is alive and well in the 21st Century.

 American’s conquest of the continent, over aboriginal tribes that had been warring , enslaving and conquering each other for (probably) millennia as well, finally brought peace to North America.

Like any people conquered by a larger, more advanced foe, many aboriginal tribes continue to be angry and resentful. They have been unable or unwilling, in large part, to assimilate into America’s Western culture and so continue to suffer separately on reservations set aside for them by the conquering Americans. Many minority conquered groups throughout the world are similarly challenged. The Palestinians come to mind. Gambling casino set-asides are helping some tribes but many of them refuse to help less fortunate tribes.

 It is interesting to note that not all aboriginal peoples in North America have resisted assimilation into America’s Western culture. The tribes of Alaska have demonstrated a desire and ability to live and thrive in both cultures and their example is compelling.

 The Tsimshian people are originally from the coast of British Colombia. In 1887, an Anglican British missionary, Father William Duncan, obtained permission from the U.S. government to create a Reserve (not a reservation) in Alaska for the tribe because it had few rights in Canada. More than 800 tribal members, with all of their possessions, crossed 70 miles of open-ocean in 50 canoes to reach the reserve on Annette Island, near Ketchikan, Alaska.

 There they established a community which has thrived on fishing, lumber, tourism and trade with local American communities that has brought them economic self-sufficiency. They send an inordinate number of sons and daughters to serve with distinction in the U.S. armed forces and continue to honor their benefactor, Father Duncan.

 They work hard to preserve their ancient culture in custom, dance and song and have a vigorous program to pass along their language to the next generation through formal school programs that will result in the next generation having three times as many fluent speakers as the current one. They have been so successful in their desire to live prosperously in both cultures that other local tribes have joined their community, such as the Tlingit and the Aleut.

 The Eskimo people are originally from Manchuria and live above the Arctic Circle. They are close neighbors with the Athabascan people, who are originally from Siberia and live just to the south. Although two distinct tribes with distinct language and customs, they have long been allies in survival in the unforgiving north.

 They too are working hard to preserve their culture with language programs in schools and have successful communities throughout rural Alaska which are closely integrated with the greater Alaskan economy – benefitting, as all Alaskans do, from the booming oil industry, trade in original art and goods and tourism.

 Of course, the Frontier Thesis, which I studied at the graduate level and found compelling, has fallen out of favor among the intelligentsia at the elite universities and hence, in the public-schools. However, to me the old saw rings true: “Right is right no matter who is against it and wrong is wrong no matter how many are for it.”

 Continuing our history; as a result of the Spanish-American War (1898-99), the United States gained political control over Spanish possessions in Cuba, Puerto Rico, Guam and the Philippines in 1898. Cuba gained total independence in 1903; the Philippines in 1947 – after civil war and conquest by Japan; Guam and Puerto Rico became U.S. territories. Puerto Rico is well on its way to becoming our 51st state.

 This accretion of territory through victory in war has been labelled by many a 20th Century historian (especially American historians) as a sure sign that America is an imperialist nation. The argument fails on virtually all points. To Wit:

 The Spanish-American War was foisted on the American government by the yellow press – see Hearst and Pulitzer above. Absolutely no strategic planning went into any preparation for war with Spain. In the late 1890s, U.S. public opinion was agitated by anti-Spanish propaganda led by newspaper publishers such as Joseph Pulitzer and William Randolph Hearst, who used yellow journalism to call for war. The business community across the United States had just recovered from a deep depression – the “Panic of 1893”, and feared that a war would reverse the gains. They lobbied vigorously against going to war.

On February 15, 1898, the U.S. Navy cruiser, USS Maine exploded in Havana harbor, resulting in the death of about 260 of her crew. American newspapers immediately blamed Spanish treachery, and “Remember the Maine” became the rallying cry for the Spanish-American War. Later investigations concluded that the tragedy was most likely an accident caused by a fire on board the ship.

Subsequently, political pressures from the Democrat Party pushed the administration of Republican  President William McKinley into a war that he had wished to avoid. Spain promised time and time again that it would reform, but never delivered. The United States sent an unrealistic ultimatum to Spain demanding that it surrender control of Cuba. The communication read in part:

“We do not want the island. The President has evidenced in every way his desire to preserve and continue friendly relations with Spain. He has kept every international obligation with fidelity. He wants an honorable peace. He has repeatedly urged the government of Spain to secure such a peace. She still has the opportunity to do it, and the President appeals to her from every consideration of justice and humanity to do it. Will she? Peace is the desired end.”

First Madrid declared war, and Washington then followed suit.

As Assistant Secretary of the Navy, Theodore Roosevelt took immediate action. Because his boss, the Secretary of the Navy, was away from the office when war erupted, Roosevelt assumed the title of Acting Secretary of the Navy and sent a telegram to Commodore George Dewey, who commanded the U.S. fleet in the Asian-Pacific. The telegram instructed him that, if war should erupt between Spain and the United States, he was to take offensive action against the Philippine Islands, which were then part of the Spanish Empire.

Dewey followed his orders. Within days after war was declared, Dewey sailed silently from Hong Kong toward Manila, and on the morning of May 1, 1898, launched a surprise attack on the Spanish fleet anchored in the bay. Within mere hours, Dewey had simultaneously captured the Philippines and demonstrated the power of the United States Navy for the very first time.

The rebirth of the United States Navy from neglect following the Civil War began, after a scathing report by the Navy’s Board of Inspection, with the 1883 authorization of the ABCD ships — USS Atlanta, USS Boston, USS Chicago, and USS Dolphin. The first three were protected open-ocean cruisers. They featured hulls made of steel, but without full armor protection.

 Instead, coal bunkers were expected to provide some protection to vital interior machinery. These ships also featured a mix of propulsion systems, able to travel under sail power, or steam. The Dolphin was a smaller vessel intended to rapidly carry messages in the age before wireless communication.

 The first American battleship was the USS Texas, a 2nd-class battleship. She was authorized in 1886, but not completed until 1895, rendering her somewhat obsolete. Her main armament featured a mix of 12 inch and 6 inch guns. Her two main 12-inch gun turrets were mounted along the side of the ship, staggered forward and aft.

 Perhaps the most famous ship of the new Steel Navy was the protected cruiser USS Olympia. Commissioned in 1895, she was a significant upgrade over the cruisers of the ABCD ships. She served as the flagship of Commodore Dewey’s squadron in the Battle of Manila Bay during the Spanish-American War. She was modernized several times, and participated in convoy duty during World War I. In 1921, the Olympia had the honor of transporting the Unknown Soldier home from France.

 Much like the Texas, the USS Maine was a 2nd-class battleship featuring staggered, side-mounted main guns. She was commissioned in 1895, and originally classified as an armored cruiser. In 1898, the Maine was dispatched to Havana to protect American interests in the midst of Cuban unrest over Spanish rule.

 The dynamite cruiser Vesuvius was a short-lived experiment in weapons technology. She was commissioned in 1890, and featured three fixed gun tubes, through which projectiles were launched with compressed air. Since the guns could not be moved, the entire ship had to be aligned directly towards the target. These almost noiseless guns were used to bombard shore targets during the Spanish-American War. Ultimately, the technology demonstrated by the Vesuvius was deemed a failure, and was abandoned.

 The battleship USS Oregon was one of the most successful and celebrated ships of the new Steel Navy. Commissioned in 1896, she was part of a class of battleships that also included the USS Indiana and USS Massachusetts. At the time of her commissioning, her speed, protection, and firepower made her one of the finest ships in the world.

 Stationed on the West Coast at the outbreak of the Spanish-American War, she was ordered to make a historic dash around South America to reinforce the American squadron operating in the Caribbean. She steamed 14,000 miles around Cape Horn in 66 days, a voyage that was exuberantly celebrated in the media, and with songs and poems. When the Battle of Santiago finally erupted on July 3rd, the Oregon performed with distinction, leading the American fleet to an overwhelming victory over its Spanish adversary.

 “The main issue of the war was Cuban independence; the ten-week war was fought in both the Caribbean and the Pacific. U.S. naval power proved decisive, allowing expeditionary forces to disembark in Cuba against a Spanish garrison already facing nationwide Cuban insurgent attacks and further wasted by yellow fever. Numerically superior Cuban, Philippine, and U.S. forces obtained the surrender of Santiago de Cuba and Manila.  Madrid sued for peace  with two obsolete Spanish squadrons sunk in Santiago de Cuba and Manila Bay and a third, more modern fleet recalled home to protect the Spanish coasts.

The result was the 1898 Treaty of Paris , negotiated on terms favorable to the U.S. which allowed it temporary control of Cuba and ceded ownership of Puerto Rico, Guam, and the Philippine islands. The cession of the Philippines involved payment of $20 million ($575,760,000 today) to Spain by the U.S. to cover infrastructure owned by Spain.

The defeat and collapse of the Spanish Empire was a profound shock to Spain’s national psyche, and provoked a thorough philosophical and artistic revaluation of Spanish society known as the “Generation of’98”. The United States gained several island possessions spanning the globe and a rancorous new debate over the wisdom of expansionism. It was one of only five U.S. wars (against a total of eleven sovereign states) to have been formally declared by Congress.”

This chapter in American history cannot fairly be described as imperialism. It was unplanned; militarily unprepared for; a strategic accident made possible by the existence of a modern, professional Navy built for defensive purposes with fewer than ten steel-hulled warships and an aggressive Roosevelt in an accidentally strategic position. There were no plans to manage any territorial gains and no capability to exploit these territories for their natural resources – which America had in abundance.

Several years after the war, America, under President Teddy Roosevelt, created Panama from part of what had been Columbia – but, by then a French territory – and established the Panama Canal Zone (originally a French proposal conceived by Frenchman Ferdinand De Lesseps, the architect of the Suez Canal) while completing the most ambitious construction project in history – the Panama Canal. The canal reverted to Panamanian control in 1999 under an agreement signed by President Jimmy Carter in September 1977.

 Within the past 100 years the United States has helped win World War I (1914-1918), with allies Britain, France and Italy against the Central Powers of Europe (Germany and Austria); survived the greatest economic collapse in world history, the Great Depression (1929-1939); won World War II against Nazi Germany and the Empire of Japan (1941-1945); rebuilt Europe and Japan; won the Cold War against the Soviet Union (1946-1991); prevented Communist Chinese and Soviet domination of Southeast Asia; built the strongest economy in history and put men on the moon!

 Today, with the rise of Radical Militant Islamist terrorism and the desire of its most committed members to establish a modern, yet savage, barbaric, sadistic and, in some respects pre-civilization, Caliphate in most of the historic lands of Islamic influence from the Seventh Century to the height of the Ottoman Empire, the Epoch of Conquest could begin anew.

 And all of this, over all of these years, was accomplished by people just like you and me and our neighbors, each with a dream of their own and a belief that that dream can be fulfilled through hard work and the opportunity to try and, if failing, to try again and to be supported in the quest by family, friends, neighbors, church, community and country. Building a new society using old ideals – this certainly is an exceptional record for an exceptional country – our America.

 Unfortunately, even the leaders of the PLDC have only a passing acquaintance with this reality.

 President Obama linked gun control (not the broken mental-healthcare industry) to the June 18, 2015 Charleston, SC AME church shooting and killing of nine people, saying the nation needed to “come to grips” with the issue in the wake of the massacre that killed nine people. Speaking on television from the White House briefing room to the nation he said;

 “At some point, we as a country will have to reckon with the fact that this type of mass violence does not happen in other advanced countries.”

 Oh, really? That, of course, was another colossal lie – only the latest in a list that stretches back to his 2008 campaign. In this case, the facts are readily available. Lone-wolf gunmen went on murderous rampages in: Oslo, Norway 2011 – 77 people killed; the Netherlands 2011 – 6; Belgium – 6; Germany 2009 – 15; England 2010 – 12; Baku Azerbaijan, 2009 – 12; Kanhajok, Finland 2009 – 5; Finland 2008 – 10; Tuusula, Finland, 2007 – 8; South Africa 2002 – 10; Nanterre, France 2002 – 8; Zug, Switzerland 2001 – 14; Osaka, Japan 2001 – 8; Pt. Arthur, Australia 1996 – 35; South Korea 1982 –  56.

 The statistics represent more people killed in more of these lone-wolf incidents around the world than in the U.S. over the same period – many in countries with rigorously restrictive gun laws – leading to the obvious conclusion that gun laws are irrelevant to persons fixated on murder and/or beset with mental illness. In fact – irrefutable fact – every single one of the perpetrators of mass killings in the United States as far back as records go – every single one – had passed the required and rigorous federal and various State background checks required to lawfully purchase a firearm!

 So, obviously, restrictive gun laws are not the solution – a commitment to real, realistic, results-oriented mental-healthcare – that protects the mentally ill and the population at-large from the mayhem that is commonplace in America today is the proper course. It is a virtual certainty that, after each of these incidents, evidence will be uncovered that demonstrated to others that the individual responsible for the killings had shown obvious signs of mental instability – obvious to non-mental health workers like family members, friends, classmates, co-workers and acquaintances. Also, a near certainty is the fact that no one wanted to confront the issue by voicing their concerns to someone who might have been able to help – because they didn’t want to offend!

 Why are they afraid to offend? Because the (former) President of the United States will call them racists, or bigots or just plain haters. He did that, consistently, and his acolytes in the PLDC have taken up the battle cry themselves. Just listen to the evening news. The President even refused to call the worldwide Islamic Radical Extremists “terrorists” – which is what they call themselves – proudly.

 So, why would our President say such things? To degrade the United States in the eyes of the world because he, along with most progressive/liberal/ Democrats, does not believe that we are exceptional at all – as he has specifically stated. In his mind, obviously, we are a people who have done more harm than good in the world since our founding. He has specifically spoken out about white police officers shooting black citizens on numerous occasions – while black thugs are massacring fellow blacks in unprecedented numbers.

 As usual, the real issue is being ignored because it does not fit the standard progressive/ liberal/Democrat template – that white America is inherently evil when it comes to non-whites. Yet, over the past 35 years in America, an estimated 324,000 (yes – 324,000!!!) black Americans have been killed at the hands of fellow blacks, amounting to more than 93% of the total black deaths by violence. That is more deaths than the Union Army suffered in the Civil War to free black Americans from slavery. It would take all of America’s local police more than 3,375 years to kill that many black Americans. (See below)

 A six-year FBI study conducted between 2007 and 2012 found that, on average, local police forces kill about 400 people every year in America. Of those, only 96, or less than 25%, involve blacks killed by white police officers. The 2007 U.S. Bureau of Justice Statistics report shows the agency reported that blacks were victims of 7,999 homicides in 2005. Blacks killed by other blacks numbered 7,440 murders in that year alone.

 Acknowledging the Judeo-Christian doctrine and the Constitution’s fiat that all lives are important, compared to those 7,999 victims, the 96 blacks unfortunately killed by white police officers that year amount to only 1.2% of the total violent deaths – compared to 93% of the deaths caused by other blacks – that’s 7500% more. The mantra chanted by the nation’s first black President – that blacks are being “hunted” by white police officers is a monumental lie – perpetrated and perpetuated by the PLDC.

 Then again, President Obama may have been relying on one of the compromised progressive/liberal news outlets for his statistics. Consider this.

 The New York Times claimed that “…there should be no disputing” a new report, which claims there were 722 gun deaths nationwide from May 2007 to February 2015 attributed to concealed permit holders that were not self-defense. The Times asserts: “The full death toll attributable to concealed carry is undoubtedly larger.” The Times was too eager, or too incompetent, to trust a report from the Violence Policy Center. A cursory Google search would have shown the Violence Policy Center (VPC) has a history of making up numbers to support their lies.

 The VPC keeps a record of permit holder abuses in each state. Take the claimed worst state, Michigan. The VPC cites state police and media reports indicating that permit holders committed 277 suicides or murders during the period from 2007 through 2015 (217 suicides and 60 murders). If accurate, VPC apparently believes that 38% of all 722 deaths nationwide that the VPC attributed to permitted concealed handguns occurred in Michigan! Huh?

 All in all, the VPC has managed to at least triple-count the true number of cases of permit holders killing people.

 First of all, suicides are not in any meaningful way linked to the act of carrying a permitted concealed handgun outside of one’s home. The Michigan State Police reports it does not collect information on how the suicides were committed, just that permit holders committed suicide.

 Interestingly, the 2013 suicide rate among Michigan permit holders (6.2 per 100,000 permit holders) is lower than the rate among the general adult population (16.59). Typically, suicides – with or without guns – take place at home. So, again, what do these numbers have to do with the concealed-carry debate?

 The VPC’s murder and manslaughter statistics are just as problematic.  This is how the Michigan State Police reports the number of pending cases and convictions: Total: Pending 18, Convicted 12. The VPC totals differed only slightly: 19 pending cases and 13 convictions.  But the problems arise in what the VPC includes in its count.

 The VPC makes what is perhaps its most inexcusable mistake by adding the “pending” and “conviction” numbers together. Convictions are obviously what should be counted. After all, some of the “pending” cases represent legitimate self-defense cases. Adding them more than doubles the supposed total number of murders. In addition, since murder cases often take years before going to trial, some of the homicides may have occurred well before 2007.  If a case is pending for three years, the VPC counts that as three separate murders!

 There is even more numerical nonsense. The VPC then adds in 26 cases that were reported in newspapers or other media outlets over the same years. However, either the Michigan State Police had already counted those cases in the official statistics, or the cases were never legally pursued. All in all, the VPC has managed to at least triple-count the true number of cases of permit holders killing people. Furthermore, the vast majority of these killings were suicides, or legitimate self-defense shootings.

 Assume, for the sake of argument, that the VPC was accurate in claiming concealed-handgun permits were responsible for 722 deaths over almost eight years. It puts things in context when one realizes there are currently more than 12 million concealed-handgun permits in the U.S. An annual death toll of about 90 (722/8) would mean that, each year, 0.00078% of concealed-carry permit holders were responsible for a shooting death. Removing suicides from the total reduces the rate to 0.00054%.

 Sort of blows a hole in the “gun nut” theory that the progressive/liberal/Democrat cabal has been pedaling, doesn’t it?

Fortunately, nothing in President Obama’s reality about white America in any way effected the exceptional nature of the American people as a whole, contributing more than $5 trillion in the past 50 years in a failed effort to eradicate poverty in America but most especially when it comes to violence against African-American citizens.

Sensationalizing the few instances of white-on-black police shootings – or mass shootings by whites in the black community – for political gain does not change the facts – it only perpetuates tensions in the nation’s black community and postpones the day when America is allowed to become truly “post-racial”.

 In point of fact, the exceptional nature of the three-quarters of the American people who do not align themselves with the progressive/liberal/ Democrat cabal is even more evident when it comes to race relations. It has historically been the Democrat Party that has perpetuated the abuse of blacks in America. After all, it was the Democrat South that decided it was better to fight a war against the rest of America than allow the slaves to be set free. It was the Democrat Party that permitted and protected “Jim Crow” in the South for one-hundred years after Emancipation and it has been the Democrat Party, in control of Congress for forty years, that has led the black community into “economic slavery” by perpetually proclaiming a phony “racism” by the rest of America at every opportunity since the great civil rights battles of the 1960s – where Congressional Republicans voted in  higher percentages than Democrats to pass the Civil Rights Act and the Voting Rights Act.

 Because the progressive/liberal/Democrat cabal controlled government at all levels for so long, they created the myth that government was the solution to economic hardship where, in truth, the government cannot create wealth – it can only redistribute it – which is what the progressive/ liberal/Democrat cabal has tried to do and has failed miserably as we have shown. Continuing down this path will only continue to undermine that which has made America exceptional. That is truly subversive.

 America’s wealth was created by our immigrant ancestors, black, brown, yellow and white together – visionaries all – builders, explorers, farmers, hunters, inventors, laborers, merchants, seamen, singers, soldiers, trappers, traders, writers – industrious, ingenious, indomitable, incomparable spirits – bustling, brave, brazen dreamers who were willing to risk – never articulated any better than by English poet Rudyard Kipling:

 If you can keep your head when all about you       Are losing theirs and blaming it on you,      If you can trust yourself when all men doubt you,      But make allowance for their doubting too;       If you can wait and not be tired by waiting,     Or being lied about, don’t deal in lies,      Or being hated, don’t give way to hating,      And yet don’t look too good, nor talk too wise:       If you can dream—and not make dreams your master;      If you can think—and not make thoughts your aim;        If you can meet with Triumph and Disaste     And treat those two impostors just the same;        If you can bear to hear the truth you’ve spoken       Twisted by knaves to make a trap for fools       Or watch the things you gave your life to, broken,       And stoop and build ’em up with worn-out tools:  If you can make one heap of all your winnings        And risk it on one turn of pitch-and-toss,       And lose, and start again at your beginnings       And never breathe a word about your loss;        If you can force your heart and nerve and sine      To serve your turn long after they are gone,       And so hold on when there is nothing in you       Except the Will which says to them: ‘Hold on!’        If you can talk with crowds and keep your virtue,     Or walk with Kings—nor lose the common touch,        If neither foes nor loving friends can hurt you.       If all men count with you, but none too much;        If you can fill the unforgiving minute        With sixty seconds’ worth of distance run,        Yours is the Earth and everything that’s in it,        And—which is more—you’ll be exceptional, my son!   You’ll be American.      (With sincere apologies to Kipling)

 So, it is obvious that some Americans are not blessed with the “exceptional” gene, instead they are infected with the “subversive” virus. Without the progressive/ liberal/Democrat cabal, race relations in America would already be at the post-racial stage and American exceptionalism would be universal.

 Finally, it is truly tragic that the first African-American President in American history – elected with more than 95% of the African-American vote –chose to, not only ignore the plight of America’s African-American community – mired in progressive/liberal poverty, families decimated by failed federal policies that glorify the dependence upon government rather than family self-sufficiency, with generations lost to the bigotry of low expectations with blame laid solely at the feet of a white America who would be thrilled to have the African-American community join the parade, but to actively encourage division among the communities through his race-baiting comments and by injecting his Justice Department into every local issue that has even the slightest of racial overtones.

 By missing the historic and exceptional opportunity to heal America’s racial wounds, he was, at the very least, guilty of malpractice – perhaps of criminal negligence – and has cemented his legacy as the most pedestrian and unexceptional of Presidents who has condemned the African-American community to continued suffering at the hands of the PLDC – which is the only community that benefits from division in America. Next time: Personal Responsibility.

American Exceptionalism

As alluded to above, the third casualty in this loss of faith has been the belief in an American exceptionalism. Therefore, the belief in an American exceptionalism must be the next principle restored. So, let’s look at more truth that will never make its way into the public-schools or the news or, by any other means, into America’s general consciousness.

 The belief in an American exceptionalism has been the animating force behind America’s purpose since the beginning. In fact, it pre-dates the idea of America itself. The brave adventurers who sailed in hundreds of tiny, flimsy, wooden ships to a New World surely must have thought themselves as exceptional, and not in an egotistical way. They had exceptional dreams and an exceptional faith that they could be successful in a new, hostile and uncharted world. Throughout our history, immigrants coming to our shores certainly shared the same faith and optimism.

 Massachusetts Bay Colony’s Governor John Winthrop said in his famous “City on a Hill” sermon in 1630:

 “… the Lord will be our God and delight to dwell among us, as his owne people and will commaund a blessing upon us in all our wayes, soe that wee shall see much more of his wisdome power goodnes and truthe then formerly wee have beene acquainted with, wee shall finde that the God of Israell is among us, when tenn of us shall be able to resist a thousand of our enemies, when hee shall make us a prayse and glory, that men shall say of succeeding plantacions: the lord make it like that of New England: for wee must Consider that wee shall be as a Citty upon a Hill, the eies of all people are uppon us;”

 But before we can appreciate how far we have come in our first 250 years, we need to understand the context of the world we have been sharing with other great nations during the same period. So, what have the other great powers been doing during the past 250 years since our birth in 1776?

 England has been shrinking from an empire – upon which the sun never set – to the United Kingdom of England, Wales, Northern Ireland and Scotland (oh, wait, Scotland’s virtually gone). This was because the cost of keeping an empire was too much, even for the doughty British. England governed one of the Middle Eastern mandates after the First World War.

 France had a bloody Reign of Terror in the 1790s, the Emperor Napoleon who set Europe ablaze; was captured; exiled to the island of Elba; escaped to raise an army and fight again; was defeated at Waterloo; exiled to the island of St Helena in the South Atlantic where he died (probably poisoned), then a series of failed republican governments and one last monarchy under Napoleon III until the Franco-Prussian War in 1870; and the first of many more republics was installed. France governed one of the Middle Eastern mandates after World War I and spent most of World War II under German occupation.

 Germany didn’t exist until 1871. It then began to compete for influence and empire with relatives in England, Russia and France under Chancellor Otto von Bismarck, leading to World War I. A bitter peace imposed by the Allied Nations at Versailles in 1919 led to an economic collapse and the rise of Adolf Hitler and subsequently to World War II. Following the destruction of the third Reich in 1945, Germany was split into East (under a breathtakingly inept Russian domination) and West Germany, which was rebuilt under the Marshall Plan and, now unified, is now the leading economy in modern Europe.

 Spain was a leading imperial power when America declared independence. In the years since, it competed with England for empire, finally losing the last of its possessions – almost by accident – to the United States during the Spanish-American War in 1898-99. Spain endured a bloody civil war in the 1930s, was a sideshow in World War II and has struggled with Basque separatists over the last few decades.

 Italy didn’t exist until the 1860s when the various cities and states were unified under a legend in his own mind – Victor Emmanuel. A minor imperial power, it sided with the Allied Powers in World War I, brought the Fascist Benito Mussolini to power in the 1920s, invaded Ethiopia in the 1930s, sided with the Nazis and Adolf Hitler in World War II was destroyed and rebuilt under the Marshall Plan and has endured many, many democratic governments since.

 China was under the thumb of the British Empire for most of this period. Although fighting alongside the allies in World War II, it soon fell to the Communists of Mao Tse Tung in 1948 when American congressional liberals caused American support to dwindle. It backed the Communist governments of Kim Il-Sung in North Korea in the 1940s, precipitating the Korean War and Ho Chi Min in North Vietnam in the 1950s, precipitating the Vietnam War. It has recently been flexing its own imperialist wings in Southeast Asia and in support of an unstable, nuclear capable North Korea.

 Japan was a mysterious and closed society until Commodore Matthew Perry opened trade negotiations in the 1850s. Under an Emperor, it challenged Imperial Russia for Far East power in the early years of the 20th Century, winning the Russo-Japanese War in 1905 and joined the Allied Powers in World War I. Chafing under demands for naval reductions, Japan left the world disarmament negotiations and invaded Manchuria in 1931 to begin fulfilling imperialist ambitions. Attacking America at Pearl Harbor on December 7, 1941, Japan entered into war against the United States and its allies. The Empire was completely devastated during the war, finally surrendering after losing entire cities to American atomic bombs in August 1945. It was rebuilt by America after the war and is now a prosperous and valued ally.

 Russia was an imperialist monarchy when America declared independence. The Tsars brutalized the population until workers inspired by communist leader Vladimir Lenin rose up to challenge imperial power in the early 20th Century. Despite concessions resulting in a form of parliamentary government (the Duma), the communists caused a revolutionary uprising in 1917 which overthrew the Tsar, resulting in his execution and that of the Royal Family in Yekaterinburg in July 1919. Lenin ruled for several years, died suddenly and was replaced by Joseph Stalin. Stalin allied with Nazi Germany early in World War II but Hitler turned on Russia, who then allied itself with America and her allies.

 After victory in Europe, the Soviets occupied Eastern Europe and allied with Communist China after 1948 to challenge the Free World for world domination – a struggle known as the Cold War. Finally, their economy collapsed under competition with the United States under President Ronald Reagan and the empire known as the Soviet Union dissolved in 1991. It now operates under a corrupt oligarchy (the government is owned by wealthy and corrupt officials) and has begun to reassemble the old Imperial Russia.

 India was part of the British Empire until 1947.

 The Ottoman Empire’s history is not as commonly known. It was founded by Turks under Osman Bey in 1299 and consisted of many of the lands of the Caliphate established by the followers of Mohammed in the century following the invention of Islam. With the conquest of Constantinople by Mehmed II  in 1453, the Ottoman state was transformed into an empire. With Constantinople as its capital and control of lands around the Mediterranean basin and the Persian Gulf the Ottoman Empire was at the center of interactions between the Eastern and Western worlds for six centuries, primarily along the Silk Road, upon which Venetian merchant Marco Polo was a prominent traveler in the late 13th Century.

 This was not, as stated above, the first attempt by Muslims to build a world empire or, what they refer to as “The Caliphate’. In the century following Mohammed’s founding of Islam and his instruction to convert the world or, failing that, kill all “infidels”, they conquered vast areas of the Middle East as far as the Indus River, the Persian Gulf, North Africa as far south as the Horn of Africa, as far west as the Atlantic Ocean and across the Straits of Gibraltar into Spain and Southern France.

 Their conquests were halted at the Battle of Tours (732AD), fought in an area between the cities of Poitiers and tours in north-central France. The battle pitted  Frankish and Burgundian forces under Charles Martel, against an army of the Umayyad Caliphate led by Abdul Rahman Al Ghafiqi, Governor-General of al-Andalus, the Andalusian region of Spain. The Franks were victorious. Al Ghafiqi was killed, and Martel subsequently extended his authority in the south.

 During the 16th and 17th Centuries, in particular at the height of its power under the reign of Suleiman the Magnificent, the Ottoman Empire was a powerful multinational, multilingual empire controlling much of Southeast Europe (reaching as far as the gates of Vienna in 1453), Western Asia and Caucasus, North Africa and the Horn of Africa.

The Empire declined after this due to “degenerate Sultans, incompetent Grand Visiers, debilitated and ill-equipped armies, corrupt officials, avaricious speculators, grasping enemies, and treacherous friends.” By the mid-19th Century, the Ottoman Empire was called the “sick man” by Europeans. The last quarter of the 19th  and the early part of the 20th Century saw some 7–9 million Turkish-Muslim refugees from the lost territories of the Caucasus, Crimea, Balkans and  the Mediterranean islands migrate to Anatolia and Eastern Thrace.

 In November 1914, the Empire entered World War Ion the side of the Central Powers (Germany, Prussia and Austria), in which it took part in the Middle Eastern theater. There were several important Ottoman victories in the early years of the war, such as the Battle of Gallipoli and the Seige of Kut [a town 100 miles south of Baghdad], but there were setbacks as well, such as the disastrous Caucasus Campaign against the Russians.

 In 1915, as the Russian Caucasus Army continued to advance into ancient Armenia, aided by some Ottoman Armenians, the Ottoman government started the deportation and massacre of its ethnic Armenian population, resulting in what became known as the Armenian Genocide[denied to this day by the Turks]. Massacres were also committed against the Greek and Assyrian minorities. 

 The “Arab Revolt which began in 1916 turned the tide against the Ottomans at the Middle Eastern front. The empire was dissolved in the aftermath of World War I, leading to the emergence of the new state of Turkey in the Ottoman Anatolian heartland, as well as the creation [with the help of T.E. Lawrence – “Lawrence of Arabia” – and the remarkable and influential adventurer and archeologist, Gertrude Bell] of the modern Balkan and Middle Eastern states of Israel, Lebanon, Syria, Iraq, Jordan and the Persian Gulf states under European “mandates” and were governed by European powers until after World War II.

 The occupation of Constantinople and Izmir  led to the establishment of a Turkish national movement which won the Turkish War of Independence (1919–22) under the leadership of Mustafa Kemal Atatürk. The sultinate was abolished November 1, 1922. Turkey, a secular, Turkman (non-Arab) state, is now in the process of applying for membership in the European Union.

 Things look a little different from this side of the oceans that separate us from the rest of the world much more than just geographically. What have our ancestors been doing in America since our beginnings?

 The first settlers had a simple purpose – survival. Imagine having no idea where you were going; no idea where you were when you got there; no idea if the aboriginals were friendly; no idea where your next meal would come from; no idea how cold it would get or how you would shelter; no idea what was poisonous or what animals were dangerous; no idea if you would even live to see tomorrow. Just imagine.

 But live they did, and prospered. They filled the entire East Coast of the continent with settlers. They were governed by appointed English governors who set up local governments after the fashion they had known in England using the (unwritten) English Constitution and laws as a model. They set about being good Englishmen.

 Then England virtually abandoned them to their own devices for more than half a century after England’s “Glorious Revolution of 1688” and the subsequent civil wars in England over the fall of the Stuart Dynasty. During this time, left pretty much to their own devices, they became a unique people. They became Virginians, New Yorkers, Pennsylvanians, Rhode Islanders, Carolinians and the rest. The residents of these “states” (more like ‘states-of-mind’) became “citizens”, cooperated with each other to become “united colonies”.

The Glorious Revolution, also called the Revolution of 1688, recalls the overthrow of King James II of England, Scotland and Ireland by a union of English Parlimentarians with the Dutch “stadholder” William III of Orange-Nassau  (William of Orange). William’s successful invasion of England with a Dutch fleet and army led to his ascending of the English throne as William III of England jointly with his wife Mary II of England (daughter of James II), in conjunction with the written documentation of a Bill of Rights, in 1689.

King James’ policies of religious tolerance after 1685 met with increasing opposition by members of leading political circles, who were troubled by the king’s Catholicism and his close ties with Catholic France. The crisis facing the king came to a head in 1688, with the birth of the King’s son, James Francis Edward Stuart (James II). This changed the existing line of succession by displacing the heir-presumptive, his daughter Mary, a Protestant and the wife of William of Orange, with young James now as heir-apparent .

The establishment of a Roman Catholic dynasty in the kingdoms now seemed likely. Some of the most influential leaders of the British Parliament’s Tories united with members of the opposition Whigs (known as the Immortal Seven – who consisted of one bishop and six nobles) and set out to resolve the crisis by inviting William of Orange to England, which the “stadtholder”, who feared an Anglo-French alliance, had indicated as a condition for a military intervention.

After consolidating political and financial support, William crossed the North Sea and English Channel with a large invasion fleet in November 1688 (numbering about 15,000 men), landing at Torbay (the first successful invasion of England since one by Isabel of France in 1326 to depose her erratic husband, Edward II, in favor of her son, the great Edward III). After only two minor clashes between the two opposing armies in England, and anti-Catholic riots in several towns, James’s regime collapsed, largely because of a lack of resolve shown by the king.

However, this was followed by the protracted Williamite War in Ireland and Dundee’s Rising in Scotland. In England’s distant American colonies, the revolution led to the collapse of the Dominion of New England and the overthrow of the Province of Maryland‘s government.

The “risings” were a series of uprisings, rebellions, and wars in Great Britain, Scotland and Ireland occurring between 1688 and 1746. The uprisings had the aim of returning James II of England, and later his descendants of the House of Stuart, to the throne of Great Britain after they had been deposed by Parliament during the Glorious Revolution. The major risings were known respectively as “the Fifteen” and “the Forty-five”, after the years in which they occurred (1715 and 1745).

After the House of Hanover succeeded to the British throne upon the death of the childless Queen Anne, daughter of James II, in 1714 as King George I (whose mother was the granddaughter of James I), the risings continued, and intensified. They continued until the last rising (“the Forty-five”), led by Charles Edward Stuart  (the Young Pretender), who was soundly defeated at the last battle ever fought on British soil – the bloody Battle of Culloden in 1746. This ended any realistic hope of a Stuart restoration.

Following a defeat of his forces at the Battle of Reading on December 9, 1688, James and his wife fled England; James, however, returned to London for a two-week period that culminated in his final departure for France on December 23.

By threatening to withdraw his troops, William, in February 1689, convinced a newly chosen Convention Parliament to make him and his wife joint monarchs – the reign of William and Mary.

The Revolution permanently ended any chance of Catholicism being re-established in England. For British Catholics its effects were disastrous both socially and politically: Catholics were denied the right to vote and sit in the Westminster Parliament for over a century; they were also denied commissions in the army, and the monarch was  forbidden to be Catholic or to marry a Catholic, this latter prohibition remaining in force until the UK’s Succession to the Crown Act of 2013 removed it in 2015.

The Revolution led to limited toleration for Nonconformist Protestants, although it would be some time before they had full political rights. It has been argued, mainly by Whig historians, that James’ overthrow began modern English parliamentary democracy: the Bill of Rights of 1689 has become one of the most important documents in the political history of Britain and never since has the monarch held absolute power.

Internationally, the Revolution was related to the War of the Grand Alliance on mainland Europe. It has been seen as the last successful invasion of England. It ended all attempts by England in the Anglo-Dutch Wars of the 17th century to subdue the Dutch Republic by military force.

However, the resulting economic integration and military co-operation between the English and Dutch navies shifted the dominance in world trade from the Dutch Republic to England and later to Great Britain. In turn, this would embolden the British into aggressive trade relations with her North American colonies with the arrival of a new monarch – George III.

Eventually, in 1763, with the ascension of George III to the English throne, the colonial’s world changed. Coming together as a People for the first time, they protested as Englishmen the imposition of the Stamp Act of 1765, which was the first internal tax levied directly on American colonists by the British government. The act came at a time when the British Empire was deep in debt from the Seven Years’ War (1756-63) in both Europe and North America and looking to its North American colonies as a revenue source – even though the colonists had fought alongside British regulars against the French and their tribal allies on the North American continent, helping the Mother Country win that war.

 (Thanks for your sacrifice now, here is what you owe us for the opportunity! I don’t think the colonists agreed.)

 Arguing that only their own representative assemblies could tax them, since they had no representatives in Parliament, the purpose of the colonists was to insist that the act was unconstitutional. Parliament repealed the Stamp Act in 1766, but issued a Declaratory Act at the same time to reaffirm its authority to pass any colonial legislation it saw fit. The issues of taxation and representation raised by the Stamp Act strained relations with the colonies to the point that, 10 years later, the colonists rose in armed rebellion against the British.

 The war for independence lasted from 1776-1781. By 1787, the ineffective government created by the Articles of Confederation was failing and American leaders called for a Constitutional Convention of the States to create a new form of government. Their work was ratified by citizens in each state by mid-1788 and the United States of America was off and running.

 And run they did. Over the next century, Americans built a nation of laws, not men. As they conquered the western frontier and tamed a continent, they brought their Constitution with them. In that context, they were the last of the conquerors in North America, ending that chapter of documented human history known as the “Epoch of Conquest”, which accompanied the rise of civilization throughout the world and which saw in North America, thousands of years of aboriginal inhabitants’ tribal conflict and countless bloody and brutal tribal conquests from Atlantic to Pacific.

 As soon as the Constitution was ratified, Americans began a campaign to abolish slavery, a movement known as Abolition. Their forerunners, anti-slavery delegates to the Constitutional Convention, had managed to include a provision ending the importation and internal sale of slaves within twenty years (by 1808) in Article I, Section 9. Although not entirely successful in the divisive national debates surrounding the Missouri Compromise in 1820 and the Compromises of 1850 and 1854 addressing the entry of new States into the Union as free or slave states, they kept their campaign alive in America’s consciousness.

 Unfortunately, their efforts were not enough to prevent civil war but were important in Republican candidate Abraham Lincoln’s election in 1860 as a candidate the slave states feared would ultimately try to end slavery in America. Even before he was inaugurated in March 1861, southern States were seceding from the Union. With that fact as prologue, Lincoln set about to save the Union.

 By war’s end in 1865, more than 300,000 Americans, fighting for the Union, had died. Lincoln had freed the southern slaves to encourage them to escape their masters and join the Union war effort. He had also played a critical, crucial and essential role in the monumental Congressional victory of the passage of the 13th Amendment to the Constitution which forever ended slavery in America, a scourge that has been around since the before the dawn of recorded history and which still exists today, especially in areas of the world under Islamic control.

 After the Civil War, essentially our second revolution, America embarked on an Industrial Revolution and entered the modern world, became socially responsible and a world power after the Spanish-American War – fought, in part, to end the reconcentrado, or concentration camp system, a new form of slavery imposed by Spain on its colony in Cuba, only 90 miles from the American coast, in 1896. Cuba’s rural population was forcibly confined to centrally located garrison towns, where thousands died from disease, starvation, and exposure.

 During this same period, Americans closed the frontier by establishing States, counties and municipalities that were finally contiguous from sea to sea and connected to each other by road and rail.

 Working from east to west and from west to east, the Central Pacific Railroad had completed the first rail route through the Sierra Nevada Mountains by 1868. More than 4,000 workers, of whom two thirds were Chinese immigrants, had laid more than 100 miles of track at altitudes above 7,000 ft. to meet the line of the Union Pacific Railroad – built primarily by Irish immigrants – and the railheads finally met at Promontory Summit, Utah Territory where the First Transcontinental Railroad in the United States was officially completed on May 10, 1869. A specially-chosen Chinese and Irish crew had taken only 12 hours to lay the final 10 miles of track in time for the ceremony. No Chinese workers were allowed to participate in the historic commemorative photograph. Next time: The Frontier Thesis.

Chaos at Home, Cancer in the Classroom

In these PEG workshops, “teachers are trained to make sure black kids “feel respected,” and to listen to their complaints without judgment or criticism. Misbehaving kids are handed a “talking stick” and encouraged to emote about the issues underlying their anger. More often than not, they are treated as victims, even if they start fights or threaten teachers.

 No longer can teachers in these programs deal swiftly with a disruptive child by removing him from class. Conflicts take days, even weeks to resolve as schools coordinate talking circles around the schedules of teachers, principals, counselors, parents and even campus police — all of whom must take time out and meet to deal ever-so-delicately with a single problem student.

 And that doesn’t include the in-class circles also required under the restorative approach. Teachers are trained never to snap at a mouthy student interrupting a lesson but rather to gather students in a circle to share their feelings about the problem. Even if such pow-wows diffuse conflicts, they take an inordinate amount of time away from academic instruction. They also give troublemakers incentive to continue causing trouble [which gives them the attention and street-cred they crave].

 Instead of being kicked out of school or suffering other serious punishment, even repeat offenders can negotiate the consequences for their bad behavior, which usually involve paper-writing and “dialogue sessions.” RJ (restorative justice) can encourage misbehavior by lavishing attention on students for committing infractions,” warns Paul Bruno, who participated in talking circles while teaching middle school in Oakland and South-Central Los Angeles. In fact, he added in a Scholastic.com blog, “the circles may unwittingly allow already assertive students to leverage their social dominance even further inside the classroom.”

Restorative justice activists argue the program combats bias that contributes to disproportionate discipline, suspensions, drop-outs and the “school-to-prison pipeline.” But all too often, it merely provides rowdy students an excuse for continued bad behavior.

 New York public schools may get their suspension numbers “right” under the new racially correct discipline standards. But their enrollment numbers will likely suffer in the process, as more students — and teachers — transfer to safer private or charter schools. In a misguided effort to be “fair” to a few, politicians are hurting the education of the many. Don’t believe it? Read on.

 According to the Huffington Post, citing U.S. Census Bureau data, “New York spent $19,076 per student in the 2011 fiscal year, as compared to the national average of $10,560.” In 90 of the city’s public schools, Families for Excellent Schools found that not a single African-American or Hispanic student received a passing grade on state tests! If there were a correlation between spending and achievement, it ought to show in grade and graduation performance and state test scores, but it doesn’t.

 A Heritage Foundation study concludes: “Continued spending increases (on public education) have not corresponded with equal improvement in American educational performance. Long-term National Assessment of Education Progress reading scale scores and high school graduation rates show that the performance of [public-school educated] American students has not improved dramatically in recent decades even though education spending has soared.” [How “fair” is that to the children?]

 This approach, of course, is right out of the PLDC playbook – deny, obfuscate, lie with statistics, lead with your feelings, ignore reality and lie, lie, lie until you get more loyal voters who, with clouded judgement, are addicted to your freebies – in this case free child-sitting, free breakfast and lunch (sometimes dinner, too), free transportation, free school supplies, free entertainment from teachers and staff and free stress-relieving, cage-fighting lessons using live opponents – teachers.

 The report cards children receive – that all children receive – are an evaluation of the quality of their home environment as well as their performance in the classroom. Students who have behavioral issues are reflecting their upbringing and are raising the red-flag of failure for their family. So, in one respect the PLDC is right – these children should not be held solely responsible for their behavior. It is their family that is principally responsible and must be held accountable for their children’s’ behavior.

 Any successful intervention for these children must be conducted in the home, must include the parents, siblings and extended family and must be conducted by people who are trained in social work – not teachers who are not trained or equipped to solve these problems that must be conducted at the first sign of trouble.

 In many circumstances, social services do not get involved at the family level until a child is hospitalized or killed as a result of a home environment that has been dysfunctional for years. These situations can be prevented by action when the red-flags first go up – which they do as early as in pre-school.

 But, in order to be successful, the state must be ready and willing to remove the child from the abusive environment at the outset. Without this possibility, there will be insufficient incentive for the family to take the intervention seriously. They must have a stake in the outcome as much as society does – a society that realizes that a product of a dysfunctional environment is going to cost everyone for that child’s entire life and to which society turns a blind eye at its own peril.

 This situation will not get better with time because we are now three generations into a public-school system controlled from Washington, DC under Lyndon Johnson’s Great Society umbrella that begins with the premise that African-Americans cannot make it in America without the cradle-to-grave help from the federal government. The federal government has succeeded in making that belief a reality. The tragedy is that it is so disgustingly wrong and so cruelly destructive for so many good kids.

 I dread the number of teacher assaults and murders it is going to take before America admits that the solution to the problem is not in the classroom – it is in the living room – the living rooms where – if I were proposing a study my thesis would be that – these kids learn the disrespect for authority, aggressive behavior, the verbal, psychological and physical skills of intimidation and violence, where they are first exposed to racism and racial hatred and where they are conditioned to view the real world through the lens of their parents, siblings, extended family, neighbors and friends who learned these lessons from the previous generation.

 Of course, this situation is not limited to the African-American community. It is prevalent in all economically challenged communities where the lack of responsibility as a social indicator is widespread and has been passed down for generations. Virtually any excuse and any bias for the inability to succeed is acceptable despite the free opportunities that are provided by society in the form of temporary relief from need and paths to enable employment. Without a moral sense of responsibility to family, community and employers however, success is impossible.

 The conditions that have led to the poverty and hopelessness in the inner cities are not primarily racial – they are the results of social pathology and crosses all boundaries – racial, ethnic, gender, national origin, etc.

 There are an inexhaustive number of studies that prove that children (for the most part) are born into this world as loving, forgiving, curious, happy little people. In the African-American community specifically, for all of the reasons we have read about in this treatise, by the time their babies get to school, they are already psychologically damaged – some irretrievably so – and they act out the lessons they have learned at home.

 The unique problem with the African-American community is that they have been conditioned for generations – perhaps centuries – to distrust the white community – the very agency that has exhibited success in creating sound family, educational and economic communities throughout the nation in both urban, suburban and rural areas.

 Suggesting that African-Americans follow the White-American example will fall on deaf, if not openly hostile, ears. For those same generations, so-called black leaders, who continue to heavily influence the African-American community, have betrayed their people by working to create the government dependent society that exists today.

There are African-American voices of reason in America. Dr. Thomas Sowell and Dr. Walter Williams are great examples of honest brokers for the African-American community. It is difficult for them to find a widespread public voice however, because the press/media prefer the flamboyant race-baiters who strive for the sensational – with an eye toward the profit line – and perhaps to avoid their extortion industry.

Dr. Sowell is an American economist, social theorist, political philosopher, author and Senior Fellow at the Hoover Institution, Stanford University. He was born in North Carolina, but grew up in Harlem, New York and has served on the faculties of several universities, including Cornell University and the University of California, Los Angeles. He has also worked for think-tanks such as the Urban Institute. Since 1980, he has worked at the Hoover Institution at Stanford University. Sowell has written more than thirty books (a number of which have been reprinted in revised editions), and his work has been widely anthologized. He is a National Humanities Medal recipient.

Dr. Williams is an American economist, commentator, and academic and the John M. Olin Distinguished Professor of Economics at George Mason University. Williams has been a professor of economics at George Mason University since 1980, and was chairman of the University’s Economics department from 1995 to 2001. He had previously been on the faculty of Los Angeles City College, California State University – Los Angeles, Temple University and Grove City College. Williams was awarded an honorary doctorate degree at Universidad Francisco Marroquin. Williams has written ten books and hundreds of articles. His syndicated column has been published weekly in approximately 140 newspapers across the United States, as well as on several web sites by Creators Syndicate. He also wrote and hosted documentaries for PBS in 1985. The “Good Intentions” documentary was based on his book The State Against Blacks.

Because the United States was founded on Judeo-Christian principles, perhaps a consideration of the role of a Christian education is worth some thought. Listen to Rod Dreher, the senior editor for The American Conservative.

 “Education has to be at the core of Christian survival—as it always was,” says Michael Hanby, a professor of religion and philosophy of science at Washington’s Pontifical John Paul II Institute. One of the most important pieces of the Benedict Option movement is the spread of classical Christian schools.

“Rather than letting their children spend forty hours a week learning “facts” with a few hours of worldview education slapped on top, parents need to pull them from public schools and provide them with an education that is rightly ordered—that is, one based on the premise that there is a God-given, unified structure to reality and that it is discoverable. They need to teach them Scripture and history. Building schools that can educate properly will require churches, parents, peer groups, and fellow traveler Christians to work together. It will be costly, but it will be worth it.

 For serious Christian parents, education cannot be simply a matter of building their child’s transcript to boost her chance of making it into the Ivy League. If this is the model your family follows (perhaps with a sprinkle of God on top for seasoning), you will be hard-pressed to form countercultural Christian adults capable of resisting the disorders of our time.

 The kind of schooling that will build a more resilient, mature faith in young Christians is one that imbues them with a sense of order, meaning, and continuity. It’s one that integrates knowledge into a harmonious vision of the whole, one that unites all things that are, were, and ever will be in God.”

 “Every educational model presupposes an anthropology: an idea of what a human being is. In general, the mainstream model is geared toward equipping students to succeed in the workforce, to provide a pleasant, secure life for themselves and their future families, and ideally, to fulfill their personal goals—whatever those goals might be. The standard Christian educational model today takes this model and adds religion classes and prayer services.

 But from a traditional Christian perspective, the model is based on a flawed anthropology. In traditional Christianity, the ultimate goal is to love and serve God with all one’s heart, soul, and mind, to achieve unity with Him in eternity. To prepare for eternal life, we must join ourselves to Christ and strive to live in harmony with the divine will. To be fully human is to be fully conformed to that reality—as C. S. Lewis would say, to the things that are— through cooperating with God’s freely given grace. To be humanized is to grow—by contemplation and action, and through faith and reason—in the love of the Good, the True, and the Beautiful. These are all reflections of the Triune God, in Whom we live and move and have our being.

 To compartmentalize education, separating it from the life of the church, is to create a false distinction. Saint Benedict, in his Rule, called the monastery “a school for the service of the Lord.” This was no mere figure of speech. In the Benedictine tradition, learning is wholly integrated into the life of prayer and work. Today our education system fills students’ heads with facts, with no higher aspiration than success in worldly endeavor. Since the High Middle Ages, the pursuit of knowledge for its own sake has been slowly separated from the pursuit of virtue. Today the break is clean.

 Educator Martin Cothran, a national leader in the classical Christian school movement, says that many Christians today don’t realize how the nature of education has changed over the past hundred years. The progressivism of the 1920s involved using schools to changethe culture. The vocationalism of the 1940s and 1950s tried to use schools to conform children to the changed culture. But the traditional way of education, which reigned from the Greco-Roman period until the modern era, was about passing on a culture and one culture in particular: the culture of the West, and for most of that time, the Christian West.”

 “The classical education of the pagans that was transformed by the church attempted to inculcate in each new generation an idea of what a human being should be, through constantly having examples of ideal humanity set in front of it, and by studying the great deeds of great men,” Cothran told me. “This was a culture with a definite and distinctive goal: to pass on the wisdom of the past and to produce another generation with the same ideals and values—ideals and values based on its vision of what a human being was.”

 [This is precisely the educational background and mindset of the Founders – to pass on to the generations the wisdom of their experience, and the culture they had created to express it, in the daily life of the new nation.]

 “That’s what education was for over two millennia,” he continued. “It is now something that retains the old label, but is not the same thing. It is not even the same kind of thing. It has been abandoned in the modern school— including many Christian ones. Even many Christian parents who do not accept the political correctness of today’s schools have completely bought into the utilitarian concept of education.”

 “To be sure, there is nothing wrong in principle with learning something useful or achieving excellence in science, the arts, literature, or any other field of the intellect. But mastery of facts and their application is not the same thing as education, any more than an advanced degree in systematic theology makes one a saint.

 The separation of learning from virtue creates a society that esteems people for their success in manipulating science, law, money, images, words, and so forth. Whether or not their accomplishments are morally worthy is a secondary question, one that will seem naïve to many if it occurs to them at all.

 If a Christian [you may substitute the “Western tradition of the moral and ethical”] way of living isn’t integrated in with students’ intellectual and spiritual lives, they’ll be at risk of falling away through no fault of their own. As John Mark Reynolds, who recently founded Houston’s Saint Constantine School, puts it, ‘Christian young people who have had a personal, life-changing encounter with Christ, and who know Christian apologetics but have not integrated them into their lives, are more vulnerable than they think. They have to learn how to translate the conversion experience and intellectual knowledge of the faith into a Christian way of living—or their faith will remain fragile.

 If it’s true that a simplistic, anti-intellectual Christian faith is a thin reed in the gale of academic life, it is also true that faith that’s primarily intellectual—that is, a matter of mastering information—is deceptively fragile. Equipping Christian students to thrive in a highly secularized, even hostile environment is not a matter of giving them a protective shell. The shell may crack under pressure or be discarded. Rather, it must be about building internal strength of mind and heart.’

 Public education in America is catastrophically broken. The federal government has systematically destroyed the States’ ability and responsibility to educate their children – a time honored tradition in America since the founding.

 However, quality children’s education is occurring in America. It is thriving in private schools, parochial schools, charter schools and home schools throughout the land but, it is only reaching about 10% of America’s children. That is about 310,000 out of 3.5 million graduates. Even worse, only 12% of those 3 million public school graduates will attain a college degree while 45% of private school graduates are successful in college!

 Unfortunately, the public education lobby – the PLDC, academia and the teacher’s union hierarchy – refuse to adopt the means and methods for educational success demonstrated by private schools – which is occurring at half the cost of public education. Why?

 Because the essence of their cabal is the ignorance of the citizenry, more specifically, the voter. If the children can be indoctrinated in the progressive/liberal tradition of big government cradle-to-grave serfdom, then the PLDC elites can enjoy virtually unlimited and unchallenged power to pursue their own version of happiness. That, of course, is pathologically sadistic and its eradication therefore, must be at the very heart of the campaign to restore America.

 Specifically, why haven’t American schools improved? For the most part, America’s teachers are dedicated, hard-working and caring people. We know what’s wrong with the government and the education bureaucracy but, what’s going wrong in the classroom?

 According to John Stossel; “The education establishment says, “We don’t have enough money!” But American schools spend more per student than other countries. Spending tripled during [the past fifty years] and class sizes dropped. But test scores stay flat.

 “Schools adopted all sorts of new technologies, from projectors to personal computers to ‘smart’ whiteboards,”. “None of these inventions improved outcomes … (E)ducational quality has been stuck in the era of disco and leisure suits for 40 years, while the rest of the world has passed it by.”

 The main reason for that is that most schools are controlled by government. Government is a monopoly, and monopolies resist change. Actually, most of us resist change. We don’t want to give up the way we’ve always done things. Certainly, few of us want to work harder, or differently. We get set in our ways.

 But when there is competition, we can’t get away with that. If we don’t adopt better ways of doing things, we go out of business. That forces innovation.

 But government-run schools never go out of business and the teacher’s unions’ support for politicians who will toe the line will never waver. Principals, school boards and teachers – especially union teachers – have little incentive to try anything new. This might be familiar because the story was also told in the decorated movie Stand and Deliver.

 In that film, actor Edward James Olmos played math teacher Jaime Escalante. Escalante taught at California’s Garfield High School. The student body was, and is, composed of some of the most “disadvantaged” students in America. Yet more Garfield High students passed advanced placement calculus tests than did students from Beverly Hills High.

 Escalante was the reason. He was simply a better teacher. Escalante was born to two teachers of Aymara ancestry in 1930 in La Paz, Bolivia. He was proud of his Aymara heritage and, as an adult, he would proclaim, “The Aymara knew math before the Greeks and Egyptians.”

 Some of his former students, said, “Escalante worked as if his life depended on the success of his students.” The results were beyond belief … literally. His students did so well on the state calculus test that authorities accused them of cheating. They made them take the test again. The students aced the test the second time.

 What made Escalante a better teacher? One student said, “He built a relationship with each student, knew them by name, knew their story. … Students didn’t want to disappoint him.”

 The movie made Escalante famous, but he didn’t change. He kept teaching at Garfield, telling students that even though they were poor, “With enough drive and hard work, the sky is the limit.” “The lessons I learned from Jaime, I apply them every day,” a former student said. “With my children, I talk about Jaime and about ‘ganas’ – desire. Nothing’s for free. You have to work really hard if you want to achieve anything.”

 Stand and Deliver has a happy ending, but what happened in real life was no fairy tale.

In any other field, we might expect this combination of success, scalability, and publicity to have catapulted Escalante to the top of his profession and spread his teaching model across the country. That isn’t what happened.

 Garfield’s union teachers resented Escalante’s fame and work ethic.

 A former Garfield student who now is a teacher said, “The problem was that Escalante’s classes were big. … He was setting a precedent, giving the message to the administrator: ‘If Escalante can do it, why not you?'”

 The union used its organizing power to get the votes to oust Escalante as math department chairman. Escalante then quit. How sad for Jaime Escalante and how sad for the children of Garfield High.

 The math program’s decline at Garfield became apparent following the departure of Escalante and other teachers associated with its inception and development. In just a few years, the number of AP calculus students at Garfield who passed their exams dropped by more than 80%.

In the mid-1990s, Escalante became a strong supporter of English-onlyeducation efforts. In 1997, he joined Ron Unz’s English for the Children initiative, which eventually ended most bilingual education in California schools favored by the PLDCC.

In 2001, after many years of preparing teenagers for the AP calculus exam, Escalante returned to his native Bolivia. He lived in his wife’s hometown, Cochabamba, and taught at Universidad Privada del Valle. He returned to the United States frequently to visit his children.

In early 2010, Escalante faced financial difficulties from the cost of his cancer treatment. Cast members from Stand and Deliver, including Edward James Olmos, and some of Escalante’s former pupils, raised funds to help pay for his medical bills.

He moved to Sacramento, California, to live with his son in the city of Rancho Cordova. He taught at Hiram Johnson High School, very similar to Garfield High School. He died in 2010, at 79, at his son’s home while undergoing treatment for bladder cancer.

As you can see, the people in the union hierarchy of the AFT and the NEA and their State affiliates are merely more thugs of the PLDC, practicing intimidation of America’s teachers who love their jobs and can’t risk losing them by speaking out against the outrageous behavior of people in positions of power but with no responsibility because they are protected by the tentacles of the PLDC.

 How enlightening for those in America who actually care about educating our children to be responsible and capable citizens. The solution is apparent – the education industry needs to be freed from the teachers’ unions and must provide actual leadership to America’s leaders. It won’t come from government, it must come from the People – the parents and grandparents on the front lines of fighting for their children’s future.

 A modern version of a 7th Century Anglo-Saxon serf’s “Oath of Fealty” to his lord (read Democrat office holder) seems appropriate for today’s – and tomorrow’s – victims of the PLDC’s “lords of the schools”’ wicked campaign for ignorance through education:

 “I will be true and faithful, and love all which he loves and shun all which he shuns, according to the laws of his God and the order of the world. Nor will I ever with will or action, through word or deed, do anything which is unpleasing to him, on condition that he will hold to me as I shall deserve it, because I submitted myself to him and chose his will.”

Next time: American Exceptionalism