Popular Posts

Knowledge Enhanced Custom Search! (74% of users polled last year indicated that they'd use it again)

Children, by Henry Wadsworth Longfellow





Henry Wadsworth Longfellow taught at  Harvard University, having garnered instant praise for his early poetry collections.
Henry Wadsworth Longfellow (b. 1807 - d. 1882),
Harvard professor and lifelong poet who experimented
with many styles throughout his production, including Free Verse.
He garnered instant fame with his first poetry books,
Voices of the Night (1939) and Ballads and Other Poems (1841).
"Children" is part of Birds of Passage and was written in 1858.



Come to me, O ye children!
  For I hear you at your play,
And the questions that perplexed me
  Have vanished quite away.

Ye open the eastern windows,
  That look towards the sun,
Where thoughts are singing swallows
  And the brooks of morning run.

In your hearts are the birds and the sunshine,
  In your thoughts the brooklet's flow,
But in mine is the wind of Autumn
  And the first fall of the snow.

Ah! what would the world be to us
  If the children were no more?
We should dread the desert behind us
  Worse than the dark before.

What the leaves are to the forest,
  With light and air for food,
Ere their sweet and tender juices
  Have been hardened into wood,—

That to the world are children;
  Through them it feels the glow
Of a brighter and sunnier climate
  Than reaches the trunks below.

Come to me, O ye children!
  And whisper in my ear
What the birds and the winds are singing
  In your sunny atmosphere.

For what are all our contrivings,
  And the wisdom of our books,
When compared with your caresses,
  And the gladness of your looks?

Ye are better than all the ballads
  That ever were sung or said;
For ye are living poems,
  And all the rest are dead.



(Note: The theme of the dichotomy Dead / Alive, otherwise referred to as Awake / Asleep, is the most prevalent and celebrated theme among the best remembered works of the great poets of history.  This theme is anchored in millennia-old archetypes and conveys a hidden psychological message about the state of mind in which it is best to live.  Most of the poems posted on this website express that theme in one way or another; such is the case in, for example, Walt Whitman's "O Me! O Life!"e.e. cummings's "since feeling is first", Robert Frost's "The road not taken", John Donne's "Death be not proud", and William Shakespeare's "Sonnet 69" and "Sonnet 94".

This very same theme may also be found frequently among the lyrics of contemporary popular music, and its presence seems to correlate with whether a song will still be played [and/or held in high esteem] decades after its composition and initial release.  For an example of this trend, and an accompanying analysis, see Ever wonder what Hotel California means?)


Borders Change: Short history primers on the Napoleonic Wars and the Unification of Germany and Italy




Human minds may turn fickle when not nurtured properly with knowledge, hindering the ability to discern among concepts that is generally called intelligence.  This is, of course, harmful insofar as individual lives are concerned, but a collective lack of perspective can be devastating when accumulated ignorance finally reaches the political realm.  This seems to be occurring in the present geopolitical scene, and the current situation has all the ingredients required to cause massive ruin to lives, nations, and entire peoples.  One of the cognitive dynamics that has led to the present situation has been a tenuous grasp of history by the general public.

The international agreements that ended the Second World War, along with the culmination of a long drawn-out (and hard fought) process of decolonization by the European imperial powers of the 19th and early 20th centuries, established national borders that, for better or worse, had been up until now relative stable by historical standards.  Though some borders did fluctuate, particularly in Africa and in Eastern Europe following the downfall of the Soviet Union, the resulting territorial alterations may be deemed small within the grand scope of history.  But it was precisely that long, 70 year period that has made current societies lose grasp of how quickly states can expand or be dismantled.



Historical shifts have a way of building up slowly and then happening all at once.  This was the case in 19th century Europe, as can be appreciated in the videos embedded below.  One of the long-standing factors leading to the present geopolitical situation is that the national borders created via decolonization and the post-World War II conventions rarely considered the underlying histories of the cultures that they were carving up into states, but rather largely followed the borders crafted by the imperialist expansions.  Creating and legitimizing a United Nations complete with its own armed forces afforded stability to the 20th century arrangement.  But why would this configuration last forever?  Why expect, even assume, that humanity would not revert to the historical mean insofar as the reshaping of national borders is concerned?  Two recent events have provided a wake-up call: the Russian Federation's annexation of Crimea and the rise of a transnational revolutionary force - the Islamic State - fluidly carving up its own borders over existing national borders.

In March 2014, the Russian Federation annexed the port territories of Crimea basically over a weekend, even though the political buildup to that event had been going on for years (if not decades or centuries), and within days the Western powers had recognized that what used to be Ukraine was now Russian Federation.  How quickly can 70 years of international laws become utterly obsolete!  Personally, I found strange how little importance was given to an invasion within Europe that shattered previous political convention.  But then again, might makes right... Who was going to stand up to the Russians?  Clearly, the answer is no one at all.

The case is entirely different with the Islamic State, which lacks the advanced weaponry necessary to deter the European powers.  Note that the Kurds quietly seceded from eastern Iraq and Syria, forming the new Kurdistan state.  Only Turkey bombs that new state because there is Kurdish presence on Turkey's east providences.  Most of the intervening coalition considers Kurdistan an ally.


I will not delve into the causes of the IS (or ISIS or ISIL or Daesh...) conflict as that would be a very long list, nor will I speculate as to the possible outcomes of that war since nothing can be stated with any certainty as things now stand.  Instead, it is worth merely pointing out that the geopolitical events of our era act as a reminder to the last few generations of a phenomena that they had begun to forget, of a historical fact that needs to be kept in mind as we move forward: state borders aren't set in stone and, when these shift, these change quickly and drastically, setting a novel world stage in the process.


To make this point clear, the three short videos that follow provide succinct 3 minute history primers of massive border changes that occurred in the heart of Europe during the 19th century (not that long ago!).  If you think nothing like this could ever happen again, I have a bridge that I would like to sell to you.

The first video depicts the process of the German Unification; the second summarizes the Italian Unification.  Finally, since both these rapid sequences of events were possible because of Bonaparte's Empire, a third 3 minute video is provided that covers the Napoleonic Wars.

Never forget.  When people forget, that's when history repeats itself.  Our cognitive dynamics must change with regards to our worldview in light of recent event so that we may be prepared for the geopolitical fluidity that is likely to come. 

I hope, at the very least, that you find these animated primers enjoyable, as a fun way to remember events that most of us do not tend to think about even though these have defined the shape of the Western world.

(For mobile users that may not be able to see the videos embedded below, please go to the following links:









"Invictus: The Unconquerable" by William Ernest Henley






Out of the night that covers me,
     Black as the Pit from pole to pole,
I thank whatever gods may be
     For my unconquerable soul.

In the fell clutch of circumstance
     I have not winced nor cried aloud,
Under the bludgeonings of chance
     My head is bloody, but unbowed.

Beyond this place of wrath and tears
     Looms but the horror of the shade,
And yet the menace of the years
     Finds, and shall find me, unafraid.

It matters not how strait the gate,
     How charged with punishments the scroll,
I am the master of my fate:
     I am the captain of my soul.


Mortals of Habit: Dying by - and the Death of - the 40 Hour Workweek




Creatures of habit that we are, we seldom revamp things that have been around since before we were born.  An example is the 8-hour workday.  Since when has it been around?  The eight-hour movement,or 40-hour movement, has its origins in the workers' struggles of the Industrial Revolution.  Karl Marx saw its vital importance to the workers' health, saying in Das Kapital: "By extending the working day, therefore, capitalist production [...] not only produces a deterioration of human labour power by robbing it of its normal moral and physical conditions of development and activity, but also produces the premature exhaustion and death of this labour power itself”. Studies have showed that working 8 hours is not ultimately productive.  Evan Robinson recently wrote - "working over 21 hours continuously is equivalent to being legally drunk. Longer periods of continuous work drastically reduce cognitive function and increase the chance of catastrophic error."



When discussing in HR circles, people will agree on the data and its conclusions, but do HR departments plan on executing a plan making a day more productive that is less conventional?  No. Sweden, in some attempt at reform and finding work-life balance, just switched to a 6-hour workday (or see here, or better here for an explanation).

Conventionality has been the killer of many great ideas. The great unknown has been terrifying traditional types for eons.  As I write this, I am standing up.  I’ve never considered being able to stand up and write.  Not because I didn’t think it wasn’t possible, it just didn’t cross my mind.  The fact that I have access to that option makes me want to try it.  If you are expecting me to say that is better than sitting down, I am sorry to disappoint.  It is not; it is simply different.  But different is what you need sometimes, and sometimes you just need your good ol’ chair.

The point I am trying to make is that there are so many ways to go through our day. There are exponential alternatives on how to structure our day, our time, and our life. So many people are looking for adventure or, when asked what they seek in their significant other, they say someone adventurous. If that were really true, why wouldn’t they take a different way home? That’s adventurous, even if on a smaller scale than what they may have envisioned. Instead of going home after work, take a scenic route or improvise a road trip to the next state over. We don’t do any of these. Well, I try not to do it.   It upsets my family, and I don’t want them to seriously consider placing a tracking chip on me, even though this would avoid driving and texting to notify them of my whereabouts, thereby making the road a much safer place because I am not swerving and trying to be grammatically correct at the same time. Now I am going on a tangent...

I am just tired of discussing and having studies show us how we can improve on our day yet nothing revolutionary happen. The graph below shows the relationship between productivity and annual hours worked.



A paper by John Pencavel implies that reducing working hours is good for productivity.

Don’t get me wrong. I think it is great that IKEA sells standing-up desks. That’s something. But I am still holding out on the 4 hour work days. Or, perhaps, we may see a Basic Living Wage implemented in our lifetimes, with recent pieces such as this one by Scott Santers making the point forcefully based on new research by the IMF and OECD.  Perhaps the time has come for some real change that redefines the position of labor in our societies in a way that strikes a better work-life balance.  I can hope.  Can't I?


Ressentiment in the Present Age, by Søren Kierkegaard




Excerpt from - Søren Kierkegaard, The Present Age, translated by Alexander Dru with Foreword by Walter Kaufmann, 1962, pp. 49–52.



It is a fundamental truth of human nature that man is incapable of remaining permanently on the heights, of continuing to admire anything. Human nature needs variety. Even in the most enthusiastic ages people have always liked to joke enviously about their superiors. That is perfectly in order and is entirely justifiable so long as after having laughed at the great they can once more look upon them with admiration; otherwise the game is not worth the candle. In that way ressentiment finds an outlet even in an enthusiastic age. And as long as an age, even though less enthusiastic, has the strength to give ressentiment its proper character and has made up its mind what its expression signifies, ressentiment has its own, though dangerous importance. […]

The more reflection gets the upper hand and thus makes people indolent, the more dangerous ressentiment becomes, because it no longer has sufficient character to make it conscious of its significance. Bereft of that character reflection is a cowardly and vacillating, and according to circumstances interprets the same thing in a variety of ways. It tries to treat it as a joke, and if that fails, to regard it as an insult, and when that fails, to dismiss it as nothing at all; or else it will treat the thing as a witticism, and if that fails, then say that it was meant as a moral satire deserving attention, and if that does not succeed, add that it was not worth bothering about. [...]

Ressentiment becomes the constituent principle of want of character, which from utter wretchedness tries to sneak itself a position, all the time safeguarding itself by conceding that it is less than nothing. The ressentiment which results from want of character can never understand that eminent distinction really is distinction. Neither does it understand itself by recognizing distinction negatively (as in the case of ostracism) but wants to drag it down, wants to belittle it so that it really ceases to be distinguished. And ressentiment not only defends itself against all existing forms of distinction but against that which is still to come.

The ressentiment which is establishing itself is the process of leveling, and while a passionate age storms ahead setting up new things and tearing down old, raising and demolishing as it goes, a reflective and passionless age does exactly the contrary; it hinders and stifles all action; it levels. Leveling is a silent, mathematical, and abstract occupation which shuns upheavals. In a burst of momentary enthusiasm people might, in their despondency, even long for a misfortune in order to feel the powers of life, but the apathy which follows is no more helped by a disturbance than an engineer leveling a piece of land. At its most violent a rebellion is like a volcanic eruption and drowns every other sound. At its maximum the leveling process is a deathly silence in which one can hear one’s own heart beat, a silence which nothing can pierce, in which everything is engulfed, powerless to resist.

One man can be at the head a rebellion, but no one can be at the head of the leveling process alone, for in that case he would be leader and would thus escape being leveled. Each individual within his own little circle can co-operate in the leveling, but it is an abstract power, and the leveling process is the victory of abstraction over the individual. The leveling process in modern times, corresponds, in reflection, to fate in antiquity. The dialectic of ancient times tended towards leadership (the great man over the masses and the free man over the slave); the dialectic of Christianity tends, at least until now, towards representation (the majority views itself in the representative, and is liberated in the knowledge that it is represented in that representative, in a kind of self-knowledge); the dialectic of the present age tends towards equality, and its most consequent but false result is leveling, as the negative unity of the negative relationship between individuals.

It must be obvious to everyone that the profound significance of the leveling process lies in the fact that it means the predominance of the category ‘generation’ over the category ‘individuality’.

---------
If you enjoyed this post, you may also like:



How to Meditate, by Jack Kerouac




-lights out-
fall, hands a-clasped, into instantaneous
ecstasy like a shot of heroin or morphine,
the gland inside of my brain discharging
the good glad fluid (Holy Fluid) as
i hap-down and hold all my body parts
down to a deadstop trance-Healing
all my sicknesses-erasing all-not
even the shred of a 'I-hope-you' or a
Loony Balloon left in it, but the mind
blank, serene, thoughtless. When a thought
comes a-springing from afar with its held-
forth figure of image, you spoof it out,
you spuff it off, you fake it, and
it fades, and thought never comes-and
with joy you realize for the first time
'thinking's just like not thinking-
So I don't have to think
any
more'


How do human minds work?: The Cognitive Revolution and Paradigm Change in Cognitive Science




During the first half of the 20th century, empiricism permeated most fields related to the study of human minds, particularly epistemology and the social sciences. The pendulum swung toward empiricism at the end of the 19th century in reaction to the introspective and speculative methods that had become the standard in disciplines like psychology, psychophysics and philosophy. Based on technical advances mostly achieved in Russia and the United States, behaviorism took form, threatening to absorb philosophy of language and linguistics (e.g., respectively, Quine 1960, and Skinner 1948, 1957).  In reaction to that movement, Cognitive Science emerged as an alternative for those discontent with the reigning versions of empiricism, that is, as a rationalist alternative.

When Chomsky (1959) pounced upon Skinner's Verbal Behavior, he later reasserted his victory as a vindication of rationalism in the face of “a futile tendency in modern speculation”, stating that he did not "see any way in which his proposals can be substantially improved within the general framework of behaviorist or neobehaviorist, or, more generally, empiricist ideas that has dominated much of modern linguistics, psychology, and philosophy" (Chomsky 1967).  Noam Chomsky’s assault, backed by the research program offered alongside it (Chomsky 1957), would be followed by twenty-five years of almost completely uncontested rationalist consensus.  Thus, the Cognitive Revolution is best understood as a rationalist revolution.

Researchers in the newly delineated interdisciplinary field coincided in arguing that the mind employs syntactic processes on amodal (i.e., context-independent) structured symbol, some of which must be innate.  The computer metaphor guided the formulation of models, whereby mind is to nervous system what software is to hardware.  Conceived as a new scientific epistemology, Cognitive Science built bridges across separate disciplines.  
Though each field has its own terminology dissimilar to the others, potentially straining effective communication, academics could converge on the view that thought, reasoning, decision-making, and problem-solving are logical, syntactic, serial processes over structured symbols.  As such, it may be suggested that the rationalist framework greatly facilitated the gestation and institutional validation of Cognitive Science as a academic domain in its own right.  Human cognition could be though of as Turing Machines (Turing 1936), perhaps similar to a von Neumann architecture (von Neumann 1945), that obey George Boole's (1854) Laws of Thought, and this computational foundation worked equally well for generative linguists, cognitive psychologists, neuroscientists, computer programmers focused on artificial intelligence, and analytic philosophers fixated on the propositional calculus of inference and human reason.  Consequently, most textbook on cognition contain a few diagrams like the one below.


Models that abide by the aforementioned rationalist premises are known as classicalist or as having a Classical Cognitive Architecture (Fodor and Pylyshyn 1988). It wasn’t until the mid-80s, with the resurgence of modeling via artificial neural networks, that the rationalist hegemony began to crack at the edges, as increasing emphasis was placed on learning algorithms based on association, induction, and statistical mechanisms that for the most part attempted to do away with innate representations altogether.  This resurgence threw Cognitive Science into what Bechtel, Abrahamsen & Graham (1998) called an identity crisis, which they date from 1985 until the time of that publication.  Almost two decades later, the identity crisis remains unresolved, as this new approach has been met with fierce resistance, displaying the unnerving, painstakingly slow characteristics of a Kuhnian paradigm shift (Kuhn 1962).



In Hume Variations (2003), Jerry Fodor, the most prominent and radical rationalist philosopher of Cognitive Science alive today, rescued the Cartesian in Hume along with his naïve Faculty Psychology at the cost of sacrificing his associationist view of learning.  And of course Fodor did this since that maneuver would render Hume a rationalist and also Cartesian linguistics and reason are central to the inaugural program of Cognitive Science, a framework that Fodor helped construct from the very beginning.  Chomsky's (1966) Cartesian Linguistics traces many of the developments of his own linguistic theory, including the key distinction between surface structure and deep structure, to the Port-Royal Grammar published by Arnauld and Lancelot in 1660.  The Port-Royal Grammar and the Port-Royal Logic (Arnauld and Nicole 1662) were both heavily influenced by the work of René Descartes.  However, the evidence is quickly mounting in a way that suggests that the maneuver needed is the opposite of Fodor's, that is, to rescue the associationist theory of learning while discarding the Cartesian aspects and the folk Faculty Psychology present in Hume's philosophy of mind.

A brief comparison between the prototypical rationalist and empiricist stances is provided in the following table.



Of these positions, the rationalist / empiricist distinction in philosophy of mind rests squarely on the issue of representational nativism. The other facets (listed in mind, processes, and representations above) seem to follow from what would be needed, wanted or expected of a cognitive architecture if there were either some or no innate ideas.

That there are no innate ideas is the core tenet of empiricist philosophy of mind. Hume believed that the mind was made up of faculties, a modular association of distinct associative engines, but he left open the question of whether the faculties arise out of experience (or ‘custom’) or are innately specified (and to what extent). There are two main reasons that suggest the former option to be the case.  First, uncommitted neural networks approximate functions, both of the body and of the world, paving the way for functional organization through processes of neural auto-organization. Second, committed neural networks bootstrap one another towards the approximation of more complicated functions; as this occurs, the domain-general processes of neurons give way to domain-specific functional organizations. However, though the representations that constitute these domain-specific processes can become increasingly applicable to variable contexts, these do not become wholly amodal, that is, context-independent, because domain-specific functions are anchored in domain-general associative processes that are inherently context-dependent or modal. (See How You Know What You Know for a review of scientific research that supports the two aforementioned reasons.)

Having said this, it must be noted that neither rationalism nor empiricism actually constitute a theory of anything at all; their core is only one hypothesis – either there are some innate ideas or there are none. There is the third possibility, however, that ideas do not exist, at least not in minds, making the rationalist/empiricist debate obsolete (cf., Brooks 1991). This third option notwithstanding, even though neither empiricism nor rationalism is actually a theory of mind, it is possible to build one in the spirit of their corresponding proposition. That is what Locke, Berkeley and Hume did; it is also what Noam Chomsky did, and what Lawrence Barsalou is doing now (whose research program is stated in Barsalou 1999).

Be that as it may, the rationalist consensus that dominated Cognitive Science's first thirty years cannot be explained by mere technological or technical factors. While someone could argue that connectionism did not appear until the mid-80s because neural networks could not be artificially implemented, this claim would be historically unfounded. Bechtel, Abrahamsen & Graham (1998) pinpoint September 11, 1956 as the date of birth of Cognitive Science. Though one may be reluctant to accept such a specific date, it is clear that the inter-disciplinary field emerged around then, plus or minus a few years. However, already in 1943, McCulloch and Pitts proposed an abstract model of neurons and showed how any logical function could be represented in networks of these simple units of computation. By 1956, several research teams had tried their hand at implementing neural networks on digital computers (see, e.g., the project of Rochester, Holland, Haibt & Duda 1956 at IBM).  By the early 60's, not only had the idea been explored, Rosenblatt (1962) had even tried building artificial neural networks as actual machines, using photovoltaic cells, instead of just simulating these on digital computers.

When Cognitive Science emerged, the technological tools existed so that research could have gone the rationalist’s or the empiricist’s way, or at least remained neutral on the matter; however, as the Cognitive Revolution is best understood as a rationalist revolution, nativism was hailed, construction began on a Universal Grammar (a project that failed miserably, by the way), decision-making processes were construed as syntactic manipulations on explicit symbol structures (Newell, Shaw, and Simon 1959, Anderson 1982), and neural networks were taken as simple instruments of pattern recognition that could serve to augment a classical cognitive architecture or, at most, to implement what would ultimately be a rationalist story. Fodor & Pylyshyn (1988) were surprisingly blunt on this last point by stating that the issue of connectionism constituting a model of cognition “is a matter that was substantially put to rest about thirty years ago” when the Cognitive Revolution took place. It took thirty years of work for frustration to set in with rationalist approaches; only then would connectionism reappear, augmented by the tools of dynamical systems theory, as a viable alternative to the rationalist or classicalist conception of cognition.


Paradigm Change in Artificial Intelligence


The term ‘connectionist’ was introduced by Donald Hebb (1949) and revived by Feldman (1981) to refer to a class of neural networks that compute through the connection weights. Thousands of connectionist nets, similar to some degree or other to the schematic below, have been created since the 1950s. The wide variety of artificial neural networks is due not only to the function each has been created (and raised) to carry out, which constrains the type of inputs and outputs to which the system has access, but also to their specific architecture—the number of neuron each layer contains, the kind of connections these exhibit, the number of layers, and the class of learning algorithm that calibrate its connection weights.


A clear and very simple example of a connectionist net (seen below) was developed by McClelland and Rumelhart (1981) for word recognition. The 3-layer network proceeded from the visual features of letters to the recognition of words through localist representations of letters in the hidden layer (for a richer discussion, see McClelland 1989). Given its function and the use of localist representations, both the mode of presentation of the input and the mode of generation of the output was constrained by the features of written language, which in turn delineated the network’s design.


Borrowed from the Empirical Philosophy of Science Project at the Natural Computation Lab of the University of California, San Diego, the graph below evidences the transition from the classicalist paradigm to the connectionist by presenting the frequency of appearance (by year) of the lexical items ‘expert system’ and ‘neural network’ in peer-reviewed academic journals of Cognitive Science. It can be clearly seen that the interest in neural networks supplanted the 1980's craze for expert systems.


For those lacking knowledge on the matter, an expert system is a decision-making program that is supposed to mimic the inferences of an expert in a given field; basically, the shell of the program is an inference engine that works logically and syntactically, and this engine must be given a knowledge base, a finite set of "If X, then Y" rules the sum of which ought to allow it to perform its target function correctly most of the time.  Typically, an expert system asks you either questions or to input specific data, and using those inputs, the inference engine goes through its knowledge base to provide you an answer.  Expert systems may be created for purposes of prediction, planning, monitoring, debugging, and perhaps most prominently for diagnosis, among several other possible purposes.  WebMD's symptom checker, which you may have used once or twice, is perhaps the most well-known example; you click on what symptoms you have, its inference engine passes your data through its knowledge base, and it provides you with a list of all the sicknesses you may be suffering from.  If you have used that symptom checker more than twice in your life, you probably know how inaccurate it tends to be, even to the point of being ludicrous at times.  In stark contrast, many artificial neural networks have been created for detecting all sorts of cancers and can do with 99% accuracy, that is, better than almost any doctor, like this one for breast cancer created by a girl during her junior year of high school.  This is just one out of countless domains where empiricist approaches vastly outperform their rationalist counterparts.

As a funny digression, I once had to make an expert system for a graduate class and built a program that would ask you 16 socioeconomic and political questions, from which it would diagnose your preferred political philosophy  (e.g., anarchism, liberalism, republicanism, communism, constitutional monarchist, fascism, and so on).  My artificial intelligence professor took it with him to the School of Engineering to test it out on his students, and when I saw him again, he commented that he was impressed by how accurate it was.  It was definitely more accurate than WebMD but, then again, medical diagnosis is a way more complicated knowledge domain that contains many more possible outputs so that is an unfair comparison.  On an unrelated but also funny note, my other artificial intelligence professor told the story of how he had lost faith in artificial neural networks while at grad school when he created a system that would either approve or reject a bank loan application. He would input the demographic and personal income data as well as the loan information, and the network would respond a simple Approve or Reject.  But he created the network with a twist; he deliberately trained it with a racist data set in such a way that the network wouldn't give out any prime loans to anyone that wasn't white.  He wanted to see if the network would ever learn the error of his ways or at least acknowledge its racism, but it never did, and he said that at that moment he lost all faith in connectionist networks.  When he finished telling the story, I immediately raised my hand and said—"You do realize that that is exactly what happens with many bankers in real life, right?  You network didn't fail; it behaved like a human would."


Reframing Cognitive Science


The seeds of empiricism have been sprouting almost everywhere. The last thirty years have seen an ever-increasing portion of scientific research dedicated, even if reluctantly, to proving some of the central tenets of empiricist theory of mind or attempting to articulate mechanisms to augment it.

In artificial intelligence, connectionist architecture emerged in the 80's as a clear and feasible alternative to symbolic approaches (a.k.a., good old-fashioned artificial intelligence or GOFAI; Haugeland 1985, Dreyfus 1992). The tools of dynamical systems theory, widely used in the field of physics, bolstered connectionism to provide for a robust account of a system’s ontogenetic evolution through time (van Gelder 1999). Connectionism provided that which behaviorist lacked, powerful learning mechanisms that could account for not only how intelligent agents derive knowledge from experience but also how we can surpass that limited amount of information to conceive an unlimited amount of possibilities; furthermore, the tools of dynamical systems theory opened the possibility of seeing what goes on inside the ‘black box’, while also helping psychology get in sync with physics and neurology. In this sense, connectionism ought not to be confused with behaviorism because neural network architectures permit an agent to surpass the limited stimulus-response patterns that it encounters (Lewis and Elman 2001, Elman 1998). It should be noted, however, that connectionist computation is not synonymous with empiricism, that it is, in fact, entirely compatible with rationalist postulates, as exemplified by Optimality Theory (Prince & Smolensky 1997), an attempt to implement universal grammar via a connectionist architecture; nevertheless, this compatibility is a token truism that goes both ways and is due to the fact that artificial neural networks and Turing machines exhibit equivalent computational power inasmuch as either can implement any definable function, which is why most people simulate neural networks using common personal computers (currently, the best open-source, free software for creating your own neural network with relative ease is Emergent, a program hosted by the University of Colorado that runs on Windows, Macintosh OS's, and Linux-Ubuntu, and can be downloaded here). Looking beyond this universal computational compatibility, connectionism clearly opens the door to empiricism, and the vast majority of connectionist models do away with rationalist tenets and clearly partake of the long-standing empiricist tradition even if many of their authors aren't willing to admit this publicly because of the entrenched stigma branded into that philosophical label.

In linguistics, a clear alternative to generativism surfaced during the 1980s in the form of Cognitive Linguistics (Langacker 1987, Lakoff 1987). Though cognitive linguistics is not wholeheartedly committed to an empiricist theory of mind, its rejection of the fundamental tenets of generativism is in itself a retreat from the rationalist consensus that stood almost uncontested. Specifically, its rejection of an autonomous, modular universal grammar and its grounding of linguistic abilities in domain-general learning and associative mechanisms represent a big leap towards empiricism. Moreover, as linguistics increasingly meshes with psychology and connectionism, slowly but surely an associationist flavor that had long been wiped out by Chomsky and his followers returns to the field. In consequence, much work in linguistics is being fruitfully redirected from devising categorical acquisition schemes toward testing statistical learning algorithms for the acquisition of syntax as well as for syntax's prehistoric origins (e.g., Hazlehurst and Hutchins 1998, Hutchins and Hazlehurst 1995) and also for how grammar changes throughout history (see, e.g., Hare and Elman 1995).

In psychology, many connectionist-friendly accounts have been offered. Perhaps the most ambitious is Barsalou’s (1999) perceptual symbol systems, an account that takes a firm empiricist stance in the face of rationalist psychology by dissolving the distinction between perception and conception. Moreover, the perceptual symbol systems approach has been recently applied, though not without difficulties, to theory of discourse (Zwaan 2004) and to theory of concepts (Prinz 2002). Still, this is not the only empiricist current in psychology, as the domain of psycholinguistics has been propelled mostly by psychologists, like Elizabeth Bates and Brian MacWhinney, and has led to findings and models that are very compatible with the tenet of empiricism (see, e.g., Thelen and Bates 2003, Tomasello 2006, Goldberg 2004, MacWhinney 2013).  Not to mention that many of the early proponents of the parallel distributed processing (or PDP) approach to Cognitive Science, like Rumelhart and McClelland, were psychologist by profession.

Empiricist cognitive architecture has gained a voice in every discipline in the cognitive sciences. The increasing acceptance of empiricism is leading not only to the testing of a rapidly-growing number of so-inspired hypotheses but also to a vast reinterpretation of earlier findings in light of radically different postulates. What has been taking place is clearly a Kuhnian paradigm shift. Hence, an exorbitant amount is still to be done. For starters, oddly enough several empiricist researchers are not convinced that their standing agendas are in fact empiricist, that is, that replacing ‘empiricist’ with ‘interactionist’ or with ‘emergentist’ does not black out the ‘empiricist’.

Consider, for example, the book Rethinking Innateness: A Connectionist Perspective on Development  (Elman et al. 1996). After a thorough and outstanding assault of rationalism and defense of empiricism, the group goes on to assert “We are not empiricists” (p. 357). Like many other fearful academics, they view the label ‘empiricist’ as a stigma, not unlike having to bear the Scarlet Letter. It is about time that this stigma be removed, and in that spirit I offer a few clarifications. First, regardless of what Chomsky and Fodor would like us to believe, behaviorism and empiricism are not synonymous, as most versions of connectionism clearly illustrate. Even the simplest neural learning algorithms, such as error backpropagation, offer that which behaviorist could not, statistical means that can carry cognition from learning through finite data to understanding an infinite amount of possibilities. Second, consider the following excerpt—

"We are neither behaviorists nor radical empiricists. We have tried to point out throughout this volume not only that the tabula rasa approach is doomed to failure, but that in reality, all connectionist models have prior constraints of one sort or another. What we reject is representational nativism." (Elman et al. 1996 1996, p. 365)

In Rethinking Innateness, the authors distinguish between three kinds of possible innate constraints: representational, architectural, and chronotopic (timing). A prime example of an architectural constraint is the characteristic 6-layer structure of the human neocortex; for chronotopic constraints, think of embryonic cell migrations. As stated above, the group offers a wealth of innate architectural and chronotopic constraints but reject representational constraints. It is the wealth of mechanisms that can go into delineating what kind of tabula the mind is that leads them to suggest that interactionism entails that empiricism is false. But empiricists have never shunned innateness altogether. The empiricist-rationalist distinction rests squarely on the issue of innate mental representations.

Advancing a strong view of architectural and chronotopic constraints does not depart one from the notion of a tabula rasa. The interaction of the many constraints with the world conforms the tabula—no sane empiricist would ever deny this! —but that does not render the tabula un-rasa, it just delineates what kind of tabula it is (i.e., a nervous system, not a DVD or a 35mm film or an infinite magnetic tape). To put it simply, denying all innate architectural and chronotopic features would be tantamount to claiming the children resemble their parents only because their parents raise them.  No one ever claimed that! The debate between rationalists and empiricists has always been about whether there are certain pieces of knowledge that are represented in the mind that are simply not learned. If you reject representational nativism yet do not reject the existence of something like ideas or mental representations, then you are committed to the tabula rasa, whether you like it or not. It may be unpopular, but it is nevertheless so because rejecting representational nativism without discarding mental representation is affirming that there are no innate ideas. That the type of tabula that it is determines what kind of information can be written on it and that human brains are highly structured does not entail the falsity of empiricism, unless representation is preprogrammed into the slate. Without unlearned representations, a highly structured and complex tabula is as concordant with empiricism as a simple and amorphous pattern-seeking agent.

Clearly, the type of slate that is proposed today is different from what was proposed during the Enlightenment. To Hume, the mind was primarily a passive photocopier of experience; in contrast, current neural networks are much more active in their assimilation of environmental information. Moreover, while Hume thought that that human minds associate the compiled copies of experience according to three domain-general types of association, connectionist neural networks are universal approximators that modularize as functional approximations consolidate because of the details of the surrounding environment and, therefore, in consequence, these readily develop mechanisms that go beyond association through association itself (see How You Know What You Know for a review). Advancing a stronger, more complex view of the cognitive slate does not distance the account from empiricism since it rejects representational nativism, just like Elman et al. 1996 did.

It is telling that connectionists naturally gravitate toward empiricism in spite of the stigma surrounding the tradition and even their own explicit assertions and roundabout philosophical identifications. Ultimately, the hallmark dispute among connectionist and classicalists is the question of what kind of tabula the mind is, a question that does not directly concern the rationalist/empiricist distinction but results from it by entailment. It is really just a practical matter that, whereas syntactic or logical engines require innate representations, complex neuronal slates like ours do not. Then again, it is also a practical matter that the only intelligent beings we know of are born with highly complex neural networks. Deep down, I am inclined to think that Fodor’s Informational Atomism is logically correct—if the mind works like a logical or syntactic engine, then all simple concepts must be innate. As Barsalou (1999) notes, there are no accounts on offer for how simple symbols can be acquired by a classical cognitive architecture or any logical or syntactic engine, and this may very well be because there are no possible accounts at all. This admission, however, should not lead us to accept Fodor’s theory of concept, but rather it should convince us that the mind is not a Turing machine (like the image below) or a syntactic engine (cf., Pinker 2005).



As the evidence mounts, even Chomsky had to abandon most of the original postulates of generative linguistics, including the important distinction between surface structure and deep structure and also the view that syntax is a totally autonomous faculty that does not derive or associate at all with the lexicon.  The Minimalist Program (1995) reduced the philosophical rationalism of Chomsky's theory to such an extent that several academics that have based their own work on generative models, suddenly finding themselves in a theoretical void that threatens to undermine their research, have chosen either to ignore it entirely or to attempt to undermine the program.  But this is just one example of how rationalist philosophy of mind is undergoing its slow death, weakening as data piles up.  As the first generation of cognitive scientists dies out and the third generation starts to assume positions of power, the stigma branded upon empiricism will weaken.  The likely result is a renewal that will allow funding to flow to new experimental techniques and to innovative practical application across the interrelated disciplines.  Exciting times lie ahead.

-------

REFERENCES

- Anderson, J.R. (1982). “Acquisition of cognitive skill”. Psychological Review 89: 369-406.
- Arnauld, A. & Lancelot, C. (1660). General and Rational Grammar: The Port-Royal Grammar. J. Rieux and B.E. Rollin (trans.). The Hague: Mouton, 1975.
- Arnauld, A. & Nicole, P. (1662). Logic, or The Art of Thinking; being The Port-Royal Logic. Thomas Spencer Baynes (trans.). Edinburgh: Sutherland and Knox, 1850.
- Barsalou, L.W. (1999). “Perceptual symbol systems.” Behavioral and Brain Sciences, 22: 577-609.
- Bechtel, W., Abrahamsen, A. & Graham, G. (1998). "The Life of Cognitive Science". A Companion to Cognitive Science. W. Bechtel & G. Graham (eds.). Massachusetts: Blackwell Publishers Ltd.
- Boole, G. (1854). An Investigation of the Laws of Thought on Which are Founded the Mathematical Theories of Logic and Probabilities. London: Macmillan.
- Brooks, R.A. (1991). “Intelligence Without Representation.” Artificial Intelligence Journal 47: 139–160.
- Chomsky, N. (1957). Syntactic Structures. New York: Mouton de Gruyter.
- Chomsky, N. (1959). "A Review of B. F. Skinner's Verbal Behavior." Language, 35, No. 1: 26-58.
- Chomsky, N. (1966). Cartesian Linguistics: A Chapter in the History of Rationalist Thought. New York: Harper & Row.
- Chomsky, N. (1967). “Preface to the 1967 reprint of ‘A Review of Skinner's Verbal Behavior’.” Readings in the Psychology of Language. Leon A. Jakobovits & Murray S. Miron (eds.). Prentice-Hall, Inc. pp. 142-143.
- Chomsky, N. (1995). The Minimalist Program. Cambridge, MA: MIT Press.
- Dreyfus, H.L. (1992). What Computers Still Can’t Do: A Critique of Artificial Reason. Cambridge, MA: MIT Press.
- Elman, J. L. (1998). “Connectionism, artificial life, and dynamical systems: New approaches to old questions.” A Companion to Cognitive Science. W. Bechtel & G. Graham (eds.) Oxford: Basil Blackwood.
- Elman, J.L., Bates, E.A., Johnson, M.H., Karmiloff-Smith, A., Parisi, D., Plunkett, K. (1996). Rethinking Innateness: A Connectionist Perspective on Development. Cambridge, MASS: MIT Press.
- Feldman, J.A. (1981). “A connectionist model of visual memory.” Parallel models of associative memory. G.E. Hinton y J.A. Anderson (eds.). Nueva Jersey: Erlbaum.
- Fodor, J.A. (2003). Hume Variations. New York: Oxford University Press.
- Fodor, J.A. & Pylyshyn, Z.W. (1988). “Connectionism and Cognitive Architecture: A Critical Analysis.” Cognition 28: 3-71.
- Goldberg, A.E. (2004). “But do we need Universal Grammar? Comment on Lidz et al.”(2003)” Cognition 94: 77-84.
- Hare, M. & Elman, J.L. (1995). “Learning and morphological change.” Cognition 56: 61-98.
- Haugeland, J. (ed.) (1985). Artificial Intelligence: The Very Idea. Cambridge, MA: MIT Press.
- Hazlehurst, B. & Hutchins, E. (1998). “The emergence of propositions from the co-ordination of talk and action in a shared world.” Language and Cognitive Processes 13(2/3): 373-424.
- Hebb, D. (1949). The Organization of Behavior: A Neuropsychological theory. New York: Wiley.
- Hutchins, E. & Hazlehurst, B. (1995). “How to invent a lexicon: the development of shared symbols in interaction.” Artificial Societies: the computer simulation of social life. N. Gilbert & R. Conte (eds.). London: UCL Press. pp. 157-189.
- Kuhn, T. (1962). The Structure of Scientific Revolutions. Chicago: University of Chicago Press, 1970. (2nd revised edition)
- Lakoff, G. (1987). Women, Fire, and Dangerous Things: What Categories Reveal About the Mind. Chicago: The University of Chicago Press.
- Langacker, R.W. (1987). Foundations of Cognitive Grammar. Stanford, CA: Stanford University Press.
- Lewis, J.D., & Elman, J.L. (2001). “Learnability and the statistical structure of language: Poverty of stimulus arguments revisited.” Proceedings of the 26th Annual Boston University Conference on Language Development.
- MacWhinney, B. (2013). “The Logic of a Unified Model”. S. Gass and A. Mackey (eds.). Handbook of Second Language Acquisition. New York: Routledge. pp. 211-227.
- McClelland, J.L. & Rumelhart, D.E. (1981). “An interactive activation model of context effects in letter perception: Part 1. An account of basic findings.” Psychological Review 88: 375-407.
- McClelland, J.L. (1989). “Parallel distributed processing: Implications for cognition and development.” Morris, R. (ed.) Parallel distributed processing: Implications for psychology and neurobiology. New York: Oxford University Press.
- McCulloch, W.S. & Pitts, W. (1943). “A logical calculus of the ideas immanent in nervous activity.” Bulletin of Mathematical Biophysics 5: 115–137.
- Newell, A., Shaw, J.C. & Simon, H.A. (1959). “Report on a general problem-solving program”. Proceedings of the International Conference on Information Processing . pp. 256-264.
- Pinker, S. (2005). "So How Does The Mind Work?" Mind and Language 20, 1: 1-24.
- Prince, A. & Smolensky, P. (1997). “Optimality: From Neural Networks to Universal Grammar”. Science 275: 1604-1610.
- Prinz, J.J. (2002). Furnishing the Mind. Massachusetts: MIT Press.
- Quine, W.V.O. (1960). Word and Object. Massachusetts: MIT Press.
- Rochester, N., Holland, J.H., Haibt, L.H., & Duda, W.L. (1956). “Tests on a cell assembly theory of the action of the brain, using a large digital computer.” IRE Transactions on Information Theory 2: 80-93.
- Rosenblatt, F. (1962). Principals of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms. Washington, D.C.: Spartan Books.
- Skinner, B.F. (1957). Verbal Behavior. Acton, MA: Copley, 1992.
- Thelen, E. & Bates, E. (2003). “Connectionism and dynamic systems: are they really different?” Developmental Science 6, 4: 378-391.
- Tomasello, M. (2006). “Acquiring linguistic constructions”. Handbook of Child Psychology. Kuhn, D. & Siegler, R. (eds.). New York: Wiley.
- Turing, A.M. (1936). "On Computable Numbers, with an Application to the Entscheidungsproblem". Proceedings of the London Mathematical Society, 2, 42: pp. 230–65, 1937.
- van Gelder, T.J. (1999). “Defending the dynamical hypothesis.” Dynamics, Synergetics, Autonomous Agents: Nonlinear Systems Approaches to Cognitive Psychology and Cognitive Science. W. Tschacher & J.P. Dauwalder (eds.) Singapore: World Scientific. pp. 13-28.
- von Neumann, J. (1945). "First Draft of a Report on the EDVAC". Originally confidential [property of the United States Army Ordnance Department].
- Zwaan, R.A. (2004). “The Immersed Experiencer: Toward an embodied theory of language comprehension.” The Psychology of Learning and Motivation 44: 35-62.


--------
If you enjoyed this article, you may also like:



Be not sad, by James Joyce







Be not sad because all men
Prefer a lying clamour before you:
Sweetheart, be at peace again -- -
Can they dishonour you?

They are sadder than all tears;
Their lives ascend as a continual sigh.
Proudly answer to their tears:
As they deny, deny.


Test Your Romantic Relationship Attachment Style




Have you ever wondered how it is that you in particular bond during romantic relationships? It just so happens that this has been a topic of scientific research for over 30 years.  As a result, if you are willing to be honest, you can test yourself and get a pretty clear picture.  This post provides a link to a free psychological test where you can do just that, as well as some background information so that you may better understand your results.




Attachment Theory stems from the seminal work of John Bowlby, who began publishing papers on the subject in 1958 and developed the ideas into a full-blown model in the trilogy of books Attachment and Loss, with Volume I: Attachment being published in 1969, Volume II: Separation: Anxiety & Anger in 1972, and finally Volume III: Loss: Sadness & Depression becoming available in 1980.

Mary Ainsworth developed the Strange Situation Protocol to observe empirically infants behavior from 12 months to 20 months of age.  The protocol was usually carried out as follows:

Episode 1: Mother (or other familiar caregiver), Baby, Experimenter (30 seconds)
Episode 2: Mother, Baby (3 mins)
Episode 3: Mother, Baby, Stranger (3 mins or less)
Episode 4: Stranger, Baby (3 mins)
Episode 5: Mother, Baby (3 mins)
Episode 6: Baby Alone (3 mins or less)
Episode 7: Stranger, Baby (3 mins or less)
Episode 8: Mother, Baby (3 mins)

Though many observations were important in all "Episodes", the key observations are mostly obtained in Episode 5 and Episode 8 when how the infant responds to the caregiver's return provides the primary characteristics of their pattern of behavior in relation to their primary caregiver, usually their mother.  The classification system that resulted from these and further experiments are commonly referred to as Attachment Styles.


Infant Attachment Styles


There are 4 attachment styles:
  1. Secure Attachment
  2. Anxious-Resistant Insecure Attachment, also commonly called Ambivalent Attachment
  3. Anxious-Avoidant Insecure Attachment
  4. Disorganized/Disoriented Attachment
There are subcategories to each of these 4 styles; the following descriptions are just a summary.  Secure infants will readily explore the surroundings, interact with the stranger, get upset when the mother departs but are happy she returns.  Secure attachment develops when caregivers are readily available to the infants and are able to satisfy their needs in an optimal manner.  In contrast, anxious-resistant infants tend not to interact with the stranger even when their mother is present; upon her departure they often appear distressed, yet when she comes back they want to approach her but express anger or helplessness instead.  Anxious-resistant attachment results from parents that are unpredictable and respond inconsistently to the infant's needs.  Perhaps worse, infants develop an anxious-avoidant insecure attachment when their attempts at closeness are rejected and, furthermore, their needs are repeatedly unattended to; as a result, the infants learn that communication is useless and begin to camouflage their distress by seeming aloof and unresponsive.  Anxious-avoidant infants do not explore much, do not show anger when the mother leaves and either ignore her when she returns or simply turn away.

Disorganized/Disoriented attachment puzzled researchers at first, such that many subjects were improperly classified in the early experiments, until Mary Main added this fourth category once there was enough data to discern the pattern.  Infants with a disorganized attachment style display tense and jerky movements that attempt to contain crying, movements that stop when they do cry. Overwhelmed by fear, these infants' behavior is inconsistent, contradictory, and often display clear signs of psychological dissociation; nonetheless, about half of these infants still approach their caregivers without resistance or avoidance.  This disoriented attachment style may sometimes be the result of abuse, and in barely a majority of cases it stems from the mother having suffered trauma shortly before or after childbirth or having had a major loss (like the death of a parent) that they did not fully process, such that they became severely depressed.


Adult Relationship Attachment Styles


An individual's attachment style may change over the years depending on the quality of their experiences during development.  Although romantic relationships do not share many traits with caregiver-infant relationships, not only do romantic links involve many of the core tenets of earlier attachments, but also traces of those first attachments do tend to carry over into adulthood, remaining constant in many cases.

The adult romantic attachment styles are:
  1.   Secure
  2.   Anxious-Preoccupied
  3.   Dismissive-Avoidant
  4.   Fearful-Avoidant   

These four styles can be graphed by plotting them in a four quadrant chart with Anxiety as the X-axis and Avoidance as the Y-axis.  A secure style result, thus, looks like the image below.


Now that you have enough background information....


NOTE: Choose Survey B.






------------------
Other psychological personality tests you may enjoy:



How to Relax Completely in 10 Seconds




Constantly feeling anxiety is a major part of the every day life for millions of individuals.  The prognosis for anxiety disorders is among the worst within the diverse families of psychopathology. From a medical perspective, treatment typically consists of prescribing benzodiazepines (e.g., lorazepam, clonazepam, diazepam), which yield substance dependence and chemical tolerance.  These medications relieve the symptoms but leave the causes untreated.  From a pure psychotherapy perspective, the prognosis for anxiety is just as bad; Cognitive-Behavioral Therapy, the most employed technique nowadays, targets specific ideas that trigger feelings of anxiety, but this is ineffective because of the nature of anxiety.  Unlike phobias - fears tied to specific triggers - anxiety results from persistent fear that has lost its triggers, spreading throughout the brain.  If you manipulate some ideas by frequent repetition, the anxiety resurfaces elsewhere, again because the causes are not being treated.

But not all is hopeless.  Relaxation techniques used properly and frequently both relive anxiety and rewire the very same neural networks that generate it.  Previously I posted a technique for combating anxiety in the morning by listening and singing to a specific adaptation of Beethoven's Ode to Joy.  In what follows, I will provide instructions for a shorter and way more effective relaxation technique.



How to Relax in 10 Seconds


The following technique is not well known, but it works like a charm.  You will have to stand up and adopt what I call the "Receptive Position".  This position is a variant of the so-called Anatomic Position, as shown below.



So here are the instructions for how to relax in 10 seconds with the Receptive Position:
Step 1:  Stand up straight, shoulders back but relaxed.

Step 2:  Raise your shin a little (as in a "proud" emotional stance).

Step 3:  Drop your arms to your side and completely relax all tensions that might be hiding there.

Step 4:  Make your palms face forward and try again to relax your arms. (This is the hardest part of the exercise; if it causes you some pain, you may slightly make them face a little bit towards you, so long as they are still mostly facing forward and not towards your body.)

Step 5:  Make sure your body is as free of tension as you can possibly get it to be.

Step 6:  Close your eyes.

Step 7:  Breathe deeply, counting in silence every exhalation until you reach 10. (If you are extra stressed, breathe and count each exhalation until 15.)

Step 8:  Upon counting 10 (or 15), immediately open your eyes. 


Do it!  After finishing, ask yourself - How do you feel at that precise moment?

If you are so anxious that your first attempt caused you some physical discomfort, please just do the exercise one more time.  This really does work for everyone.

Once you've learned how to do this easy procedure correctly, you know that you can always repeat it whenever anxious or overly stressed if you can find a place where you enjoy some privacy.

I hope that this exercise has provided you immediate relief.


BONUS:  You can check how anxious you are via elevating yourself by getting on the tips of your toes as you inhale, then lowering yourself during exhalation.  Be careful!  If you are anxious, you will feel that you are falling as you get on the tip of your toes (a vertigo-like feeling).  In contrast, if you are not anxious, elevating yourself in this way will not cause you any feeling of discomfort.


Each and All, by Ralph Waldo Emerson








Little thinks, in the field, yon red-cloaked clown,
Of thee, from the hill-top looking down;
And the heifer, that lows in the upland farm,...
Far-heard, lows not thine ear to charm;
The sexton tolling the bell at noon,
Dreams not that great Napoleon
Stops his horse, and lists with delight,
Whilst his files sweep round yon Alpine height;
Nor knowest thou what argument
Thy life to thy neighbor's creed has lent:
All are needed by each one,
Nothing is fair or good alone.

I thought the sparrow's note from heaven,
Singing at dawn on the alder bough;
I brought him home in his nest at even;—
He sings the song, but it pleases not now;
For I did not bring home the river and sky;
He sang to my ear; they sang to my eye.

The delicate shells lay on the shore;
The bubbles of the latest wave
Fresh pearls to their enamel gave;
And the bellowing of the savage sea
Greeted their safe escape to me;
I wiped away the weeds and foam,
And fetched my sea-born treasures home;
But the poor, unsightly, noisome things
Had left their beauty on the shore
With the sun, and the sand, and the wild uproar.

The lover watched his graceful maid
As 'mid the virgin train she strayed,
Nor knew her beauty's best attire
Was woven still by the snow-white quire;
At last she came to his hermitage,
Like the bird from the woodlands to the cage,—
The gay enchantment was undone,
A gentle wife, but fairy none.

Then I said, "I covet Truth;
Beauty is unripe childhood's cheat,—
I leave it behind with the games of youth."
As I spoke, beneath my feet
The ground-pine curled its pretty wreath,
Running over the club-moss burrs;
I inhaled the violet's breath;
Around me stood the oaks and firs;
Pine cones and acorns lay on the ground;
Above me soared the eternal sky,
Full of light and deity;
Again I saw, again I heard,
The rolling river, the morning bird;—
Beauty through my senses stole,
I yielded myself to the perfect whole.


Primer on Roman History: The Punic Wars, the conflicts that defined our world forever




A large part of understanding ourselves is being aware of the history that underlies the structures, symbols, and institutions that surround us, that we actively internalize (even if unaware), and that continually condition us to be a certain way and not another.  If you have some idea of how the brain works, you also know that the way us human beings presently behave is due in large part to the cultural evolution that has given way to the contexts that engulf us.

Our world would not be as it is without the Punic Wars having gone the way that they did. These massive, epic wars were fought between a Roman Republic that didn't yet control even all of Italy and a Carthage, settled by Phoenicians, that had an Empire that stretched through half of the north African shore, southern Spain, and the many islands just west of the Italian peninsula.





The following four short, animated videos provide an engaging account of all the elements that make the Punic Wars so engaging and so important.  Enjoy!


(For Mobile Users that may not be able to see the videos embedded below, please go to the following links: 1) Rome: The Punic Wars I  2) Rome: The Punic Wars II  3) Rome: The Punic Wars III  4) Rome: The Punic Wars IV)








Please see this Ancient Rome Song for a much shorter, musical historical primer of ancient Roman history.

Know yourself!


Some truths can only be stated in hidden ways





WARNING:  Do not stare at these images for longer than 5 minutes.  Staring at one for 15 minutes will change your color perception for several months.  This may make you curious, but please DON'T DO IT. See the McCollough Effect for more information.




Human perception is fragile and all-too-flexible because of the way neural connections work, mainly their speed and the velocity of any (and every) neural network's rate of change.


As we live in rigidly structured societies, with thousands of clear and unbreakable rules that cannot be ignored (e.g., get naked in public and see what happens!  No, I'm kidding....please don't!), there are some basic human truths, truths of nature, that can barely be stated and, when they are expressed, they must be hidden deep within metaphors.  The following is a clear example:



(For mobile users who cannot see the video embedded above, please click https://www.youtube.com/watch?v=pVegpypXN1I)


"Know Thyself" was inscribed at the forecourt of the Temple of Apollo at Delphi.  Visitors to Delphi looking for advice from their Oracle, the greatest oracle that has ever existed, would find this message along their way.  I strongly urge you to follow it ---  Know yourself!

A little known fact about the Oracle of Delphi (which sheds light into the reality of psychics) is that visitors to it were made to wait for days before they finally entered to hear the advice they were seeking.  The visitors almost always left the Oracle baffled, perplexed, and astounded by the quality of the advice they received. What they didn't know is that the Oracle of Delphi would send out scouts immediately when a person arrived to gather as much information as they possibly could about that person.  There was nothing magical about the psychics at all.  It was their due diligence and long training and experience that made the Oracle of Delphi the most sacred and most influential oracle that has ever existed.


In the words of Walt Whitman:

You will hardly know who I am or what I mean,
But I shall be good health to you nevertheless,
And filter and fibre your blood.

Failing to fetch me at first keep encouraged,
Missing me one place search another,
I stop somewhere waiting for you.


Published in Leaves of Grass, Final "Deathbed Edition", 1892.




Featured Original:

How You Know What You Know

In a now classic paper, Blakemore and Cooper (1970) showed that if a newborn cat is deprived of experiences with horizontal lines (i.e., ...

Site Search