Tag Archives: History

Crash and War Anger

Crash and War Anger

All of us love validation – especially when it comes from an admired source.

That’s the way I feel after reading the NYT Review by Fareed Zakaria – possibly my most admired journalist

The review is of a book called Crash, by an eminent scholar writing about the consequences of the crash of 2008. The review is below.

It validates my deep belief that the seeds of Trump’s victory go back to the “crash” of 2008. It was a moment of major negative “reset” for far too many Americans. Their savings, or their employability, or their home values, or their prospects for credit changed so negatively that it created an emergent body politic. The new body politic was characterized by a primary sentiment: seething anger. More importantly, it was characterized by a call to action: “throw the bums out!”.

The deep irony here is that democracy handed the angry a “throw the bums out” choice that many didn’t want – Barack Obama.

But their anger at inside-the-beltway Republicans, and George W. Bush, was so strong that – inside the ballot box – they pulled the lever for Obama.

When Donald Trump had the courage to viciously criticize the Republican establishment, and especially Bush, he was speaking directly to this new body politic. If their sentiment was resentment, they found their gladiator in Trump.

My only beef with the book and the review is that they do not go back far enough.

I believe the seeds of Trump’s victory go back to 9/11. It was that fateful day that itself created a new body politic, whose primary sentiment was “We are under attack and we must fight back.”

George W. Bush was responding to that scary, new sentiment when he announced not just one, but two new wars. History will record that the Iraq War – which cost trillions – was a major mistake. History will be somewhat more kind about the war in Afghanistan, which led the nation into a massively expensive 15+ year engagement of limited success and many, many unintended consequences.

So my point is that 9/11 reigned holy hell on the nation – because of the new body politic of “we are under attack and we must fight back.” – by pushing a very weak leader George W. Bush – to start two wars that almost immediately looked incompetent and wrong.

The 2008 crash was the final straw. Two stupid wars and a major economic reset were enough to push most Americans over the edge to a seething anger and a “throw the bums out” call to action.

It took the decade after to weave a tapestry of cause and effect, supported by right wing media. Never mind that most of the tapestry was a fabrication. Never mind that he was a serial liar. It was soothing to have a gladiator (Trump) that spoke the truth about the subjects that really mattered: “those folks in Washington don’t know what they are doing and they need to go”; “you are being screwed by the economic resets” and “the war in Iraq was a major mistake”.“

There is very little question in my mind that Donald Trump will go down in history as our worst president. He will be remembered by his failures, his indecency and his lack of integrity. He will be remembered for his failures abroad, where he embarrasses us and plays the fool, and his failures at home, where he depletes the treasury and breaks the back of the Affordable Care Act. Everywhere he goes, he does what is bad, and undoes decades of progress In defining what is good,, e.g. environmental regulation. His indecency and his lack of integrity will leave lasting scars on the office, but hopefully schools and parents will now have an example of what not to be, how not to act.

How did the nation get to this horrible outcome? We will only have perspective on this decades from now, but the “second draft of history”, to me, traces it all back to 9/11 and it’s two awful wars. The cruel irony was that after eight Bush years of misguided foreign adventurism, the American economy collapsed. It was the straw that broke the camels back, causing all of us to say “we are mad as hell and we need to throw the bums out!”

So the math, looking back thirty years form now, might well be:

Afghan war + Iraq war + economic crash = Obama
(Obama was the backlash. We threw the bums out for him, and thank God he was as level headed and as competent and decent as he was)

Economic reset for most Americans + unresolved racist and nationalistic impulses + Comey + Russia = Trump
(Trump was the backlash to Obama, supported by all events above)

CREDIT: https://www.nytimes.com/2018/08/10/books/review/adam-tooze-crashed.html?rref=collection%2Fsectioncollection%2Fbook-review&action=click&contentCollection=review&region=rank&module=package&version=highlights&contentPlacement=1&pgtype=sectionfront
Looking Back at the Economic Crash of 2008

By Fareed Zakaria

Aug. 10, 2018

How a Decade of Financial Crises Changed the World

By Adam Tooze
706 pp. Viking. $35.

Steve Bannon can date the start of the Trump “revolution.” When I interviewed him for CNN in May, in Rome, he explained that the origins of Trump’s victory could be found 10 years ago, in the financial crisis of 2008. “The implosion of those world capital markets has never really been sorted out,” he told me. “The fuse that was lit then that eventually brought the Trump revolution is the same thing that’s happened here in Italy.” (Italy had just held elections in which populist forces had won 50 percent of the vote.) Adam Tooze would likely agree. An economic historian at Columbia University, he has written a detailed account of the financial shocks and their aftereffects, which, his subtitle asserts, “changed the world.”

If journalism is the first rough draft of history, Tooze’s book is the second draft. A distinguished scholar with a deep grasp of financial markets, Tooze knows that it is a challenge to gain perspective on events when they have not yet played out. He points out that a 10-year-old history of the crash of 1929 would have been written in 1939, when most of its consequences were ongoing and unresolved. But still he has persisted and produced an intelligent explanation of the mechanisms that produced the crisis and the response to it. We continue to live with the consequences of both today.

As is often the case with financial crashes, markets and experts alike turned out to have been focused on the wrong things, blind to the true problem that was metastasizing. By 2007, many were warning about a dangerous fragility in the system. But they worried about America’s gargantuan government deficits and debt — which had exploded as a result of the Bush administration’s tax cuts and increased spending after 9/11. It was an understandable focus. The previous decade had been littered with collapses when a country borrowed too much and its creditors finally lost faith in it — from Mexico in 1994 to Thailand, Malaysia and South Korea in 1997 to Russia in 1998. In particular, many fretted about the identity of America’s chief foreign creditor — the government of China. Yet it was not a Chinese sell-off of American debt that triggered the crash, but rather, as Tooze writes, a problem “fully native to Western capitalism — a meltdown on Wall Street driven by toxic securitized subprime mortgages.”

Tooze calls it a problem in “Western capitalism” intentionally. It was not just an American problem. When it began, many saw it as such and dumped the blame on Washington. In September 2008, as Wall Street burned, the German finance minister Peer Steinbruck explained that the collapse was centered in the United States because of America’s “simplistic” and “dangerous” laissez-faire approach. Italy’s finance minister assured the world that its banking system was stable because “it did not speak English.”

In fact this was nonsense. One of the great strengths of Tooze’s book is to demonstrate the deeply intertwined nature of the European and American financial systems. In 2006, European banks generated a third of America’s riskiest privately issued mortgage-backed securities. By 2007, two-thirds of commercial paper issued was sponsored by a European financial entity. The enormous expansion of the global financial system had largely been a trans-Atlantic project, with European banks jumping in as eagerly and greedily to find new sources of profit as American banks. European regulators were as blind to the mounting problems as their American counterparts, which led to problems on a similar scale. “Between 2001 and 2006,” Tooze writes, “Greece, Finland, Sweden, Belgium, Denmark, the U.K., France, Ireland and Spain all experienced real estate booms more severe than those that energized the United States.”

But while the crisis may have been caused in both America and Europe, it was solved largely by Washington. Partly, this reflected the post-Cold War financial system, in which the dollar had become the hyperdominant global currency and, as a result, the Federal Reserve had truly become the world’s central bank. But Tooze also convincingly shows that the European Central Bank mismanaged things from the start. The Fed acted aggressively and also in highly ingenious ways, becoming a guarantor of last resort to the battered balance sheets of American but also European banks. About half the liquidity support the Fed provided during the crisis went to European banks, Tooze observes.

Before the rescue and even in its early stages, the global economy was falling into a bottomless abyss. In the first months after the panic on Wall Street, world trade and industrial production fell at least as fast as they did during the first months of the Great Depression. Global capital flows declined by a staggering 90 percent. The Federal Reserve, with some assistance from other central banks, arrested this decline. The Obama fiscal stimulus also helped to break the fall. Tooze points out that almost all serious analyses of the stimulus conclude that it played a significant positive role. In fact, most experts believe it ended much too soon. He also points out that large parts of the so-called Obama stimulus were the result of automatic government spending, like unemployment insurance, that would have happened no matter who was president. And finally, he notes that China, with its own gigantic stimulus, created an oasis of growth in an otherwise stagnant global economy.

The rescue worked better than almost anyone imagined. It is worth recalling that none of the dangers confidently prophesied by legions of critics took place. There was no run on the dollar or American treasuries, no hyperinflation, no double-dip recession, no China crash. American banks stabilized and in fact prospered, households began saving again, growth returned slowly but surely. The governing elite did not anticipate the crisis — as few elites have over hundreds of years of capitalism. But once it happened, many of them — particularly in America — acted quickly and intelligently, and as a result another Great Depression was averted. The system worked, as Daniel Drezner notes in his own book of that title.

But therein lies the unique feature of the crash of 2008. Unlike that of 1929, it was not followed by a Great Depression. It was not so much the crisis as the rescue and its economic, political and social consequences that mattered most. On the left, the entire episode discredited the market-friendly policies of Tony Blair, Bill Clinton and Gerhard Schroeder, disheartening the center-left and emboldening those who want more government intervention in the economy in all kinds of ways. On the right, it became a rallying cry against bailouts and the Fed, buoying an imaginary free-market alternative to government intervention. Unlike in the 1930s, when the libertarian strategy was tried and only deepened the Depression, in the last 10 years it has been possible for the right to argue against the bailouts, secure in the knowledge that their proposed policies will never actually be implemented.

Bannon is right. The crash brought together many forces that were around anyway — stagnant wages, widening inequality, anger about immigration and, above all, a deep distrust of elites and government — and supercharged them. The result has been a wave of nationalism, protectionism and populism in the West today. A confirmation of this can be found in the one major Western country that did not have a financial crisis and has little populism in its wake — Canada.

The facts remain: No government handled the crisis better than that of the United States, which acted in a surprisingly bipartisan fashion in late 2008 and almost seamlessly coordinated policy between the outgoing Bush and incoming Obama administrations. And yet, the backlash to the bailouts has produced the most consequential result in the United States.
Tooze notes in his concluding chapter that experts are considering the new vulnerabilities of a global economy with many new participants, especially the behemoth in Beijing. But instead of a challenge from an emerging China that began its rise outside the economic and political system, we are confronting a quite different problem — an erratic, unpredictable United States led by a president who seems inclined to redo or even scrap the basic architecture of the system that America has painstakingly built since 1945. How will the world handle this unexpected development? What will be its outcome? This is the current crisis that we will live through and that historians will soon analyze.

Fareed Zakaria is a CNN anchor, a Washington Post columnist and the author of “The Post American World.”

Ireland’s History


CREDIT: http://www.livinginireland.ie/en/culture_society/a_brief_history_of_ireland/
CREDIT: http://www.wesleyjohnston.com/users/ireland/past/history/index.htm

Ireland is beautiful – so beautiful that it is overwhelming.

It is blessed with natural beauty: beautiful land; abundant rivers, streams, harbors; hilltops and long views everywhere.

It is also blessed with what history has added: lovely villages, great ports, winding roads, castles and saved remnants of its proud past.

As the visitor moves from county to county, a few will ask: “What is Ireland’s history?”

The answer, of course, is complex – and every Irish man, woman and child proudly feels all the glory and all the pain. It is a rich heritage. And it is by no means monolithic.

Ireland’s heritage is conquest, first by Celts, then by Irish Kings, then by Vikings, then Anglo-Normans, and finally by English Kings. From conquest comes rebellion – the Irish are rebels to the core. With rebellion comes, way too frequently, crushing defeat and retribution.

In spite of this, using today’s lens, Ireland looks resilient and proud – the Irish always come back! And they look victorious – ultimately triumphing. It is an exciting history of leaders, their shifting lands, their shifting allegiances, treacheries and betrayals, and evolving forms of government and administration of justice.

The history below starts from the beginning. This overview starts from the end, today, and goes backward, from present-day Ireland to first evidence of human activity.

Today, Ireland is split into two – the Republic of Ireland (4.8 million population) and Northern Ireland (1.8 million). The split is the legacy of the Irish War of Independence in 1919-1922. It’s end culminated in this compromise, which neither side liked and which has been contentious ever since.

The Republic of Ireland in independent, and joined the EU in 1973. Northern Ireland is part of Great Britain. It was created in 1920, and the War of Independence left it unchanged.

From 1798 to 1922, Ireland lost population and failed to recover from two brutal events: the crushing defeat of the the Irish rebellion in 1798, and the devastating Potato Famine of 1846-1848. The English rulers were not kind during this period, and it exacerbated the simmering hatred that the Irish felt for the English.

Why such hatred? Because, from 1541 to 1798, the English attempted to impose their laws and customs, and the Irish resisted, and sometimes the resistance became open rebellion. These rebellions were often brutally crushed, and retribution was swift. The Irish hated them for it. The English brought to Ireland hated movements of all kinds. There were movements to take away Irish religion; to take away Irish land; to take away Irish rights; and to invoke cruel forms of genocide (for example, look at the history of the decimating Irish potato famine of 1846-8). Henry VIII kicked off this horrible period when he declared himself King of Ireland in 1541, with the Pope’s blessing.

From 1169 to 1541, England was in the hands of Anglo-Norman lords and Irish Kings. The lords were the result of the successful conquest of much of Ireland by “Strongbow”. The conquest began with his landing in 1169. Strongbow was himself an English Earl, but he brought with him a coalition of English and Normans.

From the eighth century until the end of the twelfth century – 400 years – Ireland was ruled by Irish Kings and Vikings. It was also a period of early christianity, with monasteries being built throughout the country. Viking influence faded as Irish Kings began to dominate.

Before the fifth century, Ireland was an island of farmers. Amazingly, the Island missed a period of Roman Empire domination. Historians have reported discussion by Roman leaders to invade Ireland, but those ideas were never implemented. As a result, Ireland is one of the few countries in Europe without legacies from the Roman Empire.

The intrigue, the conquest, the rebellion, the retribution: is Ireland, this small island of six million people, a microcosm of the world? Does it provide universal insights?

Possibly. Here are two:

Incentives to control land and expand reach are powerful.

The story of Ireland is a story of powerful men exploiting opportunities. Any Irish King, or Norman Lord, or English Kings encouraged loyalists to to form alliances which bound them to fight for land. Once victorious, the loyalist would bring that victory back to the King, add – more often than not – the King would grant that land back to the loyalist (Lord) who had conquered the land. This, of course, was only done with a pledge of continuing loyalty from the loyalist, and a pledge to give back to the king payments in the form of taxes, armies, or whatever the king required.

Conquest breeds resentment, and even hatred.

The story of Ireland is a story of conquest, followed by new rules and retribution for the conquered. We know change is hard, and culture change is the hardest of all. In all cases, imposed changes fostered resentment, and retribution, often harsh, bred hatred. When it.surfaced, it became rebellion.


Early Irish History
Ireland’s early settlers trace to about 10,000 years ago – a relatively late stage in European terms. Farming began around 4000 BC. Celts came to Ireland from mainland Europe around 300 BC. Ireland’s language. Irish (or Gaeilge), stems from Celtic language.

Early Christian and Viking Ireland – 600 AD
Saint Patrick and other missionaries brought Christianity in the early to mid-5th century, replacing the indigenous pagan religion by the year 600 AD. The Rock of Cashel stands today as a testimony to this long history of missionaries for Christ, led by Saint Patrick.

At the end of the 8th century, Vikings from Scandinavia began to invade and then gradually settle into and mix with Irish society. The Vikings founded Dublin, Ireland’s capital city in 988. Following the defeat of the Vikings by Brian Boru, the High King of Ireland, at Clontarf in 1014, Viking influence faded.

The Anglo-Norman Era and “Strongbow” – 1166
The English and Normans arrived in the twelfth century (1166), driving out the Vikings. They brought walled towns, castles, churches, and monasteries. They also increased agriculture and commerce in Ireland.

They were actually invited. The then-King of Leinster visited England and met with King Henry there, as well as many noblemen. His mission: garner support to retake his lands. The king demurred,, so he sought the support of Anglo-French noblemen in re-establishing his prior dominance (he had been badly defeated in a battle by the “High King of Ireland”, who was from Connaught).

Several noblemen offered help. They were led by the Earl of Pembroke, also known as Strongbow. Strongbow’s army landed in 1169 and quickly re-took Leinster.

Importantly, they went further and also defeated Dublin, exiling its Viking king. In 1171, the existing King of Leinster (Mac Murchada) died, and and Strongbow became King of Leinster.
The English King Henry decided to sail to Ireland in 1171. Strongbow pledged his loyalty to Henry, and every major Irish king did as well (other than O’Connor of Connacht and O’Neill in the north).

In the next period of years, the Anglo-Normans consolidated their new lands by building castles, setting up market towns and engaging in large-scale colonisation. Prince John arrived in Waterford in 1185 and initiated the building of a number of castles in the South East region. An example is Dungarvan Castle (King John’s Castle).

Importantly, the Anglo-Normans did not move the native Irish from their land in any major way.

Ireland became a Kingdom in 1199 with Papal approval. All English laws were extended to Ireland in 1210. By 1261, most of Ireland was ruled by Anglo-Norman lords, under the watchful eye of the King of England. This was to be short-lived.

In fact, by 1316, the King has essentially lost control of Ireland, except for Dublin. Cleverly, the few remaining Irish Kings (The Irish Lords in Ulster, O’Neill and O’Donnell) allied with Robert Bruce of Scotland, whose military prowess had become so legendary that he actually was crowned King of Scotland in 1314. With Bruce’s help, they together essentially defeated the Anglo-Normans in 1315. Edward Bruce was named King of Ireland until he was assassinated by Normans in 1318.

From 1261 to 1541, several major attempts were made by English Kings to regain control. Several invasions by English kings during this period were deeply resented by all locals, including the lords. They were hailed as successes at the time, but were also short-lived. By 1450, English control of Ireland had been reduced to Dublin.

Meanwhile, across centuries, many Norman lords and their ancestors essentially “went native” – increasingly adopting Irish customs and culture through inter-marriage, etc.

Anglo-Norman Feudalism – Counties, Liberties and Charters
The Irish adventure of English Kings can be best understood by studying incentives – the motives by which men went to war, to win land, and to then subjugate themselves to the King.

Before the “feudal” system of government and tex collection is discussed, a simple method of keeping score is useful. The score in questions is: how can you easily measure how much control England had at any given moment in time? The answer lies in counting how many of three. jurisdictions existed at any point in time:

“counties” were areas in Ireland that were under the complete control of the King and his feudal system.
Liberties were areas in Ireland that were loyal to the King, but did not participate in tributes to the king via feudal arrangements.
Charters were areas in Ireland that were independent of the King, completely under Irish Kings and Lords. However, importantly, these areas had treaties with the King that specified how they could co-exist.

Knowing this, the King had every incentive to create counties, which would pay taxes and pledge their allegiance to the Crown. The King’s men would conquer lands that were Charters, and make them Liberties or Counties. Then, they would slowly evolve any remaining Liberties to be Counties. This describes the first three centuries of evolving Irish Government. In 1250, vast majorities of northern Ireland were chartered lands, while counties were mostly strong on the east coast.

So how did it come to pass that Ireland was mostly counties by the 16th century?

The incentives for the warrior noblemen were enormous. The King, on seeing that Irish land had been conquered by his loyal subject, would frequently make a land grant back to that noblemen (mostly of the land that had just been conquered) – in return for feudal arrangements discussed below.

Anglo-Norman society was based on the “feudal” system of government, a standard practice though-out Europe in the day. Under this system, the king owned all land. He in turn granted it to Lords. In return, the Lords agreed to pay an annual “tribute” to the King. This could take the form of money, goods, or even armies in times of war.

The Lords, in turn, granted parcels of their lordships to peasants (ordinary people) in return for money, a soldier at time of war or some goods. Many lords set up market towns in their lordships to encourage trade and to convert goods into money. At the bottom of the hierarchy were landless peasants who were granted a plot of land on another peasant’s plot in return for manual labour on the farm.

The Irish system, by contrast, saw no overall ownership of land, but rather each individual Lord had absolute ownership of their land. The commoners worked on the Lord’s land in return for accommodation and food.
Peasants were granted land by a lord in return for annual payment of crops. The lords, in turn, were granted land by the King.

Henry VIII, “Plantations” – 1534
King Henry VIII declared himself head of the Church in England in 1534. He also ensured that the Irish Parliament declared him King of Ireland in 1541, with the support of the Pope.

There followed 250 years of brutality that culminated in a deep hatred by the Irish for the English.

From 1541 to 1798, Ireland had been in the grip of the English Crown. Those 250 years – from 1541 to 1798 – are scandalous, because the English brought to Ireland hated movements of all kinds. There were movements to change their religion (from Catholic to Anglican; to steal Irish land; to take away Irish rights; and in many ways to invoke on the Irish people a peculiar form of genocide (for example, look at the history of the decimating Irish potato famine of 1846-8). Henry VIII kicked off this horrible period when he declared himself King of Ireland in 1641, with the Pope’s blessing. The English brutally ruled from 1541 to 1798, and the Irish hated them for it.

Enforcing his will, to change England and Ireland from Catholic to Protestant (he named himself the head of the Anglican church), the King adopted a “plantation” policy. Under this policy, Protestants would get massive land grants, displacing Catholic land-holders. Thousands of English and Scottish Protestant settlers arrived during his reign. Catholics lost their land.

From this period on, a common theme became Irish rebellion followed by crushing defeat followed by retribution. The victors, England, understandably attempted to impose their will. In the process of doing so, though, it became a hated history – of depriving catholics of their land, and then their rights, and then their very lives.

Bloody 17th Century and “Penal Laws”
The 17th century was a bloody one in Ireland. England imposed the “Penal laws” on Ireland. These laws took away rights from Catholics. They took away the right, for example, to rent or own land above a certain value. They outlawed Catholic clergy,. They forbid Catholic higher education, entry into the professions. During the 18th century, Penal laws eased but by then resentment and hate dominated the country.

Defeated During Rebellion – 1798
In 1782, Henry Grattan (a Protestant) successfully agitated for a more favourable trading relationship with England and for greater legislative independence for the Parliament of Ireland. Inspired by the French Revolution, in 1791 an organisation called the United Irishmen was formed with the ideal of bringing Irish people of all religions together to reform and reduce Britain’s power in Ireland. Its leader was a young Dublin Protestant called Theobald Wolfe Tone. The United Irishmen were the inspiration for the armed rebellion of 1798. Despite attempts at help from the French the rebellion failed and in 1801 the Act of Union was passed uniting Ireland politically with Britain.

Catholic Emancipation and Daniel O’Connell – 1829
In 1829 one of Ireland’s greatest leaders Daniel O’Connell, known as ‘the great liberator’ was central in getting the Act of Catholic Emancipation passed in the parliament in London. He succeeded in getting the total ban on voting by Catholics lifted and they could now also become Members of the Parliament in London.

After this success O’Connell aimed to cancel the Act of Union and re-establish an Irish parliament. However, this was a much bigger task and O’Connell’s approach of non-violence was not supported by all. Such political issues were overshadowed however by the worst disaster and tragedy in Irish history – the great famine.

The Great Famine – 1845 – 1847
When a potato blight destroyed the crops of 1845, 1846, and 1847, disaster followed. During this decade, Ireland’s population plummeted from 8 to 4 million. Two million died. Others left – seeking refuge in America. Potatoes were the staple food of a growing population at the time. The response of the British government also contributed to the disaster. While millions of people were starving, Ireland was forced to export abundant harvests of wheat and dairy products.

The famine brought death. But, with lasting consequences, it also brought resentment and more hate.

First Move to “Home Rule” defeated – 1877- Irish Home Rule Party and Charles Stewart Parnell
“Home Rule” became the cry of all who wanted self-government in Ireland. Until 1877, there was no effective challenge to Britain’s rule over Ireland. Then, at the age of 31, Charles Stewart Parnell (1846-91) became leader of the Irish Home Rule Party, which became the Irish Parliamentary Party in 1882.

Parnell failed to achieve Home Rule. For his efforts, though, he was widely recognised as ‘the uncrowned king of Ireland’. His efforts gave the idea of Home Rule legitimacy.

Irish Unionists – led by Sir Edward Carson in Northern Ireland
In Ulster in the north of Ireland the majority of people were Protestants. They favoured the union with Britain – fearing that they would suffer retribution and a loss of rights as a minority in a Catholic controlled country. The Unionist Party was lead by Sir Edward Carson. Carson threatened an armed struggle for a separate Northern Ireland if independence was granted to Ireland.

Second Move to Home Rule Successful – 1912 – but not enacted because of WWI
A Home Rule Bill was passed in 1912 but crucially it was not brought into law. The Home Rule Act was suspended at the outbreak of World War One in 1914. Many Irish nationalists believed that Home Rule would be granted after the war if they supported the British war effort. John Redmond the leader of the Irish Parliamentary Party encouraged people to join the British forces and many did join.

However, a minority of nationalists did not trust the British government leading to one of the most pivotal events in Irish history, the Easter Rising.

Declaration of Independence I – Easter Rising – Irish rebels defeated – 1916
On April 24 (Easter Monday) 1916, two groups of armed rebels, the Irish Volunteers and the Irish Citizen Army seized key locations in Dublin. The Irish Volunteers were led by Padraig Pearse and the Irish Citizen Army was led by James Connolly. Outside the GPO (General Post Office) in Dublin city centre, Padraig Pearse read the “Proclamation of the Republic” (the Irish Declaration of Independence). The. Proclamation declared an Irish Republic independent of Britain.

Battles ensued with casualties on both sides and among the civilian population. The Easter Rising finished on April 30th with the surrender of the rebels. The majority of the public was actually opposed to the Rising. However, public opinion turned when the British administration responded by executing many of the leaders and participants in the Rising.

All seven signatories to the proclamation were executed including Pearse and Connolly.

Declaration of Independence II – 1919
Two of the key figures who were involved in the rising who avoided execution were Éamon de Valera and Michael Collins. In the December 1918 elections the Sinn Féin party led by Éamon de Valera won a majority of the Ireland based seats of the House of Commons. On the 21 January 1919 the Sinn Féin members of the House of Commons gathered in Dublin to form an Irish Republic parliament called Dáil Éireann, unilaterally declaring power over the entire island.

War of Independence – 1919-1921
What followed is known as the ‘war of independence’ when the Irish Republican Army – the army of the newly declared Irish Republic – waged a guerilla war against British forces from 1919 to 1921. One of the key leaders of this war was Michael Collins.

Michael Collins led and initiative that ended with the label “Bloody Sunday”. It was one day of violence in Dublin on November 21, 1920, during the Irish War of Independence. In total, 32 people were killed, including thirteen British soldiers and police, sixteen Irish civilians, and three Irish republican prisoners.

The day began when Michael Collins led the IRA to assassinate the ‘Cairo Gang’ – British undercover intelligence agents. IRA members went to a number of addresses and shot dead fourteen people.

Later that afternoon, members of the Auxiliary Division and RIC opened fire on the crowd at a Gaelic football match in Croke Park, killing eleven civilians and wounding at least sixty.That evening, three IRA suspects being held in Dublin Castle were beaten and killed by their captors, who claimed they were trying to escape.

Overall, while its events cost relatively few lives, Bloody Sunday was considered a victory for the IRA, as Collins’s operation severely damaged British intelligence, The net effect was to increase support for the IRA at home and abroad.

Third attempt at Home Rule adopted – 1920 Government of Ireland Act – Creates Southern and Northern Ireland under Home Rule
This Act of Parliament was intended to keep Ireland part of the United Kingdom – through Home Rule institutions. The Act establishes two new subdivisions of Ireland: the six north-eastern counties were to form “Northern Ireland”, while the larger part of the country was to form “Southern Ireland”.

Home Rule never took effect in Southern Ireland, due to the Irish War of Independence, which resulted instead in the Anglo-Irish Treaty and the establishment in 1922 of the Irish Free State. However, the institutions set up under this Act for Northern Ireland continued to function until they were suspended by the British parliament in 1972 as a consequence of the Troubles.The Government of Ireland Act of 1920 created the Irish Free State. At the same time, the Parliament of Northern Ireland was created. The Parliament consisted of a majority of Protestants and while there was relative stability for decades.

Anglo-Irish Treaty Ends War of Independence – Divides Ireland – Dec 1921
In December 1921 a treaty was signed by the Irish and British authorities. One of the key provisions of the treaty was a compromise. Ireland was to be divided into Northern Ireland (6 counties) and the Irish Free State (26 counties) which was established in 1922.

As in the Home Rule Act of 1920, most signatories of the treaty were hopeful that the division of the country into Ireland and Northern Ireland would be temporary. However, the division has stayed in place for almost a century.

While a clear level of independence was finally granted to Ireland the contents of the treaty were to split Irish public and political opinion.

Instead of bringing peace, the signing of the Treaty plunged the country into a three year Civil War.

Civil War – 1922-1923
A Civil War followed from 1922 to 1923 between pro and anti treaty forces, with Collins (pro-treaty) and de Valera (anti-treaty) on opposing sides. The consequences of the Civil war can be seen to this day where the two largest political parties in Ireland have their roots in the opposing sides of the civil war – Fine Gael (pro-treaty) and Fianna Fáil (anti-treaty).

Republic of Ireland – 1937 – Join EU
The 1937 Constitution re-established the state as the Republic of Ireland.
In 1973 Ireland joined the European Economic Community (now the European Union).

Northern Ireland Catholics Rebel – 1968
Stability in Northern Ireland ended in the late 1960s due to systematic discrimination against Catholics. 1968 saw the beginning of Catholic civil rights marches in Northern Ireland. These protests led to violent reactions from some Protestant loyalists and from the police force. What followed was a period known as ‘the Troubles’ when nationalist/republican and loyalist/unionist groups clashed.

In 1969 British troops were sent to maintain order and to protect the Catholic minority. However, the army soon came to be seen as a tool of the Protestant majority by the minority Catholic community.

Bloody Sunday – 1972 and “The Troubles”

This was reinforced by events such as Bloody Sunday in 1972 when British forces opened fire on a Catholic civil rights march in Derry killing 13 people. An escalation of paramilitary violence followed with many atrocities committed by both sides.

Between 1969 and 1998 it is estimated that well over 3,000 people were killed by paramilitary groups on opposing sides of the conflict.

Peace with Belfast Agreement – 1998
The period of ‘the Troubles’ are generally agreed to have finished with the Belfast (or Good Friday) Agreement of April 10th 1998.

Since 1998 considerable stability and peace has come to Northern Ireland. In 2007 former bitterly opposing parties the Democratic Unionist Party (DUP) and Sinn Féin began to co-operate in government together in Northern Ireland.

20th Century to present day
In the 1980s the Irish economy was in recession and large numbers of people emigrated for employment reasons. Many young people emigrated to the United Kingdom, the United States of America and Australia.
Economic reforms in the 1980s along with membership of the European Community (now European Union) created one of the world’s highest economic growth rates. Ireland in the 1990s, so long considered a country of emigration, became a country of immigration. This period in Irish history was called the Celtic Tiger.


Bill Gates recommended GRID as one of his five favorite books in 2016. Here is what Business Insider said:

“‘The Grid: The Fraying Wires Between Americans and Our Energy Future’ by Gretchen Bakke

“The Grid” is a perfect example of how Bill Gates thinks about book genres the way Netflix thinks about TV and movies.

“This book, about our aging electrical grid, fits in one of my favorite genres: ‘Books About Mundane Stuff That Are Actually Fascinating,'” he writes.

Growing up in the Seattle area, Gates’ first job was writing software for a company that provided energy to the Pacific Northwest. He learned just how vital power grids are to everyday life, and “The Grid” serves as an important reminder that they really are engineering marvels.

“I think you would also come to see why modernizing the grid is so complex,” he writes, “and so critical for building our clean-energy future.”

My son received it as a Christmas gift, and stayed up all night finishing it. I ordered it the same day he told me.

Finally, a readable history of energy. Why does our grid look as it does?

The incredible role that Jimmy Carter played in the creation of the Department of Energy, the passage of two major pieces of legislation.
1. National Energy Act

GRID traces the emergence of the California wind energy industry. According to the author, the industry emerged in spite of bad technology. The growth traced instead to enormous tax credits. The Federal tax credit was 25%, and California doubled it to 50%. Today Texas and California are by far the largest producers of wind energy in the US>

GRID traces energy from Thomas Edison to Thomas Unsall, who was his personal secretary. It was Unsall that formulated, and then implemented, an ambitious plan to centralize the nations power grid. Until he took over in Chicago, no one could figure out how to create, through government regulation and clever pricing, what today is an effective monopoly. What makes this even more remarkable: the monopolies are largely for-profit.

GRID traces the emergence of energy policy, beginning with President Jimmy Carter.

It includes the Energy Policy Act of 1978 and the Energy Policy Act of 1982.

Postscript: I just read the book a second time, and was stuck by its notes at the end, its index, and its general comprehensiveness.

I guess, for me, the big ideas in this book can be boiled down as follows:

LOAD IS DOWN: the planet is rife with innovations that are saving electricity – and most of them are coming without burden to the consumer (like turning thermostats down, wearing sweaters, etc.). So the demand for electricity peaked in 2007, and is unlikely to go higher until at least 2040.

GENERATION IS UP: At the same time, the ways to generate power better are increasing. Solar panels have dropped at least 50% in cost in a decade, while getting more effective. Wind turbines are excellent, and are continuing to improve. Coal generators are being slowly replaced by natural gas. Natural gas plants have desirable properties beyond generation, e.g. they can start up quickly and can come down quickly.

GENERATION IS BECOMING MORE RESILIENT AND MORE DISTRIBUTED. . After a decade of blackouts largely traceable to storms and poor line maintenance, the push is on for resilience, and it is working. The means to resilience is distributed generation (DG), which ultimately will prove to be very beneficial. However, because of regulatory roadblocks, perverse incentives, and a host of other complexities, it will be some time before the benefits of resilient DG are fully realized.

PREDICTING LOAD IS IMPROVING: Predicting load by five minute increments is improving. Smart meters and smart algorithms make it entirely plausible to predict load well 24 hours ahead, and extremely well 4 hours ahead.

PREDICTING GENERATION IS IMPROVING: the book tells horror stories about DG increasing instability and unpredictability. How can a utility plan for a surge due to a scorching sun? A big breeze? I find these horror stories to be suggestive of where this dysfunction will all end up, namely: prediction will improve dramatically through better weather forecasting, better detailed knowledge of all contributing generators.

A NEW MATCHING OF LOAD TO GENERATION IS VISIBLE. For all the horror stories, I think the future looks bright because matching predictable load to predictable generation is doable today, and will become a norm in the future once all the roadblocks are removed.

ASYNCHRONOUS POWER IS ALMOST HERE. Just as emails are asynchronous, while telephony is synchronous, in that same way, electricity has always been a synchronous technology – because there has never been a way of storing electricity. The world is moving fast toward asynchronous power because of batteries. When this happens, the world is going to change very fast.

TIME OF DAY PRICING WILL ACCELERATE ALL CHANGES. I am shocked at how pathetic time of day pricing is. Its ubiquitous – but pathetic. Once time of day pricing sends market signals about that discourage peak power use, so managers will take increasing advantage of using power (load) when it is cheapest, and avoiding power use (avoiding load) when it is most expensive, then we will begin to see thousands of innovative solutions for accomplishing this very simple goal.

The Reality Project

Here is an amazing compilation of a certain category of world events, by Leon Newton. Newton obviously has a POV, which can be independently understood. But what I find amazing is the diligence of the compilation.

Thanks to my friend Usman Mirza for letting me know about this.

Leon explains his work below:

“The Messengers” which is a compilation of the works of many critical thinkers over the years who have tried to explain existing conditions or to warn us of future conditions if we continued to do what we are doing.

The Reality Project File
Messages in History

“2015 Headlines” contains articles, videos and audio presentations from around the world. It is different from most other news sources in that it includes Energy as a separate category. At the bottom of the “Headlines” sheet are four videos (selected from “The Messengers”) which, I believe describe our present economic predicament.

Headlines in history

“The Reality Project Power Flow Chart”. This is my interpretation of how C. Wright Mills described the roles of the Power Elite, Corporations and governmental entities it also includes many of the entities that were created by the Powell Memo, including the propaganda machine that has so divided this country.

The Realty Project Flow Chart

History of Computing


Eckert and Mauchley develop UNIVAC, the first commercially marketed computer. It is used to compile the results of the U.S. census, marking the first time this census is handled by a programmable computer.
In his paper “Computing Machinery and Intelligence,” Alan Turing presents the Turing Test, a means for determining whether a machine is intelligent.
Commercial color television is first broadcast in the United States, and transcontinental black-and-white television is available within the next year.
Claude Elwood Shannon writes “Programming a Computer for Playing Chess,” published in Philosophical Magazine.
Eckert and Mauchley build EDVAC, which is the first computer to use the stored-program concept. The work takes place at the Moore School at the University of Pennsylvania.
Paris is the host to a Cybernetics Congress.
UNIVAC, used by the Columbia Broadcasting System (CBS) television network, successfully predicts the election of Dwight D. Eisenhower as president of the United States.
Pocket-sized transistor radios are introduced.
Nathaniel Rochester designs the 701, IBM’s first production-line electronic digital computer. It is marketed for scientific use.
The chemical structure of the DNA molecule is discovered by James D. Watson and Francis H. C. Crick.
Philosophical Investigations by Ludwig Wittgenstein and Waiting for Godot, a play by Samuel Beckett, are published. Both documents are considered of major importance to modern existentialism.
Marvin Minsky and John McCarthy get summer jobs at Bell Laboratories.
William Shockley’s Semiconductor Laboratory is founded, thereby starting Silicon Valley.
The Remington Rand Corporation and Sperry Gyroscope join forces and become the Sperry-Rand Corporation. For a time, it presents serious competition to IBM.
IBM introduces its first transistor calculator. It uses 2,200 transistors instead of the 1,200 vacuum tubes that would otherwise be required for equivalent computing power.
A U.S. company develops the first design for a robotlike machine to be used in industry.
IPL-II, the first artificial intelligence language, is created by Allen Newell, J. C. Shaw, and Herbert Simon.
The new space program and the U.S. military recognize the importance of having computers with enough power to launch rockets to the moon and missiles through the stratosphere. Both organizations supply major funding for research.
The Logic Theorist, which uses recursive search techniques to solve mathematical problems, is developed by Allen Newell, J. C. Shaw, and Herbert Simon.
John Backus and a team at IBM invent FORTRAN, the first scientific computer-programming language.
Stanislaw Ulam develops MANIAC I, the first computer program to beat a human being in a chess game.
The first commercial watch to run on electric batteries is presented by the Lip company of France.
The term Artificial Intelligence is coined at a computer conference at Dartmouth College.
Kenneth H. Olsen founds Digital Equipment Corporation.
The General Problem Solver, which uses recursive search to solve problems, is developed by Allen Newell, J. C. Shaw, and Herbert Simon.
Noam Chomsky writes Syntactic Structures, in which he seriously considers the computation required for natural-language understanding. This is the first of the many important works that will earn him the title Father of Modern Linguistics.
An integrated circuit is created by Texas Instruments’ Jack St. Clair Kilby.
The Artificial Intelligence Laboratory at the Massachusetts Institute of Technology is founded by John McCarthy and Marvin Minsky.
Allen Newell and Herbert Simon make the prediction that a digital computer will be the world’s chess champion within ten years.
LISP, an early AI language, is developed by John McCarthy.
The Defense Advanced Research Projects Agency, which will fund important computer-science research for years in the future, is established.
Seymour Cray builds the Control Data Corporation 1604, the first fully transistorized supercomputer.
Jack Kilby and Robert Noyce each develop the computer chip independently. The computer chip leads to the development of much cheaper and smaller computers.
Arthur Samuel completes his study in machine learning. The project, a checkers-playing program, performs as well as some of the best players of the time.
Electronic document preparation increases the consumption of paper in the United States. This year, the nation will consume 7 million tons of paper. In 1986, 22 million tons will be used. American businesses alone will use 850 billion pages in 1981, 2.5 trillion pages in 1986, and 4 trillion in 1990.
COBOL, a computer language designed for business use, is developed by Grace Murray Hopper, who was also one of the first programmers of the Mark I.
Xerox introduces the first commercial copier.
Theodore Harold Maimen develops the first laser. It uses a ruby cylinder.
The recently established Defense Department’s Advanced Research Projects Agency substantially increases its funding for computer research.
There are now about six thousand computers in operation in the United States.
Neural-net machines are quite simple and incorporate a small number of neurons organized in only one or two layers. These models are shown to be limited in their capabilities.
The first time-sharing computer is developed at MIT.
President John F. Kennedy provides the support for space project Apollo and inspiration for important research in computer science when he addresses a joint session of Congress, saying, “I believe we should go to the moon.”
The world’s first industrial robots are marketed by a U.S. company.
Frank Rosenblatt defines the Perceptron in his Principles of Neurodynamics. Rosenblatt first introduced the Perceptron, a simple processing element for neural networks, at a conference in 1959.
The Artificial Intelligence Laboratory at Stanford University is founded by John McCarthy.
The influential Steps Toward Artificial Intelligence by Marvin Minsky is published.
Digital Equipment Corporation announces the PDP-8, which is the first successful minicomputer.
IBM introduces its 360 series, thereby further strengthening its leadership in the computer industry.
Thomas E. Kurtz and John G. Kenny of Dartmouth College invent BASIC (Beginner’s All-purpose Symbolic Instruction Code).
Daniel Bobrow completes his doctoral work on Student, a natural-language program that can solve high-school-level word problems in algebra.
Gordon Moore’s prediction, made this year, says integrated circuits will double in complexity each year. This will become known as Moore’s Law and prove true (with later revisions) for decades to come.
Marshall McLuhan, via his Understanding Media, foresees the potential for electronic media, especially television, to create a “global village” in which “the medium is the message.”
The Robotics Institute at Carnegie Mellon University, which will become a leading research center for AI, is founded by Raj Reddy.
Hubert Dreyfus presents a set of philosophical arguments against the possibility of artificial intelligence in a RAND corporate memo entitled “Alchemy and Artificial Intelligence.”
Herbert Simon predicts that by 1985 “machines will be capable of doing any work a man can do.”
The Amateur Computer Society, possibly the first personal computer club, is founded by Stephen B. Gray. The Amateur Computer Society Newsletter is one of the first magazines about computers.
The first internal pacemaker is developed by Medtronics. It uses integrated circuits.
Gordon Moore and Robert Noyce found Intel (Integrated Electronics) Corporation.
The idea of a computer that can see, speak, hear, and think sparks imaginations when HAL is presented in the film 2001: A Space Odyssey, by Arthur C. Clarke and Stanley Kubrick.
Marvin Minsky and Seymour Papert present the limitation of single-layer neural nets in their book Perceptrons. The book’s pivotal theorem shows that a Perceptron is unable to determine if a line drawing is fully connected. The book essentially halts funding for neural-net research.
The GNP, on a per capita basis and in constant 1958 dollars, is $3,500, or more than six times as much as a century before.
The floppy disc is introduced for storing data in computers.
c. 1970
Researchers at the Xerox Palo Alto Research Center (PARC) develop the first personal computer, called Alto. PARC’s Alto pioneers the use of bit-mapped graphics, windows, icons, and mouse pointing devices.
Terry Winograd completes his landmark thesis on SHRDLU, a natural-language system that exhibits diverse intelligent behavior in the small world of children’s blocks. SHRDLU is criticized, however, for its lack of generality.
The Intel 4004, the first microprocessor, is introduced by Intel.
The first pocket calculator is introduced. It can add, subtract, multiply, and divide.
Continuing his criticism of the capabilities of AI, Hubert Dreyfus publishes What Computers Can’t Do, in which he argues that symbol manipulation cannot be the basis of human intelligence.
Stanley H. Cohen and Herbert W. Boyer show that DNA strands can be cut, joined, and then reproduced by inserting them into the bacterium Escherichia coli. This work creates the foundation for genetic engineering.
Creative Computing starts publication. It is the first magazine for home computer hobbyists.
The 8-bit 8080, which is the first general-purpose microprocessor, is announced by Intel.
Sales of microcomputers in the United States reach more than five thousand, and the first personal computer, the Altair 8800, is introduced. It has 256 bytes of memory.
BYTE, the first widely distributed computer magazine, is published.
Gordon Moore revises his observation on the doubling rate of transistors on an integrated circuit from twelve months to twenty-four months.
Kurzweil Computer Products introduces the Kurzweil Reading Machine (KRM), the first print-to-speech reading machine for the blind. Based on the first omni-font (any font) optical character recognition (OCR) technology, the KRM scans and reads aloud any printed materials (books, magazines, typed documents).
Stephen G. Wozniak and Steven P. Jobs found Apple Computer Corporation.
The concept of true-to-life robots with convincing human emotions is imaginatively portrayed in the film Star Wars.
For the first time, a telephone company conducts large-scale experiments with fiber optics in a telephone system.
The Apple II, the first personal computer to be sold in assembled form and the first with color graphics capability, is introduced and successfully marketed. (JCR buys first Apple II at KO in 1978
Speak & Spell, a computerized learning aid for young children, is introduced by Texas Instruments. This is the first product that electronically duplicates the human vocal tract on a chip.
In a landmark study by nine researchers published in the Journal of the American Medical Association, the performance of the computer program MYCIN is compared with that of doctors in diagnosing ten test cases of meningitis. MYCIN does at least as well as the medical experts. The potential of expert systems in medicine becomes widely recognized.
Dan Bricklin and Bob Frankston establish the personal computer as a serious business tool when they develop VisiCalc, the first electronic spreadsheet.
AI industry revenue is a few million dollars this year.
As neuron models are becoming potentially more sophisticated, the neural network paradigm begins to make a comeback, and networks with multiple layers are commonly used.
Xerox introduces the Star Computer, thus launching the concept of Desktop Publishing. Apple’s Laserwriter, available in 1985, will further increase the viability of this inexpensive and efficient way for writers and artists to create their own finished documents.
IBM introduces its Personal Computer (PC).
The prototype of the Bubble Jet printer is presented by Canon.
Compact disc players are marketed for the first time.
Mitch Kapor presents Lotus 1-2-3, an enormously popular spreadsheet program.
Fax machines are fast becoming a necessity in the business world.
The Musical Instrument Digital Interface (MIDI) is presented in Los Angeles at the first North American Music Manufacturers show.
Six million personal computers are sold in the United States.
The Apple Macintosh introduces the “desktop metaphor,” pioneered at Xerox, including bit-mapped graphics, icons, and the mouse.
William Gibson uses the term cyberspace in his book Neuromancer.
The Kurzweil 250 (K250) synthesizer, considered to be the first electronic instrument to successfully emulate the sounds of acoustic instruments, is introduced to the market.
Marvin Minsky publishes The Society of Mind, in which he presents a theory of the mind where intelligence is seen to be the result of proper organization of a hierarchy of minds with simple mechanisms at the lowest level of the hierarchy.
MIT’s Media Laboratory is founded by Jerome Weisner and Nicholas Negroponte. The lab is dedicated to researching possible applications and interactions of computer science, sociology, and artificial intelligence in the context of media technology.
There are 116 million jobs in the United States, compared to 12 million in 1870. In the same period, the number of those employed has grown from 31 percent to 48 percent, and the per capita GNP in constant dollars has increased by 600 percent. These trends show no signs of abating.
Electronic keyboards account for 55.2 percent of the American musical keyboard market, up from 9.5 percent in 1980.
Life expectancy is about 74 years in the United States. Only 3 percent of the American workforce is involved in the production of food. Fully 76 percent of American adults have high-school diplomas, and 7.3 million U.S. students are enrolled in college.
NYSE stocks have their greatest single-day loss due, in part, to computerized trading.
Current speech systems can provide any one of the following: a large vocabulary, continuous speech recognition, or speaker independence.
Robotic-vision systems are now a $300 million industry and will grow to $800 million by 1990.
Computer memory today costs only one hundred millionth of what it did in 1950.
Marvin Minsky and Seymour Papert publish a revised edition of Perceptrons in which they discuss recent developments in neural network machinery for intelligence.
In the United States, 4,700,000 microcomputers, 120,000 minicomputers, and 11,500 mainframes are sold this year.
W. Daniel Hillis’s Connection Machine is capable of 65,536 computations at the same time.
Notebook computers are replacing the bigger laptops in popularity.
Intel introduces the 16-megahertz (MHz) 80386SX, 2.5 MIPS microprocessor.
Nautilus, the first CD-ROM magazine, is published.
The development of HypterText Markup Language by researcher Tim Berners-Lee and its release by CERN, the high-energy physics laboratory in Geneva, Switzerland, leads to the conception of the World Wide Web.
Cell phones and e-mail are increasing in popularity as business and personal communication tools.
The first double-speed CD-ROM drive becomes available from NEC.
The first personal digital assistant (PDA), a hand-held computer, is introduced at the Consumer Electronics Show in Chicago. The developer is Apple Computer.
The Pentium 32-bit microprocessor is launched by Intel. This chip has 3.1 million transistors.
The World Wide Web emerges.
America Online now has more than 1 million subscribers.
Scanners and CD-ROMs are becoming widely used.
Digital Equipment Corporation introduces a 300-MHz version of the Alpha AXP processor that executes 1 billion instructions per second.
Compaq Computer and NEC Computer Systems ship hand-held computers running Windows CE.
NEC Electronics ships the R4101 processor for personal digital assistants. It includes a touch-screen interface.
Deep Blue defeats Gary Kasparov, the world chess champion, in a regulation tournament.
Dragon Systems introduces Naturally Speaking, the first continuous-speech dictation software product.
Video phones are being used in business settings.
Face-recognition systems are beginning to be used in payroll check-cashing machines.
The Dictation Division of Lernout & Hauspie Speech Products (formerly Kurzweil Applied Intelligence) introduces Voice Xpress Plus, the first continuous-speech-recognition program with the ability to understand natural-language commands.
Routine business transactions over the phone are beginning to be conducted between a human customer and an automated system that engages in a verbal dialogue with the customer (e.g., United Airlines reservations).
Investment funds are emerging that use evolutionary algorithms and neural nets to make investment decisions (e.g., Advanced Investment Technologies).
The World Wide Web is ubiquitous. It is routine for high-school students and local grocery stores to have web sites.
Automated personalities, which appear as animated faces that speak with realistic mouth movements and facial expressions, are working in laboratories. These personalities respond to the spoken statements and facial expressions of their human users. They are being developed to be used in future user interfaces for products and services, as personalized research and business assistants, and to conduct transactions.
Microvision’s Virtual Retina Display (VRD) projects images directly onto the user’s retinas. Although expensive, consumer versions are projected for 1999.
“Bluetooth” technology is being developed for “body” local area networks (LANs) and for wireless communication between personal computers and associated peripherals. Wireless communication is being developed for high-bandwidth connection to the Web.
Ray Kurzweil’s The Age of Spiritual Machines: When Computers Exceed Human Intelligence is published, available at your local bookstore!


A $1,000 personal computer can perform about a trillion calculations per second.
Personal computers with high-resolution visual displays come in a range of sizes, from those small enough to be embedded in clothing and jewelry up to the size of a thin book.
Cables are disappearing. Communication between components uses short-distance wireless technology. High-speed wireless communication provides access to the Web.
The majority of text is created using continuous speech recognition. Also ubiquitous are language user interfaces (LUIs).
Most routine business transactions (purchases, travel, reservations) take place between a human and a virtual personality. Often, the virtual personality includes an animated visual presence that looks like a human face.
Although traditional classroom organization is still common, intelligent courseware has emerged as a common means of learning.
Pocket-sized reading machines for the blind and visually impaired, “listening machines” (speech- to- text conversion) for the deaf, and computer- controlled orthotic devices for paraplegic individuals result in a growing perception that primary disabilities do not necessarily impart handicaps.
Translating telephones (speech-to-speech language translation) are commonly used for many language pairs.< Accelerating returns from the advance of computer technology have resulted in continued economic expansion. Price deflation, which had been a reality in the computer field during the twentieth century, is now occurring outside the computer field. The reason for this is that virtually all economic sectors are deeply affected by the accelerating improvement in the price performance of computing. Human musicians routinely jam with cybernetic musicians. Bioengineered treatments for cancer and heart disease have greatly reduced the mortality from these diseases. The neo-Luddite movement is growing. 2019 A $1,000 computing device (in 1999 dollars) is now approximately equal to the computational ability of the human brain. Computers are now largely invisible and are embedded everywhere -- in walls, tables, chairs, desks, clothing, jewelry, and bodies. Three-dimensional virtual reality displays, embedded in glasses and contact lenses, as well as auditory "lenses," are used routinely as primary interfaces for communication with other persons, computers, the Web, and virtual reality. Most interaction with computing is through gestures and two-way natural-language spoken communication. Nanoengineered machines are beginning to be applied to manufacturing and process-control applications. High-resolution, three-dimensional visual and auditory virtual reality and realistic all-encompassing tactile environments enable people to do virtually anything with anybody, regardless of physical proximity. Paper books or documents are rarely used and most learning is conducted through intelligent, simulated software-based teachers. Blind persons routinely use eyeglass-mounted reading-navigation systems. Deaf persons read what other people are saying through their lens displays. Paraplegic and some quadriplegic persons routinely walk and climb stairs through a combination of computer-controlled nerve stimulation and exoskeletal robotic devices. The vast majority of transactions include a simulated person. Automated driving systems are now installed in most roads. People are beginning to have relationships with automated personalities and use them as companions, teachers, caretakers, and lovers. Virtual artists, with their own reputations, are emerging in all of the arts. There are widespread reports of computers passing the Turing Test, although these tests do not meet the criteria established by knowledgeable observers. 2029 A $1,000 (in 1999 dollars) unit of computation has the computing capacity of approximately 1,000 human brains. Permanent or removable implants (similar to contact lenses) for the eyes as well as cochlear implants are now used to provide input and output between the human user and the worldwide computing network. Direct neural pathways have been perfected for high-bandwidth connection to the human brain. A range of neural implants is becoming available to enhance visual and auditory perception and interpretation, memory, and reasoning. Automated agents are now learning on their own, and significant knowledge is being created by machines with little or no human intervention. Computers have read all available human- and machine-generated literature and multimedia material. There is widespread use of all-encompassing visual, auditory, and tactile communication using direct neural connections, allowing virtual reality to take place without having to be in a "total touch enclosure." The majority of communication does not involve a human. The majority of communication involving a human is between a human and a machine. There is almost no human employment in production, agriculture, or transportation. Basic life needs are available for the vast majority of the human race. There is a growing discussion about the legal rights of computers and what constitutes being "human." Although computers routinely pass apparently valid forms of the Turing Test, controversy persists about whether or not machine intelligence equals human intelligence in all of its diversity. Machines claim to be conscious. These claims are largely accepted. 2049 The common use of nanoproduced food, which has the correct nutritional composition and the same taste and texture of organically produced food, means that the availability of food is no longer affected by limited resources, bad crop weather, or spoilage.< Nanobot swarm projections are used to create visual-auditory-tactile projections of people and objects in real reality. 2072 Picoengineering (developing technology at the scale of picometers or trillionths of a meter) becomes practical.1 By the year 2099 There is a strong trend toward a merger of human thinking with the world of machine intelligence that the human species initially created. There is no longer any clear distinction between humans and computers. Most conscious entities do not have a permanent physical presence. Machine-based intelligences derived from extended models of human intelligence claim to be human, although their brains are not based on carbon-based cellular processes, but rather electronic and photonic equivalents. Most of these intelligences are not tied to a specific computational processing unit. The number of software-based humans vastly exceeds those still using native neuron-cell-based computation. Even among those human intelligences still using carbon-based neurons, there is ubiquitous use of neural-implant technology, which provides enormous augmentation of human perceptual and cognitive abilities. Humans who do not utilize such implants are unable to meaningfully participate in dialogues with those who do. Because most information is published using standard assimilated knowledge protocols, information can be instantly understood. The goal of education, and of intelligent beings, is discovering new knowledge to learn. Femtoengineering (engineering at the scale of femtometers or one thousandth of a trillionth of a meter) proposals are controversial.2 Life expectancy is no longer a viable term in relation to intelligent beings. Some many millenniums hence . . . Intelligent beings consider the fate of the Universe.