Dave Winer blogged about how good it would be if students and journalists ran their own servers: http://dave.smallpict.com/2013/08/26/studentsCanRunTheirOwnServers. In part this is such a good idea because of a new wrinkle in the state of things: the government's massive and oppressive surveillance on its own citizens. In part it is a good idea due to the old notions of self-reliance: running your own server today is like maintaining your own car engine in 1938. In comments, Dave was lauded for the proposal and a wish was expressed of a box that would handle the tough stuff for you: you would customize it but it would give you the basic hardware and allow you to proceed from there (rather like buying a car in 1938 before you change your own oil, tune it up, and maintain it).

There are in fact a couple projects underway to realize something like this. One was proposed by Eben Moglen, the Columbia University law professor. His idea was a server based on the SheevaPlug model, that would operate in a mesh network and guarantee (insofar as possible) one's privacy and anonymity in the face of oppressive governments. This was hailed and work was begun and today you can see how far they have gotten at the https://www.freedomboxfoundation.org.

There is also a variant of this work based on the Linus Debian distribution, at https://wiki.debian.org/FreedomBox.

08/27/13; 11:56:07 AM

Nate Hoffelder at the Digital Reader writes up a writer who considers the natural death cycle of various physical media. He finishes with a couple of bizarre conclusions:

  1. ``So in conclusion, I’m going to go against my source and predict an indefinite lifespan for current ebook formats (barring some unpredictable random occurrence).''

  2. ``This is one of those times where it is safe to bet on proprietary over open formats.''

Mr Hoffelder is usually sharper than this. His #1 runs counter to everything we know about the digital era, as does his #2. More on the second notion:

Kinds of ebooks

  • An ebook (the field Mr Hoffelder covers) is a digital file. They fall into a range of proprietary to open formats.

  • 1. Binary Blobs These are the most proprietary. They can be opened and read only with software that understands the file format, and the reader software itself is usually also proprietary. To read this format 100 years from now, you will need a computer and the hardware and software to read whatever physical media the file lives on (these latter will be assumed as common of all ebook formats). But you also need the software that can open and display the contents of the binary blob.

  • 2. Compressed and Encrypted These are compressed (maybe by proprietary compression algorithms, or by standard compression algorithms) and encrypted. These files are almost as proprietary as the binary blobs. To read them you must have software that can uncompress the file and software that will pass the proper key to decrypt it. Usually a proprietary file such as this will give readers one tool that will do both these jobs.

  • 3. Compressed but Not Encrypted These are compressed (again, by either proprietary or standard compression algorithms) but not encrypted. All you need is to be able to uncompress them.

  • 4. Not Compressed but Encrypted These are not compressed, but they are encrypted, so you will need some sort of software that will let you enter a key to unlock the book contained within the encryption wrapper.

  • 5. Nor Compressed Nor Encrypted These are just text, with some degree of markup, of which there are a number of levels:

    • a. Highly Markup Open Document format, and Rich Text Format, are both markup schemes that are very verbose: in order to find the text content of the book, you will have to wade through a sea of markup language, after traversing a continent of header information. Usually though you will use software that understands and interprets the markup jungle employed.

    • b. Medium Markup Medium markup might be HTML, whose header is only an islet, not a continent, and whose markup of the content is mere puddles, easily skipped over. This is the densest, most-highly-marked up, format that you could read as raw text if you had a few hours' (at most) study of the markup language used.

    • c. Light Markup Light markup schemes or minimal markup -- the best-well known and used now seems to be John Gruber's Markdown -- use the least amount of markup to render the text, and are generally designed so that you could read the raw text with very little or no knowledge of the markup language; the intention behind such schemes is that the markup is almost self-explanatory, though some notions have evolved from conventions of email that maybe a reader from 100 years from now would need help understanding.

    • d. No Markup -- Text Only ... This can comprise basic ASCII text or 8- or 16- or 32-bit encodings. Seven-bit ASCII text will need the least amount of software to decode, and higher-bit encodings will need only a little bit more. Such files might declare their encoding at the start of the file, as HTML files are encouraged to do.

Every ebook that comprises what was called a `book' 60 years ago -- i.e., text -- will contain text. This text is the content and the heart of the ebook onion. Around that text is a wrapper of markup, more or less verbose. And that markup-wrapped text is then either encrypted or compressed, or both, as further wrappings. And this may then be blobbed into a binary file format.

Therefore, in order to read any ebook in a proprietary format, all the succeeding, interior layers of the ebook onion must also be readable.

In other words, to read any proprietary format ebook, you must be able to read text.

Text is the only format that will die only when the physical media the file is etched on, and all methods of reading data off that media, crumbles away.

08/14/13; 11:35:05 AM

There is something missing in most analyses of how the newspaper industry is faltering over the past 20 years in the US.

Let's go back to the 1950s. Television came with lots of entertainment and news, all for `free' once you bought your set and antenna and hooked it up within range of a local station. By contrast, movies cost you every time you wanted to see one -- plus you had to leave home and reach a moviehouse (and for the growing numbers of people who moved to the suburbs, this meant driving to the city and paying for parking). Television grew and movies failed. Television brought in more and more money and profits. These profits could be sunk into (relatively) unprofitable arms like the networks' and local channels' news departments. Watching the Nightly News was free and what is more, it was effortless, compared to poring through all the sections of the evening newspaper.

The evening dailies fell and were largely gone by when? -- early 1970s or so?

Once the evening dailies closed up, the morning dailies had a monopoly grip on their market. In the major metro areas, enough people still card to read papers to support 2 or more dailies, but in the smaller markets, one newspaper and only one remained.

In the olden days of multiple dailies, newspapers competed. They competed with one another. In the continuing give-and-take of this competition, newspapers were always looking for ways to please their readers and peddle more papers. This was a matter not only of getting more-popular features, but always studying and learning how better to compete.

But once a paper got its market all to itself, it no longer had to compete head-to-head with other papers. Instead the competition turned slantwise, against radio and television news. And slantwise competition is very different from head-to-head competition. In slantwise competition you simply focus on what makes your offering different from the competition. Hone your forte and let the others hone theirs. And in the case of television news, which had replaced the evening papers, the two competitors could compete on 2 fronts primarily:

    1. time difference
    2. space difference

Television gave a quick rundown of what happened, earlier today. Newspapers, with more space, gave a more detailed rundown of what happened, yesterday.

  • (There remains one other distinction between the 2 media: television wants, and even needs, to get everything on camera; the news heads on television news disdain what they cannot film. Print news, on the other hand, makes no such distinctions: everything there is words alone. This makes it possible for newspapers to report on what has not and cannot be filmed. Investigations are far easier for print media to cover than they are for television.)

The monopoly newspapers unlearned how to compete -- in general. They stood alone and wielded outsize influence in their markets and their states, playing kingmaker. And all the extras features in a newspaper -- the sections that had once been highly regarded as the focus of a paper's competitive edge -- were allowed to go stale. No longer was the keen humorist a highly-prized addition to a paper's staff. The comics section was printed smaller, smaller, smaller. This meant that the continuing comic, the soap opera and adventure strip, fell out of favor, and the simple gag strip dominated.

When the internet came along and offered news with the immediacy of live radio and tv, but in print with all the aspects of the newspaper, the monopoly papers let it happen. Many wished the future away. Others proudly boasted of their tradition going back a century or more and proclaimed ``We will always be here because we always have been here.''

They didn't compete with the internet; they no longer knew how.

They still don't know how.

08/14/13; 09:49:21 AM

Think of the last time a London department store got bombed.

Last week? Last month? Last year?

These bombings were pretty common in the 1980s. The rebels in Northern Ireland were responsible. They were attacking the British Empire at its heart, with the hope that the bombings could turn the British public against the lingering Imperial occupation of Ireland.

In the 1990s, after the intransigent Thatcher government was voted out, the British Empire entered into more relaxed negotiations with the rebels (aided by President Clinton's government here, which acted in some ways as intermediary). Though the Empire still occupies Northern Ireland, power-sharing has reached that part of Ireland, and the bombings have stopped.

Now consider the price the American people have been paying for supporting the American Empire -- massive, debt-funded warfare and global imperial war bases, mass domestic spying and surveillance, being groped and x-rayed at places of mass transport, the militarization of their police forces ... not to mention the occasional bombing and attempted bombing perpetrated by global rebels against our empire.

These are the price we must pay, we are told ... but, pay for what?

  • For security. For peace. For world domination?

Most Americans would recoil at this last. We little like to think of ourselves as aggressors, as conquerors, as occupiers, as the Evil Empire George Lucas and President Reagan warned us against. But if you look at the facts, objectively, what do you see?

The price for liberty is high, our leaders tell us. In fact, they tell us, the price for liberty is nothing less than liberty itself!

Wow. Nothing less than the total collapse of education in America (and a truly massive government campaign of propaganda) could get most of the people'' believing thatmost of the time.''

The answer to terrorism is simple. And as a bonus, it costs are negative -- it will in fact save the Americans money.

End the empire. Give over plans at world domination. Close the global war bases. End the wars that are more than even the Secretary of Defense Panetta can count.

Leave them alone and, just as the British Empire found out 20 years ago, they will then leave us alone.

08/13/13; 11:48:59 AM

Apple Corporation (formerly `Apple Computer Corporation') has struggled with the cloud in various endeavors such as MobileMe and iCloud, but it seems to me that Apple would do better to return to their original effort in the area of the iDisk.

Give everyone with an Apple device and iTunes ID webspace and a web page. Then Apple could offer HTML apps that would work across every platform and on every device. Provide a basic API that would allow Mac apps and iOS apps to save to and draw from the same storage pool the web page does. Provide a free level of storage and least out more storage as the user wants.

Don't try to be Facebook, don't try to be Twitter. But a Dropbox Apple could be.

08/13/13; 08:03:06 AM

On 11 Aug 2013, I wrote up a `working definition' of Film Noir:

  • the world is darker than you think

There are implications in the word `darker' here.

    1. darker means you are moving out of a lighter place
    2. there is a limit to how dark `dark' can get
    3. people get uncomfortable feeling uncomfortable feelings -- such as fear, anxiety, doubt, loathing
    4. when you get as dark as dark can get, you can only stay there or begin to go back toward the light
    5. when you are sated with fear, anxiety, doubt, loathing et. al., you move naturally, even without willing to, back toward more comfortable feelings such as security, safety, confidence, love
    6. when you are sick of sharing depravity, you willingly seek out goodness and health

Three unique things came to American popular culture, and through it to Hollywood commercial movies, in this time period:

    1. Freudian psychoanalysis
    2. the World War and what American military men (and perhaps especially teenaged men) saw and did during the years of combat
    3. the disruption to society as the government took almost the entire male workforce into the military and replaced them with women

Basically, Americans lost their innocence as they confronted questions such as basic identity (wife or welder? man or murderer? provider or provided-for?), saw their world and its assumptions turned on its head, and were asked to ponder ideas they never had before about motivation, morals, and mental `illness.'

So, film noir both soothed and aggravated these deep anxieties.

Film noir sprang from this loss of innocence. It cannot occur to a generation that was never innocent to begin with. The modern (post-1970) dark films, the neo-noirs, are more homage to past styles and tribute to and partial send-up of the attitudes that underlay the noir years of the 1940s and 1950s.

Adding to this is the whole subject of technology and style. Hollywood moved from black and white film stocks to color, and the film grains shrank and stock grew faster, capable of registering images in less and less light. All these technical changes made the rich, stark chiaroscuro possible to a 1946 black and white film very difficult, even out of the question to a 1963 color film. Though it might be possible in a digital age to recreate the characteristics of 1940 polychromatic black and white film, doing so would be merely an exercise in style, a deliberate affectation, rather than working with the tools available to explore the problems of the day.

08/13/13; 08:02:42 AM

The term `film noir' is a messy one. This is how I define it.

Context

  • The very term `film noir' came after its first claimed exemplars. It was coined, after the pulp paperback imprint Serie Noir, by a french critic when French cinemas flooded with Hollywood movies after the fall of Germany in 1945. Seeing these films all at once, contrasted with the memories of sunny, more hopeful Hollywood productions from the pre-war period, the French were struck at a very evident change in the sensibilities of studio filmmakers.

    • It is important to remember, though, that in Hollywood itself, the term film noir' was not used, and the men and women who made the movies that were later called film noir, were unaware they were making any such thing -- that any such thing existed. They instead were working within the standard genres Hollywood had laid down long before, such as Westerns, dramas, crime films, mysteries, musicals, biopics and the like. So in a very real sense,film noir' is an artificial distinction, an alien term from outside Hollywood.
  • Film noir belongs to an historical period of roughly the 1940s and 1950s. The trend reached its strong height in the late 1940s and during the 1950s gradually ebbed away in the face of more moralistic Westerns and the gray flannel suit era of rising suburbia.

  • The roots of film noir come out of several trends both within and outside of America, including films of the late 1930s France and mid-1920s Germany, crime novels and pulps of 1930s America, psychoanalysis, events of the war, and anxieties of big city living.

  • All these combined to manifest in a sensibility or attitude in American popular culture in all media -- radio, fiction, comic books, and movies.

The sensibility

  • The sensibility or attitude inherent in film noir can be summed up in 7 words: the world is darker than you think.

  • Darkness

    • I am deliberately vague when I use the word darkness' because the sensibility that lay behind film noir was never defined or sharply outlined at the time, either among those who produced this type of tale, nor among those who consumed it. More: the trend came upon the public gradually, so that it took an outsider sitting in cinemas in Paris, seeing the movies of the past half-dozen years all at once, even to mark the change that, now we have the termfilm noir' to aid our perception, seems so plain to us.

    • Darkness exists in many senses here:

      • dark as in unknown -- and unknowable
      • dark as in mysterious (hence the popularity of mystery tales in the noir era)
      • dark as in inspiring terror or fear or dreaddark as in sinful, bad, or evil
      • dark as in night, and the things that pertain to the nightland, including dreams (the opening through which Freudian psychoanalysis crept into the movies)
      • dark as in shadows
      • dark as in death and mourning
      • dark as in the alien, foreign, and strange (akin to the mysterious)

The definition

  • With these thoughts in mind I define film noir then as:

    • The manifestation in commercial Hollywood movies of the general sensibility in American culture between roughly 1940--1960 that the world is darker than you think.
  • Darker' carries its own implications. Once any depth ofdarkness' in whatever form the darkness might take, has been marked and plumbed, later noir tales must take us deeper -- darker. It is not enough, once heroes have been proven to be corrupt, to show us another corrupt hero. He must be more corrupt, or corrupt in some darker sense -- twisted, sick in the head. Once a certain ugliness of aspect has been revealed to us, we need the next one to be darker, and uglier.

  • Such a tendency is common to most artistic movements. The creators strive to capture some sensibility, some aesthetic. They move deeper into its realm, and achieve greater mastery over depicting it. Audiences want more of the frisson, the shock of the new sensation. Until it wears itself out. It can go no farther, can expose no more, can achieve no greater mastery. And a new sensibility takes its place.

08/11/13; 04:13:02 PM

A century ago, the Federal Reserve was instituted in the USA. Since then the nation has experienced a general inflationary trend, broken by periods of recession and depression, but on the whole, the greenback has bought less and less goods. So much so that, according to the official Federal Reserve accounting, $0.03 in 1913 would equal $1.00 today.

  • (And this is according to the official reckonings, which have been tweaked several times over the past 40 years -- always with the effect of diminishing how inflation has been counted. The reality is that inflation has been greater than the official tally, and more likely $0.02, or less, in 1913 dollars would equal a single 2013 dollar.)

Back in 1913, there were Dime Novels'' and evenNickel Novels'' costing $0.10 and $0.05 respectively.

So it is no wonder that hardback book prices of the big new releases now cost $30.00, or $0.90 in 1913 money.

But looking at ebooks, we see that in Amazon's Kindle store, the bottom price for a publisher wishing to accrue 70% of the sale price is $2.99, and the lowest price Amazon allows is $0.99. (Between $0.99 and $2.98, the publisher accrues 35% of the sale price.) Putting these into 1913 dollars gives us $0.10 and $0.03 (again by the official, tweaked, Federal Reserve reckoning of inflation over the past century).

Today's ebook is the modern Dime and Nickel Novel.

08/11/13; 09:37:22 AM

The MacOS (now supplanted by osX) has always stood apart. It comes from a single company and is designed to be run on that company's hardware. (Like Sun Microsystems, which alas is no more.) Buyers who get macs in general do so out of personal preference for both side of the experience -- they like Apple's hardware and they like Apple's OS. They prefer both.

  • (There are a few geeks who prefer Apple's hardware and like osX for its BSD underpinnings, and a very small number who prefer Apple's hardware as a platform on which to run MicroSoft Windows.)

MicroSoft Windows came to predominate the PC industry, achieving a de facto monopoly in the 1990s. But though the software came from a single company, the hardware came from any company that either licensed the OS, or built compatible hardware on which the OS could be installed (such as whitebox computer makers or home brewers).

MS-Windows was a buyer's only choice when looking at mainstream low-price PC hardware. Many schools and businesses standardized on the Wintel ecosystem in the late 1980s and through the 1990s. Thus many buyers who got Wintel PCs did so not out of a personal preference for MS-Windows but rather for a few other reasons:

    1. It was the cheapest alternative, the only one you could afford
    2. It was the only hardware that would run the application programs you had already bought for your current PC
    3. It was what your school required, or was the only choice that ran the application programs your school required
    4. It was what your office required, or was the only choice that ran the application programs your office required

This meant that there were significant numbers of buyers of Wintel PCs who were only (grudgingly) buying hardware bundled with MS-Windows because they had no choice, and not out of any personal preference for the OS. MS-Windows is just what came with the PC you got, it was the default option, and was rarely chosen for itself. Any gripes a buyer had with MS-Windows came to the fore, whereas the good parts of the OS were taken for granted.

This state of affairs led a goodly number of geeks to adopt Linux or one of the flavors of BSD (and later, when OpenSolaris was developed and could be freely installed on x86 platforms, Solaris).

On today's mobile platforms, we find a few OS choices:

    1. iOS
    2. Android
    3. Windows Phone
    4. Blackberry
    5. and a few minor alternatives like FirefoxOS, Ubuntu for Mobile, Tizen, et al.

Apple, as on the desktop, pioneered with the modern smartphone and touch-tablet. Apple makes its own iOS to run on its own hardware. Microsoft, quite late to the game, follows its previous model on the desktop, but also makes its own hardware; to date the company has had limited success in winning OEMs to license the Windows Phone OS, and even less success in licensing Windows RT.

Android here is like MS-Windows: Google's Android division writes the OS, and hardware companies license and adapt it for use on their own hardware. But since Google based Android on Linux, the OS is freely available, and an OEM can install almost any flavor of Android (excepting only the stopgap first tablet release of Android 3.0 Honeycomb) on their own hardware.

Android is like Linux. So many `open' geeks like it. But Android is also like MS-Windows in that it is controlled by one company. It also involves a new wrinkle to the game due to the devices being of necessity online: official Android OS releases include Google code that reports back to Google. Google knows who you are, where you are, and what you are doing with your device. And through Google's servers, so does the US Government.

There is another wrinkle here: there is fierce competition and rapid development in the mobile hardware world. So updates to the Android OS might not work on past hardware. More: most mobile hardware that hooks into a telco system is controlled by the telco (Apple is the exception here, and Microsoft is trying to cleave to that side) -- thus whatever version of Android that comes on a smartphone will only be updated at your carrier's desire, and will usually be larded with applications that serve your carrier's interests rather than your own as buyer. And each hardware partner in the Android world seeks to differentiate its offerings, and will add its own layers of user interface and application programs.

I like the look of iOS, and since I use a Mac, it makes sense for me to go iOS in any tablet or smartphone. But this only locks me in further to Apple's world. Just as I installed and played with various Linux distros on my PCs in the past, so I am leaning toward Android tablets. But then, they report to Googleplex...

Not sure, in sum, what I will do.

08/10/13; 08:55:04 AM

Looking at the difference between cap greek chi and lower case version: Χ and χ -- I must say I like the look of the lower case χ better. Upper case Χ looks too much like a capital X.

08/10/13; 07:37:33 AM

Articles in the local paper about ticks. The only good correlation between weather conditions and tick population is said to be humidity. Ticks hate dryness. Get the humidity down enough and they will just die in 8 hours. And if a tick is left on a dry counter, it will die in 8 hours.

Problem is that the humidity levels on the forest floor run quite a bit higher than general atmospheric humidity as reported from the weather stations. Humidity levels below 60% relative humidity will wreak havoc on tick populations, but when the weather stations in Rhode Island report 60% relative humidity, the relative humidity on the forest floor is more like 85%.

Nevertheless, one bit of advice is: when you come in from spending time in the woods or yard or other tick haven, wash your clothes -- but dry your clothes before you wash them.

08/09/13; 08:55:58 AM

For the past 5 years, the computer industry moved away from the desktop PC towards the mobile (small screen) and HDTV (big screen) technology. All the advances most in the tech news concerned cell phones, tablets, and watching video at home.

TV screens got bigger and cheaper and smarter; toyed with 3D (now looking like a fad again) and faster refresh rates, increased color gamuts, and different technologies from plasma to LCD to LCD with LED backlighting to OLED to SED and FED (both of which proved too difficult to manufacture economically enough to compete with LCD tech), and now to UHD or 2160P and beyond to 4320P.

Meanwhile mobile screens got denser, with smaller and smaller pixels backed by ever more powerful ARM systems-on-chips and accompanying graphics subchips, with more cores.

There is one more potential play here, and that is the Gigantic Screen.

These screens are wall-sized, and just as the TV screen and the mobile small screen are controlled with their own distinct methods -- the TV controlled by a small wand-like remote control, and the mobile screen controlled by touch (and pens in some cases), so the Gigantic Screen calls for its own method of control.

This is a sector where MicroSoft is the most advanced, with their experience with the Surface Table-sized computing (a brand that preceded MSFT's current Surface tablet branding) and their purchase of Pixel with its technology for controlling computer displays running some 80 inches (such as the display said to dominate MSFT CEO Steve Ballmer's office).

What looks like the best way to control these gigantic screens is not direct touch, since that would mean reaching to full arm's length and maybe even walking a step or two. Rather the best way to control these screens involves smaller hand and arm gestures (and maybe gestures with the whole body) that are interpreted by visual sensors in three dimensional space. Here again Microsoft with its Kinect has demonstrated the most advanced technology available today. But the Leap Motion controller also lives in this space and could well be used as the control of the screen gigantic.

Voice Control could also be used for these gigantic screens, and as voice control is useful for the small screens as well, it is likely to be incorporated into the gigantic screen control alongside gestures. These two complement each other, as voice control also includes a simple method of text entry, and gestures allow for more natural manipulation of icons and screen-objects that would be cumbersome using voice alone.

Today's gigantic screens, as they exist in Mr Ballmer's office, are only displays for traditional PC towers. But there is lots of space behind a big screen hanging on a wall -- more than enough for the guts of a desktop PC spread out along a single plane, like the components in an Apple iMac and its all-in-one offspring from Dell, HP and other PC makers.

The gigantic screen also can serve as a larger large screen, with 2160P and larger resolutions.

Small screen engines (today's smartphones and tablets) could hook into the gigantic screen as windows that float over the UHD or desktop PC display. The small screen device such as a smartphone could also be used as a control for the gigantic screen: gyroscopes in the smartphone can work like gestures for optical sensors, and the microphones in a smartphone allow for voice input to be sent to the gigantic screen from across the room without shouting.

08/08/13; 07:29:31 AM

(Thoughts on general political and military trends in the world today.)

Two empires dominate the world today. One is more political and military, the other is more economic.

The Military Empire

  • The political and military empire is based in Washington, D.C., and is run largely through NATO and Europe, though Japan and Israel are key members in their respective spheres.

  • This empire will invade and conquer any nation that it deems weak and small enough to make the conquest easy. The empire is powered largely through the US government's massive borrowing power to fund an immense and high-tech military. This empire in its current configuration, a global outgrowth of the western hemisphere empire that predated it, began in the years following the Second World War. It announced itself with the atom-bombing of Nagasaki and Hiroshima and has followed up on this advancing technological bent through intercontinental missiles, spy satellites and orbiting bombing platform plans, and its current craze for semi-autonomous drone warfare.

  • The empire is reluctant to shed its own blood and hence its preference for technology. Better to slaughter a hundred innocents abroad than have a single man of its own land wounded, seems to be the implied policy.

  • The empire is currently engaged on a plan of world domination ever since fomenting revolutions in central Europe helped break up the old Soviet empire, its only check and rival at the time.

The Economic Empire

  • The economic empire is an empire of the Ruling Class. This class consists of wealthy individuals, key politicians, and powerful corporations, among whom the international banks predominate.

  • The economic empire is based in no one city but a web of cities including New York, London, Bonn, Tokyo, Geneva and others. This is an international or transnational empire.

  • The economic empire therefore, although it is allied with the Military Empire operating out of Washington D.C., does not entirely identify with any one nation, and has gladly undermined national economies in pursuit of its own interests, which consist chiefly of maintaining and if possible growing the share of the world's wealth its members hold.

Cooperation and Rivalry

  • These two empires cooperate at times. For example, the military empire used the economic empire's banking tools to weaken Iraq before invading and conquering that nation. The same methods of economic warfare are currently (2000s on) attacking Iran, with various plans for invasion and conquest, or even wholesale destruction, bruited about over the course of the past eight years or so.

  • But the two empires can also compete. For example, the economic empire seems to think nothing of undermining the economy of the USA, host nation to the military empire, if doing so will further the aims of increasing the wealth of the ruling wealthy individuals and corporations. And whilst the military empire has in the past few years been building towards war with China, the economic empire has carried on doing whatever it can to assist and further the advancement of the Chinese economy.

The Ruling Class and Democracy

  • Since the economic empire consists of the wealthy players in the world, any national government suffering from enough corruption will come to be dominated by the economic empire. Popular movements in semi-democratic republics, however, push back against the total domination of politics by the ruling class. The relation between the ruling class and democracy is an uneasy one. The ruling class sneers at democracy in principle, but recognizes that doing away with democracy entirely could well lead to bloody uprisings on the part of the masses. Therefore the ruling class seems content to allow for the semblance of democratic forms so long as the ruling class gets to predominate in the decision-making. Still, there are signs the ruling class is finding democratic forms increasingly nettlesome, and so for example will push forward an `austerity' agenda in Greece and Spain to the very limits of armed revolution.
08/08/13; 06:49:58 AM

There have been studies lately about climate change and conservatives, seeking an answer to the question, `Why do conservatives reject the science behind climate change?'

To me the answer is a logical one, having nothing to do with personality types, modes of thinking, or any other organic cause.

The environment is the Alpha and Omega of the commons. When the environment is threatened, the free market has no solution. To paraphrase President Reagan, `When it comes to the environment, the free market is not the solution to the problem. The free market is the problem.'

  • (If there is any market protection for the environment, the capitalist apologists have yet to advance it.)

Thus, logically speaking:

    • If pesticides poison ground or water, the pesticides must be banned or regulated by the government.
    • If heavy metals poison ground or water, industries that use and egest heavy metals must be banned or regulated by the government.
    • And if production of CO2 leads to climate change, then all production of CO2 must be banned or regulated by the government.

But conservatives favor industry and distrust government. Any field where government is the only conceivable solution, and where industry (and individual property rights) lead to the problem and must be curtailed, then the scientific conclusions of that field must be distrusted and rejected.

It would not be enough for these people to say, `Yes, DDT is harming the environment and poisoning many valuable insects and birds -- but banning or regulating it is not the best solution,' because then they would be called upon to produce a better solution. Only, you see, they have no solution to offer. And once the problem is admitted to by both conservatives and liberals, both right and left, and there is only one solution offered, then that solution must be adopted.

And so, conservatives can only reject the science. Or at least cast as much doubt upon it as they can.

08/07/13; 10:17:22 AM

An unexpanded node will not appear in the browser's Find search box.

The `Outliner' menu has commands to expand all and expand all subs. These make it easier to search.

Tags I guess will be up to me. Hashtags probably. Then expand all and search.

08/07/13; 09:30:17 AM

I noticed a funny thing yesterday: viewing the fargo outline locally with the OPML Editor, the words I selected as italic showed up as italic -- with no markup around them. I did not notice if the words showed as italic in TextWrangler, but then, how could they?

After doing my search and replace, I did not look for italics. But later on I was struck by the thought that maybe my italics had vanished in editing and saving in the text editor. I looked over the secession post from yesterday, and the italics remain.

I wonder how. I am glad they did, because I was afraid I would have to markup all italics and bold using html. That wouldn't be terrible, but it would make the outline here a bit less readable.

  • On the other hand, something like a copyright sign or trademark will have to be represented as a sort of plaintext style like (c) and (tm) or else with the html entities © and ™ -- which are much nicer to read in the rendering, but a bit more difficult in the raw text outline.

  • Just as I could get an – or an — along with “Quotes” and ‘quotes’ ... but these get balky to write, mess up the plain text in looking it over, and are prone to mistyping a character or two, messing up entirely.

08/07/13; 07:42:47 AM

I changed this outline to lower-ASCII using TeX encoding, except for ellipses -- I made them into three periods instead of code like \ldots.

Tried doing this using the OPML Editor but I am not familiar enough with it, so I fell back to TextWrangler, my usual text editor on osX.

Checking back here, all seems well. Changes worked as far as I can see.

  • (Now I have to remember to retrain my fingers not to type the curly quotes and en-dashes.)
08/06/13; 04:01:56 PM

Think I will move to ASCII 7 from now on. TeX workarounds would be nice using `quotes' like that and en-dashes as double-hyphens -- and em-dashes as triple-hyphens --- would get me most of what I want.

Look into changing this outline at home using a text editor -- or OPML Editor if it could manage it.

What almost pushed me to this decision is a blog post from Doc Searles, he quotes an article that uses an upper-ASCII encoding different from that of his blog, so a rsquo comes out as 2 or 3 characters.

Staying all-ASCII 7 means this will not happen.

08/06/13; 03:39:03 PM

Couple years back there were rumors that Intel wanted to fab Apple's ARM chips for them. Didn't seem too believable but there it was. Intel even had a weasel-word comment on the story saying they were open to trying out something like that, only it would be on custom x86 chip designs. x86 was still the byword in Intel land.

I think this is a mistake. It represents an old way of thinking.

First, Intel is now a vertical monopoly: they own patents and copyrights on x86, keep extending those patents by designing new proprietary connections for their SoCs, and they make the chips, and they make auxiliary chips, and they sell those chips to OEMs. A vertical monopoly is the kind that regulators lick their chops over.

Second, Intel has more fab capacity than any other company, but with the global PC market peaking, Intel faces a worrisome future in which its capacity rises (shrinking fab processes and larger wafers means more chips coming off any one line) but sales stall or even diminish, while ARM chip sales continue to climb.

So Intel could solve both these puzzles by devoting one of their fabs to non-x86 chips. Or even service oriented chips -- go on Intel, fab some AMD chips for them. Fab Apple ARM SoCs.

  • A year or so ago, there were reports that TSMC, Taiwan Semiconductor Manufacturing, had been approached by both Apple and Qualcomm to license TSMC's entire output. TSMC turned both offers down, as the story goes, in order to play the field, apparently liking their chances for bigger profits by arranging smaller runs for different customers. Apple didn't want to depend on Samsung any longer, and needed massive fab capacity to churn out gazillions of chips, and the same seems to be true of Qualcomm. Well, nobody has the capacity of Intel.

Intel would want monopoly margins on a non-monopoly business model, but it seems as though just keeping the fabs open and running, and perfecting their manufacturing processes through successive dye shrinks, should be enough of a motivation for the company.

The golden age of monopoly over an exponentially growing industry might be over. Intel could prepare for the future and stave off any monopoly investigations at the same time. Lower margins could be offset by greater volume resulting in an increased net.

08/06/13; 03:25:27 PM

News via Gigaom.com:

http://gigaom.com/2013/08/06/ibm-hopes-google-et-al-can-breathe-life-into-its-power-chip-franchise/

IBM seems to be offering its POWER server chips, the last word in the old Apple-Motorola-IBM PowerPC alliance, in a way that reminds me of the ARM model. IBM will be open to design customization by its customers, especially big buyers such as Google.com.

08/06/13; 10:27:10 AM

resilience.org has an interesting article, a chapter from a book on Vermont, regarding the chances that Vermont might secede from the USA and even make a go of it:

http://www.resilience.org/stories/2013-08-05/farewell-to-empire

I have been hearing all this talk about secession ever since the racist Southern states, Texas foremost among them, objected most vociferously to having a President who is (half) black. So I looked through the US Constitution to see what it had to say.

The Constitution has a couple clauses on joining the union. And one on becoming a separate state apart from the state to which you currently belong, the way West Virginia was created (probably illegally, but judged to be legal by the Supreme Court) during the Civil War.

The Constitution has nothing to say about leaving. Nothing at all.

Moreover, what the Southern States tried to do in 1860 was ruled unconstitutional.

The only clause that might cover the act of secession is the Tenth Amendment, which preserves for the states all powers not expressly granted to the federal government in the rest of the constitution. Logically, if a state had the power originally to join the union, it has the power to leave the union. And if a territory belonging to the federal government (say, Colorado) gains by statehood all the powers and rights that the other states have, then the former territory also has the right to leave. However...

Two objections arise.

    1. The US Supreme Court has never allowed for the right to secede to be part of the Constitution. What is more, the Tenth Amendment never seems to have had any force anyhow. What decisions ever came down on the side of the states vs. the federal government that relied on the Tenth Amendment?
    2. The recent history of the federal government, to wit the four terms of the Presidents George W Bush and Barack Obama, indicate that whenever an individual challenges the federal government, an example will be made. Look at Bradley Manning, Thomas Drake, and now Edward Snowden. Should we expect the Federal Government to act any less harshly at the prospect that it might lose an entire state?

Vermont might be small potatoes to an Imperial Power like the USA today. But the precedent of letting even little Vermont go, would surely embolden Texas and Louisiana and Florida and maybe Oregon and Washington State and California to consider what their prosperity and fortunes might be as independent sovereign states. And the Federal Government could not allow that. Would not allow it.

I wonder, too, whether the citizens who advocate secession realize what it would entail. For Vermont that would mean a full border crossing into New Hampshire, New York, and Massachusetts. All travel to Europe would entail crossing through Canada or the USA, at the least flying through those countries' airspace. (The citizens of Texas, Oregon, California, and the other states that are not landlocked would at least have the ability to sail the sea without having to ask another nation for permission to go abroad.)

A US President need not even send in bombers and ground troops to dissuade Vermonters from sticking to their secessionist ways. There are plenty of `softer' powers at hand to inflict pain on those who say they are no longer a part of the Empire. Taxes at the border, a complete closure of the border, and other means would be more than enough. Vermont would be unlikely to find a sympathetic ear in Ottawa either -- Canada has since Stephen Harper took power shown herself to be a willing partner to US Imperial aims, and Washington could always sweeten the deal by agreeing to buy more tar sands...

It might be a nice fantasy to dream of independence but it ain't gonna happen.

08/06/13; 08:59:57 AM

I wonder about using Fargo as an online repository. This might run like a linkblog, and whenever I see a web page that looks interesting enough to remember, I could add a post to Fargo with a link to it. Just as a personal reminder. I could also quote from the page a relevant paragraph or two.

At least it would get me and keep me in the habit of posting to Fargo.

08/05/13; 11:28:11 AM

Thinking about AMD has me depressed. The PC industry has moved from high powered desktops to low powered laptops. Intel has moved that way and beats AMD on power per watt. Thus Intel gets to go on charging monopoly prices for its chips, while AMD sells fewer chips and at lower prices. AMD ends up with higher costs to produce its chips (smaller fab runs) and lower prices to entice PC OEMs to put AMD chips in their laptops.

So AMD ends up with smaller profits, less money to put into research for future chips, and they fall farther behind Intel.

What could AMD do about this?

I was wondering whether AMD could license ARM architecture, and produce a chip that has both x86 and ARM cores. This is something very few other companies even could do. VIA tech could, but AMD likely is bigger than VIA by as much as Intel is bigger than AMD. Other ARM licensees don't make x86 chips. And Intel would not want to go down this road: Intel is set on `x86 everywhere' as the company announced some half-dozen years ago.

But, why?

  • What good would a hybrid x86-ARM chip be? Running Android and a Linux or MS-Windows OS at the same time? Running iOS and OS x at the same time?

  • You could have both operating systems running the way in the unix-derived world you have a few user spaces running.

  • You could operate your x86 system when you need power, and switch to your ARM system when you want to extend battery life.

  • You could run some programs on one system and others on the other.

  • The ARM chip system might go on operating while the x86 chip system is asleep -- your laptop would look like it is asleep but the ARM system would be collecting email.

  • In one sense it would operate like ARM's big.LITTLE model, in which Cortex A-15 cores are lashed to Cortex A-7 cores, and the system switches back and forth as power is needed.

    • In this way it might make sense for such a hybrid chip to operate a hybrid OS. Ubuntu runs on both x86 and ARM. And MSFT has Windows and Windows RT now, and supposedly apps from the Metro store will run on either OS. (Going to the traditional desktop might engage the x86 cores, and going back to Metro would shut down the x86 cores and fire up the ARM cores.) Android I think also runs on both ARM and x86, though the x86 variant is a few generations behind the ARM variant.
08/05/13; 11:13:54 AM

Dave wrote about the NSA spying scandal and drifted into the common complaint that `it is all the boomer generation's fault' ... and he added:

  • ``When we were younger we did protest. Study the history. We were even effective at stopping the war we objected to. Our crime, if you want to think of it as that, is that we became middle-aged, and decided to live our lives instead of trying to change the world.''

I think this is right. But I am a lot harsher on us boomers than Dave is. Basically:

  • The boomer generation is the first in history (?) to know that what they are doing, as a nation, as a world, as a civilization, is wrong -- and go on doing it.

  • We started to change the world. We could have finished what we began -- and the whole world would be a lot better-off now if we had. And then we had kids, went back to work, went back to church, left the communes, and took away from our kids all the indulgences and freedoms we demanded and enjoyed for ourselves.

When you think of it in these terms, it becomes a very damning accusation.

  • (Note: not all boomers ever knew or acknowledged the evils of their society, and not all boomers went on doing The Usual, or tried new things and retreated to The Usual. And like as not, the boomers who changed things in the 1960s represented only a small fraction of the whole generation -- which underscores, if true, just how few people you really need to spark great social change.)
08/04/13; 11:21:43 AM

Discussions of right and left, conservative and liberal, raise the notion of authoritarians and where they fit into these axes.

An authoritarian fears the general freedom of his fellow citizens. He looks up to anyone in a uniform whether it be a police uniform or an army uniform. He wants order and security. He fears change, and this is why he fears the freedom of his fellow citizens: they might change things too much.

There is a general affinity between those who fit this description and conservatives as I define them. But the anxieties of authoritarians go far beyond the slim bounds of who has how much power to decide things in the society.

But in theory, I think it's possible for an authoritarian to be on the left wing or the right. (In practice it seems the overwhelming numbers of authoritarians line up on the right side of the axis.) That's for left and right. But as for the conservative-liberal axis, since liberals urge change and conservatives resist it, it seems that by my descriptions, no liberals can be authoritarians, and all authoritarians would find a lot in the conservative doctrines with which to agree.

08/04/13; 11:38:26 AM

Last built: Mon, Mar 17, 2014 at 9:41 AM

By SWP Pond, Thursday, August 1, 2013 at 11:24 AM.