Asst
Patterns in static

Some fluff, some info





navigational aids:
 





topics covered:

09 October 03. Here is some text for you to enjoy.

The blog finally looks like a blog. I've got the deep-sounding-because-it's-incoherent-and-out-of-context title, and that list of links to the right. Further, I'd like to point out to you that I am not using any sort of blog engine, as other blog manufacturers do. No---this is all hand-crafted HTML, giving you that added value that only a human touch can give. And after today's entry, it'll have the first overdigressing rant.

So on the subject of the links at right, I feel that the Something Awful link deserves some explanation. It's exceptionally puerile stuff, seemingly written by people who are in the late high school to early college phase. So what do I like about it?

It's text.

There's the ongoing debate as to whether literacy is in decline, with every single generation since Plato's complaining that we're worse off than we used to be, and that kids are dumber. I'm generally bothered by the lack of perspective in urgent claims that people have been making for the last thousand years. E.g., the environmental crisis, wherein we've gone too far this time and we really will all suffocate on our own pollution in the next few decades; the overpopulation crisis, wherein we're going to run out of land and food any day now; the violence crisis, wherein the people out there today are infinitely more hostile than people, say, during the crusades. You'd think we'd have learned since Malthus. I class the decline in education in there among crises humans have been facing since the origin of the word crisis (circa 1540).

Having given all that blather, I'm going to talk about how people read less now than they used to, focusing on two fields: advertising and humor. On the advertising front, just open any magazine from the 1970s or earlier. The advertisements, for booze, records, whatever, were filled with lengthy blocks of text. Those blocks are gone for good, it seems. The only blocks of text we see in modern ads are those that are federally mandated. Maybe some day even the Surgeon General's warning will just be replaced by a little `no smoking' sign.

I'm not entirely sure what this says about modern consumers. If it's true that advertisers who know their public succeed and those who don't fail, then this seems to imply that modern consumers just require less information (or solely pictographic information) to make consumption choices, that consumers of today deal with text in a fundamentally different manner. Don't know if this is true. It could just be that printing full-color pictures is cheaper now, or maybe it's just that text takes up space on the page where you could be putting boobs.

On the humor side, flash back to National Lampoon, from before we were born (early 70s). It was mostly text. You had articles in which the authors would take on some character or describe some funny situation. It included photos and cartoons, but not a great proportion. [Just as Al Franken's latest book, which is all text save for one comic section, is still a book of essays.]

Puttering around the Net, you find that it's all comics, or splashy multimedia. E.g., Suck, or Modern Humorist. The funnies in the newspaper are of course all pictoral, save for maybe Dave Barry. So Something Awful stands out as one of the few sources of daily humor in written form. Im not going to say silly things about how written humor is superior to pictoral, just that I think it's neat that even with all the bit-throwing ability we have now, people can still make us laugh by just talking.

Of course, SA does have incidental photos, some of which clearly and strikingly outdo the text. E.g., have a look at the Russian Brides pictured in this one.

`But Ben,' you protest, `what about SA's content! it's frequently completely puerile.' Well, depending on your scale. I mean, it sure beats Howard Stern. And both Mr. Stern and Something Awful beat National Lampoon by a mile on the insultingness front. I mean, we're a long way from when you could write about spics and coons in prose, serious or not, and still get respect. For this I am grateful. Sure, there are lots of ways to take political correctness too far, but the movement has made some real progress, even if most of us don't perceive it, and that progress is embodied first and foremost in humor.

Oh, and Something Awful also talks about video games a lot, which the reader may or may not be able to relate to (but compare with this page, which is friggin' hilarious whether you play video games or not). However, they have the absolutely brilliant strategy of only reviewing bad stuff (thus the name, I guess). You know how movie reviews are useful whether the movie is good or bad, but are funny only when the movie is crap. [The Filthy Critic did an all-around great job of being funny regardless of the material. Sigh.] So why not just ditch the reviews of good stuff and stick to crap? This is a strategy which many folks have come upon, but only Something Awful had the bravery to live up to.

And that's why I like SA.
[link][no comments]

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 


20 October 03. An annoying economist trick

So Jack Snow, Secretary of the Treasury, stopped by for lunch. We had lunch in the press room, which made it feel like a press conference with salad.

Secretary Snow has been around: he was a professor of Economics for a while, head of a business coalition, stuff. Not a dumb guy. He was also smart enough to acknowledge that many of the people in the room knew more facts and/or were better economists than he.

So one of the primary points of interest among these economist people was around Bush's proposed tax cuts. Consensus among the economists was that they're a dumb idea, which will blow up the budget defecit to immense proportions. The logic being: if the government cuts taxes, it gets less money. See how that works?

Now, economists are enamored of something which some call `The law of unintended consequences', the gist of which is that everything is interrelated, so when you change one variable, you need to take into account everything else in the system too. The usual application is as follows:

1.Advocate of policy P makes a claim that it will change variable X for the better.
2. Opponent of policy P then invokes the law of unintended consequences, by expanding the framework to include variable Y. If the effect on variable Y is detrimental, then opponent satisfactorily proclaims that policy P is dumb and should not be implemented.

This has the rhetorical benefits that it makes the opponent look like a more subtle thinker, and allows him/her to keep pointing out variable Y every time anybody says anything about policy P. It has three rhetorical problems: it leaves open the question that maybe we could open up the framework a little further to include variable Z, where policy P is the best thing to happen to variable Z since frigging sliced bread; it fails to take into account the (often extremely difficult) question of whether the benefit to variable X outweights the detriment to variable Y; and after the third or fourth time opponent points out variable Y, it gets really annoying and frustrating.

So Secretary Snow points out that cutting taxes will help to make the economy more efficient. A more efficient economy means a larger tax base, and therefore more tax revenue.

We can apply the above critiques in sequence: it doesn't take into account other possible expansions of the framework, like how not taxing dividends breaks the corporation-as-person metaphor, and allows for shifty dealing between the CEO's personal accounts and those of the corporation. Don't tax dividends, and suddenly every corporation is going to be paying huge dividends instead of paying salaries or keeping that money in the corporation.

Next, it doesn't take into account the relative weight of two effects: we've cut tax rates but raised the tax base. Determining which effect will prevail is a hard question, which we should defer to the economists who put out the effort to analyze this question in detail. Their verdict: there is no fucking way that the expanded tax base will make up for the cut in rates. The cuts will absolutely, positively, balloon the budget deficit. This is using any model you want, including the super-optimistic model of the Congressional Budget Office, which reports to Secretary Snow.

The sad part comes from that third critique of the law of unintended consequences: confronted with a room full of really smart economists (and me) who were unanimously in agreement that making these tax cuts permanent is a supremely bad idea, he had no recourse but to annoyingly keep repeating that the tax base will grow. He did this in a number of ways, affably joking about how those silly people in the business press don't know the difference between debt capital and equity capital (like it matters here), talking about how we need to encourage innovation, and the other usual neoconservative chestnusts.

Though the lunch was `off the record', I don't really feel that I'm betraying anything here: somewhere between most and all sane economists agree that Bush's tax cuts will expand the deficit---at a time when Baby Boomers are about to start claiming social security and the USA needs to rebuild Iraq, since it's broken; and Secretary Snow stands by his boss in supporting these cuts. The big question I'm left wondering is whether Secretary Snow believed what he was saying. He's married to this position, either by his own beliefs or those of President Bush; how many experts does it take to get a person to divorce himself from a bad belief?
[link][no comments]

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 


28 October 03. Internet as television

Television is a medium whereby data is sent from a few central points to a multitude of receiving machines. The reader may judge for him/her/itself, but people on the receiving side are often characterized as passive, mindless consumers.

The internet is a set of computers which use a standardized communication method to exchange data. Any computer on the network can address every other computer, and serve or receive data with its peers. There are some standards as to how that data gets shunted around, but the flow of data is basically arbitrary.

Now, the reader's experience with the internet is probably more like the definition of television than the definition of internet. You go to a website like nytimes.com or salon.com or hornywetmidgets.com and information is beamed in to your house, and you consume that information. Oh, you can use the Net like the mail to send one-to-one emails, and many web sites do have interactive details such as the `add to cart' button, but the flow of information is basically one way.

There are many forces out there trying to keep it as much like TV as possible. The majority of digital-protection schemes which protect the members of the RIAA and the MPAA will have the side-effect of making the Net more like TV. For example, the peer-to-peer networks that we read about in the news every day exactly fit the internet definition above. A site such as itunes.com, where you can download music that they put out for you, is a closer fit to the above definition of television.

But I digress. In Internetland, everybody can be a content provider to everybody else. Those who think this is how it should be can all have a blog.

Within the grand scheme of human history, this is new.

Nice, democratic people that we are, we like to think that everybody should have a voice, all the time. But that one isn't entirely obvious. It's like libertarianism: hoardes of free marketeers insist that the world would be a better place with zero regulation, but such a state has never, ever existed. [As a reply, a few thousand libertarians are trying to make one.] Equal access to public media has also never existed.

And the truth is, that if you give everybody a chance to talk, most of them will, indeed, talk about the completely banal, idiotic, or vaguely offensive. [I'd give examples, but how to cull it down?] So you get people who complain about the bloggers, saying that the Net is filled with blog noise from self-appointed experts such as this arse.

But people are really good at filtering dumb content. Yeah, they still think Fox News is Fair and Balanced (tm), but I expect that even that facade has been cracked, as they keep suing other content providers who parody them, such as Fox Broadcasting. [Ex post note: this lawsuit was just a joke by Matt Groening. But I'm leaving the darn links.] Or to give an example on the other end of the production spectrum, most people, when happening upon a Chick publication lying around, will successfully gather the clues and work out that this is the work of a crackpot.

Despite the innovative features, online readily follows traditional media this way. I have full faith in the abilities of those who stumble upon this to realize that even though I'm a world authority on the application of Bayesian updating to models of simultaneous conviviality, that doesn't give me the slightest license to blather endlessly on why Jack Snow is annoying.

More meta: With that in mind, I've submitted the site to Google, which I count as the blog going live, as complete strangers will now stumble upon this work and be forced to work out whether it's worth reading or not. We'll see where it goes. [By its own purchases, Google seems OK with blogs, by the way.]

I have to admit that most of what I look for in web sites myself is the TV-like sort of thing, wherein I watch words come up and look at them while I'm eating. Y'know, sentences that you don't have to read all the way through because you have something to click on in the middle of each of them. I guess that's what I'm providing, and that's OK.

Despite my lack of authority, you're hopefully finding some entertainment value in these pontifications. One reader suggested that I shoot for something more personal and less beat-you-about-the-head-and-neck witty, but it's sorta hard to put personal content here. A simple `I still think about `Lissa Tom a lot' would probably deeply disturb some subset of the world's population. I certainly don't want to end up like these characters.

Oh, and while we're on the subject of this document, the reader will note that I'm only updating on even-numbered days, thus saving the reader the endless torment of hitting over and over again on at least the odd-numbered days. See, I really do care.




[link][no comments]

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 

22 January 04. More on the dumbness of elections

[Just aimless bitching today, but yes, I did write an academic paper giving statistical evidence backing this stuff up. Except the paper focuses more on the leptokurtosis of the error term in probit estimates of voter turnout than on underwear.]

Miss JATMM of Mount Vernon, VA, points out that the last episode, about DC's taxation w/o representation, is indicative of the general hypocrisy of the political system: after all, when the Southern states felt underrepresented, they started shootin', and the Republican party has always pushed for more state's rights, and everybody all around has always been up for the `Equal representation' thing. Yet all this rhetoric evaporates when it comes to talking about DC getting the right to vote.

voter turnout v thong underwear So let me tell you what I've been working on, which I presented at the conference last weekend: evidence (but not proof) that voting behaves like a fashion statement. Some groups (e.g., yuppies) are all about it, and some groups (e.g., urban Blacks again) don't go for it. We can argue all day about whether higher turnout is a good thing or not---after all, low turnout means that only people who really care and have some amount of information will show up, and people are free to not vote if they choose not to. But spotty low turnout, where certain groups consistently show up and others consistently don't, distorts the outcome of even the simplest binary decisions.

As discussed previously, no measure of group will is true or infallible, but some are much more perverse than others. A big factor in that perverse-ness is when there are systematic distortions in voter turnout based on facts unrelated to the governing of the nation, such as those that would actually induce most people to show up to vote.

The probability of being the pivotal voter who has a true and honest effect on the election is nil, so turnout is based on everything else: either an irrational belief that one person can make a difference when he very probably won't, or voters who wish to express their opinion for other reasons, such as a sense that this is what civic-minded people do, or because they're bored or because their friends voted so why shouldn't they.

Adding to the problem, the theory says that there are multiple equilibria in such settings. Why do some groups display their thong underwear and some groups like to wear big hats? It's random and arbitrary, based on history, luck, and maybe the preferences of the group members. But it has an effect in the current world: if all of your friends are displaying thong underwear, you too will start to feel pressure to do the same, and may eventually find yourself lingering at the thong rack at the department store. Conversely, if the same you is in the typical office setting, you will feel pressure not to display your underwear, regardless of how strongly you feel that it should be freed from the constraints of your pants. The same people can have different levels of equilibrium thong turnout, depending on the situation---it's not at all inherent to the people themselves.

You may feel that it's a bit silly for me to be comparing turnout to underwear choice, but there are some important similarities. The first is that all costs for both expensive underwear and turning out to vote are your own; the second is that most or all benefits go to the people around you. After all, you can't see your own lower back, and most of the policy of this nation of 265,000,000 has nothing to do with you.

Why you choose to wear what you wear or why you choose to vote is a personal decision which includes any of a number of factors which are known only to you. But for many, one of those factors is how others treat them as a result of their actions. The important point here is that this treatment differs depending on the group: some groups find voting to be very important and reward those who vote, and some groups are indifferent or even hostile. This affects the final ballot count, and distorts the perception of `public will' in systematic ways.

In an alternate universe, only urban Black males tie sweaters around their necks, while yuppies wear five-sizes-too-large baggy pants. In an alternate universe, Blacks were historically allowed to vote, and the non-voting social norm that now pervades the cultures of Blacks, immigrants, and other historically disenfranchised groups never had a chance to take root. But a voting system here in the real world, where these groups have been disenfranchised, and do have a social norm different from the social norms of other groups, is a voting system whose outcome relies heavily on history, luck, and the whims of fashion.

Of course, the groups that are most likely to show low turnout are the groups that are generally economically disenfranchised (you get to work out which caused which), and are generally represented by the Labor party, or as we call it here in the U.S.A., the Democrats. This makes the entire process a partisan one: Reps want low turnout among these groups, and Dems want higher turnout. Which brings us to yet more hypocrisy of the political system. A good representative democracy may not have high turnout, but it has turnout which is representative of the people it, um, represents. Yet attempts to make turnout representative are not forthcoming among the Republican leadership.

Next time you're hanging out with your exceedingly patriotic pro-Bush pal, ask him/her why he/she believes that the 572,000 citizens of the District of Columbia shouldn't be represented in Congress while the 494,000 citizens of Wyoming should be. Next time you meet a Republican harping about liberal bias, ask them whether they're bothered that even the crappy measures we use to determine public opinion today are themselves biased against the poor and historically disenfranchised.

Oh, you know what the answers will be, but it's fun anyway.


[link][no comments]

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 


06 February 04. A lament about bad design

OK, this was going to be a lament about any of a number of things, with a general discussion of how to design things to be useful but not annoying. However, I fear that it's going to turn in to a rant about MSFT. Sorry, guys.

It's about designing things for dumb people. This is, by itself not a bad thing. After all, designing for less cognitive effort benefits us all, just as features designed for the handicapped are often embraced by able people who just want it easier. [I've been using the twiddler lately; it would be most useful for people who only have one functioning hand, but I like it because it lets me drink tea and type at the same time.] Or, at the other extreme, here is an article about how an unintuitive interface killed John Denver.

The two prime examples of this would be advertising and MSFT products. I think the whole thing about idiot-proofing Windows has been discussed to death, and needs no elaboration here. I've already talked about how advertising has gone from a long textual evocation of the product (with bold headlines for those who are just skimming) to a picture of the product being held against a pair of breasts.

Don't get me wrong, intuitive interfaces and things that dumb or inattentive people can readily digest are not necessarily bad. But what makes them horrible is when the design makes it impossible or too difficult for not-dumb people to go beyond the dumb level.

For example, when waiting for a subway train, I am confronted with a number of large, backlit ads right in front of me, and I typically have about ten minutes to kill. This is the perfect opportunity for the vendor to tell me all about the product, in great, backlit detail. And yet all settle for a picture. and a tag line. with inappropriately placed periods. where a comma or hyphen will do.

This is true for both SBUX, which will just show you a picture of a cup (which we presume contains coffee) or Boeing/McDonnell-Douglas, which outbid SBUX for ad placement at the Pentagon Metro station, and advertises bombers and helicopters. Surely there's more information that we need to know about the latest bomber than about a cup of coffee?

But the advertising won't tell me. If I care, there's nothing for me to do for ten minutes but to stare at the picture some more. Oh, I could look elsewhere, but I'm not elsewhere. I'm on a subway platform, waiting. I want to see the information that got thrown away in a desperate attempt to get the point across with a minimum of cognitive effort, and am frustrated that I can't.

The other prime example of this is of course anything written for MSFT Windows. It's easy to sit down and use, which I would be an arse to be annoyed by, but it's supremely difficult to go beyond the easy stuff. Spent an hour yesterday trying to get the cute little browser thing to hide files beginning with a dot. I even wrote Dell tech support, who blew me off. As you can plainly see, if I want something that seems possible, but isn't, I will be frustrated and unable to continue to function as a normal human being. My favorite foil whom I've linked to before, Joel, goes on and on about how frustration comes from having things that don't work the way you expect them to, adding little bits of cognitive effort and annoyance to your day. Joel probably describes many people, but I am most frustrated by tools that just plain don't work. Screwdrivers are truly counterintuitive, if you ask me (to make screw go out, turn counterclockwise?), but I learned the righty-tighty/lefty-loosy thing. When even that doesn't help, (like the screw is upside-down or with the few reversed-thread nuts on a bike) I am indeed frustrated. But I am infinitely more frustrated when the screwdriver is made from cheap metal and bends when the screw is too hard to undo.

Implicit to all of these things is a promise: I will tell you about my product; I will help you make your document look just right; I will unscrew your screws. Sometimes getting that promise to work takes some compromise from both sides, which is how life is. It's not the compromises but the broken promises that really hurt.

Somewhere, I read about how temperature gauges are less common on cars now, since somebody worked out that most people interact with the gauge by just looking to see if it's in the red, and panicking if it is. So why bother with a gauge? Instead, you just get a little light that tells you when the temperature is in what would have been the gauge's red part. I told Mr. DRC of Santa Monica, CA---a car expert if ever there was one---about this, and he had a hissy fit, listing three dozen things you can learn from a temperature gauge beyond whether it's in the red. So in following Joel's advice about minimizing cognitive effort for 95% of drivers, the other 5% are frustrated and dejected.

It doesn't have to be that way. Design that includes the lazy doesn't have to exclude those who care, and if it does, it's as bad a design as one that only makes sense if you study it for an hour.

I tried to come up with more examples of where things have been redesigned for the lowest common denominator and thus shut out those who care, but couldn't think of anything really good and pervasive. Television has always been written for dumb people, and since there's a time constraint, you have to pick your level of information and stick with it---unlike a print ad, it's physically impossible to say more. There are thousands of books with `for Dummies' in the title, but there have always been such how-to books, and for every such book, there's another that goes into all the detail you could want. This is even true of management books, which are typically the most supremely oversimplified books in existence, since businessmen often have a pompously overinflated idea of what their time is worth. Perhaps you, dear reader, can leave some suggestions in the box below.

Meanwhile, I have nothing but a lament about the two realms where withholding from the consumer is vehemently defended as a good thing: working with PCs, and advertising. One particular item stands out as the intersectionof the two: MSFT PowerPoint, a computer program for creating advertising presentations. Its design makes summarization and mimimization of cognitive efffort easy and information dissemination difficult. E.g., as a counter to the too-difficult design interface which caused a disaster above, PowerPoint's design is partly responsible for the destruction of a Space Shuttle.
[link][2 comments]

on Friday, September 29th, Lure Knightstalker said

Items in an average day which have been LCD'ed(Lowest Common Denominator), yes some are subsets of others.

Windows, all versions
Any chat program which changes emoticons to real icons for those who can't read them
Cell phones(good thing, but overdone), particularly cellphones with multifunction buttons(top of button is up, center is select, when on a call it is end call, when not on a call...)
Microwaves(modern, not old) Microwaves used to have 1 or 2 dials(time, power) and the button to help open the door. Now they have overly complex sets of keypads. This in and of it'sself is not a bad thing, but when it makes me take more time than the minimum(place food, close door, twist dial) I see that as a bad thing.
Cars(good thing, no more choke or messing with the accelerator to turn on)
The list goes on, but theese are the most egregious examples I can think of right now,

on Saturday, January 19th, Lure Knightstalker said

Another one: the front panel of a VCR. Used to be you could do anything from the front panel including play record setup functions, change channel etc. More and more there are stunted front panels which allow play ff rew eject and power, but have no way to enter the menu, change channels, or any other functions. Tvs dont' have this problem, they always had volume channel, and power and they still do, for them it's been feature growth, and often times the menu and front panel will still allow you to access all functions of the tv(thanks to a menu button)

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 


22 April 04. RSS and objects

public This guy makes a brilliant point about magazine articles: "I had begun to notice that people refer to a magazine article by mentioning the magazine, not the author, but with a book they typically don't remember the publisher, but only the author[...]." In this context (but another post) the guy explains why RSS saves us: it lets us pick authors that we like and design a nameless, virtual newspaper/magazine which is entirely by the people we are most interested in. Our virtual magazine can even have lots of comics (see the links page) and Dave Barry. It gives me that 90's optimism that yes, the Web really can revolutionize publishing and information dissemination.

Remember the `zine revolution, where a collection of a few people printed stuff up and made their pals read it and also left it at the bookstore hoping that a few strangers would also read a few pages? There's blogging with RSS for ya.

private So I myself have now been using an RSS reader for a little over a week. The results? My apartment is much cleaner, and I have no dishes lingering in the sink. I think I'm like a lot of people in that I guide my life based on the item on my `to do' list that is the least onerous at the moment. This had meant hitting <F5> on Paul's blog and seeing if he's said anything in the last five minutes, but now that's entirely obsolete, since the RSS reader does that automatically. As much as I love chycks in eyeliner, even that site from a few days ago has become onerous.

In fact, generally, looking for new content is among the most onerous things I can think of. As much as I may give the impression otherwise, I hate clicking on things and hoping they'll turn up something good; I really do think 90% of the content online is crap; and buying stuff online is so painful at this point that I'd rather go without than suffer the requisite half an hour of aimless clicking that goes into buying anything with plastic or silicon parts.

And so, having barred the joy of <F5>-ing the sites I really do like, I'm down to sweeping my floor and doing dishes. I guess it's sort of a geek thing to do, to automate and make efficient your downtime (here's a great example).

Oh, and I also read the NYT and the Economist more, since they now push themselves to me rather than requiring that I click on a link. I am thus notified within half an hour any time a U.S. serviceman dies in Iraq.

virtual I bought a big pile of records the other day. I'm increasingly feeling what the luddites of old said about CDs: they're just not fun compared to records. There's no tactile joy, nothing to do with your hands or your eyes. The little CD booklet really doesn't compare to the big square sleeve that hipsters have lately taken to framing and hanging on the wall. There's no ritual to putting a CD in the little motorized tray. As previously noted, if you have to get up every twenty minutes to flip the darn record, you're more likely to listen compared to just putting on a playlist in the background.

To go even further, walking through Chinatown in Manhattan a month or two ago, I happened upon a pile of 78 RPM records, from circa 1915-1925. They do indeed put the records of the 80s to shame: these things don't wobble or bend, and they weigh something substantial. They feel good to hold. These 78s are vaguely Jewish in nature, like Cohen calls his tailor on the 'Phone (comedy monologue, it says) and the Yiddisher klezmer orchestra. I wish I coud hear them.

I was raised more on CDs, though, so the step from CDs to MP3s on a hard drive was a trivial one, since pushing little plastic buttons and clicking on the picture of a button are about the same experience. Now I've got an efficient, streamlined system for playing music that involves absolutely no tactile involvement at all. Perhaps this is why I'm so into good computer keyboards---but compare the keys on your keyboard with the keys on a piano (not to be confused with a MIDI keyboard). In the end, convenience and cheapness will always win out over tactile fun. That's why CDs made records basically disappear, and why MP3s threaten to make the entire concept of music purchased with a tangible physical medium obsolote.

For me, this brings up two questions: first, how far will the virtualization of things go? Will all our media be on screens and speakers; our cars, tools, and other assorted things with buttons replaced with little touch-screens and voice commands; the soft parts replaced by pictures of soft parts; and once-heavy things like glass jars, wood furniture, and telephones replaced with cheap, light, and fully functional plastic counterparts? What'll be left? Which brings us to the central question:

What will we do with our hands?

The visitors from the future are always drawn as having gigantic heads and tiny hands. I wonder if the future really is in not touching things. I guess it can go one of two ways. We may not care at all, since our heads are getting bigger, or we may start to care much more about the things that are basically impossible to replace with non-tactile substitutes: clothing, food, people.

Which is how my RSS feed has made my life better: by streamlining the way I waste time online, I'm forced to read physical books, put my hands under running water, play records, and live more in the tactile world.


[link][no comments]

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 


01 September 04. Disneyfication in Oz

We've all complained about the Disneyfication of stories, where, as my guest blogger described it: ``All of the disagreeable or immoral aspects had been excised from the story, so that the one presented had almost no bearing on the original story, other than the famous names and the most basic story elements.'' I'd always thought of this as a relatively modern phenomenon, based in celluloid, so imagine my surprise when I read this introduction to a book entitle The Wizard of Oz:

"Folklore, legends, myths and fairy tales have followed childhood through the ages, for every healthy youngster has a wholesome and instinctive love for stories fantastic, marvelous and manifestly unreal. The winged fairies of Grimm and Andersen have brought more happiness to childish hearts than all other human creations.

Yet the old time fairy tale, having served for generations, may now be classed as "historical" in the children's library; for the time has come for a series of newer "wonder tales" in which the stereotyped genie, dwarf and fairy are eliminated, together with all the horrible and blood-curdling incidents devised by their authors to point a fearsome moral to each tale. Modern education includes morality; therefore the modern child seeks only entertainment in its wonder tales and gladly dispenses with all disagreeable incident.

Having this thought in mind, the story of "The Wonderful Wizard of Oz" was written solely to please children of today. It aspires to being a modernized fairy tale, in which the wonderment and joy are retained and the heartaches and nightmares are left out.

L. Frank Baum

Chicago, April, 1900."

Any further comments I'd had on the subject are obvious and left as an exercise to the reader.

In an attempt to keep with the blog's current title, I should also mention some of the story's economic aspects. It has been posited in no less than the Journal of Political Economy (the University of Chicago's house journal) that the story is an allegory about the debate about switching to the gold standard (gold is measured in ounces=Oz, but it was her silver slippers which saved poor Dorothy, et cetera). But, alas, the political-allegory interpretation is the entirely false offspring of a too-vivid imagination. This article gives an interesting account of the rise and fall of the gold-standard interpretation.



[link][no comments]

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 


12 November 04. The death of the mystery of travel

At the airport today. Now that they ask you to kick your shoes off when you arrive, it feels more like home. The TSA just needs to offer us tea at the security checkpoint too. So I'm sitting here in my socks, watching airplanes take off, laptop plugged in, headphones on. With my eyes closed (nothing reduces eyestrain like typing with one's eyes closed), I can fool myself into thinking I'm at home right now.

I've been working on this setup for quite a while now. Decent headphones, the laptop, a nice travel mug: it's all part of my ongoing efforts to make myself entirely location-independent. Drop me off in Caracas and I'm ready to write models and papers. [Yes, Venezuela uses the same voltage.] I've moved approximately once a year every year of my life, so everywhere I go, I have in the back of my mind that it's still temporary, so I shouldn't get too attached to anything weighing more than a few kilos. Not driving has added to this on a different scale: if I'm on the other side of town without some little box that I desperately need, it's an annoyingly long trip to get back to it. If I can't throw it in a bag and take it with me, it's not worth depending on. Back in the day, if you wanted to leave the house or work, that meant doing without certain things; now you never have to make do again.

This thread all began with the camp stove, which is the same mentality for the outback: just `cause you're out in the middle of nowhere doesn't mean that you have to make do without tea. Gosh, solar technology is fast improving, so feel free to bring your laptop too (or at least your palm pilot and keyboard, which I've done). The boring entry from a few days ago was intended to help with making sure your away computer is as much like your home computer as possible.

Or ice camping. You go out into the country, after a heavy snow, where everything is blindingly white and you can't hear anything but that indescribable sound of snow crunching, and then when you stop you can't hear anything at all and you wonder if the entire Earth has fallen off the face of itself. Having found such a spot, where it's nothing but you and white snow and white sky and cold, you take out an axe and start carving in the snow. You can use your tent or you can make yourself an igloo. You can carve chairs for yourself, around the camp stove you bought, and there, you and endless whiteness can have tea.

I remember the first time that I'd flown as a conscious person (as opposed to as an infant), which was in high school. I spent the entire flight glued to the window---especially the take-off, which is one of the greatest achievements of humanity that you can literally feel in your gut. Now, shoeless, I sleep through them. If I am awake, I'm back on the laptop, pretending I'm at home.

Oh, I know that travel is supposed to be transformative and eye-opening, but I've found ways to bypass all of that unpleasantness and potential change. Thanks to the miracles of modern technology, even travel to the most distant lands can feel just like another day at home, albeit colder, warmer, or with people who speak English with an odd accent.



[link][a comment]

on Thursday, November 18th, zzzzoe said

Yeeucch, what's up with this new format?

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 


10 August 05. Red lines, red lines

But repetition bothers me. This is no doubt derived from my geekdom, which presumes that if something is said, you're done. Saying it twice adds no value at all, because the theory says that the reader's brain ain't any different the second time around.

I've been all political lately, and politics is all about repetition these days. Democrats need an echo chamber, many have (repeatedly) said. 'Stay on message' means that different people all say the same darn thing. You get sick of it, especially when the point being repeated rings false. There were no WMDs, and repeating it over and over again doesn't make them appear.

Something in me takes repetition from an author to mean that he/she/it is out of things to say.

But repeat I did. I'd be too embarrassed to give links, but there are two articles out there which appeared over the course of a week, and one more which will be out any day now, and they're all from very similar templates. Given that they're about software patents, one is to the general law community, one to the engineering world, and one to the policy audience. But anybody who's interested in the topic is probably involved in all three of these worlds, if not a couple on the side. I guess I just feel guilty for recycling sentences.

I got horrible reviews from my game theory students. They thought I was unclear and blah blah blah. To digest and interpret their arguments: I didn't repeat myself enough. Ms MV of South Park, WA, explained to me that repetition is the key to learning, because no two students learn in exactly the same manner. You first present it in a manner perfect for the visual learner, and then in a manner perfect for the kinetic learner, all the while in a manner that the aural learner will follow, and soon they all have it down. Compare with the mathematician's presumption, that if you present the information one way or another, and then the listener can work it out for his/her/its own self.

Getting back to me, I guess I always knew that the mathematician's belief is false on a base level, because some math books are so frigging hard to read. Before learning econometrics, I spent days of my life staring at Goldberger's 'metrics textbook, wondering what was wrong with me, since every symbol is indeed defined and every concept derived. But now that I'm reasonably confident with the subject, I look back on that book and am still totally lost, because every symbol is defined and every concept derived exactly once.

Oh, I could give you more examples, but you get the picture, eh. I got back the edits on my little bookie-poo, and the editor inserted a great deal of sentences that begin `As already noted'. I'm trying to learn here, and to not just cut them all with a note saying `this is redundant'.


multiple audiences:
I'm comfortable with them---perhaps _too_ comfortable. It's related to the multiple meanings of words, like trivial. To the mathematician, copying a phone book is trivial; to the layloser, it's a step beyond. To an economist, if the Sultan of Brunei buys a sandwich from a beggar for ten bucks, then the resulting allocation (beggar has ten bucks and nowhere to sleep; Sultan is still a billionaire and has had lunch) is Pareto efficient, while the rest of us think that the allocation is probably still at the least a few tens of thousands off from anything they'd call efficient.

My readership, both of this dumb little website and any policy writing I do, is going to be both people who may be smart but who aren't in my specializations, and people who may or may not be smart but are indeed economists or computer geeks or whatever. So what to do with the multitude of words which have a double meaning?

Or let's say that you're in the literary set, and want to describe your camping trip in the context of Tennyson's Ulysses: to strive, to seek, and not to apply bandages until we get home. Those readers who know Tennyson will pick it up no problem and maybe, maybe even chuckle; the other readers will think it's kind of weird.

My own test is that if all my intended audiences will be able to read the sentence based on their understanding and get something out of it, then it's a good sentence. Yes, different readers will read it differently, and some will take me to be very eccentric. The more sensitive ones who are not in the know will realize that something is up---ay, there's the rub [vaguely a Hamlet reference]---and this is why my editor redlined any sentence which is not simultaneously legible on all dimensions to all possible audiences. No use of trivial that would be potentially read as awkward to a congressman; no asides to the economists; no references to textbook legal cases unless I've spent a page giving the histories of the litigants.

For the primary law of the essay---be it academic paper, policy brief, or even book---is that it must make the reader feel smarter. If there's a section headed `don't worry about this part if you're not an economist', then those readers who aren't economists will feel left out.

Oh, what a detriment this law is to the essay that is good by so many other measures. The essay which the reader can interact and argue with, the essay which the reader can really study and learn from, the essay which points the reader to future reading, or more generally, the essay as a work of art. They dragged you to the art museum when you were ten, and you got _something_ out of the paintings, and now that you're older and took some art history classes, you get other things out of it, perhaps closer to what the artist had intended. No point belaboring the definition of art, but as a social norm, visual arts are generally free to be ambiguous and multilayered, while nonfiction text is expected to be clear to the lowest common denominator.

I'm trying to repeat myself more than my incorrect gut wants me to, but this `all essays must speak to only one narrowly-defined audience' thing is not working for me. There are just too many people out there with different brains. Just as we repeat the point with imagery and with abstractions so the visual learners and the engineers alike will learn from the essay, the essay should have content which is pleasing to the artist and the mathematician alike, even if the other party is reminded that there are other people out there who know things that the reader doesn't.





PS:
my bathroom reading of late has been Moby Dick, by Mr. HM of NY, NY. Why didn't anybody tell me this book is so darn _funny_?

It has a chapter on cetology (the study of whales). Have been wondering of late whether this would fly in the modern world. I mean, it's a novel, about life on the high seas, about people, about obsession, and then the guy goes off for pages on types of blow spout and the anatomy of porpoises.

So I wonder how the book would fare under the marker of a modern editor. If the book were written today, would the cetology chapter see the light of the bookstore? Am I being too cynical to presume that all modern editors would just put a big red X through such a lengthy and irrelevant digression?

[link][a comment]

on Wednesday, August 10th, zoe said

In other words, art by committee sucks. I hear ya, my friend...

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 


10 January 06. Fall of the house of Simpson

George Meyer is generally regarded as one of the driving forces behind the Simpsons. Here are three data points on the guy:



That first interview really is a great interview. If you have limited time, stop reading this and go read it. I've done so about fifteen times, and keep referring to it in conversation. One of the first people to link to my little blog, Mr. PH of Seattle, WA, did so partly because I had linked to the interview in an early post and thus "rescued a great ... interview from the cyber-depths." He said so in this post, but that's just about me so don't bother clicking through on it; instead you should be reading his comments on waterboarding. The poster in the second of the three links above says she cuddles up with that interview at night, and to a great extent I don't think she's exaggerating.

So when I saw the link to third interview, all I could think was `darn, I'd better send this away to Mr. PH of Seattle, WA, as quick as I can!!' But then I got to reading it, and it wasn't the same. It was, well, bitter.

The jokes in Army Man go back and forth among just surreal, commentaries on pop culture, and just cruel. As many people fall out of windows as complain about coffee. All the cultural commentary and surreality and just a dab of the cruelty transferred over to the Simpsons. And, y'know, in the right context, cultural commentary is easy, because everything we do is a little funny, just by pointing out the minutiæ of life and their futile motivations. `Look at the cover of this religious pamphlet,' a character would observe, `It has a sunrise on it. I feel more serene already.' Our world is filled with such clichès, to the point that we don't even think about them. Much of George Meyer's sense of humor was built around that, and you can see it in interview #1, where he talks about advertising and buying tabs of LSD for five bucks.

I think some would call that sort of thing 'observational humor', and that label usually has a derogatory slant because of all those people who have quipped `you ever notice how many Starbuck'ses there are these days?' and expected us to laugh. As a matter of fact, no, I live in Baltimore, a city that has 640,000 people in city limits and _seven_ SBUXes. Compare with DC: 550,000 and 59 SBUXim (which is indeed enough for two on every corner). But observational humor is funny when it's done by somebody who really does observe the weird stuff that lurks in our institutions and our social structures.

The later interview was different, as the balance between that wide-eyed sense of screwing around with day-to-day pabulum and the cruelty shifted toward the cruel: the model for comedy he cites in this interview is Candide. And ya know, watching people fall down is funny, and will always be funny forever more, but it doesn't say what Meyer's earlier observations on culture, religion, government, and all that other weird stuff we fill 24 hours of every day with did say.

[link][a comment]

on Wednesday, January 11th, ds said

Great links - but i wouldn't say he sounded bitter - think he laughed too much, in a semi-good natured way, usually. actually he explicitly referred to being 'bitter' in the past and basically how he got over it (towards the bottom)

The Simpsons really suck this year. but the first time i said that was ~1994 so i probably can't be trusted. i really do think it's gotten progressively sappier since righty christian groups starting praising its family values, which started even before when that new yorker piece came out (eg http://www.snpp.com/other/articles/holyfamily.html). I still watch it tho

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 


11 April 06. Anti-intellectual

[PDF version]

Pundit is a term from Hindi meaning “wise and learned man”, but it is usually used sarcastically in modern parlance. But, y'know, I don't feel so sarcastic about it. You can decide the “wise” part for yourself, but having spent a couple of years studying the narrow topic of subject matter expansion in patent law, I am confident describing myself as an authority. It's been months since I've heard a new argument on either side of the debate, and the new facts I'm learning are increasingly fine details. I don't feel any hubris when I say that nobody is going to blindside me on the tiny, narrow bit of subject that I have chosen for myself.

And ya know, most of the arguments that I have presented in various media and to various bigwigs over the last few months are arguments that numerous non-experts have also made.

I often run into people who divide academic results into two categories: (1) things anybody could have come up with after a bit of thought, and (2) things that are too esoteric to be worth anything. Some exceptions are made for chemists and engineers, whose work the commonsense folk have some sense is esoteric but will somehow eventually lead to new toys or a cure for something, but everybody else--the mathematicians who study tensors in R14 , the biologists who study odd tropical flora, and most importantly, the anthropologists and sociologists and economists who study people, whom we all study every day--are wasting their time and our money.

Findings
Nor is the righteous `my common sense trumps their PhDs' attitude restricted to the stereotypical hick. The back page of Harper's magazine, the page most magazines reserve for the humorous finale, is the Findings section, that lists a series of out-of-context study results. From the March 2006 issue: “...It was discovered that guppies experience menopause and that toxic waste in the Arctic was turning polar bears into hermaphrodites. ...A survey found that Americans are becoming less repulsed by the sight of obese people. Scientists launched a study to determine what sorts of clothing make a woman's bottom look too big. A study found that Americans are more miserable today than they were in 1991, and British researchers discovered that many young girls enjoy mutilating their Barbie dolls.”

OK, what are we to make of this? What message is being sent? Mashing together the studies means that the findings do not add up to any real image of the world, even if the page does categorize the findings for some sense of flow. Readers can't drop these tidbits into cocktail party conversation, because they only have one piece of information and so aren't armed for even the simplest follow-up. Interested readers can't learn more, because there are no citations. More importantly, there is no context: we are not given the reason for studying guppy reproductive systems, so we don't know why a scientist would care to do such a thing.

Being the back page, we know that it's supposed to be humorous, and with everything taken out of context, it can be, the way that so many statements out of context or in a different context are funny. But there's also the sense of laughing at the scientists. The subject of every sentence (but the passive-voice ones) is a researcher or a study or a survey. If the editors just wanted to list facts, they'd say “Americans are becoming less repulsed...” but instead they waste ink pointing out that “A study found that Americans are becoming less repulsed...”.

If there were an American Association Against Science, they would probably reprint the Findings page verbatim. The AAAS would ask, in big red letters, "Why are we spending money on this?" and the answer to why would not be anywhere to be found.

But you know that I spend all day studying obscure features of people's behavior and reading math books, so it's no surprise that I'm anti-anti-intellectual. It's no secret that if I had an anti-intellectual in the room here, I'd tell him or her (reading from Harper's again) “New data suggested that Uranus is more chaotic than was previously thought.”

[See, statements in a different context are downright hilarious!]

But it goes further than my kind of academic. The anti-intellectual sentiment--the insistence that it's either common sense or it's not worth the trouble--is a belief that there is no such thing as an expert. It is the myopic belief that if I don't know it, then there's nothing to know. As such, the anti-intellectual sentiment is often aimed at targets well far afield from intellectuals.

At the Baltimore Museum of Art, the same establishment that houses Picasso's Mother and Child, are such aggressively simple works of art as two silkscreen reprints of the Last Supper, and a curtain of blue and silver beads. Some readers will recognize the first as a work by Andy Warhol, and thus know the context: Mr. Warhol felt that the repetition and mutation of familiar images created new perspectives. For the second, as for a great deal of art that was clearly easy to execute, we don't know the context at all.1 But even though we don't know it, there is a context. The guy went to art school, has had a few focal ideas that drove all his work, and has done years of pieces that led to this simple bead curtain.

So what is an expert to do? One approach is to always stick to things that are obscure and look hard. Make sure that every study, every work of art, every essay says fuc* you, I'm an expert and you can't do what I do. But we value people who make it look effortless, whether they're figure skating, producing a painting, or running regressions. We always value simplicity, so if all it takes to get across the message is a curtain of beads, then why overcomplicate things to remind the viewer that it took years of work to get there? Some of the best guitarists out there never really ventured past four chords, while the guys who can play intricate solos are often dubbed wankers.

I'm glad I wrote my PhD thesis, and more generally love the idea of a thesis in general, including for high school seniors, BAs, or anywhere in between. A good thesis means that the author has become an expert in some tiny, irrelevant little corner of the world. Research ability by itself is valuable, and it's good practice for when the student needs to be an authority in something of more practical value, but it also gives the student an idea of what the other experts of the world have gone through to get to their simple ends. Remember that part in Zoo Story where the guy says that “sometimes it's necessary to go a long distance out of the way in order to come back a short distance correctly”? A student who has gone a long way in becoming an expert, and must then reduce that to the sort of ten second summaries that we all give to friends and family, will have a better understanding of the long distance that other experts have gone before they could string together simple words or beads or chords.


Footnotes

... all.1
Sorry, I can't help the art snobs in the audience with the guy's name. Enjoy being in the dark with me here.



[link][no comments]

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 


26 October 06. Is Ruby halal?

[PDF version]

The starting point here is last episode's essay on programming languages, and this here is basically an explanation and generalization of why I wrote it. For those who didn't read it (and I don't blame ya), here's a summary in the form a description of my ideal girlfriend: she should be an Asian Jewess, around 172-174cm tall, gothy, sporty, significantly smarter than me, significantly cuter than me, significantly better socialized than me, willing to hang out with me, very well organized but endlessly spontaneous, enjoys walks along the beach, does intellectually challenging work that involves being outdoors, and plays guitar in a rock band. Yeah.

So: too bad half of those things contradict the other half, eh.

The first key difference between the problem of picking a programming language and the problem of picking a significant other is that the programming language doesn't have to like you back. The liking-you-back issue creates many volumes' worth of interesting stories, all of which I will ignore here, in favor of the the other key difference: unlike many girl/boyfriends, programs are often shared among friends and coworkers, meaning that there are externalities in my arbitrary, personal-preference choice.

Personal preference plus externalities is the perfect recipe for never-ending, repetitive debate.

Debating the undebatable
Under Jewish law, one must never say the Name of God. In fact, there is none--it's sort of a mythical incantation, used to breathe life into Golems and otherwise tell monotheistic fairy tales. Under Islamic law, one must speak the Name of God when slaughtering an animal for the animal's flesh to be halal. My reading here is that there is therefore no way for meat to be both halal and kosher.

And let's note, by the way, that kosher and halal laws are not cast as rules about keeping clean for the sake of disease prevention. They're ethical laws, meaning that, like personal preference, they can't really be debated. It's not like somebody will finally find the correct answer and write it down for everybody to see. We can't even agree to basic axioms like `you should be nice to people' or `don't be wasteful'.

Do ethical laws induce externality problems? From the looks of it, yes they do, because so many people spend so much time trying to get other people to conform to their personal ethics. Ethics are an extreme form of that other personal preference, æsthetics, and seeing somebody commit what you consider to be an unethical act is often on par with watching somebody wearing a floppy brown sweater with spandex safety orange tights.

Fortunately, almost everybody understands that there is no point going up to Mr. Brown-and-orange and telling him he needs to change, because we all know exactly how the conversation will go: some variant of `I have my own personal preferences' or `who are you to impose your arbitrary choices upon me'. That is, it would be a boring argument, because there is fundamentally no right answer.

When does human life begin? I have no idea, and anybody who says otherwise is guilty of hubris.

Gee, that was a fun debate, wasn't it.

And the problem with that non-debate, as with this essay, is that it has no emotionally satisfying conclusion. The natural form of a debate is for one side to present its best arguments, the other side to present its own, and then both sides go home and think about it. But the form of debate that is emotionally satisfying has a resounding conclusion, where one side tearfully confesses to the other, `OK, I was wrong!' But with arguments of ethics or personal preference, this sort of resolution happens about once every never.

But there's a simple way to fix this problem: invent statistics.

After all, not all debates are mere issues of personal preference. A question like `will building this road or starting this war improve the economy' has a definite answer, though we're typically not smart enough to know it. There is valid grounds for debate there.

But for ethics and personal preference issues, we can still make it look like there are valid grounds for debate. Find out whether abortions decrease crime The paper that claims this, by Steven “Freakonomics” Leavitt and another not-famous economist, has been shown to be based on erroneous calculations. PDF, find out whether people commit more errors when commas are used as separators or terminators, run benchmarks, accuse the author of the file system you don't like of being a murderer. With enough haphazard facts, any debate about pure personal preference regarding simple trade-offs can be extended to years of tedium.

This turns debates that should be of the natural form (both sides state opinions, then go home) into the resounding form of debate, where both sides attempt to get the other side to tearfully confess the errors of its ways. But the sheen of facts doesn't change the fundamental nature of debates over ethics or personal preference, and because these are debates where nobody is actually wrong, nobody will ever be convinced to bring about an emotionally satisfying conclusion. We instead simply have a new variant on the recipe for tedious, never-ending debate.

Relevant previous entries:
The one three years ago when I advocated C, and came off sounding very reasonable, I think. The one where I complain about an especially vehement set of proselytizers The one where I talk about the value of stable standards




[link][2 comments]

on Sunday, October 29th, rd said

-i think Levitt claims the mistakes weren't significant and dont alter his main conclusions

-your claim that ethical debates are fundamentally unresolvable might also be a matter of opinion. Eg, some people might think that human life starts at x and this can be proven axiomatically, we just havent figured out how to do it yet. it's possble (but highly unlikely?) that at some pt 'somebody will finally find the correct answer and write it down for everybody to see', no?

on Monday, October 30th, the author said

Yes, Donohue and Levitt has a response (PDF) to the claims. I have not had time to really look at the `metrics on any of the papers involved; if anybody has, let me know. But the key allegation is that the original abortion-prevents-crime paper used ln(arrests) as the dependent variable, and if you redo the regression with the much more sensible ln(arrests per capita) and jiggle the variables a bit, then the effect disappears. In the response linked above, Donohue and Levitt respond that if you do use ln(arrests per capita), and jiggle the variables more then the effect reappears.

In short, we have a specification fight. I think we should throw out specifications of the regression with ln(arrests). The authors of the critique, Foote and Goetz, found a valid specification where abortions have no relation to crime, and then Donohue and Levitt found another specification where abortions reduce crime. For my part, all I can do is modestly and respectfully comment that I told you so.

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 


21 November 06. Is IBM evil?

[PDF version]

In the 1940s, a number of IBM's subsidiaries assisted the Nazi government in implementing the logistics of the Holocaust, to some extent being entrepreneurs who originated some of the ideas that made the whole thing possible. For example, every serial number tattooed to a victim's arm corresponded to a punchcard manufactured and processed by IBM. The involvement is so well documented that even IBM doesn't deny it. When Edwin Black organized the facts and wrote the story for the lay-reader in a book entitled IBM and the Holocaust (Black, 2001)

(BUY!), , IBM's official response was to play down the book by pointing out that it didn't say anything that a host of historians didn't already know.

But what does this piece of history say about the IBM of today? By way of discussion, and for the sake of not sounding like a crackpot (always a hazard when talking about the Holocaust), I offer a few social science approaches to the Holocaust before returning to the question of what IBM (and you) should do today.

Game theory
Game Theory begins with a situation in which people have an abundance of actions they could take, but those actions depend on the actions other people choose. John Nash proved that given a few simple technical conditions, there are always equilibria, wherein everybody implicitly agrees to behave in a certain manner and is OK with that behavior. This is why he won a Nobel prize and has books and movies about his life.

The problem with Nash equilibria is that there are often many of them. One or the other may be more likely, but before the fact, they're all possible. For example, everybody in England chooses to drive on the left, whereas in most other countries the equilibrium given the same situation is for everybody to drive on the right. Why'd it happen one way or another? Historians who have studied the question can amalgamate all the random events into some compelling stories, but here's my own summary: it's basically arbitrary.

If we had asked people in the Germany of 1935 whether they would assist in mass killings, all but a handful would have said no; yet in the Germany of 1945 we found enough Germans who said yes that mass killings were efficiently and extensively conducted. Why'd Germany as a country choose this approach to getting out of the depression when other countries just chose to have people build extraneous public works? Why'd the society switch from a peaceful equilibrium to a violent one? Many thousands of pages have been written on the subject, the basic conclusion of which is: it's basically arbitrary.

To be literal about Game Theoretic examples, consider Chess. There is nothing inherent to the setup of the game that causes a given outcome. Sometimes white wins, sometimes black wins, depending on what the people playing the game do. Similarly in social situations: sometimes one side prevails, sometimes the other, and we never know which it will be until after the fact, and if we put similar people in a similar situation, the other outcome could easily prevail. Conversely, equilibria which occurred elsewhere in time or space can always crop up again; the best we can do is try to bias things in one direction or another, by taking away black's knight, setting social norms about not killing Jews, or establishing rules explicitly outlawing hate crimes.

What the historians say

Hannah Arendt wrote the seminal book on the question, Eichmann in Jerusalem (Arendt, 1963). This is the book which coined the phrase `the banality of evil' to describe people like Eichmann, who was a dull bureaucrat who didn't think twice about the implications of his paper shuffling.

The moral (one of many): any organization is capable of evil, because the size of the organization allows an action to be broken down into bite-size, palatable pieces to be farmed out to people who would never approve of the whole. Their individual roles seem trivial and relatively blameless, and just as the officers at Nuremberg claimed that they were “just following orders,” everybody in an organization has somebody else they can point their finger at. Yet the end result is an equilibrium which nobody would have volunteered to bring about.

Ronald Wintrobe, in the final chapter of The Political Economy of Dictatorship (Wintrobe, 1998), extends the story that any individual only makes a marginal contribution by describing the bureaucrats who are entrepreneurs within the system, working hard to have a more-than-marginal influence. They advance in the bureaucracy by taking the initiative and having ideas which will help the organization achieve its goals more effectively. They do not `just follow orders' but take action to help the world along to the evil equilibrium. Wintrobe says Eichmann was a bureaucratic entrepreneur of this sort; Black shows that the heads of IBM's German subsidiary were. Such entrepreneurs always exist, pressing the society to move toward the evil equilibrium, in a manner that creates business or influence for them.

These authors show us the structure of the evil equilibria. There will always be people who are callous to moral considerations and will attempt to shift the organization to their benefit and the detriment of the rest of the world. Then, most of the people who have to take action to bring about a bad outcome can't see the big picture and so have no idea where their actions are leading to. So the protest singers have the right idea: large organizations (IBM, the government) have a comparative advantage in implementing evil equilibria, and we need to maintain especial vigilance over them.

Within this context, the big question is: what can these organizations do to ensure that the organization won't fall into an evil equilibrium, either through manipulation by bureaucratic entrepreneurs or just by wandering into them?

The IBM question
Let's return to present-day IBM, and the question of whether they're evil. Yeah, their laptops are all an evil-looking black; they make a server called The Intimidator; they're a big, blocky bureaucracy like every other big, blocky bureaucracy. But that's mere cosmetic evilness. As should be clear to this point, evil does not hit anybody over the head, and those who say that IBM's German subsidiary didn't know what the Nazi regime was up to are to some extent correct [but see Edwin Black's comments].

But today's IBM chooses to take a simple, insidious course which exacerbates its past: it tries to forget.

In a press release discussing Edwin Black's book, IBM states that it “[...] looks forward to and will fully cooperate with appropriate scholarly assessments of the historical record.” This follows discussion of the logistics of which universities house IBM documents. The message is clear: nobody in Armonk, NY, would be willing to operate a gas chamber, so the matter is a “scholarly” and “historical” question.

If the important moral question were “Should IBM be held accountable and pay reparations that would affect its balance sheet?” then IBM's insistence on averting its collective eyes makes sense-the IBM of today doesn't want to have to pay the debts incurred by the IBM of yesterday. But there is a far more important question: how do we keep such things as genocides or mass internments from ever happening again? This is the question which affects us today, and is the question that IBM can best help to contribute to, and yet seems to go out of its way to avoid.

The above press release was written in February 2001, so IBM didn't know any better, but the follow-up of March 2002 doesn't seem to say anything to change the claim that this is a question for researchers, not the people who head today's organizations and build today's machines. IBM's business conduct guides say nothing about refusing business from parties with suspect intentions or who aim to trample the rights of citizens [as of 2 May 2003]. As far as I could ascertain from their publicly available information and from correspondence with employees, IBM has made no changes that would ensure that its bureaucracy can not re-entangle itself in those past misdeeds which it “categorically condemns.”

Generalization
From IBM's second press release discussing Edwin Black's book: “A review in The New York Times concluded that the author's `... case is long and heavily documented, and yet he does not demonstrate that I.B.M. bears some unique or decisive responsibility for the evil that was done.'” Here is the full review. I agree with the reviewer: IBM was not unique or decisive.

IBM is the paragon for this essay because their work is dull and doesn't seem related to anything we picture oppression on a mass scale to look like. Also, there is nothing hypothetical about their situation: a subsidiary did provide substantial assistance to Germany's eugenically-oriented goals, and its official statements of today do make an effort to forget that. Yet everything we could say about IBM we could say about any other organization or person: each of us is capable of assisting in evil, there are situations which would tempt any one of us to do so, and all of us are more comfortable just not thinking about it.

Many people with whom I have discussed this topic point out that government-sponsored genocide is unlikely in the USA, so the game is fundamentally different. This would be to see white win a dozen games and to assume that this means black can never win. The game may be biased toward white, but that is by no means a proof of impossibility. Over the lifetimes of our elder citizens, the USA has gone through many periods which we collectively look back on and exclaim, `What were we thinking?' How did Japanese citizens wind up spending years imprisoned in internment camps for no reason? How did McCarthy manage to ruin the lives of hundreds of political enemies? Forty years ago, lynchings weren't prosecuted as crimes. No, this stuff wasn't genocide, but it certainly wasn't OK, either.

Others I have met contend that the situation was much more ambiguous in the 1940s than it is now, and it wasn't so clear-cut that IBM shouldn't have been involved. This is entirely the point. If it happens again, it will be just as not-clear-cut until after the fact, so we must plan for it before it happens.

You
So ask yourself, given that you have perfect retrospective knowledge of history, what you would have refused to do. Would you have supported and aided in the registration of minorities? If not, then you should not support it now. Would you have accepted that other people around you were being imprisoned without <EM>habeas corpus</EM>? If not, then do not just assume that thing will turn out differently this time around. Would you be comfortable if your boss asked you to work toward ethically suspect activities? Rather than worrying about it if it happens, make sure that your organization has rules in place now to ensure that such a situation can't happen in the future. There will always exist amoral bureaucratic entrepreneurs pushing us toward an evil equilibrium, but we can do a lot to lower the probability that they will succeed.

The game is not different. All of the ingredients of the situation of Germany or the USA in the 1940s are around today: we have bureaucracies, different races and countries, a government, and people. I'm not proclaiming that the sky is falling, and am not predicting genocides. But conversely, many people look at the horrors of the past and see them as something which was committed by monsters who are incomparable to the noble souls who populate the world now. But it was subtle, and if it happens again, it will be subtle again. Some arbitrary sequence of events could push us toward an evil equilibrium just like before, and there are no new safeguards in place. The only difference between now and then is that we have the experience of history, marking red flags along the way. To see those flags and do nothing about them would be, well, evil.

About the author: I wrote this essay on an IBM Thinkpad-one of eight I have owned (mostly Thinkpad 560s and Thinkpad 570s). I recently refused a job interview solicited by a contractor for the Department of Homeland Security.

Relevant previous entries:
The one about the folly of talking about IBM's will at all.

Bibliography

Hannah Arendt.
Eichmann in Jerusalem: A Report on the Banality of Evil.
1963.

Edwin Black.
IBM and the Holocaust: The Strategic Alliance between Nazi Germany and America’s Most Powerful Corporation.
Crown, 2001.

Ronald Wintrobe.
The Political Economy of Dictatorship.
Cambridge University Press, 1998.

@bookarendt:eichmann,
title ="Eichmann in Jerusalem: A Report on the Banality of Evil",
author = "Hannah Arendt",
year = 1963

@bookblack:ibm,
author="Edwin Black",
title="IBM and the Holocaust: The Strategic Alliance between Nazi Germany and America’s Most Powerful Corporation",
publisher= "Crown", year=2001

@bookwintrobe:pol,
author ="Ronald Wintrobe",
title= "The Political Economy of Dictatorship",
publisher= "Cambridge University Press", year= 1998




[link][a comment]

on Saturday, January 1st, Jack said

IBM IS EVIL.

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 


30 August 07. Neil Diamond, "America"

[PDF version]

This song should be the USA's national anthem.

It is the only song about the United States of America that doesn't just make me cynical. The current anthem is something about watching a battle over Baltimore's Fort McHenry, which is not something many of us can really relate to. Other songs that are more direct and just keep saying things like “at least I know I'm free” just remind me of the eternal vigilance that is the price of that freedom, and how many little flaws that freedom has today.

I once heard an ad executive on the radio, talking about how the USA can sell itself to the parts of the world that dislike it. Sorry, I'm not going to be able to find you a proper reference. He said that if you gave his agency a million bucks, it wouldn't be able to come up with a better tag line than the land of opportunity. It's a moniker that says nothing but optimism, hope, and prosperity.

So that's why I love Neil Diamond's America. Despite an occasional jump to third person, it's from the perspective of the people who still have that optimism that things will be better in the USA, and feel that it's worth giving up everything for that optimism. It has impact because there really are people like that, and we all recognize that expressing your optimism about a country by leaving behind friends, family, and everything you know means a lot more than expressing love of a country with a bumper sticker.

When I was a kid, by the way, this was one of only two or three English-language albums that we had in our collection (being that I'm an immigrant myself; see prior column). So part of my affection for the song is that I heard it a few hundred times. I always thought the cover of Hot August Night looked smarmy, though, and perhaps it's why I still dislike jeans jackets.

Of course, it'll never become our national anthem, because it's about immigration, from the first person perspective. It's about a naïve optimism about the Land of Opportunity.

Here is a snippet from a New York Times article about a pair of anti-immigration politicians:

[The anti-immigrant politicians'] main arguments for ridding the town of illegal immigrants come down to this: their presence has led to both rising crime and overcrowded schools. As it turns out, however, the crime rate in Carpentersville has actually been cut in half over the past 10 years; and while the schools were, indeed, overcrowded four to five years ago..., class sizes have now been reduced — although it did require the passage of a tax referendum. [From “Our Town”, by Alex Kotlowitz Published: August 5, 2007]
It is much like a dozen comparable anecdotes from all over the country (e.g., here).

Opponents of immigration present a standard syllogism for why the USA must bar the door: (1) the USA is fast going downhill, (2) it is going downhill because of immigrants, and (therefore) we must bar the door to immigrants. Without premise (1), the conclusion loses its power. Without premise (1), immigration opponents are left with abstract economic arguments about how things are OK now, but will all fall apart any minute now--and even that is only plausible when there is some tangible present evidence that opportunities are only barely forthcoming.

In short, opposing immigration requires manfuacturing a perception of scarcity.

More next time.



[link][no comments]

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 


14 September 07. My immense disappointment with the software industry

[PDF version]

I dislike Jane Austen novels. Any novel by any author is about people, even in sci-fi where the people are dressed up as sentient chunks of silicon. The characters do things, things happen to them, and the novel is interesting for the reactions of the characters. But Jane Austen novels cut to the core, because nothing actually happens. The light coming through the swaying tree branch makes it look like one character flinched, the other responds to the perceived flinch, the first responds to the response, and soon you have a plotline built out of nothing. I acknowledge that others love her writing, but I can't stand it.

Next point. To a great extent, ethics is about efficiency. If I told you that most of ethics is about individual self-interest versus achieving the most benefit for the group at large, you'd wonder whether to class that as self-evident or a tautology. So much of what we find to be deeply unethical--the smashed car window to steal somebody's favorite CDs, the tire company that shipped tires that fall apart and kill people because the accountants thought they could save two cents an hour in labor costs--are the ones that are most destructive for the least gain.

So I'm finding the politics of software to be increasingly onerous. The politics of coal mining is about people finding jobs, producing heat and electricity, environmental destruction. Things happen, and on that backdrop, people do the usual asshole things to promote their self-interest and get more money than the other guys. Human blood was likely spilled in the examples of unethical action above.

But there is nothing tangible to software. It's just information, whose best claim to actually existing is a weak differential in a magnetic field. A few episodes ago, I wrote about people who were tortured to advance Yahoo!'s hopes of transmitting its weak electromagnetic disturbances across China, and the time before, I had written about how the US Trade Representative has actively worked to ensure the death of thousands (if not millions) of people with AIDS in the name of ownership of information. And hey, how about a story about a man who threatened to kill a child so he could keep putting out spam.

Look, I know all the arguments for why software and intellectual property matter, and respect most of them. When I'm not writing about software, I'm writing books and law review papers about software. If computers didn't exist, I have no idea how I'd buy food. But I can still acknowledge that it is all smoke, shadow, and æther.

Can software make the world a better place? No doubt. Information, human connection, learning about the world through data analysis beyond human ability, fun--we can get all of that from software. And further, well-written software is write-once, use forever, because copying code is so easy. I really have downloaded, compiled, and used software written in the 1980s. I use the GNU's edition of grep a dozen times a day, as do literally millions of people around the world, and it hasn't seen any substantial changes in five years--even the changes in 2002 weren't so substantial. That is amazing value for effort.

Tragedies are not about people being stupid and dying, but about people falling from grace. The software industry's fall from infinite abundance to endless bickering is a tragedy.

Even the ones who are allegedly on the pro-efficiency team are asses. Just today I read about the GNU's lawyer having a drag-out argument with the guy who named the O'Reilly Press after himself. I won't bother linking to it--you'd be better off spending your time reading Jane Austen directly. Gosh, I've received a nastygram or two from the lead author of a stats package because s/he didn't like the concept behind my stats library. Pardon my yelling, but THEY'RE FUNCTIONS TO SHUNT MATRICES. THERE IS NOTHING TO POLITIC OVER. You use yours, I'll use mine, and at worst there's some duplicated effort.

“But B,” you protest, “hadn't you heard? People suck.” And I would say that yes, I realize this, but I'd been hoping for something a little better in the new world. I mean, it's just so easy out here, like a fantasy novel. You want a castle on a cloud? No problem, get coding. Write your fiction, and it will compile and run and on a good day spit out correct answers. My real failure was in thinking that people's suckiness came from scarcity, so in a world without scarcity, people wouldn't suck. But no, people just make something up and call it scarce. When you light your taper at my torch, my flame is no dimmer--but now I'm not the only guy on the block with light.

So where we could have invented abundance, people die and are tortured over invented scarcity.

Manufacturing scarcity
This is all an extended example of a class of actions that create ethical qualms in virtually everybody. They take a situation of general abundance and well-being, and manufacture scarcity therefrom. That's why it's unpleasant to talk to most anti-immigration activists: as per the last column, they take the world around us, a Land of Opportunity, and cast it as a wasteland of scarcity.

Don't manufacture scarcity is the neoclassical, capitalist, free-trade equivalent to the old track, “Kids, be free. Do whatever you want, be whoever you want to be, just as long as you don't hurt anybody.”

It's a simple and easy way to maintain some ethics, and all of the above examples manage to blare right past it. The hardcore neoclassicists are surprised that you can do things in a neoclassically perfect system that still causes others harm, like building barriers between them and the things they need, that you can then promise to drop should they pay the right fee.

If we agree that people are subjective beings, and that their perception of value depends on things besides just their immediate gut need, then it is possible to create scarcity simply by claiming that there is scarcity, as does the anti-immigration team with its constant hammering away at how the USA is a land of scarcity.

The problem is that the whole system rides on scarcity. Some of these forms of manufacturing scarcity simply consist of withholding your services or goods from others until they pay up. We think this is OK, and the system clearly won't work without that type of scarcity. For most of the world's desirable goods, there is simply not enough to go around--our infinite desires have to live on a tiny planet.

But there are other cases where the scarcity is an artificially built wall keeping folks from what they would otherwise have access to, like the formula for a medicine or a job opportunity. Intellectual property law is designed from the ground up to create scarcity in the use of an idea, which is why IP law makes so many people queasy: just because some dude is the first guy to come up with some idea, he has the right to manufacture scarcity of that idea the world over? Franz Kafka died in 1924, but if I disseminate his writings far and wide, I'll get sued? It takes a few steps to make that scarcity makes sense; sometimes it does and sometimes it falls flat.

As usual, I have no conclusion, but intend only to point out that even though our system depends on the existence (and sometimes creation) of scarcity, there are still distinctions between creating scarcity in one's own goods and services and building artificial barriers so you can charge a fee to drop them. Free marketeers see all manufacturing of scarcity as necessary and good, and the hippie kids see all manufacturing of scarcity as evil, but both extremes go nowhere. It's a case-by-case question that doesn't admit sweeping generalizations like `all patents are bad' or `all free market actions are Pareto-improving'.



[link][a comment]

on Saturday, September 15th, A librarian you know said

Listen-okay, I know how -much- you hate Miss Austen, but what did she do to you? I promise if she was around today she'd be the coolest kid in school and definitely opposed to software patents. AND she'd know how to persuade others of their evilness. (smirky)

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage:
 
 


24 December 07. Social technology

[PDF version]

Technology has indisputably advanced. You don't need examples from me to see that progress marches on, and that we humans are doing a lot better than we were a millennium ago. There were mysteries that we couldn't solve even a decade ago that are just run-of-the-mill now.

Meanwhile, you may recall my discussion of the Byzantine-Ottoman war. The interpersonal conflicts we have today are exactly the ones we were having a thousand years ago. People read Machiavelli or The Bible because these books address interpersonal conflicts that we have at the office all the time.

There are a dozen clichés about the contrast: the more things change, the more they stay the same, and history repeats itself, and maybe a few lines out of Shakespeare about the immutability of human nature. But it's worth asking why. ¿Why has our wonderful ability to invent new means of doing things and new technologies had absolutely no effect on how we go about interacting with other people? ¿Why does history repeat itself when biology and mathematics and physiology don't?

Here's my definition of Economics for the day: Economics is the study of conflict as a technological problem. We take the same tools that we used to study integrated circuits and populations of beavers and apply them to the problems of allocating resources among humans. And if you don't like how Economists are doing it, then there are other fields that cover the same questions of how people interact with different approaches and methods: Sociology, Game theory, Anthropology, Political Science, Law.

There's a fundamental optimism to it, that if we think hard enough about conflict, then we can resolve it better. We can actually build upon history rather than repeating it.

When I was at a Social Sciences department in a traditionally physical sciences school, we often got a line in the way of, `well, the social sciences aren't really a hard science.' The standard witty retort is that no, they are much more of a hard science, because human behavior (especially in groups) is the hardest problem science has ever faced. Underlying that little witticism is a faith that the questions are hard but still solvable. Yeah, people's emotions and reactions are hard to predict, but we used to say that about the weather, and our three-day forecasts get better every year.

I'm increasingly losing faith in the fundamental optimistic premise of Economics. Believe it or not, the Iraq war was a big part of my loss of faith, because it destroyed any illusion that I'd had that I was somehow on the edge of history. Nope, I'm just Generation X, a demographic whose key distinguishing feature is its complete lack of any distinguishing features. The real problems of how people behave in groups--the Shakespeare plots--aren't going to be solved, aren't even going to be step-toward-solved, by anybody alive today. We have them better classed and categorized and dissected (Prisoner's Dilemma, Stag Hunt, Matching Pennies), but the final step of teaching people to behave differently when faced with a Stag Hunt is still well beyond anything we economists could even contemplate.

Also on the list of completely obvious statements: a sense of progress is important. To a great extent, I think this is why we geeks are more interested in doing math and building stuff than in dealing with people. It's not that doing math is somehow easier, cleaner, or even less political--the math journals are still edited by humans, after all. But once you've proven a theorem or built a new toy, there it is, and the technique is now a part of our stock of elements, forever more. Meanwhile, any conflict you have now will be had again, possibly by you and the same other party, over and over. Once you kick the bucket, the theorem is still proven, but everything you learned about interpersonal anything is lost forever, and will have to be relearned by the next batch of people.

All of which inexorably leads us to the question of what our technological progress counts for when our social technology is resisting any sense of progress at all. We can rephrase the statement that history repeats itself to say that history never progresses, and I couldn't imagine anything more disheartening.



[link][5 comments]

on Wednesday, December 26th, someone in DC said

Now, wait a minute. I sense your cynicism oozing from this one, ahem. Do you really think that just because social technical progress hasnt occurred in the 30-odd years of your life that it isnt happening? I mean---thats not much time to really gauge human change, no? I guess I feel that its an easy-out to focus all human energy on reliable mathematical and hard science truths (because they can be re-proved and easily defended). If everyone took your perspective (give up on emotional human study) and focused on purely the purely hard sciences, wed never get closer to understanding our human boundaries.

on Wednesday, December 26th, sue doc said

Are you saying MySpace isn't social technical progress???? WHAT ABOUT MYSPACE?????

on Wednesday, December 26th, the author said

It's not just my first-hand observations over the last few decades, but how they match up with history and literature over the millennia. I mean, lines about how we can learn from conflicts we read about in books from two hundred years ago and histories from millennia ago is so common as to be clich.

I'm inclined to put the burden on you, dear anonymous reader in DC, to point out what has changed, Facebook and MySpace aside? What pieces of history are you willing to say are obsolete and will never repeat?

on Thursday, December 27th, me said

Wow this is a profoundly depressing view of humanity and of social science.Do you really believe this or was it just a bad day!? I think there is progress. I think things like the foundation of the ICC or the passage of certain labor laws show this. Of course there are steps backwards, but I still think there is progress.

on Monday, December 31st, Sarah said

What do you mean by social technology?

In certain areas, where certain philosophical ideas have caught on, we've gotten human rights law, passive resistance from Gandhi and MLK, labor protections in Europe and somewhat in the US, and more protections for women, children, and minorities than before, in some countries. That's a kind of progress, though it's not based on clean-cut assumptions as the sciences are, and thus harder to determine, I guess.

I suppose making laws is a kind of progress, in an ass-backwards way. This is probably obvious. People made certain laws because they found that it benefited the society over all, though not necessarily the individual on the short term (taxes, e.g.). Of course, this isn't the kind of thing that would spread like wildfire, like certain kinds of technology would. And then, some groups have got different laws that they like better, so fuck taxes! And then war starts. Or something. Yeah, it's too bad.

Comment!
Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human:
Name:
E-Mail:
Homepage: