• evan

blog moved

As described elsewhere, I've quit LiveJournal. If you're interested in my continuing posts, you should look at one of these (each contains feed autodiscovery metadata as well):

http://neugierig.org/software/blog/ -- (new, just created) the continuation of this.
http://neugierig.org/software/chromium/notes/ -- detailed posts about Chrome.
http://neugierig.org/ -- small life updates, in a few sentences each.

I've used this blog since November of 2003, where I decided I should start writing things online so that I can learn things from it: both in that writing a thought out helps crystallize it, and more from the comments I get from readers who know more than me. I would like to continue to hear from you. Please don't be afraid to email.
  • evan

memcache job offers

I get occasional recruiter spam that specifically calls out "my work on memcached".

This is pretty funny because all I did was make some trivial contributions seven years ago (!) when it was still Brad and Anatoly's toy. (Looking at the logs now, it appears to be right during my brief attempt at using OS X -- I guess I did the first patch for OS X support. And the first and only change to the AUTHORS file mentions me, despite not putting me in it.)

I can only imagine people are looking at the ohloh kudo rank, which currently puts me at #2 for some reason (ahead of Brad, hah).

On one hand, it's kinda cool that recruiters are trying to get with the times ("[companies] won't speak to [candidates] unless they see their github account 1st", wrote one). On the other, when they mention memcached it reveals they don't really know what they're doing.
  • evan

closest computer

I have a computer in the closet that serves music. I haven't thought much about the computer as I've used it -- I just know it's always available on my network and ready when I want it. Sometimes I forget about it for months.

Today I am cleaning the closet and want to move (unplug) the computer. I thought to check uptime:
18:15:41 up 700 days,  7:00,  1 user,  load average: 0.00, 0.00, 0.00
Not bad!
  • evan

max-width user css

I run my browsers wide (full-screen, actually -- tiling window manager and all). Most sites continue to work fine like this. Some, like Wikipedia and LiveJournal, make the text as wide as the window, resulting in unreadably wide lines. I had been just dealing with this for years when I realized some simple user CSS would help:
$ cat .config/google-chrome/Default/User\ StyleSheets/Custom.css 
p, td, li {
  max-width: 50em;
(For some reason you have to run Chrome with --enable-user-stylesheet for that to work. Other browsers have similar mechanisms.)
  • evan

no go

Two friends of mine were pretty enthusiastic about the Go language, so I tried writing a program in it yesterday. It is frustrating because despite the language having some really good ideas (my heart sang a bit when running "ldd" on my binary revealed it uses no libraries), it felt pretty profoundly ugly.

I don't mean the syntax (well, truthfully, the syntax is pretty ugly -- hard to be too proud of tutorial snippets like sum(&[3]int{1,2,3})) but rather that there are all of these strange non-orthogonal bits that stick out like half-hammered nails. Like having all of value, reference, and pointer types (with accompanying weird FAQ entries) or strange extra builtins (iota, close).

My program wanted to read a set of structured objects from an input file, so I needed a growable array of objects. I wanted my function to return an array (er, slice) of these. It would seem natural to use the vector package but that only lets you get back an array of untyped interfaces (the no-polymorphism thing); it appears the best I can do is copy the vector's contents into a new array once I'm done reading the file.

I hope that with more time they will be able to improve the documentation (it's not at all clear to me when pointers are needed, or why var buf [10]byte; file.Read(&buf) needs that & or how else I'm supposed to get a slice from an array) as well as grow more consistency into the language. I think many of the underlying concepts (the way interfaces work, even the "polymorphism makes things too complicated" rule) are really good but actually using the language left me feeling let down.
  • evan

livejournal kids

Neat image from Jack Dorsey.

Every so often someone will ask me about Twitter and I'll dig up a a random day from Brad's LJ in 1999 and talk about how cyclical ideas are, how some were ahead of their time and how others were just not good. Then I think about how five years from now nobody will remember any of this and I get a little wistful.

On the other hand, at the WWDC WebKit party last week I got talking to a new Apple intern -- an undergrad, probably somewhere around 21 years old -- and he asked what I worked on before Google. It turns out he and his friends were happy LJ users! So perhaps my five years estimate can be pushed out a bit.
  • evan

microsoft, apple, adobe

Recall ten years ago, back when Microsoft dominated computing -- the time of the DOJ investigation. People (and governments) railed against Microsoft for their anticompetitive practices (which were undeniably scummy). The silly "M$" abbreviation became common. Eventually they were forced, legally, to open up their API.

It's interesting to contrast this to today's Apple. (Consider that Java programs also ran terribly on Windows; there's a legitimate user experience argument that Java "should not have been allowed" until they could make it take under 10 seconds to start up.) While Apple is certainly not a monopoly, I think the moral objection -- at least the one I had -- to Microsoft's behavior was not so much from abusing their monopoly but rather that zero-sum behavior is wasteful in our non-zero-sum world. That is, seeing someone spend their energy to destroy others rather than trying to out-improve their competitors is always a downer, even if they're nobody.

But while people complain about Apple, it's only a few people; users don't care at all (aside from the blue lego), and even most developers affected by Apple shenanigans retain the necessary cognitive dissonance to rationalize it. Why? What I think the important difference between Microsoft then and Apple now is that Microsoft's products just weren't very good.

That is, once we're at a state where we're ready to blame, we talk about it in terms of the morality and what is good for the world; but we only get to the point of blaming when we're unhappy as consumers. (Consider how quickly people forgive Facebook for the variety of privacy fuckups they've made -- it's because, even with the bad, people love using Facebook.) It's the same reason nobody's too upset that it's Adobe in particular getting cut out: as a user of the web, I know that Flash just isn't very good.

Rather than arguing now, Adobe could have helped their case more in the past by working with browser vendors to improve their plugin -- making their plugin not be the top source of crashes for browsers (I found it hilarious that, before they had finished the work to move plugins out of process, Safari had extra code specifically to blame Flash when it crashed), not dropping the ball on protecting their users, leading the pack on GPU integration to make Flash outperform the <video> tag that browser vendors effectively introduced as a workaround for Flash's poor integration.

This probably wouldn't have directly helped the current situation with Apple (whose primary interest, public statements to the contrary, is more in locking in developers and users to their platform). But if both users and developers loved Flash, it would mean that Flash would have been more of a fundamental piece of the internet rather than that thing we grudgingly tolerate to make YouTube work, putting Adobe in a stronger bargaining position with respect to new gadgets.

(More fundamentally, I have long wondered about the wisdom of a company who has nearly tied their success to an operating system distributed by a company that historically repeatedly shown that they only grudgingly allow third-party software for niches where they haven't yet had the time to write their own version, e.g., but maybe that problem is endemic to all operating systems.)