Foo Hack » Punditry Isaac Schlueter on Web Development Tue, 03 Nov 2015 06:34:16 +0000 en ADHD and Web Development Mon, 22 Oct 2007 20:02:43 +0000 Isaac ...Read More]]> I’m working on a site right now with Yahoo! that has not yet been released to the public, so I can’t say too much about it. When I first started here, I was on the games team. Being a fairly avid chess player, the games site was a haunt of mine as early as 1997. (In fact, it was the first Yahoo! offering that I used on a regular basis, and human-on-human chess remains one of my favorite Yahoo! products.) However, most of the actual content on the site was not terribly thrilling to me, since I’m not as into video games as I once was, and the latest match-three clone doesn’t get me going. It was a website, with business goals, and technical hurdles, and it was fun to solve the problems, but that’s about as deep as it went.

Later, on the brand universe sites, we never really worked on a brand that I was all that into, and, between you me and the internet, I wasn’t really all that happy with the direction that the product went in. It was fun building modules, and challenging, and there were definite good times, but I definitely never went to those sites except to show someone what I had done.

This product, however, may well become my new starting page once it releases, or at least a frequently accessed bookmark. Unfortunately, it’s been a real challenge to actually get stuff done instead of just consuming all the great content that it is finding.

In a fit of cosmic irony, I came across an article on the New York Times about ADHD. While there’s really nothing particularly controversial in Peter Steinberg’s analysis, I was struck by this:

For those of us who have “attention-surplus disorder” [...] this knowledge-based economy has been a godsend. We thrive.

But attention disorder cases, up to 5 to 15 percent of the population, are at a distinct disadvantage. What once conferred certain advantages in a hunter-gatherer era, in an agrarian age or even in an industrial age is now a potentially horrific character flaw, making people feel stupid or lazy and irresponsible, when in fact neither description is apt.

I’ve worn a lot of hats in my career. I’ve peddled vacuums, fixed computers, written documentation, led training classes, managed products from concept to deployment, designed logos, and built software in ASP, PHP, Javascript, CSS, VB and C++. I’ve walked a technically illiterate grandmother through a registry patch over the telephone. I was more successful in some of these roles than in others, but all in all, I’ve done alright. I’d definitely say that I’m a “knowledge worker.” And, I was diagnosed with AD(H)D as a child. I took Ritalin until I rebelled against the establishment in High School and decided that I didn’t need to be dosed. By all accounts, I still have ADHD. And, I don’t think I’d be as good at what I do if I didn’t have this so-called “disorder.”

In the words of Tyler Durden, You are not a beautiful and unique snowflake. While we all have our particular unique combination of traits, I’m quite sure that there are lots and lots of people smarter and more successful than I am. But, not to be too vain, I’ve thrived in this knowledge economy, and I’m equally sure that I’m not the only person with an “attention deficit” who has managed to turn this mindset to their advantage.

Why don’t people stop to wonder why their attention deficit child can’t sit still and do spelling homework? Because that crap is BOOOOORRRRIIINNNNGGGG. Everyone’s attention pretty much works the same way. We all have different levels of instability in our attention. My wife can start folding clothes, and just keep going until they’re done. She’s a teensy bit OCD, which is just the other side of the attention-stability spectrum from ADHD. (That’s probably why we make such a good team.)

One commonly cited symptom of ADHD is the ability to get deeply immersed in certain tasks, almost to a ridiculous amount. As a kid in the summer, I would put a new RPG into the Nintendo, and not move until I had beaten it. I had a deficit of attention like the Pope has a deficit of catholicism. A meteor could have landed on our roof, and I might not have noticed. These days, it’s web pages. 7:00 rolls around, and it takes an act of extreme willpower to “switch off”.

In programming, the right kind of laziness is good. Each bit of functionality should only have to be written once; copy-and-paste programming is the ultimate in wtf-ery. A person with ADHD avoids boring behaviors because, to them, boredom feels like dying, and changing tasks is easy. While “normal” people feel a sense of fear or nervousness that the things they leave won’t get done, those of us with ADHD are quite content that we’ll get to it later, after we do something else. I’m sure that there’s some evolutionary justification for this, but all I know is that I’d gladly get punched in the nose rather than write out those fucking spelling words 3 times each ever again. (Incidentally, there’s some research out there indicating that childhood spelling education has little if any effect on adult spelling ability.)

I often hear this when I tell people that I’m glad that I have ADHD: Well, you’re high-functioning. My (sister, nephew, mother, uncle, friend) is so ADHD he can’t even hold down a job. Listen, either your friend with ADHD hasn’t found a job they love, or their problems run deeper than an attention instability. “High functioning,” with respect to ADHD, is just a euphemism for “found something you like doing that happens to pay well.” I’m just as ADHD as I ever was. My wife knows better than to send me to the store without a list, and is frequently (if understandingly) annoyed at my forgetful and distractable nature.

Most of the programmers I know, and nearly all of the web developers I know, have been diagnosed at some time or another with ADHD. Now, of course, this is anecdotal evidence, and utterly painfully unscientific. I’m the first to cast doubt when there is a lack of hard stats on a topic, and I’d absolutely love for some sociologist to do a proper study on this. But how is it that there are so many people working as engineers and artists and executives working and making a living on the web who have been diagnosed with ADHD, if this disorder is such a liability in the Knowledge Economy?

I suspect that a part of the problem is that many people learn their sense of self-worth and competence at an early age, mostly based on their ability to please their parents and teachers. A child with ADHD doesn’t want to do his spelling homework, and speaking from experience, it was a frustrating and emotionally painful experience. It’s embarrassing. A child who is embarrassed will naturally begin to think, I’m no good at this. That thought sticks. It shapes their choices and the things that they’re interested in. Labeling ADHD a “disorder” and subjecting children to constant humiliation and parental frustration sends the message to that they aren’t good at just about everything. At the other extreme, ADHD children are often thought to need special help and attention, and the parent who wants to do right by their child shelters them from any kind of environment where they might fail. They quickly learn that they’re not as good as the other kids, and stop pushing themselves.

The most important thing is for a child to feel competent and successful at something that challenges them. This is actually pretty easy with ADHD kids, but only if parents are willing to look beyond the carbon-copy stimulation that most public schools offer. Competence breeds interest, and interest motivates action, which in turn leads to more competence. It’s a virtuous cycle. The non-high functioning ADHD patients probably need a shrink more than medication. Their instability, properly harnessed, could be a source of tremendous power in the world.

I’ll admit, occasionally I just don’t feel like doing something, and it’s really hard to get moving. The drive that sustains me for hours of intensity can also turn in the other direction, and creating a new iTunes playlist is so fascinating I can’t tear myself away. But the benefit of web development is that there’s always a lot of different things to do and it all tends to feel like playing. The deadlines just add another element to the process, like the countdown in a video game that makes the adrenaline rise up and sharpen your wits. But the slow times require you to be a patient teacher with yourself, and be firm but understanding. So, I take 4 hour days sometimes, or spend the better part of the day reading through my friends’ twitter updates. Tomorrow I’ll make it up. This slack-and-attack rhythm isn’t so good in some fields, but it actually works quite well in most areas of the Knowledge Economy, since the person with instable attention frequently has a great breadth and depth of knowledge about the many various things that their interest drove them to. If, of course, enough time is spent in “attack” mode to make a difference.

Speaking of which, we have a demo tomorrow. So, writing this blog post probably isn’t the best use of my time…

When will the Bubble Bubble burst? (Or: Who really listens to these pundits?) Mon, 06 Aug 2007 17:00:06 +0000 Isaac ...Read More]]> So, I stumbled across an article by John Dvorak, who seems to think that we’re headed for another bubble burst in the next few years, and that it’ll be worse than the 2001 tech crash. In fact, he asserts that “Every single person working in the media today who experienced the dot-com bubble in 1999 to 2000 believes that we are going through the exact same process and can expect the exact same results,” an absurdly over-reaching generalization that made me literally laugh out loud.

I’ve heard this before. Every once in a while, a relative or friend of mine hears someone talking on the news about how Microsoft is going to buy Yahoo, or how we’re headed for a crash, or how the internet will be outdated by some new thing in 6 months, or how “web2.0 is changing everything,” and so on, and sometimes calls me to ask what I think of it, since I’m the person they know with some skin in that game. The vast majority of punditry is utter trash. Like any fortune teller, they get by on one or two hits, and rely on the short attention span of the crowd to hide their nearly endless string of failed predictions. The more radical their claim, the more likely it is that they’ll get attention, because people so often fall victim to the “Cloud Insurance” scam. (”What if this guy is right, and the clouds really are going to fall… then I’ll be glad I bought that insurance!”) The more attention he gets, and the more he can whip the audience up into a frenzy, the more likely it is that the pundits’ past failed predictions will be forgotten.

I don’t want to come off as decrying punditry as such. (That would sort of make me like a Dvorak of Pundits, wouldn’t it?) No, there is some very good analysis out there. One mark of a good analyst is a history of successful predictions. For example, if you look through the “iTulip’s Record” links at the top of’s homepage, you’ll find some very sound predictions that were insightful, well-informed, and ultimately accurate. In fact, this article from November 1999 called the bubble for what it was.

Back in the 90s, there was a great new thing that changed the way that humans interact, communicate, and generally live their lives. Big technological innovations rarely come along and change the landscape of daily life. The railroad, the steam engine, the printing press, the television—these kinds of things have made our world smaller, and in so doing, have had huge effects on the way that we live and work. By changing the way that we live, they also made certain skills and commodities much more (or less) valuable than they were prior to the change. Things that were never possible now are, and there’s new money to be made.

By definition, no one has any experience in a new landscape-changing technology when it shows up. New ventures cost money, and carry a risk of failure. Not wanting to be left behind, there is often a frenzy of new ideas, along with a corresponding frenzy of spending and speculation. Money floods into the market, artificially pushing up prices above their “real” value. Most of those ideas fail, and most of that money is thus wasted on unproductive ends. Eventually, the economy does what economies do, and there is a correction.

Not all Economic Bubbles are the same. Some are big, and some are small. The stock market crash of 1929 was a bubble that kicked off the start of the Great Depression. By comparison, the dot-com bubble was pretty tame, though it did reduce quite a few “paper millionaires” to pennies. (Not all economic bubbles are stock-market bubbles. Anything that is overvalued and becomes and object of speculation will eventually come crashing back to reality. Tulips and Beanie Babies come to mind.)

A speculative bubble is a dangerous thing to get wrapped up in, and highlights the importance of not betting more than you have and always diversifying your investments. However, in the long run, it’s a necessary part of the feedback-cycle that produces a quality marketplace. Yahoo, eBay, Amazon, and Google were never going to go under during the dot-com bubble. They had to tighten their belts a notch or two, but good ideas that are executed on wisely will always do alright.

I personally think that there really aren’t any Amazon- or Google-sized ideas left on the web. That’s not to say that we can’t innovate or can’t make money, of course, but at this point, it’s the 37signals and Twitters and Flickrs and YouTubes, which take a small team to create value with simple good ideas on a shoestring budget, that are making out well. And of course, Yahoo and Google and them are doing what they do, and buying the successful startups when it makes sense to do so. VC has been flowing back into Silicon Valley, but in much smaller streams, and with the benefit of a big failure to learn from. The benefit of doing things on a shoestring is that it doesn’t cost as much, and you get to keep more of your purchase price if you are eventually bought. There is a much more stable ecosystem than there was 6 years ago.

Granted, “Web2.0″ is a silly marketing buzzword, and I think that most engineers and businesspeople pretty much understood that from the start. Social networking is an interesting phenomenon, but ultimately nothing new. (Weren’t we networking socially before web2.0?) Some things are getting attention and interest and $$$ that probably shouldn’t, and there will be corrections in the future. But the web crash of 2001 is not going to happen again—at least, not until the next life-changing piece of technology comes by and gets us all excited.

What I want to know is this: When will we as a society stop listening to doom-and-gloom pundits like Dvorak? Did 2001 traumatize us so much that we can’t even admit the possibility of a fortunate future? I think there is a sort of “interest bubble” that somewhat mirrors the phenomenon of an economic speculative bubble. You invest your agreement, and in return, you get to say “I told you so.” The problem is, when that bubble pops, no one notices.