I just started a technology podcast I'm calling the "Tech Tricks Podcast." The first show was posted yesterday the 21st, and runs about 35 minutes. Blogs are great, but podcasts are audio, adding a certain extra bit of interesting-ness.
I'll, of course, continue to post content here, but please have a look at this new show and tell me what you think.
I hadn't realized just how much your day-to-day experience colors your overall perception. For example, listen to this little (absolutely true) story.
Last night, my wife and I decided to go to the store to get some food. It's been fairly cold recently, so we decided to have a quick look at the weather to see if we should wear a coat or just a light jacket. The thermometer showed that it wasn't too cold, so we both decided to take just a light fleece. We had a fun evening; it was cool out, but not too cold, just like we had envisioned.
Now that you've heard the story, let me fill you in on the details. We live in Colorado up in the "high country," at around 7000 ft elevation. When I said it had been "fairly cold," I mean that temperatures had gotten to -15°F the night before; that's before adding wind chill. Days had stayed in the single digits most of the week.
So when we decided that it was light-fleece weather, it was actually 25°F outside, still well below freezing. Of course, it's not humid here, so it was a nice, cozy 25°; but still, well below freezing. So how exactly did I begin to think that this sort of weather is not that bad? Well, my criteria for this sort of decision has changed. For example, above about 10° it no longer hurts to breathe. That makes me feel a lot warmer. Above 20°, I can make it from my car to the store without losing too much body heat. I never intend to spend more than 2 minutes out in the weather at a time, so extended exposure really doesn't even factor into the decision. Having spent all my childhood in Phoenix, I still find my new perception on this subject quite surprising.
I noticed my computer exhibiting a strange sort of behavior today. I recognized exactly what was going on, but I decided to take a few screenshots and write about it because most people are unaware that this happens. Here's how it goes:
Confusion in the Task Manager
You notice that your computer is behaving as though it's under heavy load, but you can't find which application is hogging the CPU. You take a look at your task manager and see something like this. Look, in particular, at the areas that I circled:
Here, the process list shows that 7% of the CPU time is being taken by googletalk, while the remaining 93% is spent idle. Those numbers add up just fine. However, at the bottom, we see that 61% of the processor time is in use -- that's a whole lot more than 7%. So what's using the other 54%?
I know some of you have seen this before and probably thought something devious was going on. Could it be a virus? Perhaps spyware? I'm sure you've heard about rootkits--programs that hide their existence from the user. Could this perhaps be a sign of a rootkit?
Well, the reality is a whole lot less exciting. What we're really dealing with here is bad reporting. Once again, as in the case of the Sony rootkit fiasco, Mark Russinovich gives us the tools to see what's really going on. One of his free utilities, Process Explorer, gives us a more accurate view than the built-in task manager. Have a look at the following screenshot, and look, in particular, at the first three processes listed. This screenshot was taken soon after the previous one, so the numbers won't match.
This will all probably make a lot more sense with a bit of explanation...
Interrupts and DPCs
One of the primary responsibilities of the operating system is to schedule time for each process that requests use of the CPU. Most of a program's run time is spent waiting--waiting for you to type something, waiting for a file to open, that kind of thing. When a program is ready to do something, the operating system schedules it a time slot. Yet even on computers with over a hundred processes running, most of the time there isn't any process that's ready to run. The OS schedules this left-over time to process number zero, the "Idle" process. This special-purpose process sends the CPU a HALT instruction that tells the CPU to go into low-power mode and wait for something to happen (like a keypress, for example).
So, we've got X number of programs running, plus process number zero, the "Idle" process. Between these, we can account for all the time that's allocated by the process scheduler. However, this isn't necessarily all of the time that gets used by the CPU. The OS kernel itself also uses CPU time, but it doesn't ever have to wait in line for the scheduler. This code, which is usually hardware drivers (like for your video card), runs under a totally different set of rules.
Kernel CPU time is, for the most part, divided into two categories: time spent on interrupts, and time spent on Deferred Procedure Calls (DPCs). These are really two heads of the same beast; the distinction comes from what kind of code you're dealing with and exactly when that code has to run. The important point is that interrupts and DPCs aren't part of the normal process schedule, but do take up (some times significant) CPU time.
So, what we saw in the first screenshot was the result of the fact that DPC and interrupt time isn't reported by Task Manager. At the time of the screenshot, about 58% of the CPU time was being taken by DPCs and interrupts, leaving about 42% of the CPU time for the scheduler to use as necessary. Of that remaining 42% which the scheduler had to work with, 93% went unused and 7% went to googletalk. Some quick math (42% x 93%) tells us that the real time spent idle was only 39%. Googletalk only used 3% of the total CPU time, which was 7% of the time allocated to the scheduler.
Confused yet? Well, here's the executive summary: Windows' built-in Task Manager does a poor job at reporting CPU usage because it doesn't directly report the time that is used by the Windows kernel (drivers in particular). The per-process percentages are actually calculated based on the remaining time after the drivers have already taken their piece of the pie. This can lead to boatloads of confusion when trying to diagnose a problem, particularly when the real culprit is a driver. Process Explorer by Sysinternals does report DPC and interrupt time, thus bringing balance back to the universe.
If you want to find out more about DPCs, interrupts, and Windows process scheduling, check out Chapter 3 of the book Microsoft Windows Internals.
My wife asked for an Xbox for her birthday this year.
I would imagine that most of you won't get past that first line; particularly the men, who are wondering how I managed to find such a girl. But stay with me: this is a very serious essay about video games and wasting time, not about how "cool" my wife is.
This birthday request got me thinking. Unless you count flight simulators, I haven't really played video games much since I left college. Also in college, I developed an almost obsessive dislike of wasting time. The constant, oppressive weight of never-ending class assignments gave me a sort of persistent edginess such that I could only mentally justify avoiding what I had to do if I was still doing something productive. I still can’t sit down and watch TV unless I feel like I’m accomplishing something, like learning something on the History channel.
In looking around trying to decide what’s worth getting, I played a few of the games they have nowadays, and was surprised to find that I didn’t feel like I was wasting time at all. That’s kind of odd, because playing video games has been, for many years now, the quintessential time wasting activity. If anything is a waste of time, it’s video games, right? So why doesn’t it feel that way?
This really surprised and interested me. As hypersensitive as I am about using my time productively, I really expected to be as turned-off about the prospect of sitting in front of a TV controlling a virtual character as I am about watching Friends or CSI. What I came up with is that a well-written game has all of the intellectual elements of a "good use of time" that I have come to expect. These elements can all be boiled down to two basic principles:
First of all, and most importantly, these games are very intellectually stimulating. They make you think in much the same way as, say, fixing a broken radio would. Now, I’m not talking about games like Donkey Kong, Space Invaders, or any of the rescue-the-princess games we grew up with. Some of these newer games are complex puzzles with subtle clues buried deep inside intricate plots. A lot like a good murder mystery. Not all games are this way, but I’m not really interested in the others.
The other important element is that there is a sense of goals and accomplishment. This is a very common element, because games without it never become very popular. In order to spend much time with a game, you have to feel like you’re actually doing something. But that’s just it: when you turn the thing off, you realize that you’ve accomplished absolutely nothing.
So that must be it. If you haven’t really accomplished anything, you’ve wasted your time, right? Well, perhaps. Tolkien’s Lord of the Rings series was a great set of books; but after I read them, I had nothing new and useful to show for it. Everything I had learned from those books pertained only to a fantasy world that didn’t exist. No real-world knowledge at all; well, nothing that I hadn't already heard, at least. Furthermore it took a days to finish those books. I spent more time reading just those books alone than I did in all my video game playing over the past 8 years combined. And yet, how many parents tell their kids that reading the classics will rot your brain? No, of course not.
Video games, I think, have a certain stigma associated with them for a couple of reasons. First of all, early technology made it difficult to make a game mentally challenging but still fun to play. That, combined with a lack of imagination on the part of the programmers lead to a large divide between games like “Final Fantasy” and “Mario: Fun with Numbers.” Of course, kids do what they enjoy, and these simple but challenging games provided kids with the emotional feedback they craved by giving them simple goals which they could accomplish and feel good about. In fact, with video games, the noticeable accomplishment frequency, and therefore the reward-to-time-spent ratio, is much higher than any other readily available activity. These kids, if left to their own devices, will seek out the activity that gives them the most positive feedback—so they’ll play video games all day if they can.
Which brings me to my next point: if a child spends an inordinate amount of time at any activity, even reading, the parent will conclude that the child is wasting time. It doesn’t matter what the activity is; the child could be playing video games, watching TV, playing baseball, solving puzzles, assembling models, or even studying advanced calculus. If he fills the entirety of his free time with the same activity, the parent will be displeased. The kid will be “wasting his time playing baseball,” or “wasting his time with those stupid puzzles.” Video games are just too consistently entertaining.
I see no reason why a good game can’t be as beneficial as a decent book. It's a fairly new and novel medium that has yet to be fully exploited, I think. Most games, like most books, are, indeed, a waste of time (have you seen what passes as literature these days?). Others sharpen your mind like a good game of chess. Of course, I'm certainly not saying that Call of Duty, as educational as it is, should be used in schools to teach about WWII; but I think that we will eventually have games that will serve that exact purpose. What could better teach you what it was like than reliving the experience yourself? Oddly enough, true-to-life realism and historical accuracy are goals that the game industry is aggressively pursuing. The industry has a long way to go yet, but it also has some powerful potential that shouldn't be ignored.
I like RSS; I use it extensively to track intersting blogs, product releases, and now podcasts all using Thunderbird's RSS feed manager. It quickly became obvious to me that an email client really is the perfect match for RSS feeds, since the content so closely resembles an email message to begin with. I therefore couldn't come up with any good reason to use Firefox's RSS-powered "live bookmarks." That is, until now.
Like most of us, I suppose, I run into a lot of web content that seems really interesting, but I often don't care to read it at the moment. Perhaps I'm looking for something else, or perhaps it explains how to do something that I'm not working on right now. So that's what bookmarks are for, right? That may be, but it hasn't worked too well for me. I've been disappointed with my bookmark setup for a few crucial reasons.
The first is portability. I use at least 3 computers regularly, and what I bookmark here I want to be available anywhere. Using a bookmark service like Delicious solves that problem; but it also introduces my other problem: ease-of-use. Delicious is, in fact, about as easy to use as they could possibly make the site. But I want something that no website can offer; I don't want to have to go to their website. I want complete browser integration, like my bookmarks toolbar. I decided that the only solution was to write an extension to integrate Delicious bookmarks directly into Firefox. Then I observed--quite correctly--that I was far too lazy to do that. And then, and this is the cool part, I realized that Delicious and Firefox developers had already done the hard work; I just have to "turn on" the existing capability.
So, there's the background; here's the solution. This solution only works with Firefox, not Internet Explorer. It almost works with the new IE version 7, but Microsoft unfortunately left out some very critical pieces in their implementation.
Delicious will serve up your bookmarks either on their website, or in handy RSS form. This works very well with Firefox's RSS bookmark feture, allowing you to put a "Folder" of Delicious-served bookmarks right into your normal bookmarks collection, anywhere you might otherwise display your own browser-served bookmarks. That includes my old friend, the bookmarks toobar. So, here's what you do.
Go to your Delicious account (or Delirious -- same exact concept, but open-source), select a tag you want to add as a bookmark folder. (Did I explain that these "bookmark folders" are actually the tags you already use? Well, they are.) Now, do you see the little orange RSS icon in the address bar? It looks like this: . Click it.
When you click you get a drop-down list of RSS feeds to use. You want the feed of bookmarks, not the feed of tags. It will then ask you where to put the "live bookmark" and what to call it. You can pick whatever you want, but I'd suggest calling it something short (like the name of the tag) and creating it in the bookmarks toolbar folder. Go ahead and repeat that process of all the other tags you want quick access to. What you end up with is something that looks quite a bit like this:
Of course, there's no rule that says you can only use your bookmarks. It works just as well with anybody's bookmark collection. If you want to, you can create a normal bookmark folder (even on the bookmarks toolbar) and put any or all of your "live" bookmarks folders inside it. If you have a lot of computers to do this on, you can get one set up and then copy your "bookmarks.html" file to the others. If you want to do that but you have no idea what I'm talking about, contact me and I'll walk you through it.
One limitation that I haven't addressed yet is that each live bookmark collection only uses a single Delicious tag. This means that you may want to have a few tags that you use specifically for classifying bookmarks in your browser's collection. Since Delicious lets you specify any number of tags for a given entry, that's not a problem. Also, the other RSS feed that we didn't use, the feed of tags, is one that lists your Delicious tags rather than bookmark entries. This drops you off on the tag's Delicious page. This may be useful to you if you use a LOT of tags, of you want to link to someone else's tags collection. Another limitation is the number of bookmarks it will display under one tag. On my browser, it will display the top 31 and clip the rest--the others just don't fit on the screen. If you have more than that, perhaps you'll want to consider a more fine-grain classification system. There is no limit (that I've seen) on the number of RSS bookmark folders you can create, so go ahead and create as many as you deem necessary.
I hope these ideas help you make better use of your bookmarks collection. If you're part of that unwashed 90% who still use IE instead of Firefox, perhaps this will give you one more reason to upgrade to Firefox. Give it a try and you probably won't go back.