Wednesday, 29 June, 2005
I ran across an interesting Associated Press article aggregated on Yahoo today: Intense Interval Training Deemed Effective. Researchers measured the strength and endurance of two groups of college students, all of whom were healthy and exercised regularly, but weren't really athletes. One group did three 30-minute interval sessions per week for two weeks. The other group did no specific training, but continued their normal activities, including basketball, jogging, or aerobics.
It's little surprise to learn that endurance and strength improved in the group that did the interval training. They nearly doubled their endurance, although how endurance was measured was not specified. That the control group showed no change also doesn't come as much of a surprise.
The surprising thing about this research is how shallow it is. Athletes have known for years that focused interval training is a valuable part of an overall training schedule. If you ever played football or ran track in high school, you probably remember your coach making you run sprints until you thought you were going to puke or pass out. Working the body at high heart rates (above your lactate threshhold) for brief periods builds strength and increases your ability to use oxygen. If the measure of endurance is how long you can operate at a high heart rate, then of course interval training will increase your endurance.
The article does state that interval training is not effective as a normal exercise regimen because it's difficult, painful, and doesn't burn enough calories. Most people do not have the motivation to complete even one interval training session without a lot of encouragement from somebody else who's right there: a coach or a workout partner. Four to seven 30-second all-out efforts is hard. Your lungs burn. Your legs ache. Your head spins. After two or three intervals, you can't believe how long 30 seconds is. Halfway through the workout you start thinking, "Why am I doing this? Why should I put myself through such pain?" Without somebody pushing, only the most highly motivated individuals will push through the pain and complete the interval workout at 100% effort. I know. I've done it. Interval workouts are effective, but they're unbelievably painful. A 30 minute interval workout in the morning will leave you tired for the rest of the day.
There are other disadvantages to interval training. The most common problem that athletes experience is over training. If one interval workout per week is good, wouldn't two or three be better? All too many athletes--even advanced athletes--over do the interval training by not giving their bodies time to rest and rebuild. They show improvement for the first few weeks--maybe a month--and then their performance levels off or falls. The most common reaction then is to increase the intensity, until finally they injure themselves: pull a muscle, develop a stress fracture, or just run their bodies down to the point that they have no energy.
Endurance cyclists--those who compete in 24-hour races, Paris-Brest-Paris, the Race Across America, and similar events--have tried replacing much of their aerobic conditioning with intervals, and have mostly failed. Interval training is important for building strength and endurance at high heart rates, but a 30-minute interval workout cannot help with conditioning your back, neck, shoulders, forearms, wrists, butt, and feet for the rigors of spending all day in the saddle. Believe me, I've tried it. Although my legs and cardiovascular system were just fine after 100 miles the rest of my body was wiped out. Interval training is a valuable addition to a training program, but certainly not a substitute for hours on the bike and the aerobic conditioning you need for all around health.
Tuesday, 28 June, 2005
Related to the previous note, how the heck do I securely destroy a CD or DVD? I know I can make it unreadable to a standard drive by shattering it or popping it in the microwave for about two seconds. But that doesn't really affect the bits that are written to the media. Could somebody melt the plastic cover off the CD and read the raw media using techniques similar to those used to reconstruct information from damaged hard drives? Seems to me the best way would be to shred the media.
Several companies make devices that purport to make your data unreadable. Aleratec makes a DVD/CD shredder that doesn't actually shred the media in the normal sense. It simply impresses a waffle pattern on the plastic covering, making the CD unreadable by any normal means. That would stop the average dumpster diver from stealing your information, but it wouldn't deter somebody who is more motivated.
Many companies make devices that actually shred the media in much the same way that a paper shredder shreds paper. In fact, some paper shredders will shred CDs, DVDs, and diskettes. Provided the resulting pieces are small enough, it would be very difficult (but not impossible) for even the most motivated black hat to obtain any useful data. This option gives the most bang for the buck.
As with hard drives, the only sure way to destroy the data would be to melt the media into so much slag. I suspect a machine to do that would be way too expensive unless you really don't want the government to have any chance at reconstructing your data. But if they were willing to go to that extreme, they'd probably be better off getting a warrant and seizing your computer.
I think I'll invest in a new paper shredder.
Tuesday, 28 June, 2005
I'm not sure what to call the program I'm looking for. I thought it would be called an archive manager, but a Google search on that term returns backup programs, file managers, and other file manipulation or viewing tools, but nothing like what I'm looking for. Let me explain.
I have CD backups of my systems going back almost 10 years. Some of the data on them is over 20 years old. We'll leave the discussion of how useful most of that stuff is for another time. The thing is, I have a dozen or more CDs with archives of old email messages, old source code, and all manner of stuff. I'm afraid to throw any of the old CDs out because I don't know if they contain stuff that isn't on the newer backups. It's clear, though, that I can't continue to add to my backup CD collection indefinitely.
What I want to do is consolidate the backups into a single directory structure, remove duplicate information, and make sure that the most recent version of any duplicated file is the one that's kept. I know I can't fully automate the process, as I'll want the ability to manually resolve any changes, but a program should be able to do most of the grunt work for me. Here's what I envision.
- Copy the entire contents of the oldest backup CD to a directory on the hard drive.
- Insert the next most recent backup and start a program that will compare the new CD with the existing file structure.
- All files that exist on the new CD but not on the original are copied without question.
- Files that have duplicate names, modification dates, sizes, and contents (using a CRC or MD5 hash) are ignored (not copied).
- Files that have duplicate names but different dates or contents are copied to the destination and assigned a version number, and are flagged in the user interface for further action.
After each CD is processed, I manually resolve the changed files by deleting one version, or by renaming so that the most recent version is the one that's kept.
If I apply that methodology to every one of my backup CDs, by the time I'm done I should have a single top-level directory that contains all of the information from the 10 years of backup CDs. I can then go through that directory and delete anything I no longer want to save (I really don't need those emails from 1985), burn a single CD, and delete the working directory from my hard drive. The next time I want to do a backup, I copy the backup CD to a working directory and then use the program to make the comparisons of the data on my hard drive with the backup image, copy the necessary files, and allow me to burn the result to CD.
I have to think there's a similar program available. But I don't know what it is or even what to call it. Any ideas?
Monday, 27 June, 2005
If I were to formalize my approach to debugging, the first rule would be "identify the cause of a problem before you attempt to fix it." Poking around making changes that might possibly fix the problem without first stepping through the code and seeing where things go wrong exposes you to several risks: you can make the problem worse, you can create other problems, or you can mask the original problem temporarily only to find it reappear at some later time. Unless you know what's causing a problem, you cannot reliably say that you fixed it. The worst thing that can happen is to have a bug "just disappear."
That rule holds for debugging other things, too. You wouldn't start randomly replacing parts on your car to eliminate a squeak, or go out and buy a new battery without first testing and re-charging the existing one. The process of debugging is methodical: locate, as closely as possible, the cause of the problem, and then change one thing at a time until the problem is solved. Any other approach is a waste of time and an invitation to cause more problems.
As computers and operating systems have become more complicated, it's become increasingly difficult to locate the causes of problems. For any given set of symptoms, there are untold possible culprits. Is it a hardware or software problem? Is the problem with the driver, the application settings, or a conflict with another program or device in the system? I've seen Microsoft Word crash because of a bad video driver. Why the video driver should cause an application program to crash is beyond me, but it is (or at least it used to be) fairly common.
Of all the subsystems I've had trouble with over the years, sound has been the worst. In the 13 years or so since I've had sound on the computer, the only one I never had trouble with was the original SoundBlaster that I bought in 1992 or 1993. Low audio, excessive hiss, inoperative microphone, IRQ and driver conflicts, application conflicts or poor application software, you name it and I've probably experienced the problem. I have no idea why the sound subsystem has to be so difficult.
Today I was trying to use the integrated SigmaTel sound hardware on my laptop to record some speech, and I couldn't get any audio. My first suspicion was that the microphone was bad, so I tried a different one. No dice. I booted my old system and plugged my headset into that. The microphone worked like a champ. So the problem is with the laptop. I poked around in the settings, but didn't find anything promising. A quick search of Dell's support site revealed that there is a new driver (released on June 20) that addresses some problems. Downloading and installing the new driver didn't fix the problem. Then I ran the diagnostic utility. It showed that the hardware was working fine, and when I ran Sound Recorder again it actually recorded my voice.
Why did running a diagnostic change the behavior of the sound subsystem? Will I have to run the diagnostic so that it can perform some magic every time I want to use the microphone?
When people learn that I program computers for a living, they often ask me if I can help them with a problem. My standard response these days is: "I just program these machines. I have no idea how to actually use them." They laugh, but there is quite a bit of truth in that statement. I like for things to make sense, and as time goes on there is less and less sense to be made out of configuring my computer. Maybe I'm just getting old.
Sunday, 26 June, 2005
The U.S. Supreme Court on Thursday released its opinion in the Kelo v. New London (PDF) case. When I saw a news headline saying that the decision had been handed down I decided to read the entire decision myself before reading any of the commentary.
In 2000, the city of New London, CT approved a development plan intended to revitalize the city. The 90 acres to be redeveloped were adjacent to a large manufacturing facility proposed by the pharmaceutical company Pfizer. As part of the development plan the city obtained property from willing sellers and attempted to use its power of eminent domain to force the remainder of the property owners to relinquish their property for "just compensation" as required by the Fifth Amendment of the Constitution. Several of the landowners protested, arguing that the takings violated the "public use" clause of the Fifth Amendment.
The city argued that economic development is a valid reason for exercising eminent domain, even though the property in question would eventually be sold or leased to private parties. The case worked its way through the Connecticut courts and finally to the State Supreme Court, which held that the takings were valid. The U.S. Supreme Court agreed to hear the case. Thursday the Court ruled in favor of the city, upholding in a 5-4 decision the ruling of the Connecticut Supreme Court.
I was dismayed when I first heard about the decision, thinking that this was a huge departure from the idea of private property. But in reading the Court's decision I learned that this is just one more step down that road. Two prior cases in particular have established government's ability to use its power of eminent domain to transfer property from one private party to another.
In Berman v. Parker (1954), the Court upheld a redevelopment plan targeting a blighted area of Washington, D.C. in which most of the housing was beyond repair. The city invoked eminent domain to condemn the property, use some of it for public purposes, and sell or lease the remainder to private parties. To complete its plan the city also took some non-blighted property and the owner protested. In a unanimous ruling the Court ruled in favor of the city, stating:
We do not sit to determine whether a particular housing project is or is not desirable. The concept of the public welfare is broad and inclusive... The values it represents are spiritual as well as physical, aesthetic as well as monetary. It is within the power of the legislature to determine that the community should be beautiful as well as healthy, spacious as well as clean, well-balanced as well as carefully patrolled. In the present case, the Congress and its authorized agencies have made determinations that take into account a wide variety of values. It is not for us to reappraise them. if those who govern the District of Columbia decide that the Nation's Capital should be beautiful as well as sanitary, there is nothing in the Fifth Amendment that stands in the way.
In Hawaii Housing Authority v. Midkiff, the Court considered a Hawaii statute whereby title was taken from land owners (lessors) and transferred other private properties (lessees) in order to reduce the concentration of land ownership. The Ninth Circuit (which ruled against the Hawaii Housing Authority) ruled that the taking was "a naked attempt on the part of the state of Hawaii to take the property of A and transfer it to B solely for B's private use and benefit." The Supreme Court, in another unanimous decision, reversed the Ninth Circuit's ruling, holding that the State's purpose of eliminating the "social and economic evils of a land oligopoly" qualified as a valid public use.
In light of the Court's ruling on those two cases, it is no surprise that the Court ruled in favor of the city of New London in the current case. The Court rarely reverses itself, and the Kelo v. New London case is similar enough to those two and to others that the outcome isn't a huge departure from other rulings. It does place in question, though, the value of the takings clause in the Fifth Amendment.
I'm inclined to favor Justice Thomas' opinion in his dissenting view: that the Court should strictly interpret the Fifth Amendment, that this case should have been decided in favor of the petitioners, and that the Berman and Midkiff decisions should no longer be used blindly as precedent when deciding similar cases in the future.
But nobody asked me.
Saturday, 25 June, 2005
Today and tomorrow mark ARRL Field Day 2005, the annual ham radio event where we all head out to remote locations, set up temporary stations, and see how many stations we can contact. Okay, so it's a 2-day geek fest for radio nerds. We actually do benefit from it. The idea is to simulate operating for an extended period in an emergency, using power other than from the commercial power lines. Mostly that means gas powered generators.
There are many different categories of stations and many different ways to earn extra points beyond the points you get for making contacts. There are points awarded for having an elected official visit the site, for having a press release published in the paper, making a contact via one of the amateur radio satellites, special modes, and natural power. For the second year in a row our club's natural power source was me riding a bicycle.
I wrote up last year's experiment here. That setup used a small Plymouth alternator to convert my pedaling into electrical power that was stored in a battery. We charged the battery and used it to make our required five contacts to earn the 100 points for a natural power source. There were two problems with that setup. First, a lot of my pedaling effort was wasted exciting the field for the alternator. Second, it just wasn't very sexy. Pointing to a battery and saying "I charged that by pedaling" doesn't have quite the same effect as driving the radio directly from the bicycle generator.
This year my friend Steve Cowell (KI5YG) got hold of a 24 volt electric scooter motor and wired up a voltage regulator. If you turn an electric motor, it acts like a generator. We attached the motor to the bicycle using the same V-belt we used last year, put a 22,000 μf capacitor in-line to insulate me from the transmitter load, and attached the radio to the voltage regulator. 32 minutes of medium-hard pedaling later and we had our five contacts. That was a whole lot easier than the hours and hours of pedaling I had to do last year in order to charge the battery.
Thursday, 23 June, 2005
Tuesday I wrote a bit about securely erasing data from a hard drive and I mentioned Darik's Boot and Nuke. DBAN is a nifty little system that will write a bootable diskette or CD that you can use to completely (as much as reasonably possible) eliminate all traces of your data from a hard drive. The image that it writes contains a pared-down bootable Linux system and the program that actually erases the data. It's all quite easy to use.
I used the free Eraser program to create the DBAN bootable disk and then popped that into my Devil Machine (a Celeron 666 lab machine) to test it out. DBAN supports a number of different secure erasure techniques, ranging from very low to very high security. The default is the DOD 5220.22-M method, which the program ranks as medium security. At that level your data probably won't hide from the FBI or the NSA, but your average identity thief or local law enforcement crime lab wouldn't be able to do anything with it. I tried to use the more secure Gutmann technique, but the program failed because it was unable to allocate enough memory. I don't know why, exactly, but I didn't feel like futzing with it. Besides, I've never used this lab machine for anything critical, so the chances of it containing anything personal or incriminating are vanishingly small.
It took DBAN right at two hours to make the seven passes required to securely erase my 100 GB drive. I guess it would take 10 or more hours to complete the 37 passes of the Gutmann method? I'll try it on one of the other systems.
My only question now is how I prove that the thing actually worked? I'm no dummy, but I have absolutely no way to verify that the program did what it claims to do. I could download the source code and build my own version of the program to ensure that what I ran is indeed what the author wrote. I'm even capable of understanding what the code does. I could prove that it actually implements the secure erasure methods that it claims.
I could inspect data on the individual disk sectors, but all that will tell me is that the drive electronics can't discern any meaningful data. I don't have the equipment or the knowledge to inspect the drive any other way.
I'm satisfied that what I downloaded works, and I'm not going to fret about it. But this illustrates a fundamental truth about security. At some point you have to trust somebody. I'm smarter than your average computer user (about computers, anyway), able to read and understand source code and inspect disk sectors to see if any of my data remains in a normally readable. But even I have to take somebody's word for it that the method used by DBAN actually makes it difficult or impossible to reconstruct meaningful data from my hard drive.
Wednesday, 22 June, 2005
One definition of entropy is, "Inevitable and steady deterioration of a system or society." We've all seen it: absent constant maintenance, systems move from a state of order to a state of disorder. Weeds grow in the flower garden and your desk becomes cluttered and disorganized. The same thing happens with software systems' code. At the start of a project everything is clean. The directory structure is nicely laid out and the source code repository exactly mirrors the code on the disk. Formal frequent (ideally, daily) build procedures ensure that the source control system remains in sync with what the developers are doing.
But at some point the project goes into "crunch" mode. Either the team gets behind schedule or has to rush to get a bug fix out the door for a critical client or magazine review. Maybe the product ships and a year later a lone developer hacks in some changes quickly and doesn't follow all the formal procedures to maintain the fidelity of the source code control system. At some point, the project gets out of sync. Six years later another lone developer picks up the code and tries to puzzle it all out.
- They serve as a central repository for the project's source code.
- They maintain a revision history so that it's possible to retrieve all versions of the code and view every change made from the project's inception up to the most recent version.
- They control access to the source code, ensuring that only authorized users can read or update the code base, and that changes are recorded in the proper order (i.e. that older versions don't overwrite newer versions).
However, no version control system that I've used can prevent users from subverting it. It's possible to check in files that aren't used in the project, and to use files without checking them in to the database. Everything works fine until somebody decides to grab the latest version from the database and try to build the project. There's no way for the system to enforce the rules, and no way short of trying to build the project to prove that the rules have been followed. Maintaining a project's source integrity is requires active thought by the team members all the time.
Microsoft Visual Studio and Visual Studio .NET, and some other development environments have varying degrees of integration with version control systems. These integrated systems work well as long as everybody follows the rules. The problem is that the rules aren't precisely defined, they're hard to follow, and they're absurdly easy to break unintentionally.
The only way to ensure that your project will build successfully at any time is to create and maintain a daily build procedure that gets the latest version of the code from the repository into a clean directory structure and builds the entire project. Every programmer on the project is notified of the build status every day. This technique has been proven many times over the years, and is recommended by any project management book or seminar produced in the last five years. Martin Fowler calls it Continuous Integration. For a more friendly discussion of the topic and links to other resources, see Joel Spolsky's Daily Builds are Your Friend.
I'd be willing to bet that almost all successful large software systems use a similar technique. I'd also be willing to bet that most unsuccessful large systems can point to the lack of a daily build process as a major contributer to the project's failure.
Here's the kicker, though. A daily build process will ensure that you can build your project, and automated testing can ensure that the built version actually works. But there does not appear to be a way to ensure that all the rules are followed and that the project file remains in sync with version control. It's possible to add files to version control without adding them to the project file, and as long as your daily build procedure pulls down the entire source tree, you'll never know it. The only way you can ensure that the project and the source code control stay in sync is to manually open the project from the version control into a clean directory. And that isn't going to happen every day. Or even every month. Instant entropy.
Daily builds will keep your project on schedule. Build early and build often. But no automated tool is going to prevent entropy in your project's structure. That's just the way it is. It's a dirty little secret that most programmers either don't recognize or prefer not to discuss.
Tuesday, 21 June, 2005
After two months with the laptops, it's certain that our old desktop machines won't be in daily use anymore. I don't know yet who's going to get them, but three of the four old machines will be leaving the house soon. They're not much by today's standards, but a 750 MHz Pentium III with 768 MB of RAM and an 80 GB hard drive would make a decent home file server or browser, email, and word processing machine.
Before I give the machines away, I want to make sure that all personal data is wiped from the drives. That turns out to be a lot more difficult than you might think.
As you probably know, when you delete a file in Windows the data isn't actually erased. What Windows does is "move" it to the Recycle Bin by just changing the location of an index entry. All the data remains on the disk. Even if you tell Windows to delete the file rather than move it to the recycle bin, the data isn't erased. Only the index entry is deleted. Windows will re-use the space taken by the file at some point, but there's no guarantee when. Somebody with just a little knowledge of disk formats can easily pull the data from the disk.
One solution to this problem is to overwrite the file with random data before deleting it. In theory that will deter the casual snoop who knows a little bit about reading individual disk sectors from reading your files. But it only works for files that you explicitly delete. It won't prevent the snoop from gathering data from backup files created by Word or other programs, or from reading the pieces of the Windows swap file that are scattered over your disk. The swap file is especially insidious because it can contain information that you never actually saved to disk. If Windows gets busy and needs to free up some RAM, it will write stuff from memory to disk. That nasty letter you wrote to your boss but didn't actually save might very well be floating around on your hard drive.
Deleting individual files is not secure enough. To ensure that people can't get data from you, you have to wipe the entire drive. Some people say that formatting the drive is good enough. But Windows maintains certain areas of the disk when it formats. And even the areas that it overwrites aren't as secure as you would think.
I don't completely understand the physics of why, but when you write data to a location on disk, the previous data isn't fully destroyed. It's almost child's play, using equipment and software that's fairly commonplace these days, to reconstruct the original data. In fact, it's possible to reconstruct (with decreasing levels of accuracy) several generations of data in a particular location. For example, if you wrote an "E", then over-wrote that with "B", and then wrote over the same location again with an "X", it's quite likely that a skilled operator with good equipment would be able to reconstruct what you did. Frightening, isn't it? Read Peter Gutmann's Secure Deletion of Data from Magnetic and Solid-State Memory for a little better explanation.
The only positively sure way to securely remove data from the drive would be to destroy the drive. Either grind the disk surface into dust, or melt it down. Acid is more effective than burning, but it's usually possible in either case to reconstruct some data. But if you want to give away a computer with a working hard drive, how do you prevent people from getting at your old data?
The answer is found in a utility called Darik's Boot and Nuke (DBAN), which makes several passes over your entire hard drive, writing specific patterns that are constructed to obscure the previous data. The method used is described in Peter Gutman's article linked above, and also in the National Industrial Security Program Operating Manual of the US Department of Defense (aka DOD directive 5220.22-M). The basic idea is to make so many different generations of overwritten data that it's virtually impossible to to reconstruct the last generation that had actual good data. If downloading and using DBAN by itself seems daunting, download the free Eraser tool, which has an option to create the DBAN boot disk for you.
I haven't actually used DBAN yet. Give me a couple of days and I'll let you know how it works.
Monday, 20 June, 2005
I responded to an email message the other day and got a failure notice in return. It seems that Road Runner has blocked direct connections to its inbound mail servers from the residential dynamic space. See Road Runner Mail Blocks for more information.
This isn't a problem for most people, I know. But I send mail through the SMTP server on my laptop rather than connect to Road Runner's SMTP server. The primary reason is so I can send mail when I'm traveling. Unless I'm connected to Road Runner's dynamic space, I don't have access to their SMTP server. I guess I could change the outbound server when I'm at home, but it's always annoying to change it again whenever I go off network.
I've run into similar problems with other servers that treat my home-based server as "suspect." The solution there usually is to wait 30 minutes or so and try again. Those servers are set up to give a non-permanent error to suspect servers on the first attempt, but to allow the message to go through on a retry. The idea here is to discourage spambots that use a shotgun approach to email and don't check return status. I don't know why Road Runner didn't implement that technique instead of the iron curtain.
I could, of course, use my Web-based client when I'm on the road. Except that I detest Web interfaces and it's difficult then to get an archive of the sent messages back down to my laptop once I get home.
Would somebody please fix the email spam problem? I've only been waiting for the last five years.
Sunday, 19 June, 2005
When I visit the grocery store I often make a detour by the beer cooler to see if there's anything new. I don't drink a whole lot of beer these days, so I can afford to splurge on the high-priced imports or botique beers from time to time. The local supermarket has an impressive selection--a far cry from the Miller/Coors/Budweiser fare of most places.
The other day I ran across Steel Reserve (warning, Shockwave required), a product of the Steel Brewing Company. What caught my eye was the promise of 8.1% alcohol. I've always liked high gravity beers because they're usually more flavorful than most styles that you find. Although you have to be careful. Barley wines, Russian Imperial stouts, and dopplebocks can get to be so strong that the taste isn't enjoyable.
I don't know what style Steel Reserve is supposed to be. I've seen some commentary on the Web refer to it as a "malt liquor," but that's like calling a Chevy Suburban an SUV. "Malt liquor" seems to be the generic term for any high gravity lager.
Whatever it's supposed to be, it's not very good. I poured two glasses for Debra and me. The color is good--darker than a normal beer. No appreciable head on the brew, and I wasn't especially careful in the pour. The first taste is somewhat sweet. The aftertaste is a little sour. There is nothing in between. No complex or subtle flavors and an indifferent mouth feel. My comment after finishing the glass was, "Well, that was interesting." Debra was less impressed than I was.
Steel Reserve tastes better than your average American beer, but there are plenty of other brews that I prefer. It has two redeeming qualities: price and alcohol content. If you're looking to get hammered quickly and inexpensively, I'd recommend Steel Reserve. One glass of that stuff and I was buzzing pleasantly. It's hard to imagine that people buy it in 40 oz bottles and drink the whole thing themselves. I doubt I could stand the taste for that long, and I'm certain that 40 ounces at 8.1% alcohol would lay me out cold.
I don't think I need to try that one again.
Saturday, 18 June, 2005
Sometimes it seems the more I learn about computers, the less I actually understand. Who would have thought that it would be so difficult to back data up to an external device? See June 16 for the preliminaries.
When I finally uninstalled Iomega's automatic backup software, Windows magically let me safely remove the device. That's problem one solved. One would think that just disabling the sofware would be enough, but apparently there's something that gets loaded at startup and holds a reference to the drive. Is this another example of hardware manufacturers' inability to create usable software?
Michael Covington saw Thursday's note and recommended Ntbackup. Yes, the program that comes with Windows. I had to reformat the Iomega drive with NTFS so that it could hold the backup file. Ntbackup writes the backup into a single large file, a "feature" that I'm not entirely happy with. The only saving grace is that Ntbackup is installed by default on Windows 2000, XP, and 2003. 20 minutes after I formatted the drive I had a backup of all my data. Not bad. I'll need to explore the program a bit so I can learn how to schedule backups and such, but Ntbackup looks like the right solution.
Because Ntbackup stores everything in a single file, those files can get very large. The backup file on my system, for example, is more than 15 gigabytes. FAT32 partitions have a maximum file size of 4 gigabytes, so if you want to use Ntbackup you have to format the backup drive with NTFS. Unless you make really small backups.
Encouraged by a successful backup of my system, I turned to Debra's. But for some reason I can't format her Western Digital external drive with NTFS. Windows keeps telling me that something is accessing the drive. I have no idea what that something might be. I guess it's time to do a little research. Google is my friend.
I feel a rant coming on.
Friday, 17 June, 2005
One of the groups most critical of the Broadband over Power Lines (BPL) initiative is the Amateur Radio Relay League (ARRL)--ham radio operators. Their (our) contention is that BPL causes harmful interference across the HF and and lower VHF radio spectrums. Many uninformed critics of the ARRL and hams in general say that our opposition to BPL is just a knee-jerk reaction--that the interference caused by BPL is minimal at best and easily countered by turning up the amplifier. (Which works, by the way, for transmitting. But the interference causes problems with reception.) They go on to point out (wrongly) that ham radio is a dying hobby and that any value provided by amateurs was in the past. In short, in the minds of BPL advocates, we hams are stuck in the past and refuse to give up our useless hobby and release our radio spectrum for other uses.
See the ARRL's BPL area, Broadband Over Power Line (BPL) and Amateur Radio for more information about the ham community's position on this topic.
One of the primary purposes of the amateur radio service is, according to the FCC, technical investigation. Over the years since the service was created, amateurs have made many innovations in advancing the state of the art in radio communications. Kevin McQuiggin has a detailed discussion in his thesis Amateur Radio and Innovation in Telecommunications Technology. Click on the link at the top of the page to see the entire thesis in a PDF file. Whereas it's true that opportunities for innovation are fewer now than in this past, amateurs do still experiment and often pioneer new technologies or techniques for using current technology.
The July 2005 issue of QST, the magazine of the ARRL, has an article about a group of hams in Virginia's Shenandoah Valley who used commercial off-the-shelf 802.11 wireless gear to create a wide-area wireless network. This gear is intended for use at a range of 250 to 300 feet. By adding some home-built microwave antennas, they were able to extend the range in an early experiment to 34 miles. For Field Day last year, they set up a solar powered 802.11 router at a remote site and connected it via microwave link to another relay station 17 miles away. That station relayed to a home station that was 2.4 miles away. Users at the Field Day site were able to check their email and browse the Web from their laptops, all over a wireless link constructed from standard wireless gear.
That is innovation. You can bet that people who are trying to put together community-wide or city-wide wireless links will be talking to these guys.
Thursday, 16 June, 2005
When we decided to replace our desktop machines at home with laptops, Debra was concerned about carrying around a computer that has all of our financial data on it. To prevent that, and to provide a convenient backup mechanism, we purchased a 120 GB Western Digital external USB hard drive. Problem solved. Quicken is on the laptop and its data is stored on the external drive. Plus, with regular backups we only out the cost of the hardware if we somehow lose the computer.
My original intention was to back up my laptop on one of my desktop machines. That turned out to be a real pain in the neck, and I'm trying to get rid of the desktops. They create way too much heat and noise. So today I went down to Fry's and picked up an Iomega 160 GB drive. Less than a buck a gigabyte for an external drive is just amazing.
The Iomega drive has a very annoying feature: it turns on automatically when it sees a signal on the USB port. So if it's plugged into my laptop when I power up, the drive comes on. That wouldn't be so bad, I guess, except that then I can't turn it off. Windows XP tells me that the device cannot be stopped, and hitting the power switch on the drive has no effect. This isn't much of a problem in practice because I only use the drive for backups. The only time it is a problem is when I finish a backup and want to disconnect the drive. Since I have no way of knowing how long Windows is going to hold on to its buffer before flushing it to the drive, I have to restart my computer before I can turn the drive off. I'll keep the drive now that I've written data to it, but I probably won't buy another Iomega unless I can find a way around this problem.
Both of these drives, by the way, come with backup software. In theory this is a good thing. In practice, it stinks. I learned a long time ago that hardware manufacturers' software stinks. I'll never understand why hardware manufacturers continue to provide such crap utility software with their products rather than licensing something that's actually useful. I especially dislike the Western Digital software because it writes the backup files in a compressed format that can be read only by that backup software. At least Iomega's software gives me the option to write real files to the disk. Sure, writing compressed files saves space. But at the expense of portability. I'd rather be able to take the drive to any computer and read the files directly rather than having to track down the backup utility and install it. And with 160 gigabytes I won't be running out of space on my backup volume any time soon.
I'm in the market for some good backup software. Any suggestions?
Tuesday, 14 June, 2005
Need to write a report for your pointy-haired boss? Be sure to visit the Web Economy Bullshit Generator for all those cool phrases that he's expecting to see.
On a related note, I was perusing one of my many bookshelves last week when I ran across The Official Politically Correct Dictionary and Handbook. I don't know where I obtained this gem, but it's a real hoot. The book contains four sections:
- Part I: A Dictionary of Politically Correct (PC) Terms and Phrases presents the basics. Here's where you will find the definitions of new terms like "hair disadvantaged" and "heterocentrism." According to the book's user's guide, "Part I is also particularly helpful when you want to find out what oppressions you've been subjected to, and what to call the perpetrators."
- Part II: A Politically Incorrect/Politically Correct Dictionary is a thesaurus and bilingual dictionary combined. Use it to look up outmoded and offensive terms like "fat" and "prisoner," to learn the new, non-offensive terms like "possessing an alternative body image" and "client of the correctional system."
- Part III: Other Suspect Words, Concepts, and "Heroes" to Be Avoided and/or Discarded goes beyond words and covers core thoughts, customs, and beliefs that corrupted our culture. You'll learn, for example, that dating is the result of phallocentric social conditioning, the purpose of which is the continued subjugation of women in order to maintain the white male dominant culture.
- Part IV: Know Your Oppressor: A Bilingual Glossary of Bureaucratically Suitable (BS) Language is a guide to the correct-speak used by corporate, political, and military language. Acid rain is "poorly buffered precipitation," and that influence peddling is really just "using one's credibility for accomplishing an objective."
The sad part about The Handbook is that there is a reference for almost every usage described. Government, corporate, and academic leaders actually use these terms and hold the beliefs that are described in the book. It's frightening in some ways. And very funny. But it's like a book of Gary Larsen "Far Side" cartoons--best taken in small doses lest you overdose on hilarity.
The book was copyright 1992, so you might not be able to find it in stores. Surprisingly, it's still available on Amazon. But don't worry if you can't find the book. Just search Google for politically correct dictionary.
Sunday, 12 June, 2005
Can somebody identify this lizard for me? Debra has books on Texas bugs and snakes, but nothing on amphibians and other reptiles. I didn't see anything that even remotely resembles this thing in any of the sites that matched my search for Texas lizards or amphibians.
I know the picture isn't terribly good, but it was about midnight when I snapped this picture on my porch. The small sized picture actually looks better than the full image, but if you're interested go ahead and click on the image at left for a larger view.
[Update 6/17: Reader consensus is that this is probably a juvenile Mediterranean Gecko. Thanks to all who responded. Somebody also suggested that I give an estimate of the lizard's size. At the bottom left of the picture is the edge of a 2x4 stud. That edge is 1.5" wide, and it looks like the lizard is about three times that length. So figure between 5.5 and 7 inches from nose to tail.]
I'm a little disappointed that my camera wasn't able to get a better picture than this. The picture at left is the first one I took, using the camera's normal snapshot setting. I tried several of the night mode settings, but every one turned out somewhat blurred, probably because I wasn't able to keep the camera steady enough. Slower shutter speeds require a more steady hand.
Saturday, 11 June, 2005
Today begins phase two of the house remodel. Phase one was the garage conversion that took a little over four years because we started it with no money, no clear plan, and no schedule. Phase two won't take near as long, nor will it be as expensive. The idea is to bring the house up to modern standards. The house was built in 1978 and, as you can see by the pictures (click on the image for a larger view), it shows. Our remodel plans are simple: replace the floor coverings, probably with wood, fix any cracks in the drywall, replace the cheap hollow core doors, remove the acoustic popcorn from the ceilings, and replace the door trim and baseboards with something a little less plain. That will take care of the two spare bedrooms and the hallway.
I started the demolition work this afternoon by removing the carpet and scraping the ceiling in one of the bedrooms. Those are simple projects that I can get done in a weekend. We have a contractor coming in next week to give us some more solid numbers on installing floors, doors, and the rest.
Phase three of the remodel is the master bedroom and bathroom, which will get much the same treatment as the rest of the rooms. Following that we'll probably spruce up the outside before gutting the kitchen--a very expensive proposition.
Thursday, 09 June, 2005
The movie Hotel Rawanda tells the story of Paul Rusesabagina, the hotel manager at the Les Milles Collines hotel in Kigali in 1994 during the Rawandan massacre. I was pleasantly surprised by the movie. It's drama, of course, and I suspect that the writer and director took a few liberties, but it's a reasonably good portrayal of the events. Rather than try to review the movie myself, I'll encourage you to see it or to read one of the many reviews available on the Web.
To recap real events, in a 100 day period starting on April 6, 1994, somewhere around 800,000 (some put the number at 1,000,000) people in Rawanda were killed by "militias" armed mostly with clubs and machetes. People were rounded up door to door, robbed, raped, enslaved as prostitutes, hacked and beaten to death all in broad daylight. Most of the people killed were minority Tutsi, but there were also many Hutus murdered for their moderate views. The History Place has a good summary here.
The United Nations, despite ample warning of impending problems, did absolutely nothing to avert the crisis or help end it once it had begun. In typical bureaucratic fashion, the U.N. issued statements condemning the killings, being very careful not to use the word "genocide," as that would have forced action. For the most part, people around the world looked at the news on their television screens and said, "That's horrible," before flipping the channel. You see similar reactions today to events in the Darfur province of Sudan.
A lot of people in the United States are of the opinion that something like that couldn't happen here. "Those Africans are just crazy," is a commonly held belief. "We're civilized," is another, implying that somehow our society is above such things. Or maybe they're implying that whites are above such behavior, conveniently forgetting the Nazi counter example that occurred only 60 years ago. Whether derived from elitism, racism, or something else, the idea that a genocide-type event could not happen in the U.S. is just plain wrong. It doesn't take much of a stretch to envision it happening here.
In what I think is the most likely scenario, workers displaced by outsourcing or off shoring, or who have had their formerly high wages for jobs like construction or meat packing reduced by competition from immigrants (legal or illegal) form roaming militias that terrorize or kill immigrants. "Immigrant" would probably have a broad definition in this case, encompassing anybody who doesn't fit the stereotype of a white American. Anybody who opposed such a militia would be dispensed with as well. I don't think such a thing would be nationwide, but I can easily envision flare-ups in large cities, National Guard troops called in to restore order, and tensions escalating until at some point there is a pitched battle between National Guard troops and one of these militias.
The only reason that hasn't happened yet is because things are still going relatively well. A 6% unemployment rate, bad as it is, doesn't make for a critical mass of dissatisfied citizens, especially when almost all of the unemployed here live far better than the vast majority of people in the world. Things are a lot different if 35% of skilled workers become unemployed and are struggling just to eat. Then you're talking about tens of millions of people who have few prospects and a lot of time to nurture hatred for the groups they think are the cause. It's not a pretty picture. If idle hands are indeed the devil's tools, you probably don't want imagine the combination of idle hands and a well-nurtured grudge.
I'd like to think that we're a long way from that happening, but I'm not so sure. Current trends have me a little worried. As conditions in Mexico and Central and South America deteriorate it becomes more attractive for people living there to risk entering the U.S. illegally in search of a better life. One can hardly blame them. This is true of people from other countries as well but Latin American immigrants are more prevalent simply because they're closer. It's a whole lot easier to hitch a ride north than to hop a freighter from Asia, Europe, or Africa.
The Minuteman Project, benign though it may be, is the first real step towards a large militia. Here we have a well organized private group policing the border because the government agencies in charge either can't or won't do it. This is ringing warning bells throughout the government but in typical bureaucratic fashion our elected representatives are misinterpreting the signals. The Minuteman Project's members are seriously concerned about illegal immigrants taking American jobs, overwhelming our social welfare system, and changing our way of life. Claiming that Minuteman supporters are racist or causing problems with our neighbors to the south will not change the fact that people are becoming increasingly less tolerant of illegal immigration. Goverment officials at all levels should stop issuing statements to further their petty political goals. They must act to stem the tide of illegal immigration. That means securing the borders, denying services to non-citizens, and tracking down and deporting illegal immigrants. Failure to do so will eventually result in a violent confrontation.
My personal position on immigration is quite a bit more relaxed than most. I believe we should tighten our borders to prevent illegal immigration while at the same time removing some of the roadblocks that prevent productive people from obtaining work visas, green cards, and U.S. citizenship. We should welcome people who strive to enter our country legally, often at great risk to themselves. Our society was built by risk takers and dreamers who came to this country and worked to provide a better life for themselves and their families. Currently we have our priorities backwards. We make it very difficult for those attempting to enter legally and at the same time allow untold thousands to enter illegally.
If you want to see what the hard liners have to say, start at Michelle Malkin's Immigration Blog and follow some of the links. These people are serious and they're becoming angry. The longer things remain as they are, the more people join the ranks of the hard liners. A few fringe wackos is disturbingly funny. An organized group of motivated people who support the fringe wackos is very frightening.
Wednesday, 08 June, 2005
If you've ever been involved in a large development project, you probably experienced the code formatting nightmare. How many spaces to indent? Where does the open brace go? Should you indent case labels? There's little possibility that you'll convince a group of programmers that your way is best, and invariably you'll end up with an imposed standard that nobody supports 100%. Imposing the standard is the easy part. Enforcing it is something else entirely.
Visual Studio 2005, which will ship when the .NET Framework 2.0 ships, has some pretty impressive code formatting built in to the C# editor. With it, you can set formatting standards that are enforced automatically as the code is typed. And you can reformat code almost instantly. It's very cool stuff.
Tuesday, 07 June, 2005
I've never been a big fan of browser based Web applications. Programs that run in the browser suffer from a number of major drawbacks, any one of which should discourage developers from writing browser applications. When you consider all of the drawbacks, I find it inconceivable that anybody would seriously consider distributing a browser app.
The most obvious problem is platform incompatibility. When developing a browser application, you have to decide which browser versions you want to support. Will it be just Internet Explorer 6? What about Firefox and earlier Internet Explorer versions? Will you want to support Opera, Netscape, or Safari? Once you've decided on a platform, then you need to determine if you'll write to the lowest common denominator (the subset of functionality supported by all of the selected browsers) or if you will support browser-specific functionality. Whatever you decide, you have to be willing to accept that your application will look and probably operate slightly differently in each different browser.
Backward compatibility is another issue that you will be faced with. The major browser development teams make an attempt to support old functionality in new versions of their browsers, but their track record is less than stellar. Many a Web application has broken with the introduction of a new Internet Explorer version. One can argue that operating system version upgrades also introduce incompatibilities, but I've never seen a Windows upgrade break things to the extent that I've seen an Internet Explorer upgrade break Web applications.
If the above doesn't dissuade you, consider the user interface nightmare that's inherent in a browser based application. First, there are two menus: the browser's menu and the Web application's menu. It's possible, I guess, to combine the two into a single physical menu, but that will confuse your users almost as much (perhaps more than) having two menus.
Then consider the effect of hijacking shortcut keys. In Internet Explorer, for example, Alt+D moves the focus to the address bar, and Alt+Left Arrow is the keyboard shortcut for the Back button. Those are just two of the many keyboard shortcuts that I have come to depend on in Internet Explorer and Firefox. Browser applications, though, hijack or disable many of the shortcut keys. I'm not sure if it's the J2SE Runtime Environment used by Internet Explorer or the DigiChat software that runs under it, but something hijacks those keyboard shortcuts.
As if that's not enough, control tab order is goofy, page load and unload events have unexpected side effects, scrolling is weird, etc. In general, browser based applications try to look like client applications, but little differences combine to make them slow, clunky, confusing, and unpredictable. I can't understand why users put up with them.
And that's the frustrating part. Users will complain about browser based applications but they still used the darned things. It's just so much more convenient to allow automatic installation of an ActiveX object or Java applet than to download and install a rich client application. Microsoft has attempted to address this with their zero-impact deployment of client applications in the .NET Framework, but that initiative is stymied by the relatively small installed base of .NET systems.
Browser based applications are, unfortunately, the only reasonable method of providing rich Web-based content. I am of the opinion that the inherent problems in the platform outweigh any perceived benefit of such applications, but I also realize that I'm in the minority. In my experience, both from the user's and the developer's perspective, browser based applications aren't worth the trouble.
Sunday, 05 June, 2005
It's not terribly surprising to find that laptop computers outsold desktops, but I didn't expect it to happen quite this soon. The article points out that laptop prices have dropped 17% in the last year, whereas desktops have dropped only 4%. Laptop quality also has improved, as has battery life. Combined with very good video performance, large hard drives, and CD burners, laptops today are more powerful and less expensive that top of the line computers were three or four years ago. Add wireless capability (95% of today's laptops have wireless) and the near ubiquity of free wireless access in most of the U.S., and it just makes sense to buy a laptop or notebook computer.
A laptop makes a whole lot of sense for a home computer even if you won't be traveling much. My Dell Latitude is just slightly larger than a book. It takes very little space on my desk, and I can very easily disconnect it and put it in a drawer or on a shelf out of the way. Since I installed a wireless router, I can also take the computer to any room in the house or even outside by the pool if I want to do some leisurely reading or play games away from the office. Debra can put her laptop on the kitchen counter to follow a recipie if she doesn't want to print it. I can do research while I'm watching the brew pot, or shop for car parts while I'm out in the garage working on the Mustang.
Traditional desktop computers still have their uses, although the large footprint and high power requirements make them much less attractive. Desktops these days seem to be clustered at the edges: sub-$500 entry level machines, and high end performance machines. I still find my old desktops useful as lab machines, and I suspect that many like having a fixed server for file and printer sharing, but for the most part the age of the general purpose desktop appears to be coming to an end. Cheap USB hard drives and wireless printer nodes have almost eliminated the need for centralized computing in a home environment.
I'm interested to see what manufacturers do now. Available computing resources, even in laptops, have surpassed most people's needs. 2 GHz is more than sufficient for listening to music while surfing the Web or answering email. Or to watch a DVD movie in full screen. I expect we'll see more people ditching their old desktops for laptops, but what then? Where's the "killer app" that will require twice the horsepower and entice people back onto that three year upgrade cycle that drove the 90s tech boom?
Friday, 03 June, 2005
I'm still going through my pictures from Japan and trying to write down as many of my memories before they fade with time.
On my Saturday morning walk I spied what looked like a Christian church tucked away a couple of blocks off the arterial street I was walking on. This was something that I hadn't seen yet in Japan. I'd probably wandered past dozens of shrines or temples without knowing it. Seeing this church nestled among the modern buildings in the middle of Tokyo was kind of strange. It's a beautiful building with some well-tended gardens. I was in a hurry so I didn't go inside.
I've mentioned before that space is at a premium in Tokyo. Most people live in apartments and have no yard to speak of. I certainly didn't see much unpaved space. But there are trees along most streets and many people keep little gardens of potted plants outside their buildings. During my morning walks I often saw the owners watering their gardens. The language barrier prevented us from speaking, but they seemed okay with me taking pictures.
Thursday, 02 June, 2005
I ran across some shots today that Debra took of Charlie the day after he showed up in the yard. The vet estimated Charlie's age then at about 9 months. He was malnourished, dehydrated, and suffering from an very advanced case of demodectic mange. If you ever wanted to know what a mangy pit bull looks like, here's a good example. Click on the pictures to view full size.
It took about eight weeks to eliminate the staph infection and clear up the skin lesions caused by the mange, and for Charlie to put on 20 pounds. In the shot above, taken on July 15, 2002, he weighed 50 pounds. In this shot, taken on September 9, he weighed 70 pounds. Ideally we'd like to keep him between 70 and 75, but he's been tipping the scales at close to 80 lately. I guess we need to exercise him more.
He was already housebroken and mostly leash trained when we got him, and it seemed as though somebody had begun obedience training as well. Charlie is loving, happy, playful, protective, and an endless source of amusement. He also has a lazy ear. Whoever dumped or lost this dog missed out on a very good pet. Debra and I weren't looking for another dog when he showed up, but now we can't imagine not having him.
Wednesday, 01 June, 2005
Wandering the blogs this morning, I ran across the Autorantic Virtual Moonbat. What a hoot! Type anything into the text box and you're greeted with a senseless babble of incomprehensible political views. It's like watching TV politics shows or listening to talk radio except there's no person there trying to convince you that the opinions expressed are worth paying attention to.
You can play with the Moonbat on the linked page, or copy the HTML code to include either version on your own Web page. Thank Sean Gleeson for this morning's laugh.
"Moonbat," by the way, is the term that conservative bloggers use to describe modern liberals, peace protestors, and other ideological opponents. The Wikipedia entry says that it's an epithet similar to "Feminazi" or "Idiotarian." In the little reading I've done of conservative blogs, I've only seen it used to describe people who are way out on the fringe.