November 15, 2009

Easing Operating System Installs

For many years I have been a fan of periodically wiping my computer and starting clean. This process originated with my first computer that could get on the internet. I began downloading all kinds of stuff from games, to utilities, to OS enhancements and with each new installed program the system would crawl a bit more.

At this time in the mid-90's company's were putting out products left and right promising to speed up your PC, make your applications work better, and return valuable disk space. I tried them all and they all had one thing in common they were useless, for every CPU cycle you got back two were wasted by the program, for every bit returned two were used, and so on.

Then there were mas the malware, notable pieces such as Michelangelo Virus and Melissa Worm came out in the mid to late 90's. With these notable pieces of malware came the media hype and exposés on hackers and their work. People became frantic about virus protection and internet security and products like Norton and McAffee became household names, and required installs taking even more of that precious hard drive space and CPU clocks.

I finally became fed up with all the smoke and mirrors and snake oil and started to reinstall the OS and applications. I tried as best I could to make the process as painless as possible and have all my applications backed up before doing a reinstall so I could start installing them immediately. This cut the time down after the install but I still had to locate them on my computer or the latest version on the internet and back them up to CD.

Throughout this time people had advocated making recoverable images of the system using programs like Acronis True Image or Norton Ghost. I have always thought this was a silly, pointless endeavor. First it takes nearly as long to reinstall the OS as it does the image. Second OS's are constantly getting updates and that is the first thing I want installed, I do not want an application interfering with an OS update or opening a hole before I have time to patch it. Third applications are constantly being updated and new ones coming out the whole reason for the reinstall is to be clean and fresh now I have the old and new version of every app installed and I just wasted all the time I was supposedly saving by restoring from image.

That process remained a yearly event around New Years continuing through Windows XP. When the first year of XP rolled around I did not even think about reinstalling everything still worked pretty well. For the first time in almost ten years I found I was upgrading my PC more frequently than my OS. When I did a major upgrade I still reinstalled everything just to be on the safe side. This still involved backing up some applications and finding others, with the internet became much more accessible and searchable it became easier but never painless.

With Windows 7 coming out I thought I was in for the same adventure. Then I found a new tool called Ninite which helps ease the application install process. What Ninite does is allow you to select from a fairly large base of applications then it will build an installer and install all those applications for you at once. The installer checks whether you are running a 32 bit or 64 bit OS and decides on the optimal version, during the installation process it will choose all the defaults and select not to install any browser toolbars, aware, or spyware. This site saved me so much pain and time during my Windows 7 install, it did not eliminate the pain but it sure helped out.

Thank you Ninite!

Cleaning and Organizing

You may have noticed recently that some old posts have changed or your feed reader has reposted some articles as new, all these centered around the goals postings. I subscribed to the feed and realized that the goals posts really obscure the meaty posts people probably came to read. I reorganized the site into two blogs, moved the goals posts over to the new blog then cleaned up some of the articles that referred to the goals posts. Right here at Caffeinated Code will still be where I post the topic entries and Caffeinated Goals will have the goals entries.

October 22, 2009

Backing Up Is Hard To Do

I'm sure everyone has a story of losing valuable data, not the stolen kind the destroyed kind.

There are so many ways things can go wrong, drive motors fail, platters become unbalanced, chips go bad, electronic noise, communication failures, programs go awry, viruses, the list goes on... And on... And on... I can relate stories of each type however, I have never found any of those listed to be as bad as letting a human near a computer.

I have done many stupid things to my data, deleted the wrong folder, formatted the wrong drive, thought I backed things up, not saved often enough, etc. The worst thing I have ever done is on a shared web host I did "rm -rf /" instead of "rm -rf ./" and I as a lowly user had permissions to do it (for all the non *nix'ers out there that is a remove all files and folders recursively from the root of the file system which can include all the disks and remotely mounted file systems, without prompting). Stories like this come from everywhere friends, relatives, or even giant companies like T-Mobile and Microsoft with the user data loss on the T-Mobile SideKick because of server failure on the Microsoft/Danger cloud.

So this gets me to the meat of the story, I'm a bit of a bit freak (haha yeah I know soul crushing pun). The recent story about the SideKick data loss got me thinking about more robust backup solutions. So I first thought about what questions need to be answered in order to decide what approach is best suited for the needs. Here are the 5 most important items I could come up with:
  • Is user error a concern? - Unintentionally deleted a file/directory
  • Do old versions and/or copies need to be kept? - Overwrote important calculations
  • Is timeliness a factor? - The contract only allows an hour of down time a year
  • How many people and devices are there? - One computer, a small office with a few computers, or a large number
  • Is security a concern? - Do certain people only need certain pieces of data, or is it ok for everyone to see everything
User error is the easiest to solve as all it requires is making frequent backups, in doing so the likely hood or complete data loss is reduced to near zero and data is recoverable to some relatively recent time frame. There is software built right in to most operating systems and then there s software like Acronis True Image® which will make bootable images of a system making recovery throw in a DVD, reboot and say go.

Versioning can be solved using a version control system these are often used by programmers to manage the source code so many developers can work on it at the same time systems like Subversion,Git and Mecurial exist for this purpose. This makes the process slightly more complicated as it is no longer just a file system to deal with. The first new piece is having a central repository which is similar to a file server just a bit more advanced. Users then ask the repository for a local copy (aka check out) which they would then use to view and make changes, once they are done making changes they need to send their changes to the repository (aka check in) so others have them available. Once a system like this is in place it will maintain each check in of a file as a separate version thus old versions can be looked at and used. These systems can grow to be very large because of all the copies of data so keep that in mind if more than text files are planned to be placed within.

Timeliness can only be solved through having spares of everything on hand or better yet a replica hot spare system on hand and ready to go. RAID sounds good at first but is is a pretty poor solution, if power goes out the write fails, if corrupt data is written its still corrupt, plus RAID cards are expensive and you NEED to always have a spare on hand.

The number of users affect whether a server is dictated, if only one computer is used it is senseless to go any further than the a drive or folder on that system. A small number of computers may indicate a basic NAS is a good inexpensive solution, but often they lack the sophisticated controls of a full fledged server. So on a larger scale where additional services (i.e. FTP, windows file sharing, SSH, Web Server, etc) are needed, or access restrictions a server may be the best choice.

The last issue of security falls with the answer to the number of users, a NAS may have some access controls to prevent users from getting data they shouldn't but a server running Windows, Linux, or Unix will have the controls.

In my case I am implementing a solution for my home, my concerns are user error, vesrioning and the number of devices (we have several computers, media servers, and other connected devices). My solution is to set up a server to deal with the number of devices, running Git to version my documents and code, and a NAS with a large amount of storage for automatic backups.

Be safe - protect your bits ;-P

October 13, 2009

Now to Caffeinated Code

Several times I have ventured into blogging only to fall vastly behind and effectively stop posting. Well I wanted to bring forward my past articles into this new blog, start fresh and clean.

My plan for the blog is to start out on a monthly basis posting a full article, longer term this will transition to a more frequent basis hopefully to the point of multiple posts each week. I have never been an avid writer so this is also an exercise at improvement in that regard and with that a lesson on what works for me, so far that is mostly free-form with a bit of internal mind mapping mixed in for good measure. One of the toughest things for me to get over is pressing the publish button, is the grammar OK, is it structurally sound, blah, blah. I really have to work on that one.

Update 11/5/09 - I removed much of this post as it had to do with my goals, to keep things clean that is now a separate blog at goals.caffeinated-code.com