28 November 2007

Technical HowTo: High Availability Monitoring, Part 1

This blog entry is going to be quite technical, so people with a sensitive disposition might want to skip ahead to other entries. O.K., you've been warned!

My goal here is to describe the process of setting up a Monitoring system for a High Availability network security appliance. Specifically, this is work for a customer, who is going to implement one of our AGORA systems (see my earlier blog from this week) in a High Availability configuration. A specific feature of this monitoring system, is that it should detect failure of a primary system, and switch to a secondary automatically, according to a set of rules.

Now High-Availability means different things to different people. In my case, I interpret it to mean any system which when correctly implemented, will reduce the probability of a systems failure. As a system is made up from different parts, we isolate those subsystems which are most likely to fail, and put measures in place to detect or prevent this failure.

My goal here is to develop a network-based monitoring sub-system, which will continuously monitor and measure performance of the target system, and to activate special counter-measures in the event of a subsystem failure. I plan to use off-the-shelf components wherever possible, and especially open-source tools running in a Linux environment (although not all tools selected are of this type.) I believe this approach will be helpful to document, in case others want to adopt a similar approach, and you can learn from my mistakes.

First is to choose to development environment. I am going to develop within a VMWare appliance, which is a Virtual Machine. By doing this, it will be easier for the customer to implement at their own site. I happen to be using a MacBook Pro for this work, but it could easily be an Ubuntu Linux or even Windows XP box. Some of the features and tools I plan to implement include:

  • Cacti -- used for time-series graphing of various metrics. In particular, useful for showing trends.

  • SmokePing -- a nice Cacti-based tool to show network latency. Network performance is of particular interest.

  • Perl -- the general purpose scripting language for writing new functionality

  • XAMPP -- one of my favourite bundles of Apache, MySQL, Perl and PHP

  • Mon/Nagios/Hobbit -- select from one of several network monitoring tools

  • VMware -- used to run a virtual machine, for portability

  • CentOS -- the version of Linux I chose for running the monitoring system inside the VMWare

I considered using a solution such as keepalived, but thought that might be more complex than I need. Plus I like re-inventing wheels...

Preparing the Development System

My first task is to connect to our VMWare server, and build the development environment. This box is stored in our data centre, and only provides access via SSH. Therefore, I am going to tunnel in via SSH, using VNC to get access to the graphical environment.

For the Mac OSX, I have chosen to use "Chicken of the VNC" as my VNC client. Because I need to tunnel in via SSH, I chose to open a terminal window, and type in the command directly.
ssh root@vmware-dev -L 5901:localhost:5901

I then connect to localhost port 1 in the VNC client, which will then tunnel to the remote system. Entering the password, and I am faced with the screen shot below.

Now I use the interface of the VMWare server, and tell it I want to create a virtual machine, using the Red Hat Enterprise Linux 4 template (which is closest to CentOS.) I choose also only 640 Mb of RAM (this machine will be running as a Web server, but I won't install X11.) I don't need a physical CD, as I have downloaded the ISO images of the CentOS onto the VMWare server, and just need to mount the image as if it was the CD drive. I switch on the VM, and it boots immediately into the CentOS installer.

I run through the installation options, selecting mostly the defaults. I made the VM with only 8 Gb of disk, so I have chosen a minimal install. I'll add the other stuff I need later. My first step however will be to use YUM to install any required security patches and updates for the minimal install, then download and install my Web environment, XAMPP. I will also add the VMWare tools, as these are important if I want the system to have good time synchronization (which is important for security applications), because NTP and friends don't play together nicely with Virtual Machines due to clock tick latency correction.

Here are the commands used:

wget http://www.apachefriends.org/download.php?xampp-linux-1.6.4.tar.gz
yum update

There was around 48 Mb of updates for the CentOS packages -- mostly new versions of tools and the kernel, with a few minor security issues.

See the Apache Friends web site for details on installing XAMPP. Just follow the instructions for improving its security, and make it run from startup by using chkconfig to add it to the processes to be run upon a reboot (after symbolic linking into /etc/init.d).


My first choice was to install Smokeping, by Tobias Oetiker. It's a great tool for visualization of network behaviour, which is an important part of any network-based services. I simply followed the comprehensive installation guide. Later, I found a more friendly Smokeping install guide here.

For convenience of the reader, I will paste below the commands needed. I decided to use binary distributions, rather than building from source, to save installing to many prerequisites in the VM.

yum install libart_lgpl
yum install perl-Time-HiRes
wget http://dag.wieers.com/rpm/packages/rrdtool/rrdtool-1.2.23-1.el4.rf.i386.rpm
wget http://dag.wieers.com/rpm/packages/rrdtool/perl-rrdtool-1.2.23-1.el4.rf.i386.rpm
# Note both RPMs have to be installed with a single command, to avoid a dependency loop
rpm -Uvh rrdtool-1.2.23-1.el4.rf.i386.rpm perl-rrdtool-1.2.23-1.el4.rf.i386.rpm

wget http://downloads.sourceforge.net/echoping/echoping-6.0.2.tar.gz?use_mirror=heanet
yum install curl

I'll continue this in Part 2.


I came across a real gem while browsing the BBC News web site today. Japanese culture contains the delightful concept of "inemuri" (居眠り), which translates as napping or dozing. What's interesting about it is that it is culturally acceptable, in certain circumstances, to fall asleep in meetings or other social gatherings.

Apparently, it is intended to show that you sacrificed much of your regular sleep in your work, and is considered a type of macho display -- "look how hard I work, because now I cannot stay awake!" Naturally, like many Japanese traditions, it is not for everyone -- only those of superior social status can afford to indulge in front of their underlings, or those who have little status at all.

Now, I just need to figure out how to incorporate this into the university classes I teach.... hmmmm....

26 November 2007

AGORA Audit Compliance Appliance

I'm really excited about the AGORA Audit and Compliance Appliance which my company has developed, and which is starting to see some traction in the market.

The idea actually came from one of our large Banking customers. It's a simple idea (as some of the best ones are), but one which we haven't really seen elsewhere on the market. The "elevator pitch" is as follows:

Your company or bank has just outsourced some key IT activities -- e.g., application development or database administration. It made sense financially, and you're covered by SLAs, so you know what service you can expect. But you no longer have real control over who is doing what, and when, to your customers' data. A firewall or VPN solution doesn't really help, because it's designed to only keep out unauthorized persons -- but the outsourced company have full access, so how do you track what they doing?

Some systems, like Oracle, let you turn on database auditing -- but if you outsource the DBA function, then your DBA can turn it off. So most of the time, you just have to trust people -- until something goes wrong, some critical table is dropped, or some vital information leaks -- and then you're stuck, because where do you start investigating?

This is the business problem solved by AGORA -- it's a secure application gateway appliance which sits between your internal systems, and the authorized persons who need access, that keeps indelible records of all activity -- down to the level of scanning the network protocol in real-time, and recording all keystrokes or SQL queries sent by the external administrator, transparently and with no noticeable impact on performance. It supports SSH, Oracle SQL*Net, Microsoft SQL TDS, HTTP/HTTPS, Telnet, FTP and even X11 protocols. This means that all traffic is captured in separate files, linked to the uniquely-identified user who started the sessions.

A separate auditor user role can login via the Web interface, and review audit logs of the various sessions managed by the system. The workflow management is integrated with a built-in trouble ticket system, so audit logs of access to a service can be linked to specific problems or activities. We also tie the sessions in with specific VPN-authenticated users (we support Check Point VPN, Open VPN or even pre-shared SSH keys for authentication of users.)

We've recently added plug-in modules for supporting HTTP and HTTPS auditing, which also tracks all files which are up or downloaded from a remote Web server. Our latest version of the software will include SSH session audit (which includes the possibility to play-back sessions in real-time), as well as X11 sessions. The system does its work by protocol inspection of every packet -- extracting the audit-relevant information, associating it with a specific two-factor authenticated user, and writing it to a secure tamper-proof logging system, including packet payloads (such as SQL commands or SSH terminal sessions.)

We're planning to offer the AGORA system as a Hardware Appliance for high-performance requirements -- but it's currently available as a software installation, or as a VMWare virtual appliance. When installed on a VMWare server, the same functionality is available, but with slightly reduced performance possible (depending upon the hardware.)

The system uses email and web interfaces to communicate with its users -- typically, for example, a support technician (such as a Database Administrator or DBA) will receive an email informing them of a trouble-ticket which has been opened against one of the many production databases they are responsible for. An email will go to the support co-ordinator for the company, who will assign it to the next available technician with the appropriate access rights. Upon receipt of the email, the technician can then click on a Web link, which opens dynamically a port on the firewall (accessed through the VPN) which gives access to the relevant service. This starts the audit session, and also keeps track of when activity occurs (which is very useful for SLA verification.)

Naturally, because the system is ticket-based it blocks access to resources for which no ticket is available -- and also includes the possibility to restrict access to specific time periods -- and will automatically close access when the ticket expires.

In summary, this is a great tool for organizations that need to provide positive auditing of access to critical or sensitive internal resources by outside users (such as DBAs or developers), without requiring special logging to be enabled directly on every resource. With the increasing requirements of Basel II, ISO27001 and Sarbanes-Oxley for compliance programs, such an audit appliance will become essential in every large enterprise.

23 November 2007

Digital Rights and the right to be paid

A recent article in the International Herald Tribune by computer scientist and composer Jaron Lanier argues the case for a new model of compensating artists, writers and other creative types. Despite an earlier advocacy for Internet piracy, he now admits he was wrong, and that the promise of the Web to increase opportunities for getting paid for creative output has not materialized.

In my view, the situation is not as dire as he implies. Yes, there are many writers who would like to earn a living from the Internet, but it's simply not going to happen, due to the huge numbers of "wannabes", and limits to the demand for paid content. Aggregation services tend to function as filters for quality -- in much the same way as publishers trawl through piles of submitted manuscripts, looking for the hidden gem that might turn a profit -- but ultimately, the market will decide.

Simple economics suggests that not every writer can be paid for their writing -- there are simply too many of them, and a huge influx of enthusiastic amateurs has made it even more difficult for good writers to have their voice heard. Fortunately, I believe that the filtering mechanisms will adapt naturally as the ecosystem develops, as we already see many fine writers are featured on Blogs such as BoingBoing, Salon, Technorati and even Digg and Kuro5hin.

Whether these writers make money is an interesting question, which cuts to the heart of Lanier's thesis -- that the advertising model (as supported by Google's Adwords) is not enough to earn a decent living, and that some other micropayment model is required to solve the problem of the "free rider." Technically, such systems exist, but tend to live behind "walled gardens" (such as AOL), or are burdened with restrictive Digital Rights Management (DRM), such as Amazon's popular new Kindle e-Book reader.

For me, the more interesting issue is that the content providers -- or more specifically, the publishers -- haven't yet come to terms with the demands of its customers. Currently, many of us watch TV which is laden with excessive advertising, that disrupts our enjoyment of great programs like "Dexter" and "Heroes." Increasingly, however, there is a new generation of Internet-literate scofflaws who spurn the advertising, and prefer to trade (mostly illegally) in high-definition digital downloads of their favourite TV shows and movies.

As this trend increases, advertisers will see a decline in their revenues, leading to attempts by studios to be more restrictive with DRM -- an effort which is doomed to fail, for good technical reasons. Their only hope is to adapt their business model (as Apple's wildly-successful iTunes has shown can be done with music), so that consumers have more choice over what they download--and pay a fair price for content which is not locked down with DRM that restricts their options for viewing the shows they want to see.

Ultimately, it may be that a reputation-based system may evolve (such as Cory Doctorow's "Whuffie") -- but I'm not holding my breath. History has shown that artists and writers need some support from the wealthy to create their best works--but that until we achieve a post-scarcity economy, there will always be a surplus of artists and writers (however talented) starving in a garret.

Media Center selection update

My latest thinking is that I will probably buy either a Sony PS3 or Microsoft XBOX 360 as a Media Center. The real issue is going to be DIVX/XVID support. There are rumours that both Sony and Microsoft have finally recognized that support for these codecs in their player firmware is important to some customers. Sony has apparently added a patch in the latest firmware to support selection of this type of file -- but there is no firm date on when it will be able to play them, so I will wait until that turns up before making a decision.

Also of some interest is whether NAS storage (e..g, SAMBA mount) could be used with either system. We'll see....

Dance Review: The Beggar and the Bird

Dances with Birds

A drama of self discovery in movement, pantomime and special effects.

It was a chilly Thursday night at the Odeon Theater in Vienna, as the lobby thrummed with anticipation. Nearly 250 people had turned up for the premiere of "The Beggar and the Bird," a Dance and Music performance created by New Zealand Choreographer, Amber Stephens. Together with musician Natalie Jean-Marain, and dancer Albert Kessler, Stephens has produced an original story that entwines soaring vocal improvisation with pyrotechnic displays of Modern Dance energy.

Upon entering the grand portico of the Odeon Theater (formerly an agricultural trading exchange, complete with fluted marble pillars and elegant staircases), the audience found themselves viewing a broad stage, flanked on either side by two mysterious seated plaster figures – apparently the chrysalises from which some strange female figures had recently emerged. A tall banner of newspaper clippings of an actress’ life hung at stage left, while a wheeled mirror waited in the wings. A small group of musicians huddled silently at the rear, accompanied by an elegant singer, seated on a tall stool.

From her first moments on stage, Stephens led us into the interior life and feelings of each character she played. First on stage was the Diva, so upright in posture, silently miming her daily superficialities, while allowing us to glimpse the loneliness beneath the mask. Clad in a simple cocktail dress, she conveyed through gestures and facial expressions the reality of the unreflected life, diverting but shallow.

The story introduces a range of characters, in a transformative journey that leaves each affected by their interactions. The Diva is world-weary, a woman of ambition and power, capable of art, yet selfish and sometimes cruel. In an impressive display of on-stage metamorphosis, the Diva then changes into the Beggar. Initially restrained, the dancing becomes more frenetic, arms gyrating, with twirls, rhythmic breathing, dips and falls, as an insistent drumming begins to be heard.

Events begin to take a darker turn, when the Beggar meets its Shadow – Albert Kessler – who leads the Beggar down paths of power and control, which culminate in obsession, and the total abjection of the Bird, cast down into an emotional well, from which only the newly-awakened compassion of the Beggar can rescue it. The Shadow mirrors the darker side of the Beggar, engaging in a physically demanding pas de deux of puppetry and power, with great leaps, rolls, martial jabs and lifts, as well as much floor work.

Some loss of self seems to be a prerequisite for the classical journey of self-discovery, charting unknown territories of one's internal world, to discover its deeper meaning. This journey is not without missteps, as we learn when the Diva meets the Bird – played by Jean-Marain – whose wings materialize in subtle vocalizations and static poses, aided by a costume of silver and feathers.

Like Kate Bush's Aerial, Jean-Marain invents a language of birds, with its “Kirikeeks” and “Kurruuuuu” cries, evoking the lilt of a forest-dwelling bird-of-paradise. The Beggar, dances to these songs – as Stephens dances patterns that mirror the soaring voice of Jean-Marain. This interaction between Beggar and Bird is the core of the performance, as the Bird sings and the dancer reflects them in motion, a sound-driven marionette. Soon, however, the flow of influence is reversed, and the Beggar delights in exercising control over the Bird's song – with disastrous consequences for the Bird.

At the climax, the Shadow is reintegrated, the Bird redeemed, and the Beggar arises, transformed –ready for the next stage of a journey reflecting the labyrinth of our own life changes. The stunning finale, sung with English lyrics by Jean-Marain, lifts the energy and leaves an impression of serenity and self-acceptance.

Written and choreographed by Stephens, who remained throughout on stage, the work incorporates elements of modern dance, Brazilian capoeira, floor work, hip-hop, and allusions to classical ballet, all performed to a high technical standard. The music, performed live on guitar, piano and extensive percussion, hinted at Shamanic drumming, Arabic motifs and Spanish flamenco themes, emphasizing the different stages of the story, and deftly supporting the high-energy levels of the two dancers. The sparse staging included a curious mirror through which the dancer passed parts of her self, as if seeking to reflect on her actions.

The mystery of the plaster bodies was only resolved at the end of the performance, when a large screen behind the musicians showed the process of applying plaster to the dancer, which hardened and then was shed as if emerging from a cocoon – an apt metaphor for the transformation which we had just witnessed.

The performance had been a huge challenge, the creative team acknowledged later, with hundreds of hours of rehearsals, and the management of extensive details of costume design, staging, musical composition, choreographic research and improvisation. In the end, the work seemed more than justified, and was well received by the audience. The team plans to take the show abroad, to Dance and Arts festivals around the world over the next few years, as well as producing variations of the story in other media, to retell the modern myth of the Beggar and the Bird in different forms.

Disclaimer: The writer contributed the Website design for this performance, but has no beneficial connection with the performers.

"The Beggar and the Bird"
Odeon Theater
Nov. 8, 2007
Choreographer/Principal Dancer: Amber Stephens
Music/Singer: Natalie Jean-Marain
Dancer: Albert Kessler
Website: www.beggarandbird.com

13 November 2007

First Impressions

Just for fun, I thought I'd post another picture of myself onto this Blog. Many people think they know me -- but some might be surprised by what they see in the picture.

The weapon in my right hand is a Czech-made copy of an AK47, with bipod rest. In my left hand is a 9mm automatic pistol. The picture was taken in 2005, somewhere in Slovakia.

07 November 2007

Return on Security Investment (ROSI)

Earlier this year, I prepared a presentation for a Security Conference, which includes a concept which I think other readers might find interesting. It's the "Return on Security Investment." Basically, the idea is to perform a Risk Assessment, and to calculate the probabilities of occurrence of various scenarios which can cause losses or other damage.

Next, you determine the most appropriate controls to mitigate or eliminate those risks, and determine their costs. For example, if you know that there is a 2% chance that the annual Spring rains will bring major floods, and you have a house near a river, you might expect that repairs of the damage caused by flooding could cost you 100,000 of your local currency. You consider various options for protecting your house, e.g., installing flood defenses, diverting the river, putting in basement pumps, etc.

Given that a 2% chance annual event is likely to occur at least once in 50 years, we can then analyse whether investing in counter-measures -- i.e., security controls -- is going to cost us more than the event itself. Assuming we normalize the monetary unit per time value of money (Net Present Value), a single loss event cost of 100,000 means an average cost per year of 2,000 (recall we expect this event at least once every 50 years.) So, if the capital and operational (CAPEX/OPEX) costs of the controls are more than 2,000 currency units, then it's probably not a good investment for us.

In which case, our next step would be to try to transfer the risk -- i.e., by finding an insurer who would sell us 100,000 worth of flood insurance coverage for say 1,800 units per year -- which would be a good financial decision, based on Return on Security Investment (ROSI.) Of course, our insurer would be likely to be using a similar basis of calculation -- but they have advantages of scale and usually superior sources of information on risk, and therefore may well offer a better price.

In the final analysis, the worst thing we can do -- is to do nothing, and hope for the best.

06 November 2007

Media Center Extenders

At home, I've been running a Pinnacle ShowCenter 200 (older model), which has been fine for the past year or so -- until last week, it suddenly stopped displaying any of the text in the menus. This was really wierd -- the system would boot just fine, showing the logo, and the showcenter logo in the upper right corner -- but the names of TV shows would not appear at all.

At first I suspected this was due to a recent upgrade of the Linux-based back end -- I'm using the Linux MTPCenter, which has done a great job running with Lampp on my Ubuntu system. I recently upgraded to MTPCenter 2.0, and thought this might be the issue -- but I saw the menus from the later version for a few days, and downgrading still failed to show the menu items.

A related glitch is that the fast forward capability used to show the percentage -- but this does not display (although it works just fine.) The strange thing is, I can navigate through the menus by sound, and by looking at the MTPCenter through a standard Web browser -- and the programs stream just fine. My guess is the character generator for the fonts might be broken -- I've tried everything with the ShowCenter that I can think of, and also tried hacking on the CSS in the MTPCenter to change font displays and background, but with no results.

Anyway, I decided to replace the Pinnacle with something with more capabilities. My ideal Media Center should be able to do the following:

1) Stream MP2 Video, MP3s, DivX and XVID
2) Stream from Internet Radio (e.g., ww.sky.fm.)
3) View pictures from network storage
4) Work with Linux
5) NOT require a Windows box anywhere
6) Use a remote
7) Output to HDMI or at least component with up to 1080p to my HD TV
8) Handle AC3 audio, and at least Dolby 5.1
9) Handle MKV wrappers
10) Maybe in future play either from BlueRay or HD DVD.
11) Noiseless low temperature operation
12) I don't want to spend more than 250 Euros, or "roll my own."
13) I don't want DVR or recording functionality

So, I started looking around for some options.

I first got interested in the XBMC Open Source application, which looks really cool. It does most of what I need, but only seems to work on the original Xbox (and not the Xbox 360 or Elite), which means that I won't be able to use HDMI output, or even plug in a HD DVD in future.

I considered the Sony PS3, but am not clear on whether or not it can play DivX or XVID. My guess is it's probably a "no" -- and I don't really like Sony as a brand, although getting the BlueRay drive is tempting.

I also considered the AppleTV, or even a small Apple box, but the former lacks the decoders I want, while the latter is too expensive -- and both lack HDMI (although realistically speaking, DVI output would probably suffice.)

So there are still some more options.

First up is the Xbox 360 with optional HD DVD. A little expensive, but there is the benefit of getting access to games like HALO -- but who has time for games these days? -- I barely have enough time to watch Heroes or Prison Break! I don't like supporting Microsoft anyway, although I could *console" myself (nasty pun that) with the thought that each Xbox sold is a loss for M$. I also don't like the way that the XBox 360 enforces code signing and other nasty DRM stuff, and am not aware of a simple "mod" for the 360 which won't invalidate the warranty. So that's out.

Next option is a Mac Mini. I bought one of these for my father, and he seems to use it, but not for TV. The output is DVI I think, but no HDMI -- although probably good enough. While I like Apple boxes (we already have three at home), I can't really justify spending the 600 Euros it costs here in Austria for the smallest model. So that's off the list for now, at least until I win the lottery (which isn't going to happen, since I don't buy tickets!)

I don't really want a box with a built-in hard drive, since I have enough disk space on other machines. I'm using my Ubuntu Box as my TV and media file server, running SAMBA and Azureus, and therefore simply want a box which streams over the LAN, without using Microsoft software anywhere if I can avoid it.

Looking around I found the Linksys DMA2100.

Also interesting is this D-Link box, which is only 180 Euros -- but I'm not sure if it also has a wired LAN, as well as wireless. Well, it will come out at the end of November, so we'll see. There also seems to be a US version, with different specs: the Dlink DSM-750, which looks nice, although more expensive, and I don't know if it will be available in Europe.

02 November 2007

Time Machine on older PowerBooks

A trap for young players (yes, me!) with implementing Time Machine on older Powerbook machines.

Leopard will install and run just fine -- and you can plug an external drive into the USB port, and backups will work -- but it's totally impractical, because certain older Powerbook models have only USB 1.1, rather than USB 2.0 -- which means it's very, very slow.

My recommendation -- if you have an old Powerbook, then try to get an external drive that uses Firewire rather than USB.

01 November 2007

Using Apple's Time Machine

I've been using OS X 10.5 (Leopard) for a few days, and felt I'd share some of my findings, mainly with regard to Time Machine.

I purchased a family pack, and have upgraded  four Macs successfully so far.   In my view, Time Machine is a great idea, and worth the price of admission alone, especially for those who haven't done backups before.

Personally, I use Mozy for my basic backup needs, but like the possibility of additional layers of backup which Time Machine provides.  

So, there are a couple of things I discovered:

1. It is possible to trigger Time Machine manually.  Simply hold down the "ctrl" key, then click the Time Machine icon in the dock -- a menu will pop up, which contains the item "Back Up Now".   This is documented in the online help.

2. Time Machine will not activate itself (apparently) when running on a Mac Book Pro, if running on battery power.  It will wait until mains power is connected, and then schedule itself.

3. Time Machine doesn't handle encrypted files well.  Specifically, it won't backup individual files stored in an encrypted file system -- instead, it will backup the entire file system.  This is not too surprising, considering that backing up the unencrypted files would be a security risk.  I guess Apple will be working on some workaround for this, but I don't see an easy fix, due to the key management issues.

4. Time Machine apparently does not use encryption, or even compression, for files stored on the backup device.  This is a deficiency in my view which should be corrected in the future, or by third-party add-ons.  Naturally this is a user issue -- because the typical user would be unable to deal with the key management issues.  I think Mozy has a reasonable approach in this regard, but it's up to each person to decide how to manage encryption keys.

5. The caveat regarding encrypted file systems also applies to virtual machines, which I believe are treated as a monolithic whole -- it's not easy to imagine how this can be otherwise, especially if the virtual machines (I use Parallels, but it also applies to VMware) are not running.  Maybe VM vendors will expose their file systems to the fsevents mechanism which harvests file changes in future, and allow Time Machine to selectively back up only changes in the guest operating systems -- after all, FAT32, NTFS and ext3 formats are well-known.

6. It seems the 5160 build of Parallels has an issue with running VMs which are restored from Time Machine.  I was able to cause my OS X to kernel panic when trying to run a WinXP which I restored.