Operating Systems‎ > ‎Linux‎ > ‎

Ubuntu Page 1

Back in the early part of 2008 we decided that we wanted to take a fresh look at Linux on the desktop. To do so we would start with a “switcher” article, giving us the chance to start anew and talk about some important topics while gauging the usability of Linux.

That article was supposed to take a month. As I have been continuously reminded, it has been more than a month. So oft delayed but never forgotten, we have finally finished our look at Ubuntu 8.04, and we hope it has been worth the wait.

There are many places I could have started this article, but the best place to start is why this article exists at all. Obviously some consideration comes from the fact that this is my job, but I have been wanting to seriously try a Linux distribution for quite some time. The fact that so much time has transpired between the last desktop Linux article here at AnandTech and my desire to try Linux makes for an excellent opportunity to give it a shot and do something about our Linux coverage at the same time.

After I threw this idea at Anand, the immediate question was what distribution of Linux should we use. As Linux is just an operating system kernel, and more colloquially it is the combination of the Linux kernel and the GNU toolset (hence the less common name GNU/Linux), this leaves a wide variation of actual distributions out there. Each distribution is its own combination of GNU/Linux, applications, window managers, and more, to get a complete operating system.

Since our target was a desktop distribution with a focus on home usage (rather than being exclusively enterprise focused) the decision was Ubuntu, which has established a strong track record of being easy to install, easy to use, and well supported by its user community. The Linux community has a reputation of being hard to get into for new users, particularly when it comes to getting useful help that doesn’t involve being told to read some esoteric manual (the RTFA mindset), and this is something I wanted to avoid. Ubuntu also has a reputation for not relying on the CLI (Command-Line Interface) as much as some other distributions, which is another element I was shooting for – I may like the CLI, but only when it easily allows me to do a task faster. Otherwise I’d like to avoid the CLI when a GUI is a better way to go about things.

I should add that while we were fishing for suggestions for the first Linux distro to take a look at, we got a lot of suggestions for PCLinuxOS. On any given day I don’t get a lot of email, so I’m still not sure what that was about. Regardless, while the decision was to use Ubuntu, it wasn’t made in absence of considering any other distributions. Depending on the reception of this article, we may take a look at other distros.

But with that said, this article serves two purposes for us. It’s first and foremost a review of Ubuntu 8.04. And with 9.04 being out, I’m sure many of you are wondering why we’re reviewing anything other than the latest version of Ubuntu. The short answer is that Ubuntu subscribes to the “publish early, publish often” mantra of development, which means there are many versions, not all of which are necessarily big changes. 8.04 is a Long Term Support release; it’s the most comparable kind of release to a Windows or Mac OS X release. This doesn’t mean 9.04 is not important (which is why we’ll get to it in Part 2), but we wanted to start with a stable release, regardless of age. We’ll talk more about this when we discuss support.

The other purpose for this article is that it’s also our baseline “introduction to Linux” article. Many components of desktop distributions do not vary wildly for the most part, so much of what we talk about here is going to be applicable in future Linux articles. Linux isn’t Ubuntu, but matters of security, some of the applications, and certain performance elements are going to apply to more than just Ubuntu.

Background

I think it's impossible to offer a purely objective review on an operating system – qualitative data like the GUI and nebulous concepts like “ease of use” can't be measured. There is a degree of subjectivity in such a review, and I believe it's important to relate that in this article To that extent a bit of background on myself is probably going to be helpful on relating my point-of-view on matters, before jumping into Ubuntu. This section is being written prior to my even touching Ubuntu, so that it doesn't end up reflecting my experience, rather than my expectations.

Based on the computers I have owned and the operating systems I have used, I would best be classified as a Windows user. Like many of our readers (and our editors) I have lived the Microsoft life, starting from DOS and going straight through to Vista. I have clocked far more time on Windows than anything else, and it's fair to say that's where my skills (troubleshooting and otherwise) are strongest.

With that said, I am by no means limited to just a single OS. As was customary for most American schools in the 90s, I had access to the requisite Apple IIs and Macintoshes. But to be frank I didn't care for Mac OS Classic in the slightest – it was a remarkable OS in 1984 and even in 1993 and the age of Windows 3.1, but by the time Windows 95 rolled around it was more of a nuisance to use than anything else. It's through a cruel joke that when starting work in IT in 2001, I was tasked with using the newly released Mac OS X 10.0 “Cheetah” full-time to gauge its status for use on the organization's Macs.

Apple didn't ship Mac OS X as the default OS on their Macs at that time, which should tell you a lot. Nevertheless, while I abhorred Mac OS Classic, Mac OS X was far more bearable. The interface was better than anything else at the time (if not a bit too shiny), application crashes didn't (usually) take out the OS, and the Terminal was a thing of beauty. Sure, Windows has a command line environment, but it didn't compare to the Terminal. Mac OS X was a mess, but there were nuggets to be found if you could force yourself to use it.

I'll save you the history of Mac OS X, and we'll pick up in 2004, where Apple had improved Mac OS X a great deal with the release of 10.3 “Panther.” At this point I was a perfectly happy Mac user for my day job, and I probably would have used one at home too if it wasn't for the hefty price of a Mac and the fact that it would require having an entirely separate computer next to my gaming PC. A bit later in what was probably a bad idea, I convinced Anand to try a Mac based on the ease of use and productivity features. This resulted in A Month With A Mac, and he hasn't left the platform since.

Finally we'll jump to the present day. I'm still primarily a Windows user since I spend more time on my desktop PC, while my laptop is a PowerBook G4. I would rather be a Mac user, but not a lot has changed in terms of things preventing me from being one. To replace my PC with a Mac would require throwing down money on a workstation-class Mac Pro that is overkill for my processing needs, not to mention my wallet.

I also am not a fan of dual-booting. Time booting is time wasted, and while I am generally not concerned about boot times, dual booting a Mac would involve rebooting my desktop far more often than the occasional software installation or security update currently requires. It also brings about such headaches as instant message logging being split in two places, difficulty accessing documents due to file system/format differences, and of course the inability to simultaneously access my games and my Mac applications. In theory I could game from within Mac OS X, but in reality there are few native games and virtual machines like Parallels and the Mac branch of Wine are lacking in features, compatibility, and performance.

I also find the Mac to be a weak multimedia viewing platform. I'll get into this more once we start talking about multimedia viewing under Ubuntu since much of the underlying software is the same, but for now I'll say that libavcodec, the standard building block for virtually all *nix media players, is particularly lacking in H.264 performance because the stable branch is single-threaded.

So while I'm best described as a Windows user, a more realistic description would be a Windows user that wants to be a Mac user, but can't bear to part with Windows' games or media capabilities.

As for my experience with Linux, it is not nearly as comprehensive. The only time I ever touched Linux was in college, where our department labs were Dells running Linux and the shell accounts we used for assignments were running off of a small Linux cluster. I never touched the Red Hat machines beyond quickly firing up Netscape Navigator to check my email; otherwise the rest of my Linux usage was through my shell account, where I already had ample experience with the CLI environment through Mac OS X's terminal.

My expectations for Ubuntu are that it'll be similar to Mac OS X when it comes to CLI matters - and having already seen screenshots of Ubuntu, that the GUI experience will be similar to Windows. I am wondering whether I am going to run into the same problems that I have with Mac OS X today, those being the aforementioned gaming and multimedia issues. I have already decided that I am going to need to dual-boot between Ubuntu and Vista to do everything I want, so the biggest variable here is just how often I'll need to do so.

It's Free - Gratis

When doing the initial research for this article, one of my goals was to try to identify all of the reasons why I would want to use Ubuntu. While there are many reasons, a lot of them are what amount to edge cases. At the risk of being accused of shortchanging Ubuntu here, after using Ubuntu for quite some time the main reasons came down to this: It's free, and it's secure. That's it. Many of the popular Linux applications can be found for Windows, non-gaming performance largely isn't a concern on a high-end desktop such as mine, and no one is making any serious claims about ease of use when compared to Mac OS X. Ubuntu is free and Ubuntu is secure, but that's about it.

We'll start with “free”, since that's one of the fundamental subjects. When we say Ubuntu is free, there are two elements to that. The first is that Ubuntu costs nothing; it is free (gratis). The second is that Ubuntu's source code is open and can be modified by anyone; it is free (libre). This is expressed in the popular and simplified slogans of “free as in beer” and “free as in speech.” Many software products are freeware (e.g. Futuremark's PCMark) but fewer products are open source. The former does not necessitate the latter or vice versa, although practical considerations mean that most open source software is also freeware in some fashion since you can't keep people from compiling the source code for themselves.

There's fairly little to explain with respect to Ubuntu being freeware. It can be downloaded directly from the Ubuntu website in the form of an ISO disc image, and copied, installed, wiped as many times as anyone would like. Ubuntu's corporate sponsor/developer Canonical also sells it for a nominal price (currently it's listed on Amazon for $12) but there is no difference between the retail version and the download version. It's a free operating system, and free is a very good price.

Being free does mean giving up some things that would normally come with purchased software. Official support is the first element, as since it's a free OS there is no one being paid to support users. We'll dive into support in-depth in a bit, but for now it's enough to remember that Ubuntu does not come with official support. Support options are limited to the Ubuntu Knowledge Base, the forums, and whatever additional help can be found on the internet.

There's more to being able to offer Ubuntu for free than just not offering official support. Incidental expenses of assembling and distributing Ubuntu are covered by Canonical, who expects to eventually make a profit from Ubuntu through selling enterprise support. Development of Ubuntu and the underlying Linux components are done by a variety of volunteers working in their spare time, and paid employees from companies such as Novell, IBM, Red Hat, and others who use Linux in their commercial products and have a vested interest in its development.

However - and this is where we're going to take a bit of a detour - there is also the matter of who is not paid because Ubuntu is free. The United States patent system allows for ideas and methods to be patented, along with the more typical physical devices. What this means is that everything from encryption methods to video codecs to file systems can be and are patented by a variety of companies. As a result a lot of technologies in common use are patented, and those patents must be licensed for use when it comes to the United States (and many other countries with similar patent systems). Ubuntu includes software that uses patented material, but since Ubuntu is free, no one is paying those license fees.

Now I want to be very clear here that the reason I bring this up is because it's interesting, not because it's a problem. The chief example of where patents are an issue is media playback. MP3, MPEG-2, H.264, AAC, and other common formats have paid license requirements. This directly rears its head for the user when you first fire up Ubuntu's movie player and attempt to play a movie using a patented codec. Ubuntu goes through great lengths to point out that it needs to use a patented codec to play the material, and that unless the user has a valid license for the codecs it may be a patent violation to play the material, ultimately giving the user the option to download what Ubuntu calls the “restricted” codec set that is not distributed for legal reasons.

With that said, the legal issues are entirely theoretical for the end user. While using the restricted codecs is technically a patent violation, to our knowledge no individual has ever been sued or otherwise harassed over this, and we don't expect that to ever change. The licensing bodies like MPEG-LA are concerned with commercial products using their property – if someone is making money from their property, they want a piece of it. They are not concerned with home use of their codecs, and quite frankly users have nothing to be concerned about.

It should also be noted that Ubuntu (and other Linux distros) are not alone in this. VLC, Media Player Classic, various Windows codec packs, and many other free media players are also technically in violation of patent law for the same reasons. Even if someone is a Windows user, there's still a good chance they're violating patent law. For all practical purposes it's very hard to avoid being an IP violator, no matter the platform.

Meanwhile for those that absolutely must stay on the right side of the law, there are options, but it's not pretty. Canonical sells licensed software packages that can play back most media formats; Cyberlink's PowerDVD Linux for DVD playback, and Fluendo Complete Playback Pack for everything else. However the price may be shocking: being legit is going to cost you $50 for PowerDVD and another $40 for Fluendo. This makes a small but notable difference from Windows and Mac OS X. It's hard but not impossible to be both free and legitimate on those platforms through legal software that is given away for free – Winamp, Quicktime, DivX, and Flip4Mac all fall under this umbrella. Again, this makes no practical difference – no one who's holding a patent cares – but it's something any Ubuntu user trying to playback media is going to have to pay attention to for a fleeting moment.

Ultimately, the important bits to take away from this are that Ubuntu is free as in beer, and for the price you're only giving up official support. There are some patent issues, but since no one on either side actually cares, it doesn't matter. If nothing else, Ubuntu will be the best-priced operating system you will ever use, and price matters.

It's Free – Libre

While the value of “free as in beer” is easy to describe, the value of “free as in speech – otherwise known as libre – is harder to relate. Nonetheless, rather large books have been written on the subject, so we'll try to stick with something condensed.

Virtually everything distributed with Ubuntu is an open source program in some manner. Many of the components of Ubuntu, such as the Linux kernel and the GNU toolset, are licensed under the GNU General Public License (GPL), which in a nutshell requires that any software distributed under the GPL license either include the source code with the software or a way to get the source code. Other bits of Ubuntu are under slightly different licenses with slightly different legal requirements, but the outcome is effectively the same. Ubuntu is free - you can get the source code to it and modify/distribute it as you see fit.

But when we're talking about Ubuntu, there's more than just being able to access the source, as most of the development teams that are responsible for the programs included in Ubuntu have their projects open for public participation. So not only can you take the code and modify it, but if your modifications are good enough they can be submitted back to the main project and possibly be included in a future version of the software. The fundamental idea of open source software is that users are empowered to see how software works and to modify it as they see fit. Other lesser benefits also exist, such as protecting authors' rights by preventing people from taking the code and improving it without sharing it (the GPL), and making sure all the authors are properly credited.

This does not always make open source relevant for the user however. The fundamental benefits of open source software are for people that are programmers, but most users are not programmers. Being able to see and edit the code is not necessarily useful if you don't know how to use it. Even with a background in programming, I would be hard pressed to be able to quickly contribute significant code changes to most projects; very few programs are small and simple enough to be able to easily jump into these days.

Still, there are some definite benefits for those of us that can't throw out code like Linux's chief architect Linus Torvalds. The most direct benefit of course is that this software exists at all. Since all of the software in Ubuntu is free as in beer, paid developers do not develop many of the programs. Open source as a default state makes it easier for people to contribute to the development of software, and that means it's easier for such gratis software to be continually developed in the first place.

Open source software is also a benefit for the longevity of software. Since no one person has absolute control over a project, no one can terminate it. This means that someone else can pick up a project and continue should the original developer(s) quit, as is sometimes the case with old software. It also allows for software to be forked, which is to take the code from a project and create a derivative separate from the original project – the benefit being that a forked project can be taken in a different direction than the original developer may want. As proof of the importance of forking, there are a number of programs in Ubuntu that are forks of older projects, such as X11 (otherwise known as just X), Ubuntu's base windowing system.

Finally, open source software is beneficial to overall software security. If you can see the source, you can analyze it for possible bugs. If you can edit the source, you can fix those bugs rather than wait for someone else to do so - and we can't even begin to overstate the importance of this. The direct relevance to the average user is once again limited here since most people cannot read or write code, but it does filter down through benefits such as rapid patching of security vulnerabilities in some cases. The security benefits of Ubuntu being open source are some of the most important reasons we consider Ubuntu to be secure.

In short: even if you can't code you benefit from Ubuntu being a free (libre) operating system.

It’s Secure

Security is a tough nut to crack, both with respect to making something secure and judging something to be secure. I’m going to call Ubuntu secure, and I suspect that there’s going to be a lot of disagreement here. Nonetheless, allow me to explain why I consider Ubuntu secure.

Let’s first throw out the idea that any desktop OS can be perfectly secure. The weakest component in any system is the user – if they can install software, they can install malware. So while Ubuntu would be extremely secure if the user could not install any software, it would not be very useful to be that way. Ubuntu is just as capable as any other desktop OS out there when it comes to catching malware if the user is dedicated enough. The dancing pigs problem is not solved here.

Nevertheless, Ubuntu is more secure than other OSes (and let’s be frank, we’re talking about Windows) for two reasons. The first is for practical reasons, and the second is for technical reasons.

To completely butcher a metaphor here: if your operating system has vulnerabilities and no one is exploiting them, is it really vulnerable? The logical answer to that is “yes” and yet that’s not quite how things work. Or more simply put: when’s the last time you’ve seen a malware outbreak ravaging the Ubuntu (or any desktop Linux distro) community?

Apple often gets nailed for this logic, and yet I have a hard time disagreeing with it. If no one is trying to break into your computer, then right now, at this moment, it’s secure. The Ubuntu and Mac OS X user bases are so tiny compared to that of Windows that attacking anything but Windows makes very little sense from an attacker’s perspective.

It’s true that they’re soft targets – few machines run anti-virus software and there’s no other malware to fend off – but that does not seem to be driving any kind of significant malware creation for either platform. This goes particularly for Mac OS X, where security researchers have been warning about the complacent nature this creates, but other than a few proof of concept trojan horses, the only time anyone seems to be making a real effort to break into a Mac is to win one.

So I am going to call Ubuntu, with its smaller-yet user base and lack of active threats, practically secure. No one is trying to break into Ubuntu machines, and there’s a number of years’ worth of history with the similar Mac OS X that says it’s not going to change. There just aren’t any credible threats to be worried about right now.

With that said, there are plenty of good technical reasons too for why Ubuntu is secure; while it may be practically secure, it would also be difficult to break into the OS even if you wanted to. Probably the most noteworthy aspect here is that Ubuntu does not ship with any outward facing services or daemons, which means there is nothing listening that can be compromised for facilitating a fully automated remote code execution attack. Windows has historically been compromised many times through these attacks, most recently in October of 2008. Firewalls are intended to prevent these kinds of issues, but there is always someone out there that manages to be completely exposed to the internet anyhow, hence not having any outward facing services in the first place is an excellent design decision.

Less enthusing about Ubuntu’s design choices however is that in part because of the lack of services to expose, the OS does not ship with an enabled firewall. The Linux kernel does have built-in firewall functionality through iptables, but out of the box Ubuntu lets everything in and out. This is similar to how Mac OS X ships, and significantly different from how Windows Vista ships, which blocks all incoming connections by default. Worse yet, Ubuntu doesn’t ship with a GUI to control the firewall either (something Mac OS X does), which necessitates pulling down a 3rd party package or configuring it via CLI.

Operating System Inbound Outbound
Windows Vista All applications blocked, applications can request an open port All applications allowed, complex GUI to allow blocking them
Ubuntu 8.04 All applications allowed, no GUI to change this All applications allowed, no GUI to change this
Mac OS X 10.5 All applications allowed, simple GUI to allow blocking them All applications allowed, no GUI to change this

Now to be fair, even if Ubuntu had shipped with a GUI tool for configuring its firewall I likely would have set it up exactly the same as how I leave Mac OS X set up – all incoming connections allowed – nevertheless I find myself scratching my head. Host-based firewalls aren’t the solution to all that ails computer security, but they’re also good ideas. I would rather see Ubuntu ship like Vista does, with an active firewall blocking incoming connections.

Backwards compatibility, or rather the lack thereof, is also a technical security benefit for Ubuntu. Unlike Windows, which attempts to provide security and still support old software that pre-dates modern security in Windows, Ubuntu does not have any such legacy software to deal with. Since Linux has supported the traditional *nix security model from the get-go, properly built legacy software should not expect free reign of the system when running and hence be a modern vulnerability. This is more an artifact of previous design than a feature, but it bears mentioning as a pillar of total security.

Moving on, there is an interesting element of Ubuntu’s design being more secure, but I hesitate to call it intentional. Earlier I mentioned how an OS that doesn’t let a user install software isn’t very useful, but Ubuntu falls under this umbrella somewhat. Because the OS is based heavily around a package manager and signed packages, it’s not well-geared towards installing software outside of the package manager. Depending on how it’s packaged, many downloaded applications need to be manually assigned an executable flag before they can be run, significantly impairing the ability for a user to blindly click on anything that runs. It’s genuinely hard to run non-packaged software on Ubuntu, and in this case that’s a security benefit – it’s that much harder to coerce a user to run malware, even if the dancing pigs problem isn’t solved.

Rounding out the security underpinnings of Ubuntu, we have the more traditional mechanisms. No-eXecute bit support helps to prevent buffer overflow attacks, and Address Space Layout Randomization makes targeting specific memory addresses harder. The traditional *nix sudo security mechanism keeps software running with user privileges unless specifically authenticated to take on full root abilities, making it functionally similar to UAC on Vista (or rather, the other way around). Finally, Ubuntu comes with the AppArmor and SELinux security policy features that enable further locking down the OS, although these are generally overkill for home use.

There’s one last issue I’d like to touch on when it comes to technical security measures, and that’s the nature of open source software. There is a well-reasoned argument that open source software is more secure because it allows for anyone to check the source code for security vulnerabilities and to fix them. Conversely, being able to see the source code means that such vulnerabilities cannot be completely obscured from public view.

It’s not a settled debate, nor do I intend to settle it, but it bears mentioning. Looking through the list of updates on a fresh Ubuntu install and the CERT vulnerability list, there are a number of potential vulnerabilities in various programs included with Ubuntu – Firefox for example has been patched for vulnerabilities seven times now. There are enough vulnerabilities that I don’t believe just counting them is a good way to decide if Ubuntu being open source has a significant impact on improving its security. Plus this comes full-circle with the notion of Ubuntu being practically secure (are there more vulnerabilities that people aren’t bothering to look for?), but nevertheless it’s my belief that being open source is a security benefit for Ubuntu here, even if I can’t completely prove it.

Because of the aforementioned ability to see and modify any and every bit of code in Ubuntu and its applications, Ubuntu also gains a security advantage in that it’s possible for users to manually patch flaws immediately (assuming they know how) and that with that ability Ubuntu security updates are pushed out just about as rapidly as humanly possible. This is a significant distinction from Windows and Patch Tuesday, and while Microsoft has a good business reason for doing this (IT admins would rather get all their patches at once, rather than testing new patches constantly) it’s not good technical reasoning. Ubuntu is more secure than Windows through the virtue of patching most vulnerabilities sooner than Windows.

Finally, looking at Ubuntu there are certainly areas for improvement with security. I’ve already touched on the firewall abilities, but sandboxing is the other notable weakness here. Windows has seen a lot of work put into sandboxing Internet Explorer so that machines cannot get hit with drive-by malware downloads, and it has proven to be effective. Both Internet Explorer and Google’s Chrome implement sandboxes using different methods, with similar results. Meanwhile Chrome is not ready for Linux, and Firefox lacks sandboxing abilities. Given the importance of the browser in certain kinds of malware infections, Ubuntu would benefit greatly from having Firefox sandboxed, even if no one is specifically targeting Ubuntu right now.

Ubuntu – Long Term Support

One item of particular interest with Ubuntu is their development schedule. Because a typical Linux distribution is composed of many applications from many different parties, the Ubuntu developers do not directly control or develop a lot of the software included in Ubuntu. Furthermore Ubuntu tries to be a complete desktop environment rather than just an operating system, which means it includes a wider variety of software than what’s found in Windows and Mac OS X.

What this amounts to is that Ubuntu needs to both provide future patch support for included applications, and it needs to compensate for the fact that they don’t develop many of these programs. Coupled with this is the fact that 2nd party application development is not necessarily synchronized to Ubuntu’s release schedule and some applications (and the kernel itself) can have a rather rapid development rate.

Trying to deal with all of these factors, Ubuntu has settled on two classes of releases. Every 6 months – in October and April – Ubuntu takes what’s ready and releases a new version of the OS. For 1st party material this is tied with some goal for the release (such as replacing the audio daemon) while for 3rd party software this may be as simple as grabbing the latest version. This puts regular Ubuntu versions in an unusual position when trying to classify them – it’s significantly more than a Mac OS X point update, still more than a Windows service pack, and yet a single release generally encompasses less than a new version of either OS. But at the same time, there’s no guarantee that any given release of Ubuntu won’t break software compatibility or binary driver compatibility, which puts it up there with major OS releases.

Furthermore because of the need to provide security updates for all these different programs in all of these different versions, Ubuntu has a very short support cycle, and in that cycle only bug fixes and security updates will be issued, software is not otherwise changed as it’s intended to represent a stable platform. A regular release is only supported for 1.5 years; which for example means support for 7.10 Gutsy, the immediate predecessor to 8.04 Hardy Heron, expired in April. This pushes new versions of Ubuntu back towards the idea of them being closer to a service pack or a point release. In essence, it’s intended that everyone using regular versions of Ubuntu will stick to a relatively rapid upgrade treadmill.

But this obviously doesn’t work for everyone, which results in there being two classes of Ubuntu. What we’re looking at today, 8.04, is what Ubuntu calls a long term support (LTS) release. Every 2 years a version of Ubuntu is labeled as a LTS release, which entails a much greater effort on the developer’s part to support that edition of the OS. The standard support period is 3 years instead of 1.5 years, and for the server edition of the OS that becomes 5 years.

This makes the LTS releases more comparable to Mac OS X and Windows, both of which have long support periods in excess of 3 years. This is also why we’re starting with a review of Hardy, in spite of it being over a year old now, because it’s the current LTS release. Regular short-support Ubuntu releases have their place, but they are not intended for long-term use. Coming from Windows or Mac OS X, a LTS release is the comparable equivalent.

Operating System Mainstream Support Extended Support
Windows 5 years 5 additional years
Ubuntu 1.5 years None
Ubuntu LTS 3 years None
Mac OS X So long as it's the newest OS So long as it's one version behind

Unfortunately, in spite of the LTS designation, not all of the applications in a LTS release are intended to be used for such a long period of time, or are their developers willing to support them for that length of time. If we take Firefox for example, the last Ubuntu LTS release, 6.06 Dapper, shipped with Firefox 1.5. Mozilla very quickly ended support for Firefox 1.xx after Firefox 2 shipped, and now you can’t even get support for 2.xx now that 3.xx has been out for quite some time. This leaves the Ubuntu developers in charge of supplying security updates for the older versions of Firefox they still support, which while better than the alternative (no security patches) isn’t necessarily a great solution.

The Ubuntu developers have done a good job of staying on top of the matter (they just published a new 1.5 security patch as recently as last month) but it highlights the fact that the Ubuntu developers do not always have the resources to maintain both a stable platform and the necessary security updates. So while an LTS release is supposed to be supported for 3 years, in reality not every component is going to make it that long.

Digging through the bugs list for Dapper and Hardy, I get the impression that these kinds of cracks only occur on less-used software (particularly that which is not part of the default install, such as VLC), so an option for users who need to stick with the base OS for the entire life of a LTS release, but don’t mind upgrading a few applications can go that route and cover all of their bases. Unfortunately this is easier said than done, and we’ll get to why that is when we discuss the package manager.

What this amounts to is that if you’re the kind of person that intends to run a computer and an OS for a very long period of time – say on the scale of XP, which turns 8 this year – Ubuntu likely isn’t a good fit for you.

What’s the Value of Technical Support, Anyhow?

Besides patching bugs and security vulnerabilities, the other aspect of “support” is technical support; help for when things go wrong. As I mentioned earlier, Ubuntu is free, and one of the conditions of this is that there is no official technical support for Ubuntu for the user. To be fair, there are some purchasable support options for larger organizations that can afford a support contract, but for the average desktop user this isn’t accessible. So as far as we’re concerned, Ubuntu doesn’t have any official technical support.

I spent quite some time gnawing over the idea of just how valuable technical support is. I have never made a technical support call for desktop software, often because I’m capable of finding and fixing the issue myself through the magic of Google, and because calling for technical support seems to be a futile exercise in being fed canned support scripts. So many possible things can go wrong with software that the person on the other end of the line may not be able to help you, which makes me question the value of technical support for software.

Trying to come up with a resolution for this matter, I posted a poll last year in our forums to get some user feedback. The skills of the people who inhabit our forums versus those who read our site means that this poll is not a scientifically valid poll, nor is it even a fair poll; it’s greatly biased towards the techie crowd like myself. Nevertheless, I wanted to know who uses technical support when they have it.

I had theorized that the results of the poll would end up reflecting my own views, and this is exactly what happened. When our forum participants were asked if they had ever called Microsoft for technical support with Windows (excluding activation issues), out of 52 votes only 9 of those votes were a “yes” for 17.3%. Clearly out of our techie crowd, the majority of users do not use their technical support options.

Based on this, I do not believe that technical support for a software product is valuable for the people most likely to be installing Ubuntu on their own. Or in other words: So what if Ubuntu doesn’t come with technical support? It’s not like most of us would use it anyhow.

I would take the time to separate the idea that software technical support is the same as total technical support however. It becomes another matter entirely when you can get support for a complete computer from an OEM. They can support both the hardware and the software, and that means they can (probably) help you solve issues when what looks like an issue with one element is really an issue with the other.

The benchmark here is Apple since they make both their hardware and their software, which puts them a step above Dell and other PC OEMs that are a bit more separated from the software. What I’m getting at is that is that even if Ubuntu came with technical support, it would be of limited value since they cannot help you with your hardware. If you need real support, you’re better off buying a computer from an OEM who can support all that you need (although we should note that even for computers sold with Ubuntu, the OEM does not usually handle the software support…).

Finally, just to throw out an example of how useless technical support can be even when you have it, let’s take a look at Windows (we’d take a look at the Mac, but OS support is bundled with the hardware). Even for a retail copy of Windows, which Microsoft offers direct support for, you only get free technical support for 90 days after activation. After that you’re out $59 per incident. It’s effectively installation and post-installation support, not support for continuing use.

In the end, not only would technical support likely be unbeneficial for most people once they’re past the installation process, but there’s no real precedent for offering technical support on just the OS. As such while there’s no technical support for Ubuntu, it ultimately doesn’t matter because no one else provides cheap extended technical support for just their OS either.

A Word on Drivers and Compatibility

As we mentioned earlier, Ubuntu and the Linux kernel are open source projects, particularly under the GPL license. In large part due to the philosophies of the GPL, compared to Mac OS X and Windows, Linux handles drivers in a notably different fashion.

In a nutshell, the developers of the Linux kernel believe in the open source movement and wish for all related software to be open source. Furthermore they do not like the implications of attaching a closed source “binary blob” driver to the Linux kernel, because if something goes wrong it can be impossible to debug the issue if it occurs in the driver for which they do not have the code for. As such they have moral and technical objections to the Linux kernel supporting external drivers and actively prevent the creation of such drivers. This is done through mechanisms such as not having a fixed API for external drivers, and by not artificially keeping themselves from making changes to the kernel that would break external drivers. Drivers that they do have the code for can usually just be recompiled against the new kernel and are unaffected as a result. The result is that “binary blob” drivers are systematically opposed.

For the most part, this works fine. Not all hardware is supported under Linux because not everyone is willing to share the specifications and data needed to make a driver, but more than enough device manufacturers are willing to share such data that Linux generally supports non-esoteric hardware quite well. There is one class of notable hold-outs here however, and that’s the GPU manufacturers, namely ATI and NVIDIA.

Compared to other drivers, GPU drivers are different for two reasons. First is the sheer complexity of the drivers - besides interfacing with the hardware, the drivers are responsible for memory management, compiling/optimizing shader code, and providing a great deal of feedback. This in essence makes GPU drivers their own little operating system – one that its developers aren’t necessarily willing to share. The second significant difference here is because of the above, GPU drivers are among the only drivers that have a compelling reason to be updated regularly; they need to be updated to better support newer games and fix bugs in the complex code that runs through them.

Complicating matters further is that some intellectual property in GPUs and drivers is not the property of the company who makes the GPU. AMD doesn’t own everything in their Universal Video Decoder, and just about everyone has some SGI IP in their drivers. In the interest of protecting that IP, it is difficult to release the code for those drivers containing other companies’ IP.

Because of all of this, manufacturer-supplied GPU drivers are not always open source. Intel and S3 do well in this respect (largely because they have few tricks to hide, I suspect), but hyper-competitive NVIDIA and AMD do not. AMD has been looking to rectify this, and back in 2007 we discussed their starting work on a new open source driver. Development has been progressing slowly, and for the R6xx and R7xx hardware, the open source driver is not yet complete. Meanwhile NVIDIA has shown no real interest in an open source driver for their current hardware.

So if you want to use a modern, high-performance video card with Linux, you have little choice but to also deal with a binary blob driver for that card, and this becomes problematic since as we mentioned Linux is designed to discourage such a thing. Both AMD and NVIDIA have found ways around this, but the cost is that installing a binary driver is neither easy, or bug free.

The fundamental method that both use for accomplishing this is through the use of a kernel shim. Both analyze the headers for the kernel to identify how the kernel is organized, then they compile a shim against that kernel. The shim resolves the issues with the lack of a stable API, and the other end of the shim provides the stable API that NVIDIA and ATI need.

Ubuntu in particular takes this one step further, and in the interest of promoting greater out of the box hardware compatibility, includes a version of the binary drivers with the distribution. This is unusual for a Linux distribution and has earned Ubuntu some flak since it’s not strictly adhering to some open source ideals, but it also means that we were not forced to play with driver installers to get Ubuntu fully working. Ubuntu had no issues with both our AMD 2900XT and NVIDIA 8800GTX cards, both of which were picked specifically because we wished to test Ubuntu on suitably old hardware which would exist in time for Ubuntu to include support for it. With that said, the drivers Ubuntu includes are understandably old (once again owing to the idea of a stable platform) which means we can’t avoid installing drivers if we want better performance and application compatibility.

And this is where “easy” comes to an end. We’ll first start with AMD’s installer, the easier of the two. They have a GUI installer that puts in a driver along with a Linux version of the Catalyst Control Center. It’s Spartan, but it gets the job done.

NVIDIA on the other hand does not have a GUI installer – their installer is a text mode installer that requires shutting down the X server (the GUI) in order to install. It’s difficult to understate just how hard this makes driver installation. Not only is doing all of this completely non-obvious, but it requires interfacing with the CLI in a way we were specifically trying to avoid. It’s something that becomes bearable with experience, but I can’t call it acceptable.

Driver upgrades are an issue on both sides, because the installers are not completely capable of finding and eliminating older versions of the binary drivers. In one instance, for the NVIDIA drivers we had to track down a rather sizable shell script that automatically deleted the old drivers before installing the new ones, as that was deemed the “right” way to install the drivers. We had less of an issue with ATI’s drivers, but to be fair the primary card I used for my time with Ubuntu was the 8800GTX. I can’t confidently say that there are not other issues that I may have not run in to.

The Ubuntu community does supply tools to help with GPU driver installations, Once such tool is EnvyNG, which reduces the driver installation process to selecting what driver you want to install and it does the rest. This is a far easier way to install drivers, in the right situation it’s even easier than it already is under Windows. But it suffers from needing to have the latest driver data hardcoded in to it, which means you can only use it to install drivers it knows about, and nothing newer. It’s not regularly updated (as of this writing the latest driver versions it has are NV 173.14.12 and ATI Catalyst 8.6) so it’s good for installing newer drivers, but not the newest drivers.

The other tool is access to Ubuntu’s Personal Package Archives, which are a collection of user-built binaries that can be installed through the Ubuntu package manager (more on this later). It’s harder to use than EnvyNG, but anyone can build a PPA, which makes updates more likely. As it’s user-generated however, this still means that there won’t always be the latest drivers available, which means we’re still back to using ATI and NVIDIA’s installers.

As it stands, installing new GPU drivers on Ubuntu is between an annoyance and unbearable, depending on how many hoops you need to jump through. It’s certainly not easy.

The other problem with GPU drivers is that they do not always stay working. Among the issues we encountered was ATI’s driver failing to work after installing an Ubuntu update, and an NVIDIA driver that kept rebooting the system during testing for reasons we never determined (once we wiped the system, all was well).

Our final issue with the state of GPU drivers on Ubuntu is their overall quality. With a bit of digging we can come up with issues on both sides of the isle, so it’s not as if either side is clean here. But with that said, we only ended up experiencing issues with ATI’s drivers. We encountered some oddities when moving windows that was eventually fixed in the Catalyst 9.3 drivers. It turns out that the problem was that ATI’s drivers lacked support for redirected OpenGL rendering; Linux guru Phoronix has a great article on what this is, including videos, that explains the importance of this change.

Ultimately we hate to sound like we’re beating a dead horse here, but we can’t ignore the GPU driver situation on Ubuntu (and really, Linux as a whole). The drivers have too many issues, and installing newer drivers to fix those issues is too hard. Things could be worse, Ubuntu could only distribute driver updates with OS updates ala Apple, but they could also be better. For the moment it’s the weakest point for Ubuntu when it comes to installing it on a high-end system.

The Package Manager – A Love/Hate Relationship

Out of every piece of software in Ubuntu, the package manager is the single most monumental and unique piece in the Operating System. I can tell you about Evolution (Ubuntu’s email client), or Totem (Ubuntu’s media player) and even if you’ve never used these programs, it would be easy to relate them to other things you likely have used. However trying to relate a package manager is a bit harder. The use of a package manager, and going further than that by completely relying on one, changes the OS experience entirely. Some of these changes are good and some are bad, driving what has become a love/hate relationship with apt, Ubuntu’s package manager.

Rather than trying to explain what a package manager is, it would be easier to explain what is a package manager. Package mangers are more common than most people would think, as there are several systems that use package managers without it manifesting itself in an obvious way. My iPhone runs a package manager – two in fact – one being the iTunes App Store and the other being apt (the same as Ubuntu) sitting underneath Cydia. Steam is also a package manager, taking care of its own little microcosm of games, mods, and SDKs. Most people have used a package manager without realizing it.

But none of them take it as far as Ubuntu. Steam only uses package management to install games, the iPhone via apt takes it a little bit further to install a wider base of applications and frameworks, but none of them integrate package management in to the OS like Ubuntu does. Everything in Ubuntu is a package, starting with the kernel and moving to drivers and applications. And the ramifications of this are huge.

When you go to install an Ubuntu application, there is no need to track down an installer for an application, make sure it’s the latest version, make sure it’s not really a Trojan or virus-infected, etc. All of the applications bundled with an Ubuntu release sit on Ubuntu’s servers as a package. Finding software to install (if it didn’t already come on the CD) is as easy as firing up the Add/Remove Applications application, and looking for the application you’d like to install. And if you don’t know what you want to install? Ubuntu will tell you all about whatever application you’re looking at.

Once an application is installed, the package manager will keep track of that application. It can uninstall the application if you need to remove it, or make sure it’s up to date if at some point a newer version (such as a bug fix) is published. The package manager brings everything together.

From an application perspective it’s little different than the iTunes App Store, but compared to what other OSs do it’s a big deal. How many different applications install their own updater service? Even though Microsoft and Apple consolidate updating their software in to their own software update systems, they can’t do that for everyone else’s applications. Chrome, Flash, Java, etc all have updaters running in the background just to keep their respective applications up to date. And while these updater applications are small compared to what they’re tasked to monitor, it’s none the less a waste of resources. Why do you need many applications to do the same job? On Ubuntu, you don’t.

On Ubuntu, the package manager is also in charge of keeping the OS itself updated, which is where we see it significant diverge from our earlier example of the iTunes App Store. Mixed in with application updates are updates to various system components, each one dutifully made in to their own package. This makes it very easy for Ubuntu to distribute component updates as needed (rather than bundling them together as larger updates) but it’s also a bit daunting – there are a lot of updates even when starting from Ubuntu 8.04.3. Nevertheless, for the curious types, this allows you to see exactly what’s being updated, and usually there’s a note attached with a meaningful explanation as to why.

Ubuntu’s package manager is the most foolproof way to install and maintain software I’ve ever used, on a computer. And that’s why I love it.

The package manager is also the outlet of my frustrations with Ubuntu, for many of the same reasons. Everything in Ubuntu is a package. There are no drag-and-drop installs like in Mac OS X, and there are no MSI/NSIS/InstallShield installs like Windows, there is only the package. The problem is that the package manager is an extremely self-limiting device when combined with Ubuntu’s software distribution philosophy as we mentioned earlier. Ubuntu isn’t just distributing an OS on which you run programs, but they’re distributing the programs themselves, and it’s all one stable platform.

You’ll first discover how frustrating this can be when you decide that you would like a newer version of some piece of software than what Ubuntu offers. We’ll take Wine for example, which develops at a rapid pace. If you want to be able to install the latest version of Wine, rather than version 1.0.1 that comes with Ubuntu, you’ll need to follow these instructions, which are composed of adding new repository entries to apt, followed by downloading and importing an authentication key in to apt so that it will trust the packages. Only then can you go back in to the package manager and tell it to install the latest version of Wine.

The Ubuntu project does offer a slightly simpler alternative through the Personal Package Archives, which are packages uploaded by users and hosted by the project. PPA repositories are a bit easier to install than the standard DEB repository, but the primary focus on PPAs is that there’s additional software available as a package for easier upgrading and maintenance. However since PPAs are maintained entirely by users, they’re unreliable as a source of updates, and not everything is made available for Hardy.

As a result of all of this, the package manager has just made software installation on Ubuntu a good deal harder than it is on Mac OS X or Windows if we wanted to do the same thing. And if you want a piece of software that’s not either the default Ubuntu version or the latest version from another repository, good luck, the package manager is designed to make upgrading easy, not necessarily downgrading.

The package manager exists to the detriment of any other way to install software. Technically software packages can be distributed outside of a repository, but in my own experience that seems very uncommon. Followed by that you have the shell script containing a binary blob (which may or may not be recognized and open correctly) and the more bearable-but-rare compressed folder. You are, for better or worse, stuck with the package manager in most cases.

This is why I hate the package manager. To the credit of the developers of it, it’s more of a flaw in the philosophy of Ubuntu than the technology, but the package manager in the minion enforcing the harsh realities of that philosophy. It’s here where the wheels start falling off of Ubuntu. It works well when you want to run software that Ubuntu provides in its main repositories, but only when you want to run software that they provide. Installing any other software is at times a nightmare.

I’ll close out this section reflecting on the iTunes App Store one more time. In spite of being a package manager, I have no qualms with it. Apple doesn’t tie app versions with OS versions, so I can always grab the latest version. Meanwhile if I need an older version it’s not easy, but double-clicking on archived IPA files is still less troubling than trying to pull off something similar with Ubuntu.

True nirvana for software installation and updating lies between Ubuntu’s strict package manager, and Windows’ loose environment of installers. Apple found one solution, though certainly not the only one. Ubuntu would do well to find a similar way to meet in the middle. As much as I love a unified installer and updater, as done by Ubuntu it causes me more frustration than enjoyment. I consider the package manager to be the worst regular experience of Ubuntu.

UI & Usability

To put Ubuntu GUI in the context of existing operating systems, I’d lump it in with Windows XP. If you can use Windows XP, then you’re going to be right at home with Ubuntu. The window layouts are similar, the buttons are the same, many of the shortcut key combinations are the same. Whether it’s intentional or not I can’t say, but with the similarities it’s a very easy transition to Ubuntu coming from Windows.

But there are some important differences between Ubuntu and XP, and they start to make themselves apparent almost immediately. The taskbar and its conjoined twin the start menu (the Menu Bar in Ubuntu) have been separated – the taskbar gets the bottom of the screen and the menu bar gets the top. Because the menu bar is always visible by default this makes it look close to Mac OS X, but due in large part to the fact that applications do not share the menu bar like they do in Mac OS X, it’s functionally much more like XP. Joining it up top are the Ubuntu equivalents of the quick launch toolbar, and the system tray. This leaves the taskbar at the bottom, containing running applications along with the controls for Ubuntu’s virtual desktops implementation.

This is something I find works quite well on narrow screens, but is a wash on larger screens and widescreens. By putting the menu bar and the taskbar on different physical bars, it leaves more space for active applications in the taskbar while not forcing the menu bar to be compacted. Depending on how cluttered your complete taskbar may have been under Windows, this can buy you enough space to comfortably fit another couple of active applications, which may not be much but can make all the difference in some situations. The cost of this however is that you lose additional vertical real estate compared to if everything was on one bar. Hiding the bars can get this space back, but it’s been in my experience that most people hate auto-hiding bars, which may very well be why no OS has them auto-hiding by default.

At first glance, the menu bar is just different enough from XP’s start menu to throw some people for the loop. The contents of the start menu have been broken up a bit: Applications is Windows’ All Programs, Places is My Recent Documents, and System is Control Panels. Coming from Windows, the two biggest changes are that most applications are organized by functionality rather than each application getting its own subfolder in the Applications menu, and that what would be found in Control Panels is now split between the Preferences and Administration submenus under System, based on if it adjusts a per-user preference or a system preference (and hence would need administrative access).

Nautilus, the Ubuntu file manager, really drives home the idea that Ubuntu works like Windows. It takes the “file manager is a web browser” concept just as far Windows ever did, which isn’t necessarily a good thing given how old (not to mention dead) the concept is, making Nautilus feel a bit dated. Beyond that, there’s little that can be said that differentiates it from Windows XP’s Explorer.

Multitasking is also handled in a very XP-like fashion. Beyond the taskbar, alt-tab switches among applications just as it does on Windows (or cmd-tab on Mac OS X). Notably, Ubuntu has copied some of the more interesting quirks of both Windows Vista and Mac OS X. From Windows Vista it inherits the ability to see the contents of a window when alt-tabing, and from Mac OS X it inherits the ability to close an inactive window without needing to focus on it, allowing you to keep focus on whatever you’re working on.

Ubuntu also has one more trick up its sleeve when it comes to multitasking, and that’s virtual desktops. Virtual desktops, or workspaces as they’re called in Ubuntu, allow for the creation of multiple workspaces in a single user session, such that different windows can be in different workspaces, completely hidden when that workspace is not active. It’s been a feature of various *nix operating systems for ages, and Apple added this feature as Spaces in 10.5 Leopard. Windows has no built-in equivalent.

I’ve tried using this method before as Spaces, and again on Ubuntu with their workspaces, and I fully admit I don’t “get it.” The idea of being able to move a window completely out of your way by keeping it in another workspace makes sense, but I have never been able to make it work for me. Ultimately I find that I have to go chase down a window that I need when it’s off in another workspace. I know there are plenty of people out there that can make good use of workspaces, so it may as well just be a personal flaw. It’s a neat concept, but I haven’t been able to make it work for me.

Moving on, one thing I find that Ubuntu does well is that it better bridges the look of the OS with and without eye-candy. Windows Vista does a very poor job of this, and it’s immediately obvious if Aero is running or not. The style choices for Vista clearly were based on Aero, so if for any reason Aero is disabled, you get the 2D-only Vista Basic UI that poorly compensates for the lack of transparency. Ubuntu on the other hand looks nearly identical in static screens, only the lack of subtle window shadows give away when Ubuntu is running without visual effects (Ubuntu’s name for 3D accelerated desktop compositing). Most people will never run Ubuntu with desktop compositing disabled, just as most people will never run Windows Vista with Aero disabled, nevertheless this is one of those subtle design choices that impressed me.


An example of Ubuntu's hardware compositing. Hardware composited on the left, software on the right.

With desktop compositing enabled the experience is similar to that of Windows Vista or Mac OS X. Windows fade out of view, shrink & grow, etc just as they do in the other two. I feel like I should be writing more here, but there’s just not a lot to say; it’s the same desktop compositing abilities everyone else has, including their UI tricks that serve to accelerate user interaction. The one thing in particular that did catch my eye however is that Ubuntu includes a UI feature called Scale that is virtually identical to Mac OS X’s Exposé. As a self-proclaimed Exposé junky I find most welcoming, as this is my preferred way to multitask with a large number of windows. There have been a couple of times, as a result, where I have found my workflow under Ubuntu being smoother than that of Vista, though Mac OS X still surpasses this.

However I’m much less enthusiastic about the icons Ubuntu uses, and there’s one element in particular that nearly drives me insane: executables/binaries don’t even have icons. In Windows executables can be packed with resources such as icons, and in Mac OS X app bundles contain icon files that are used to give the bundle an icon. On Ubuntu however, the executables don’t have their own icons. Ubuntu can assign custom icons to anything, but apparently this is being remembered by file manager, rather than actually attaching an icon. By default, the only thing with custom icons are the Launchers (a type of shortcut) that Ubuntu automatically creates for installed applications. Everything else is either issued a default icon for its type, or certain media types (e.g. images) are thumbnails.

In an ideal world this isn’t an issue because everything is installed to the system and has its own Launcher somewhere in the menu bar, but with software that doesn’t directly install (such as programs distributed in compressed folders) this isn’t something that’s handled automatically. In place of an application specific icon executables have a generic executable icon, which worse yet is shared by more than just executables. As an example of this we have a screenshot of the folder for the demo of Penny Arcade Adventures: Episode 2. Can you figure out which item launches the game?

The right answer, the document-like item called RainSlickEp2 (which is actually a shell script) is completely non-obvious. If this were Windows or Mac OS X, there would be an appropriate custom icon over the right item. Meanwhile not only are we lacking a custom icon, but the binary icon is used directly in 3 different places, and as an overlay on top of a document icon in a 4th place. Only 1 item is even an executable binary. And while I had hoped this was an issue just with this game, it extends to everything else; even Firefox’s actual executable lacks an icon. As it turns out, the Linux executable format, ELF, doesn’t have the ability to contain icons.

I hate to harp on this issue, but I am absolutely dumbfounded by it. Usability goes straight down the tubes the moment you need to use non-packaged software because of this – and because the DEB package format is not a Linux-wide standard, there’s a lot of software like that. On a GUI, there needs to be graphical elements to work with.

On the flip side, I find it interesting that Ubuntu has icons in certain places where Windows and Mac OS X do not. Action buttons such as “open” and “close” have icons embedded in them, while the other two OSs have always left these buttons sparser, containing just the text. The ramifications of this are that with icons in your buttons, you don’t necessarily need to be able to read the text to be able to use the OS so long as you understanding the meaning of the icons. It’s easily the most drastic difference between the Ubuntu and Windows/Mac OS X GUIs that I have noticed. But at the same time, I’ll say that it’s so different that even after a year I still don’t know quite what to make of it – it often results in big, silly buttons when something smaller would do. The jury is still out on whether this is a good difference or not.

I would also like to touch on the directory structure of Ubuntu, as it falls under the nebulous umbrella of usability once you have to start traversing it. Because Linux is a spiritual successor to the ancient Unix systems of years past, it has kept the Unix directory structure. This is something I believe to be a poor idea.

I don’t believe I’ve ever seen a perfect directory structure on an operating system, but there are some that are better than others. As an example of this, here’s a list of some of the more important Linux root directories: bin, boot, dev, etc, home, mnt, opt, sbin, usr, and var. And if this were Windows Vista: Boot, Program Files, Program Data, Users, and Windows.

The problem I have with the Ubuntu directory structure is that the locations of very few things are obvious. Firefox for example is in /usr/lib/Firefox, while on Windows it would be in /Program Files/Firefox. Why /usr/lib/? I have no idea. There’s a logical reason for that placement, but there’s absolutely nothing intuitive about it. Microsoft is no saint here (how many things are in /Windows and /Windows/System32?) but at least the location of user installed programs is completely and utterly obvious: Program Files. And if we’re on Mac OS X it’s even easier, /Applications. This all adheres to a standard, the Filesystem Hierarchy Standard, but that just means the standard is just as confusing.

Thankfully, and to be fair, there’s little reason to be going through the entire contents of the OS partition looking for something, but If you ever need to do so, it can be a frustrating experience. Ubuntu would benefit greatly by using a more intuitive structure, something that I’m convinced is possible given that Apple has pulled this off with Darwin, which also has the *nix directory structure, but avoids it as much as possible. I’d also like to see user data kept in /users like Windows and Mac OS X rather than /home, but Rome wasn’t built in a day… There is much room for improvement here.

Wrapping things up, when I first started with Ubuntu I did not have very high expectations as far as usability was concerned. I expected Ubuntu to be functional, but not necessarily exceptional – GUI design is an ugly and hard job, just how good could it be on a free OS? For all the reasons I like Mac OS X I can’t sing high praises about Ubuntu’s GUI or usability, but it surpassed my initial expectations. Other than the icon issue, there are no glaring flaws in Ubuntu’s GUI or the usability thereof. It’s not a revolutionary or even evolutionary GUI, but it does come off as a very solid facsimile of Windows XP with a few unique quirks and the eye-candy of Vista and Mac OS X thrown in, and that’s something I’m satisfied with. And a satisfactory GUI is not a bad thing, it’s quite an accomplishment given just how difficult GUI design is.

As an aside, I’m not a big fan of the default orange/brown color scheme for Hardy. It can be changed easily enough although I’ve always thought they could do better for a default scheme. I hear 9.10 may finally do away with orange, so we’ll see what we get in Ocotober.

Installation

In terms of difficulty, right up there with making a good GUI is making a good installer. History is riddled with bad OS installers, with pre-Vista Windows being the most well-known example. Text mode installers running on severely castrated operating systems reigned for far too long. Microsoft of course improved this with Windows Vista in 2006, but even as late as the end of 2007 they were still releasing new operating systems such as Windows Home Server that used a partial text mode installer.

The reason I bring this up is that good OS installers are still a relatively recent development in the PC space, which is all the more reason I am very impressed with Ubuntu’s installer. It’s the opposite of the above, and more.

Right now Ubuntu is the underdog in a Windows dominated world, and their installation & distribution strategies have thusly been based on this. It’s undoubtedly a smart choice, because if Ubuntu wiped out Windows like Windows does Ubuntu, it would be neigh impossible to get anyone to try it out since “try out” and “make it so you can’t boot Windows” are mutually incompatible. Ubuntu plays their position very well in a few different ways.

First and foremost, the Ubuntu installation CD is not just an installer, but a live CD. It’s a fully bootable and usable copy of Ubuntu that runs off of the CD and does not require any kind of installation. The limitations of this are obvious since you can’t install additional software and CD disc access times are more than an order of magnitude above that of a hard drive, but nevertheless it enables you to give Ubuntu a cursory glance to see how it works, without needing to install anything. Live CDs aren’t anything new for Linux as a whole, but it bears mentioning, it’s an excellent strategy for letting people try out the OS.

This also gives Ubuntu a backdoor in to Windows users’ computers because as a complete CD-bootable OS, it can be used to recover trashed Windows installations when the Windows recovery agent can’t get the job done. It can read NTFS drives out of the box, allowing users to back up anything they read to another drive, such as USB flash drive. It also has a pretty good graphical partition editor, GParted, for when worse comes to worse and it comes time to start formatting. Ubuntu Live CD is not a complete recovery kit in and of itself (e.g. it can’t clean malware infections, so it’s more of a tool of last resort) but it’s a tool that has a purpose and serves it well.

Better yet, once you decide that you want to try an installable version of Ubuntu, but don’t want to take the plunge of messing with partitions, Ubuntu has a solution for that too. Wubi, the Windows-based Ubuntu Installer, allows you to install Ubuntu as a flat-file on an existing NTFS partition. Ubuntu can then boot off of the flat file, having never touched a partition or the master boot record (instead inserting an Ubuntu entry in to Windows BCD). This brings all the advantages of moving up from a Live CD to an installable version of Ubuntu, but without the system changes and absolute commitment a full install entails. Wubi installations are also easily removable, which further drives home this point.

Now the catch with a Wubi installation is that it’s meant to be a halfway house between a Live CD and a full installation, and it’s not necessarily meant for full-time use. As a flat file inside of a NTFS partition, there are performance issues related to the lower performance of the NTFS-3G driver over raw hard drive access, along with both external fragmentation of the flat file and internal fragmentation inside of the flat file. An unclean shutdown also runs the slight risk of introducing corruption in to the flat file or the NTFS file system, something the Wubi documentation makes sure to point out. As such Wubi is a great way to try out Ubuntu, but a poor way to continue using it.

Finally, once you’ve decided to go the full distance, there’s the complete Ubuntu installation procedure. As we’ve previously mentioned Ubuntu is a live CD, so installing Ubuntu first entails booting up the live CD – this is in our experience a bit slower than booting up a pared down installation-only OS environment such as Vista’s Windows PE. It should be noted that although you can use GParted at this point to make space to install Ubuntu, this is something that’s better left in the hands of Windows and its own partition shrinking ability due to some gotchas in that Windows can move files around to make space when GParted can’t.

Once the installation procedure starts, it’s just 6 steps to install the OS: Language, Time Zone, Keyboard Layout, Installation Location, and the credentials for the initial account. Notably the installation procedure calls for 7 steps, but I’ve only ever encountered 6, step 6 is always skipped. This puts it somewhere behind Mac OS X (which is composed of picking a partition and installing, credentials are handled later) and well ahead of Windows since you don’t need a damn key.

The only thing about the Ubuntu installation procedure that ruffles my feathers is that it doesn’t do a very good job of simplifying the installation when you want to install on a new partition but it’s not the only empty partition. This is an artifact of how Linux handles its swapfile – while Windows and Mac OS X create a file on the same partition as the OS, Linux keeps its swapfile on a separate partition. There are some good reasons for doing this such as preventing fragmentation of the swapfile and always being able to place it after the OS (which puts it further out on the disk, for higher transfer rates) but the cost is ease of installation. Ubuntu’s easy installation modes are for when you want to install to a drive (and wipe away its contents in the process) or when you want to install in the largest empty chunk of unpartitioned space. Otherwise, you must play with GParted as part of the installation procedure.

This means the most efficient way to install Ubuntu if you aren’t installing on an entire disk or immediately have a single free chunk of space (and it’s the largest ) is to play with partitions ahead of time so that the area you wish to install to is the largest free area. It’s a roundabout way to install Ubuntu and can be particularly inconvenient if you’re setting up a fresh computer and intend to do more than just dual boot.

Once all of the steps are completed, Ubuntu begins installing and is over in a few minutes. Upon completion Ubuntu installs its bootloader of choice, GRUB, and quickly searches for other OS installations (primarily Windows) and adds those entries to the GRUB bootloader menu. When this is done, the customary reboot occurs and when the system comes back up you’re faced with the GRUB boot menu – you’re ready to use Ubuntu. Ubuntu doesn’t treat its first booting as anything special, and there are no welcome or registration screens to deal with(I’m looking at you, Apple). It boots up, and you can begin using it immediately. It’s refreshing, to say the least.

The actual amount of time required to install Ubuntu is only on the order of a few minutes, thanks in large part due to its dainty size. Ubuntu comes on a completely filled CD, weighing in at 700MB, while Windows Vista is on a DVD-5 at over 3GB, and Mac OS X is on a whopping DVD-9 at nearly 8GB. It’s the fast to download (not that you can download Windows/Mac OS X) and fast to install.

We’ll get to the applications in-depth in a bit, but I’d like to quickly touch on the default installation of Ubuntu. Inside that 700MB is not only the core components of the OS and a web browser, but the complete Open Office suite and Evolution email client too. You can literally install Ubuntu and do most common tasks without ever needing to install anything else beyond security and application updates. Consider the amount of time it takes to install Microsoft Office on a Windows machine or a Mac, and it’s that much more time saved. Canonical is getting the most out of the 700MB a CD can hold.

NEXT PAGE - APPLICATIONS

Comments