I recently got a new laptop from work. Its a refurbished Dell Latitude E6330 with an Intel Core i5 processor, a 13″ screen and a 120GB SSD drive that came with Windows 7 Pro. I haven’t used Windows regularly in quite some time (I’ve been using a WinXP VM on the rare occassion I need to do something in Windows) but now that I’ve been using Windows 7 for some time I can say its really a very good OS. That said, I missed my Linux tools, and quickly grew frustrated every time I hit up against a task that I knew I could do easier in Linux or that required yet another proprietary and expensive application. Trust me, you can quickly get used to having access to a vast repository of software that can be installed in seconds safely and for free.
Since this is a machine owned by my company (and for which they paid good money for a Windows license) I can’t in good conscience just blow away Windows and replace it with Linux. The SSD, while incredibly fast, isn’t really big enough to carve up partitions for another OS. Besides, the main reason I have the machine is to run some software which requires Windows. That said, there are times when I’d like to use the machine for personal tasks as well, and I’d rather not have my personal information saved on the work machine. This got me thinking about how I could boot the machine from a “persistent” USB drive. Persistent in this sense means that the drive would not only boot the OS (a bootable live CD or USB drive does that), but also serve to store personal settings like wifi settings, themes, installed programs, etc. and personal files so that the next time I boot up, its all there.
I’ve used various flavors of Linux over the years, first starting with Redhat, then moving to Mandrake but for the last few years its mainly been Ubuntu serving as my OS of choice. This is probably just because its so undeniably easy to get running, but lately the tide of my love affair with Ubuntu has started to ebb. First it was Canonical’s introduction of the Unity interface which I never really warmed up to, then their choice to integrate Amazon offerings into the desktop search which can be disabled but still rankles me on principal, and recently I’ve been put off by what seems to be a general decline in the stability of the OS. I’ve been hearing a lot about Arch lately and thought it was time I checked that out anyway.
Possibly because of my use of Ubuntu I had gotten fairly comfortable using the Gnome desktop environment and part of my disatisfaction with Ubuntu actually has to do with flaws in Gnome. Random crashes always seemed to lead back to gnome-shell or some other gnome component. Linux Torvalds once penned a tirade on G+ back in 2012 about how incredibly difficult it had become to change even the simplest desktop preferences in Gnome and that post has remained in the back of my mind. A while ago I had dabbled with Xfce4 as a replacement but wasn’t very happy with that either. It was time for a drastic change and the little bit of reading I had done suggested that KDE deserved a fresh look.
Before I settled on Arch I tried out a bunch of Linux distributions. In the old days I would have had to burn the downloaded ISO image to a CD (which always took longer than I wanted to wait), but newer machines like this Dell have the ability to boot from USB so I got good at creating a bootable USB drive fairly quickly. There are a bunch of programs which make this task trivial, I’ll try to remember to add links to these later. My experience with trying new distros generally always follows the same pattern – boot up, figure out how hard it is going to be to get the wifi going, evaluate the interface. If I can’t get the wifi up (and quickly) the distro has very little chance to get played with at all. Since I don’t work at a desk, but instead usually in a comfy chair or on the couch where there’s no ethernet handy working wifi is a must. Ethernet generally works out of the box in most distros, but wifi can be hit and miss. This Dell uses an Intel Centrino 6205 wireless chip which seems to be well supported, so getting wifi functional was more of a problem of “how to do it” than “does it work?”.
Vanilla Arch Linux isn’t a “noob” friendly distro by any measure. Booting up the latest ISO dumped me to a command line, no graphical interface, not very many programs installed by default – just the most basic of linux tools. Arch is a “base” distribution which provides a solid base upon which to build up a custom Linux installation. By following the installation guide on the Arch wiki getting past each hurdle was a breeze. Arch did make me work to get what I wanted, but in this way I was sort of forced to learn the Arch “way” and my install ended up being a custom distro with just the stuff I wanted. Arch wasn’t as intimidating to me as it might have been years ago since there’s no way for a beginner to know what they want to end up with having never seen any of it before.
A bootable, persistent USB drive can be created in a lot of ways, but here’s what I did as best I can remember now. This is NOT a how to guide, its just my notes. For a new Arch install – follow the official Arch Installation Guide and the rest of the Arch wiki when you bump into things that don’t work or if you don’t know how to do something.
Just to reduce the confusion factor of booting USB and installing to USB at the same time, I burned a CD (I keep a few rewritables that I use for this purpose so I don’t end up with a bunch of coasters since Linux distros are updated frequently) and booted up with a spare USB flash drive plugged in. (edit: its only later that I realized that the Arch live CD has both X86-64 and i686 flavors of 64bit on it. I initially installed X86-64 which is why I couldn’t boot the drive on a 32bit machine later).
In addition to creating a bootable USB installation, I also wanted to be able to plug the USB drive into a Windows machine and use it. Partly this is for utility of transferring files back and forth between machines, but also because if somebody were to plug the stick in, I wanted it to work and not give a Windows user the message that the drive was unformatted. It would just be too easy to blow away all my hard work. I was able to partition the drive easily enough using fdisk. I created a 2GB FAT32 bit formatted partition for Windows, and designated the remaining 14GB as ext4 for Linux. (edit: the second time I set up Arch on another USB drive I ran into all kinds of problems getting Windows to see the Windows partition as “initialized”. I had used
fdisk to partition the drive and
mkfs to format it. I ended up having to delete the small partition in Windows, and create a new partition from there (and format it as FAT32). This had the side effect of bonking my partition table. Arch would still load but with an initial error. I fixed this my running through once with
testdisk. I also note here that both drives were created with MBR not GPT partitioning – fdisk reports it as “dos” type. When I tried to use GPT I couldn’t never get Windows to see the volumes.)
I didn’t create a swap or boot partition because I read that using swap might wear out the drive faster than otherwise and why do I need a separate boot partition if I only intend to boot one OS? I’m sure there’s a good reason, I just can’t think of one.
I used the
lsblk command to list the partitions
NAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINT
sda 8:0 0 119.2G 0 disk
|-sda1 8:1 0 105.6G 0 part
`-sda2 8:2 0 13.7G 0 part
sdb 8:16 1 15G 0 disk
|-sdb1 8:17 1 1.9G 0 part /run/media/user/USBDRIVE
`-sdb2 8:18 1 13.2G 0 part /
sr0 11:0 1 1024M 0 rom
I mounted the
/dev/sdb2 (Linux) partition on
/mnt/arch and used
pacstrap /mnt/arch base to install Arch to the partition. I used
fdisk to toggle the bootable flag on it. Of course, this isn’t enough to boot, which I found by attempting to do so. Turned out that I needed to install grub to the drive. To do this you can’t be using the partition so I booted to the live CD again, then re-created a
/mnt/arch directory, and mounted the USB installation to it with:
# mkdir /mnt/arch
# mount /dev/sdb2 /mnt/arch
Then used the handy:
arch-chroot /mnt/arch command which basically just binds several other important system mounts from the USB to the live CD. This put me at a different command line prompt, and I found I had to install grub with
pacman -S grub before I could go ahead and run
grub-install /dev/sdb to the USB drive, followed by
grub-mkconfig -o /boot/grub/grub.cfg
lsblk often because rebooting into Windows (before I figured out how to get wifi working) to read up on how to proceed seemed to have the effect of these devices shifting around such that at one point the SSD drive would be
sda and another it would appear as
sdb. Since installing to the wrong partition would be disastrous (from Windows’ perspective anyway), I was careful to keep tabs on this.
Getting wifi working was critical to most of the above (you can’t install using pacstrap if you’re not on the internet), both from the live CD and afterwards in the bootable USB as explained above. Luckily, Arch has made this incredibly simple with the
wifi-menu script which saw my wifi and connected to it without a problem from the command line. Trying to connect to an access point with a hidden SID was a different matter. I never could get wifi-menu to successfully connect to a hidden network, but I did figure out how to do so manually:
I found that in
/etc/netctl/examples a file called
wpa_supplicant_static which provides the info you need to create a network connection using wpa_supplicant (needed to install that package to connect to my WPA2 network btw) with a static IP (192.168.10.191).
I copied that out to the main
/etc/netctl folder and renamed it to the (hidden) SSID MyWifi, then modified the details as shown. There’s a line in there to select if the SSID is hidden
which has to uncommented. Then I was able to start a wireless connection using:
netctl start MyWifi
Description='A simple WPA encrypted wireless connection using a static IP'
# Uncomment this if your ssid is hidden
To have the wifi connect automatically on boot:
netctl enable MyWifi
which just creates the symlink for the systemd service.
I installed xorg-server, xf86-video-intel (this machine has Intel HD4000 graphics), and xorg-xinit (which creates a file called .xinitrc in /etc/skel which I copied to my home directory and modified) to get X Windows running. Also the synaptics package (forget the exact name now). Of course, X Windows without a desktop environment like Gnome or KDE is not very satisfying. Before I decided to go with KDE I did actually install gnome, but quickly got frustrated with it since the experience of a bare bones Gnome3 is much worse than Ubuntu’s more fleshed out install of Gnome which wasn’t doing it for me already. I wasn’t able to rip out everything with
pacman -Runs gnome, but most of it went away, and I can still run some basic gnome apps if needed. Getting KDE up and going was easier, and the configurability of the desktop is much more robust. KDE looks great by comparison, and I’ve been using Gnome for years!
I set up screen-edges so that moving the mouse to the upper left corner triggers Desktop-Grid which is a lot like expose’ on the Mac and something I’d gotten used to from Gnome3 (I had stopped using Unity last year). At first Desktop-Grid wasn’t available as a screen edge trigger because Desktop Effects needed to be enabled (although the checkbox says you can enable it anytime with a key combo, doing so doesn’t make the effects available to screen edges).
The only other issue is something I haven’t had to deal with in a long time. Since I basically gave up on Windows a long time ago and have only run it since in a VM, I had forgotten entirely about the problem with the system clock that is created when you dual boot Linux with Windows. The problem stems from the fact that Windows interprets the system clock as being in local time, where Linux machines treat the system clock as being in UTC (GMT) and display the time to the user as an offset to UTC (for example, New York is UTC – 5 hours). Both methods work, but if you dual boot each OS can get confused. Time differences like this can also make it impossible to access some secure websites. The best fix for this is to change a registry setting in Windows to force it to accept the computer’s hardware clock as UTC time.
For Win7 (not XP), you can create a .reg file with the following info and import it into the registry for the fix:
Windows Registry Editor Version 5.00
Just as a test I’ve tried to boot my new USB drive on my old MacBook but it won’t boot – not sure why, but also not really interested in figuring that out since the Mac is already running Linux. Also wouldn’t work on the old Thinkpad which is likely because thats a 32bit machine. It worked great on the Intel NUC though thats already running Linux too. This drive was only the test of concept though. I’d like to get a Sandisk Cruzer fit (one of those really small USB drives that look like a wireless mouse adapter) eventually so theres less chance of physically breaking the thing when its plugged into the laptop. (edit: I picked one of these up and went through the whole process again using i686 architecture.)
Sure, Arch isn’t as fast as it would be if I was running this direct off the SSD drive, but its still pretty darn fast. Now theres only a lot of desktop preferences tweaking and installation of application tools left to do, but so far Arch and KDE have really impressed me.
I used Kate to write up these notes.
update 140219: An important consideration with a perisistent bootable OS on a flash drive is that the files on the drive are not encrypted. While you might not have anything to hide, its likely that you’d rather be a bit more careful with your passwords, email, financial info, or other personal information. While there are a lot of ways to deal with this, I chose to go the simple route and used ecryptfs to encrypt a private directory in my home folder. Anything I put in that folder is available to me when I’ve booted up and logged into the OS, but when the drive is offline anyone attempting to read the disk cannot access anything I’ve put inside that folder.