Getting started with Windows Server 2008

This past year I finally took the plunge and started learning Windows Server. At first I was extremely apprehensive about this, but as I thought about it more I realized that a lot of companies are successfully using Windows Server for one or more reasons. In addition, most environments have a mix of Windows, Linux and Solaris, so this would help me understand all the pieces in the environment I support.

Being the scientific person I am, I decided to do some research to see just how viable Windows Server was. To begin my experiements, I picked up a copy of Windows Server 2008 Inside Out. This was a fantastic book, and really set the ground work for how Windows, active directory and the various network services work. The book peaked my attention, and my true geek came out and I committed myself to learning more.

To go into more detail, I signed up for a number of Microsoft training courses. The first class I took was Maintaining a Microsoft SQL Server 2005 database. This course helped out quite a bit, and provided some immediate value when I needed to do some work on my VMWare virtual center database (VMWare virtual center uses SQL server as it’s back-end database).

After my SQL skills were honed by reading and hands on practice, I signed up for Configuring and troubleshooting a Windows Server 2008 network infrastructure and configuring and troubleshooting Windows Server 2008 active directory servers. These classes were pretty good, and went into a lot of detail on the inner workings of active directory, DNS, WINS, DHCP, DFS, role management, group policy as well as a bunch of other items related to security and user management.

While the courses were useful, I wouldn’t have attended them if I had to pay for them out of my own pocket (my employer covered the cost of the classes). I’m pretty certain I could have learned just as much by reading Windows Server 2008 Inside Out, Active Directory: Designing, Deploying, and Running Active Directory and spending a bunch of time experimenting with each service (this is the best way to learn, right?).

That said, I’m planning to get Microsoft certified this year. That is probably the biggest reason to take the classes, since they lay out the material in a single location. I haven’t used any of the Microsoft certification books, but I suspect they would help you pass the tests without dropping a lot of loot on the courses. Once I take and pass all of the tests I’ll make sure to update this section with additional detail.

If you’ve been involved with Windows Server, I’d love to hear how you got started. I’m hoping to write about some of the Windows-related stuff I’ve been doing, especially the stuff related to getting my Linux hosts to work in an active directory environment. Microsoft and Windows server are here for the foreseeable future, so I’m planning to understand their pros and cons and use them where I see a good fit. There are things Solaris and Linux do better than Windows Server, and things Windows Server does better than Linux and Solaris. Embracing the right tool (even if it is a Microsoft product) for the job is the sign of a top notch admin in my book. Getting myself to think that way took a LOT of work. ;)

A nice graphical interface for smartmontools

In my article out SMART your hard drive, I discussed smartmontools and the smartctl comand line utility in detail. The article shows how to view SMART data on a hard drive, conduct self-tests and shows how to configure smartmontools to generate alerts when a drive is about to or has failed. Recently I learned about GSmartControl, which is a graphical front-end to smartmontools. While I’ve only played with it a bit, it looks like a pretty solid piece of software! The project website has a number of screenshots, and you can download the source from the here. Nice!

A simple and easy way to copy a file system between two Linux servers

During my tenure as a SysAdmin, I can’t recall how many times I’ve needed to duplicate the contents of a file systems between systems. I’ve used a variety of solutions to do this, including array-based replication, database replication and tools such as rsync and tar piped to tar over SSH. When rsync and tar where the right tool, I often asked myself why there wasn’t a generic file system replication tool when I completed my work. Well, it appears there is. The cpdup utility provides an easy to use interface to copy a file system from one system to another. In it’s most basic form you can call cpdup with a source and destination file system:

$ cpdup -C -vv -d -I /data

root@'s password: 
Handshaked with fedora2
Scanning /data ...
Scanning /data/conf ...
/data/conf/          copy-ok
/data/conf/smb.conf              copy-ok
/data/conf/named.conf            copy-ok
Scanning /data/www ...
Scanning /data/www/content ...
Scanning /data/www/cgi-bin ...
Scanning /data/dns ...
Scanning /data/lost+found ...
cpdup completed successfully
1955847 bytes source, 1955847 src bytes read, 0 tgt bytes read
1955847 bytes written (1.0X speedup)
3 source items, 8 items copied, 0 items linked, 0 things deleted
3.8 seconds  1007 Kbytes/sec synced   503 Kbytes/sec scanned

This will cause the entire contents of /data to be migrated through SSH to /data on the remote server. It appears cpdup picks up everything including devices and special files, so this would be a great utility to clone systems. There are also options to replace files that are on the remote end, remove files that are no longer on the source, and various others that can be used to customize the copy operation. Nifty utility, and definitely one I’ll be adding to my utility belt.