Well, I am off on a much-needed vacation. For my American readers, this weekend represents our "4th of July" Independence Day holiday. What better way to celebrate than to drive hundreds of miles from one side of the country to the other? In this case, from the North side down to the South side.
I am armed with two books on this subject. The first, is part of a series on American Road Trips, which details the roadside attractions to be found along the Great River Road. We will start up in Minnesota, and work our way Southward, covering a total of eight states in eight days along the Mississippi River.
The second book is Alton Brown's "Feasting on Asphalt, the River Run". This book describes Alton's ride Northward up the Mississippi river, detailing the restaurants and foods he enjoyed, so I will have to read the chapters in reverse.
Special thanks to Roy Buol, mayor of Dubuque, Iowa that I [met in Scottsdale earlier this year] for the idea to come visit his fine city, considered one of the Smarter Cities in the USA, thanks to IBM technology.
I don't know if I will have internet access along the way, or have the time and/or energy to blog, tweet (@az990tony) or upload photos during the trip. We'll see.
technorati tags: , IBM, Smarter Cities, Dubuque Iowa, Great River Road, Mississippi River
Congratulations to my colleague and close friend, Harley Puckett, who celebrated his 25th anniversary of service here at IBM. This is known internally as joining the "Quarter Century Club" or QCC. This is not just a figure of speech, the members of this club hold get-togethers and barbeques throughout the year.
Here is Harley welcoming Ken Hannigan and others he worked with back in Tivoli Storage Manager (TSM) software development.
Our manager, Bill Terry, presenting Harley with a plaque.
By the time I got to the cake, it was half gone!
technorati tags: , IBM, TSM, Harley Puckett, QCC
Confused about what storage solutions you need? IBM now has a [Storage Evaluation Tool] that you can use to find out about IBM's latest products, solutions and offerings.
The tool will is customized for different industries, job roles, and challenge areas. Give it a try!
technorati tags: IBM, Storage, Evaluation Tool
Continuing my saga for my [New Laptop], I have gotten all my programs operational, transferred and organized all my data, and now ready for testing. You can read my previous posts on this series: [Day 1], [Day 2], [Day 3], [Day 4].
At this point, you might be thinking, "Testing? Just use your laptop already, deal with problems as you find them!" In my case, I need to sign off that the new laptop meets my needs, and then send back my previous laptop, wiped clean of all passwords and data. I have until the end of June to do this.
The value of testing is to avoid problems later, perhaps an inconvenient time such as a business trip or client briefing. It is better to work out any issues while I am still in the office, connected to the internal IBM intranet on a high-speed wired connection. Also, I plan to do a Physical-to-Virtual (P-to-V) conversion of my Windows XP C: drive to run as a virtual guest OS on Linux, so I want to make sure the image is in working order before the conversion. That said, here is what my testing encountered.
- Of the 134 applications I had identified as being installed on my old laptop, I determined that I only needed about 70 of them. The others I did not bother to install on the new.
- I had not thought about "addons" and "plugins" that I have that attach themselves inside browsers or other applications. I made sure that Flash, Shockwave and Java worked correctly on all three browsers: IE6, Firefox and Opera.
- One of my "plugins" is an application called [iSpring Pro, which plugs into Microsoft PowerPoint. I thought I had Microsoft Office installed, but found out the standard IBM build had only the viewers. I installed Microsoft Office 2003 Standard Edition with PowerPoint, Excel and Word. I then realized that I did not have the original V4.3 installation file for iSpring Pro, so I downloaded the latest v5 from their website. However, my license key is only for version 4, so a quick email got this resolved, and the nice folks at iSpring Solutions sent me the v4.3 installation file.
Shameless Plug: We use iSpring Pro to record our voices with PowerPoint slides to generate web videos for the [IBM Virtual Briefing Center] which we use to complement face-to-face briefings. This allows attendees to review introductory materials to prepare for their visit to Tucson, or to stay up-to-date on products and features in between annual visits. If you have not checked out the IBM Virtual Briefing Center, now is a good time to see what videos and other resources we have out there. You can even request to schedule a briefing in Tucson!
- Testing out iSpring Pro, I realized that there are no jacks for my headset. On my old ThinkPad T60, I had two jacks, one green for headphone and one pink for microphone. My headset has two cables, one for each, which I then use for the recordings. I also use this for online webinars and training sessions. Apparently, ThinkPad T410 went for a single 3.5mm "Combo" audio jack that handles both roles. Fortunately, there is a [Headset Buddy] adapter that merges the two cables from my headset to the combo jack on my new laptop. I ordered one which will arrive some time next week.
- My new laptop doesn't fit my old docking station either. I had set the docking station aside while I had the two laptops latched together for the file transfers, but now that I am done with the old laptop, I discovered that my new T410 doesn't fit. I ordered a new one.
- Using find, grep, awk, sort and uniq, I was able to generate a list of all the file extensions on my Documents foler. I was able to find old Lotus 123, Freelance Graphics, and Wordpro files. I thought Lotus Symphony would handle these, but it does not. I was able to install an old version of Lotus Smartsuite that includes these programs so that I can process these files.
- I also found in the extensions list pptx, docx and xlsx files, which represent the new Microsoft Office 2007 formats. I installed the "Format Compatability Pack" that allows Office 2003 read these files.
- Lastly, I installed a few programs that support a wide variety of file formats. VideoLAN's [VLC] plays a variety of audio and video files. [7-Zip] packs and unpacks a variety of archive files. (Note: Another program, BitZipper, also supports a variety of archive formats, but the install will corrupt your Firefox and IE browsers with new tool bars, change your search engine default, and install a lot of other unwanted software. Cleaning up the mess can be time-consuming. You have been warned!) I also installed [MadEdit], a binary/hex/text editor that will open any file to see what kind of format it has inside. From this, I was able to determine that some of my extension-less files were GIF, RTF or PDF format, and rename them accordingly.
With the testing done, I am ready to go wipe my old system of all passwords and data!
technorati tags: IBM, ThinkPad, T60, T410, P-to-V, Flash, Shockwave, iSpring, PowerPoint, Headset Buddy, audio jack, Lotus, Symphony, SmartSuite, VideoLAN, VLC, 7-Zip, MadEdit
Continuing my saga for my [New Laptop], I have gotten all my programs operational, and now it is a good time to re-evaluate how I organize my data. You can read my previous posts on this series: [Day 1], [Day 2], [Day 3].
I started my career at IBM developing mainframe software. The naming convention was simple, you had 44 character dataset names (DSN), which can be divided into qualifiers separated by periods. Each qualifier could be up to 8 characters long. The first qualifier was called the "high level qualifier" (HLQ) and the last one was the "low level qualifier" (LLQ). Standard naming conventions helped with ownership and security (RACF), catalog management, policy-based management (DFSMS), and data format identification. For example:
In the first case, we see that the HLQ is "PROD" for production, the application is PAYROLL and this file holds job control language (JCL). The LLQ often identified the file type. The second can be a version for testing a newer version of this application. The third represents user data, in which case my userid PEARSON would have my own written TEST JCL. I have seen successful naming conventions with 3, 4, 5 and even 6 qualifiers. The full dataset name remains the same, even if it is moved from one disk to another, or migrated to tape.
(We had to help one client who had all their files with single qualifier names, no more than 8 characters long, all in the Master Catalog (root directory). They wanted to implement RACF and DFSMS, and needed help converting all of their file names and related JCL to a 4-qualifer naming convention. It took seven months to make this transformation, but the client was quite pleased with the end result.)
While the mainframe has a restrictive approach to naming files, the operating systems on personal computers provide practically unlimited choices. File systems like NTFS or EXT3 support filenames as long as 254 characters, and pathnames up to 32,000 characters. The problem is that when you move a file from one disk to another, or even from one directory structure to another, the pathname will change. If you rely on the pathname to provide critical information about the meaning or purpose of a file, that could get lost when moving the files around.
I found several websites that offered organization advice. On The Happiness Project blog, Gretchen Rubin [busts 11 myths] about organization. On Zenhabits blog, Leo Babauta offers [18 De-cluttering tips].
Peter Walsh's [Tip No. 185] suggests using nouns to describe each folder. Granted these are about physical objects in your home or office, but some of the concepts can apply to digital objects on your disk drive.
Other websites were specific to organizing digital files on your personal computer. On her Lifehacker blog, Gina Trapani shows her approach to [Organizing "My Documents"]. Chanel Wood offers her [How to organize your computer and still remember where you put everything], based on a simple alphabetic system. Microsoft offers [9 tips to organize files better]. Most of the advice was common sense, but this one, from Peter Walsh's [Tip No. 190], I found amusing:
"Use the computer’s sorting function. Put “AAA” (or a space) in front of the names of the most-used folders and “ZZZ” (or a bullet) in front of the least-used ones, so the former float to the top of an alphabetical list and the latter go to the bottom."
Personally, I hate spaces anywhere in directory and file names, and the thought of putting a space at the front of one to make it float to the top is even worse. Rather than resorting to naming folders with AAA or ZZZ, why not just limit the total number of files or directories so they are all visible on the screen. I often sort by date to access my most frequently-accessed or most-recently-updated files.
Of all the suggestions I found, Peter Walsh's "Use Nouns" seemed to be the most useful. Wikipedia has a fascinating article on [Biological Classification]. Certainly, if all living things can be put into classifications with only seven levels, we should not need more than seven levels of file system directory structure either! So, this is how I decided to organize my files on my new Thinkad T410:
- C: Drive
Windows XP operating system programs and applications. I have structured this so that if I had to replace my hard disk entirely while traveling, I could get a new drive and restore just the operating system on this drive, and a few critical data files needed for the trip. I could then do a full recovery when I was back in the office. If I was hit with a virus that prevented Windows from booting up, I could re-install the Windows (or Linux) operating system without affecting any of my data.
- D: Drive
This will be for my most active data, files and databases. I have the Windows "My Documents" point to D:\Documents directory. Under Archives, I will keep files for events that have completed, projects that have finished, and presentations I used that year. If I ever run out of space on my disk drive, I would delete or move off these archives first. I have a single folder for all Downloads, which I can then move to a more appropriate folder after I decide where to put them. My Office folder holds administrative items, like org charts, procedures, and so on.
As a consultant, many of my files relate to Events, these could be Briefings, Conferences, Meetings or Workshops. These are usually one to five days in duration, so I can hold here background materials for the clients involved, agendas, my notes on what transpired, and so on. I keep my Presentations separately, organized by topic. I also am involved with Projects that might span several months or ongoing tasks and assignments. I also keep my Resources separately, these could be templates, training materials, marketing research, whitepapers, and analyst reports.
A few folders I keep outside of this structure on the D: drive. [Evernote] is an application that provides "folksonomy" tagging. This is great in that I can access it from my phone, my laptop, or my desktop at home. Install-files are all those ZIP and EXE files to install applications after a fresh Windows install. If I ever had to wipe clean my C: drive and re-install Windows, I would then have this folder on D: drive to upgrade my system. Finally, I keep my Lotus Notes database directory on my D: drive. Since these are databases (NSF) files accessed directly by Lotus Notes, I saw no reason to put them under the D:\Documents directory structure.
- E: Drive
This will be for my multimedia files. These don't change often, are mostly read-only, and could be restored quickly as needed.
I'll give this new re-organization a try. Since I have to take a fresh backup to Tivoli Storage Manager anyways, now is the best time to re-organize the directory structure and update my dsm.opt options file.
technorati tags: , mainframe, DFSMS, HLQ, LLQ, DSN, naming convention, RACF, JCL, file system, de-clutter, organization, Peter Walsh, Windows, Linux, TSM
Continuing my saga for my [New Laptop], let's recap my progress so far:
- [Day 1 afternoon], I received the laptop from shipping on Wednesday, took a backup of the factory install image to an external USB drive, and re-partitioned to run both Windows and Linux operating systems.
- [Day 2], I spent Thursday using the "Migration Assistant" tool, and completed the operation sending the rest of my data over to the /dev/sda6 NTFS partition.
So now, Friday (day 3), I get to install any applications that were not part of the pre-installed image. Thankfully, I had planned ahead and figured out the 134 different applications that I had on my old system. I printed out a copy of my spreadsheet, and used it as a checklist to systematically go through the list. For each one, I determined one of the following:
If I could find the application already installed, either the same version or newer, or functionally equivalent, then I would mark it down as being part of the factory build. Of those programs pre-installed, I am quite pleased that the settings were carried over during yesterday's file transfer. For example, my bookmarks and bookmarklets on Firefox are all in tact. However, it did not carry forward all of my Firefox addons, so these I had to install separately.
- ISSI Download
IBM Standard Software Installer is our internal website for IBM and select third-party software for the different operating systems supported. Many of the ISSI programs were already included in the factory build, such as Lotus Notes, Lotus Symphony, Firefox browser, and so I had very few left remaining to do manually from ISSI.
- INSTALL from D:\Install-Files
As I mentioned in my previous post, I saved the ZIP or EXE files of installation, as well as any license keys, URLs and other useful information to re-install each application.
- COPY over from D:\Prog-Files
Many programs don't have installation files, because they don't need to update the registry or create Desktop icons or Taskbar management buttons. For these I can just copy the directory over to C:\Program Files.
- WEB Download
In some cases, the Install-File was fairly downlevel, so I downloaded a fresh copy from the Web. In other cases, I forgot to save the ZIP or EXE, so this was the backup plan.
- DEFER for later install
I worked down the list alphabetically, but some programs needed other programs to be installed first, or I needed to find the license registry key, or whatever. This allowed me to focus on the most important programs first. Others I might defer indefinitely until I need them, such as programs to access Second Life, or to build software for Lego Mindstorms robots.
- SKIP those applications no longer required
Some programs just don't need to be on my new system. This includes software to manage printers I no longer have, drivers to attach to gadgets and devices I no longer own, and software that might have been specific to the old ThinkPad T60. This was also a good time to "de-duplicate" similar applications. For example, I have decided to limit myself to just three browsers: Firefox, Opera, and Internet Explorer IE6.
The planning paid off. I was able to confirm or install all of my applications today and have a fully working Windows XP system partition. I celebrated by taking another backup.
technorati tags: , IBM, Lotus, Notes, Symphony, Firefox, Opera, IE6, ThinkPad, T410, T60
Continuing my saga regarding my [New Laptop], I managed on
[Wednesday afternoon] to prepare my machine with separate partitions for programs and data. I was hoping to wrap things up on day 2 (Thursday), but nothing went smoothly.
Just before leaving late Wednesday evening, I thought I would try running the "Migration Assistant" overnight by connecting the two laptops with a REGULAR Ethernet cable. The instructions indicated that in "most" cases, two laptops can be connected using a regular "patch cord" cable. These are the kind everyone has, the connects their laptop to the wall socket for wired connection to the corporate intranet, or their personal computers to their LAN hubs at home. Unfortunately, the connection was not recognized, so I suspected that this was one of the exceptions not covered.
(There are two types of Ethernet cables. The ["patch cord"] connects computers to switches. The ["crossover" cable] connects like devices, such as computers to computers, or switches to switches. Four years ago, I used a crossover cable to transfer my files over, and assumed that I would need one this time as well.)
Thursday morning, I borrowed a crossover cable from a coworker. It was bright pink and only about 18 inches long, just enough to have the two laptops side by side. If the pink crossover cable were any shorter, the two laptops would be back to back. I kept the old workstation in the docking station, which allowed it to remain connected to my big flat screen, mouse, keyboard and use the docking stations RJ45 to connect to the corporate intranet. That left the RJ45 on the left side of the old system to connect via crossover cable to the new system. But that didn't work, of course, because the docking station overrides the side port, so we had to completely "undock" and go native laptop to laptop.
Restarting the Migration Assistant, I unplug the corporate intranet cable from the old laptop, put one end of the pink cable into each Ethernet port of each laptop. On the new system, Migration Assistant asks to setup a password and provides an IP address like 169.254.aa.bb with a netmask of 255.255.0.0 and I am supposed to type this IP address over on the old system for it to reach out and connect. It still didn't connect.
We tried a different pink crossover cable, no luck. My colleague Harley brought over his favorite "red" crossover cable, that he has used successfully many times, but still didn't work. The helpful diagnostic advice was to disable all firewall programs from one or both systems.
I disabled Symantec Client Firewall on both systems. Still not working. I even tried booting both systems up in "safe" mode, using MSCONFIG to set the reboot mode as "safe with networking" as the key option. Still not working. At this point, I was afraid that I would have to use the alternate approach, which was to connect both systems to our corporate 100 Mbps system, which would be painfully slow. I only have one active LAN cable in my office, so the second computer would have to sit outside in the lobby.
Looking at the IP address on the old system, it was 9.11.xx.yy, assigned by our corporate DHCP, so not even in the same subnet of the new computer. So, I created profiles on ThinkVantage Access Connections on both systems, with 192.168.0.yy netmask 255.255.255.0 on the old system, and 192.168.0.bb on the new system. This worked, and a connection between the two systems was finally recognized.
Since I had 23GB of system files and programs on my old C: drive, and 80GB of data on my old D: drive, I didn't think I would run out of space on my new 40GB C: drive and 245GB D: drive, but it did! The Migration Assistant wanted to my D:\Documents on my new C: drive and refused to continue. I had to turn off D:\Documents from the list so that it could continue, processing only the programs and system settings on C: drive. It took 61 minutes to scan 23GB on my C: drive, identify 12,900 files to move, representing 794MB of data. Seriously? Less than 1GB of data moved!
It then scanned all of the programs I had on my old system, and decided that there were none that needed to be moved or installed on the new system. The closing instructions explained there might be a few programs that need to be manually installed, and some data that needed to be transferred manually.
Given the performance of Migration Assistant, I decided to just setup a direct Network Mapping of the new D: drive as Y: on my old system, and just drag and drop my entire folder over. Even at 1000 Mbps, this still took the rest of the day. I also backed up C:\Program Files using [System Rescue CD] to my external USB drive, and restored as D:\prog-files, just in case. In retrospect, I realize it would have been faster just to have dumped my D: drive to my USB drive, and restore it on the new system.
I'll leave the process of re-installing missing programs for Friday.
technorati tags: , Ethernet, Patchcord, Crossover, firewall, SysRescCD, USB
Continuing my rant from Monday's post [Time for a New Laptop], I got my new laptop Wednesday afternoon. I was hoping the transition would be quick, but that was not the case. Here were my initial steps prior to connecting my two laptops together for the big file transfer:
- Document what my old workstation has
Back in 2007, I wrote a blog post on how to [Separate Programs from Data]. I have since added a Linux partition for dual-boot on my ThinkPad T60.
|/dev/sda1||26GB||NTFS||C:||Windows XP SP3 operating system and programs|
|/dev/sda2||12GB||ext3||/(root)||Red Hat Enterprise Linux 5.4|
|/dev/sda6||80GB||NTFS||D:||My Documents and other data|
I also created a spreadsheet of all my tools, utilities and applications. I combined and deduplicated the list from the following sources:
- Control Panel -> Add/Remove programs
- C:\Program Files
- Start -> Programs panels
- Program taskbar at bottom of screen
The last one was critical. Over the years, I have gotten in the habit of saving those ZIP or EXE files that self-install programs into a separate directory, D:/Install-Files, so that if I had to unintsall an application, due to conflicts or compatability issues, I could re-install it without having to download them again.
So, I have a total of 134 applications, which I have put into the following rough categories:
- AV - editing and manipulating audio, video or graphics
- Files - backup, copy or manipulate disks, files and file systems
- Browser - Internet Explorer, Firefox, Opera and Google Chrome
- Communications - Lotus Notes and Lotus Sametime
- Connect - programs to connect to different Web and Wi-Fi services
- Demo - programs I demonstrate to clients at briefings
- Drivers - attach or sync to external devices, cell phones, PDAs
- Games - not much here, the basic solitaire, mindsweeper and pinball
- Help Desk - programs to diagnose, test and gather system information
- Projects - special projects like Second Life or Lego Mindstorms
- Lookup - programs to lookup information, like American Airlines TravelDesk
- Meeting - I have FIVE different webinar conferencing tools
- Office - presentations, spreadsheets and documents
- Platform - Java, Adobe Air and other application runtime environments
- Player - do I really need SIXTEEN different audio/video players?
- Printer - print drivers and printer management software
- Scanners - programs that scan for viruses, malware and adware
- Tools - calculators, configurators, sizing tools, and estimators
- Uploaders - programs to upload photos or files to various Web services
- Backup my new workstation
My new ThinkPad T410 has a dual-core i5 64-bit Intel processor, so I burned a 64-bit version of [Clonezilla LiveCD] and booted the new system with that. The new system has the following configuration:
|/dev/sda1||320GB||NTFS||C:||Windows XP SP3 operating system, programs and data|
There were only 14.4GB of data, it took 10 minutes to backup to an external USB disk. I ran it twice: first, using the option to dump the entire disk, and the second to dump the selected partition. The results were roughly the same.
- Run Workstation Setup Wizard
The Workstation Setup Wizard asks for all the pertinent location information, time zone, userid/password, needed to complete the installation.
- Re-Partition Disk Drive
I burned a 64-bit version of [System Rescue CD] and ran [Gparted] to re-partition this disk into the following:
|/dev/sda1||40GB||NTFS||C:||Windows XP SP3 operating system and programs|
|/dev/sda2||15GB||ext3||/(root)||Ubuntu Desktop 10.04 LTS|
|/dev/sda6||245GB||NTFS||D:||My Documents and other data|
- Redefine Windows directory structure
I made two small changes to connect C: to D: drive.
- Changed "My Documents" to point to D:\Documents which will move the files over from C: to D: to accomodate its new target location. See [Microsoft procedure] for details.
- Edited C:\notes\notes.ini to point to D:\notes\data to store all the local replicas of my email and databases.
- Install Ubuntu Desktop 10.04 LTS
My plan is to run Windows and Linux guests through virtualization. I decided to try out Ubuntu Desktop 10.04 LTS, affectionately known as Lucid Lynx, which can support a variety of different virtualization tools, including KVM, VirtualBox-OSE and Xen. I have two identical 15GB partitions (sda2 and sda3) that I can use to hold two different systems, or one can be a subdirectory of the other. For now, I'll leave sda3 empty.
- Take another backup of my new workstation
I took a fresh new backup of paritions (sda1, sda2, sda6) with Clonezilla.
The next step involved a cross-over Ethernet cable, which I don't have. So that will have to wait until Thursday morning.
technorati tags: IBM, Lenovo, ThinkPad, T60, T410, Intel, Clonezilla, SysRescCD, Gparted, Windows, Ubuntu, Linux, Lucid, LTS
Well, it's Tuesday, and you all know what that means... IBM announcements!
This week, IBM announced the IBM Tivoli Storage Productivity Center for Disk Midrange Edition, affectionately referred to as "MRE". This is basically TPC for Disk but with two key differences:
- A special license that covers only DS3000, DS4000, DS5000 series, whether natively attached or virtualized behind SAN Volume Controller.
- A new pricing model based on the number on controllers and drawers, rather than by TB managed. For example, if you have a DS5300 and two expansion drawers, then you pay for three units of MRE. As you upgrade from smaller capacity disks to larger capacity disks, your license costs won't increase. This eliminates the quarterly hassle to "true up" your software licenses to match actual capacity that is required on TB-based licensing.
This includes the [new DS3500 model] that was announced last month. This was part of the set of solutions to [help midsized businesses].
A fresh new blogger on the scene, Anthony Vandewerdt (IBM), covers [10 things I like about the IBM DS3500], saving me the trouble.
For more information on Tivoli Storage Productivity Center for Disk Midrange Edition, see the IBM [Announcement Letter].
technorati tags: IBM, Tivoli, Productivity Center, Midrange+Edition, MRE, TPC, DS3000, DS3500, DS4000, DS5000, Anthony Vandewerdt
My how time flies. This week marks my 24th anniversary working here at IBM. This would have escaped me completely, had I not gotten an email reminding me that it was time to get a new laptop. IBM manages these on a four-year depreciation schedule, and I received my current laptop back in June 2006, on my 20th anniversary.
When I first started at IBM, I was a developer on DFHSM for the MVS operating system, now called DFSMShsm on the z/OS operating system. We all had 3270 [dumb terminals], large cathode ray tubes affectionately known as "green screens", and all of our files were stored centrally on the mainframe. When Personal Computers (PC) were first deployed, I was assigned the job of deciding who got them when. We were getting 120 machines, in five batches of 24 systems each, spaced out over the next two years. I was assigned the job of recommending who should get a PC during the first batch, the second batch, and so on. I was concerned that everyone would want to be part of the first batch, so I put out a survey, asking questions on how familiar they were with personal computers, whether they owned one at home, were familiar with DOS or OS/2, and so on.
It was actually my last question that helped make the decision process easy:
How soon do you want a Personal Computer to replace your existing 3270 terminal?
- 1-60 days
- 61-120 days
- 121-180 days
- As late as possible
I had five options, and roughly 24 respondents checked each one, making my job extremely easy. Ironically, once the early adopters of the first batch discovered that these PC could be used for more than just 3270 terminal emulation, many of the others wanted theirs sooner.
Back then, IBM employees resented any form of change. Many took their new PC, configured it to be a full-screen 3270 emulation screen, and continued to work much as they had before. My mentor, Jerry Pence, would print out his mails, and file the printed emails into hanging file folders in his desk credenza. He did not trust saving them on the mainframe, so he was certainly not going to trust storing them on his new PC. One employee used his PC as a door stop, claiming he will continue to use his 3270 terminal until they take it away from him.
Moving forward to 2006, I was one of the first in my building to get a ThinkPad T60. It was so new that many of the accessories were not yet available. It had Windows XP on a single-core 32-bit processor, 1GB RAM, and a huge 80GB disk drive. The built-in 1GbE Ethernet went unused for a while, as we had 16 Mbps Token Ring network.
I was the marketing strategist for IBM System Storage back then, and needed all this excess power and capacity to handle all my graphic-intense applications, like GIMP and Second Life.
Over the past four years, I made a few slight improvements. I partitioned the hard drive to dual-boot between Windows and Linux, and created a separate partition for my data that could be accessed from either OS. I increased the memory to 2GB and replaced the disk with a drive holding 120GB capacity.
A few years ago, IBM surprised us by deciding to support Windows, Linux and Mac OS computers. But actually it made a lot of sense. IBM's world-renown global services manages the help-desk support of over 500 other companies in addition to the 400,000 employees within IBM, so they already had to know how to handle these other operating systems. Now we can choose whichever we feel makes us more productive. Happy employees are more productive, of course. IBM's vision is that almost everything you need to do would be supported on all three OS platforms:
- Lotus Notes
Access your email, calendar, to-do list and corporate databases via Lotus Notes on either Windows, Linux or Mac OS. Corporate databases store our confidential data centrally, so we don't have to have them on our local systems. We can make local replicas of specific databases for offline access, and these are encrypted on our local hard drive for added protection. Emails can link directly to specific entries in a database, so we don't have huge attachments slowing down email traffic. IBM also offers LotusLive, a public cloud offering for companies to get out of managing their own email Lotus Domino repositories.
- Lotus Symphony
Create presentations, documents and spreadsheets on either Windows, Linux or Mac OS. Lotus Symphony is based on open source OpenOffice and is compatible with Microsoft Office. This allows us to open and update directly in Microsoft's PPT, DOC and XLS formats.
- Firefox Browser
Many of the corporate applications have now been converted to be browser-accessible. The Firefox browser is available on Windows, Linux and Mac OS. This is a huge step forward, in my opinion, as we often had to download applications just to do the simplest things like submit our time-sheet or travel expense reimbursement. I manage my blog, Facebook and Twitter all from online web-based applications.
The irony here is that the world is switching back to thin clients, with data stored centrally. The popularity of Web 2.0 helped this along. People are using Google Docs or Microsoft OfficeOnline to eliminate having to store anything locally on their machines. This vision positions IBM employees well for emerging cloud-based offerings.
Sadly, we are not quite completely off Windows. Some of our Lotus Notes databases use Windows-only APIs to access our Siebel databases. I have encountered PowerPoint presentations and Excel spreadsheets that just don't render correctly in Lotus Symphony. And finally, some of our web-based applications work only in Internet Explorer! We use the outdated IE6 corporate-wide, which is enough reason to switch over to Firefox, Chrome or Opera browsers. I have to put special tags on my blog posts to suppress YouTube and other embedded objects that aren't supported on IE6.
So, this leaves me with two options: Get a Mac and run Windows on the side as a guest operating system, or get a ThinkPad to run Windows or Windows/Linux. I've opted for the latter, and put in my order for a ThinkPad 410 with a dual-core 64-bit i5 Intel processor, VT-capable to provide hardware-assistance for virtualization, 4GB of RAM, and a huge 320GB drive. It will come installed with Windows XP as one big C: drive, so it will be up to me to re-partition it into a Windows/Linux dual-boot and/or Windows and Linux running as guest OS machine.
(Full disclosure to make the FTC happy: This is not an endorsement for Microsoft or against Apple products. I have an Apple Mac Mini at home, as well as Windows and Linux machines. IBM and Apple have a business relationship, and IBM manufactures technology inside some of Apple's products. I own shares of Apple stock, I have friends and family that work for Microsoft that occasionally send me Microsoft-logo items, and I work for IBM.)
I have until the end of June to receive my new laptop, re-partition, re-install all my programs, reconfigure all my settings, and transfer over my data so that I can send my old ThinkPad T60 back. IBM will probably refurbish it and send it off to a deserving child in Africa.
If you have an old PC or laptop, please consider donating it to a child, school or charity in your area. To help out a deserving child in Africa or elsewhere, consider contributing to the [One Laptop Per Child] organization.
technorati tags: , Anniversary, DFHSM, MVS, DFSMShsm, z/OS, dumb terminals, cathode ray tube, personal computer, DOS, OS/2, ThinkPad, cloud computing, Web20, Windows, Linux, MacOS, Apple, Microsoft, OLPC
The BP oil spill in the Gulf of Mexico is a good reminder that all organizations should consider practice and execution of their contingency plans. In this most recent case, the [Deepwater Horizon] oil platform had an explosion on April 20, resulting in oil spewing out at an estimated 19,000 barrels per day. While some bloggers have argued that BP failed to plan, and therefore planned to fail, I found that hard to believe. How can a billion-dollar multinational company not have contingency plans?
The truth is, BP did have plans. Karen Dalton Beninato of New Orleans' City Voices discusses BP's Gulf of Mexico Regional Oil Spill Response Plan (OSRP) in her article [BP's Spill Plan: What they knew and when they knew it]. A
[redacted 90-page version of the OSRP] is available on their website.
The plan indicates that it may be 30 days from the time a deep offshore leak reaches the shoreline, giving OSRP participants plenty of time to take action.
(Having former politicians [blame environmentalists] for this crisis does not help much either. At least the deep shore rigs give you 30 days to react to a leak before the oil gets to the shoreline. Having oil rigs closer to shore will just shorten this time to react. Allowing onshore oil rigs does not mean oil companies would discontinue their deep offshore operations. There are thousands of oil rigs in the Gulf of Mexico. Extracting oil in the beautiful Alaska National Wildlife Reserve [ANWR] might be safer, it does not eliminate the threat entirely, and any leak there would be damaging to the local plant and animals in the same manner.)
So perhaps the current crisis was not the result of a lack of planning, but inadequate practice and execution. The same is true for IT Business Continuity / Disaster Recovery (BC/DR) plans. In all cases, there are four critical parts:
The planning team needs to anticipate every possible incident, determine the risks involved and the likelihood of impact, and either accept them, or decide to mitigate them. This can include natural disasters (hurricanes, fires, floods) and technical issues (computer viruses, power outages, network disruption).
Mitigation can involve taking backups, having replicated copies at a remote location, creating bootable media, training all of the appropriate employees, and having written documented procedures. IBM's Unified Recovery Management approach can protect your entire IT operations, from laptops of mobile employees, to remote office/branch office (ROBO) locations, to regional and central data centers.
When was the last time you practiced your Business Continuity / Disaster Recovery plan? I have seen this done at a variety of levels. At the lowest level, it is all done on paper, in a conference room, with all participants talking through their respective actions. These are often called "walk-throughs". At the highest level, you turn off power to your data center --on a holiday weekend to minimize impact to operating revenues-- and have the team bring up applications at the alternate site.
As many as 80 percent of these BC/DR exercises are considered failures, in that if a real disaster would have occurred, the participants are convinced they would not have achieved their target goals of Recovery Time Objective (RTO). However, they are not complete failures if they can help improve the plans, help identify new incidents that were not previously considered, and help train the participants in recovery procedures.
The last part is execution. In my career, I have been onsite for many Disaster Recovery exercises as well as after real disasters have occured. I am not surprised how many people assume that if they have plans in place, have made preparations, and have one to three practice drills per year, that the actual "execution" would directly follow. While the book [Execution] by Bossidy and Charan is not focused on IT BC/DR plans per se, it is a great read on how to manage the actual execution of any kind of business plan. I have read this book and recommend it.
If you have not tested out your IT department's BC/DR plans. Perhaps its time to dust off your copy, review it, and schedule some time for practice.
technorati tags: , Gulf of Mexico, Deepwater Horizon, oil spill, OSRP, Business Continuity, Disaster Recovery
Well, it feels like Tuesday and you know what that means... "IBM Announcement Day!" Actually, today is Wednesday, but since Monday was Memorial Day holiday here in the USA, my week is day-shifted. Yesterday, IBM announced its latest IBM FlashCopy Manager v2.2 release. Fellow blogger, Del Hoobler (IBM) has also posted something on this out atthe [Tivoli Storage Blog].
IBM FlashCopy Manager replaces two previous products. One was called Tivoli Storage Manager for Copy Services, the other was called Tivoli Storage Manager for Advanced Copy Services. To say people were confused between these two was an understatement, the first was for Windows, and the second was for UNIX and Linux operating systems. The solution? A new product that replaces both of these former products to support Windows, UNIX and Linux! Thus, IBM FlashCopy Manager was born. I introduced this product back in 2009 in my post [New DS8700 and other announcements].
IBM Tivoli Storage FlashCopy Manager provides what most people with "N series SnapManager envy" are looking for: application-aware point-in-time copies. This product takes advantage of the underlying point-in-time interfaces available on various disk storage systems:
- FlashCopy on the DS8000 and SAN Volume Controller (SVC)
- Snapshot on the XIV storage system
- Volume Shadow Copy Services (VSS) interface on the DS3000, DS4000, DS5000 and non-IBM gear that supports this Microsoft Windows protocol
For Windows, IBM FlashCopy Manager can coordinate the backup of Microsoft Exchange and SQL Server. The new version 2.2 adds support for Exchange 2010 and SQL Server 2008 R2. This includes the ability to recover an individual mailbox or mail item from an Exchange backup. The data can be recovered directly to an Exchange server, or to a PST file.
For UNIX and Linux, IBM FlashCopy Manager can coordinate the backup of DB2, SAP and Oracle databases. Version 2.2 adds support specific Linux and Solaris operating systems, and provides a new capability for database cloning. Basically, database cloning restores a database under a new name with all the appropriate changes to allow its use for other purposes, like development, test or education training. A new "fcmcli" command line interface allows IBM FlashCopy Manager to be used for custom applications or file systems.
A common misperception is that IBM FlashCopy Manager requires IBM Tivoli Storage Manager backup software to function. That is not true. You have two options:
- Stand-alone Mode
In Stand-alone mode, it's just you, the application, IBM FlashCopy Manager and your disk system. IBM FlashCopy Manager coordinates the point-in-time copies, maintains the correct number of versions, and allows you to backup and restore directly disk-to-disk.
- Unified Recovery Management with Tivoli Storage Manager
Of course, the risk with relying only on point-in-time copies is that in most cases, they are on the same disk system as the original data. The exception being virtual disks from the SAN Volume Controller. IBM FlashCopy Manager can be combined with IBM Tivoli Storage Manager so that the point-in-time copies can be copied off to a local or remote TSM server, so that if the disk system that contains both the source and the point-in-time copies fails, you have a backup copy from TSM. In this approach, you can still restore from the point-in-time copies, but you can also restore from the TSM backups as well.
IBM FlashCopy Manager is an excellent platform to connect application-aware fucntionality with hardware-based copy services.
technorati tags: IBM, Announcements, FlashCopy, FlashCopy+Manager, Microsoft, Windows, VSS, UNIX, AIX, Solaris, Linux, TSM, Exchange, SQL+Server, SAP, DB2, Oracle