This blog is for the open exchange of ideas relating to IBM Systems, storage and storage networking hardware, software and services.
(Short URL for this blog: ibm.co/Pearson )
Tony Pearson is a Master Inventor, Senior IT Architect and Event Content Manager for [IBM Systems for IBM Systems Technical University] events. With over 30 years with IBM Systems, Tony is frequent traveler, speaking to clients at events throughout the world.
Lloyd Dean is an IBM Senior Certified Executive IT Architect in Infrastructure Architecture. Lloyd has held numerous senior technical roles at IBM during his 19 plus years at IBM. Lloyd most recently has been leading efforts across the Communication/CSI Market as a senior Storage Solution Architect/CTS covering the Kansas City territory. In prior years Lloyd supported the industry accounts as a Storage Solution architect and prior to that as a Storage Software Solutions specialist during his time in the ATS organization.
Lloyd currently supports North America storage sales teams in his Storage Software Solution Architecture SME role in the Washington Systems Center team. His current focus is with IBM Cloud Private and he will be delivering and supporting sessions at Think2019, and Storage Technical University on the Value of IBM storage in this high value IBM solution a part of the IBM Cloud strategy. Lloyd maintains a Subject Matter Expert status across the IBM Spectrum Storage Software solutions. You can follow Lloyd on Twitter @ldean0558 and LinkedIn Lloyd Dean.
Tony Pearson's books are available on Lulu.com! Order your copies today!
Safe Harbor Statement: The information on IBM products is intended to outline IBM's general product direction and it should not be relied on in making a purchasing decision. The information on the new products is for informational purposes only and may not be incorporated into any contract. The information on IBM products is not a commitment, promise, or legal obligation to deliver any material, code, or functionality. The development, release, and timing of any features or functionality described for IBM products remains at IBM's sole discretion.
Tony Pearson is a an active participant in local, regional, and industry-specific interests, and does not receive any special payments to mention them on this blog.
Tony Pearson receives part of the revenue proceeds from sales of books he has authored listed in the side panel.
Tony Pearson is not a medical doctor, and this blog does not reference any IBM product or service that is intended for use in the diagnosis, treatment, cure, prevention or monitoring of a disease or medical condition, unless otherwise specified on individual posts.
The developerWorks Connections platform will be sunset on December 31, 2019. On January 1, 2020, this blog will no longer be available. More details available on our FAQ.
Well, it's that time of year again. While every corporate blogger waits for their employer to release last year's earning report, we are forced to find other things to write about that comply within [corporate "black-out" rules].
"Insanity is doing the same thing over and over again but expecting different results." -- Albert Einstein
In addition to being a technical consultant for IBM, I am also a certified yoga instructor with formal training. Back in 2004, I co-founded the Tucson Laughter Club, based on [Hasya yoga], a form of yoga that incorporates breathing, stretching and laughter exercises. The two jobs are actually similar, in which I am standing in front of a group of people, telling them what to do and how to do it.
January is the month where gyms and yoga classes are filled with new students who have made New Year's Resolutions. Every time I am asked "What should I do to lose weight, get fit, and sleep better?"
(Note: I am neither a medical doctor nor registered dietician. I can share with you ideas that have worked for me (or my yoga students) that might help you achieve your goals. I strongly suggest you read books and consult with medical experts as necessary.)
I always tell them the same answer. But first, I make them promise they won't share the secret with anyone, and that I will whisper it in their ear. After I get their nod of agreement, I whisper "Eat Less and Exercise More."
I get the same quizzical look every time. The response is typically "That's your big secret? Everyone knows that!" If that's true, why are nearly a third of all Americans obese, out-of-shape, and/or sleep-deprived? The answer is the knowing-doing gap.
While the book is focused on why businesses fail to achieve their goals, I think many of the principles apply to individuals trying to reach their health goals:
Understand "Why" before "How". People are quick to follow process and procedures, rather than understanding the underylying biology, chemistry, or physiology.
Knowing comes from doing and teaching others. Learning is best done by trying a lot things, learning from what works and what does not, thinking about what was learned, and trying again.
Actions speaks louder than words, thoughts, and elegant plans. Without taking some action, learning is more difficult and less efficient because it is not grounded in real experience. When I was in Japan, one of the employees told me their boss was NATO, which stood for "No Action, Talk Only!"
There is no doing without mistakes. In building a culture of action, one of the most critical elements is how you treat yourself when
things go wrong. Even well planned actions can go wrong. All learning involves some failure, something from which one can continue to learn.
Fear fosters knowing-doing gaps, so drive out fear. Do you fear making mistakes? Do you fear success? Do you fear people will make fun of you for trying something outside your comfort zone? Drive out that fear!
Measure what matters and what can help turn knowledge into action. Peter Drucker is often quoted as saying "If you can't measure it, you can't manage it!" The trick is to figure out which measurements lead to corrective actions.
If you have problems keeping any of your New Year's Resolutions, try to figure out why. Is it because you didn't know what to do? Or, more likely, you know what you needed to do, but didn't do it? Feel free to enter your comments below!
(What does this have to do with Storage? When IBM got back into networking in a big way, they had to decide whether to combine it with one of the existing groups, or form its own group. IBM decided to merge networking with storage, which makes sense since the primary purpose of most networks is to access or transmit information stored somewhere else.)
Last April, the Wharton School and the Institute for the Future convened a one-day [After Broadband] workshop in San Francisco, California, that brought together a group of leading technologists, entrepreneurs, academics and policymakers to explore the future of broadband over the next decade.
Continuing this week's theme about the future, fellow blogger, published author, and futurist David Houle is coming out with a new book this month titled [Entering the Shift Age]. This is a follow-on to his book, [The Shift Age].
Since this book cites IBM studies explicitly, his PR department asked me to review it. If you are an aspiring author that has a book you want me review, and it relates to the topics my blog covers like Cloud, Big Data, storage, and the explosion of information, feel free to send me a copy!
(FTC Disclosure: I work for IBM. I was not paid by anyone to mention this book on my blog. I was provided an "Uncorrected Advanced Copy" of this book at no cost to me for this review. I do not know David Houle personally, have not read any of his prior works, nor have I ever seen him speak at public events. This post is neither a paid nor celebrity endorsement of this author, his book, nor any other books by this author.)
First, let's get a few details out of the way:
Title:Entering the Shift Age, 284 pages Author: David Houle, futurist Genre: Non-fiction, trends and predictions
Publisher: Sourcebooks, Inc. Publish date: January 2013
As I mentioned in my post [Historians vs. Futurists], there is only one past, but there are many potential futures. There seems to be as many futurists out there as there are potential futures. I suspect not everyone will agree with all that David has written. However, this reminds me of one of my favorite quotes:
"When two futurists always agree, one is no longer necessary." -- old Italian adage
In his book, David asks a series of thought-provoking questions, then answers them with his views and opinions on how the future will roll out:
Is humanity now entering a new age that is different than the Information age?
If so, what should we call it?
Which forces are driving this new age?
How will this impact various aspects and institutions of society?
David feels humanity is indeed entering a new age, which he calls the Shift Age. This is driven by three forces: the shift to globalization of culture and politics, the flow of power and influence to individuals, and the acceleration of electronic connectedness.
In a sense, David is like a hunter-gatherer from the Stone age, hunting down trends and gathering ideas from others. In much the same way my compost brings renewed purpose to the rinds and pits of my fruits and vegetables, David's book does a good job paraphrasing the works of many of today's leading futurists.
David predicts the decade we are now in, the 2010's, will mark the end of the Information age, a transition period to this new era, that will lead to transformations in government, education, health, technology, and energy.
Over the past two weeks, I had time to enjoy a variety of movies. I had seen several whose stories wrapped around key moments of transition.
"Gone with the Wind", as well as the new offering "Lincoln" from Steven Spielberg. Both are set in the 1860's, the time of the [American Civil War], pitting the Industrial-age forces of the North, against the Agricultural-age economy of the South. This time saw the transition from slavery to freedom.
"Doctor Zhivago", set in the time of World War I, on the German-Russian front, as well as the Russian Revolution of 1917, and the resulting Civil War between the Red Guard and the White Army. This saw the transition from a Russian government ruled by Czars, to one ruled by the people through Communism.
"Lawrence of Arabia", also set in the time of World War I, but south in Arabia. T. E. Lawrence was able to bring several warring Arab tribes together to defeat the Turks, and was a key figure in the transition to an Arab National Council.
Some might call these completely unexpected [Black Swan] events, while others might feel they are merely fortunate (or misfortunate) sequences of events that led to inevitable social change. Has something happened, or will something happen later this decade, that will drive us to leave the Information Age?
David's previous book, The Shift Age, was published back in 2007, and a lot has happened in the past six years: a global financial melt-down recession; the Arab Spring uprisings in the Middle East; Barack Obama was elected and re-elected; man-made climate change in the form of hurricanes, tsunamis and superstorms hit various parts of the world; brush fires lit up Australia, and BP's Deepwater Horizon oil rig exploded off the Gulf coast, just to name a few.
David's new book reflects the impact of these recent events, from discussions on his [Evolutionshift] blog, to Q&A sessions he has after his public speaking presentations. For those who are not interested in the wide array of topics he covers in this one book, David also offers [a dozen different mini-eBooks] that cover specific topics like [Technology, Energy and Health].
My Rating: Moist and Flaky
Who should read this book: If you are a time-traveler from 1975 that came to this decade to learn all about what your future has in store, but can only select one book to read before you zoom back to your own time period, this would be the book I recommend.
I do not want to imply this is a quick read, or one that you can't put down once you start reading it. Just like you should not gulp down a full bottle of cheap Vodka in one sitting, this book should be read over a series of days, as I did, so that you can mull over in your mind the different points and thoughts he is trying to convey.
Today is the last day of 2012, so it is only fitting to end the year looking forward to the future!
While I have been accused of being a historian, I consider myself a bit of a futurist. Since 2006, I have been blogging about the future of technology, including Cloud, Big Data, and the explosion of information. As a consultant for the IBM Executive Briefing Center, I present to clients IBM's future plans, strategies, and product roadmaps.
(Fellow blogger Mark Twomey on his Storagezilla blog has a humorous post titled [Stuff your Predictions], expressing his disdain for articles this time of year that predict what the next 12 months will bring. Don't worry, this is not one of those posts!)
What exactly is a futurist? Biologists study biology. Techologists study technology. But a person can't simply time-travel to the future, read the newspaper, make observations, take notes, and then go back in time to share his findings.
Here seem to be the key differences between Historians vs. Futurists:
There is only one past.
There are many possible futures.
Only six percent of humanity are alive today, so historians must study history through the writings, tools, and remains of those that have passed on.
Futurists study the past and the present, looking for patterns and trends.
Search for insight.
Search for foresight.
Framework to explain what happened and why.
Framework to express what is possible, probable, and perhaps even preferable.
A common framework for both is the concept of the various "Ages" that humanity has been through:
Around 200,000 years ago, in the middle of what archaeologists refer to as the [Paleolithic Era], man walked upright and used tools made of stone to hunt and gather food. Humans were nomadic and travelled in tribes to follow the herds of animals as they migrated season to season. The History Channel had a great eight-hour series called [Mankind: The Story of All of Us] that started here, and worked all the way up to modern times.
About 10,000 years ago, humans got tired of chasing after their meals, and settled down, growing their food instead. Grains like wheat, rice, and corn became staples of most diets around the world. Civilization evolved, and people traded what they grew or made in exchange for items they needed or wanted.
About 300 years ago, humans developed machines to help do things, and even to help build other machines. While farmers harnessed oxen to plow fields, and horses to speed up travel and communication, these were all based on muscle power.
Machines like the steam engine were powered by coal, petroleum, or natural gas. Today, one gallon of gasoline can do the work of 600 man-hours of human muscle power, or [move a ton of freight 400 miles].
Cities grew up with skyscrapers of steel, connected by trains, planes and automobiles. Communications with the telegraph, telephone, radio and television replaced sending message on horseback.
The forces that drove humanity to the Industrial age clashed with the culture and identity established during the Agricultural age. I highly recommend futurist Thomas Friedman's book [The Lexus and the Olive Tree] that covers these conflicts.
When exactly did the Information age begin? Did it start with Guttenberg's muscle-powered [Printing Press] in the year 1450, or the first punched card in 1725?
Futurist [Alvin Toffler] published his book The Third Wave in 1980. He coined the phrase "Third Wave" to describe the transition from the Industrial age to the Information age.
While IBM mainframes were processing information in the 1950's, many people associate the Information Age with the IBM Personal Computer (1981) or the World Wide Web (1991). Over 100 years ago, IBM started out in the Industrial age, with business machines like meat scales and cheese slicers. IBM led the charge into the Information Age, and continues that leadership today.
In any case, value went from atoms to bits. Computers and mobile devices transfer bits of data, information and ideas, from nearly anyplace on the planet to another, in seconds.
Ideas and content are now king, rather than land, buildings, machines and raw materials of the Industrial age. In 1975, less than 20 percent of a business assets were intangible. By 2005, over 80 percent is.
While the Industrial age was dominated by left-brain thinking, the Information Age requires the creativity of right-brain thinking. I highly recommend Daniel Pink's book, [A Whole New Mind] that covers this in detail.
"The future is already here -- it's just not very evenly distributed!" -- William Gibson (1993)
The problem with looking back through history as a series of "Ages" is that they really didn't start and end on specific days. The Agricultural age didn't end on a particular Sunday evening, with the Industrial age starting up the following Monday morning.
There are still people on the planet today in the Stone age. On my last visit to Kenya, I met a nomadic tribe that still lives this way. Huts were temporarily constructed from sticks and mud, and abandoned when it was time to move on.
A short-sighted charity built a one-room school house for them, hoping to convince the tribe that staying in one place for education was more important than hunting and gathering food in a nomadic lifestyle. Some stayed and starved.
In the United States, about 2 percent of Americans grow food for the rest of us, with enough left over to make ethanol and give food aid to other countries.
Sadly, the Standard American Diet continues to be foods mostly processed from wheat, rice and corn, even though our human genetic make-up has not yet evolved from a "Paleolithic" mix of [meats, nuts and berries].
There are still people on the planet today in the Industrial age. American schools are still geared to teach children for Industrial age jobs, but still take "summer vacation" to work in the fields of the Agricultural age? Seth Godin's book [Stop Stealing Dreams] is a great read on what we should do about this.
Wrapping up my series on a [Laptop for Grandma], I finally have something that I think meets all of my requirements! Special thanks to Guidomar and the rest of my readers who sent in suggestions!
I could have called this series "The Good, the Bad, and the Ugly". The [Cloud-oriented choices] weren't bad per se, but expected persistent Internet connection. The [Low-RAM choices] were not ugly per se, but had limited application options. The ones below were good, in that they helped me decide what would be just right for grandma.
Linux Mint 9
One of my readers, Guidomar, suggested Linux Mint Xfce. At LinuxFest Northwest 2012, Bryan Lunduke indicated that [Linux Mint] is the fastest growing Linux in popularity. You can watch his 43-minute presentation of [Why Linux Sucks!] on YouTube.
The latest version is Mint 14, but that has grown so big it has to be installed on a DVD, as it will no longer fit on a 700MB CD-ROM. Since I don't have a DVD drive on this Thinkpad R31, I dropped down to the latest Gnome edition that did fit on a LiveCD, which was Mint 9.
(In retrospect, I could have used the [PLoP Boot Manager CD], and installed the latest Linux Mint 14 from USB memory stick! My concern was that if a distribution didn't fit on a CD-ROM, it was expecting a more modern computer overall, and thus would probably require more than 384MB or RAM as well.)
Linux Mint is actually a variant of Ubuntu, which means that it can tap into the thousands of applications already available. Mint 9 is based on Ubuntu 10.04 LTS.
One of the nice features of Linux Mint is that there are versions with full [Codecs] installed. A codec is a coder/decoder software routine that can convert a digital data stream or signal, such as for audio or video data. Many formats are proprietary, so codecs are generally not open source, and often not included in most Linux distros. They can be installed manually by the Linux administrator. Windows and Mac OS are commercially sold and don't have this problem, as Microsoft and Apple take care of all the licensing issues behind the scenes.
The installation went smooth. It would have gladly set up a dual-boot with Windows for me, but instead I opted to wipe the disk clean and install fresh for each Linux distribution I tried.
Running it was a different matter. The screen would go black and crash. There just wasn't enough memory.
Since [Peppermint OS] was partially based on Lubuntu, I thought I would give [Lubuntu 12.04] a try. The difference is that Peppermint OS is based on Xfce (as is Xubuntu), but Lubuntu claims to have a smaller memory footprint using Lightweight X11 Desktop Environment (LXDE). This version claims to run in 384MB, which is what I have on grandma's Thinkpad R31.
There are two installers. The main installer requires more than 512MB to run, so I used the alternate text-based Installer-only CD, which needs only 192MB.
The LXDE GUI is simple and straightforward. As with Peppermint OS, I did have to install the Codec plugins. However, the time-to-first-note was less than two minutes, so we can count this as a success!
Linux Mint 12 LXDE edition
Circling back to Linux Mint, I realized that my problem up above was chosing the wrong edition. Apparently, Linux Mint comes in various editions, the main edition I had selected was based on Gnome which requires at least 512MB of RAM.
Other editions are based on KDE, xFCE and LXDE. Linux Mint 9 LXDE requires only 192MB of RAM, and the newer Linux Mint 12 LXDE requires only 256MB. I choose the latter, and the install went pretty much the same as Mint and Lubuntu above.
The music player that comes pre-installed is called [Exaile], which supports playlists, audio CDs, and a variety of other modern features, so no reason to install Rhythmbox or anything else. Grandma can even rip her existing audio CDs to import her music into MP3 format. Time-to-first-note was about two minutes.
The best part: the OS only takes up about 4GB of disk, leaving about 15GB for MP3 music files!
Lubuntu and Linux Mint LXDE were similar, but I decided to go with the latter because I like that they do not force version upgrades. This is a philosophical difference. Ubuntu likes to keep everyone on the latest supported releases, so will often remind you its time to upgrade. Linux Mint prefers to take an if-it-aint-broke-don't-fix-it approach that will be less on-going maintenance for me.
A few finishing touches to make the system complete:
A nice wallpaper from [InterfaceLift]. This website has high-res photography that are just stunning.
Power management with screen-saver settings to a nice pink background with white snowflakes falling.
A small collection of her MP3 music pre-loaded so that she would have something to listen to while she learns how to rip CDs and copy over the rest of her music.
Icons on the main desktop for Exaile, My Computer, Home Directory, and the Welcome Screen.
Larger Font size, to make it easier to read.
Update settings that only look for levels "1" and "2". There are five levels, but "1" and "2" are considered the safest, tested versions. Also, an update is only done if it does not involve installing or removing other packages. This should offer some added stability.
I considered installing [ClamAV] for anti-virus protection, but since this laptop will not be connected to the Internet, I decided not to burn up CPU cycles. I also considered installing [Team Viewer] which would allow me remote access to her system if anything should every fail. However, since she does not have Wi-Fi at home, and lives only a few minutes across town, I decided to leave this off.
Once again, I want to thank all of my readers for their suggestions! I learned quite a lot on this journey, and am glad that I have something that I am proud to present to grandma: boots quickly enough, simple to use, and does not require on-going maintenance!