If you don't have time to look through it, here's the quick list with some of my own thoughts:
A slow cooker – As we demand more smarts from our devices that have to be able to run some kind of operating system. Linux is a great fit here. It's freedom allows a lot of flexibility to experiment. It also can be extremely tiny and embedded.
Research robot – this strange humanoid robot is designed for research. Their site features a video of it going through its paces, including a bit of soccer action.
Underwater tsunami sensors – any device that needs to work with a degree of autonomy and be able to interface with other things can benefit from Linux. It tends to just run and run and run and run.
Computer engineer Barbie – OK... this one creates a lot of confusion for me. I'm not thrilled with the image that Barbie presents to girls, yet so many of them seem to connect with it and this one is clearly doing something more intellectual though to me she looks more like she's in marketing and... Aaaaaaggggghhh! However, she does have Tux sitting on her shelf and there is some binary code on one of her screens. It hurts my brain (and my eyes a little) but it's still kinda cool.
Airplane entertainment systems – I've seen this one in action. The plane I was on was having some problems with the system, but it was bravely trying to restart and reconnect. This is especially amusing for me when I think about the people who tell me how unsuitable Linux is for multi-media. As Sheldon would say: "That's poppycock!"
Bark-activated dog door – Again, something interactive that has to work reliably for a long time.
Animate Menus – I admit that these big menu monitors that change drive me crazy because it's always on the wrong screen when I'm trying to decide what I want. However, if I were going to do one, Linux is the obvious choice.
Gas station ads – Not really my favorite thing... but ditto on why I'd use Linux here.
North Korea - I don't guess this is exactly an endorsement one would seek out here in capitalist-land, but... oh, well!
The International Space Station - Are you seeing a trend here? Places where you can't call someone to come reload the operating system seem to find some value in Linux.
Old stuff – One of the past times of Linux enthusiasts is to find places it will run. This obviously includes bleeding-edge hardware, but also includes some legacy hardware...like computers some kids think Ben Franklin used. (Ben Franklin didn't actually use a computer.) The cool thing is that when someone gets it to work they're usually pretty open about sharing, so you can dig that old thing out of the closet and breathe a little new geeky life into it. Here are a few projects that might amuse you: m68k, Wii-Linux, PS2
Calculators, watches and other common devices – Tiny... reliable...
The Lego supercomputer – The availability of open computing platforms like Raspberry Pi have created new opportunities to combine technology with quirky artistic sensibilities
Titanium surfboard robots – durability... reliability... Am I repeating myself?
Quirky ideas that don't catch on, but someone had to try anyway – I love to see unfettered creativity and innovation. Sometimes these things pan out and become the next aglet. Sometimes they just go on to history's "Well... that was interesting" pile. Since the primary requirement of experimenting with Linux is mere desire it's a great platform for crazy ideas and mad science.
Cow milking system –Durability... reliability... and apparently sensitivity and gentleness too.
I love all of these weird uses of Linux. I know this merely scratches the surface. Share anything else you've seen.
I haven't written here for a while. A lot of changes have been going on behind the scenes for me. Around the time that I wrote my last entry I was brought into a new team within ISV Developer Relations at IBM. As a result, I am no longer editor for the Linux and Open Source content on developerWorks. This is bittersweet for me, as I really enjoyed helping authors to shape their information to share with people who want to learn. In my little intro on this blog I mention that I am really passionate about Linux and Open Source. This is not hyperbole. I've run a Linux desktop for more than ten years. We have no Microsoft Windows or Mac OS in my house. Personally, I don't find this a hardship. I do pretty much what I want and need on my computers and don't seem to have the struggles that others do. (I'm hearing a story by one friend who has been through ten different Microsoft tech support people over a period of days still trying to get his Windows 8 activated. Bleh!)
I've always found that it wasn't that Linux and open source couldn't do things that people needed, but that they just weren't aware of what was available or how they had become tool-bound. Perhaps a Linux environment tends to provide the most benefit to a technical person with a sense of curiosity. In any case, it was a joy to help provide a variety of information to help people try things and broaden their horizons. I got to work with some really talented authors and make a difference to hundreds of thousands of people. Wow!
My new role is going to take me more behind the scenes. I'll still be enabling people to get their messages out, but in a greater variety of ways. It's all fairly new, so I don't really have stories to share. I have no doubt that I'll get food for blogging in my new role.
I will continue to use this space to talk about technology and social issues that I think make a difference. I'll continue share things that I do with Linux and open source so that others can explore them as well. As my activities become more varied I may have even more to share!
Along those lines, I recently had to put together a booklet for a non-IBM project. I had done the writing in a word processor, but when it came time to generate the final format I decided that I really wanted to work with a publishing tool rather than a word processor. In a previous role, ages back, I did a lot of work with Adobe PageMaker. I remember renting time on Macs at Kinko's to create signs, fliers and such. Once I designed a set of sample pages for an elementary level math book using PageMaker. The pages were being shown in an editorial meeting for a major textbook publisher and it was the first time they had gotten a printed mock-up as opposed to a literally pasted page. (That sounds ancient doesn't it?)
It's certainly possible to do a booklet in a word processor, and LibreOffice works fine for that. (Have you downloaded v4 yet?) However, a desktop publishing tool is designed for a greater degree of precision in layout. If you can get your head wrapped around the different approach to thinking you really get a more finished product. The most popular open source publisher project I am aware of is Scribus, and the last time I used it was to do an 8-page, full-color mini-magazine for a convention. I know that many will consider Scribus to be lacking as compared to a PageMaker or a Quark, but the $850 price tag on Quark makes it a little out of my reach, and it's not available for Linux. Scribus is free and available for Windows, Mac OS and Linux. I essentially had to wish for it and it was installed on my system. Here's a sample from their sourcforge gallery.
Of course, working with page layout is different from other kinds of content editing. Publishing programs are much more concerned with controlling how things look than helping you to write. It's literally like having a lot of little pieces of paper that you are going to paste onto a blank page. You assign what is on each little piece of paper, a graphic, some text, etc. Some of this is handled by "frames" in a word processor, but in layout software it just seems to be a little easier somehow... at least to me... depending on the project.
Because of the "pagey" nature of this application, it was easier for me to set up something to be how I wanted it and not have things move around on me when I changed something. I had a few challenges with setting up a dynamic Table of Contents, something that is almost magical in LibreOffice. I will probably review this tutorial by Bruce Byfield before my next document.
If you want to play with layout software, Scribus isn't a bad place to start. It will acquaint you with the basic concepts of layout, primarily separating your design concepts from your content. For a big project it might still be worth using a commercial product, but for a little newsletter, booklet or other projects where you are assimilating various elements into a single document, Scribus might do the trick.
Now, when I'm designing a single page, like a sign, postcard or flier, from scratch, I will tend to use Inkscape. I've even done large, full-color banners with this and enjoy it very much.
More adventures later. I'm tinkering with Blender, the free, open 3D modeling/animation/compositing software. If you think moving from word processing to page layout is mind blowing you just wait 'til you see what happens in this paradigm! I used Blender to create the header graphic for this blog. Everything is composed of 3D elements which were lit and moved around, then "photographed". It's pretty interesting stuff.
For everyone who contributed to the Linux and open source areas on developerWorks for me, you have my sincerest gratitude. It was an honor to work with all of you. For the readers to gave feedback, sometimes in the most forceful ways, you have my gratitude as well. It's hard to function in a vacuum and your input constantly shaped my approach to what we covered and how. I look forward to my new adventure.
OK. I'm a little more caught up after the holiday. I'm pleased to announce that the Learning PHP series on developerWorks has gotten some updates (Part 1 was done earlier. 2 and 3 updates just got published.) If you want to dive in and start making use of this popular language this is a great way to do it. I'm actually going to roll up my digital sleeves and walk through the tutorials myself as a coder rather than as an editor. I need to tinker with PHP a lot more and this will get me going quickly.
There's lots going on with the changing of the year, however. Here are a few things of interest.
Linux: 2012 was a very good year
I came across this article from PC World, a publication that is not necessarily known for pushing Linux. There are a lot of interesting points in there. Linux is making money. People are getting stuff done with GNU/Linux. Gaming companies are starting to turn their eyes to the platform. For myself, 2012 was the year where had had less confusion than any other time in my life when I told people I ran Linux. Nearly everyone I spoke with about it had heard of Ubuntu and many said they were considering loading it up themselves if they hadn't already. I know that for many things Linux still makes up a small percentage bump on the user map, but it is going gangbusters in the background and it just isn't going away!
Of course, it isn't always pretty in Linux land. I don't know if it speaks to passion or just poor socialization, but a recent blog post in the Real World Linux community discusses the heated exchange with Linux Torvalds and one of the kernel maintainers when a patch broke something in userspace. I've heard a lot of discussion about whether or not professionals have exchanges like that. They may not where you have worked, but I've witnessed some pretty strongly worded conferences in my career. It's probably not something recommended in the people management handbook, but it does happen. Hopefully everyone will make nice and move on. Of course, ten years ago that exchange wouldn't have been any kind of news whatsoever.
One of the interesting side effects of tools like Twitter is that it gives you a view of the information pie that is hard to get any other way. So much information comes from so many people with tags identifying groups and trends. Twitter has provided a 2012 retrospective page showing what was hot in a variety of topics. My stuff hasn't bubbled up to the top so far, but it's interesting to see who has. It's also interesting—and sometimes aphaalling—to see what people have found important enough to share and discuss. This kind of data is going to become more robust as we go.
Of course, the new year is not all about looking back. It's also about looking forward. This slide show from InfoWorld has their picks for what is "highly anticipated". I'll admit that some of them don't especially grab me. (Let's see if you can figure out which ones!)
I'm intrigued to see the future direction of Android, though the way providers tend to implement it I won't be able to enjoy it with my existing device. I really wish I could handle Android like I do installing Linux on devices!
The pending evolution of wireless protocols is also intriguing. Wouldn't it be amusing if we eventually fix the high-speed everywhere questions with wireless rather than running cables?
The flexible displays are also pretty interesting and could show up in some surprising areas. Imagine touch interaction that doesn't need to be flat anymore.
I'm also curious to see what happens with the new innovations in energy usage that are evolving. I'm sure there will be more on that later.
Happy New Year, all. I hope you got some kind of a break and are ready to start making a difference with all that you do.
I'm going through my post-holiday email with a shovel right now. I have some more "New Year" kinds of things to discuss, but can't quite get to it today. Here are a few tidbits of fun for you in the mean time.
I was watching a British program called QI (Season 10, Episode 1. I would post a link but some of the conversation was a little naughty for the workplace. If you find it on your own that's your problem.)
That's so OMG!
When we think of the abbreviation OMG (short for "Oh, my God") we usually think of bubbly girls texting away. It seems, however, that its first recorded use was in a letter by Baron John Arbuthnot Fisher in a letter 1916 letter to Winston Churchill. Lord Fisher wrote I hear that a new order of Knighthood is on the tapis—O.M.G. (Oh! My God!)—Shower it on the Admiralty! He even clarified the abbreviation... though that would seem to defeat the purpose. I suppose I'll never really be Noble. You can find the quote in his memoirs, published in 1920. (http://cmwosdu.de/W3h8Pz )
I unfriend thee!
Many people have been cleaning up their friend lists on Facebook and it seems that "unfriending" someone is a new and potentially volatile activity. This one is even older! The first recorded usage is in a 1659 letter by Thomas Fuller in which he writes I hope, Sir, that we are not mutually unfriended by this difference which hath happened betwixt us. You can see the original letter in his biography: http://cmwosdu.de/XjQPM7
I must use this the next time I have to let a connection go because they get too crazy!
Get your systray back in Ubuntu!
Finally, here's a little tip I found. I was installing blueproximity, a tool to let me use my mobile phone as a lock token for my Ubuntu desktop. Essentially, when the phone leaves the area of the desktop, it automatically locks and automatically unlocks when it returns. That's kind of handy! However, I was having some difficulty with it, some of which was an issue because the systray icon did not come up in Unity. (Yes, I've been using Unity on my desktop. I hate to say it, but I actually am getting used to it.)
I've missed other icons there, but this was the one that finally annoyed me into action. It turns out the default behavior of Unity is to limit items on the systray to those that have been whitelisted. A simple command turned them all back on, including ones that I didn't realize were missing. Unity with a fully active systray is much nicer. Here's the command to turn them all on:
gsettings set com.canonical.Unity.Panel systray-whitelist "['all']"
I used that command, logged out and back in, and, voila! All my icons are back and functional. Of, course, you can whitelist just a few of them. More details can be found at the original discussion in the Ubuntu forums where I found my answer: http://cmwosdu.de/S5y1hK
Happy New Year! More retrospective, futurespective kind of writing tomorrow or Friday when I can breathe!
From: Scam Mail
DO YOU NEED A LOAN, IF YES REPLY FOR MORE INFO.
At least they are honest about scamming you.
IBM Linux on POWER
I think Linux and the POWER architecture are an outstanding combination. I would really like to see a new POWER laptop to load Linux on. (Apple used to be a good source for that until they decided to go Intel.) POWER servers are competitive in price with el-cheapo models, especially when you factor in consolidating services into a single box. I came across this video which talks about Linux on POWER.
Steve Southworth is funny
Steve Southworth posted this picture with the caption: "The original camera phone."
New VoltDB article in developerWorks/opensource
How many databases do we need exactly? When are we going to have enough? If you have a shop where you are able to support a single database, say DB2, then that's great! However, it's likely that you need to have flexibility in your database, either because you can't always get what you need from the administrators, or you are dealing with customers who have varying situations, or some other unexpected situation that you can't predict. It's always good to have more tools in your toolbox. Talke a look at VoltDB and play with it. Here's what author, Simon Buckle, had to say about his article:
"Having problems scaling your relational database? Thinking about switching to a NoSQL datastore in order to scale but you need your data to be consistent at all times? These are the kind of tradeoffs that you would normally have to consider when deciding whether to go down one route or the other but it is possible to have both scalability and data consistency. Introducing VoltDB.
"VoltDB is an in-memory database written in Java with the scalability of NoSQL data stores, but without the consistency problem; it is ACID compliant. VoltDB is queried using SQL, so it will be familiar to those who have worked with relational databases before. This article will cover the basics of how to install VoltDB and how to use it, so you can integrate it into your application(s). Finally, it will also discuss VoltCache, a scalable key-value store with a memcached compatible API built using VoltDB."
The VoltDB developers tout their project as revolutionizing your application design methodology to get things out fast! Check it out, play with it and you'll be the smartest one in the room when someone brings it up!
Open Source alternatives
Ages back, I wrote a popular blog entry called "Start your learning with Open Source." It must have struck some sort of a chord because it's gotten more than 15K hits since 2009. One of the most common conversations I have with people about open-source software is about subsitutes for the current software that they are using. I reference a few sources in that blog entry, but there is also an evolving Wiki in the Real World Open Source community with a list of open source software. You'll find some handy suggestions in there. You can also add your own. It's just getting started, so we've barely scratched the surface. Take a look and contribute.
Don't forget that there is a discussion board there as well where you can bring up your questions and problems. Have another place where you do those discussions? Tell me where and I'll add them to the bookmarks or feeds in the community.
I caught an article today: "Linux users targeted by mystery drive-by rootkit". I stand by my believe that Linux is the most secure environment that I have used and I enjoy the freedom from many of the security issues that friends experience. However, it would be ridiculous to imagine that Linux–or any environment–is immune from attack.
You probably already know the security basics. Don't hang out at weird websites and let anything and everything run. Use things like Adblocker in Firefox to help cut out little scripts and things that you don't want to run. When things pop up saying would you like to install or run something that you don't understand, don't click "Yes". How do you know you're OK, though?
First, pay attention! You can tell when your system is not behaving normally. When the network seems clogged or processes start getting chunky that could be a sign that things are running on your system that you don't know about. Don't ignore that. Do something about it.
The first step is to look at the processes that are running. In Linux a basic ps -aux will give you information about what is running. If you tinker a lot, like I do, you may bet all kinds of things turned on when you installed them to play with them. The other day I noticed that I had a web server and two database servers that had been left active after playing around with them. Often when you install that kind of software it will set itself to automatically start. These are the kinds of things that can create danger for you if you don't realize they are running.
For seeing how the system resources are running, top is a good quick check. It is a console-based system monitor that will show you what is using resources on your machine. Here's a sample:
Based on my snapshot my audio and firefox are the biggest pgs. I also note that I have mongod running, which is a database enging that I thought I had disabled. It may be that something is using it, or I may not have shut it off correctly. I need to look into that. As a "basic user" I probably don't need to know about all of these processes... but as a "technical user" I really should understand them, at least well enough to know that they are normal.
Dealing with root kits and other nasties
Keeping an eye on all of your running processes is probably not what you want to do. It's good to know that you can spot check with things are misbehaving, but you want to be proactive and stop things before they start. Here are a few things that might help.
Clam Antivirus — Clam was the first anti-virus software that I discovered for Linux. It runs on othe rplatforms as well and seems to be pretty good stuff. Clam does what any intivirus system does. It scans files and compares them with signatures of known viruses. Of course, the value is only as good as the definitions. Clam definitions seem to be updated pretty regularly and it's easy to automate the process. At the very least you should have something like this available. Admitedly, the only virus files that I've ever found with this have been dormant Windows viruses that someone sent me in emails... but it's good to know that.
chkrootkit – This is a common tool available throught Linux distributions. It looks for a number of common exploits and reports issues.
rkhunter – another popular root kit detector that was available through the Ubuntu repositories. This tool works best if you install it onto a "clean" system, i.e. one that you know is uninfected. Ideally you would set it up immediately after installing the operating system and let it initialize. rkhunter looks for unexpected changes to system files and alerts you to possible mischief.
Of course, if you are serious about digging into root kit detection, you will want to look deeper than just running a tool. Here is an excellent article on Symantec's web site: "Detecting Rootkits And Kernel-level Compromises In Linux" which goes into quite a bit of detail about the technical side of this sort of forensics.
Cyber attacks seem to be the way of the future. No one is immune, but you can make yourself less of a target. Some say that eternal vigilance is the price of freedom, and this probably goes for software too.
I got a message when I tried to run a browser-based application that was truly out of Dilbert:
XXXXXXXXXX is temporarily unavailable at this time for any of the following reasons:
Status and additional information are posted on the XXXXXXXXXX System Status page. We apologize for the inconvenience and will bring the application online as soon as possible. Please try again later.
The status page did tell me what was going on, but the first read was a little silly.
Update to Ubuntu 12.10
The other day I did my update to Ubuntu 12.10 on my laptop. The update went smoothly, though it took a while. The one wish that I had was that there was a way to have it automatically use the recommended response for dealing with config files on the updates. The way it works now I have to hit a button from time to time. I'm sure there is a way to do this, but I haven't researched it. Maybe someone out there can point me.
Overall things seem OK. I'm getting some mysterious system component crashes that seem par for the course with an update on this laptop (Lenovo w500). Whatever is crashing doesn't seem to be affecting my normal activity, so it's not troublesome. I expect the next serious round of updates to magically make all of those things go away. I feel that a few things are a little more spry (especially in the Unity desktop) but I have no measurable benchmark.
I have to say that I really like updating Linux. In Windows and other systems where a major upgrade is actually the purchasing of a new product it always seemd a pain. (I'm seeing all sorts of unrest about Windows 8 and am thankful that I don't have to play there.) In Linux I get a little note saying that there's a major distribution update and I hit the button. It's been very pleasant.
Of course, I have a server at a church that suffered some neglect for a while that needs to be updated by hand because it fell too far behind. That is inconvenient, but workable. If you keep things up to date it generally all goes pretty well.
PDFs on the fly
I use PDFs all the time. I think they are a terrific way to share documents. They save trees but provide a controlled look and feel and their openness makes them easy for anyone to read regardless of tools or operating systems. I trust PDF as an archival format much more than I trust any of the word processing formats out there, even open document, I'm sorry to say.
I started working with PDFs a lot when I started using OpenOffice.org, LibreOffice and the like. It was difficult to convince others that they needed to download software, even if it was free, to read my documents. Then I realized that the vast majority of the time that I don't really need someone to edit the document, just view it. All my open document tools had a PDF button built in so, voila! Easy sharing with no complaints.
Generating PDFs has become more common and there's no reason why you can't include that functionality in your own programs. The developerWorks article "Generate PDF files from Java applications dynamically" has just been updated by the author to include the latest techniques. Take a look and see how you might be able to harness this power for yourself.
Today I was pointed to the article "How would you fix the Linux desktop?" through slashdot. (Yes, another one of those articles.) I am quite comfortable using a Linux desktop and have been for nearly a decade, so it's not very mysterious to me. My family also uses Linux as a desktop with no real complaints. However, this seems to remain a controversy. It reminds me a little about my daughter talking about her school lunch.
My daughter just turned ten. The other day she was talking about all of the terrible things they are doing at the school cafeteria. They've removed some of the dishes she liked and put, in her opinion, poor alternatives in their place. I should say that my daughter is not a pizza and hot dogs sort of diner. She likes sushi and different kinds of vegetables when they are well prepared. Her description of what was going on did sound a little poor, but it's an institution's approach to being told to provide more "healthy choices" while also adhering to a giant list of restrictions, primarily budgetary. I would probably eat it, but not look forward to it. I suggested that my daughter could always take her lunch and we could keep them interesting. I don't think she even heard me.
We have a lot of choices that we would rather not act on. "I hate my job," says someone... but doesn't really want to leave and find another one. "I hate the environment in my city," says another... but won't move someplace where they say they'd be happier. We complain, but we don't act, because we are not so unsatisfied that we think it's worth the effort to make a change. This truth means that most complaints fall on deaf ears because providers know that we likely won't do anything. If Walmart knew that "I'm never shopping here again" didn't have a silent "unless I find that I'm desparate for something and everyplace else is closed, or I happen to be somewhere and Walmart is the only place I recognize, or I know I need something cheap" then they would probably be a lot more attentive.
So, in desktop land, though people might be disapointed with their Windows or MacOS experience, they likely won't really try to make a move. Once the disappointment is voiced it has been served and one can simply get on with things.
Some say that the problem is not enough applications and that there are barriers to writing applications that work across Linux desktops. I don't know how true that is. I regularly play with different desktops on my Linux installation (you can change it every time you log in if desired). All of the programs I run work fine across the desktops... though the experience changes slightly as the desktop features rearrange. It seems that it is largely a matter of the application letting go of the things that the desktop does rather than trying to emulate them. Maybe I'm missing something.
There are really only about a dozen things that most people do with a computer. Applications exist for those. Developers of popular software could provide a LInux version as easily as they provide a version for Windows and Mac OS. Arguably, if they started to use some of the existing open development techniques that are used for Linux applications they could more easily write things that run on all of the operating systems with single code base. There are several examples of this in existing open-source software.
People don't use the Linux desktop because they just don't care for the most part. They use whatever they're given. If IT turned around and gave them a Linux desktop and managent said it was the new policy people would use it. Oh, they would complain, just like people do about the store they go back to again and again, but they wouldn't quit their job over it. As long as someone has to make an effort to be different, it will only be those who already do that sort of thing in their lives who take on Linux, and discover the benefits it gives them. Everyone who prefers to "go with the flow" can discover what flows downhill.
This is just a quick one. I've started a week of System Z training, to better understand this technology. I think System Z is a bigger deal than people imagine and there is a lot of our future that could benefit greatly by more people being aware of and taking advantage of this powerful computing technology.
On the one hand, I was aware that this is not for the squeamish. It's true that System Z can create a powerful computing environment that allows many people to simply do what they do without having to worry about everything that runs under the hood. However, to manage such a universe takes some willingness to get your hands dirty. It reminded me of some of my earliest days of computing, where one had to be so close to the moving parts to connect to networks and do anything besides simply run one application at a time.
Yes, there is a lot to successfully harnessing the power of a System Z environment, but it's not really beyond anyone who has a basic grounding in technology. Like anything worthwhile, it takes some focus, and some work... and practice... but the rewards can be so great. Personally, I think open, highly mobile devices on the front end with plenty of Big Iron type of power on the back end is the shortest distance between here and Star Trek. There is still plenty of room for openness in such an environment... though I'm wrestling a little with my classmates on that one.
Today was pretty brain-filling though. Off to enjoy some amazing chili at Tolbert's with my parents. Then more brain stuffing tomorrow.
Fear not... more on blogging coming soon... just not enough brain for it right now.
I recently had a call with some people who are interested in
contributing to the Real World Open Source and Real World Linux communities here one
developerWorks! Yay! I would like to see a lot more input by people in
these places. As a part of that conversation they requested me to
outline my recommendations for people new to writing in this
environment. I decided that this might be of interest to the general
public, so I'm posting here rather than privately through email.
Writing in developerWorks is not like having your own Wordpress or
other blog. You can do a good deal of customization to make it fit your
own preferences, but you will need to fit into the overall
developerWorks framework. This framework may change around you, so the
general rule of thumb is "Keep It Super Simple". Your content is what
is most important here, not any bells and whistles that you might add,
so write things that do well with plain, clean HTML. I prefer to do my
writing in an HTML editor, actually. I tend to use Kompozer, an
open-source editor. Unfortunately, development on this project seems to
have stalled out, but it's still my favorite editor. It produces clean
HTML with no muss or fuss and allows me to easily put something
together which I can just paste into place. You can use any editor that
you choose which can save HTML. However, bear these things in mind:
Don't use a lot of styles and parameters on the HTML. Sometimes
you need to, but keep it to a minimum. This will make your article
behave better when it's published.
Be cautious about what comes out of a word processor. When I
write something in LibreOffice and them paste it into the blog there is
a lot of hidden style information that ends up in the HTML. This is
ugly and bulky and will do weird things to your entries. Be prepared to
clean up anything that you do.
Including images is good, but you will be working with a simpler
subset of formatting options. You will also need to upload your image
to the developerWorks server or reference the link externally.
Posting audio and video is good, but remember that this can
sometimes be unpredictable. For example, when posting a YouTube video,
you must use the old-school <object>
code rather than a simpler <iframe>.
Someday this may change, but for now it is necessary. Some embeds
simply will not work.
The included HTML editor is decent, but a little thin for me. I
have two browser plugins that I use to help me write entries that I
have not pre-written in Kompozer.
Write Area - This plugin will give you a
fuller editor that you can invoke in any text area with a right-click.
It provides more formatting options for links and images. Unfortunately
it does not include a spell checker, though. so be sure to double-check
your work. I use this a lot! (I'm using it now). It's been a real help
to get around any site that has a limited window in which to write.
It's free for Firefox. I'm sure that people with other browser
preference will find similar add-ons. I'm just telling you what I use.
Scribefire - This plugin provides more than
just an editor. It is a blog management system, allowing you to work
with various blogs on different sites. It will give me a list of the
blogs that I use and let me edit or create a new entry for any of them.
This can be handy, but it sometimes does some strange things with more
advanced formatting. (Remember, I said to keep it simple?) Another
feature is that it will allow me to simultaneously publish the same
thing on multiple blogs at once. I did run into one issue, which I
mentioned in a previous post. Do not use the '#' symbol in your article
titles with Scribefire. This caused it to get lost when trying to
agregate my existing entries. That was very frustrating for quite a
while until I tracked down the issue.
There are other blogging tools which are compatible with
developerWorks, but these are the ones that I generally use.
Any pictures that you want to use need to either live on the system
or be linked with the URL. For some content, especially copywrited
content, I just link to it. That saves some of the usage hassles and
acts as an automatic credit to the owner. For example, Dilbert cartoons
are a great thing to include from time to time and they have a simple
method for linking to their content.
If you're going to do things like this, you should expect to have
to tweak the HTML from time to time. Sometimes developerWorks seems to
alter things that are not entered through the raw HTML view. (That's
the <h> button in your toolbar if you are using the default blog
editor.) HTML is nothing to be afraid of, and many of you are technical
people anyway, so you should feel comfortable with it.
For some pictures, though, it's best to upload them. If you are
using the default editor, uploading is automatic. You click the icon to
insert a picture and it gives you a chance to upload your picture. I
will often use this step just to get the picture up and then go into
Write Area to manipulate it and make it look nice with the article.
Add image tool.
You can also upload an image file directly. Select the Settings
link, next to New Entry. On the Create & Edit tab you'll
find File Uploads. You can manage everything here. Note that
this interface acts much like old-school FTP, so you can't overwrite
If you need to change something you need to delete it and then
upload the new one. That provides a window where the file may not
exist, but it's pretty quick.
Copy the link for an existing file and you can use a conventional <img>
tag to include it.
Bookmarking major links
I quickly got annoyed by some of the steps to getting to areas like
file management. They are easy to find, but require a number of
clicks to get there. This was easily remedied with a few book
marks. I have bookmarks set up for my main blog and the entries
page. These reside in a folder on my bookmark toolbar, so it's
pretty easy to jump right to the spot I want. If I did more file
management I'd probably set up a bookmark for it as well, but it's just
as easy to go to the entries page and then click over to files.
(Two quick clicks versus three slow ones.) It seems like such a
silly thing, but it really helped me a lot.
Contributing to the Real World communities
That should get you started with basic blogging. If there are
questions that I have raised rather than answered I'll be happy to
address them. You can email
me or comment here. I may make this a living document and update it
rather than writing additional chapters. I've set up the Real World Open Source and Real World Linux communities so that any member
can draft an article. Simply become a member and start one. When you
submit it, I'll be notified and can release it. Feel free to use this
to post a great topical discovery or idea without taking on the burden
of maintaining your own blog. If you decide to start your own, let me
know and I'll follow it.
Computer security fascinates me. I freely admit that I don't have the chops that many do about cracking into or securing syststems, but I do alright for myself... on securing systems, that is. I'm certainly not claiming in any way that I spend time engaged in any activity that could be construed as subversive or illegal... Dang! Awkward...
Of course, this is the situation one gets into when taking an interest in the "dark arts" of computing. People assume that you are claiming to be some sort of criminal mastermind or something when actually you are simply fascinated by the nature of how bad guys do things. Just as someone who likes to watch true crime documentaries on TV is not necessarily using it to plan their weekend, many people interested in "Black Hat" hacking are not looking to lead the next charge of Anonymous. So, it is likely that if you had an interest in attending the recent Black Hat 2012 conference in Las Vegas that it was hard to make a strong connection between that and what you are paid to do. That's OK. Though the event is over, there is a reasonable archive of confernce material on the web site, including papers, presentations and even some source code! (Use at your own risk.) There's not much in the way of video from the site right now, but a YouTube search brings up material-- though most of it is from Black Hat 2012 in Europe. I'm guessing, though, that techniques and vulnerabilities don't change much by crossing the ocean, so you can probably get a lot from them.
I'll keep my eyes open and try to report additional material as I find it.
IP Law Talk
The other day I was reading about a patent license agreement between a major software company and a minor company for an undisclosed amount regarding undisclosed patents. The story was non-news, unless you're into corporate celebrity, but the discussion had some interesting thoughts expressed. At least they tried to be interesting. They ultimately turned into the sort of juvenile brawl that such discussions do because everyone is out to win. The part of the discussion that really caught my attention was why a company might not want to disclose their patents. Since Linux and Open Source software frequently comes under fire for allegedly violating patents this is interesting to me. The conversation is often along these lines:
Patent holding company: The villainous developers of these open-source projects are stealing our IP and violating our patents and they must pay.
open source developers: Uhhh... we don't think we are.
Patent holding company: Oh, yes you are. In fact we have been striking numerous deals with people who agree that this is a violation.
open source developers: Wow, you really do seem to be making deals with people. Maybe there is something to this. What patents are we violating so that we can fix that?
Crickets: (chirp) (chirp) (chirp)
OK... that wasn't completely fair and read more like a Dilbert cartoon, but I hope you see the fun side of it. It seems to me that if my goal was to prevent people from infringing on my intellectual property that I would want to proclaim loudly and strongly what was being stolen from me so they could and would cease and desist. That doesn't seem to be the way that it works out for some reason. There are non-disclosure agreements (NDAs), behind-the scenes business, announcements that are simultaneously widespread and secretive. It can be very confusing.
Well, it turns out that a new community has formed on trying to understand and relate to Intellectual Property Law. It's your chance to ask your questions and voice your own experiences with people who deal with this every day. It's called IP Law Talk, and should be a fascinating place. I wonder if they know about this weird patent slide show.
Has the Command Line outstayed its welcome?
This is the question asked by a Linux Insider story. I'm going to apologize for being a little prejudiced here, but I just don't understand someone who is technical who wants to do everything with a mouse. Even when I'm supporting Windows I will jump into the command line to get information because I can get information faster by typing "ipconfig /all" than I can browsing around with the mouse. I use icon-based launchers and I find them very handy. I recently talked about how I use them to keep my Firefox identities clear. However, there are some things that I can just flat do more efficiently using the command line. I can then combine those things into a script which I can place under an icon if I so desire. Macro recordings of mouse movements just don't seem to have the same capabilities.
I know that many people get nervous about the command line. They don't type well. They don't have the commands memorized. It can be frustrating until you get used to it. But there is a heavy price for a graphical interface in system resources which could and should be used for other things if the interface is only rarely required.
I hope that you aren't afraid of the command line. If you'd like to explore it in Linux there's a nice tutorial as part of our Learn Linux 101 series. Windows folks can look at this site. You don't have to use it all the time (though I admit that I do). It's nice to have it around, though for when the other tools aren't working. As an example, when I've had some program take over my graphical interface, it's nice to be able to switch to a command session to see what's happening and kill the offending processes. I've been able to use ssh from my phone to connect to my laptop when the keyboard wasn't responding and fix things without having to reboot. Is that geeky? You bet! But that skill comes in handy when you're dealing with bigger problems.
There has been some controversy about comments by Valve co-founder, Gabe Newell, calling Windows 8 a "catastrophe" and saying that Linux was part of Valve's future strategy. (Don't take my word for it. See the story by the BBC.) I admit that I haven't had as much time for games for a while, and when I do I am more likely to want to play a "human contact" game with dice and faces rather than having more computer time. However, it's no secret that Linux has been woefully thin in the gaming area. This is ironic, because I think that the tools and libraries available to Linux could make it an outstanding platform for media and gaming. It's just not where game creators focus.
Perhaps something like the Steam platform working more with Linux will make a difference. Of course, this is a future play. Steam has announced enthusiasm but not a release for Linux. It could get pretty interesting, though. While browsing through the gaming world I found that Steam is looking to Linux. Another site, Good Old Games, does not support Linux now, but might respond to interest, especially if it works well for Steam.
I did find a site, Desura, which already supports Linux. I downloaded a few of their free games to test and just might go for some of the paid titles as well. As entertainment becomes more network and browser based the native platform should matter less and less. I'm intersted to see what has happened. If anyone is already using Desura and knows games I should check out, let me know!
Some time back-- actually quite a while back-- I wrote a series of articles called the Windows to Linux roadmap. Now that I'm editor of the Linux site on developerWorks, I have to look at these things from a different perspective and it is bittersweet to watch them age. Ubuntu wasn't around at that time, which is my primary environment now. There are also tools that have come along to make management easier when, at the time, Webmin was really the only consistent tool I could find. (Webmin is still around, by the way, and I still might consider it if I was managing servers and needed to help share management with people who didn't have a strong Linux background.)
One of the articles I was looking over today was the one on doing backups. In 2003 the backup landscape was pretty dismal, at least from where I could see it. Were I to write that article today I would have more tools to discuss, my favorite being rsync. Rsync was actually around when I wrote the articles, but it was one of those resources that lurked in the shadows, like so many little tools do. Essentially rsync is designed to do file duplication, but tries to make it as efficient as possible by only transfering the delta (changes) in files when it can. It has a number of options and can be set up to do transfers through the network and over encrypted tunnels if desired. I wrote a little script that I run manually whenever I wish to do a backup... though I could run it automatically if I chose... and probably should.
This does a backup to my local USB drive and also does a dump to a network machine, through an encrypted tunnel. This device could be anywhere as long as I could access it over the network, and you'll notice that I am accessing it through an Internet address, so it works when I'm on the road as well. Note also that I'm doing key-based authentication in ssh.
The --exclude-from parameter lets me set up a file containing paths (with wild cards) that I do not want to back up. Things like the Trash, cache files, etc.
The first backup is a bear because it has to transfer all of the data. After that it's easier because it only addresses changes. Of course, one problem with this is that it doesn't take into account file deletions. rsync can do that, but I found that defeated the purpose of the backup if I was trying to recover files that I'd delted accidentally. So, I set up another script that I call cya-purge.sh, that handles that sort of clean-up. I run it periodically, when I'm pretty sure that I don't have something I need to restore.
This second script is identical, except for the --delete parameter, which tells rsync to remove files that are no longer on my system.
I agree that my solution is somewhat inelegant, and probably more hands-on than many people would prefer their backup to be. However, at the time that's really what I was looking for and I still enjoy doing it this way. I have a lot of granular control over this and don't have to mess with interfaces or anything like that. It's simple.
Of course, my hairy-man approach to backups is not going to be to most people's taste. For them/you there is duplicity, an elegant front end to working with rsync that handles bundling of files into smaller chunks, suitable for storing on remote networks. It also does management of the the backup to keep files around for a period of time and then allow them to leave gracefully... something that I would like to get my own scripts to do when I have time to wrap my brain around it. Duplicity is the default backup solution in Ubuntu, so if you have that turned on, you are using it!
My first experience with duplicity was not great. It spent a few hours doing a full backup of my user directory (gigs and gigs of data) and then deleted it when it was done. I never did figure out why it was doing that. However, when I recently tried it again through the Ubuntu control panel it seemed to work fine. I would need to do some tinkering to see how best to emulate my current system of dual backup-ups to a local and remote device, but it might be worth the trouble. I am amused to see that when I looked at the settings to refresh my memory that the automatic backup for today has already occured, and that I did not notice. That's a good sign!
Of course, there are a number of backup solutions that have evolved over the last nine years or so since I penned-- or shoudl I say keyboarded-- that article. Notable ones are Bacula, fwbackups and Amanda. At some point I may dig into them a little more, but in the mean time you will probably enjoy what you can do with rsync. I should point out that there are ways to use rcync in Windows as well. Take a look at this article if you want to explore that.
I was reading things through my Twitter feed the other day and came across this article by Steven J. Vaughan-Nichols discussing the choice that Google, Canonical and others have made to not use Linux in the name of their products. It's not going to turn your world upside down, and it's fairly trivial on some levels, but it is interesting. I use both Ubuntu and Android. I selected them because they have Linux as their foundation, but more specifically because out of similar choices they just did what I wanted the way I wanted.
There is a good deal of discussion about the fact that there are so many Linux distributions. "There's too much choice!" In reality most of these offerings are distinctive in some way, and merely share their foundational parts of Linux and GNU tooling. So far Android has been very successful. Ubuntu seems to be the Linux-based computing environment that more people recognize. Those are good things.
When I became editor of the Open Source site on developerWorks I was inundated with various databases and data framesworks and other similar pieces. Databases and such are fairly successful in the open-source world because that sort of work is a kind of voodoo to a lot of people. It largely runs behind the scenes and gives up data when I ask for it and hides data away when I tell it to. It's an easy place to insert open-source without upsetting people because they don't necessarily deal with the moving parts anyway.
As time has passed I've seen a lot more in the NoSQL areas and with cloud, mobile and all the strange and unusual places we try to put software nowadays I can appreciate the need to know about as many alternatives as you can. As long as the data remains open, flexibility on how you interact with it is handy and can help you turn a bad situation into an innovative opportunity.
When I can buy a 2TB drive to sit on my desk for $99 do I really need to worry about drive tuning? I would say that makes it even more important! What a shame to have a big giant drive and then waste a good deal of it because the data isn't partitioned optimally. I'm still interested in learning more about different drive tuning techniques, especially since I run Linux, because I can mix and match some of that a little more than I might in other environments.
As some of you know, I've been playing around with Blender, the free, open-source 3D modelling, animation and compositing software. I'm still just a baby, but I'm slowly learning how to do interesting things with it. Today I wanted to design a logo for a community group I'm building. I wanted to do something unusual. Tinkering with Blender, I found that one could import SVG files, created in Inkscape, and then manipulate them to have depth. I took some silhouettes that I was playing with and managed to create the following graphic. (be sure to click on it and see it full sized)
Admittedlymy picture won't win any kind of design awards, but it really shows what can be done by bringing things into a 3D environment and playing with light and such rather than simply drawing. I'll be doing much more with this. Of course, once it's designed, it's easy to move the camera around to get different perspectives and even shoot some sort of video where you move through the picture.
Blender is just one of the coolest things. I'm making this image available under Creative Commons, using the (cc-by) license. Please feel free to use it as you wish, just please give me credit.
Yesterday I spent some time with a dear friend in San Antonio, Texas who is in his 70s. We couldn't get together in person, because we're a couple of hours apart, but he got a new iPad and I realized that he should be able to use video technology. It went pretty well. We got him going in fairly short order and were able to talk face-to-face. His health has made it difficult for him to get around as much, so this is going to give him the chance to have more people time. That's good. That's what technology should do. It's good technology too, which works from his iPad and my Linux and my Android and someone else's Windows. Technology should break down barriers, not create them.
Here's the weird part, though. Before contacting him, I discovered that Skype, the tool we decided to start with, had an updated Linux version out. This is weird. Skype has been traditionally "fringe friendly" to Linux. We've been back-leveled for years with little or no interest in moving things much forward. Suddenly, Microsoft, one of the Great Satans, makes the update happen. Maybe it was already in the works and they just pulled aside the curtain. Maybe they jumped in and put a team on it. I don't know. I do know that the last thing I expected from Microsoft's purchase of Skype was for them to make it easier for me.
Microsoft takes a lot of punching from me. I used to use Windows exclusively and now I just don't care for it -- though I support people who do and don't whine about it. Many of their applications and development methods have been problematic for a more open world, which is frustrating and often counterproductive. I've been hearing that Microsoft is cultivating a new perspective which may be beneficial to an open world. Suddenly they've done something that helps. I have to give them a tip of the hat for that. Updating Skype is a big deal. I don't expect I'll be using Windows any time soon, but I'd like to have the chance to interact with Microsoft technologies in an open way. I'd live to start working with them rather than around them. Maybe things really are changing. Perhaps we've really entered into the Age of Aquarius.
In a previous entry I described how I was using ffmpeg to do screen captures for demos. I wanted to share a few new tricks with that. I wanted to make it easier to shoot demos, so I created a special wallpaper when I'm doing one. My screen dimensions are 1920x1200. So, I created a background of that size using GIMP and a graphic I found on the web. I haven't fully vetted this for copyright, so if this belongs to you and you are upset, let me know and I'll redo it.
Within that background I drew a 1280x720 box (the resolution for HD video) and bordered it with a yellow line. Now I know where to put all of my frames when I do the demo. Anything outside of that box will not be recorded.
Finally, I altered my capture command by changing -i :0.0 to -i :0.0+139,152. This tells ffmpeg to offset the capture by 139 pixels on the x axis and 152 pixels on the y axis.
It is highly unlikely that you will be able to use my wallpaper as-is, though you are welcomed to try. I'm sharing it freely (provided that it is freely available from the originator!). You will probably need to make your own for your situation. Now, when I'm doing a demo, I call this up. The video gets a black background (which I could easily repaint any time I wish) and I can run other things around my capture without having to edit it out.
I said I was going to check out Mint to satisfy my curiosity about why this might have made such a leap on distrowatch.com. Well, I loaded it up and wanted to give you a quick reaction.
I've run a variety of Linux distributions as my desktop over the last many years. For the last several years I've stuck with Ubuntu. I fired up Mint from a USB drive to see if it might be compelling. My original goal was to use some file system linking magic to try out some of my key applications, like Lotus Notes, while booted into the other environment. Unfortunately, my encrypted drive made it difficult for me to do that. I didn't really feel that I needed to, though. I saw enough just poking around.
In general, I didn't see anything wrong with Mint. It loaded up fine running Gnome and seemed to talk to my laptop just fine. This is not really surprising news, however, as it would be more news to me if it did not talk to my laptop at this stage of Linux development. Here's a decent video walkthrough of what I saw. It's about 8 minutes, but shows you all of the key points.
Overall Mint was a decent experience, but no something that I felt I needed to switch to right away. In general I think that many people went to Mint to run away from having Unity as the default desktop. If you haven't had the Unity experience, here's a demo of that as well.
Personally, I didn't care so much for Unity. However, it was not difficult to change it. In fact, I found a nice little video which shows how easy it is to change. (I like this woman. I wonder if I could get her to join us over here on developerWorks?)
Another concern for people was that some of the codecs were not supported "out of the box" on Ubuntu. This was really not a problem either. In the Ubuntu repository they have a series of "restricted extras". You'll find it by searching the software repository. Ubuntu doesn't install these things by default because of licensing issues, but if you install them it's fine. This is an extremely simple step. Perhaps Mint supports more codecs, but it's just not that vital to me. I'm comfortable with simply adding elements as I find that I need them.
So, for now I'm not going to worry to much about Mint. Perhaps when I do an installation for someone in the future or on another machine I may give Mint a try. (My wife's machine has been running Kubuntu for a while and might be due for a refreshment.) Feel free to add your own opinions.
The other day I was commenting on an article question where I pointed them to distrowatch.com, my favorite site for keeping up with the latest in Linux distributions. I noticed that the top-listed distribution and continuing to grow rapidly is Mint. I was intrigued. As someone who has used Ubuntu for the last several years I was curious as to what might be different about Mint, and why the sudden growth?
Essentially, Mint is a based on Ubuntu, which is, in turn, based on Debian. Apparently, their goal is to overcome some of the issues that people have with an out-of-the-box Ubuntu installation with codecs for playing DVDs and other media. It's all compatible with Ubuntu repositories, so able to reap the benefits of software that is already set up that way. Why is it suddenly so popular? Some say that it's because of Ubuntu's Unity, and their push to change the desktop experience-- whether you want it changed or not.
(skip this paragraph if you already know all about Linux and desktop choices)
If you are a Windows or Mac user, you probably tend to think of the operating system as a set of icons and tools that you push around with your mouse. This is pretty much the way these things are sold. (Would you like one mouse button or two?) In Linux, this aspect of computing is known as the "desktop" and it is an optional layer. Many don't bother running a desktop on something like a server where it is rarely accessed locally, allowing those memory and compute resources to be used for other things. This decoupling of the desktop means that not only can you choose not to run one, you can choose which one you run. There are many choices, the most popular being KDE and Gnome. You can load one, or another, or have several on your system and switch between them at will depending on your taste.
When I updated my Ubuntu to 11, it replaced my desktop with Unity. It's designed to work more like a smart-phone or tablet environment and I couldn't stand it. If I wanted to be running a tablet I would use a tablet. I use a laptop or desktop machine for a reason. Fortunately, I found an article which told me "How to replace Ubuntu 11.10's Unity desktop with good ol' GNOME". That got me working again and was a testament to the flexibility that I enjoy about Linux... but it was still a pain. Ubuntu has apparently made this move to help attract people to their distribution, using the same logic to sell Mac or Windows. This is probably sound marketing and good business for bringing new customers. It has the side affect of angering the existing technical customer base who have grown to enjoy and trust the Ubuntu environment. (I guess I can't speak for the whole base, just me and a few friends.)
So, what do you do when your distribution starts to tear away from the things that made you choose it in the first place? In the Linux and Open Source world, someone creates a fork from that path to the pit of despair that continues to embrace what you like and maybe adds a little more. In this case, that's Mint. I have downloaded it and will do some tests to see if my IBM software will still run. If it does, there is a good chance that I will join the thousands who are moving from Ubuntu to Mint. I've got nothing against Ubuntu. I still support it... they've just started to leave me behind as a customer. I hope they figure that out.
To me, this is a good story. It reinforces that flexibility that makes the open source world so interesting to me. Linux has always been about choice and access to technology. That is constant. I'll let you know what I think of Mint. If you try it first, let me know.
[The ideas stated here are my own and don't necessarily represent IBM's positions, strategies, or opinions.]
Legal trials of using Open Source
Now that open-source software is making it past technical acceptance we are starting to hit speed bumps from legal. (Is the first class a lawyer takes in college "No! 101"?) I hear time and time again that I can't use a specific open-source project at IBM because there are problems with the license. In some cases these are real issues as developers specifically license their software to discourage commercial usage (a noble sentiment, I'm sure, but detrimental to wide-spread use). In most, cases, though, I think that the project creators unwittingly create obstacles to adoption by just not knowing the legal ramifications of their licenses and how they might be interpreted by a corporate lawyer.
If Open Source is going to forge ahead, this has got to be tackled. At FOSDEM 2012 (Free and Open Source Software Developers' European Meeting) there was a DevRoom to discuss the legal issues of Open Source led by Richard Fontana from Red Hat. You can see details about who was involved in his blog entry, "The first FOSDEM Legal Issues DevRoom". Recordings may become available through Karen Sandler's Free as in Freedom podcast.
I'm going to SXSW 2012
I just got my pass to SXSW, the awesome music, film and interactive conference held each year in Austin, Texas. I am really psyched about seeing how everything transitions from one area to another and who checks in and out of the experience and who floats through the whole thing. Most of the technical focus will come through the Interactive conference. However, technology is going to bleed into everything throughout the event from a very practical perspective. I'm going to keep my eyes open and share everything I can. I'll have more information on where to look as things get closer. If you're going to SXSW, let me know!
[Remember that even though I work for IBM I am an individual with my
own thoughts and ideas. Anything I write here may not necessarily
represent the views of the IBM Corporation or its partners... though I'm
hoping that's only a matter of time before they catch up.]
It is hard to imagine a world without C. It is such a fundamental part of computing today, the foundation of many things that you use. Every deeply technical developer that I've dealt with has some C chops, and many prefer it. Linux, most of the GNU tools and many other software components that you use are written in C. It was a groundbreaking departure from the sort of low-level program that was demanded from computing in the early 80s and it changed everything.
The Tiobe Programming Community Index tracks the popularity of programming languages according to a poll of developers. Java has fallen recently and C has risen, bringing them almost neck to neck. Not bad for something that many would consider "old school" programming!
What makes C so relevant? Part of it is its legacy. Much of the foundations of computing, such as operating system elements are developed in C, so it's vital if you want to work with things at a low level. But the thinking behind C makes room for elegant, portable, fast, maintainable code. That's pretty good stuff! (Developers can preempt all the benefits and create chunky, non-portable code; but you wouldn't do that, would you?)
Ironically, many of the things that developers choose over C are actually built on C, including Java, Python, Perl and others. The virtual machines, the compilers and interpreters are generally written in C.
If you haven't looked at C, the original bible for it is The C Programming Language, written by Brian W. Kerninghan and Dennis M. Ritchie. This is still a popular resource as the foundations have not moved a whole lot. There is also the Eclipse CDT project, which will let you work with C development right in Eclipse. However, a quick look at Linux and other open source development repositories will show you a cornucopia of options for compilers, integrated development environments and more.
Rest in peace Dennis. You made a big difference in the computing world. Arguably, Steve Jobs stood on your shoulders.
[Remember that even though I work for IBM I am an individual with my
own thoughts and ideas. Anything I write here may not necessarily
represent the views of the IBM Corporation or its partners... though I'm
hoping that's only a matter of time before they catch up.]
A while back I turned my router over to dd-wrt. So far I
really have nothing to complain about. It just works. I had
a power outage a while back and it came back with no surprises.
We do multiple laptops and Netflix streaming all together and it seems
to be fine.
It's gotten me to thinking about other sorts of dedicated Linux
possibilities. Now, I'm not looking to do some sort of embedded
project. That's cool, but just not where I tend to spend my
time. However, I do have some little projects that would be nice
for a dedicated system and I just don't have a spare computer.
Fortunately there are a few ways for me to deal with this. Let me tell
you what I'm doing.
Virtualization with kvm
I run Linux full time on all my computers. It does what I need
and I seem to find more things that it can do every day. I'm very
satisfied with a Linux environment. However, in the corporate
world, I still find people who have a half-finished glass of Microsoft
Kool-Aid in front of them and create situations where I must have
access to a Windows system. I'm also finding that as Windows
moves further and further away from what it was when I used it
regularly that I can't help people in my head any more. If
someone wants help with something and they're on Windows, I have to get
a version in front of me so I can find the right ways to configure
things. (I suppose I could just refuse to help, but I've always
wanted people to be able to use technology, especially if they are
motivated and just need some advice.)
So, I don't have a machine that I'm willing to set aside running
Windows, waiting for me to need it. I'm also not willing to do a
dual boot, because it means that I have to stop doing the productive
things that I'm doing to go into this other environment while I fullfil
this other requirement. The answer is obviously virtualization.
I've been a VMWare user for some time. I used it for many
different situations. For example, I had a lab where we were
using Symantic's Ghost as part of our solution to image systems.
I had a Linux server set up with a file share which worked fine for
copying images. However, the mulitcast imaging only worked on
Windows. (*sigh*) I ended up setting up a thin Windows
virtual environment. It talked to the file system with Samba and
allowed me to do the multicasting. It seemed a long way around,
but it prevented me from dedicating an entire new piece of hardware to
run a single program, or from having to find Windows equivalents to all
the other things that this server was doing simply because of one
However, being an open-source kinda guy, I've wanted to have a freer approach to virtualization. I knew about qemu and kvm, but had only fired
them up a little. I already had working virtual machines and just
hadn't devoted the time to figure out how to recreate things in a new
environment. Then, poking around in some forums, I found a simple set
of instructions to convert aVMWare image to a kvm image:
qemu-img convert "Windows XP Professional.vmdk" -O qcow2 Win_XP_Pro.img
That was it. I ran that command, waited a while for everything to
copy and I had an image I could fire up with kvm or qemu. (Essentially
they are the same. kvm includes kernel elements such as hardware
I found that Ubuntu had a nifty little front end called aqemu (see the screenshot below).
It seems to do what I was used to through the VMWare workstation
interface. There are several things that I'm not used to yet. I still
need to learn how to tweak devices, boot from CDs and create
snapshots. I know it can be done, I just need to figure out the
methodology. However, simply running the converted image has worked
just fine. It's lean and mean. Next I'm going to see how it works on
a server. I currently run a machine with similar requirements to the
lab machine I described above. (This one services a church's
infrastructure and I loaded VMWare Server to run their antivirus
management software without having to dedicate a whole new machine.)
I'm hoping to replace it with a simpler kvm approach.
I also want to tell you about my little flash-drive solutions... but
I don't have time to finish this right now. I found out that through
Amazon I could get 16GB flash drives for less than $20, so I've ordered a few. They
should be in tomorrow, so I can tell you more about what I've actually
done, rather than the idea. So, this is is to be continued...