If you don't have time to look through it, here's the quick list with some of my own thoughts:
A slow cooker – As we demand more smarts from our devices that have to be able to run some kind of operating system. Linux is a great fit here. It's freedom allows a lot of flexibility to experiment. It also can be extremely tiny and embedded.
Research robot – this strange humanoid robot is designed for research. Their site features a video of it going through its paces, including a bit of soccer action.
Underwater tsunami sensors – any device that needs to work with a degree of autonomy and be able to interface with other things can benefit from Linux. It tends to just run and run and run and run.
Computer engineer Barbie – OK... this one creates a lot of confusion for me. I'm not thrilled with the image that Barbie presents to girls, yet so many of them seem to connect with it and this one is clearly doing something more intellectual though to me she looks more like she's in marketing and... Aaaaaaggggghhh! However, she does have Tux sitting on her shelf and there is some binary code on one of her screens. It hurts my brain (and my eyes a little) but it's still kinda cool.
Airplane entertainment systems – I've seen this one in action. The plane I was on was having some problems with the system, but it was bravely trying to restart and reconnect. This is especially amusing for me when I think about the people who tell me how unsuitable Linux is for multi-media. As Sheldon would say: "That's poppycock!"
Bark-activated dog door – Again, something interactive that has to work reliably for a long time.
Animate Menus – I admit that these big menu monitors that change drive me crazy because it's always on the wrong screen when I'm trying to decide what I want. However, if I were going to do one, Linux is the obvious choice.
Gas station ads – Not really my favorite thing... but ditto on why I'd use Linux here.
North Korea - I don't guess this is exactly an endorsement one would seek out here in capitalist-land, but... oh, well!
The International Space Station - Are you seeing a trend here? Places where you can't call someone to come reload the operating system seem to find some value in Linux.
Old stuff – One of the past times of Linux enthusiasts is to find places it will run. This obviously includes bleeding-edge hardware, but also includes some legacy hardware...like computers some kids think Ben Franklin used. (Ben Franklin didn't actually use a computer.) The cool thing is that when someone gets it to work they're usually pretty open about sharing, so you can dig that old thing out of the closet and breathe a little new geeky life into it. Here are a few projects that might amuse you: m68k, Wii-Linux, PS2
Calculators, watches and other common devices – Tiny... reliable...
The Lego supercomputer – The availability of open computing platforms like Raspberry Pi have created new opportunities to combine technology with quirky artistic sensibilities
Titanium surfboard robots – durability... reliability... Am I repeating myself?
Quirky ideas that don't catch on, but someone had to try anyway – I love to see unfettered creativity and innovation. Sometimes these things pan out and become the next aglet. Sometimes they just go on to history's "Well... that was interesting" pile. Since the primary requirement of experimenting with Linux is mere desire it's a great platform for crazy ideas and mad science.
Cow milking system –Durability... reliability... and apparently sensitivity and gentleness too.
I love all of these weird uses of Linux. I know this merely scratches the surface. Share anything else you've seen.
I haven't written here for a while. A lot of changes have been going on behind the scenes for me. Around the time that I wrote my last entry I was brought into a new team within ISV Developer Relations at IBM. As a result, I am no longer editor for the Linux and Open Source content on developerWorks. This is bittersweet for me, as I really enjoyed helping authors to shape their information to share with people who want to learn. In my little intro on this blog I mention that I am really passionate about Linux and Open Source. This is not hyperbole. I've run a Linux desktop for more than ten years. We have no Microsoft Windows or Mac OS in my house. Personally, I don't find this a hardship. I do pretty much what I want and need on my computers and don't seem to have the struggles that others do. (I'm hearing a story by one friend who has been through ten different Microsoft tech support people over a period of days still trying to get his Windows 8 activated. Bleh!)
I've always found that it wasn't that Linux and open source couldn't do things that people needed, but that they just weren't aware of what was available or how they had become tool-bound. Perhaps a Linux environment tends to provide the most benefit to a technical person with a sense of curiosity. In any case, it was a joy to help provide a variety of information to help people try things and broaden their horizons. I got to work with some really talented authors and make a difference to hundreds of thousands of people. Wow!
My new role is going to take me more behind the scenes. I'll still be enabling people to get their messages out, but in a greater variety of ways. It's all fairly new, so I don't really have stories to share. I have no doubt that I'll get food for blogging in my new role.
I will continue to use this space to talk about technology and social issues that I think make a difference. I'll continue share things that I do with Linux and open source so that others can explore them as well. As my activities become more varied I may have even more to share!
Along those lines, I recently had to put together a booklet for a non-IBM project. I had done the writing in a word processor, but when it came time to generate the final format I decided that I really wanted to work with a publishing tool rather than a word processor. In a previous role, ages back, I did a lot of work with Adobe PageMaker. I remember renting time on Macs at Kinko's to create signs, fliers and such. Once I designed a set of sample pages for an elementary level math book using PageMaker. The pages were being shown in an editorial meeting for a major textbook publisher and it was the first time they had gotten a printed mock-up as opposed to a literally pasted page. (That sounds ancient doesn't it?)
It's certainly possible to do a booklet in a word processor, and LibreOffice works fine for that. (Have you downloaded v4 yet?) However, a desktop publishing tool is designed for a greater degree of precision in layout. If you can get your head wrapped around the different approach to thinking you really get a more finished product. The most popular open source publisher project I am aware of is Scribus, and the last time I used it was to do an 8-page, full-color mini-magazine for a convention. I know that many will consider Scribus to be lacking as compared to a PageMaker or a Quark, but the $850 price tag on Quark makes it a little out of my reach, and it's not available for Linux. Scribus is free and available for Windows, Mac OS and Linux. I essentially had to wish for it and it was installed on my system. Here's a sample from their sourcforge gallery.
Of course, working with page layout is different from other kinds of content editing. Publishing programs are much more concerned with controlling how things look than helping you to write. It's literally like having a lot of little pieces of paper that you are going to paste onto a blank page. You assign what is on each little piece of paper, a graphic, some text, etc. Some of this is handled by "frames" in a word processor, but in layout software it just seems to be a little easier somehow... at least to me... depending on the project.
Because of the "pagey" nature of this application, it was easier for me to set up something to be how I wanted it and not have things move around on me when I changed something. I had a few challenges with setting up a dynamic Table of Contents, something that is almost magical in LibreOffice. I will probably review this tutorial by Bruce Byfield before my next document.
If you want to play with layout software, Scribus isn't a bad place to start. It will acquaint you with the basic concepts of layout, primarily separating your design concepts from your content. For a big project it might still be worth using a commercial product, but for a little newsletter, booklet or other projects where you are assimilating various elements into a single document, Scribus might do the trick.
Now, when I'm designing a single page, like a sign, postcard or flier, from scratch, I will tend to use Inkscape. I've even done large, full-color banners with this and enjoy it very much.
More adventures later. I'm tinkering with Blender, the free, open 3D modeling/animation/compositing software. If you think moving from word processing to page layout is mind blowing you just wait 'til you see what happens in this paradigm! I used Blender to create the header graphic for this blog. Everything is composed of 3D elements which were lit and moved around, then "photographed". It's pretty interesting stuff.
For everyone who contributed to the Linux and open source areas on developerWorks for me, you have my sincerest gratitude. It was an honor to work with all of you. For the readers to gave feedback, sometimes in the most forceful ways, you have my gratitude as well. It's hard to function in a vacuum and your input constantly shaped my approach to what we covered and how. I look forward to my new adventure.
OK. I'm a little more caught up after the holiday. I'm pleased to announce that the Learning PHP series on developerWorks has gotten some updates (Part 1 was done earlier. 2 and 3 updates just got published.) If you want to dive in and start making use of this popular language this is a great way to do it. I'm actually going to roll up my digital sleeves and walk through the tutorials myself as a coder rather than as an editor. I need to tinker with PHP a lot more and this will get me going quickly.
There's lots going on with the changing of the year, however. Here are a few things of interest.
Linux: 2012 was a very good year
I came across this article from PC World, a publication that is not necessarily known for pushing Linux. There are a lot of interesting points in there. Linux is making money. People are getting stuff done with GNU/Linux. Gaming companies are starting to turn their eyes to the platform. For myself, 2012 was the year where had had less confusion than any other time in my life when I told people I ran Linux. Nearly everyone I spoke with about it had heard of Ubuntu and many said they were considering loading it up themselves if they hadn't already. I know that for many things Linux still makes up a small percentage bump on the user map, but it is going gangbusters in the background and it just isn't going away!
Of course, it isn't always pretty in Linux land. I don't know if it speaks to passion or just poor socialization, but a recent blog post in the Real World Linux community discusses the heated exchange with Linux Torvalds and one of the kernel maintainers when a patch broke something in userspace. I've heard a lot of discussion about whether or not professionals have exchanges like that. They may not where you have worked, but I've witnessed some pretty strongly worded conferences in my career. It's probably not something recommended in the people management handbook, but it does happen. Hopefully everyone will make nice and move on. Of course, ten years ago that exchange wouldn't have been any kind of news whatsoever.
One of the interesting side effects of tools like Twitter is that it gives you a view of the information pie that is hard to get any other way. So much information comes from so many people with tags identifying groups and trends. Twitter has provided a 2012 retrospective page showing what was hot in a variety of topics. My stuff hasn't bubbled up to the top so far, but it's interesting to see who has. It's also interesting—and sometimes aphaalling—to see what people have found important enough to share and discuss. This kind of data is going to become more robust as we go.
Of course, the new year is not all about looking back. It's also about looking forward. This slide show from InfoWorld has their picks for what is "highly anticipated". I'll admit that some of them don't especially grab me. (Let's see if you can figure out which ones!)
I'm intrigued to see the future direction of Android, though the way providers tend to implement it I won't be able to enjoy it with my existing device. I really wish I could handle Android like I do installing Linux on devices!
The pending evolution of wireless protocols is also intriguing. Wouldn't it be amusing if we eventually fix the high-speed everywhere questions with wireless rather than running cables?
The flexible displays are also pretty interesting and could show up in some surprising areas. Imagine touch interaction that doesn't need to be flat anymore.
I'm also curious to see what happens with the new innovations in energy usage that are evolving. I'm sure there will be more on that later.
Happy New Year, all. I hope you got some kind of a break and are ready to start making a difference with all that you do.
I'm going through my post-holiday email with a shovel right now. I have some more "New Year" kinds of things to discuss, but can't quite get to it today. Here are a few tidbits of fun for you in the mean time.
I was watching a British program called QI (Season 10, Episode 1. I would post a link but some of the conversation was a little naughty for the workplace. If you find it on your own that's your problem.)
That's so OMG!
When we think of the abbreviation OMG (short for "Oh, my God") we usually think of bubbly girls texting away. It seems, however, that its first recorded use was in a letter by Baron John Arbuthnot Fisher in a letter 1916 letter to Winston Churchill. Lord Fisher wrote I hear that a new order of Knighthood is on the tapis—O.M.G. (Oh! My God!)—Shower it on the Admiralty! He even clarified the abbreviation... though that would seem to defeat the purpose. I suppose I'll never really be Noble. You can find the quote in his memoirs, published in 1920. (http://cmwosdu.de/W3h8Pz )
I unfriend thee!
Many people have been cleaning up their friend lists on Facebook and it seems that "unfriending" someone is a new and potentially volatile activity. This one is even older! The first recorded usage is in a 1659 letter by Thomas Fuller in which he writes I hope, Sir, that we are not mutually unfriended by this difference which hath happened betwixt us. You can see the original letter in his biography: http://cmwosdu.de/XjQPM7
I must use this the next time I have to let a connection go because they get too crazy!
Get your systray back in Ubuntu!
Finally, here's a little tip I found. I was installing blueproximity, a tool to let me use my mobile phone as a lock token for my Ubuntu desktop. Essentially, when the phone leaves the area of the desktop, it automatically locks and automatically unlocks when it returns. That's kind of handy! However, I was having some difficulty with it, some of which was an issue because the systray icon did not come up in Unity. (Yes, I've been using Unity on my desktop. I hate to say it, but I actually am getting used to it.)
I've missed other icons there, but this was the one that finally annoyed me into action. It turns out the default behavior of Unity is to limit items on the systray to those that have been whitelisted. A simple command turned them all back on, including ones that I didn't realize were missing. Unity with a fully active systray is much nicer. Here's the command to turn them all on:
gsettings set com.canonical.Unity.Panel systray-whitelist "['all']"
I used that command, logged out and back in, and, voila! All my icons are back and functional. Of, course, you can whitelist just a few of them. More details can be found at the original discussion in the Ubuntu forums where I found my answer: http://cmwosdu.de/S5y1hK
Happy New Year! More retrospective, futurespective kind of writing tomorrow or Friday when I can breathe!
From: Scam Mail
DO YOU NEED A LOAN, IF YES REPLY FOR MORE INFO.
At least they are honest about scamming you.
IBM Linux on POWER
I think Linux and the POWER architecture are an outstanding combination. I would really like to see a new POWER laptop to load Linux on. (Apple used to be a good source for that until they decided to go Intel.) POWER servers are competitive in price with el-cheapo models, especially when you factor in consolidating services into a single box. I came across this video which talks about Linux on POWER.
Steve Southworth is funny
Steve Southworth posted this picture with the caption: "The original camera phone."
New VoltDB article in developerWorks/opensource
How many databases do we need exactly? When are we going to have enough? If you have a shop where you are able to support a single database, say DB2, then that's great! However, it's likely that you need to have flexibility in your database, either because you can't always get what you need from the administrators, or you are dealing with customers who have varying situations, or some other unexpected situation that you can't predict. It's always good to have more tools in your toolbox. Talke a look at VoltDB and play with it. Here's what author, Simon Buckle, had to say about his article:
"Having problems scaling your relational database? Thinking about switching to a NoSQL datastore in order to scale but you need your data to be consistent at all times? These are the kind of tradeoffs that you would normally have to consider when deciding whether to go down one route or the other but it is possible to have both scalability and data consistency. Introducing VoltDB.
"VoltDB is an in-memory database written in Java with the scalability of NoSQL data stores, but without the consistency problem; it is ACID compliant. VoltDB is queried using SQL, so it will be familiar to those who have worked with relational databases before. This article will cover the basics of how to install VoltDB and how to use it, so you can integrate it into your application(s). Finally, it will also discuss VoltCache, a scalable key-value store with a memcached compatible API built using VoltDB."
The VoltDB developers tout their project as revolutionizing your application design methodology to get things out fast! Check it out, play with it and you'll be the smartest one in the room when someone brings it up!
Open Source alternatives
Ages back, I wrote a popular blog entry called "Start your learning with Open Source." It must have struck some sort of a chord because it's gotten more than 15K hits since 2009. One of the most common conversations I have with people about open-source software is about subsitutes for the current software that they are using. I reference a few sources in that blog entry, but there is also an evolving Wiki in the Real World Open Source community with a list of open source software. You'll find some handy suggestions in there. You can also add your own. It's just getting started, so we've barely scratched the surface. Take a look and contribute.
Don't forget that there is a discussion board there as well where you can bring up your questions and problems. Have another place where you do those discussions? Tell me where and I'll add them to the bookmarks or feeds in the community.
I caught an article today: "Linux users targeted by mystery drive-by rootkit". I stand by my believe that Linux is the most secure environment that I have used and I enjoy the freedom from many of the security issues that friends experience. However, it would be ridiculous to imagine that Linux–or any environment–is immune from attack.
You probably already know the security basics. Don't hang out at weird websites and let anything and everything run. Use things like Adblocker in Firefox to help cut out little scripts and things that you don't want to run. When things pop up saying would you like to install or run something that you don't understand, don't click "Yes". How do you know you're OK, though?
First, pay attention! You can tell when your system is not behaving normally. When the network seems clogged or processes start getting chunky that could be a sign that things are running on your system that you don't know about. Don't ignore that. Do something about it.
The first step is to look at the processes that are running. In Linux a basic ps -aux will give you information about what is running. If you tinker a lot, like I do, you may bet all kinds of things turned on when you installed them to play with them. The other day I noticed that I had a web server and two database servers that had been left active after playing around with them. Often when you install that kind of software it will set itself to automatically start. These are the kinds of things that can create danger for you if you don't realize they are running.
For seeing how the system resources are running, top is a good quick check. It is a console-based system monitor that will show you what is using resources on your machine. Here's a sample:
Based on my snapshot my audio and firefox are the biggest pgs. I also note that I have mongod running, which is a database enging that I thought I had disabled. It may be that something is using it, or I may not have shut it off correctly. I need to look into that. As a "basic user" I probably don't need to know about all of these processes... but as a "technical user" I really should understand them, at least well enough to know that they are normal.
Dealing with root kits and other nasties
Keeping an eye on all of your running processes is probably not what you want to do. It's good to know that you can spot check with things are misbehaving, but you want to be proactive and stop things before they start. Here are a few things that might help.
Clam Antivirus — Clam was the first anti-virus software that I discovered for Linux. It runs on othe rplatforms as well and seems to be pretty good stuff. Clam does what any intivirus system does. It scans files and compares them with signatures of known viruses. Of course, the value is only as good as the definitions. Clam definitions seem to be updated pretty regularly and it's easy to automate the process. At the very least you should have something like this available. Admitedly, the only virus files that I've ever found with this have been dormant Windows viruses that someone sent me in emails... but it's good to know that.
chkrootkit – This is a common tool available throught Linux distributions. It looks for a number of common exploits and reports issues.
rkhunter – another popular root kit detector that was available through the Ubuntu repositories. This tool works best if you install it onto a "clean" system, i.e. one that you know is uninfected. Ideally you would set it up immediately after installing the operating system and let it initialize. rkhunter looks for unexpected changes to system files and alerts you to possible mischief.
Of course, if you are serious about digging into root kit detection, you will want to look deeper than just running a tool. Here is an excellent article on Symantec's web site: "Detecting Rootkits And Kernel-level Compromises In Linux" which goes into quite a bit of detail about the technical side of this sort of forensics.
Cyber attacks seem to be the way of the future. No one is immune, but you can make yourself less of a target. Some say that eternal vigilance is the price of freedom, and this probably goes for software too.
I got a message when I tried to run a browser-based application that was truly out of Dilbert:
XXXXXXXXXX is temporarily unavailable at this time for any of the following reasons:
Status and additional information are posted on the XXXXXXXXXX System Status page. We apologize for the inconvenience and will bring the application online as soon as possible. Please try again later.
The status page did tell me what was going on, but the first read was a little silly.
Update to Ubuntu 12.10
The other day I did my update to Ubuntu 12.10 on my laptop. The update went smoothly, though it took a while. The one wish that I had was that there was a way to have it automatically use the recommended response for dealing with config files on the updates. The way it works now I have to hit a button from time to time. I'm sure there is a way to do this, but I haven't researched it. Maybe someone out there can point me.
Overall things seem OK. I'm getting some mysterious system component crashes that seem par for the course with an update on this laptop (Lenovo w500). Whatever is crashing doesn't seem to be affecting my normal activity, so it's not troublesome. I expect the next serious round of updates to magically make all of those things go away. I feel that a few things are a little more spry (especially in the Unity desktop) but I have no measurable benchmark.
I have to say that I really like updating Linux. In Windows and other systems where a major upgrade is actually the purchasing of a new product it always seemd a pain. (I'm seeing all sorts of unrest about Windows 8 and am thankful that I don't have to play there.) In Linux I get a little note saying that there's a major distribution update and I hit the button. It's been very pleasant.
Of course, I have a server at a church that suffered some neglect for a while that needs to be updated by hand because it fell too far behind. That is inconvenient, but workable. If you keep things up to date it generally all goes pretty well.
PDFs on the fly
I use PDFs all the time. I think they are a terrific way to share documents. They save trees but provide a controlled look and feel and their openness makes them easy for anyone to read regardless of tools or operating systems. I trust PDF as an archival format much more than I trust any of the word processing formats out there, even open document, I'm sorry to say.
I started working with PDFs a lot when I started using OpenOffice.org, LibreOffice and the like. It was difficult to convince others that they needed to download software, even if it was free, to read my documents. Then I realized that the vast majority of the time that I don't really need someone to edit the document, just view it. All my open document tools had a PDF button built in so, voila! Easy sharing with no complaints.
Generating PDFs has become more common and there's no reason why you can't include that functionality in your own programs. The developerWorks article "Generate PDF files from Java applications dynamically" has just been updated by the author to include the latest techniques. Take a look and see how you might be able to harness this power for yourself.
Today I was pointed to the article "How would you fix the Linux desktop?" through slashdot. (Yes, another one of those articles.) I am quite comfortable using a Linux desktop and have been for nearly a decade, so it's not very mysterious to me. My family also uses Linux as a desktop with no real complaints. However, this seems to remain a controversy. It reminds me a little about my daughter talking about her school lunch.
My daughter just turned ten. The other day she was talking about all of the terrible things they are doing at the school cafeteria. They've removed some of the dishes she liked and put, in her opinion, poor alternatives in their place. I should say that my daughter is not a pizza and hot dogs sort of diner. She likes sushi and different kinds of vegetables when they are well prepared. Her description of what was going on did sound a little poor, but it's an institution's approach to being told to provide more "healthy choices" while also adhering to a giant list of restrictions, primarily budgetary. I would probably eat it, but not look forward to it. I suggested that my daughter could always take her lunch and we could keep them interesting. I don't think she even heard me.
We have a lot of choices that we would rather not act on. "I hate my job," says someone... but doesn't really want to leave and find another one. "I hate the environment in my city," says another... but won't move someplace where they say they'd be happier. We complain, but we don't act, because we are not so unsatisfied that we think it's worth the effort to make a change. This truth means that most complaints fall on deaf ears because providers know that we likely won't do anything. If Walmart knew that "I'm never shopping here again" didn't have a silent "unless I find that I'm desparate for something and everyplace else is closed, or I happen to be somewhere and Walmart is the only place I recognize, or I know I need something cheap" then they would probably be a lot more attentive.
So, in desktop land, though people might be disapointed with their Windows or MacOS experience, they likely won't really try to make a move. Once the disappointment is voiced it has been served and one can simply get on with things.
Some say that the problem is not enough applications and that there are barriers to writing applications that work across Linux desktops. I don't know how true that is. I regularly play with different desktops on my Linux installation (you can change it every time you log in if desired). All of the programs I run work fine across the desktops... though the experience changes slightly as the desktop features rearrange. It seems that it is largely a matter of the application letting go of the things that the desktop does rather than trying to emulate them. Maybe I'm missing something.
There are really only about a dozen things that most people do with a computer. Applications exist for those. Developers of popular software could provide a LInux version as easily as they provide a version for Windows and Mac OS. Arguably, if they started to use some of the existing open development techniques that are used for Linux applications they could more easily write things that run on all of the operating systems with single code base. There are several examples of this in existing open-source software.
People don't use the Linux desktop because they just don't care for the most part. They use whatever they're given. If IT turned around and gave them a Linux desktop and managent said it was the new policy people would use it. Oh, they would complain, just like people do about the store they go back to again and again, but they wouldn't quit their job over it. As long as someone has to make an effort to be different, it will only be those who already do that sort of thing in their lives who take on Linux, and discover the benefits it gives them. Everyone who prefers to "go with the flow" can discover what flows downhill.
This is just a quick one. I've started a week of System Z training, to better understand this technology. I think System Z is a bigger deal than people imagine and there is a lot of our future that could benefit greatly by more people being aware of and taking advantage of this powerful computing technology.
On the one hand, I was aware that this is not for the squeamish. It's true that System Z can create a powerful computing environment that allows many people to simply do what they do without having to worry about everything that runs under the hood. However, to manage such a universe takes some willingness to get your hands dirty. It reminded me of some of my earliest days of computing, where one had to be so close to the moving parts to connect to networks and do anything besides simply run one application at a time.
Yes, there is a lot to successfully harnessing the power of a System Z environment, but it's not really beyond anyone who has a basic grounding in technology. Like anything worthwhile, it takes some focus, and some work... and practice... but the rewards can be so great. Personally, I think open, highly mobile devices on the front end with plenty of Big Iron type of power on the back end is the shortest distance between here and Star Trek. There is still plenty of room for openness in such an environment... though I'm wrestling a little with my classmates on that one.
Today was pretty brain-filling though. Off to enjoy some amazing chili at Tolbert's with my parents. Then more brain stuffing tomorrow.
Fear not... more on blogging coming soon... just not enough brain for it right now.
I recently had a call with some people who are interested in
contributing to the Real World Open Source and Real World Linux communities here one
developerWorks! Yay! I would like to see a lot more input by people in
these places. As a part of that conversation they requested me to
outline my recommendations for people new to writing in this
environment. I decided that this might be of interest to the general
public, so I'm posting here rather than privately through email.
Writing in developerWorks is not like having your own Wordpress or
other blog. You can do a good deal of customization to make it fit your
own preferences, but you will need to fit into the overall
developerWorks framework. This framework may change around you, so the
general rule of thumb is "Keep It Super Simple". Your content is what
is most important here, not any bells and whistles that you might add,
so write things that do well with plain, clean HTML. I prefer to do my
writing in an HTML editor, actually. I tend to use Kompozer, an
open-source editor. Unfortunately, development on this project seems to
have stalled out, but it's still my favorite editor. It produces clean
HTML with no muss or fuss and allows me to easily put something
together which I can just paste into place. You can use any editor that
you choose which can save HTML. However, bear these things in mind:
Don't use a lot of styles and parameters on the HTML. Sometimes
you need to, but keep it to a minimum. This will make your article
behave better when it's published.
Be cautious about what comes out of a word processor. When I
write something in LibreOffice and them paste it into the blog there is
a lot of hidden style information that ends up in the HTML. This is
ugly and bulky and will do weird things to your entries. Be prepared to
clean up anything that you do.
Including images is good, but you will be working with a simpler
subset of formatting options. You will also need to upload your image
to the developerWorks server or reference the link externally.
Posting audio and video is good, but remember that this can
sometimes be unpredictable. For example, when posting a YouTube video,
you must use the old-school <object>
code rather than a simpler <iframe>.
Someday this may change, but for now it is necessary. Some embeds
simply will not work.
The included HTML editor is decent, but a little thin for me. I
have two browser plugins that I use to help me write entries that I
have not pre-written in Kompozer.
Write Area - This plugin will give you a
fuller editor that you can invoke in any text area with a right-click.
It provides more formatting options for links and images. Unfortunately
it does not include a spell checker, though. so be sure to double-check
your work. I use this a lot! (I'm using it now). It's been a real help
to get around any site that has a limited window in which to write.
It's free for Firefox. I'm sure that people with other browser
preference will find similar add-ons. I'm just telling you what I use.
Scribefire - This plugin provides more than
just an editor. It is a blog management system, allowing you to work
with various blogs on different sites. It will give me a list of the
blogs that I use and let me edit or create a new entry for any of them.
This can be handy, but it sometimes does some strange things with more
advanced formatting. (Remember, I said to keep it simple?) Another
feature is that it will allow me to simultaneously publish the same
thing on multiple blogs at once. I did run into one issue, which I
mentioned in a previous post. Do not use the '#' symbol in your article
titles with Scribefire. This caused it to get lost when trying to
agregate my existing entries. That was very frustrating for quite a
while until I tracked down the issue.
There are other blogging tools which are compatible with
developerWorks, but these are the ones that I generally use.
Any pictures that you want to use need to either live on the system
or be linked with the URL. For some content, especially copywrited
content, I just link to it. That saves some of the usage hassles and
acts as an automatic credit to the owner. For example, Dilbert cartoons
are a great thing to include from time to time and they have a simple
method for linking to their content.
If you're going to do things like this, you should expect to have
to tweak the HTML from time to time. Sometimes developerWorks seems to
alter things that are not entered through the raw HTML view. (That's
the <h> button in your toolbar if you are using the default blog
editor.) HTML is nothing to be afraid of, and many of you are technical
people anyway, so you should feel comfortable with it.
For some pictures, though, it's best to upload them. If you are
using the default editor, uploading is automatic. You click the icon to
insert a picture and it gives you a chance to upload your picture. I
will often use this step just to get the picture up and then go into
Write Area to manipulate it and make it look nice with the article.
Add image tool.
You can also upload an image file directly. Select the Settings
link, next to New Entry. On the Create & Edit tab you'll
find File Uploads. You can manage everything here. Note that
this interface acts much like old-school FTP, so you can't overwrite
If you need to change something you need to delete it and then
upload the new one. That provides a window where the file may not
exist, but it's pretty quick.
Copy the link for an existing file and you can use a conventional <img>
tag to include it.
Bookmarking major links
I quickly got annoyed by some of the steps to getting to areas like
file management. They are easy to find, but require a number of
clicks to get there. This was easily remedied with a few book
marks. I have bookmarks set up for my main blog and the entries
page. These reside in a folder on my bookmark toolbar, so it's
pretty easy to jump right to the spot I want. If I did more file
management I'd probably set up a bookmark for it as well, but it's just
as easy to go to the entries page and then click over to files.
(Two quick clicks versus three slow ones.) It seems like such a
silly thing, but it really helped me a lot.
Contributing to the Real World communities
That should get you started with basic blogging. If there are
questions that I have raised rather than answered I'll be happy to
address them. You can email
me or comment here. I may make this a living document and update it
rather than writing additional chapters. I've set up the Real World Open Source and Real World Linux communities so that any member
can draft an article. Simply become a member and start one. When you
submit it, I'll be notified and can release it. Feel free to use this
to post a great topical discovery or idea without taking on the burden
of maintaining your own blog. If you decide to start your own, let me
know and I'll follow it.
Computer security fascinates me. I freely admit that I don't have the chops that many do about cracking into or securing syststems, but I do alright for myself... on securing systems, that is. I'm certainly not claiming in any way that I spend time engaged in any activity that could be construed as subversive or illegal... Dang! Awkward...
Of course, this is the situation one gets into when taking an interest in the "dark arts" of computing. People assume that you are claiming to be some sort of criminal mastermind or something when actually you are simply fascinated by the nature of how bad guys do things. Just as someone who likes to watch true crime documentaries on TV is not necessarily using it to plan their weekend, many people interested in "Black Hat" hacking are not looking to lead the next charge of Anonymous. So, it is likely that if you had an interest in attending the recent Black Hat 2012 conference in Las Vegas that it was hard to make a strong connection between that and what you are paid to do. That's OK. Though the event is over, there is a reasonable archive of confernce material on the web site, including papers, presentations and even some source code! (Use at your own risk.) There's not much in the way of video from the site right now, but a YouTube search brings up material-- though most of it is from Black Hat 2012 in Europe. I'm guessing, though, that techniques and vulnerabilities don't change much by crossing the ocean, so you can probably get a lot from them.
I'll keep my eyes open and try to report additional material as I find it.
IP Law Talk
The other day I was reading about a patent license agreement between a major software company and a minor company for an undisclosed amount regarding undisclosed patents. The story was non-news, unless you're into corporate celebrity, but the discussion had some interesting thoughts expressed. At least they tried to be interesting. They ultimately turned into the sort of juvenile brawl that such discussions do because everyone is out to win. The part of the discussion that really caught my attention was why a company might not want to disclose their patents. Since Linux and Open Source software frequently comes under fire for allegedly violating patents this is interesting to me. The conversation is often along these lines:
Patent holding company: The villainous developers of these open-source projects are stealing our IP and violating our patents and they must pay.
open source developers: Uhhh... we don't think we are.
Patent holding company: Oh, yes you are. In fact we have been striking numerous deals with people who agree that this is a violation.
open source developers: Wow, you really do seem to be making deals with people. Maybe there is something to this. What patents are we violating so that we can fix that?
Crickets: (chirp) (chirp) (chirp)
OK... that wasn't completely fair and read more like a Dilbert cartoon, but I hope you see the fun side of it. It seems to me that if my goal was to prevent people from infringing on my intellectual property that I would want to proclaim loudly and strongly what was being stolen from me so they could and would cease and desist. That doesn't seem to be the way that it works out for some reason. There are non-disclosure agreements (NDAs), behind-the scenes business, announcements that are simultaneously widespread and secretive. It can be very confusing.
Well, it turns out that a new community has formed on trying to understand and relate to Intellectual Property Law. It's your chance to ask your questions and voice your own experiences with people who deal with this every day. It's called IP Law Talk, and should be a fascinating place. I wonder if they know about this weird patent slide show.
Has the Command Line outstayed its welcome?
This is the question asked by a Linux Insider story. I'm going to apologize for being a little prejudiced here, but I just don't understand someone who is technical who wants to do everything with a mouse. Even when I'm supporting Windows I will jump into the command line to get information because I can get information faster by typing "ipconfig /all" than I can browsing around with the mouse. I use icon-based launchers and I find them very handy. I recently talked about how I use them to keep my Firefox identities clear. However, there are some things that I can just flat do more efficiently using the command line. I can then combine those things into a script which I can place under an icon if I so desire. Macro recordings of mouse movements just don't seem to have the same capabilities.
I know that many people get nervous about the command line. They don't type well. They don't have the commands memorized. It can be frustrating until you get used to it. But there is a heavy price for a graphical interface in system resources which could and should be used for other things if the interface is only rarely required.
I hope that you aren't afraid of the command line. If you'd like to explore it in Linux there's a nice tutorial as part of our Learn Linux 101 series. Windows folks can look at this site. You don't have to use it all the time (though I admit that I do). It's nice to have it around, though for when the other tools aren't working. As an example, when I've had some program take over my graphical interface, it's nice to be able to switch to a command session to see what's happening and kill the offending processes. I've been able to use ssh from my phone to connect to my laptop when the keyboard wasn't responding and fix things without having to reboot. Is that geeky? You bet! But that skill comes in handy when you're dealing with bigger problems.
There has been some controversy about comments by Valve co-founder, Gabe Newell, calling Windows 8 a "catastrophe" and saying that Linux was part of Valve's future strategy. (Don't take my word for it. See the story by the BBC.) I admit that I haven't had as much time for games for a while, and when I do I am more likely to want to play a "human contact" game with dice and faces rather than having more computer time. However, it's no secret that Linux has been woefully thin in the gaming area. This is ironic, because I think that the tools and libraries available to Linux could make it an outstanding platform for media and gaming. It's just not where game creators focus.
Perhaps something like the Steam platform working more with Linux will make a difference. Of course, this is a future play. Steam has announced enthusiasm but not a release for Linux. It could get pretty interesting, though. While browsing through the gaming world I found that Steam is looking to Linux. Another site, Good Old Games, does not support Linux now, but might respond to interest, especially if it works well for Steam.
I did find a site, Desura, which already supports Linux. I downloaded a few of their free games to test and just might go for some of the paid titles as well. As entertainment becomes more network and browser based the native platform should matter less and less. I'm intersted to see what has happened. If anyone is already using Desura and knows games I should check out, let me know!
Some time back-- actually quite a while back-- I wrote a series of articles called the Windows to Linux roadmap. Now that I'm editor of the Linux site on developerWorks, I have to look at these things from a different perspective and it is bittersweet to watch them age. Ubuntu wasn't around at that time, which is my primary environment now. There are also tools that have come along to make management easier when, at the time, Webmin was really the only consistent tool I could find. (Webmin is still around, by the way, and I still might consider it if I was managing servers and needed to help share management with people who didn't have a strong Linux background.)
One of the articles I was looking over today was the one on doing backups. In 2003 the backup landscape was pretty dismal, at least from where I could see it. Were I to write that article today I would have more tools to discuss, my favorite being rsync. Rsync was actually around when I wrote the articles, but it was one of those resources that lurked in the shadows, like so many little tools do. Essentially rsync is designed to do file duplication, but tries to make it as efficient as possible by only transfering the delta (changes) in files when it can. It has a number of options and can be set up to do transfers through the network and over encrypted tunnels if desired. I wrote a little script that I run manually whenever I wish to do a backup... though I could run it automatically if I chose... and probably should.
This does a backup to my local USB drive and also does a dump to a network machine, through an encrypted tunnel. This device could be anywhere as long as I could access it over the network, and you'll notice that I am accessing it through an Internet address, so it works when I'm on the road as well. Note also that I'm doing key-based authentication in ssh.
The --exclude-from parameter lets me set up a file containing paths (with wild cards) that I do not want to back up. Things like the Trash, cache files, etc.
The first backup is a bear because it has to transfer all of the data. After that it's easier because it only addresses changes. Of course, one problem with this is that it doesn't take into account file deletions. rsync can do that, but I found that defeated the purpose of the backup if I was trying to recover files that I'd delted accidentally. So, I set up another script that I call cya-purge.sh, that handles that sort of clean-up. I run it periodically, when I'm pretty sure that I don't have something I need to restore.
This second script is identical, except for the --delete parameter, which tells rsync to remove files that are no longer on my system.
I agree that my solution is somewhat inelegant, and probably more hands-on than many people would prefer their backup to be. However, at the time that's really what I was looking for and I still enjoy doing it this way. I have a lot of granular control over this and don't have to mess with interfaces or anything like that. It's simple.
Of course, my hairy-man approach to backups is not going to be to most people's taste. For them/you there is duplicity, an elegant front end to working with rsync that handles bundling of files into smaller chunks, suitable for storing on remote networks. It also does management of the the backup to keep files around for a period of time and then allow them to leave gracefully... something that I would like to get my own scripts to do when I have time to wrap my brain around it. Duplicity is the default backup solution in Ubuntu, so if you have that turned on, you are using it!
My first experience with duplicity was not great. It spent a few hours doing a full backup of my user directory (gigs and gigs of data) and then deleted it when it was done. I never did figure out why it was doing that. However, when I recently tried it again through the Ubuntu control panel it seemed to work fine. I would need to do some tinkering to see how best to emulate my current system of dual backup-ups to a local and remote device, but it might be worth the trouble. I am amused to see that when I looked at the settings to refresh my memory that the automatic backup for today has already occured, and that I did not notice. That's a good sign!
Of course, there are a number of backup solutions that have evolved over the last nine years or so since I penned-- or shoudl I say keyboarded-- that article. Notable ones are Bacula, fwbackups and Amanda. At some point I may dig into them a little more, but in the mean time you will probably enjoy what you can do with rsync. I should point out that there are ways to use rcync in Windows as well. Take a look at this article if you want to explore that.
I was reading things through my Twitter feed the other day and came across this article by Steven J. Vaughan-Nichols discussing the choice that Google, Canonical and others have made to not use Linux in the name of their products. It's not going to turn your world upside down, and it's fairly trivial on some levels, but it is interesting. I use both Ubuntu and Android. I selected them because they have Linux as their foundation, but more specifically because out of similar choices they just did what I wanted the way I wanted.
There is a good deal of discussion about the fact that there are so many Linux distributions. "There's too much choice!" In reality most of these offerings are distinctive in some way, and merely share their foundational parts of Linux and GNU tooling. So far Android has been very successful. Ubuntu seems to be the Linux-based computing environment that more people recognize. Those are good things.
When I became editor of the Open Source site on developerWorks I was inundated with various databases and data framesworks and other similar pieces. Databases and such are fairly successful in the open-source world because that sort of work is a kind of voodoo to a lot of people. It largely runs behind the scenes and gives up data when I ask for it and hides data away when I tell it to. It's an easy place to insert open-source without upsetting people because they don't necessarily deal with the moving parts anyway.
As time has passed I've seen a lot more in the NoSQL areas and with cloud, mobile and all the strange and unusual places we try to put software nowadays I can appreciate the need to know about as many alternatives as you can. As long as the data remains open, flexibility on how you interact with it is handy and can help you turn a bad situation into an innovative opportunity.
When I can buy a 2TB drive to sit on my desk for $99 do I really need to worry about drive tuning? I would say that makes it even more important! What a shame to have a big giant drive and then waste a good deal of it because the data isn't partitioned optimally. I'm still interested in learning more about different drive tuning techniques, especially since I run Linux, because I can mix and match some of that a little more than I might in other environments.
As some of you know, I've been playing around with Blender, the free, open-source 3D modelling, animation and compositing software. I'm still just a baby, but I'm slowly learning how to do interesting things with it. Today I wanted to design a logo for a community group I'm building. I wanted to do something unusual. Tinkering with Blender, I found that one could import SVG files, created in Inkscape, and then manipulate them to have depth. I took some silhouettes that I was playing with and managed to create the following graphic. (be sure to click on it and see it full sized)
Admittedlymy picture won't win any kind of design awards, but it really shows what can be done by bringing things into a 3D environment and playing with light and such rather than simply drawing. I'll be doing much more with this. Of course, once it's designed, it's easy to move the camera around to get different perspectives and even shoot some sort of video where you move through the picture.
Blender is just one of the coolest things. I'm making this image available under Creative Commons, using the (cc-by) license. Please feel free to use it as you wish, just please give me credit.
Yesterday I spent some time with a dear friend in San Antonio, Texas who is in his 70s. We couldn't get together in person, because we're a couple of hours apart, but he got a new iPad and I realized that he should be able to use video technology. It went pretty well. We got him going in fairly short order and were able to talk face-to-face. His health has made it difficult for him to get around as much, so this is going to give him the chance to have more people time. That's good. That's what technology should do. It's good technology too, which works from his iPad and my Linux and my Android and someone else's Windows. Technology should break down barriers, not create them.
Here's the weird part, though. Before contacting him, I discovered that Skype, the tool we decided to start with, had an updated Linux version out. This is weird. Skype has been traditionally "fringe friendly" to Linux. We've been back-leveled for years with little or no interest in moving things much forward. Suddenly, Microsoft, one of the Great Satans, makes the update happen. Maybe it was already in the works and they just pulled aside the curtain. Maybe they jumped in and put a team on it. I don't know. I do know that the last thing I expected from Microsoft's purchase of Skype was for them to make it easier for me.
Microsoft takes a lot of punching from me. I used to use Windows exclusively and now I just don't care for it -- though I support people who do and don't whine about it. Many of their applications and development methods have been problematic for a more open world, which is frustrating and often counterproductive. I've been hearing that Microsoft is cultivating a new perspective which may be beneficial to an open world. Suddenly they've done something that helps. I have to give them a tip of the hat for that. Updating Skype is a big deal. I don't expect I'll be using Windows any time soon, but I'd like to have the chance to interact with Microsoft technologies in an open way. I'd live to start working with them rather than around them. Maybe things really are changing. Perhaps we've really entered into the Age of Aquarius.
In a previous entry I described how I was using ffmpeg to do screen captures for demos. I wanted to share a few new tricks with that. I wanted to make it easier to shoot demos, so I created a special wallpaper when I'm doing one. My screen dimensions are 1920x1200. So, I created a background of that size using GIMP and a graphic I found on the web. I haven't fully vetted this for copyright, so if this belongs to you and you are upset, let me know and I'll redo it.
Within that background I drew a 1280x720 box (the resolution for HD video) and bordered it with a yellow line. Now I know where to put all of my frames when I do the demo. Anything outside of that box will not be recorded.
Finally, I altered my capture command by changing -i :0.0 to -i :0.0+139,152. This tells ffmpeg to offset the capture by 139 pixels on the x axis and 152 pixels on the y axis.
It is highly unlikely that you will be able to use my wallpaper as-is, though you are welcomed to try. I'm sharing it freely (provided that it is freely available from the originator!). You will probably need to make your own for your situation. Now, when I'm doing a demo, I call this up. The video gets a black background (which I could easily repaint any time I wish) and I can run other things around my capture without having to edit it out.
I said I was going to check out Mint to satisfy my curiosity about why this might have made such a leap on distrowatch.com. Well, I loaded it up and wanted to give you a quick reaction.
I've run a variety of Linux distributions as my desktop over the last many years. For the last several years I've stuck with Ubuntu. I fired up Mint from a USB drive to see if it might be compelling. My original goal was to use some file system linking magic to try out some of my key applications, like Lotus Notes, while booted into the other environment. Unfortunately, my encrypted drive made it difficult for me to do that. I didn't really feel that I needed to, though. I saw enough just poking around.
In general, I didn't see anything wrong with Mint. It loaded up fine running Gnome and seemed to talk to my laptop just fine. This is not really surprising news, however, as it would be more news to me if it did not talk to my laptop at this stage of Linux development. Here's a decent video walkthrough of what I saw. It's about 8 minutes, but shows you all of the key points.
Overall Mint was a decent experience, but no something that I felt I needed to switch to right away. In general I think that many people went to Mint to run away from having Unity as the default desktop. If you haven't had the Unity experience, here's a demo of that as well.
Personally, I didn't care so much for Unity. However, it was not difficult to change it. In fact, I found a nice little video which shows how easy it is to change. (I like this woman. I wonder if I could get her to join us over here on developerWorks?)
Another concern for people was that some of the codecs were not supported "out of the box" on Ubuntu. This was really not a problem either. In the Ubuntu repository they have a series of "restricted extras". You'll find it by searching the software repository. Ubuntu doesn't install these things by default because of licensing issues, but if you install them it's fine. This is an extremely simple step. Perhaps Mint supports more codecs, but it's just not that vital to me. I'm comfortable with simply adding elements as I find that I need them.
So, for now I'm not going to worry to much about Mint. Perhaps when I do an installation for someone in the future or on another machine I may give Mint a try. (My wife's machine has been running Kubuntu for a while and might be due for a refreshment.) Feel free to add your own opinions.
The other day I was commenting on an article question where I pointed them to distrowatch.com, my favorite site for keeping up with the latest in Linux distributions. I noticed that the top-listed distribution and continuing to grow rapidly is Mint. I was intrigued. As someone who has used Ubuntu for the last several years I was curious as to what might be different about Mint, and why the sudden growth?
Essentially, Mint is a based on Ubuntu, which is, in turn, based on Debian. Apparently, their goal is to overcome some of the issues that people have with an out-of-the-box Ubuntu installation with codecs for playing DVDs and other media. It's all compatible with Ubuntu repositories, so able to reap the benefits of software that is already set up that way. Why is it suddenly so popular? Some say that it's because of Ubuntu's Unity, and their push to change the desktop experience-- whether you want it changed or not.
(skip this paragraph if you already know all about Linux and desktop choices)
If you are a Windows or Mac user, you probably tend to think of the operating system as a set of icons and tools that you push around with your mouse. This is pretty much the way these things are sold. (Would you like one mouse button or two?) In Linux, this aspect of computing is known as the "desktop" and it is an optional layer. Many don't bother running a desktop on something like a server where it is rarely accessed locally, allowing those memory and compute resources to be used for other things. This decoupling of the desktop means that not only can you choose not to run one, you can choose which one you run. There are many choices, the most popular being KDE and Gnome. You can load one, or another, or have several on your system and switch between them at will depending on your taste.
When I updated my Ubuntu to 11, it replaced my desktop with Unity. It's designed to work more like a smart-phone or tablet environment and I couldn't stand it. If I wanted to be running a tablet I would use a tablet. I use a laptop or desktop machine for a reason. Fortunately, I found an article which told me "How to replace Ubuntu 11.10's Unity desktop with good ol' GNOME". That got me working again and was a testament to the flexibility that I enjoy about Linux... but it was still a pain. Ubuntu has apparently made this move to help attract people to their distribution, using the same logic to sell Mac or Windows. This is probably sound marketing and good business for bringing new customers. It has the side affect of angering the existing technical customer base who have grown to enjoy and trust the Ubuntu environment. (I guess I can't speak for the whole base, just me and a few friends.)
So, what do you do when your distribution starts to tear away from the things that made you choose it in the first place? In the Linux and Open Source world, someone creates a fork from that path to the pit of despair that continues to embrace what you like and maybe adds a little more. In this case, that's Mint. I have downloaded it and will do some tests to see if my IBM software will still run. If it does, there is a good chance that I will join the thousands who are moving from Ubuntu to Mint. I've got nothing against Ubuntu. I still support it... they've just started to leave me behind as a customer. I hope they figure that out.
To me, this is a good story. It reinforces that flexibility that makes the open source world so interesting to me. Linux has always been about choice and access to technology. That is constant. I'll let you know what I think of Mint. If you try it first, let me know.
[The ideas stated here are my own and don't necessarily represent IBM's positions, strategies, or opinions.]
Legal trials of using Open Source
Now that open-source software is making it past technical acceptance we are starting to hit speed bumps from legal. (Is the first class a lawyer takes in college "No! 101"?) I hear time and time again that I can't use a specific open-source project at IBM because there are problems with the license. In some cases these are real issues as developers specifically license their software to discourage commercial usage (a noble sentiment, I'm sure, but detrimental to wide-spread use). In most, cases, though, I think that the project creators unwittingly create obstacles to adoption by just not knowing the legal ramifications of their licenses and how they might be interpreted by a corporate lawyer.
If Open Source is going to forge ahead, this has got to be tackled. At FOSDEM 2012 (Free and Open Source Software Developers' European Meeting) there was a DevRoom to discuss the legal issues of Open Source led by Richard Fontana from Red Hat. You can see details about who was involved in his blog entry, "The first FOSDEM Legal Issues DevRoom". Recordings may become available through Karen Sandler's Free as in Freedom podcast.
I'm going to SXSW 2012
I just got my pass to SXSW, the awesome music, film and interactive conference held each year in Austin, Texas. I am really psyched about seeing how everything transitions from one area to another and who checks in and out of the experience and who floats through the whole thing. Most of the technical focus will come through the Interactive conference. However, technology is going to bleed into everything throughout the event from a very practical perspective. I'm going to keep my eyes open and share everything I can. I'll have more information on where to look as things get closer. If you're going to SXSW, let me know!
[Remember that even though I work for IBM I am an individual with my
own thoughts and ideas. Anything I write here may not necessarily
represent the views of the IBM Corporation or its partners... though I'm
hoping that's only a matter of time before they catch up.]
It is hard to imagine a world without C. It is such a fundamental part of computing today, the foundation of many things that you use. Every deeply technical developer that I've dealt with has some C chops, and many prefer it. Linux, most of the GNU tools and many other software components that you use are written in C. It was a groundbreaking departure from the sort of low-level program that was demanded from computing in the early 80s and it changed everything.
The Tiobe Programming Community Index tracks the popularity of programming languages according to a poll of developers. Java has fallen recently and C has risen, bringing them almost neck to neck. Not bad for something that many would consider "old school" programming!
What makes C so relevant? Part of it is its legacy. Much of the foundations of computing, such as operating system elements are developed in C, so it's vital if you want to work with things at a low level. But the thinking behind C makes room for elegant, portable, fast, maintainable code. That's pretty good stuff! (Developers can preempt all the benefits and create chunky, non-portable code; but you wouldn't do that, would you?)
Ironically, many of the things that developers choose over C are actually built on C, including Java, Python, Perl and others. The virtual machines, the compilers and interpreters are generally written in C.
If you haven't looked at C, the original bible for it is The C Programming Language, written by Brian W. Kerninghan and Dennis M. Ritchie. This is still a popular resource as the foundations have not moved a whole lot. There is also the Eclipse CDT project, which will let you work with C development right in Eclipse. However, a quick look at Linux and other open source development repositories will show you a cornucopia of options for compilers, integrated development environments and more.
Rest in peace Dennis. You made a big difference in the computing world. Arguably, Steve Jobs stood on your shoulders.
[Remember that even though I work for IBM I am an individual with my
own thoughts and ideas. Anything I write here may not necessarily
represent the views of the IBM Corporation or its partners... though I'm
hoping that's only a matter of time before they catch up.]
A while back I turned my router over to dd-wrt. So far I
really have nothing to complain about. It just works. I had
a power outage a while back and it came back with no surprises.
We do multiple laptops and Netflix streaming all together and it seems
to be fine.
It's gotten me to thinking about other sorts of dedicated Linux
possibilities. Now, I'm not looking to do some sort of embedded
project. That's cool, but just not where I tend to spend my
time. However, I do have some little projects that would be nice
for a dedicated system and I just don't have a spare computer.
Fortunately there are a few ways for me to deal with this. Let me tell
you what I'm doing.
Virtualization with kvm
I run Linux full time on all my computers. It does what I need
and I seem to find more things that it can do every day. I'm very
satisfied with a Linux environment. However, in the corporate
world, I still find people who have a half-finished glass of Microsoft
Kool-Aid in front of them and create situations where I must have
access to a Windows system. I'm also finding that as Windows
moves further and further away from what it was when I used it
regularly that I can't help people in my head any more. If
someone wants help with something and they're on Windows, I have to get
a version in front of me so I can find the right ways to configure
things. (I suppose I could just refuse to help, but I've always
wanted people to be able to use technology, especially if they are
motivated and just need some advice.)
So, I don't have a machine that I'm willing to set aside running
Windows, waiting for me to need it. I'm also not willing to do a
dual boot, because it means that I have to stop doing the productive
things that I'm doing to go into this other environment while I fullfil
this other requirement. The answer is obviously virtualization.
I've been a VMWare user for some time. I used it for many
different situations. For example, I had a lab where we were
using Symantic's Ghost as part of our solution to image systems.
I had a Linux server set up with a file share which worked fine for
copying images. However, the mulitcast imaging only worked on
Windows. (*sigh*) I ended up setting up a thin Windows
virtual environment. It talked to the file system with Samba and
allowed me to do the multicasting. It seemed a long way around,
but it prevented me from dedicating an entire new piece of hardware to
run a single program, or from having to find Windows equivalents to all
the other things that this server was doing simply because of one
However, being an open-source kinda guy, I've wanted to have a freer approach to virtualization. I knew about qemu and kvm, but had only fired
them up a little. I already had working virtual machines and just
hadn't devoted the time to figure out how to recreate things in a new
environment. Then, poking around in some forums, I found a simple set
of instructions to convert aVMWare image to a kvm image:
qemu-img convert "Windows XP Professional.vmdk" -O qcow2 Win_XP_Pro.img
That was it. I ran that command, waited a while for everything to
copy and I had an image I could fire up with kvm or qemu. (Essentially
they are the same. kvm includes kernel elements such as hardware
I found that Ubuntu had a nifty little front end called aqemu (see the screenshot below).
It seems to do what I was used to through the VMWare workstation
interface. There are several things that I'm not used to yet. I still
need to learn how to tweak devices, boot from CDs and create
snapshots. I know it can be done, I just need to figure out the
methodology. However, simply running the converted image has worked
just fine. It's lean and mean. Next I'm going to see how it works on
a server. I currently run a machine with similar requirements to the
lab machine I described above. (This one services a church's
infrastructure and I loaded VMWare Server to run their antivirus
management software without having to dedicate a whole new machine.)
I'm hoping to replace it with a simpler kvm approach.
I also want to tell you about my little flash-drive solutions... but
I don't have time to finish this right now. I found out that through
Amazon I could get 16GB flash drives for less than $20, so I've ordered a few. They
should be in tomorrow, so I can tell you more about what I've actually
done, rather than the idea. So, this is is to be continued...
[Remember that even though I work for IBM I am an individual with my
own thoughts and ideas. Anything I write here may not necessarily
represent the views of the IBM Corporation or its partners... though I'm
hoping that's only a matter of time before they catch up.]
Lately I've seen a number of articles like "Why malware for Macs is on its way"
talking about the discovery of a malware kit designed for Macintosh
systems. For those who don't know, there are actually toolkits that are
sold to help people design attacks on systems. If you've heard of
"script kiddie" attacks, then this is the sort of thing that they mean.
Basically someone who doesn't know a lot about hacking into a system
uses one of these kits, much as you or I would use a library to do draw
graphics, and focuses on their core business of ripping off credit card
numbers or what have you.
Most of these kits have been centered around Windows, and they have borne much
fruit. As a Linux user I haven't really had much trouble with that
sort of thing. Neither have Macintosh users. An argument has floated
around for a while. Is it that the architectures of these environments
are somehow superior to Windows, or is it that the market share was
small enough that no one cared to exploit it? Well... we are about to
see. With the emergence of these kits there should be more attempts on
the Macintosh systems. Will theyhold up to the strain or will they fall and require the same sort of scrutiny that a Windows box requires?
Linux is obviously further down the line so I probably don't have to sweat things too much yet. However, the BSD
base of Mac OS X makes the environments hauntingly similar. If the
attacks are highly successful on Macs, then they might transfer easily
to a Linux environment.
Here are a few things that I plan to do to make sure that I have at least a little peace of mind:
Keep that firewall running. I try not to be a control
freak about that, but some basic blocking is always warranted. I might
do some digging and really harden it up. If the malware can't get out
then it can't do its job.
Keep an eye out for weird processes. I
do take a peek at my running processes from time to time, especially
when things seem to be slowing down. I have general familiarity with
what is connected to what and try to look into things that I don't
recognize. I suppose a bad process could hide itself, but at least
I'll catch the less stealthy ones.
Practice safe computing. Fortunately, being an open-source kinda guy, I don't tend to find myself hunting for pirated software (warez).
Usually I can get everything I need right from the Ubuntu repository.
However, I still end up poking around from time to time for other
stuff. I should be cautious about unknown binary packages and try to
get everything from the project site. If I am using a repository, make
sure that I look for news about it. If it's distributing bad stuff the
community will likely know and tell me... but I have to look.
Run ClamAV. The ClamAV software is free and easy to deal with. I'll keep it running and up-to-date.
It's a shame that we have to think about any of this. Computing
should be open and easy. But as long as the bad guys are out there and
our laws and conventions make it so easy for people to impersonate me
with a few numbers then I need to deal with it.
I really hope that the Mac and Linux environments prove a little tougher than Windows. I guess we'll see.
I suppose I should preface this article by reminding people that
this blog is based on my own thoughts and observations and does not
necessarily representthe opinions of the IBM corporation, it's partners or small, blue furry creatures from Alpha Centauri.
In an article for Business Week,
HP CEO, Leo Apotheker said that HP was going to start releasing systems
that can run WebOS, recently acquired from Palm, alongside of Microsoft
Windows. His goal is to create an environment where WebOS will have a
wider spread and hopefully get more applications. Right now WebOS runs
about 6,000 apps compared to the hundreds of thousands run by Android
and Apple. It's an interesting approach and might make a difference,
depending on how it's implemented. I don't know full details, but it
sounds as though it will be a sort of dual boot scenario.
What a different sort of a world! There was a time when such a move would be followed
by rumors of impending crushing blows to HP. I doubt that will be the
case this time. I don't think that their experiment will be fully
successful, because people are driven by their tools. (Boy! Are they
driven by their tools!) If it's a dual boot environment, users would
have to shutdown one environment to get to the other. Unless there is
an absolutely killer app on the WebOS side, that switch will happen
seldom, if ever.
It got me to thinking, though, about a similar scenario with my own
preferred environment, Linux. If you were going to partition a system
to have both environments, you could do the dual boot thing, or you
could use virtual machine technology to activate that other operating
system from within the other. Obviously, my preference would be to
start up Linux and run Windows in the virtual machine, but it could go
the other way. In this way people could learn the tools in the new
area without having to shut down and restart to get to the ones in the
other. I recommend a scenario like this to people who are 90%
motivated to move to Linux, but 10% terrified. A partnership with
VMWare or similar would surely provide the commercial security that
people like for such a thing, though one could always use open
approaches to virtual machines like VirtualBox. I'm just guessing that
a large company who wanted to do this would want to do it cooperation
with other large companies who have similar goals. That seems to be
the way that it goes.
Yet, even in this situation, it's all about the tools. If your
people are still beholden to tools that only work in one environment,
they just aren't going to have any need for the other one. It will
just become a big wrapper. In order to be successful, the new
environment has to have as many of the tools as possible. Many of the
most mature open source tools are available across platforms. Mozilla
tools such as Firefox and Thunderbird are easily available. Now that
more and more work is being done through browsers, getting everyone
into running Firefox on Linux could make an immediate difference.
LibreOffice is multi-platform. (I know that it's moderately fresh as a project, but it picks up where the OpenOffice.org code left off when Oracle acquired it.)
I know that IBM's commercial products, Symphony and Lotus Notes, run in
a Linux environment. I'm sure digging would find some more.
Now that people are using their phones to do things that they used
to do on a computer, they've become just a little more open to having
some tools that work a little differently. Their views of computing
have changed. It's an outstanding time to start trying something
different and the open-source world has a number of offerings. I'm
curious to see if this HP thing goes and if others will emerge from
their fear and start doing things differently as well.
If you're looking for places to start, I still like this list of Open Source projects on Wikipedia. There's also a site, osalt.com, which
is specifically designed to help you explore open source tools that
might work for you, based on what you are using now. If you end up
doing anything in these areas, I'd like to hear about it. Maybe we'll
find a way to share it with everyone
A few interesting stories today. The first one, while not very deep and more of a sort of op-ed piece talks about Oracle's apparent battling with open source. It's called "Shuttleworth: Oracle dooms its prospects in open source business." Oracle has made some interesting choices as the new stewards-- yes stewards! -- of the open-source projects they received with Sun. Another, "Jailbreakers Smell Trouble in New Apple Security Patent," speculates about how some of the new patents that Apple has filed may have more to do with giving them a way to catch iPhone jailbreakers than enhanced user security.
It will be interesting to see how all of this plays out. What I see is an increased interest by many technology consumers in openness. Just as the era of the PC created a world which was unfettered from the rule of the system administrator the next age of computing will allow a tech-savvy user to have more freedom and flexibility with their equipment... provided that they are permitted. We are going to discover soon how much of your technology that you actually own. Imagine if you bought a car but were not permitted to outfit it with whatever enhancements that you desired to go faster, be safer or just look funky. Imagine being prohibited from cannibalizing your old electronics or motor parts to create your own toys, tools and inventions. I can see voiding the warranty and refusing to support a device which had been altered... but surely when I pay that much money for a piece of hardware I should be able to tinker with it. I'm hoping that the response to these sorts of things will be for consumers to go to technologies where they are not subject to such controls. We'll see. Likewise, if Oracle continues to tighten their grip on their open-source projects I hope that many do what has happened in the past when an open-source project has started to turn away from its community responsibilities: they'll turn to another option which is still open.
I agree that these companies are well within their rights to protect their property. However, if your need for protection exceeds the desire of the consumer to put up with your restrictions they will go elsewhere. Like I said... I can't wait to see what happens next.
On the other hand, there are a couple of interesting looks at where openness is blossoming. Here is an article, "What would persuade you to ditch Windows for Ubuntu 10.04?," which says what I've said for some time: Linux is ready for you if you want it. It's a nice description of the things that Ubuntu has done to make themselves friendly enough to by my choice as a daily work environment as well as the choice for people I have helped migrate from Windows to Linux. It's all just going to get better and better.
Finally, there is movement in an area where you would expect openness to be unwelcome: the military. I've seen a few articles lately about how the military is finding advantages to open-source tools. Do a little searching and you'll find them easily. This makes sense to me. Most of the guys I have met who are truly dangerous don't wish for a lot of fancy James Bond gadgets if they get into trouble. They just want a nice sharp pencil. I think that soldiers are discovering that there is an advantage to having transparency to the technology that they use when lives are on the line. It will be interesting to see where all that goes as well.
Enough for now. I wish I had a great tag line to leave you with... but I don't.
One of the resources that comes into my mailbox is a publication called ITWire out of Australia. I enjoy getting a different perspective on people using technology, though every once in a while I get confused when they talk about technology issues up before their government.
Today was a fun piece called "A Win-Lin situation: moving a small office over to Linux." It's yet another example of someone making the move to Linux successfully. Their solution uses a Wine solution with Crossover Linux to allow some Windows programs to run in the new environment. If your business is absolutely bound to Windows applications then you really have no choice but to use such a solution. However, I would personally be very uncomfortable being held hostage by a piece of software like that... but that's an individual choice. The point is, that what on its face appeared to be an insurmountable obstacle to a Linux environment had an answer. It simply took someone with a strong Linux background to help work out the details.
If you are a Linux advocate, remember the amount of fear that some people have about technology. It really is a fear. It may not be a rational fear to you, but we all have some sort of thing that makes us irrationally uncomfortable. Maybe it's speaking in public. Maybe it's dealing with certain kinds of people. Maybe it's heights, or snakes or blood. If you have such a thing, try to realize that there are people who are just as nervous about technology. They don't need to be told how illogical or ignorant they are. They need to be reassured by someone who can confidently lead them to the answer. Get arrogant at this stage and the answer is simple: "No! because you're a jerk!"
I'm sure that the above paragraph really doesn't apply to any of my readers here. You are all helpful people who have your eye on the best technology for the job and creating an environment that is productive and easy-to-maintain. Keep your eye open for examples like the one above and learn what you can from them. Use them to help answer questions for people who just have no idea what Linux offers or why you would want to change.
A champion is supposed to help protect the weak. In your case, you are a knight in shining penguin armor who can lead the charge to a better way. Do with with honor and respect for those you protect. Lead them forward!
Alas! LinuxCon has begun and I'm stuck here, up to my-- well... never you mind! I'm just not there this year. However, conventions about opensource things are increasingly friendly to virtual participation, so maybe I'll be able to look in a little.
Today I already saw an interesting area on Open Source Education. I will continue to say that I think that good use of open source technologies in any school curriculum will go a long way to making kids smarter and more in-tune to technology. Why? Because open source is about technology! it's about how things work and why they work. It's the technological equivalent to those little plastic models that showed you the transparent person with all the guts showing. OK... maybe it's not that graphic... but the skin is certainly pretty easy to peal back.
Right now I haven't seen a lot of education in primary schooling which teaches kids how to enjoy getting into what makes technology happen. I think they get exposed to some commercial tools, but does that really teach what they need? Linux and open source software create an environment where kids can work with any aspect of the infrastructure, from basic program usage to development to security to... I've said it all before! I just can't believe that there aren't more teachers out there who are excited about technology helping their students to do cool stuff. Maybe there are many and I just don't hear about them. Even so, it shouldn't be pockets of experimentation it should be a core value. Open source environments allow you to dissect technology like a frog. It provides complete transparency down to any level that you want to explore. Any children who get a grounding like that are going to skyrocket ahead.
Maybe it's a money thing. Maybe the issue is that funding and free products all go together and a system that centered learning on an open-source environment would have to give up a lot. Maybe there's pressure not to go there by... er... "Them!" (Giant ants?) I really don't understand.
I will say that the availability of open-source software gave me learning opportunities that I never would have had otherwise. I still go and tinker with different technologies because it's so easy to get a basic start with things that are open. That knowledge allowed me once to secure a lab so tightly that we upset the security people (because they couldn't scan it to tell us whether or not it was safe). I want kids and up-and-comers to have that same joy of learning, discovery and exploration that I enjoy. I hope it becomes more available-- or should I say permitted.
Linux Hype vs. Reality Dang! While I was scribbling on the stuff above this story from opensource.com came up about a panel discussion on the perception and reality of Linux today. Wow! I would have loved to have been in that room! Check it out! Bottom line, Linux is not going anywhere and there is nothing really stopping you from putting to work for you now... if you really want to...
How's your second Life? I know that Second Life is not as fully buzzword compliant as it used to be, but I wandered back in to look around and realized that there is some cool stuff there. For example, I attended a live concert with some good music and then had a wonderful conversation with someone about life the universe and everything. Second Life can be a diverting way to interact with people. So, I'm proposing an experiment. I have staked out a location on IBM's property: You can reach it by going here: http://slurl.com/secondlife/IBM%20Business%20Center/68/164/32. This is not my official office. (I'm not actually important enough to have my own designated space in IBM's land. *sigh*) However, it looked like the sort of place that a group of people could sit around and talk about stuff. So, I'm going to try to find a spot on my calendar where I can go and hang out there and anyone who wants to join me is welcome. We can talk about living the open-source lifestyle. If you go there now you'll probably find me sleeping. Feel free to tap me for a friend request. My user name is "Cmwalden Newman" in the weird language of SL.
If you are a Firefox user, you may have heard about the vulnerability
discovered which could allow malicious web sites to steal passwords that
you have stored in your password safe. You didn't know that? It could
suck. I don't have the details, but you can get a hint in the
description of the session "Breaking Browsers: Hacking Auto-Complete" at
Blackhat conference. (That's were security-conscious people get
together and talk about bad-guy stuff.)
The upshot is that after this conference, the precise method for doing
this will be out in the open, and there may be a lot of enterprising
hooligans who immediately make use of it. Get your passwords out of
Firefox now! I found a handy tool that
will look pull the passwords from your local repository and help you
dump it into another format before you clear them out of Firefox. I
know that sounds alarming, but you save it to your local system and run
it from there. (It will warn you if you try to run it from the
Internet.) It will show you a list of your passwords and let you copy
them into another file. I dumped them into a spreadsheet. (ODS format, of course!)
So... what to do with this file. I don't feel much better having a
spreadsheet laying around on my system with passwords to everything.
True, it's much less likely that someone will poke around on my file
system than that people will mess with my browser... but it's still not
a good idea. It's time to crank up the encrypted file space!
I've talked from time to time about working with encrypted file
systems, but not much beyond that. But now it's pretty urgent and I
want to make sure that I have an easy-to-use space available right now
for this and other sensitive information for which I need better
habits. I know that encryption sounds hard, but it's really not that
bad. There's a lovely open-source, multi-platform tool called TrueCrypt that makes this all pretty easy to handle. Don't think encryption will make that much of a difference? Take a peek at this article
on how long it takes to break passwords of varying complexity. Good
encryption with a good password will likely surpass the attention span or statute of limitations for most situations.
How easy was this to do? I installed TrueCrypt, which took a few
minutes of downloading and script-running. I fired up the program
which, incidentally, had a nice GUI. I created a 1GB volume which
resides as a file on my file system. It's formatted internally just
like a file system and it mounts that way too. I could easily have put
it on a flash drive if I wanted to. TrueCrypt also supports encrypting
partitions. Now I have a moderately safe repository that I can save my
spreadsheet into. I can mount it when I need to and not have to do
anything too weird with it. I can also keep multiple things in it,
consolidating my secured items. In Linux, and Mac OSX as well, I
think, it's easy to make a relative pointer to a file. That means that
I can take some key configuration and data files and store them in my
encrypted area, but allow the applications to deal with them as though
they were standard. I can explain that in more detail if someone is
interested. There is probably a way to do that in Windows by now, but I just don't know what it is. Maybe someone can fill us in.
So, I'm sorry to bear the news. I rather like the convenience of
the password safe... but it's just not safe right now. And don't feel
that putting Firefox's password file in your encrypted volume will
help. The problem is that Firefox will give up your password if it's
asked in the right way. We need to make sure that Firefox doesn't know
the password. Ultimately I'm sure this will be fixed. Then it may be
safe to go back. There are also other password safe tools that might
be helpful... but for now, I think I'm going to go with the
old-fashioned copy and paste approach with the spread sheet.
I hope that all of you will take this stuff seriously and give TrueCrypt a try.
We really do need to start taking personal responsibility for securing
our communications. Government is too slow and to clumsy to do it for
us (not to mention that they don't want anything to be secured from them).
Manufacturers have too many points of view to accomodate to make it
automatic. It has to be the right solution for you. Start with this
and before you know it I bet you'll be asking me about encrypting your
I was catching up on my slashdot
articles and found an interesting
note about a major computer manufacturer (hint: you'll find out who
it is if you follow the links) who has been dancing in and out of the closet issue about
supporting Linux as a viable option for users. Their latest round has
boiled things down between Windows and Ubuntu. They have made the
following conclusions about which choice you should make:
You are already using WINDOWS programs
(e.g. Microsoft Office, ITunes etc) and want to continue using them
You are familiar with WINDOWS and do
not want to learn new programs for email, word processing etc
You are new to using computers
You do not plan to use Microsoft
You are interested in open source
It's hard to argue with the points that if your
computing is largely wrapped in Windows proprietary software that you
should be using Windows. It's also hard to argue with the idea that you
should use Ubuntu
if you are interested in open source programming. I guess where I
disagree is that idea that Windows is the best starting place for
people who are new to computers.
I've been away from Windows for a while. I can
never completely escape it, because people I know who have Windows
still ask me for help with problems. I have noticed that as things have
passed through Window XP, then Vista then Windows 7 (Vista 2.0) that I
have more and more difficulty keeping up with how to configure things.
I can always find the answers with a quick web search, but the point is
that I have to look it up. If it was truly easy and intuitive then it
would naturally lead me to the answers. So, why would starting with
Windows put a new user at an advantage?
If you are new to computing then you don't have
any expectations. You are learning technologies from the beginning. At
that point I don't see that either system would make any difference to
you. Sure, you might have more Windows users to throw rocks at-- I mean
more Windows users within a stones' throw to help you out... but you
might also know a few people who use Linux. Ubuntu has an easy install,
an almost magical way of finding and installing software. The default
settings are all pretty reasonable for a typical user. It's designed to
connect you to the Internet and get you browsing with no special
software loads or changes.
In addition, someone new to computers would have
access to tools for art design, media editing, programming, security,
and any number of other interests. All they have to hit the button and
it's theirs. True, the applications may not be the most common
commercial editions of these tools, but this person is new. They are
trying to learn how technologies work. How better to introduce them
than to provide freely available resources that will let them
experiment. As an example, I'll give GIMP
(which has a
very nice article in the Open Source zone right now). GIMP stands
for the Gnu Image Manipulation Program. It's an editor that provides
enormous capabilities to edit photos and other pixel-based graphics
which you would use on the web and in documents. It rivals Adobe
Photoshop in its functions and I have actually used tutorials that were
written for Photoshop to learn skills in GIMP. Someone learning about
pixel-editing can learn a great deal with this tool and put out results
that are usable by anyone. (GIMP supports a ridiculous number of
graphic formats.) There are many other applications that are similar.
I really want to challenge this last point. In
fact, I think that someone new to computers who starts with Linux will
fail to develop a number of bad habits that seem to occur with people
who grow up with Windows. They'll find community assistance early on.
(The Ubuntu help forums are very friendly and usually provide
tutorial-quality answers for solving problems.) They'll learn that
there are options available for software and that they can and should
make choices when they select a tool. They'll grow with their curiosity
rather than be driven by fear that they may or may not be licensed
correctly for what they are doing.
I know many will disagree with me, but I'd love a
chance to take a group of kids and raise them through a Linux program
versus a Windows curriculum. I think the Linux kids will have a broader
more creative view of technology and will dive into a community-drive,
open, global world. I think they'll be people who look for solutions
rather than waiting for answers. It could be a beautiful thing.
I was poking around on some information about OpenSim for an author and stumbled across a video about using a Wii remote to create a low-cost Multi-touch Whiteboard. Wow! I admit that I'd never given a whole lot of thought about how the Wii remote works. I was curious, but I just assumed it would be pretty complicated. I had never realized that the Wii communicated through Bluetooth and that the sensor on the front was an infrared camera.
So, I poked around a little more and found a page on CWiiD, which outlines how to get my laptop, running Ubuntu, to communicate with the Wii remote. I was able to set it up to use the Wii as a mouse, interpreting the tilt sensors-- which was challenging to use! I was also able to get it to work as a mouse using the infrared sensor. In this mode the Wii remote gives its position relative to an infrared source. (Your little Wii bar acts as the source for games, but any light source that generate IR will work, even a candle!) My office lights all use the little swirly fluorescent bulbs, but the bathroom across the hall from my office has incandescents. I just turned on the bathroom light and was immediately able to move my mouse by waving the Wii remote in the air like a magic wand-- though I had to point it at the bathroom. However, a desk lamp with an incandescent light bulb would do. I suppose I could mount a red LED somewhere if I wanted to be specific. (What a great use for a lava lamp!)
Anyway... it's been a while since I've been tickled with technology. This is silly, but fun and easy to do. Check it out!
Did you ever want to turn to a manager, or someone in marketing, or just anyone who doesn't live personally through the pain of what it takes to move an application from concept to clickability and say "If you think it's so easy, why don't you just do it yourself?" If you are currently employed as a developer it's likely that you have resisted that rather career-limiting urge. Yet, dreams can come true. Google has released a tool called the "App Inventor," which provides a drag-and-drop environment which will allow anyone who can design a slide presentation to put together an Android application.
The good news, is some of the more ridiculous application requests may take care of themselves. The bad news is that this will make for a lot of weird stuff out there in the Android application market.
App Inventor runs across platforms, which is the typical Google way. I haven't tried it out yet, but I will soon. Maybe I'll post some sort of weird app for you to enjoy. If you do something with it, why not upload it onto My dW and share it?
This just in... or at least into my inbox: "Google Closes the Blinds on Windows" It looks like the company, which successfully became a verb, and has built a love/hate relationship for many in the technology world has made a choice. Apparently they are in the process of removing Windows from the organization and offering Linux or Mac for all of their users. In this report, Andrew Storms, the director of security operations at nCircle Security, comments that the move smells more like cost-cutting than security. "[But it's] been
cleverly spun into a PR effort to strike at Microsoft," he said.
Even if it is a cost-cutting move, so what? If removing Windows from your organization saves money, money that one might want for other things, like paying employees, marketing, development of new business, things that go way beyond providing a platform on which to run your applications, what's wrong with that? If you could buy fuel for your car more cheaply and still get where you want to go, wouldn't you do it? If you could refinance your mortgage and get a better deal, wouldn't you do it?
As the open-source movement continues to provide multi-platform tools and as more and more functionality moves into the browser, the platform running your applications becomes less and less relevant. For many, the only thing really holding them from making such a move is the will to move. If this all really goes through as rumored, perhaps that example will provide will for other companies and individuals to officially make the change that they've been hinting at for years. If that happens, it will make some interesting changes in the software and hardware industries. Developers who have kept their development Windows-centric, claiming that there was no point in supporting anything else but Windows may suddenly discover that they have multi-platform approaches after all.
I think that finally breaking that ice will help computing to integrate in areas that have been held up. I remember several years ago seeing some of the technology concepts which had your house working together. IBM had this ad about how little bits of technology planted around could make life easier. Frankly, I think centering our computing around Windows has made it very difficult to do this sort of ubiquitous technology. It wasn't that it wasn't possible to have more integration between Windows and things that couldn't run Windows. It just didn't appear to be easy for some reason. As a Linux user, I became frustrated with the ways that gadgets I would get came packaged with all of this Windows software with no information about how to interact with the device in any other way. Generally this was not a problem, because in many cases Linux would recognize the device as a file system or something and I stumbled across an open-source tool to help me out. A good example is the calibre program which I use to manage my Sony book reader. It makes book management very easy to deal with and even helps me convert between formats to make for a better experience. I can group ebooks together in a series to help me find them and read them in order. It's nice. (It's also multi-platform!) A bad example is my TomTom navigator, which, despite the fact that it uses Linux as it's operating system, has it's interface software only available in Windows (and more recently Mac OS X). It's begun to be supplanted by my Droid phone for navigation... which is a shame because I think the TomTom does a better job in many ways.
It's all about standards. Not "this is the only platform that people should use so you should only bother programming with our tools in our way" sort of standards, but true open standards that are independent of tools and platforms. The more we adopt predictable ways for technology to communicate the easier it will be for people to create devices and software that can do it. Perhaps this sort of quaking in the desktop world will cause people to wonder what they'll do if the market changes, and they'll start reaching for those standards. Whatever happens next, I think it's going to be interesting.