Do you smell that burning? That's all of the passion of people on a US Presidential election day. Pardon us if everyone is a little weird today. Of course, one of the discussions that I find pretty interesting is the "Vote, but only vote for someone who matters" discussion. It contains phrases like "Unless you vote for someone on this list you are throwing your vote away."
No, I'm not turning the blog political today, but I am going to spin this thinking a little. Essentially the idea is that you should only vote for someone who has a chance of winning. Does that mean that if your candidate loses that you "threw your vote away?" You have to vote for what works for you and you run the risk that you might not be in the majority.
Ready for the spin? Here it comes!
I see being a GNU/Linux and open-source software user to be the same kind of thing. I use software to get things done. Sometimes I have a business requirement to use a particular tool, but often the finished product is all that matters. I have consistently been able to explore new skill and do things that I had never tried before simply because I could download an open tool without having to wrangle money or licenses from my organization. It is true that the majority of computing is done on two specific platforms and there are clearly leading tools for specific tasks. However, in my quest to get things done I still have all my choices open. In most cases I'm able to do things in a way that is compatible with people who use commercial tools, so no one has to worry about it. In any case I'm learning about a particular skill in the tried and true fashion of simply exploring.
If you are voting today you should vote your conscience and not worry about whether your vote will count. They all count... even if they are only recognized by the people who received them... a sign that they shouldn't give up their cause, even if they don't win. If you are trying to get something done or build skills and you can't get your hands on the "right stuff" look around for open alternatives. You will accomplish something and either find that it works well enough for your needs or that you can more clearly demonstrate the value of having the tool that you want. "I did this with what I have, but if you buy me that I can do these other things."
Doing something almost always accomplishes more than doing nothing.
There's a Storm brewing
I want to take a quick opportunity to send my heart out to the people who are victims of the storms along the East coast in the United States. Nature is a tremendous force that we do not fully appreciate until there are events like this. I hope your family and friends are safe and unharmed. My condolences to all who did not make it through unscathed.
I'm going to turn to a different storm, now. Twitter has created the interesting phenomenon of a massive stream of real-time data from people all over the world. There is an incredible wealth of information in there, if only you could get it out. One of the tools which might help is Twitter Storm. While not brand new it's a relatively new player in the data space. M. Tim Jones takes some time to introduce it and lead you into the basics in this developerWorks article, "Process real-time big data with Twitter Storm - An introduction to streaming big data". Take a look at it and other material in the Open Source section.
I got a message when I tried to run a browser-based application that was truly out of Dilbert:
XXXXXXXXXX is temporarily unavailable at this time for any of the following reasons:
Status and additional information are posted on the XXXXXXXXXX System Status page. We apologize for the inconvenience and will bring the application online as soon as possible. Please try again later.
The status page did tell me what was going on, but the first read was a little silly.
Update to Ubuntu 12.10
The other day I did my update to Ubuntu 12.10 on my laptop. The update went smoothly, though it took a while. The one wish that I had was that there was a way to have it automatically use the recommended response for dealing with config files on the updates. The way it works now I have to hit a button from time to time. I'm sure there is a way to do this, but I haven't researched it. Maybe someone out there can point me.
Overall things seem OK. I'm getting some mysterious system component crashes that seem par for the course with an update on this laptop (Lenovo w500). Whatever is crashing doesn't seem to be affecting my normal activity, so it's not troublesome. I expect the next serious round of updates to magically make all of those things go away. I feel that a few things are a little more spry (especially in the Unity desktop) but I have no measurable benchmark.
I have to say that I really like updating Linux. In Windows and other systems where a major upgrade is actually the purchasing of a new product it always seemd a pain. (I'm seeing all sorts of unrest about Windows 8 and am thankful that I don't have to play there.) In Linux I get a little note saying that there's a major distribution update and I hit the button. It's been very pleasant.
Of course, I have a server at a church that suffered some neglect for a while that needs to be updated by hand because it fell too far behind. That is inconvenient, but workable. If you keep things up to date it generally all goes pretty well.
PDFs on the fly
I use PDFs all the time. I think they are a terrific way to share documents. They save trees but provide a controlled look and feel and their openness makes them easy for anyone to read regardless of tools or operating systems. I trust PDF as an archival format much more than I trust any of the word processing formats out there, even open document, I'm sorry to say.
I started working with PDFs a lot when I started using OpenOffice.org, LibreOffice and the like. It was difficult to convince others that they needed to download software, even if it was free, to read my documents. Then I realized that the vast majority of the time that I don't really need someone to edit the document, just view it. All my open document tools had a PDF button built in so, voila! Easy sharing with no complaints.
Generating PDFs has become more common and there's no reason why you can't include that functionality in your own programs. The developerWorks article "Generate PDF files from Java applications dynamically" has just been updated by the author to include the latest techniques. Take a look and see how you might be able to harness this power for yourself.
I receive a lot of SPAM in my email. Some of it is fairly entertaining. Here his one I got today:
ECONOMIC AND FINANCIAL CRIME COMMISSION (EFCC)
#15 AWOLOWO ROAD,
IKOYI, LAGOS. NIGERIA
Motto:Eagle Eye Of The Law.
We are writing to know if it is true that you are DEAD? Because we received a notification from one Mr.Graham Simon Brad of United Kingdom stating that you are DEAD and you gave him the right to claim your compensation funds which you have with us.He also stated you died in a Train Crash in India and he has been calling our office regarding this issue, but we cannot proceed with him until we confirm this by not hearing from you after 7days.We would like you to know that this is the second time we are notifying you about your compensation fund but we do not want to assume until after this final notice because, we have with us his international passport which he sent to us and also his bank account details where he wants the money transferred into.
Be advised that we have made all arrangements for you to receive and confirm your funds without any stress and delay.All we want to know now is whether you are dead or still alive because Mr.Graham Simon Brad message came as a shock to us and we can not just proceed with him until we confirm if what he said is real and if it happens that we do not hear from you in 7days then we say: MAY YOUR SOUL REST IN PERFECT PEACE.
May the peace of the Lord be with you wherever you may be now.
... and my response:
Mr. Graham is correct. I was crushed beneath the wreckage in that train crash in what was a blessedly quick death. I have been communicating with him by Ouija board to make arrangements for him to receive these funds to help arrange for special vacations to expose Goth kids to sunlight and upbeat music.
Please cooperate with him in this matter in every way. Should you need to contact me further, please arrange a séance and I will endeavor to answer any questions you may have.
Halloween comes only once a year, but you can carry it in your heart all year round. I've always enjoyed the spooky stuff and there are those who really go all out to celebrate that season. As it turns out, people are applying a lot of do-it-yourself technology to make their own spooky effects. Some are just front port surprises. Others are in very professional haunted attractions. Unless you are extremely industrious and don't need sleep it's probably too late for you to do much with these ideas, but it's never too early to start for next year.
I'm actually thinking about doing something with web cams and a laptop. It could be fun!
Mirror, mirror on the wall...
Mirrors have long played a part in horror tales. Here is a genius way to create your own ghostly mirror effect. With LED monitors as cheap as they are now this is actually not too difficult to manage. Here is video and then a link to the instructions.
One of my most vivid memories from Walt Disney's Haunted Mansion is the singing statues. This enterprising individual has figured out a way to recreate this effect on his porch activated through an arduino circuit!
Of course, not all of these things are incredibly high tech. You could probably accomplish this one over the weekend with a few parts from the discount store. It shows how to set up a poor-person's gobo (a light with a shape in it) using a pen light, a cheap compact fromt he makeup department and some clip art. How simple is that? Of course it would be easy enough to make the light something that was not battery operated and even control it through some switching. The mirror technique is handy, though. I'd always seen this done through lensing.
Of course, if you have the means to be a more sophisticated projection you can use techniques similar to those used by the Bates Haunt. He talks about Photoshop, but it would be perfectly simple to use open-source GIMP instead.
On Sunday, October 14, 2012, my family watched Felix Baumgartner plummet from more than 120,000 feet into history. If you missed this fascinating event, here is the video. Interestingly it comes in at just about the same time that we tuned in, so you get to see the same thing that I saw. At 12:16 or so in the video there was a sudden cut away with silence that was very alarming. I thought something had gone wrong.
What an event. The touchdown was so smooth and perfect! My daughter and I talked about the precision of the checklist and all of the planning that had gone into this.
While this is amazing, what also interests me is how we came to watch it. A number of things fell into place that would have not been available even a few years ago. It's an interesting example of how my life has changed due to social media.
First, I had not heard about this jump. I guess I don't follow the correct news. A friend on Facebook pointed out that it was going to happen, expressing her dismay that this was not being covered seriously by the news. I noted it with interest, but apparently not enough interest to mark the time. I got distracted.
Later, my family was sitting down to watch the special features on the Avengers Blue-ray disk, and I happened to check Facebook on my phone. My friend had posted again that the jump was happening now and being broadcast live on the Discovery Channel. Excitedly, we adjusted our plans and decided the special features could happen at any time. We went to switch over, only to find that part of our cutting back our TV subscription (we watch little "normal" TV) removed the Discover Channel. How inconvenient!
Fortunately, it was easy enough for me to find a Live YouTube broadcast of the event. (This is a fairly new addition to YouTube.) I was struggling to get it to come up through our TV player, but it was no problem to bring it up full screen on my laptop. The three of us sat and watched fascinated as Felix dropped himself from that weather balloon in yet another example of space becoming accessible to the rest of us.
I often become frustrated at the ways my work technologies and personal technologies can clash and how things don't always work the way I'd like. Yet this is an example of how it all came together. That's really amazing!
POSIX Semaphore APIs using System V Semaphores APIs
As I noted above, the world is changing. My experience with the Freefall event was an interesting combination of interfaces with mysterious technological services. Yes, I have some idea of how they are likely implemented, but I don't really care about the details. My TV, my phone and my laptop all have the potential to interact with these things through Android and Linux and who knows what. I have no idea where the actual bits and bytes are processed and I don't really care. It's magic. It's Star Trek. It's just there!
This is increasingly the way that our world works today. What used to be a specific application on a specific server becomes a series of whatevers running wherever. A side-effect that you may not expect from this is that functions that used to be dispersed on multiple servers with their own raison d'être may get consolidated onto mainframe systems. I know developers who never imagined that they would be working on mainframes. Of course, System z has done a lot to incorporate various environments so one may be working on a System z and never know it.
However, there are times when moving an app into that environment, specifically from Intel to POWER, may require some rethinking of things. Today, on developerWorks, is an article that helps with such transitions, showing how you can implement POSIX Semaphore APIs using System V Semaphore APIs. The technique could save some trouble for people trying to port applications and make it easier for your whatever to run wherever.
Bash tricks and tips
It may be because I'm an "old guy" but I find that I can read things faster than I can watch them on video and that I can type things faster than I can point around with a mouse. So, I relish the way that even a Linux desktop makes it easy for me to jump to someplace that I can type. Of course, that also gives me access to some nifty automation and other things. Typing really isn't that bad or scary. I wish more people would dive into it.
For those who are discovering the joys of the command line, there is a nice introduction to some BASH (that's the typical Linux shell) tips and tricks. There's a lot of good stuff in the Real World Linux community by our contributing moderator, Himanshu, and others. Check it out... or better yet... join and contribute.
Himanshu has also been busy in the Real World Open Source community. This week, he's looking at VLC, a media player (and server, and converter) that I like. I've used VLC for doing all kinds of strange things, including streaming video. It's muli-platform and pretty capable.
Clearly people did not get as excited as I did about the Bossies, the Open Source Software awards, that I wrote about in my last entry. Perhaps it's just not very compelling, or perhaps there is just a general lack of curiosity in such things.
I've had my world shaken and stirred a little with recent events-- in a good way. The first has been my involvement in developing a Knowledge Path for System Z (mainframes) where I have had to dive a little bit into that mysterious world. I remember when I worked at the Texas Lottery Commission and the mainframe guys were "over there". The operators were pretty decent, but the admins were scary dudes.
Picture a scene from an old Clint Eastwood spaghetti western. The sysadmin is dressed in black, with an ornate, but well-used six gun prominently displayed on his hip. I wander up as a wide-eyed kid dressed like Huckleberry Finn. "How do I learn more about the mainframe?" I would ask.
This is met with either a steely-eyed stare as the sysadmin says through clenched teeth "You don't... and pray never have to." He then strides away, the wind whipping his long coat around him, but miraculously having no affect on his hat. Later, there are gunshots.
It has been very nice to come into contact with much less scary people in the mainframe world. People who are excited about mainframes and who reward curiosity, but it is still a precious and rare resource and there are many gateways. It's a shame, because there are many interesting ways in which a mainframe could take the place of a number of computing resources, consolidating them together. Imagine a Bring Your Own Device (BYOD) world where I don't have to worry so much about your device being completely secure because I'm not actually running my software there... I'm providing a central resource and using your device as a fancy terminal. How could that make a difference?
In any case, this is very exciting to me and I'm enjoying the chance to see the outstanding engineering that makes the System Z what it is. It is amazing that people were able to think things through so completely... a vast difference from today's rush to market.
The other thing I am working with is a group of hgh school students in a security contest called CyberPatriot. The idea is to get kids interested in technology to have a greater appreciation for how computer security works. I'm a mentor in the group, drawn in because of my Linux background. (Apparently the team was hit with an Ubuntu image last year and they were very confused by being met with a console prompt and a blinking cursor.) It's been interesting, but so far all of the samples have been Windows-based... forcing me to dust off some of my brain cells, since I haven't really had to administer Windows machines with any seriousness for a while now! (There are advantages to being a long-haired techno-freak.)
One of the things that has intrigued me is the difference between how young people approach technology today and how I remember approaching it in my youth. I suppose that part of working with technology in the Eighties was that you really had to know how to make things work or it didn't. Windows was a ways off yet and the blinking cursor on my Commodore 64 or the school's Apple IIe (or the TRS80s) gave you no comfort, no clues as to what to do next. You really had to know something about the moving parts. Interestingly, many of those parts are still there, but buried within all the menus and icons.
It intrigues me that some of these students, who are clearly clever and interested in technology, seem to be experiencing these moving parts fo the first time. Ports and processes were always a part of my computing world. Some of them seem to be discovering these things for the first time. How is that possible? All of them embrace the knowledge eagerly and they are doing great, but it amazes me that one could learn about technology without developing an understanding about how these things work... especially if you are more of a techie type.
Curiosity is one of our most valuable assets as humans. We have always dug deeper as a species, finding out how things work and new ways to apply what we learn. We take things apart. We invent. We misapply what we know in wonderful ways to create new discoveries. It seems to me that some of this curiosity is waning. We seem to be waiting for experts to tell us what to do. Experts are great, but how do you know if they're right unless you've tried on your own?
I encourage everyone to try to dig a little deeper into technology. Don't let anyone tell you that you don't need to understand something and that it will all be handled by "top men", especially in these BYOD days! What you don't know can be used to exploit you in so may ways. Bad guys use it to steal your information and resources. Employers use it to make you give up your Facebook information and spy on your personal computers and phones. Governments and commerical interest use it to accumulate information about you and game you. I don't mean to be alarmist and I think that much of this is done with good intentions... but you can't defend yourself or make your own decisions unless you engage a little.
Technology is our servant. We should all be able to take advantage of mainframes or keep our email safe from bad guys. Solutions are there for the using, but we have to be curious and we have to not take "No" for an answer. Go do a search right now for a technical topic that you don't but would like to understand. The first two or three things may be way over your head, but you will ifnd something that introduces it to you correctly. (Don't be surprised if some of the better ones are on developerWorks.) Dig, learn, play, ask questions, get answers. You will be amazed at what you can find and do.
Before I get to the BOSSies (Best Open Source Software awards), I saw this article today: "Brand-new hardware -- now with malware pre-installed!" That's a terrific time saver! Imagine being able to participate in denial of service (DoS) attacks and SPAM profligation without all the pesky poking around in malicious web sites.
The author concludes that you should stick to the big players when buying hardware. I conclude that you should always take control of your own security. That's why I like building things from scratch and why the first thing I do when I get a system is erase the drive and load Linux. With the way things are today, manufacturing and assembly spread all over the world, you should make no assumptions about your system when you get it. Do some verifications on your own.
The BOSSies are in, and the winners are...
Every year, InfoWorld presents the BOSSies, awards for the Best Open Source Software in a variety of categories. While this is as scientific as about any awards system out there (which is to say not very), it is a great way to see what's making waves in the Open Source world. I get validation on things I already use and introductions to stuff I haven't yet discovered.
These are presented as slide shows, which is a little annoying. Here are the winners:
WordPress and Joomla are my two default content management environments for quick web sites. I want to like Drupal, but have just not had as much success with it. I'll keep tinkering, though. Typo3 looked interesting too.
I've worked with SugarCRM and liked it. Right now I don't have as much need, but if I have to do that sort of tracking again I will likely make use of it. vTiger might also be worth a look. I may also be tinkering with Magento for some things.
I'm perpetually curious about accounting systems. I'll probably look at FrontAccounting.
I'm also curious about Diaspora. There were several other tools that looked really interesting but dealt with situations outside of my world.
I've heard quite a bit about OpenStack. I don't know that I'll get to do much with it, but it intrigues me. CloudStack as well. We've had a little article coverage on Cloud Foundry, so I will probably look at it too. Lots of options!
Puppet seems like my style. I may actually have a project where it could be useful. I'm curious about Juju and Chef as well. I'm intrigued about deployment automation since I oversaw the Y2K rollout at the Texas Lottery Commission.
OpenRemote is intriguing and may offer some solutions to problems unique to my household.
Fun gaming options in 0 A.D., Warzone 2100 and Stella . Naev also reminds me of a game I used to play on my Commodore 64. I can't remember its name, but I enjoyed it quite a bit. (I'm sure someone will remind me. It's just on the edge of my memory.
I loveCalibre using it with my Kindle, and my Sony Book Reader before that. Outstanding application. Be warned, it gets an update constantly. I wish they would set it up to auto-update and save me the trouble.
An Arduino kit is sitting, waiting for my attention. I really want to play with it. Upcoming articles about that too.
I'll probably check out Lightworks as well when the Linux version is available.
Today I was pointed to the article "How would you fix the Linux desktop?" through slashdot. (Yes, another one of those articles.) I am quite comfortable using a Linux desktop and have been for nearly a decade, so it's not very mysterious to me. My family also uses Linux as a desktop with no real complaints. However, this seems to remain a controversy. It reminds me a little about my daughter talking about her school lunch.
My daughter just turned ten. The other day she was talking about all of the terrible things they are doing at the school cafeteria. They've removed some of the dishes she liked and put, in her opinion, poor alternatives in their place. I should say that my daughter is not a pizza and hot dogs sort of diner. She likes sushi and different kinds of vegetables when they are well prepared. Her description of what was going on did sound a little poor, but it's an institution's approach to being told to provide more "healthy choices" while also adhering to a giant list of restrictions, primarily budgetary. I would probably eat it, but not look forward to it. I suggested that my daughter could always take her lunch and we could keep them interesting. I don't think she even heard me.
We have a lot of choices that we would rather not act on. "I hate my job," says someone... but doesn't really want to leave and find another one. "I hate the environment in my city," says another... but won't move someplace where they say they'd be happier. We complain, but we don't act, because we are not so unsatisfied that we think it's worth the effort to make a change. This truth means that most complaints fall on deaf ears because providers know that we likely won't do anything. If Walmart knew that "I'm never shopping here again" didn't have a silent "unless I find that I'm desparate for something and everyplace else is closed, or I happen to be somewhere and Walmart is the only place I recognize, or I know I need something cheap" then they would probably be a lot more attentive.
So, in desktop land, though people might be disapointed with their Windows or MacOS experience, they likely won't really try to make a move. Once the disappointment is voiced it has been served and one can simply get on with things.
Some say that the problem is not enough applications and that there are barriers to writing applications that work across Linux desktops. I don't know how true that is. I regularly play with different desktops on my Linux installation (you can change it every time you log in if desired). All of the programs I run work fine across the desktops... though the experience changes slightly as the desktop features rearrange. It seems that it is largely a matter of the application letting go of the things that the desktop does rather than trying to emulate them. Maybe I'm missing something.
There are really only about a dozen things that most people do with a computer. Applications exist for those. Developers of popular software could provide a LInux version as easily as they provide a version for Windows and Mac OS. Arguably, if they started to use some of the existing open development techniques that are used for Linux applications they could more easily write things that run on all of the operating systems with single code base. There are several examples of this in existing open-source software.
People don't use the Linux desktop because they just don't care for the most part. They use whatever they're given. If IT turned around and gave them a Linux desktop and managent said it was the new policy people would use it. Oh, they would complain, just like people do about the store they go back to again and again, but they wouldn't quit their job over it. As long as someone has to make an effort to be different, it will only be those who already do that sort of thing in their lives who take on Linux, and discover the benefits it gives them. Everyone who prefers to "go with the flow" can discover what flows downhill.
I get a lot of oddball mail through my ibm.com address. Today I got the following note:
UNITED STATES HOMELAND SECURITY
CARGO ANALYSES AND INSPECTION UNIT
Chautauqua County Office Of The Sheriff
15 E. Chautauqua Street P.O. Box 128 Mayville,
New York, NY 14757, USA.
OUR REF: UHS/WB/XX125/0011/10.
10th September 2012
We Intercepted a Parcel Containing an ATM CARD in Your Name
Greetings from the County Office of the Sheriff, Mayville,
New York. One of our Officer attached to the FedEx Courier
Cargo Inspection Unit discovered a parcel containing an
ATM CARD/PIN worth $1.5Million US Dollars at our Facility
in HOUSTON, TX, US. We verified the content of the parcel and
it was made clear that you have been waiting for very long
while for the fund to arrive your location.
However, our reason for contacting you on this day is to
inform you on the location of the ATM CARD and the
requirement necessary to claim it. The parcel was dispatched
through UPS Courier Service on the 25th of July,2012, but one
Mr. Clide Stewart intercepted the parcel claiming that he was
your representative and that you gave him due authorization to
receive the package on your behalf. For your information, the
parcel is presently in Transit at HOUSTON, TX, US and is scheduled
for Delivery to your Home Address.
To verify this Notification, Log in to www.ups.com and insert
the Tracking Number below to view delivery status;-(
If we do not hear from you it will be assumed that you authorized
the claim by Mr. Clide; but if otherwise, you are advised to
contact the FedEx Courier Officer in Nigeria with the information
REV. DELE ROBERT
Direct Telephone: +234 809 142 4401
As soon as you establish a contact with him, ensure that you
provide your present address as below to the contact in other
for the Officer to effect the change and Re-direct the parcel
to your home address. Also ensure that you call the Officer so
that he can be aware of your correspondence,
Please be aware that the only document requirement necessary to
effect the Re-direction of the Parcel is a Re-direction Fee $160
US Dollars. Once you contact Rev. Dele Robert, he will instruct
you on how to make the payment.
As soon as the Re-direction fee is paid, you shall not be required
to pay any further fees because the parcel is already in the
NB: Stop all further communication with anyone claiming to be
sending your fund because you ATM CARD is already in USA.
We expect your urgent attention to this email to enable us
monitor this delivery effectively.
Sheriff Joseph A. Gerace
Assistant Director, Cargo Inspection.
Chautauqua County Office Of The Sheriff
NEW YORK. USA.
The interesting thing about this is that when I go to the UPS site and look up this tracking number, the delivery tracking is consistent with a package received from outside the US and delivered to Houston. That's a nice detail. I've decided to let the $1.5 million dollars go, though it would have been nice to go ahead and get the big Harry Potter movie collection.
Have you followed the Spring Roo series? As an editor I don't always have the opportunity to recreate the labs and examples in articles that I publish. I get sufficient information to see that they should work and then let them go. With the Spring Roo articles I found myself playing with the examples a little because it was so easy to just try them.
If I was more involved in application development today-- as in hands-on, having to crank out results-- I would probably be taking a serious look at this technology. It's not for everything, but in the areas where it is a good fit it promises some quick, easy development to solve problems that will run on a variety of environments. This really matters as we move into the Cloud world. Honestly, you are going to need more than Java in a Cloud world.
We just published that final two pieces on this series and it provides a good overview to start making you productive. See all the articles in the "Introducing Spring Roo" series.
Did you ever get an email about a study that made you say "Wow! How do I get paid to do a study as obvious as that?" I got an email with the results of a study which claimed "Tech Fuels Success for Business Travelers". In it, we are treated to such startling discoveries as it can be hard to get a healty meal on the road, people who travel a lot miss personal things in their lives and mobile technology is really helpful to people on the road. Perhaps this is for people who have never done any business travel. Maybe I'm just not the target audience. I didn't disagree with anything they showed, it just seemed to be in the "many people seem to have two eyes" sort of category.
Firefox on my phone
I loaded Firefox on my Droid Bionic and it seems to do OK. Dolphin HD has been my browser of choice for a while, but Firefox has some niceties. They do a really good job of auto zooming on collumns and modules of information so I can read them. If the page doesn't have the information divided, though, it can be a little challenging. The pinch-zoom functions are pretty spry, so it's easy to adjust. (That is assuming that I will still be permitted to use such a function or if someone will have to invent an "innovation" such as jumping up and down while I wave the phone about in order to zoom because pinch-zooming has been "taken". Did you know that cabinet makers had rounded rectangles long before smart phones did? I digress.)
I like the way that Firefox is very obvious about areas where there are flash components and that you need to click the object to activate it. Dolphin can be quirky there and sometimes just shows blank space, which looks broken, when it's actually an unloaded object.
Navigation is pretty straightforward and it does better with some form-based pages. I'll certainly keep it. It looks like I my use different browsers for different purposes, however. I'll bet it's a better experience on a tablet.
Turn the eDGe loose!
I got a great deal on an Entourage Pocket eDGe tablet... because I bought it in a "deal" just as the were going out of business. For the price I figured it would be worth tinkering with, and it was fun enough. However, their approach to setting up Android has made it proprietary enough that it can't be upgraded. Furthermore, Google has decided not to allow it access to their software repository so I have to install everything through 3rd party app stores. The Amazon app store is pretty good, but many application developers choose only to release through the Google store, so that makes for a lot of unnecessary hacking.
Of course, there are enthusiasts out there who would probably make short order of all of these kinds of updates if the information about the hardware was released. I understand the business reasons for not releasing the information. But the technology side of me would love to see what would happen if such things were turned loose into the wild. The eDGe is a nifty little device and could probably do well if it could be updated a bit more.
Right now I'm letting my daughter use it as an internet device as she works her way toward a laptop. The built-in book reader is a nice touch and I'm setting her up with some material there... like the entire Oz series by L. Frank Baum.
I look around and see that there is a devoted base for the Commodore Amiga... who seem to continue to make discoveries. I wonder what it is about such devices that let them live so much more easily. Alas. I'll squeeze what I can out of it. I suppose people create this technology to be disposable... the opposite of the main frame.
Today I was hit by several pieces of information that just gave me profound sadness. First was the announcement about the football tragedy in Liverpool twenty-three years ago where the responsibility for the deaths of ninety-six fans was finally correctly placed with the authorities and planners who made a number of mistakes and miscalculations. It was made all the worse by a collective cover which made it seem as it was all unavoidable. In reality, many of those people might have been saved. You can read the Prime Minister's full statement to the House of Commons.
Secondly are the news statements about the deaths of US ambassadors in Libya. I won't get into the politics of any of this. I'll just say that all of it makes me very sad.
I am a believer in openness and technology. I see these as tools which transport people into better versions of themselves. To me, and ideal world is one in which everyone explores their curiosity and works to meet their own challenges, whatever those may be. In this world, no one is denied the opportunity to educate themselves and discovery is openly shared. We find a balance between what everyone needs, wants and provides. I know that there are major obstacles to such a world, and human nature is one of them.
All of these reminders of people at their worst do not define people as a whole. We are making progress and things are getting better. Today I'm just reminded of how much more is needed. I wish there was a way to remove the fear and greed that makes people feel the need to control each other.
I was going to add some other related things here, but I suppose that won't really be fair to them. I'll give myself a chance to reset a little and I'll give them their own place.
Things have been pretty weird around here. I went and spent a week in Dallas learning about mainframes to support an upcoming project. If you've always been curious about the mainframe but not had the opportunity you're going to love it. I'll have more for you on that later... like later in the year... so you'll just have to be a little patient.
I know this makes me an old guy, but I grew up watching reruns of the original Star Trek series. I didn't watch them first run, so that makes me slightly less old, I suppose. Today, Google included a tribute to the original series.
It's interesting to me how much I was influenced by the ideas on that early program. It provided such a positive view of how technology and society might evolve. It's amazing how close we've gotten. Here are a few things that jump out at me:
The computer had the answers to everything. It doesn't talk to me, but with the Internet there's an awful lot of information at my fingertips with more of it becoming interconnected, interactive, and intuitive.
Tricorders could tell you all kinds of information about your location and local conditions. My smart phone has a GPS and through the network can tap into lots of other information. It also has sensors that can be be used to detect magnets, measure velocity and more. (There was actually very cool program that applied several of these sensors in a mock-up of a Star Trek the Next Generation. Unfortunately CBS decided that this was an infringement on their intellectual property so it is no longer available. I will try not to digress about how this attitude is potentially stifling to people who are trying to turn science fiction into science fact.)
Everyone talked to each other through video phones. Yesterday I had a video call with an older (read retired) friend. He is unable to leave the house because of his health, but we were able to have a face-to-face conversation... me on my laptop and he on his tablet.
People had these flat panels of information that they could use to read or sign off. We have a number of options. A giant library of books can all be held in a small device.
They had massive ammounts of data stored on little plastic cards, including video. They got this wrong... it's all actually smaller.
Depending on your situation you might argue with me that we are intergrated between races and cultures. I work regularly with people from all over the world. I don't have any colleagues from other planets yet. Star Trek was the first thing I ever saw that showed this as a natural environment.
Star Trek taught me that curiosity and exploration were good things. Honor and duty mattered. You could not judge things simply by their appearance and differences were things to be worked through. Yes, there was some pretty heavy-handed storytelling and the special effects are no match for today's standards. However, Star Trek put me on the road to wanting to make a difference in the world in ways that nothing else did. It is worth celebrating.
In a previous post I mentioned the Humble Bundle and how Linux users seemed to be quite willing to contribute cash for games. The latest bundle is out and I just started my downloads. I think this is a great model and I enjoy participating. Of course, these games are available for multiple platforms and the directly reward the creators and charities. As of this writing, Linux contributors are still willing to pay more than those who designated themselves as Mac or Windows users, significantly more. Perhaps Linux users have a greater appreciation for the work and a greater desire to reward it. Perhaps they just have more money because they are technical and already leverage things for free. :-)
Seriously, though. It's a great cause and worth checking out.
Build your own distribution
One of the confusing elements of Linux for some people is the concept of the "distribution". Often people see Red Hat and Ubuntu and other variants of Linux as completely different universes, like choosing Mac or Windows. However, it is much more like choosing Ford over Chevrolet or Brand X of canned peas over Brand Y. There are differences, which can be frustrating in particular circumstances, but it all largely works the same. I'm especiallin intrigued by the sort of "black box" kinds of distributions like ArtistX, which is aimed at people doing video production, and the specialty distributions like Scientific Linux, which brings together packages for people working with genetics and other scientific problems. The goal of these is to provide a sort of "solution Linux" which will just boot and do what you want.
Imagine being able to create your own internal distributions of Linux that were role-based. They all have the common base, but contain the default packages appropriate for the required role. You install them and they just work. Yocto is one of the ways to do this. Check out the article "Build custom embedded Linux distributions with the Yocto Project", live now on developerWorks. It will lead you through the concept and empower you to make your own. If you want to pop it up on DistroWatch and share it with the world you can. You can also put it in your tool belt for your own use.
I owe you some more thoughts on blogging and some information about last week's System Z experience. Be patient as I catch up with a few things and I promise you'll hear more.
This is just a quick one. I've started a week of System Z training, to better understand this technology. I think System Z is a bigger deal than people imagine and there is a lot of our future that could benefit greatly by more people being aware of and taking advantage of this powerful computing technology.
On the one hand, I was aware that this is not for the squeamish. It's true that System Z can create a powerful computing environment that allows many people to simply do what they do without having to worry about everything that runs under the hood. However, to manage such a universe takes some willingness to get your hands dirty. It reminded me of some of my earliest days of computing, where one had to be so close to the moving parts to connect to networks and do anything besides simply run one application at a time.
Yes, there is a lot to successfully harnessing the power of a System Z environment, but it's not really beyond anyone who has a basic grounding in technology. Like anything worthwhile, it takes some focus, and some work... and practice... but the rewards can be so great. Personally, I think open, highly mobile devices on the front end with plenty of Big Iron type of power on the back end is the shortest distance between here and Star Trek. There is still plenty of room for openness in such an environment... though I'm wrestling a little with my classmates on that one.
Today was pretty brain-filling though. Off to enjoy some amazing chili at Tolbert's with my parents. Then more brain stuffing tomorrow.
Fear not... more on blogging coming soon... just not enough brain for it right now.
I recently had a call with some people who are interested in
contributing to the Real World Open Source and Real World Linux communities here one
developerWorks! Yay! I would like to see a lot more input by people in
these places. As a part of that conversation they requested me to
outline my recommendations for people new to writing in this
environment. I decided that this might be of interest to the general
public, so I'm posting here rather than privately through email.
Writing in developerWorks is not like having your own Wordpress or
other blog. You can do a good deal of customization to make it fit your
own preferences, but you will need to fit into the overall
developerWorks framework. This framework may change around you, so the
general rule of thumb is "Keep It Super Simple". Your content is what
is most important here, not any bells and whistles that you might add,
so write things that do well with plain, clean HTML. I prefer to do my
writing in an HTML editor, actually. I tend to use Kompozer, an
open-source editor. Unfortunately, development on this project seems to
have stalled out, but it's still my favorite editor. It produces clean
HTML with no muss or fuss and allows me to easily put something
together which I can just paste into place. You can use any editor that
you choose which can save HTML. However, bear these things in mind:
Don't use a lot of styles and parameters on the HTML. Sometimes
you need to, but keep it to a minimum. This will make your article
behave better when it's published.
Be cautious about what comes out of a word processor. When I
write something in LibreOffice and them paste it into the blog there is
a lot of hidden style information that ends up in the HTML. This is
ugly and bulky and will do weird things to your entries. Be prepared to
clean up anything that you do.
Including images is good, but you will be working with a simpler
subset of formatting options. You will also need to upload your image
to the developerWorks server or reference the link externally.
Posting audio and video is good, but remember that this can
sometimes be unpredictable. For example, when posting a YouTube video,
you must use the old-school <object>
code rather than a simpler <iframe>.
Someday this may change, but for now it is necessary. Some embeds
simply will not work.
The included HTML editor is decent, but a little thin for me. I
have two browser plugins that I use to help me write entries that I
have not pre-written in Kompozer.
Write Area - This plugin will give you a
fuller editor that you can invoke in any text area with a right-click.
It provides more formatting options for links and images. Unfortunately
it does not include a spell checker, though. so be sure to double-check
your work. I use this a lot! (I'm using it now). It's been a real help
to get around any site that has a limited window in which to write.
It's free for Firefox. I'm sure that people with other browser
preference will find similar add-ons. I'm just telling you what I use.
Scribefire - This plugin provides more than
just an editor. It is a blog management system, allowing you to work
with various blogs on different sites. It will give me a list of the
blogs that I use and let me edit or create a new entry for any of them.
This can be handy, but it sometimes does some strange things with more
advanced formatting. (Remember, I said to keep it simple?) Another
feature is that it will allow me to simultaneously publish the same
thing on multiple blogs at once. I did run into one issue, which I
mentioned in a previous post. Do not use the '#' symbol in your article
titles with Scribefire. This caused it to get lost when trying to
agregate my existing entries. That was very frustrating for quite a
while until I tracked down the issue.
There are other blogging tools which are compatible with
developerWorks, but these are the ones that I generally use.
Any pictures that you want to use need to either live on the system
or be linked with the URL. For some content, especially copywrited
content, I just link to it. That saves some of the usage hassles and
acts as an automatic credit to the owner. For example, Dilbert cartoons
are a great thing to include from time to time and they have a simple
method for linking to their content.
If you're going to do things like this, you should expect to have
to tweak the HTML from time to time. Sometimes developerWorks seems to
alter things that are not entered through the raw HTML view. (That's
the <h> button in your toolbar if you are using the default blog
editor.) HTML is nothing to be afraid of, and many of you are technical
people anyway, so you should feel comfortable with it.
For some pictures, though, it's best to upload them. If you are
using the default editor, uploading is automatic. You click the icon to
insert a picture and it gives you a chance to upload your picture. I
will often use this step just to get the picture up and then go into
Write Area to manipulate it and make it look nice with the article.
Add image tool.
You can also upload an image file directly. Select the Settings
link, next to New Entry. On the Create & Edit tab you'll
find File Uploads. You can manage everything here. Note that
this interface acts much like old-school FTP, so you can't overwrite
If you need to change something you need to delete it and then
upload the new one. That provides a window where the file may not
exist, but it's pretty quick.
Copy the link for an existing file and you can use a conventional <img>
tag to include it.
Bookmarking major links
I quickly got annoyed by some of the steps to getting to areas like
file management. They are easy to find, but require a number of
clicks to get there. This was easily remedied with a few book
marks. I have bookmarks set up for my main blog and the entries
page. These reside in a folder on my bookmark toolbar, so it's
pretty easy to jump right to the spot I want. If I did more file
management I'd probably set up a bookmark for it as well, but it's just
as easy to go to the entries page and then click over to files.
(Two quick clicks versus three slow ones.) It seems like such a
silly thing, but it really helped me a lot.
Contributing to the Real World communities
That should get you started with basic blogging. If there are
questions that I have raised rather than answered I'll be happy to
address them. You can email
me or comment here. I may make this a living document and update it
rather than writing additional chapters. I've set up the Real World Open Source and Real World Linux communities so that any member
can draft an article. Simply become a member and start one. When you
submit it, I'll be notified and can release it. Feel free to use this
to post a great topical discovery or idea without taking on the burden
of maintaining your own blog. If you decide to start your own, let me
know and I'll follow it.
A friend of mine, Neil Gilmore, a talented developer, taught me the phrase "fully buzzword compliant". It's like something in a Dilbert cartoon, where jargon takes on a life of its own. I use it often.
Terms like REST and RESTful come up regularly, but I rarely see them in enough context to be understood by someone who doesn't already know what it means. That's why I was thrilled to see "Understand Representational State Transfer (REST) in Ruby" by M. Tim Jones. The term is even described in the title! If you are working with Ruby I think you'll find this a useful read. Even if you don't use Ruby I think there is information that will help you become more fully buzzword compliant, and be able to better consider the value of the REST architecture for your own projects.
We all eat... at least I'm pretty sure that we do. In the United States we have a relationship with food. Restaurants are a big part of Austin culture, and the culture of many cities. "Arm chair" chair chefs have begun to rival "armchair quarterbacks" as people watch the plethora of cooking shows and networks on television. So, it's no wonder that the auditorium for this forum was nearly full (there were easily two hundred people).
The speakers were Addie Broyles and Michael Chu. Addie is a food writer and blogger for the Austin American-Statesman. (Read her blog, "Relish Austin".) She talked about the ways in which technology is changing how we interact with and communicate about food. I mentioned the cooking shows. There are also hundreds of food blogs where people share restaurant experiences, dietary thoughts, nutrition discoveries and personal cooking adventures. Not so long ago there was a designated professional food critic or two who were the official voices of taste. Now, in addition to the bloggers, we have social media, Yelp and other sites where anyone and everyone can publicly share their praise and disdain for their dining experience, and restaurants can publicly respond and react. It's not just dining out that gets treatment though. People share personal recipes, techniques and nutritional ideas. People with food allergies or conditions that require special diets are able to share their discoveries for enjoying food with restrictions.
Of course, technology doesn't just bring us commentary. There are sites and apps devoted to different things that you may need to do with food. Sites like livestrong.com allow you to track your diet and learn more about the nutritional details of food you eat. (I like everything about that site except for their serious deficiency of not having an Android version of their app. They seem to be pointedly supporting everything but Android. Come on guys! Maybe you need IBM Worklight... but I digress.) There are also sites that will let you enter the food that you have and help you come up with recipes that you can make (e.g. supercook.com and myfridgefood.com). One site, eatyourbooks, will let you enter the titles of the cookbooks that you own and it will help you find recipes for available ingredients.
There is even an app which will estimate calories based on a picture of your food. It's currently only available for iPhone, but I'm sure there will be others. Very interesting stuff!
Next it was Michael Chu's turn. He is an engineer and the author of Cooking for Engineers. He was demonstrating a technique called Sous Vide, which is borrowed from a laboratory technique for accurately and evenly heating substances by using a temperature-controlled water bath. (Read Michael's introduction to the concept on his blog.) Michael was passionate about cooking and enjoys a good steak (a man after my own heart). It was clear that many of the attendees were unprepared for Michael's engineering view of cooking, but I was fascinated with how his knowledge of the process could help one predictably and consistently create the perfect 65
ºC boiled egg. Words really fail me on explaining what we got from Michael's presentation. You'll just need to wait for the video to fully appreciate what he did.
It did get me thinking about the ways in which science, engineering and food overlap. The Sous Vide process that he demonstrated is a tub filled with water. The substance that you want to heat is packaged in some sort of container (except for eggs, which have their own container). Next a heating unit will gently heat the water until it reaches a designated temperature and hold it there for as long as desired. It is not possible to overheat food in this manner because the temperature cannot go over what you've specified. However, the physics of certain foods causes them to do certain things when heated for an extended period of time. (Michael seemed especially intriguiged by the physics of the perfect egg.) Home use of this technique is pretty uncommon, but it is becoming more widely used by restaurants. They can, for example, prepare a container of steaks to a perfect medium-rare temperature and hold them there for hours. Then, when one is ordered it is seared on the grill to give it the final touch. The result would be consistently perfect steaks.
Of course, we already enjoy a number of scientific breakthroughs in our kitchen. Our basic stoves and ovens, refrigerators and such are obvious examples. Some of us still remember what it was like to cook without a microwave oven. Much of the science in food happens behind the scenes though, in the growth, preparation, and transportation of food before it gets to you. Some of this is controversial, and rightly so. We literally are what we eat. It's good to learn more about what is done with your food and what you can do yourself to keep it healthy and tasty.
I have no idea if you have heard about this. Sometimes there are things that I think are wide-spread news that others have never seen. (Of course, people are shocked when I don't know who won American Idol.) Recently, technical writer Matt Honan was hacked, hard. They destroyed all of the data on his laptop, his ipad and his cloud storage, apparently as part of the road to playing around with his Twitter account. The attack took advantage of his doing what we all do, having some alignment between our accounts in different places, and using the differences between the different organization's policies to get inside. Once they are in one, it's easier to get into the others. It's similar to the ideas in this World War II cartoon about keeping secrecy. (WARNING: This video is a reflection of its time and contains some caricatures which are inflammatory and frankly racist. I show it for its historical context and the lesson it discusses about how people can piece together bits of information. Not only does this video not express IBM's opinions, it doesn't even express my own... but it does show how long these ideas of security have been around.)
Matt's story is unsettling. It is regretful on so many levels. I imagine that the companies involved especially regret that it happened to a technical writer.
So, what does it mean to you and me? It means that it's time to get serious about security. We have to get serious about our own security because if something slips it is our memories and our creations that are lost forever. That's just too hard to consider.
Security = inconvenience
The first thing we need to accept is that any level of security demands a certain level of inconvenience. I'm not talking about the security theatre that we experience at the airport. I'm talking about things like having to type in a password every time you want to use your computer. I'm talking about having to change your security codes periodically and making them long and complex. These things are requirements for modern security. Just like you have to take time to unlock your door and maybe disarm the security system, you are going to have to take a few extra steps.
The first step that I have taken is to make all of my passwords a significant length. I've set them to 25 characters for all that can accept it. Anything that doesn't go up to 25 I take to the maximum it will take. I'm using a mixture of upper- and lower-case letters with numbers and special characters. I am enforcing my own policy of changing these at least every three months. I have made all of my passwords completely different from each other. This is a huge pain... but until there is some sort of biometric standard that will apply just to me, I have no choice.
Crank up the security
Do you have all of the verification policies turned on that are available? Do you grumble when someone asks to see your ID? Take a look at the options available to you and see what else you can do. For example, Google has a 2-step verification which authorizes access by device. When you use a browser, or certain other apps, Google will send a numeric code to you by phone. This code must be entered or you may not access your Google tools through that device. For things which cannot use this process Google creates an application-specific password for that application and device.
On my account, I had to set up my long password on my Google account and verify it in the browser, and separately in the browser on my phone. I also had to enter a separate password for the GMail app on phone, Thunderbird on my laptop and my instant messaging software. I only need to do this once, but I'll have to recycle them later on when I do my password revisions.
I need to review my options on Facebook. For now, at least, I have significant passwords on them. Of course, using truly secure passwords has caused me to need a password manager. I'm using keepass because it is available in Linux and Android and I have a way to share the database between devices. My database encryption password is also significant (20+ characters), and something I have to remember and type in each time I need to access the passwords. It will also need to change periodically, which will be a pain. Right now, though, I'm betting that I have less chance of someone hacking my password manager database than I do a company accidently dumping my information over the Internet or allowing themselves to be socially engineered into compromising my account.
Could we do better?
We could absolutely do better in our security! The standards and tools for doing good security are available. In many cases, regular application of what is freely available could make a difference. Key-based authentication with a biometric as the password would allow me to control my keys, have different keys for different purposes and never have to remember anything. The protocols for key exchange already exist. It could work.
It's not going to happen that way, though. There are too many people who don't want to understand these things and don't want to be bothered. Companies and governments do ultimately do what they are directed-- but often in a "malicious genii" sort of way. "OK. You wish for a mountain of gold, which falls from the sky and buries you."
We need to be more demanding about the protection of our accounts and identities. We need to be more tolerant of the process required to verify our identities and we need to be willing to actively participate in the process. I'm guessing that overall there is more money to be made by everyone for fraud than there is for security which works... which is a real shame.
I hope you'll consider what happened to Matt, and what you would do if it happened to you. Now... how are you going to prevent it, and how are you going to teach the others?
On Sunday night, I joined a number of space exploration enthusiasts at a Landing Party to watch the deployment of Curiosity, the newest Mars rover. It was an incredible event. Here is some video of my immediate reaction after the party. Bear in mind that it is very late, I'm out on the street and I'm pretty tired by now. It's raw-gritty reporting that puts you there! I would have had Monday or Tuesday, but I had to fiddle with the video a little... and I was pretty out of it on Monday and not able to multi-task as well as I do on other days.
First, let me congratulate NASA and all involved. It was an inspiring deployment where everything appeared to work perfectly. Watching it in a room full of people who cared was inspiring. Every stage was cheered enthusiastically. It was wonderful to behold.
In the video, I mention a couple of applications. First, was Uniview, which is a commercial application that was used to show us an impressive 3D rendering of our solar system and beyond as the presenter related it all to the Mars mission. However, he also pointed out Partiview, which he said was a similar application, freely available as open source. It's mulitplatform and I am downloading it now. I'll report the results.
I believe that space exploration is important. It drives us to solve problems and gives us places to reach when our own world seems a little inhospitable. Science fiction becomes science fact as people find ways to make their social and technological dreams come true. We will never stop reaching for the stars. If governments decide to get out (which might not be a bad idea on some levels) people will make it happen.
Hacking my DNS
A while back I was feeling frustrated about my home network. Everything just seemed sluggish, but when I would do various speed tests it didn't really seem to be so bad. What was going on? After poking around for a while, I observed that my slow-down seemed to be related to domain name resolution. If you already know about this stuff you can skip the explanation.
Quick explanation of DNS
In a TCP/IP (Transmission Control Protocol/Internet Protocol) network, which is what we use on the Internet, everything is done by the numbers. Ultimately, your network card is wanting to talk to another networks card somewhere else. That's what your MAC (Machine Access Control) address is. It's a unique identifier of your network card. Of course, having an index of all of those devices is cumbersome, so a system of cataloging them was determined. That's where the TCP/IP address comes in, the x.x.x.x number that is assigned to you on a network. However, telling you to visit my web page at 188.8.131.52 is probaby not going to be easy to deal with. So, a concept was devised where names could be given to the various networks and a lookup occur to point you to the final destination. That is known as the DNS (Domain Name System). I'm going pretty quickly here. If you really want to understand you should read more about tcp/ip and DNS, but here's essentially how it works:
You connect to a network. You get your own IP address (x.x.x.x) which points to your network card's MAC address. You usually don't care what your MAC address is unless you are doing some serious troubleshooting. You sometimes need to know your IP address.
You are pointed to a gateway, an IP address which will be the central point of communication for everything coming from your computer.
You are given DNS server which will translate names (like ibm.com) into IP addresses.
When you look up a name, your system will give the name to the DNS and receive the IP address. Then the IP address will be contacted to complete the connection. If you can't look up names, your system may seem like it can't talk to the Internet.
If this name lookup process is slow, it will delay every network connection that you access through a name.
Once I noticed that my name resolution seemed to be a bottleneck, I started digging around. I think that the DNS servers for ISPs are typically pretty overloaded. If I can bypass those, then I can perhaps get a faster lookup and faster networking overall. In Linux, there is a utility called dig. It performs name lookups with some feedback about the process. By default, it will use your network's name server, but you can designate a name server as well. I found a list of public name servers and played with them through dig. You can see some examples below.
Ultimately, I decided that I liked the Google server, 184.108.40.206, because it was easy to remember. All of them provided some improvements. So, I went to my home router and told it to use the Google name servers rather than the default. Voila! All machines connected to my network automatically go to the other servers to look up names. This has made a vast improvement in my networking latency. Isn't that interesting?
If I'm in another network and want to do the same thing, then I can adjust the network settings to include my own choices. That will vary with each operating system. On Linux, I simply edit a file called /etc/resolv.conf. Here's what it looks like:
220.127.116.11 is the secondary server.
What about the phone?
So, after I had done this for a while, I started wondering about the network on my phone. I have a 4G phone, but it just seemed to lose its mind from time to time. Again, the issues seemed to be related to finding things more than connecting to them? Could I do the same thing?
I did some digging, and since Android is based on Linux, there were similar underpinnings. However, these only seemed to work for the WIFI network, not the 4G/3G. Drat! I rooted my phone some time ago, so, I had access to the settings, but I just couldn't find anything useful. Then I found out that there are apps that will help out with this. The one I settled on is "Set DNS" by Steve Hanlon. I tried the free version for a while and then bought the pro version for less than $3. (I like to support independent developers when I can, so I donate to open-source projects and buy pro versions of phone apps that I like.) It has worked exactly as I hoped. Suddenly, some of the sites I had trouble with getting lost started working very efficiently and I have noticed a decided difference in my network stability.
Perhaps later on I'll find the guts for this and be able to do it without a helper app, though I'm satisfied with the solution.
If you are having sluggish access to the Internet, maybe a change to your name server will help. Feel free to post a comment with a question and I'll help if I can.
Technology has changed an awful lot since I first got started. Go ahead and laugh, kids. Some day you will also scratch your head in amazement at how far things have come and how different it all is. You can either get mad or you can get busy is the way I see it. I am thrilled by new technology... even the things that I don't do so well. It always gives me something new and interesting to explore. Increasingly it lets me get into things that I always wanted to do, but did not follow the correct path to have those toys (e.g. animation, filmmaking). As technology moves forward I'm finding that some of my childhood fantasies are in reach enough for me to play without having to make the massive career jump it would have once required. I can make perfectly bad movies with my laptop and commoditiy equipment and don't need to go starve somewhere to make it happen. That's progress!
Of course, there are those who have trouble with these kinds of motions. We all have inconvenience and frustration with moving forward, but some people are really dug in and if they aren't careful the Earth will crumble beneath their feet. I was reading an article called "The 9 most endangered species in the IT workforce". I won't go through all of the categories that Dany Tynan set forth. You can read for yourself. I was intrigued by the general trend of his comments and the impressions they left with me.
Overall the technological landscape is becoming more mobile, more flexible and more chaotic. If you are safely housed in your fortress of policy and certification then you will find yourself becoming less and less significant as people respond to your obstacles to their work by simply going around you. Several of the scenarios that he listed had to do with maintaining a "my way or the highway" attitude and attempting to rule others by simply being smarter than everyone else. The truth is that people have access to all kinds of resources and opinions outside of your domain. They can use their own engenuity to try a number of failed, but progressing approaches using their own devices, open-source software and the Internet in the space of time that it takes to do one round of feasibility study and heirarchical approvals. In other words, the rabble can rise up and get things done, even without the smart guy.
So, rather than being an obstacle, you should be a contributor. Yes, some people will want things that have unintended consequences and you still need to look out for disaster. But if you are working to be an ally rather than a gateway then you can get them to do some of the leg work and bring it up to the point where you can do the final tweaks to work the miracles. That makes you Merlin, not Moriarty.
The other area that was covered was the idea of specialization. Big iron computing will always have a place, but it's no longer the only way to get things done. Likewise, all of your precious certifications and other skill sets may not be relevant to the problem at hand. Are you really leading people into the best solutions, or are you bending the problem into your space? There will always be space for skilled people who can solve problems. Those characteristics are action-oriented, however. I love this quote from the article:
"The days when you could slap some Cisco or Microsoft certifications onto your résumé and write your own ticket are long over, says Lenny Fuchs, owner of My IT Department, which provides contract tech services to small businesses.
"'Without the work experience to back it up, certifications are almost useless,' he says. Fuchs adds he gets a kick out of seeing résumés that read 'John Doe, MCTS, CCA, CTSGIT, MCITP, CCNA, MCP. Last held position: Assistant manager at Starbucks.'"
What have you done for us lately? Certifications are great, but they are no longer the only avenue to getting things done. Many people can now buy a book on Amazon, learn a technology and get to work without ever entering the classroom of secret knowledge. Sometimes you don't even need to do that. Just type your error message into Google and see the troubleshooting discussion in a myriad of support groups.
"That's not fair," I hear you cry! Those people are just fixing a sympton without a full understanding of the system. "They could cause great damage with their dabbling." This is true. But they would rather take that risk than go through the old-school time and expense to have it done "correctly". The message is that a technological survivor is someone who is excited about technology and eager to solve problems, even when they go against conventional wisdom of the past. A survivor is more about finding solutions than blocking ideas and applies their old expertise to the new ideas to save others the pain of learning the "hard way". It's OK if you prefer to step back and watch the mess, waiting for people to come begging back to you. Just be prepared for the day when they evolve and you lose your place in the ecosystem.
Personally, I think there is a lot of excitement in technology. I am mobile and get to apply myself to things that are interesting, helping a lot of people break new ground for themselves. I don't worry that my place will go away, because there is always something waiting for the truly curious. Prehaps that's really the truy quality of keeping yourself relevant as technology marches on. Don't lose your curiosity. Don't let your fear of the unknown trap you in a changing habitat. Explore, share and be part of those who are exploring new ground.
Computer security fascinates me. I freely admit that I don't have the chops that many do about cracking into or securing syststems, but I do alright for myself... on securing systems, that is. I'm certainly not claiming in any way that I spend time engaged in any activity that could be construed as subversive or illegal... Dang! Awkward...
Of course, this is the situation one gets into when taking an interest in the "dark arts" of computing. People assume that you are claiming to be some sort of criminal mastermind or something when actually you are simply fascinated by the nature of how bad guys do things. Just as someone who likes to watch true crime documentaries on TV is not necessarily using it to plan their weekend, many people interested in "Black Hat" hacking are not looking to lead the next charge of Anonymous. So, it is likely that if you had an interest in attending the recent Black Hat 2012 conference in Las Vegas that it was hard to make a strong connection between that and what you are paid to do. That's OK. Though the event is over, there is a reasonable archive of confernce material on the web site, including papers, presentations and even some source code! (Use at your own risk.) There's not much in the way of video from the site right now, but a YouTube search brings up material-- though most of it is from Black Hat 2012 in Europe. I'm guessing, though, that techniques and vulnerabilities don't change much by crossing the ocean, so you can probably get a lot from them.
I'll keep my eyes open and try to report additional material as I find it.
IP Law Talk
The other day I was reading about a patent license agreement between a major software company and a minor company for an undisclosed amount regarding undisclosed patents. The story was non-news, unless you're into corporate celebrity, but the discussion had some interesting thoughts expressed. At least they tried to be interesting. They ultimately turned into the sort of juvenile brawl that such discussions do because everyone is out to win. The part of the discussion that really caught my attention was why a company might not want to disclose their patents. Since Linux and Open Source software frequently comes under fire for allegedly violating patents this is interesting to me. The conversation is often along these lines:
Patent holding company: The villainous developers of these open-source projects are stealing our IP and violating our patents and they must pay.
open source developers: Uhhh... we don't think we are.
Patent holding company: Oh, yes you are. In fact we have been striking numerous deals with people who agree that this is a violation.
open source developers: Wow, you really do seem to be making deals with people. Maybe there is something to this. What patents are we violating so that we can fix that?
Crickets: (chirp) (chirp) (chirp)
OK... that wasn't completely fair and read more like a Dilbert cartoon, but I hope you see the fun side of it. It seems to me that if my goal was to prevent people from infringing on my intellectual property that I would want to proclaim loudly and strongly what was being stolen from me so they could and would cease and desist. That doesn't seem to be the way that it works out for some reason. There are non-disclosure agreements (NDAs), behind-the scenes business, announcements that are simultaneously widespread and secretive. It can be very confusing.
Well, it turns out that a new community has formed on trying to understand and relate to Intellectual Property Law. It's your chance to ask your questions and voice your own experiences with people who deal with this every day. It's called IP Law Talk, and should be a fascinating place. I wonder if they know about this weird patent slide show.
Has the Command Line outstayed its welcome?
This is the question asked by a Linux Insider story. I'm going to apologize for being a little prejudiced here, but I just don't understand someone who is technical who wants to do everything with a mouse. Even when I'm supporting Windows I will jump into the command line to get information because I can get information faster by typing "ipconfig /all" than I can browsing around with the mouse. I use icon-based launchers and I find them very handy. I recently talked about how I use them to keep my Firefox identities clear. However, there are some things that I can just flat do more efficiently using the command line. I can then combine those things into a script which I can place under an icon if I so desire. Macro recordings of mouse movements just don't seem to have the same capabilities.
I know that many people get nervous about the command line. They don't type well. They don't have the commands memorized. It can be frustrating until you get used to it. But there is a heavy price for a graphical interface in system resources which could and should be used for other things if the interface is only rarely required.
I hope that you aren't afraid of the command line. If you'd like to explore it in Linux there's a nice tutorial as part of our Learn Linux 101 series. Windows folks can look at this site. You don't have to use it all the time (though I admit that I do). It's nice to have it around, though for when the other tools aren't working. As an example, when I've had some program take over my graphical interface, it's nice to be able to switch to a command session to see what's happening and kill the offending processes. I've been able to use ssh from my phone to connect to my laptop when the keyboard wasn't responding and fix things without having to reboot. Is that geeky? You bet! But that skill comes in handy when you're dealing with bigger problems.
There has been some controversy about comments by Valve co-founder, Gabe Newell, calling Windows 8 a "catastrophe" and saying that Linux was part of Valve's future strategy. (Don't take my word for it. See the story by the BBC.) I admit that I haven't had as much time for games for a while, and when I do I am more likely to want to play a "human contact" game with dice and faces rather than having more computer time. However, it's no secret that Linux has been woefully thin in the gaming area. This is ironic, because I think that the tools and libraries available to Linux could make it an outstanding platform for media and gaming. It's just not where game creators focus.
Perhaps something like the Steam platform working more with Linux will make a difference. Of course, this is a future play. Steam has announced enthusiasm but not a release for Linux. It could get pretty interesting, though. While browsing through the gaming world I found that Steam is looking to Linux. Another site, Good Old Games, does not support Linux now, but might respond to interest, especially if it works well for Steam.
I did find a site, Desura, which already supports Linux. I downloaded a few of their free games to test and just might go for some of the paid titles as well. As entertainment becomes more network and browser based the native platform should matter less and less. I'm intersted to see what has happened. If anyone is already using Desura and knows games I should check out, let me know!
The O'Reilly OSCON is done, but not forgotten. Did you make it to OSCON? If not, there is a page of videos which may give you some taste of what you missed. Additionally, David Mertz is a correspondent who has been our eyes on the ground in the past and has some interesting interviews to share. I expect the first soon and we'll share it with you as soon as we can.
Of course, we're always open to your own experiences. Take a moment to join the Real World Open Source community and provide your own observations in the blog. This isn't just mine, it belongs to all of us. I hope you'll contribute.
I have a number of tools that I use for keeping my blog running. Some have expressed curiosity, so later this week I'll write something that gives some detail about how I create and manage my blog, along with some suggestions to keep it all smooth. One of the blogging tools that I found, which pleased me greatly, was Scribefire. It's a plugin for Firefox that talks to your blogs and lets you edit the entries. It worked well, but for my my primary blog it never seemed to retrieve the entries. How annoying!
It was able to get entries from other blogs, so I became convinced that there was something about titles in my blog that was causing the problem. I searched to no avail. Finally, inspired by what I do not know, I started trying to trim everything I could think of from existing titles. As it turned out, I had a few titles where I had used "#". Those were the problem entries. I changed each "#" to "No." and it all loaded very nicely. Hooray!
I'll cover how I use Scribefire along with some other tools in an entry, hopefully over the next day or so.
Cyber allegedly attacked at French McDonald's
In the strange, but true category we have the story about a man with cyber vision attacked. Steve Mann, a scientist who has used a computerized vision system to help him see, claims in his blog that he was attacked by employees of a McDonald's in Paris. The interesting side of this is that his equipment, which is essentially screwed to his skull, captured images of the event as it occurred. If it happened as he says, it is a bizarre event, reminiscent of something from a future, science-fiction world where there is conflict between people who are "enhanced" with technology and those who aren't. The photo shown is from the blog and is said to show one of the men attempting to forcibly remove Mann's eyeware.
Is it possible that this is a hoax or a misunderstanding? I could easily be fooled by the information that I have. It's a strange tale. It's a cautionary one, though. Technology of all kinds is going to intrude more and more on our lives. Some of it will be invasive and dangerous to our privacy and our persons. Others will improve the quality of peoples lives immeasurably. Still others will be somewhere in the middle, with the potential for harm, depending on how it is applied. In the sci-fi stories it is always a matter of distrust and fear that fuels the violence. In this case, it appears that the McDonald's staff was zealously trying to apply their "no photography" policy-- and idea that seems almost ridiculous when so many people have a camera with them all the time and regularly share themselves through social media. I'm guessing that they would not have objected to some high-school aged girls taking silly Facebook photos of each other over milkshakes.
Technology will intrude and some of it will be pretty weird. Both sides of that story need to understand that there is the potential for fear and conflict. We're going to have to be mature about it and find ways to deal with it.
I went with my daughter to a sporting event in Austin, Texas. The event was being broadcast and as I walked by the technical area I noticed that rather than the sort of giant video mixer that looks like something from the Death Star (which I thought actually was a video mixer but seems to be a steam plant) the director had an LCD screen and a little laptop which he used to manage everything. His system was a little magic box on a rack that connected all the cameras together and allowed him to mix live video using key presses. Wow!
Obviously, I can't qualify most of this. Several of these tools I've used, but others I need to explore. I also need to learn about hardware and who knows if I can actually afford any of it. (Cheap camera equipment to a Hollywood person is very different from a spouse's definition.) I've pointed Scott Lanningham to this and am curious if anything will jump out at him. The point to you is that technology marches on and that things that were impossible a while back are more possible now. If you find something interesting there, please share it here!
Since I do wear a number of hats in my world I find that I want to have degrees of separation between what I'm doing. Sometimes I just don't want those things to overlap. For example, I have a number of tools and things that I use internally at IBM as well as social networking and other things that are related to my job. I also have personal equivalents to all of those things. At some point I realized that I wanted to separate those things out a little more and realized that I could use Firefox profiles to do it very nicely. Since a profile can be invoked at start, I simply created new launchers that invoked the correct profile when starting Firefox. Next I created some distinctive icons to help me tell them apart. Here are a couple:
Chris's Firefox Icons
Regrettably my use of the IBM logo is unauthorized so I cannot show it to you. However, you can imagine an IBM logo floating in place of the black rectangle. These logos were done with GIMP, simply layering over the existing icons. I have a folder, $HOME/art/icons, where I keep such things. (I like to play with icons.) Next, I invoked Firefox with the -ProfileManager option. This brings up a little screen where I can create new profiles. I made one called "Test" which is set up to not record history or store cookies or anything. It is set as the default profile, so that when I invoke a page from another application I have some semblance of privacy. For my others, I invoke Firefox with the -P <profile name> and -no-remote parameters. The -P obviously selects the profile. The -no-remote overrides Firefox's default behavior to use an existing browser if open. If found that without it I might not get my chosen profile if I had another one opened.
This allows me to do things like have mulitple Twitter views open. It also helps keep my search histories and such grouped with like things. I use different themes for each one, so it's obvious which one I'm using. Admitedly, this is probably not for most people, but it is fun to play with. In a more practical setting it would make it easy to invoke have the default Firefox be restricted in some ways, for, say, a kiosk setting but still be able to launch with links appropriate for administration without logging in as a different user.
What does John Cleese know about creativity?
Recently I was inpired by the creators of South Park. Yesterday I came across this video from 1994 where John Cleese talks to a group about creativity. There's some real gold in here. He recently gave an updated version to a group of filmmakers, but this one has some gold that is surely still relevant.
Today I was looking over an interesting article called "Great open source map tools for Web developers". Look, I'm taking off my IBM hat and setting it way over there. Now it's just you and me with no major corporations to bother us or be represented in any way.
When you work in a company that does a lot of technology, with a lot of research, it's pretty difficult to deal with anything that isn't being explored somewhere. Some of these things become major projects and money centers, others are tinkered with and then set aside for the next interesting thing. So there can be a sort of love/hate relationship with companies like Google, who also do a lot of tinkering with various technology areas. One day a tool is OK to explore, the next day "Das ist verboten!" This can really be unfortunate when you have something that you're trying to do which has to either be thrown away or completely rethought. Yet when you're dealing with something like mapping, aren't you forced to go to someplace like Bing and Google who can afford to put satellites in the air and do massive data collection? Well... not necessarily.
As it turns out, governments and other public entities do massive data collection as well, and their information belongs to the public. It's true that some of it may not be as up-to-date or as rich as a private entity can do, but it is freely available. And it turns out that there are a number of open source projects that are tapping these public data sources, and finding interesting ways to tap into them. If I was a betting man, I would say that your odds of being shut off from an open-source resource are better than that of "the competition". It's also remarkably easy to find things like weather data from the National Oceanic and Atmospheric Administration and community-supported street data (Say! I can see my office from here!). Here is a source for open data catalogs around the world, which pointed me to this specific list for Texas, among others.
Obviously, along with the data, there are APIs available as well. All of this can be mashed up in interesting ways. Here's an example where Open Street Map data was used to create an interactive map of countries. As you mouse over the countries you get information about the name of the country. Clicking the country takes you to data for that area. Obviously, one needs to consider the source of information, and there are situations where you need the accountability of a paid resource, but spend some time exploring these open data pits to mine. You find a good fit for your needs, and supporting these efforts helps ensure that they continue.
If you are keeping score, we have a coupld of new items in the Open Source and Linux sites today.
In Linux, "Accelerate to Green IT - A practical guide to application migration and re-hosting" has been a significant effort by a team inside of IBM to share their observations about server consolidation based on their real experience. This is good stuff, and the kind of information that only comes from experience. Their approach is to help you identify the "low-hanging fruit" for server consolidation and to have realistic expectations for where the complexity may lie. I think that server consolidation is a fascinating area. In some ways it makes me wish I was still involved in levels of system administration where I could give it a shot. This article will likely not answer all of your questions about what to expect, but it will get you thinking, and thinking is one of the keys to success.
In Open Source is a fun article, "Building Ruby extensions in C++ using Rice: Add new programming extensions in Ruby", an examination on how to use RICE (Ruby Interface for C++ Extensions) to combine the goodness of C++ with that of Ruby. It provides more ways to use the right tool for the job and allows you to tap into the power of C without having to do everything there. I like this sort of mashed-up approach to programming. Yes, it can add some complexity, but it's worthwhile if it keeps you in control of your performance or other critical elements. It's more tools for your toolbox.
Today I came across a slideshow about fifteen products that Google has killed. This is interesting to me. We often focus on successes or downfalls, but rarely on those day-to-day ideas that didn't make it all the way. I've heard over and over that success in any area is a matter of trying enough things until something sticks. Super success is continuing to try things once you've already succeeded.
I was a regular user of iGoogle, which they are now shutting down. (I'm playing with protopage now, if you are looking for alternatives.) Most of the technologies are mentioned are things that I heard of but never really did anything with them, which is probably why they are no more. I find this sort of thing inspirational in a strange way. When big companies struggle with moving things forward, just like I do, it reminds me that the way is never paved.
Last night I was inspired from another strange place. I watched a documentary called "6 days to air" about the creative force behind the animated program South Park, and how they regularly crank out an episode in six days. I know that many of you may not be fans of the long-running series which has a well-earned reputation for equal opportunity offending, but it really is amazing to see these people who have "made it" continue to work with the same enthusiasm as a startup to get things done. One of the ideas that they discussed was that the need to meet their deadline always drove them to complete rather than to polish to perfection. One comment was that they always felt like they needed another day or two, but even if they had it, it would really only result as a 5% improvement in quality.
That's a really interesting perspective. I'm not saying that quality doesn't matter, but sometimes I wonder if creators aren't the best judge of quality. Perhaps there is a point when we need to let our creations go and let the audience decide. Of course, when we're talking about things like technology, especially software, the feedback from an audience gets fed back into the creation and adds polish from real use, rather than anticipated use. Of course, this has always been the spirit of open source. Get things out and then let them grow.
[As an aside, the conference room the kids are using to write their school TV show in S8E11-Quest for Ratings, looks very much like the conference room the South Park team uses for their weekly creation process.]
I've continued working with Blender to do some headers for the developerWorks community. (You may have noticed that my header graphic here has changed.) I'm increasingly impressed with what this software can do. The other day I was talking to someone about how easily one could transform a slide presentation into a more exciting experience by with Blender. One could simply use it to create more popping charts, by adapting in Blender. (What if you took a boring bar chart and made each column a real column, lit to highlight the item you were discussing?) Or, you could take elements of a presentation and add some pizazz by flying from one chart to another, or including other elements. It had just never occurred to me before.
Obviously you won't be able to do this with just any presentation, unless you develop some specific skills to do it quickly, or have support staff to help. But Blender has game engine elements. That means that once you have some items created you can apply physics and external control to them. Perhaps that could be harnessed to take information from other sources and then automatically work with them.
"I've done scientific visualizations in Blender before. I created a
3D globe a few months back that was UV-mapped so it would show up in
game blender. I then used the python ODBC module
(http://www.python.org/windows/win32/odbc.html) to access an
ODBC-enabled database like MySQL to vertex-paint the globe according
to the temperature at that spot.
"My current project is to use a DEM (Digital elevation map) of the
coast of Oregon, USA, and show the weather in realtime using the game
engine. So far I've been having trouble importing the DEM into blender.
I've found programs that could convert the DEM into a DXF, but it costs
May I say "Wow!"? How can you get started with something like this? Just go start! You'll probably need to deal with Python to talk to Blender. You will probably also need to deal with some sort of data conversion, coming from SQL or CSV. If you can get the channel made, though, you may have a new and interesting way to look at your data. If you make use of the game engine technology you could also embed it as part of your application's functionality. I don't know what I'm going to do with this, but I'm going to do something!
Some time back-- actually quite a while back-- I wrote a series of articles called the Windows to Linux roadmap. Now that I'm editor of the Linux site on developerWorks, I have to look at these things from a different perspective and it is bittersweet to watch them age. Ubuntu wasn't around at that time, which is my primary environment now. There are also tools that have come along to make management easier when, at the time, Webmin was really the only consistent tool I could find. (Webmin is still around, by the way, and I still might consider it if I was managing servers and needed to help share management with people who didn't have a strong Linux background.)
One of the articles I was looking over today was the one on doing backups. In 2003 the backup landscape was pretty dismal, at least from where I could see it. Were I to write that article today I would have more tools to discuss, my favorite being rsync. Rsync was actually around when I wrote the articles, but it was one of those resources that lurked in the shadows, like so many little tools do. Essentially rsync is designed to do file duplication, but tries to make it as efficient as possible by only transfering the delta (changes) in files when it can. It has a number of options and can be set up to do transfers through the network and over encrypted tunnels if desired. I wrote a little script that I run manually whenever I wish to do a backup... though I could run it automatically if I chose... and probably should.
This does a backup to my local USB drive and also does a dump to a network machine, through an encrypted tunnel. This device could be anywhere as long as I could access it over the network, and you'll notice that I am accessing it through an Internet address, so it works when I'm on the road as well. Note also that I'm doing key-based authentication in ssh.
The --exclude-from parameter lets me set up a file containing paths (with wild cards) that I do not want to back up. Things like the Trash, cache files, etc.
The first backup is a bear because it has to transfer all of the data. After that it's easier because it only addresses changes. Of course, one problem with this is that it doesn't take into account file deletions. rsync can do that, but I found that defeated the purpose of the backup if I was trying to recover files that I'd delted accidentally. So, I set up another script that I call cya-purge.sh, that handles that sort of clean-up. I run it periodically, when I'm pretty sure that I don't have something I need to restore.
This second script is identical, except for the --delete parameter, which tells rsync to remove files that are no longer on my system.
I agree that my solution is somewhat inelegant, and probably more hands-on than many people would prefer their backup to be. However, at the time that's really what I was looking for and I still enjoy doing it this way. I have a lot of granular control over this and don't have to mess with interfaces or anything like that. It's simple.
Of course, my hairy-man approach to backups is not going to be to most people's taste. For them/you there is duplicity, an elegant front end to working with rsync that handles bundling of files into smaller chunks, suitable for storing on remote networks. It also does management of the the backup to keep files around for a period of time and then allow them to leave gracefully... something that I would like to get my own scripts to do when I have time to wrap my brain around it. Duplicity is the default backup solution in Ubuntu, so if you have that turned on, you are using it!
My first experience with duplicity was not great. It spent a few hours doing a full backup of my user directory (gigs and gigs of data) and then deleted it when it was done. I never did figure out why it was doing that. However, when I recently tried it again through the Ubuntu control panel it seemed to work fine. I would need to do some tinkering to see how best to emulate my current system of dual backup-ups to a local and remote device, but it might be worth the trouble. I am amused to see that when I looked at the settings to refresh my memory that the automatic backup for today has already occured, and that I did not notice. That's a good sign!
Of course, there are a number of backup solutions that have evolved over the last nine years or so since I penned-- or shoudl I say keyboarded-- that article. Notable ones are Bacula, fwbackups and Amanda. At some point I may dig into them a little more, but in the mean time you will probably enjoy what you can do with rsync. I should point out that there are ways to use rcync in Windows as well. Take a look at this article if you want to explore that.
I was reading things through my Twitter feed the other day and came across this article by Steven J. Vaughan-Nichols discussing the choice that Google, Canonical and others have made to not use Linux in the name of their products. It's not going to turn your world upside down, and it's fairly trivial on some levels, but it is interesting. I use both Ubuntu and Android. I selected them because they have Linux as their foundation, but more specifically because out of similar choices they just did what I wanted the way I wanted.
There is a good deal of discussion about the fact that there are so many Linux distributions. "There's too much choice!" In reality most of these offerings are distinctive in some way, and merely share their foundational parts of Linux and GNU tooling. So far Android has been very successful. Ubuntu seems to be the Linux-based computing environment that more people recognize. Those are good things.
When I became editor of the Open Source site on developerWorks I was inundated with various databases and data framesworks and other similar pieces. Databases and such are fairly successful in the open-source world because that sort of work is a kind of voodoo to a lot of people. It largely runs behind the scenes and gives up data when I ask for it and hides data away when I tell it to. It's an easy place to insert open-source without upsetting people because they don't necessarily deal with the moving parts anyway.
As time has passed I've seen a lot more in the NoSQL areas and with cloud, mobile and all the strange and unusual places we try to put software nowadays I can appreciate the need to know about as many alternatives as you can. As long as the data remains open, flexibility on how you interact with it is handy and can help you turn a bad situation into an innovative opportunity.
When I can buy a 2TB drive to sit on my desk for $99 do I really need to worry about drive tuning? I would say that makes it even more important! What a shame to have a big giant drive and then waste a good deal of it because the data isn't partitioned optimally. I'm still interested in learning more about different drive tuning techniques, especially since I run Linux, because I can mix and match some of that a little more than I might in other environments.
As some of you know, I've been playing around with Blender, the free, open-source 3D modelling, animation and compositing software. I'm still just a baby, but I'm slowly learning how to do interesting things with it. Today I wanted to design a logo for a community group I'm building. I wanted to do something unusual. Tinkering with Blender, I found that one could import SVG files, created in Inkscape, and then manipulate them to have depth. I took some silhouettes that I was playing with and managed to create the following graphic. (be sure to click on it and see it full sized)
Admittedlymy picture won't win any kind of design awards, but it really shows what can be done by bringing things into a 3D environment and playing with light and such rather than simply drawing. I'll be doing much more with this. Of course, once it's designed, it's easy to move the camera around to get different perspectives and even shoot some sort of video where you move through the picture.
Blender is just one of the coolest things. I'm making this image available under Creative Commons, using the (cc-by) license. Please feel free to use it as you wish, just please give me credit.
(QUICK ONES are bigger than a tweet, but not much!)
Two quick things to look at. (I need to get my demo edited for the Pulseaudio thing I talked about the other day.) Today I looked around to see if Windows users could use ffmpeg to capture video the way I've discussed here, here, and here. The short answer is "No, you can't." Dang! It's really a cool technique. However, I did find this tool, which looks ot perform a similar function. I'll need to fire it up in a Windows VM to check it out-- and that's just not going to happen today or even this week. Maybe someone else will have a chance to check it out and give us feedback.
Also, a retweet by @0xMF: "RT @BrentOzarPLF: Turns out that if you want to finish your work by 5:30 every day, you should visit this link: http://t.co/nwFeMeE9" I didn't have time to look at it in detail ( :-) ) but it looked interesting.
This is actually kind of cool, so I thought I'd share it.
PROBLEM: To create textures for the planes in my 3D bumper video I need to do screen captures of various developerWorks articles in my browser. This is easy. I can use GIMP to capture a section of the screen, but it's just a little slow and I need to get nine images from several different web pages. Must go faster.
SOLUTION: (at least for now) Because I'm selecting the web pages by hand an automated spider wouldn't work for me, but I can still speed up the process. Here's what I did in Ubuntu Linux (12.04)
Set up the web browser and size it as I like. I will not move this window once I begin.
Bring up a web page.
Hit the PrtSc button and save the screenshot as a graphic. I'm naming the files to fit in groups to make my later processing easier.
Once I have captured the screens that I want, I need to crop them. I'll use GIMP to figure out the parameters, but automate the process.
I make note of the Position and Size parameters, because I'm going to need this for my automation. Since I never moved the browser window and I always did a full screenshot each article will be in the same position in each saved image.
Now, I apply the magic, ImageMagick to be precise. I use the convert command to process all of these images at once:
ls screenshots|xargs -iFILE convert screenshots/FILE -crop 935x894+346+172 ./cropped-FILE
I've pulled all of my images into the screenshots directory. I want to preserve them because I always try to preserve originals until I'm sure they won't be needed. I'm using ls to get all the file names and piping them through xargs, an incredibly useful tool for passing things from one tool to another. convert, a component of ImageMagick, crops each file, based on the parameters we got from GIMP. All files are cropped perfectly and identically in seconds.
There is probably a tool somewhere that I can use to download web pages and convert them directly into images. That could save some time, except that I'm still picking the pages by hand to make sure that the are representative of the topic. If I ever do something that's spidering through a page I might do it differently. For now this was pretty handy.
I've had a bit of an R & D week. Yesterday I recorded an audio interview for developerWorks which needs to be edited and reviewed and transcripted, so it will be a while before you get to hear it, but it's fun stuff. In order to do that I needed to tinker with my audio setup. I found out how to rearrange pulseaudio in Linux to let me do recordings from a Voice Over Internet Protocol (VOIP) phone. I followed directions from a few folks, but still ran into some little fiddly problems that confused me. I recorded my own video demo for that which I will provide here when I get it edited down a little. This is going to make a big difference for me, and also opens the door to being able to record video demos in an interesting way. Look for that here soon. I recorded the demo using my own instant demo script which begins recording my desktop with the microphone using ffmpeg. Here it is, in case you find it useful:
#!/bin/bash COUNT=5 echo "Capture starting in..." while [ $COUNT -gt 0 ]; do echo $COUNT let COUNT=$COUNT-1 sleep 1 done
The -s 1024x768 -i :0.0+520,200 parameters tell ffmpeg to record a 1024x768 section of my screen and to offset it by 520,200. This corresponds to the little rectangle I've drawn on my wallpaper as "demo space". That way I can visually see the constraints of the recording.
Yesterday I had some time set aside with our main media guy, Scott Laningham. He had to bump to next week so I had the prospect of doing some tedious busy-work or keeping in my R & D frame of mind. I decided to stick with the R & D and worked with Blender, an open-source 3D modeling and compositing program, to start creating a video bumper for things that I create on developerWorks. Blender makes it pretty easy to create flying logo animations once you have your environment built. (That is, of course, the tricky part.) Essentially you build the environment and then you send a virtual camera through it, pointing it where you want. There is lighting and all kinds of interesting elements to work with, but it's a fairly specific skill.
By the end of my time yesterday I had an environment with some 3D letters for developerWorks and a plane suspended in space showing a developerWorks article on it. I could fly the camera around and renderd an 18-second test. I need to tinker more with the materials on my lettering (I want it to look like hard plastic) and I need to replicate the plane with many different articles from developerWorks to make a maze of them to fly through. (Sounds kind of like actually using developerWorks, doesn't it? )
This is why I love open-source software. I didn't need to take any classes or convince someone to let me spend budget to explore these technologies. They are simply waiting for me. A media person I know does the same thing I did with pulseaudio using $700 worth of hardware. Granted, some of what I'm doing requires my level of technical enthusiasm and persistance. There are times when the commercial answer is better suited. But, if you are a techie, and you want to play there's nothing stopping you. You just need to open the door.
Look for my pulseaudio demo soon. Oh! I forgot to tell you that Scott and I did a quick video discussion about some open-source thinking last week. Here it is if you're curious: