A friend of mine, Neil Gilmore, a talented developer, taught me the phrase "fully buzzword compliant". It's like something in a Dilbert cartoon, where jargon takes on a life of its own. I use it often.
Terms like REST and RESTful come up regularly, but I rarely see them in enough context to be understood by someone who doesn't already know what it means. That's why I was thrilled to see "Understand Representational State Transfer (REST) in Ruby" by M. Tim Jones. The term is even described in the title! If you are working with Ruby I think you'll find this a useful read. Even if you don't use Ruby I think there is information that will help you become more fully buzzword compliant, and be able to better consider the value of the REST architecture for your own projects.
We all eat... at least I'm pretty sure that we do. In the United States we have a relationship with food. Restaurants are a big part of Austin culture, and the culture of many cities. "Arm chair" chair chefs have begun to rival "armchair quarterbacks" as people watch the plethora of cooking shows and networks on television. So, it's no wonder that the auditorium for this forum was nearly full (there were easily two hundred people).
The speakers were Addie Broyles and Michael Chu. Addie is a food writer and blogger for the Austin American-Statesman. (Read her blog, "Relish Austin".) She talked about the ways in which technology is changing how we interact with and communicate about food. I mentioned the cooking shows. There are also hundreds of food blogs where people share restaurant experiences, dietary thoughts, nutrition discoveries and personal cooking adventures. Not so long ago there was a designated professional food critic or two who were the official voices of taste. Now, in addition to the bloggers, we have social media, Yelp and other sites where anyone and everyone can publicly share their praise and disdain for their dining experience, and restaurants can publicly respond and react. It's not just dining out that gets treatment though. People share personal recipes, techniques and nutritional ideas. People with food allergies or conditions that require special diets are able to share their discoveries for enjoying food with restrictions.
Of course, technology doesn't just bring us commentary. There are sites and apps devoted to different things that you may need to do with food. Sites like livestrong.com allow you to track your diet and learn more about the nutritional details of food you eat. (I like everything about that site except for their serious deficiency of not having an Android version of their app. They seem to be pointedly supporting everything but Android. Come on guys! Maybe you need IBM Worklight... but I digress.) There are also sites that will let you enter the food that you have and help you come up with recipes that you can make (e.g. supercook.com and myfridgefood.com). One site, eatyourbooks, will let you enter the titles of the cookbooks that you own and it will help you find recipes for available ingredients.
There is even an app which will estimate calories based on a picture of your food. It's currently only available for iPhone, but I'm sure there will be others. Very interesting stuff!
Next it was Michael Chu's turn. He is an engineer and the author of Cooking for Engineers. He was demonstrating a technique called Sous Vide, which is borrowed from a laboratory technique for accurately and evenly heating substances by using a temperature-controlled water bath. (Read Michael's introduction to the concept on his blog.) Michael was passionate about cooking and enjoys a good steak (a man after my own heart). It was clear that many of the attendees were unprepared for Michael's engineering view of cooking, but I was fascinated with how his knowledge of the process could help one predictably and consistently create the perfect 65
ºC boiled egg. Words really fail me on explaining what we got from Michael's presentation. You'll just need to wait for the video to fully appreciate what he did.
It did get me thinking about the ways in which science, engineering and food overlap. The Sous Vide process that he demonstrated is a tub filled with water. The substance that you want to heat is packaged in some sort of container (except for eggs, which have their own container). Next a heating unit will gently heat the water until it reaches a designated temperature and hold it there for as long as desired. It is not possible to overheat food in this manner because the temperature cannot go over what you've specified. However, the physics of certain foods causes them to do certain things when heated for an extended period of time. (Michael seemed especially intriguiged by the physics of the perfect egg.) Home use of this technique is pretty uncommon, but it is becoming more widely used by restaurants. They can, for example, prepare a container of steaks to a perfect medium-rare temperature and hold them there for hours. Then, when one is ordered it is seared on the grill to give it the final touch. The result would be consistently perfect steaks.
Of course, we already enjoy a number of scientific breakthroughs in our kitchen. Our basic stoves and ovens, refrigerators and such are obvious examples. Some of us still remember what it was like to cook without a microwave oven. Much of the science in food happens behind the scenes though, in the growth, preparation, and transportation of food before it gets to you. Some of this is controversial, and rightly so. We literally are what we eat. It's good to learn more about what is done with your food and what you can do yourself to keep it healthy and tasty.
I have no idea if you have heard about this. Sometimes there are things that I think are wide-spread news that others have never seen. (Of course, people are shocked when I don't know who won American Idol.) Recently, technical writer Matt Honan was hacked, hard. They destroyed all of the data on his laptop, his ipad and his cloud storage, apparently as part of the road to playing around with his Twitter account. The attack took advantage of his doing what we all do, having some alignment between our accounts in different places, and using the differences between the different organization's policies to get inside. Once they are in one, it's easier to get into the others. It's similar to the ideas in this World War II cartoon about keeping secrecy. (WARNING: This video is a reflection of its time and contains some caricatures which are inflammatory and frankly racist. I show it for its historical context and the lesson it discusses about how people can piece together bits of information. Not only does this video not express IBM's opinions, it doesn't even express my own... but it does show how long these ideas of security have been around.)
Matt's story is unsettling. It is regretful on so many levels. I imagine that the companies involved especially regret that it happened to a technical writer.
So, what does it mean to you and me? It means that it's time to get serious about security. We have to get serious about our own security because if something slips it is our memories and our creations that are lost forever. That's just too hard to consider.
Security = inconvenience
The first thing we need to accept is that any level of security demands a certain level of inconvenience. I'm not talking about the security theatre that we experience at the airport. I'm talking about things like having to type in a password every time you want to use your computer. I'm talking about having to change your security codes periodically and making them long and complex. These things are requirements for modern security. Just like you have to take time to unlock your door and maybe disarm the security system, you are going to have to take a few extra steps.
The first step that I have taken is to make all of my passwords a significant length. I've set them to 25 characters for all that can accept it. Anything that doesn't go up to 25 I take to the maximum it will take. I'm using a mixture of upper- and lower-case letters with numbers and special characters. I am enforcing my own policy of changing these at least every three months. I have made all of my passwords completely different from each other. This is a huge pain... but until there is some sort of biometric standard that will apply just to me, I have no choice.
Crank up the security
Do you have all of the verification policies turned on that are available? Do you grumble when someone asks to see your ID? Take a look at the options available to you and see what else you can do. For example, Google has a 2-step verification which authorizes access by device. When you use a browser, or certain other apps, Google will send a numeric code to you by phone. This code must be entered or you may not access your Google tools through that device. For things which cannot use this process Google creates an application-specific password for that application and device.
On my account, I had to set up my long password on my Google account and verify it in the browser, and separately in the browser on my phone. I also had to enter a separate password for the GMail app on phone, Thunderbird on my laptop and my instant messaging software. I only need to do this once, but I'll have to recycle them later on when I do my password revisions.
I need to review my options on Facebook. For now, at least, I have significant passwords on them. Of course, using truly secure passwords has caused me to need a password manager. I'm using keepass because it is available in Linux and Android and I have a way to share the database between devices. My database encryption password is also significant (20+ characters), and something I have to remember and type in each time I need to access the passwords. It will also need to change periodically, which will be a pain. Right now, though, I'm betting that I have less chance of someone hacking my password manager database than I do a company accidently dumping my information over the Internet or allowing themselves to be socially engineered into compromising my account.
Could we do better?
We could absolutely do better in our security! The standards and tools for doing good security are available. In many cases, regular application of what is freely available could make a difference. Key-based authentication with a biometric as the password would allow me to control my keys, have different keys for different purposes and never have to remember anything. The protocols for key exchange already exist. It could work.
It's not going to happen that way, though. There are too many people who don't want to understand these things and don't want to be bothered. Companies and governments do ultimately do what they are directed-- but often in a "malicious genii" sort of way. "OK. You wish for a mountain of gold, which falls from the sky and buries you."
We need to be more demanding about the protection of our accounts and identities. We need to be more tolerant of the process required to verify our identities and we need to be willing to actively participate in the process. I'm guessing that overall there is more money to be made by everyone for fraud than there is for security which works... which is a real shame.
I hope you'll consider what happened to Matt, and what you would do if it happened to you. Now... how are you going to prevent it, and how are you going to teach the others?
On Sunday night, I joined a number of space exploration enthusiasts at a Landing Party to watch the deployment of Curiosity, the newest Mars rover. It was an incredible event. Here is some video of my immediate reaction after the party. Bear in mind that it is very late, I'm out on the street and I'm pretty tired by now. It's raw-gritty reporting that puts you there! I would have had Monday or Tuesday, but I had to fiddle with the video a little... and I was pretty out of it on Monday and not able to multi-task as well as I do on other days.
First, let me congratulate NASA and all involved. It was an inspiring deployment where everything appeared to work perfectly. Watching it in a room full of people who cared was inspiring. Every stage was cheered enthusiastically. It was wonderful to behold.
In the video, I mention a couple of applications. First, was Uniview, which is a commercial application that was used to show us an impressive 3D rendering of our solar system and beyond as the presenter related it all to the Mars mission. However, he also pointed out Partiview, which he said was a similar application, freely available as open source. It's mulitplatform and I am downloading it now. I'll report the results.
I believe that space exploration is important. It drives us to solve problems and gives us places to reach when our own world seems a little inhospitable. Science fiction becomes science fact as people find ways to make their social and technological dreams come true. We will never stop reaching for the stars. If governments decide to get out (which might not be a bad idea on some levels) people will make it happen.
Hacking my DNS
A while back I was feeling frustrated about my home network. Everything just seemed sluggish, but when I would do various speed tests it didn't really seem to be so bad. What was going on? After poking around for a while, I observed that my slow-down seemed to be related to domain name resolution. If you already know about this stuff you can skip the explanation.
Quick explanation of DNS
In a TCP/IP (Transmission Control Protocol/Internet Protocol) network, which is what we use on the Internet, everything is done by the numbers. Ultimately, your network card is wanting to talk to another networks card somewhere else. That's what your MAC (Machine Access Control) address is. It's a unique identifier of your network card. Of course, having an index of all of those devices is cumbersome, so a system of cataloging them was determined. That's where the TCP/IP address comes in, the x.x.x.x number that is assigned to you on a network. However, telling you to visit my web page at 22.214.171.124 is probaby not going to be easy to deal with. So, a concept was devised where names could be given to the various networks and a lookup occur to point you to the final destination. That is known as the DNS (Domain Name System). I'm going pretty quickly here. If you really want to understand you should read more about tcp/ip and DNS, but here's essentially how it works:
You connect to a network. You get your own IP address (x.x.x.x) which points to your network card's MAC address. You usually don't care what your MAC address is unless you are doing some serious troubleshooting. You sometimes need to know your IP address.
You are pointed to a gateway, an IP address which will be the central point of communication for everything coming from your computer.
You are given DNS server which will translate names (like ibm.com) into IP addresses.
When you look up a name, your system will give the name to the DNS and receive the IP address. Then the IP address will be contacted to complete the connection. If you can't look up names, your system may seem like it can't talk to the Internet.
If this name lookup process is slow, it will delay every network connection that you access through a name.
Once I noticed that my name resolution seemed to be a bottleneck, I started digging around. I think that the DNS servers for ISPs are typically pretty overloaded. If I can bypass those, then I can perhaps get a faster lookup and faster networking overall. In Linux, there is a utility called dig. It performs name lookups with some feedback about the process. By default, it will use your network's name server, but you can designate a name server as well. I found a list of public name servers and played with them through dig. You can see some examples below.
Ultimately, I decided that I liked the Google server, 126.96.36.199, because it was easy to remember. All of them provided some improvements. So, I went to my home router and told it to use the Google name servers rather than the default. Voila! All machines connected to my network automatically go to the other servers to look up names. This has made a vast improvement in my networking latency. Isn't that interesting?
If I'm in another network and want to do the same thing, then I can adjust the network settings to include my own choices. That will vary with each operating system. On Linux, I simply edit a file called /etc/resolv.conf. Here's what it looks like:
188.8.131.52 is the secondary server.
What about the phone?
So, after I had done this for a while, I started wondering about the network on my phone. I have a 4G phone, but it just seemed to lose its mind from time to time. Again, the issues seemed to be related to finding things more than connecting to them? Could I do the same thing?
I did some digging, and since Android is based on Linux, there were similar underpinnings. However, these only seemed to work for the WIFI network, not the 4G/3G. Drat! I rooted my phone some time ago, so, I had access to the settings, but I just couldn't find anything useful. Then I found out that there are apps that will help out with this. The one I settled on is "Set DNS" by Steve Hanlon. I tried the free version for a while and then bought the pro version for less than $3. (I like to support independent developers when I can, so I donate to open-source projects and buy pro versions of phone apps that I like.) It has worked exactly as I hoped. Suddenly, some of the sites I had trouble with getting lost started working very efficiently and I have noticed a decided difference in my network stability.
Perhaps later on I'll find the guts for this and be able to do it without a helper app, though I'm satisfied with the solution.
If you are having sluggish access to the Internet, maybe a change to your name server will help. Feel free to post a comment with a question and I'll help if I can.
Technology has changed an awful lot since I first got started. Go ahead and laugh, kids. Some day you will also scratch your head in amazement at how far things have come and how different it all is. You can either get mad or you can get busy is the way I see it. I am thrilled by new technology... even the things that I don't do so well. It always gives me something new and interesting to explore. Increasingly it lets me get into things that I always wanted to do, but did not follow the correct path to have those toys (e.g. animation, filmmaking). As technology moves forward I'm finding that some of my childhood fantasies are in reach enough for me to play without having to make the massive career jump it would have once required. I can make perfectly bad movies with my laptop and commoditiy equipment and don't need to go starve somewhere to make it happen. That's progress!
Of course, there are those who have trouble with these kinds of motions. We all have inconvenience and frustration with moving forward, but some people are really dug in and if they aren't careful the Earth will crumble beneath their feet. I was reading an article called "The 9 most endangered species in the IT workforce". I won't go through all of the categories that Dany Tynan set forth. You can read for yourself. I was intrigued by the general trend of his comments and the impressions they left with me.
Overall the technological landscape is becoming more mobile, more flexible and more chaotic. If you are safely housed in your fortress of policy and certification then you will find yourself becoming less and less significant as people respond to your obstacles to their work by simply going around you. Several of the scenarios that he listed had to do with maintaining a "my way or the highway" attitude and attempting to rule others by simply being smarter than everyone else. The truth is that people have access to all kinds of resources and opinions outside of your domain. They can use their own engenuity to try a number of failed, but progressing approaches using their own devices, open-source software and the Internet in the space of time that it takes to do one round of feasibility study and heirarchical approvals. In other words, the rabble can rise up and get things done, even without the smart guy.
So, rather than being an obstacle, you should be a contributor. Yes, some people will want things that have unintended consequences and you still need to look out for disaster. But if you are working to be an ally rather than a gateway then you can get them to do some of the leg work and bring it up to the point where you can do the final tweaks to work the miracles. That makes you Merlin, not Moriarty.
The other area that was covered was the idea of specialization. Big iron computing will always have a place, but it's no longer the only way to get things done. Likewise, all of your precious certifications and other skill sets may not be relevant to the problem at hand. Are you really leading people into the best solutions, or are you bending the problem into your space? There will always be space for skilled people who can solve problems. Those characteristics are action-oriented, however. I love this quote from the article:
"The days when you could slap some Cisco or Microsoft certifications onto your résumé and write your own ticket are long over, says Lenny Fuchs, owner of My IT Department, which provides contract tech services to small businesses.
"'Without the work experience to back it up, certifications are almost useless,' he says. Fuchs adds he gets a kick out of seeing résumés that read 'John Doe, MCTS, CCA, CTSGIT, MCITP, CCNA, MCP. Last held position: Assistant manager at Starbucks.'"
What have you done for us lately? Certifications are great, but they are no longer the only avenue to getting things done. Many people can now buy a book on Amazon, learn a technology and get to work without ever entering the classroom of secret knowledge. Sometimes you don't even need to do that. Just type your error message into Google and see the troubleshooting discussion in a myriad of support groups.
"That's not fair," I hear you cry! Those people are just fixing a sympton without a full understanding of the system. "They could cause great damage with their dabbling." This is true. But they would rather take that risk than go through the old-school time and expense to have it done "correctly". The message is that a technological survivor is someone who is excited about technology and eager to solve problems, even when they go against conventional wisdom of the past. A survivor is more about finding solutions than blocking ideas and applies their old expertise to the new ideas to save others the pain of learning the "hard way". It's OK if you prefer to step back and watch the mess, waiting for people to come begging back to you. Just be prepared for the day when they evolve and you lose your place in the ecosystem.
Personally, I think there is a lot of excitement in technology. I am mobile and get to apply myself to things that are interesting, helping a lot of people break new ground for themselves. I don't worry that my place will go away, because there is always something waiting for the truly curious. Prehaps that's really the truy quality of keeping yourself relevant as technology marches on. Don't lose your curiosity. Don't let your fear of the unknown trap you in a changing habitat. Explore, share and be part of those who are exploring new ground.
Computer security fascinates me. I freely admit that I don't have the chops that many do about cracking into or securing syststems, but I do alright for myself... on securing systems, that is. I'm certainly not claiming in any way that I spend time engaged in any activity that could be construed as subversive or illegal... Dang! Awkward...
Of course, this is the situation one gets into when taking an interest in the "dark arts" of computing. People assume that you are claiming to be some sort of criminal mastermind or something when actually you are simply fascinated by the nature of how bad guys do things. Just as someone who likes to watch true crime documentaries on TV is not necessarily using it to plan their weekend, many people interested in "Black Hat" hacking are not looking to lead the next charge of Anonymous. So, it is likely that if you had an interest in attending the recent Black Hat 2012 conference in Las Vegas that it was hard to make a strong connection between that and what you are paid to do. That's OK. Though the event is over, there is a reasonable archive of confernce material on the web site, including papers, presentations and even some source code! (Use at your own risk.) There's not much in the way of video from the site right now, but a YouTube search brings up material-- though most of it is from Black Hat 2012 in Europe. I'm guessing, though, that techniques and vulnerabilities don't change much by crossing the ocean, so you can probably get a lot from them.
I'll keep my eyes open and try to report additional material as I find it.
IP Law Talk
The other day I was reading about a patent license agreement between a major software company and a minor company for an undisclosed amount regarding undisclosed patents. The story was non-news, unless you're into corporate celebrity, but the discussion had some interesting thoughts expressed. At least they tried to be interesting. They ultimately turned into the sort of juvenile brawl that such discussions do because everyone is out to win. The part of the discussion that really caught my attention was why a company might not want to disclose their patents. Since Linux and Open Source software frequently comes under fire for allegedly violating patents this is interesting to me. The conversation is often along these lines:
Patent holding company: The villainous developers of these open-source projects are stealing our IP and violating our patents and they must pay.
open source developers: Uhhh... we don't think we are.
Patent holding company: Oh, yes you are. In fact we have been striking numerous deals with people who agree that this is a violation.
open source developers: Wow, you really do seem to be making deals with people. Maybe there is something to this. What patents are we violating so that we can fix that?
Crickets: (chirp) (chirp) (chirp)
OK... that wasn't completely fair and read more like a Dilbert cartoon, but I hope you see the fun side of it. It seems to me that if my goal was to prevent people from infringing on my intellectual property that I would want to proclaim loudly and strongly what was being stolen from me so they could and would cease and desist. That doesn't seem to be the way that it works out for some reason. There are non-disclosure agreements (NDAs), behind-the scenes business, announcements that are simultaneously widespread and secretive. It can be very confusing.
Well, it turns out that a new community has formed on trying to understand and relate to Intellectual Property Law. It's your chance to ask your questions and voice your own experiences with people who deal with this every day. It's called IP Law Talk, and should be a fascinating place. I wonder if they know about this weird patent slide show.
Has the Command Line outstayed its welcome?
This is the question asked by a Linux Insider story. I'm going to apologize for being a little prejudiced here, but I just don't understand someone who is technical who wants to do everything with a mouse. Even when I'm supporting Windows I will jump into the command line to get information because I can get information faster by typing "ipconfig /all" than I can browsing around with the mouse. I use icon-based launchers and I find them very handy. I recently talked about how I use them to keep my Firefox identities clear. However, there are some things that I can just flat do more efficiently using the command line. I can then combine those things into a script which I can place under an icon if I so desire. Macro recordings of mouse movements just don't seem to have the same capabilities.
I know that many people get nervous about the command line. They don't type well. They don't have the commands memorized. It can be frustrating until you get used to it. But there is a heavy price for a graphical interface in system resources which could and should be used for other things if the interface is only rarely required.
I hope that you aren't afraid of the command line. If you'd like to explore it in Linux there's a nice tutorial as part of our Learn Linux 101 series. Windows folks can look at this site. You don't have to use it all the time (though I admit that I do). It's nice to have it around, though for when the other tools aren't working. As an example, when I've had some program take over my graphical interface, it's nice to be able to switch to a command session to see what's happening and kill the offending processes. I've been able to use ssh from my phone to connect to my laptop when the keyboard wasn't responding and fix things without having to reboot. Is that geeky? You bet! But that skill comes in handy when you're dealing with bigger problems.
There has been some controversy about comments by Valve co-founder, Gabe Newell, calling Windows 8 a "catastrophe" and saying that Linux was part of Valve's future strategy. (Don't take my word for it. See the story by the BBC.) I admit that I haven't had as much time for games for a while, and when I do I am more likely to want to play a "human contact" game with dice and faces rather than having more computer time. However, it's no secret that Linux has been woefully thin in the gaming area. This is ironic, because I think that the tools and libraries available to Linux could make it an outstanding platform for media and gaming. It's just not where game creators focus.
Perhaps something like the Steam platform working more with Linux will make a difference. Of course, this is a future play. Steam has announced enthusiasm but not a release for Linux. It could get pretty interesting, though. While browsing through the gaming world I found that Steam is looking to Linux. Another site, Good Old Games, does not support Linux now, but might respond to interest, especially if it works well for Steam.
I did find a site, Desura, which already supports Linux. I downloaded a few of their free games to test and just might go for some of the paid titles as well. As entertainment becomes more network and browser based the native platform should matter less and less. I'm intersted to see what has happened. If anyone is already using Desura and knows games I should check out, let me know!
The O'Reilly OSCON is done, but not forgotten. Did you make it to OSCON? If not, there is a page of videos which may give you some taste of what you missed. Additionally, David Mertz is a correspondent who has been our eyes on the ground in the past and has some interesting interviews to share. I expect the first soon and we'll share it with you as soon as we can.
Of course, we're always open to your own experiences. Take a moment to join the Real World Open Source community and provide your own observations in the blog. This isn't just mine, it belongs to all of us. I hope you'll contribute.
I have a number of tools that I use for keeping my blog running. Some have expressed curiosity, so later this week I'll write something that gives some detail about how I create and manage my blog, along with some suggestions to keep it all smooth. One of the blogging tools that I found, which pleased me greatly, was Scribefire. It's a plugin for Firefox that talks to your blogs and lets you edit the entries. It worked well, but for my my primary blog it never seemed to retrieve the entries. How annoying!
It was able to get entries from other blogs, so I became convinced that there was something about titles in my blog that was causing the problem. I searched to no avail. Finally, inspired by what I do not know, I started trying to trim everything I could think of from existing titles. As it turned out, I had a few titles where I had used "#". Those were the problem entries. I changed each "#" to "No." and it all loaded very nicely. Hooray!
I'll cover how I use Scribefire along with some other tools in an entry, hopefully over the next day or so.
Cyber allegedly attacked at French McDonald's
In the strange, but true category we have the story about a man with cyber vision attacked. Steve Mann, a scientist who has used a computerized vision system to help him see, claims in his blog that he was attacked by employees of a McDonald's in Paris. The interesting side of this is that his equipment, which is essentially screwed to his skull, captured images of the event as it occurred. If it happened as he says, it is a bizarre event, reminiscent of something from a future, science-fiction world where there is conflict between people who are "enhanced" with technology and those who aren't. The photo shown is from the blog and is said to show one of the men attempting to forcibly remove Mann's eyeware.
Is it possible that this is a hoax or a misunderstanding? I could easily be fooled by the information that I have. It's a strange tale. It's a cautionary one, though. Technology of all kinds is going to intrude more and more on our lives. Some of it will be invasive and dangerous to our privacy and our persons. Others will improve the quality of peoples lives immeasurably. Still others will be somewhere in the middle, with the potential for harm, depending on how it is applied. In the sci-fi stories it is always a matter of distrust and fear that fuels the violence. In this case, it appears that the McDonald's staff was zealously trying to apply their "no photography" policy-- and idea that seems almost ridiculous when so many people have a camera with them all the time and regularly share themselves through social media. I'm guessing that they would not have objected to some high-school aged girls taking silly Facebook photos of each other over milkshakes.
Technology will intrude and some of it will be pretty weird. Both sides of that story need to understand that there is the potential for fear and conflict. We're going to have to be mature about it and find ways to deal with it.
I went with my daughter to a sporting event in Austin, Texas. The event was being broadcast and as I walked by the technical area I noticed that rather than the sort of giant video mixer that looks like something from the Death Star (which I thought actually was a video mixer but seems to be a steam plant) the director had an LCD screen and a little laptop which he used to manage everything. His system was a little magic box on a rack that connected all the cameras together and allowed him to mix live video using key presses. Wow!
Obviously, I can't qualify most of this. Several of these tools I've used, but others I need to explore. I also need to learn about hardware and who knows if I can actually afford any of it. (Cheap camera equipment to a Hollywood person is very different from a spouse's definition.) I've pointed Scott Lanningham to this and am curious if anything will jump out at him. The point to you is that technology marches on and that things that were impossible a while back are more possible now. If you find something interesting there, please share it here!
Since I do wear a number of hats in my world I find that I want to have degrees of separation between what I'm doing. Sometimes I just don't want those things to overlap. For example, I have a number of tools and things that I use internally at IBM as well as social networking and other things that are related to my job. I also have personal equivalents to all of those things. At some point I realized that I wanted to separate those things out a little more and realized that I could use Firefox profiles to do it very nicely. Since a profile can be invoked at start, I simply created new launchers that invoked the correct profile when starting Firefox. Next I created some distinctive icons to help me tell them apart. Here are a couple:
Chris's Firefox Icons
Regrettably my use of the IBM logo is unauthorized so I cannot show it to you. However, you can imagine an IBM logo floating in place of the black rectangle. These logos were done with GIMP, simply layering over the existing icons. I have a folder, $HOME/art/icons, where I keep such things. (I like to play with icons.) Next, I invoked Firefox with the -ProfileManager option. This brings up a little screen where I can create new profiles. I made one called "Test" which is set up to not record history or store cookies or anything. It is set as the default profile, so that when I invoke a page from another application I have some semblance of privacy. For my others, I invoke Firefox with the -P <profile name> and -no-remote parameters. The -P obviously selects the profile. The -no-remote overrides Firefox's default behavior to use an existing browser if open. If found that without it I might not get my chosen profile if I had another one opened.
This allows me to do things like have mulitple Twitter views open. It also helps keep my search histories and such grouped with like things. I use different themes for each one, so it's obvious which one I'm using. Admitedly, this is probably not for most people, but it is fun to play with. In a more practical setting it would make it easy to invoke have the default Firefox be restricted in some ways, for, say, a kiosk setting but still be able to launch with links appropriate for administration without logging in as a different user.
What does John Cleese know about creativity?
Recently I was inpired by the creators of South Park. Yesterday I came across this video from 1994 where John Cleese talks to a group about creativity. There's some real gold in here. He recently gave an updated version to a group of filmmakers, but this one has some gold that is surely still relevant.
Today I was looking over an interesting article called "Great open source map tools for Web developers". Look, I'm taking off my IBM hat and setting it way over there. Now it's just you and me with no major corporations to bother us or be represented in any way.
When you work in a company that does a lot of technology, with a lot of research, it's pretty difficult to deal with anything that isn't being explored somewhere. Some of these things become major projects and money centers, others are tinkered with and then set aside for the next interesting thing. So there can be a sort of love/hate relationship with companies like Google, who also do a lot of tinkering with various technology areas. One day a tool is OK to explore, the next day "Das ist verboten!" This can really be unfortunate when you have something that you're trying to do which has to either be thrown away or completely rethought. Yet when you're dealing with something like mapping, aren't you forced to go to someplace like Bing and Google who can afford to put satellites in the air and do massive data collection? Well... not necessarily.
As it turns out, governments and other public entities do massive data collection as well, and their information belongs to the public. It's true that some of it may not be as up-to-date or as rich as a private entity can do, but it is freely available. And it turns out that there are a number of open source projects that are tapping these public data sources, and finding interesting ways to tap into them. If I was a betting man, I would say that your odds of being shut off from an open-source resource are better than that of "the competition". It's also remarkably easy to find things like weather data from the National Oceanic and Atmospheric Administration and community-supported street data (Say! I can see my office from here!). Here is a source for open data catalogs around the world, which pointed me to this specific list for Texas, among others.
Obviously, along with the data, there are APIs available as well. All of this can be mashed up in interesting ways. Here's an example where Open Street Map data was used to create an interactive map of countries. As you mouse over the countries you get information about the name of the country. Clicking the country takes you to data for that area. Obviously, one needs to consider the source of information, and there are situations where you need the accountability of a paid resource, but spend some time exploring these open data pits to mine. You find a good fit for your needs, and supporting these efforts helps ensure that they continue.
If you are keeping score, we have a coupld of new items in the Open Source and Linux sites today.
In Linux, "Accelerate to Green IT - A practical guide to application migration and re-hosting" has been a significant effort by a team inside of IBM to share their observations about server consolidation based on their real experience. This is good stuff, and the kind of information that only comes from experience. Their approach is to help you identify the "low-hanging fruit" for server consolidation and to have realistic expectations for where the complexity may lie. I think that server consolidation is a fascinating area. In some ways it makes me wish I was still involved in levels of system administration where I could give it a shot. This article will likely not answer all of your questions about what to expect, but it will get you thinking, and thinking is one of the keys to success.
In Open Source is a fun article, "Building Ruby extensions in C++ using Rice: Add new programming extensions in Ruby", an examination on how to use RICE (Ruby Interface for C++ Extensions) to combine the goodness of C++ with that of Ruby. It provides more ways to use the right tool for the job and allows you to tap into the power of C without having to do everything there. I like this sort of mashed-up approach to programming. Yes, it can add some complexity, but it's worthwhile if it keeps you in control of your performance or other critical elements. It's more tools for your toolbox.
Today I came across a slideshow about fifteen products that Google has killed. This is interesting to me. We often focus on successes or downfalls, but rarely on those day-to-day ideas that didn't make it all the way. I've heard over and over that success in any area is a matter of trying enough things until something sticks. Super success is continuing to try things once you've already succeeded.
I was a regular user of iGoogle, which they are now shutting down. (I'm playing with protopage now, if you are looking for alternatives.) Most of the technologies are mentioned are things that I heard of but never really did anything with them, which is probably why they are no more. I find this sort of thing inspirational in a strange way. When big companies struggle with moving things forward, just like I do, it reminds me that the way is never paved.
Last night I was inspired from another strange place. I watched a documentary called "6 days to air" about the creative force behind the animated program South Park, and how they regularly crank out an episode in six days. I know that many of you may not be fans of the long-running series which has a well-earned reputation for equal opportunity offending, but it really is amazing to see these people who have "made it" continue to work with the same enthusiasm as a startup to get things done. One of the ideas that they discussed was that the need to meet their deadline always drove them to complete rather than to polish to perfection. One comment was that they always felt like they needed another day or two, but even if they had it, it would really only result as a 5% improvement in quality.
That's a really interesting perspective. I'm not saying that quality doesn't matter, but sometimes I wonder if creators aren't the best judge of quality. Perhaps there is a point when we need to let our creations go and let the audience decide. Of course, when we're talking about things like technology, especially software, the feedback from an audience gets fed back into the creation and adds polish from real use, rather than anticipated use. Of course, this has always been the spirit of open source. Get things out and then let them grow.
[As an aside, the conference room the kids are using to write their school TV show in S8E11-Quest for Ratings, looks very much like the conference room the South Park team uses for their weekly creation process.]
I've continued working with Blender to do some headers for the developerWorks community. (You may have noticed that my header graphic here has changed.) I'm increasingly impressed with what this software can do. The other day I was talking to someone about how easily one could transform a slide presentation into a more exciting experience by with Blender. One could simply use it to create more popping charts, by adapting in Blender. (What if you took a boring bar chart and made each column a real column, lit to highlight the item you were discussing?) Or, you could take elements of a presentation and add some pizazz by flying from one chart to another, or including other elements. It had just never occurred to me before.
Obviously you won't be able to do this with just any presentation, unless you develop some specific skills to do it quickly, or have support staff to help. But Blender has game engine elements. That means that once you have some items created you can apply physics and external control to them. Perhaps that could be harnessed to take information from other sources and then automatically work with them.
"I've done scientific visualizations in Blender before. I created a
3D globe a few months back that was UV-mapped so it would show up in
game blender. I then used the python ODBC module
(http://www.python.org/windows/win32/odbc.html) to access an
ODBC-enabled database like MySQL to vertex-paint the globe according
to the temperature at that spot.
"My current project is to use a DEM (Digital elevation map) of the
coast of Oregon, USA, and show the weather in realtime using the game
engine. So far I've been having trouble importing the DEM into blender.
I've found programs that could convert the DEM into a DXF, but it costs
May I say "Wow!"? How can you get started with something like this? Just go start! You'll probably need to deal with Python to talk to Blender. You will probably also need to deal with some sort of data conversion, coming from SQL or CSV. If you can get the channel made, though, you may have a new and interesting way to look at your data. If you make use of the game engine technology you could also embed it as part of your application's functionality. I don't know what I'm going to do with this, but I'm going to do something!
Some time back-- actually quite a while back-- I wrote a series of articles called the Windows to Linux roadmap. Now that I'm editor of the Linux site on developerWorks, I have to look at these things from a different perspective and it is bittersweet to watch them age. Ubuntu wasn't around at that time, which is my primary environment now. There are also tools that have come along to make management easier when, at the time, Webmin was really the only consistent tool I could find. (Webmin is still around, by the way, and I still might consider it if I was managing servers and needed to help share management with people who didn't have a strong Linux background.)
One of the articles I was looking over today was the one on doing backups. In 2003 the backup landscape was pretty dismal, at least from where I could see it. Were I to write that article today I would have more tools to discuss, my favorite being rsync. Rsync was actually around when I wrote the articles, but it was one of those resources that lurked in the shadows, like so many little tools do. Essentially rsync is designed to do file duplication, but tries to make it as efficient as possible by only transfering the delta (changes) in files when it can. It has a number of options and can be set up to do transfers through the network and over encrypted tunnels if desired. I wrote a little script that I run manually whenever I wish to do a backup... though I could run it automatically if I chose... and probably should.
This does a backup to my local USB drive and also does a dump to a network machine, through an encrypted tunnel. This device could be anywhere as long as I could access it over the network, and you'll notice that I am accessing it through an Internet address, so it works when I'm on the road as well. Note also that I'm doing key-based authentication in ssh.
The --exclude-from parameter lets me set up a file containing paths (with wild cards) that I do not want to back up. Things like the Trash, cache files, etc.
The first backup is a bear because it has to transfer all of the data. After that it's easier because it only addresses changes. Of course, one problem with this is that it doesn't take into account file deletions. rsync can do that, but I found that defeated the purpose of the backup if I was trying to recover files that I'd delted accidentally. So, I set up another script that I call cya-purge.sh, that handles that sort of clean-up. I run it periodically, when I'm pretty sure that I don't have something I need to restore.
This second script is identical, except for the --delete parameter, which tells rsync to remove files that are no longer on my system.
I agree that my solution is somewhat inelegant, and probably more hands-on than many people would prefer their backup to be. However, at the time that's really what I was looking for and I still enjoy doing it this way. I have a lot of granular control over this and don't have to mess with interfaces or anything like that. It's simple.
Of course, my hairy-man approach to backups is not going to be to most people's taste. For them/you there is duplicity, an elegant front end to working with rsync that handles bundling of files into smaller chunks, suitable for storing on remote networks. It also does management of the the backup to keep files around for a period of time and then allow them to leave gracefully... something that I would like to get my own scripts to do when I have time to wrap my brain around it. Duplicity is the default backup solution in Ubuntu, so if you have that turned on, you are using it!
My first experience with duplicity was not great. It spent a few hours doing a full backup of my user directory (gigs and gigs of data) and then deleted it when it was done. I never did figure out why it was doing that. However, when I recently tried it again through the Ubuntu control panel it seemed to work fine. I would need to do some tinkering to see how best to emulate my current system of dual backup-ups to a local and remote device, but it might be worth the trouble. I am amused to see that when I looked at the settings to refresh my memory that the automatic backup for today has already occured, and that I did not notice. That's a good sign!
Of course, there are a number of backup solutions that have evolved over the last nine years or so since I penned-- or shoudl I say keyboarded-- that article. Notable ones are Bacula, fwbackups and Amanda. At some point I may dig into them a little more, but in the mean time you will probably enjoy what you can do with rsync. I should point out that there are ways to use rcync in Windows as well. Take a look at this article if you want to explore that.
I was reading things through my Twitter feed the other day and came across this article by Steven J. Vaughan-Nichols discussing the choice that Google, Canonical and others have made to not use Linux in the name of their products. It's not going to turn your world upside down, and it's fairly trivial on some levels, but it is interesting. I use both Ubuntu and Android. I selected them because they have Linux as their foundation, but more specifically because out of similar choices they just did what I wanted the way I wanted.
There is a good deal of discussion about the fact that there are so many Linux distributions. "There's too much choice!" In reality most of these offerings are distinctive in some way, and merely share their foundational parts of Linux and GNU tooling. So far Android has been very successful. Ubuntu seems to be the Linux-based computing environment that more people recognize. Those are good things.
When I became editor of the Open Source site on developerWorks I was inundated with various databases and data framesworks and other similar pieces. Databases and such are fairly successful in the open-source world because that sort of work is a kind of voodoo to a lot of people. It largely runs behind the scenes and gives up data when I ask for it and hides data away when I tell it to. It's an easy place to insert open-source without upsetting people because they don't necessarily deal with the moving parts anyway.
As time has passed I've seen a lot more in the NoSQL areas and with cloud, mobile and all the strange and unusual places we try to put software nowadays I can appreciate the need to know about as many alternatives as you can. As long as the data remains open, flexibility on how you interact with it is handy and can help you turn a bad situation into an innovative opportunity.
When I can buy a 2TB drive to sit on my desk for $99 do I really need to worry about drive tuning? I would say that makes it even more important! What a shame to have a big giant drive and then waste a good deal of it because the data isn't partitioned optimally. I'm still interested in learning more about different drive tuning techniques, especially since I run Linux, because I can mix and match some of that a little more than I might in other environments.
As some of you know, I've been playing around with Blender, the free, open-source 3D modelling, animation and compositing software. I'm still just a baby, but I'm slowly learning how to do interesting things with it. Today I wanted to design a logo for a community group I'm building. I wanted to do something unusual. Tinkering with Blender, I found that one could import SVG files, created in Inkscape, and then manipulate them to have depth. I took some silhouettes that I was playing with and managed to create the following graphic. (be sure to click on it and see it full sized)
Admittedlymy picture won't win any kind of design awards, but it really shows what can be done by bringing things into a 3D environment and playing with light and such rather than simply drawing. I'll be doing much more with this. Of course, once it's designed, it's easy to move the camera around to get different perspectives and even shoot some sort of video where you move through the picture.
Blender is just one of the coolest things. I'm making this image available under Creative Commons, using the (cc-by) license. Please feel free to use it as you wish, just please give me credit.
(QUICK ONES are bigger than a tweet, but not much!)
Two quick things to look at. (I need to get my demo edited for the Pulseaudio thing I talked about the other day.) Today I looked around to see if Windows users could use ffmpeg to capture video the way I've discussed here, here, and here. The short answer is "No, you can't." Dang! It's really a cool technique. However, I did find this tool, which looks ot perform a similar function. I'll need to fire it up in a Windows VM to check it out-- and that's just not going to happen today or even this week. Maybe someone else will have a chance to check it out and give us feedback.
Also, a retweet by @0xMF: "RT @BrentOzarPLF: Turns out that if you want to finish your work by 5:30 every day, you should visit this link: http://t.co/nwFeMeE9" I didn't have time to look at it in detail ( :-) ) but it looked interesting.
This is actually kind of cool, so I thought I'd share it.
PROBLEM: To create textures for the planes in my 3D bumper video I need to do screen captures of various developerWorks articles in my browser. This is easy. I can use GIMP to capture a section of the screen, but it's just a little slow and I need to get nine images from several different web pages. Must go faster.
SOLUTION: (at least for now) Because I'm selecting the web pages by hand an automated spider wouldn't work for me, but I can still speed up the process. Here's what I did in Ubuntu Linux (12.04)
Set up the web browser and size it as I like. I will not move this window once I begin.
Bring up a web page.
Hit the PrtSc button and save the screenshot as a graphic. I'm naming the files to fit in groups to make my later processing easier.
Once I have captured the screens that I want, I need to crop them. I'll use GIMP to figure out the parameters, but automate the process.
I make note of the Position and Size parameters, because I'm going to need this for my automation. Since I never moved the browser window and I always did a full screenshot each article will be in the same position in each saved image.
Now, I apply the magic, ImageMagick to be precise. I use the convert command to process all of these images at once:
ls screenshots|xargs -iFILE convert screenshots/FILE -crop 935x894+346+172 ./cropped-FILE
I've pulled all of my images into the screenshots directory. I want to preserve them because I always try to preserve originals until I'm sure they won't be needed. I'm using ls to get all the file names and piping them through xargs, an incredibly useful tool for passing things from one tool to another. convert, a component of ImageMagick, crops each file, based on the parameters we got from GIMP. All files are cropped perfectly and identically in seconds.
There is probably a tool somewhere that I can use to download web pages and convert them directly into images. That could save some time, except that I'm still picking the pages by hand to make sure that the are representative of the topic. If I ever do something that's spidering through a page I might do it differently. For now this was pretty handy.
I've had a bit of an R & D week. Yesterday I recorded an audio interview for developerWorks which needs to be edited and reviewed and transcripted, so it will be a while before you get to hear it, but it's fun stuff. In order to do that I needed to tinker with my audio setup. I found out how to rearrange pulseaudio in Linux to let me do recordings from a Voice Over Internet Protocol (VOIP) phone. I followed directions from a few folks, but still ran into some little fiddly problems that confused me. I recorded my own video demo for that which I will provide here when I get it edited down a little. This is going to make a big difference for me, and also opens the door to being able to record video demos in an interesting way. Look for that here soon. I recorded the demo using my own instant demo script which begins recording my desktop with the microphone using ffmpeg. Here it is, in case you find it useful:
#!/bin/bash COUNT=5 echo "Capture starting in..." while [ $COUNT -gt 0 ]; do echo $COUNT let COUNT=$COUNT-1 sleep 1 done
The -s 1024x768 -i :0.0+520,200 parameters tell ffmpeg to record a 1024x768 section of my screen and to offset it by 520,200. This corresponds to the little rectangle I've drawn on my wallpaper as "demo space". That way I can visually see the constraints of the recording.
Yesterday I had some time set aside with our main media guy, Scott Laningham. He had to bump to next week so I had the prospect of doing some tedious busy-work or keeping in my R & D frame of mind. I decided to stick with the R & D and worked with Blender, an open-source 3D modeling and compositing program, to start creating a video bumper for things that I create on developerWorks. Blender makes it pretty easy to create flying logo animations once you have your environment built. (That is, of course, the tricky part.) Essentially you build the environment and then you send a virtual camera through it, pointing it where you want. There is lighting and all kinds of interesting elements to work with, but it's a fairly specific skill.
By the end of my time yesterday I had an environment with some 3D letters for developerWorks and a plane suspended in space showing a developerWorks article on it. I could fly the camera around and renderd an 18-second test. I need to tinker more with the materials on my lettering (I want it to look like hard plastic) and I need to replicate the plane with many different articles from developerWorks to make a maze of them to fly through. (Sounds kind of like actually using developerWorks, doesn't it? )
This is why I love open-source software. I didn't need to take any classes or convince someone to let me spend budget to explore these technologies. They are simply waiting for me. A media person I know does the same thing I did with pulseaudio using $700 worth of hardware. Granted, some of what I'm doing requires my level of technical enthusiasm and persistance. There are times when the commercial answer is better suited. But, if you are a techie, and you want to play there's nothing stopping you. You just need to open the door.
Look for my pulseaudio demo soon. Oh! I forgot to tell you that Scott and I did a quick video discussion about some open-source thinking last week. Here it is if you're curious: