I'll be using this blog over the next 3
weeks or so to chronicle my experiences with something very different
from my normal job. I have been given a great opportunity to learn
about emerging markets, help out a growing city, and experience a new
culture, all rolled into one. IBM has a Corporate Service program
called the IBM Corporate Service Corps. Through this program, IBM
sponsors groups of IBM employees to work with local governments in
emerging markets, to bring new technology and capabilities to these
areas so that they can grow in a smart and efficient manner.
I've been accepted into this program
and have just started a 3 week assignment in Da Nang, Vietnam. For
those who are not quite familiar with the area, Vietnam is in
southeast Asia, roughly half way between India and Japan. Da Nang,
Vietnam is near the mid-point, north-to-south, of the country, on the
coast of the South China Sea. Da Nang is the third largest urban
area in Vietnam, with about 1 million people. Hanoi, in the north,
and Ho Chi Minh City (Saigon), in the south are much larger.
Our goal, while here in Da Nang, is to
work with the local government to find ways to improve three areas of
critical infrastructure: water management, transportation, and food
safety. Within these three, I will be focusing on food safety, along
with my IBM colleague, Eileen Doherty. We will learn much more about
what challenges Da Nang faces in these areas as we work with
government leaders over the 3 week assignment.
My travel to Vietnam started early on
Tuesday morning (5AM from the house, 7AM flight) from Raleigh, NC USA
(-5 GMT). After a 4 hour lay-over at JFK airport in New York, I took
a 13.5 hour flight to Incheon, South Korea. The flight path for this
flight took us very near the North Pole, flying over Canada, Hudson
Bay, the Arctic circle, Siberia, and China before landing in South
Korea at 4PM local time – on Wednesday afternoon. After another 3
hour lay-over, we took our final flight, 4.5 hours more, to land in
Da Nang, Vietnam (+7 GMT) at 10PM on Wednesday evening. Since the US
is on daylight savings time right now, Da Nang is 11 hours ahead of
the current time in Raleigh, NC.
My first impressions of Da Nang were
very good – the airport is very modern, customs officials are very
nice, and the people are quite helpful. This promises to be a very
interesting 3 weeks!
Anyone who writes software for a living has been in this position. You're asked to make an update to an existing application or to add a new feature to an application. And you've never seen the source code before. Or maybe you have, but that was 2 weeks ago and so many things have happened since then that you can't recall a thing about the application.
What do you do? If you're like most developers, you pick up a piece of the source code, look at it for awhile, get a feel for what's going on, and repeat until you have a good enough idea of the program structure that you feel confident in making some changes. Unfortunately, this is time consuming, really difficult to do, and you even run the risk of looking at a piece of code that is either rarely run through or might even be wrong. This situation gets astronomically more difficult when the source code base rises into the 10,000s or 100,000s of source modules and spans 1,000s of relational tables in a database.
There are tools available to help developers navigate through a large source code base, and they're much more useful for code understanding and analysis than the programmer's trusted friends: find, grep, and Perl. One such tool is Rational Asset Analyzer
. This tool provides users the ability to search very quickly through large amounts of source code coupled with DB2 and IMS/DB database schema. While some basic source code metrics are available easily from the starting pages of its user interface, the real power of RAA comes out when its program structure diagrams, impact analysis, and dead-code display features are used. By indexing the source code by the variables and identifiers used in the source code, RAA can point out subtle dependencies between modules and database tables which would otherwise go un-noticed by developers unfamiliar with the source code base.
Leshek Feidorowicz has recently created a short video which shows some of RAA's features. I found it very informative and if you are in need of program understanding tools I think you'll find it interesting as well. You can find it here
I often hear from people I work with
that “System z is old school” or “no one writes code for
mainframes any more.” I quietly let them say their piece – but I
know better. There is plenty of new and exciting work being done
with every hardware platform that IBM offers – System z included.
But I'm getting ahead of myself. In this blog post, I'll look at
mainframe application development, de-bunk some common
misconceptions, and provide links to the latest technology for
streamlining application development on z/OS and Linux for System z.
Current Thinking on Developing
applications for Mainframe Systems
Except for the savvy group of "Master the Mainframe" students from universities, if you ask someone
in a Computer Science program about how much exposure to mainframes
and z/OS they receive you'll find that they give you something of a
hollow stare. “Mainframes? Nobody writes applications for them
any more.” some of them might (erroneously) say. And even if you
ask someone who knows that there is quite a bit of new work done on
mainframe systems every day, they usually think of some archaic
programming environment involving large, heavy, and old text-based
terminals with 24x80 screens.
There's an ongoing perception that
application development for z/OS requires the use of text-based user
interfaces , writing code in assembler language, using ALL CAPITAL
LETTERS, and lacking common comforts such as interactive debuggers.
This causes most people to think that writing applications for z/OS
is hard to do and thus they tend to avoid it whenever possible.
This, in turn, feeds the urban legend of the difficulty in working
with z/OS, since the set of people working with the systems becomes
more isolated. While this ancient history is fun to think about, it
turns out that the reality of developing applications for z/OS is
very much different than this urban legend.
To further the thinking, people also
believe that accessing mainframe systems is nigh impossible to
accomplish. Here is a point that has a level of truth in it. It is
true that mainframe systems don't grow on trees, or fit in our
pockets. And so it has been a bit of a struggle in the past to get
access to mainframe systems. But recent offerings are making this
barrier to entry come down – I'll give some pointers on this at the
end of this blog post.
It turns out that as of 2012, the
common perceptions about application development for z/OS are very
far from reality. The reality here is that application development
for z/OS is very much like application development for any other
platform. You use modern application development tools including
integrated development environments (IDEs), interactive debuggers,
feature-rich, syntax-highlighting editors, including wizards which
generate application source code on your behalf. These tools also
include capabilities to interact with remote databases, allowing
teams to test out complex SQL queries prior to coding them into
applications and support for multiple programming languages including
Java, C/C++, COBOL, and PL/I. The tools assist development teams in
compiling, linking, and debugging their applications, all from a
remotely connected workstation which communicates with the
application running on the mainframe system.
COBOL is a very common language used
for business applications running on mainframe systems. The tools
available for COBOL are vast and deep in function, ranging from code
analysis and understanding tools to advanced code re-factoring
support built into the code editors. This re-factoring support
allows for re-locating blocks of function, ensuring that the
resulting application code still performs the same set of steps after
the re-factoring is complete.
In addition to COBOL, however, Java is
a very popular programming language for mainframe applications. Java
applications running in application server environments, on-line
transaction processing environments, as well as for running batch
programs are all supported, with the tools helping teams write these
applications and deploy them into the target runtime environments.
This still leaves the access to the
z/OS systems as something to take care of. Well, it turns out that
there is a solution to this issue as well. It is now possible to
access a high fidelity emulation environment for application
development of mainframe applications – whether those are COBOL,
C/C++, Java, or even assembler language programs. And for all the
runtime environments into which those applications might run –
batch, WebSphere Application Server, CICS, IMS, or DB2. This
environment can be thought of as an emulation environment rather than
a simulation of mainframe functions running on a workstation. As
such, development teams can get much more familiar with System z and
z/OS systems as well as working with their applications running in
their intended runtime environments.
Now to Make it Easy
All the parts which I've described
above have been available separately for quite some time now. For
those that are wondering what the names of those tools are, here is a
However, the burden has been on the
development organization to realize the integration points amongst
these tools and to figure out how to use the above tools in the
combination for which they were intended.
A solution from IBM takes the guessing
out of this and helps teams realize the benefits of using the above
tools in combination. The Integrated Solution for System z Development (ISD for z) is a combination of
the tools I have noted above. This solution represents the
recommended set of tools for developing applications that contain
mainframe components. The solution and the corresponding
integrations have been optimized to support typical development usage
scenarios. With ISD for z, you get all the capabilities which you
have come to expect when writing code for Linux or Windows platforms,
along with a whole lot more features to enable team collaboration
across projects which have pieces that run on multiple platforms.
Application development for z/OS and
System z is really just like development for every other platform.
Tools exist to help access the system environment, edit, compile, and
debug applications running on the platform, and even understand
existing applications which have evolved over decades of continual
refinement. And access to a z/OS system for application development
purposes is easier than ever before – removing that barrier to
entry which has held development teams back in the past.
Have a look at these and I think you'll
agree that the future is bright for software development – no
matter what the target platform is. In fact, the tools enable
organizations to choose their deployment platform based on their
business requirements, not based on the skill-set of their
organization. This frees up the organization to choose the best
platform for the job to be performed – freedom indeed!
I spoke with a customer the other day who is looking for a way to call a web service from a COBOL program running in batch, running on a z/OS system. After thinking about it, I realized that more people than just this one customer might be interested in this topic, so I've made a blog entry about it.
I typically refer to this type of thing as an "outbound" web service call. The web service implementation happens to be written in Java, runs on a Windows Server, and front-ends a SQL Server database. The COBOL batch application needs to retrieve some information from the SQL Server database as part of its batch processing.
I talked about 3 high level approaches:
- Use a mechanism like EXCI or a local queue to make a request to a CICS transaction or a WebSphere Application Server application running on the z/OS system. From this runtime environment, make an outbound web services call. Interestingly enough, the customer indicated that they are already calling outbound web services from some of their CICS transactions.
- Use a MQ queue setup to communicate with a "proxy" which will translate queue messages into web services requests, and the responses back into queue messages. I used DataPower as the easiest example. Code the DataPower box to take an incoming MQ message, copy the message elements into a web services request, invoke the web service, take the response data and put it onto a MQ queue message, and send that back.
- Use an open source web services client (such as C/C++ Axis or Java Axis) and call it directly from the COBOL batch application. The customer asked how this related to the COBOL "INVOKE" operation. This operation is what is used in COBOL to call to Java code - with the Java/COBOL support - so I told them that yes, this operation is what would be used to make calls to the Java web services client.
There were also some questions about whether or not there would be issues in calling a web services within a long running batch program loop. I indicated that no matter what, there's going to be some added latency in contacting another system and retrieving data from that system. What we typically recommend is that if the number of individual items to process in a given time is very high, then consider creating another web service which allows for making a batch of requests through a single web services call. Then use the information returned within the batch program "inner loop", alleviating the need to call the web service on every iteration.
Note that there do exist all sorts of tools to make the job of connecting the different data formats involved above together. There are WSDL files describing the web services, COBOL data structures (COPYBOOKs), Java classes, MQ message queue data formats, and so on. Some of these tools generate source code (Java, COBOL, and even PL/I), others generate configuration settings for various environments, and some do both. Which tool is best for you depends on which of the paths you choose to take from those listed above.
Software development teams that want to work well, spend more time coding new function and less time answering those age old questions "So, are you on time? In budget? On schedule?" are always looking for ways to make reporting something that they don't have to spend time on. Building reports whenever someone asks for status just wastes the development team's time. Well, maybe not wastes time, but it does take away from their ability to focus on the tasks that they're being asked to do, namely, design new solutions, write code, and deliver functions and fixes and enhancements.
I have always been interested in how application development tools can actually help development teams rather than get in their way. One of the areas of interest for me is in this area of reporting. How well can a version control and bug tracking system support a team's need for reports, or their management's hunger for status reports from the team? Is it easy to create these reports? How do the reports look? How can consumers of the reports get them without impacting the development team who is usually asked to create them?
In my experience, most bug tracking systems and version control systems have various mechanisms for getting information out of them and then leave it to some set of smart people to figure out how to make the results look pretty for their organization. This usually means that an enterprising individual in the organization has to take it upon themselves to figure it all out, write some reports, and then maintain those reports over time. This is somewhat to be expected since every organization tends to want their reports to look and act a certain way, whether this is because of past history and tools that they have used, or because they just like to have some customization.
Bottom line, the development tool needs to serve multiple needs. There need to be some good built-in or pre-canned reports to get people started, and then there needs to be a good way of extending/adding reports to the environment so that the right information is available to the right folks ... whenever those folks need to access that information.
After working with lots of different tools over the years, I keep being impressed by what Rational Team Concert (RTC) provides in both built-in function/features as well as extensibility. This area of reporting is no exception. I found a great article on setting up Rational Team Concert reports this morning, written by Ken Kumagai. It is titled: "Creating customized reports through multiple project areas in Rational Team Concert
". At first, the information provided seems complex and daunting. BIRT reporting is used, you have to understand that RTC provides a "Jazz data source" that is usable in BIRT reports, and so on. However, once you get past this, you see that there is real power here.
First, RTC provides a bunch of built-in reports that can be used in dashboards. Second, after looking at some of the references, including some from Tim McMackin back in 2010 (see: "Creating custom reports with BIRT and Rational Team Concert - 3 part series
"), you see that RTC also offers a great way to build customized reports and then enable integration of those reports into dashboards as well.
So here you have it, a set of tools to allow both easy reporting and customized reporting across both work items and version control information. This wouldn't be possible without an integrated tool that allows access to this type of information. If you haven't already, you should have a look at these capabilities if your software development teams are plagued with the typical issues of needing to provide status all the time while still finding time to write the applications and features that they're expected to deliver.
In the lead-up to IBM's annual Innovate conference, Innovate 2011
, the Enterprise Modernization team has put together some short and concise video overviews of our application development tooling and application lifecycle management solutions.
You can view these at this Youtube
I encourage people interested in learning more to spend a couple of minutes to discover the wealth of features and capabilities that are provided in these tools.
It's been a bit of time (about 2.5 months) since I've had a chance to update this blog. In that time, I've been working on projects in several areas including:
- Application Portfolio Management
- IBM Compilers support and capabilities
- Enterprise Modernization tools, including IDEs, source code management, and build automation tools
- Cloud computing and its implications for application development and development tools
We're all also working hard to prepare for the up-coming Innovate 2011 conference
, taking place in June, 2011 in Orlando, FL. This year's conference looks to be an exciting one, including an appearance by IBM Watson, the computer that successfully competed in a Jeopardy! format trivia contest earlier this year.
There are also going to be several interesting session tracks at the conference for people working on applications which run on AIX and z/OS systems. There are dedicated tracks for each of these platforms, with sessions to cover a wide range of topics around application design, development, test, and support for applications running on these platforms.
I had the opportunity to speak at the SHARE Conference in Anaheim, CA
last week. For those not aware of this conference, it is one of the longest-running technical conferences and offers attendees the opportunity to learn about new and interesting ways to take advantage of their mainframe systems, in conjunction with the myriad of other computing systems in their enterprise. The conference is membership-driven and is held twice a year in the United States. Not only do the attendees learn from the sessions, they also get the opportunity to network with their peers from other organizations, compare notes, trade techniques, and learn from one another.
I have spoken at the conference in the past on a variety of topics ranging from the z/OS LDAP server and how to configure and use LDAP directory servers, to security offerings from IBM Tivoli and how to use them alongside or to enhance mainframe security administration, to topics concerning enterprise architecture and software development. Every time I attend, I find it refreshing that the attendees bring with them a fierce desire to learn new technologies and how to apply them to their organization.
At this Spring's conference, I presented on three topics:
- Application Development for z/OS - a discussion around the latest and greatest technology for writing and maintaining applications that run on z/OS systems, covering those applications run in CICS, IMS, DB2, or batch, and also programming languages such as COBOL, PL/I, C/C++, Java, JCL, and REXX.
- Application Modernization - a presentation about application portfolio management and application modernization and how these two processes are cyclical in nature and linked to one another.
- Multi-platform Application Development - a forward-looking presentation about where application programmers and application development teams are going to have to get to in terms of being nimble and agile in the future. The premise for this talk is that application programming teams are generally able to pick up and use new programming languages quickly. What impedes their progress is the time it takes to learn platform-specific tools and runtime environments. To the extent that application development tools mitigate or remove the need to learn new tools and environments, teams can become effective and efficient moving from project to project and platform to platform.
The next time you hear that there is a skills gap for mainframe application development, think again. Computer programmers and computer scientists today are inquisitive to learn and use new environments (new to them at least). And developers are now able to choose their tool-set in such a way that it is applicable and supporting of multiple runtime environments and multiple languages simultaneously. Because of this, these developers can quickly pick up programming languages they have not used in the past and be effective developers of applications running on a wide variety of systems. This means that mainframe application development skills are within every organization's reach, as long as those teams are enabled appropriately.
One of the reasons for the relatively long pause in blog entries on this blog a couple of weeks ago was because I was lucky enough to have a vacation. The vacation was great, plenty of sun, scuba diving, and relaxing on the beach. I also had time to read a couple of books which I had been wanting to get through for some time. One of those books is the subject of today's blog entry.
I read the book You Are Not A Gadget
, by Jaron Lanier. For some, this book may seem to be an attack on the Internet (Mr. Lanier is careful never to capitalize the I in Internet, by the way). I did not find this to be the case. I believe that Mr. Lanier makes some very good points about the current and possible future uses of the computing technology. While we have witnessed extraordinary growth in the computational, storage, and networking capacity of computing systems (as witnessed by the recent IBM Watson
showing on Jeopardy!), this has not resulted in any quickened pace of innovation other areas of human innovation (for example, music). Indeed, there are quite a few unpleasant areas of the Internet and certainly the ability to criticize in relative anonymity can make the Internet a seemingly unfriendly place to be at times.
On the other hand, the Internet has enabled startling growth in communications between people, even if that communications is not as formal or perhaps civil as it might have been were the people sitting across from one another. We see evidence of social networking being used to organize peaceful protests, locate lost friends, and re-unite relatives who have been searching for one another.
As luck would have it, I completed the book just before all the excitement over the IBM Watson Jeopardy! Challenge event. The contrast between the cautionary guidance from Mr. Lanier and the cutting-edge research into machine learning of IBM Watson seems on the surface to be considerable. However, my interpretation is somewhat different. What I see here is that we are moving to era where computing systems can serve as an extremely capable advisory system to assist us in being even more innovative and expressive than ever before. We must, however, be sure to keep in mind our ethics and manners as we blaze this trail, taking responsibility for what we write, compose, draw, publish, and produce.
As for computers taking over the planet - well, that does make for great science fiction. For now, I'm content to keep in mind that because every computer is still programmed, there's ample errors (unintentional) and missing features in them so that we're not going to rely on them for everything anytime soon. And that brings me right back around to the topic of application development, development tools, and application modernization which is the subject of this blog. Since the collective mountain of software in the world just keeps rising (we don't do much of anything to reduce the pile), there's ample opportunity for teams to improve applications as they build new features. IBM Watson isn't writing software ... at least not yet.
I had the pleasure of visiting my alma mater last week to interview students for summer internship positions at IBM. I always enjoy going to the campus to meet with faculty and students and hear about what they're studying, what research they're working on, and what's on their minds.
One thing always strikes me as interesting though when I talk to students about their software development experience. They are always excited to tell me about the projects that they have worked on and the cool technical problems that they had to overcome to build a solution. Some of this software runs on Linux systems, some if it is web application server-based, some of it is database programming, and still other projects are various game implemntations that the students have created. The interesting thing here is what they don't talk about. I very rarely hear about how the students worked as a team to collaborate on the project. Yes, they do work as a team. But more often than not, working as a team turns into a group of them all sitting in front of the same LCD screen and sort of "team coding" the solution.
This is just one aspect of what I will call "professional software development." And it is something that I'm sad to see is not getting as much attention as I would like it to get in our education system. Most curricula that I have seen relegates topics such as "software design", "team structure", "planning", "source configuration management", "requirements management", "work item management", and "test management" to courses such as "Software Engineering". These courses are typically junior or senior level elective courses. The outcome of this approach is that students fresh into the work-force have developed software development habits that are best described as "solitary". The concepts of multiple people working simultaneously on a source base are only experienced if the students are contributing to some open source project - only rarely do students consider this as part of their class work. The benefits of coordinated build and automated testing are never considered. And managing a group project using a source configuration management system that can coordinate multiple people's work and assist with addressing conflicting changes to source code is completely foreign.
When these students enter the work-force, they wind up spending a large amount of their time learning all of these working characteristics "on the job". And in doing so, they have to un-learn what could be viewed in the industry as "bad habits" before they can appreciate the benefits of working alongside many others who are all contributing to the same source code base.
I tend to think that we could do better here by instituting a development process for our university students that encourages good, strong, "professional software development" practices from the out-set of their college experience. In so doing, we would be preparing these students much better for the style of programming that is found in most medium to large software development environments. This would also encourage "good coding habits" before those "bad habits" take root, resulting in software development teams that seek out good source configuration management, requirements tracking, defect and feature tracking, and automated build tools rather than shun them.
What have other people seen in their interactions with new hires and college students they have worked with? I'm interested in hearing your experiences.
A colleague of mine pointed out this interesting article in eWeek: link to article
. I was very happy to see this getting some attention in the industry press. The article points out some very good information about COBOL and its use in the computing industry. I think it's safe to say that COBOL is a programming language for commercial software development, and has been used effectively in a wide range of business-related applications. Features pointed out in the article include:
- much existing business software is written in COBOL
- there is a large amount of COBOL being written every year
- there are still a great number of programmers writing applications in COBOL
- COBOL is avaliable for many platforms, including emerging cloud-based runtime environments
- there are ways of using COBOL alongside other programming languages as well, including Java
- there are COBOL compilers from several vendors including Micro Focus and Veryant
I feel the need to point out some additional information that was not part of the article. IBM has continued to provide a wide range of tools and support for COBOL (and other languages as well) in the form of COBOL compilers, run-time environments, application development tools (editors, builders, debuggers), and application analysis and program understanding tools. I will use this blog post to point some of these features out, corresponding to the features that the article noted above describes.
In terms of support of COBOL and applications written in COBOL, IBM provides:
- support for programmers working with COBOL
- syntax-highlighted editing of COBOL programs with context-sensitive help and type-ahead support
- interactive debugging of COBOL programs running on remote systems
- program build and syntax check support, linking compilation errors back to the source code in the editor
- generation of COBOL programs which are accessible through web services
- generation of web services from existing COBOL programs
- See: Rational Developer for System z, Rational Developer for Power Systems
- support for COBOL compilers on multiple platforms
- using COBOL with multiple programming languages including C/C++, EGL, and Java
- application analysis of COBOL applications
- understand the program structure of a COBOL application
- understand the databases and files that a COBOL application uses or modifies
- understand the logic flow of a COBOL program
- perform impact analysis of changes that are being considered for a COBOL program
- See: Rational Asset Analyzer
Workstation and cloud environments can serve as excellent development and test systems to develop, test, and refine COBOL applications. In most cases, production runtime environments will dictate the ultimate deployment topology in which the COBOL applications run. These environments may be cloud-based or utilize traditional data-center-hosted systems. For development and test, the IBM Smart Business Development and Test Cloud
offers quick and easy access to development and test systems for individuals or teams to use.
In summary, IBM also provides a wide range of support for application development teams working with, extending, renovating, and building COBOL applications. I encourage teams to investigate all the great tools available for this important and useful programming language!
As I read over my posting from late last week and thought about this situation a bit more, I kept coming back to the term I used in the title of this posting - skills portability. I believe that application development teams will be challenged increasingly to be able to move, quickly, from project to project and from application source code base to source code base with ease, precision, and speed. In such an environment, teams and individuals will need to be multi-lingual (programming languages), multi-platform, and multi-environment savvy. These teams will have to switch from environment to environment to perform their designated tasks. In this way, these teams will have to be able to adapt their skills to different environments, thus requiring portable skills or the virtue of skills portability.
How is this possible? Does the current computer engineering or computer science curriculum emphasize such skills development? Is there a set of educational materials or exercises that will hone such skills? I think that the answers to these questions lie not in the specific courses that students take, but in the ways in which the courses are taught, the tools and environments used for projects and homework during the classes, and the emphasis on tool usage that is made both during education and on-the-job training that takes place during internships and in a permanent position.
There seems to be something of a quantum gap between "heavy tool users" (those that rely heavily on the capabilities of an integrated development environment (IDE), build automation tools, and emulation/test environments) and "command-line users" (those that are masters of arcane commands such as 'ls', 'mv', 'find', 'grep', 'make', and so on). For a person to move from one of those environments to the other is quite an effort. Everything is different. Seemingly simple tasks turn into hours of drudgery or Internet searching to find the answer. Even moving between different command-line environments can be a challenge. Commands are different, options are different, and editor styles and features vary widely.
Moving between different platforms, languages, and environments when using an IDE, however, seems to be quite a bit easier - even if the IDE is different from one the user has used in the past. A user of and IDE is already familiar with the need to "hunt around" through the menu bar or context menus to find an appropriate task. Users expect fly-over help and visual cues to guide them in "doing the right thing". In this way, I have found that users of IDEs wind up being able to transfer their skills between multiple languages, platforms, and runtime environments much more quickly and effectively than those who are experts in the particular command-line tools of the particular platforms that they are used to using.
In summary, for skills portability, I encourage teams and individuals to take that quantum leap into learning and using an integrated development environment (IDE). It will be painfully slow for the first several uses. But you will make back that investment in time saved, precision, and the ability to apply your skills to a wider platform and runtime environment set that otherwise possible.
I keep hearing that COBOL skills are in decline. But for some reason, I just don't believe that this is something for organizations to worry that much about.
Here's my reasoning.
Stepping back from all of this, what strikes me as most interesting is that the number of computer languages that people are becoming at least conversant in is rising, not falling. And on top of this, if a computer programmer is good at anything by the time they have obtained their degree, it should be the ability to pick up, read, learn, and create new programs in languages that they, at first, are not familiar with. This is just the way the software industry is headed. Softwarwe engineers will need, as a prerequisite of getting a job, the ability to learn and use languages that are "new to them".
And so, that being the case, the world is full of potential COBOL programmers! There's not a shortage of them any more than there is a shortage of C/C++ programmers.
Now, what is difficult is learning new run-time environments and new application development environments. Figuring out all the details of how to access source code, how to edit it, how to compile, debug, and test it - those are tough things to pick up. We get used to using the tools we're familiar with, and our muscle memory takes over and helps us be even more productive in the environments where we're used to working. This is where having development tools that ease the access to and usage of unfamiliar environments come to the rescue. And that is where Rational Integrated Development Environments (IDEs) such as Rational Developer for Power Systems and Rational Developer for zEnterprise come in.
So the next time someone laments to you the lack of COBOL skills, think again. I think there are millions of them available ... they just don't know it yet.
I have stated before that the hardest part about application development is getting started. What I mean by this is that it takes great effort to get one-self into the mind-set and position of actually designing a piece of software or writing a software program or routine or building that source code into a program or testing that program to see if it runs and so on. Performing this task takes a strong will and focused concentration on the specific task at hand. In many cases, other distractions such as e-mail, instant messages, facebook status, SMS text alerts, and even the phone need to be turned off or ignored just so you can get into the necessary level of concentration to perform the task at hand.
There are times, though, especially when first learning how to use some of the application development tools that are available, that being able to ask questions, get answers, and interact with others who are working through similar situations is an incredibly productive way to understand how to perform a task or learn a feature about the tools that you were not familiar with before. Reading through product documentation can be boring and difficult, especially with most documentation only available in electronic format. On the one hand, electronic format is easily searched based on keywords. On the other hand, everyone has, at one time or another, found what they were looking for more quickly by fanning through the pages of a book until something catches their eye, either in a figure, a section title, or a code snippet that they saw as the pages flew by. This latter method is not well covered by electronic book formats.
To address the needs of application developers of all skill levels learn how to make better use of the tools at their disposal, there are several discussion lists available free of charge and backed by the developers of the tools themselves. We call these discussion areas "cafes" to give them the feel of places where people can come together, talk about issues or successes that they have had, and generally find out how to make use of the wide array of features in the Rational application development and application modernization tools. If you're an application developer and you haven't looked at the Rational cafes, I encourage you to do so at your first opportunity (there are links at the end of this post). You can just watch and read or contribute your successes and questions to the cafe discussion, it's your choice.
These cafes help application developers get started. They ease the learning curve for using the tools, and serve as effective "how-to" documentation to fill the gap between product manuals and personal experience. Use them!
The top level link for much of this information is:
This posting is a bit off-topic from Application Modernization. However, it is related, since everybody has a need to use what are known as "office productivity" tools.
I have been a long-time Linux workstation user. I have enjoyed using the open source operating system and have found it to be, in general, the best environment to suit my needs as a developer at heart. When I was in college I used mostly SunOS and Unix-style environments and thus I always had a soft spot for a good command shell. But I digress.
As a Linux user, my options for "office productivity" tools were somewhat limited. After looking around, I became a Open Office user and have found Open Office to be quite good at documents, presentations, and spreadsheets - the three main elements of an office productivity tool-suite that I use most often. There was nothing I could not accomplish in Open Office that would necessitate my using MSOffice. Interoperability with my colleagues who were using MSOffice file formats (.ppt, .pptx, .doc, .docx, .xls, .xlsx) worked reasonably well, with only minor re-formatting required for most files.
But I was always on the look-out for other tools that could be used. In looking around, I decided a couple of days ago to give Symphony 3 a try. I tried Symphony 1.x in the past and was not impressed with its capabilities. I have to say, though, with Symphony 3, the user interface is much improved, to the point that I think it is more polished than Open Office. Having worked with it for a week or so now, I am thinking that I will switch to using Symphony 3 for the forseeable future. The interface is a bit nicer than Open Office and it's screen rendering seems just as good.
I suggest you give Symphony 3 a try. It's available for Mac, Linux, and Windows and ready for you to try it out.
One more note - I recently had an article published in IBM System Magazine - Mainframe EXTRA edition - see here. The article discusses multi-platform application development and the rising need for application programmers to be multi-platform and multi-language savvy. Enjoy.