This blog contains articles related to the practicalities around
application software development.
In this blog, I'll offer opinions and advice on the following topics:
- multi-platform application development
- skills required for future commercial programmers
- definition of "commercial software development"
- why versioning is important
- supporting migration, coexistence, fall-back
- what to think about when writing commercial software
I'll hope to answer some of these types of questions:
- How should source code control be
- When should secure engineering be employed?
- How can teams
maintain software for their sucessors to own?
I hope you find this information useful for your career and the software development that you are responsible for.
I see that I had high hopes of becoming a consistent, but not prolific blogger back in November 2010 when I started this blog.
And since then I've not updated it at all - as they say - life interfered.
So, as I begin the new year, I am re-doubling my efforts to be much more diligent in adding information to this blog.
This first entry for the year will only contain a couple of topic ideas for future blog entries. Here are some things I've been thinking about as trends for Application Modernization and Commercial Software Development:
- Application Modernization is not a single pass process. It is a process of continual renewal (much like the maintenance one does on a house or other asset).
- The biggest issue/impediment with programming and for programmers is to "get started".
- The second biggest issue/impediment for programmers is to "stay engaged and on task"
- Multi-platform applications and application development is the future - and the future is now.
- There are aspects of writing software that apply to all practitioners ... and yet very little of these skills is taught through formal education:
- coding for the next person to read and update your code
- design and coding for migration, coexistence, and fall-back scenarios
- design and coding for forward and backward compatibility
- design and coding for multiple active versions deployed into production
- design, coding, and testing with security characteristics in mind
- Enterprise Architecture, Application Portfolio Management, and Enterprise Modernization are tightly related
I hope to spend time on each of the topics above in the coming weeks and months.
I have spent the day today educating myself on IBM's Service Oriented Model and Architecture (SOMA) method. Why, you may ask. Well, as I look across the Rational product portfolio, we have a bevy of product offerings that are available for teams to use. These products are jam-packed with features and capabilities. So jam-packed, in fact, that the tools can be overwhelming at times.
One way to provide focus for architects, project managers, application developers, and test teams is to have them follow a well thought out process, using tools such as the Rational tools to support their work.
So, as part of my desire to help teams a) get started and b) stay engaged and on task, I find myself interested in how best to show the Rational tools being used at various stages of a application development process such as SOMA. Teams following a SOMA method will have clearly articulated tasks to accomplish, and my hope is to show the Rational tools (a combination of Rational Team Concert, Rational modeling tools, Rational requirements tools, Rational application development IDEs, and Rational Testing tools) being used in each of these tasks.
I have seen that alignment has been done in the past between SOMA and the Rational Unified Process (RUP). This was a good start and was appropriate at the time that the work was completed. However, times have changed and both development processes and product portfolios have evolved since that work was completed several years ago. It appears that it is time to re-assess SOMA and the Rational tool-suite to show how both complement and support one another.
I had the good fortune to speak with a colleague of mine yesterday on this topic.
He and I agree that these two areas - Application Portfolio Management (APM) and Application Modernization (sometimes referred to as Enterprise Modernization) - are very much related to one another.
Indeed, APM is a process by which teams relate their business goals and priorities to their information technology (IT) assets, deciding which assets to refresh, rewrite, rehost, retire, replace, or refactor. And as siuch, APM work spawns application modernization work!
But there is a feedback loop involved as well. Since smart teams make informed choices as to what path to take, the tools used and information gathered to perform application modernization is critical in making informed decisions about the organization's application portfolio. Thus, application modernization efforts feed information back into the APM process for the next spin through the analysis of the portfolio and determination of what direction to take the IT assets.
You can think of APM as a cyclical process which spawns application modernization projects. In turn, these application modernization projects are cyclical and feed information back into the APM process for the next cycle of portfolio analysis.
For those interested, the Rational products of interest for these areas of work are:
- Application Portfolio Management
- Application Modernization
These tools are complementary, with different team members making use of them at different stages of the APM and application modernization process steps.
I have recently started using the term "multi-platform application development" quite a bit. But what do I mean by this term?
From my perspective, as I look across a wide range of application development teams in many different industry sectors, what I see is a steep rise in the number of platforms, operating systems, and programming languages being used to provide complete solutions. Rather than the computer programming converging on a common application infrastructure or programming language, I see that the programming task is becoming more and diverse.
The industry has stated for years that one of the keys to extensibility is to have clearly defined interfaces and employ loose coupling to enable freedom of choice of platform, runtime environment and programming language. It appears that the industry has succeeded in this! That is the good news.
On the other hand, by having such freedom, applications tend to become combinations of multiple different elements or components, each running in their own environment, with their own constraints, capabilities, and limitations. Testing and debugging such applications starts to get much more complicated since the set of dependencies (on what must be up and running, or somehow scaffolded to appear to be available) rises with each loosely coupled element in the overall system. Furthermore, the team that provides one component may not always be available when there are questions related to that specific component.
Because of the rise in platforms, runtimes, and programming languages, coupled with the necessity of organizations to streamline their IT staffs, including programming staffs, as much as possible, there is now increased pressure on application development teams to be able to switch from platform to platform, from runtime to runtime, and from language to language with speed and laser-point accuracy. This is the essence of multi-platform development.
Just because an application is built of a loosely coupled set of components, each written in whatever language is most appropriate and running on the most appropriate system, doesn't mean that a programming staff only needs to worry about one small part of the solution. Application development teams are responsible for the entire solution and thus must be ready to take on the management, maintenance, and enhancement of the entire application, no matter what platform, runtime, or language the piece of the solution was created to run within.
In summary, multi-platform application development is the idea that application development teams will need to be increasingly platform-nimble and language-nimble in order to be the most productive and most valued assets to the development staff of their organization. The application programmer that can move freely between platforms, runtimes, and languages will be easiest to re-align to the ever-changing needs of the organization to meet its business goals. Application development teams should be thinking about how to be multi-platform savvy as they improve their programming skills.
In one of my previous blog updates I noted that the hardest thing to do when writing software is getting started. I figured I would explain myself a bit more in this post and give some links to some useful information for those who are possibly new to the Rational application development tools.
The task of writing software, either creating it from nothing or editing existing software to make changes and enhancements, is a task that requires focused attention and concentration. Anything that provides a distraction or pulls the person away from the specific task they need to accomplish simply gets in the way of the programmer. Because of this need to concentrate on the task of programming, it is very difficult to "get into the zone". And once a programmer is in that zone, the last thing you want to do is pull them out. For doing so will just require them to spend the extra effort and time to re-engage and catch up to where they had left off when they were interrupted.
The day-to-day tasks of the application programmer require similar tasks to be performed over and over. These include accessing source code, finding a particular location in the source code that needs attention, making those changes, building a test driver, and testing the changes to the application that the programmer made. These steps are performed countless times by programmers, no matter what language, runtime environment, or platform they are working on.
The Rational Developer tools (for System z, zEnterprise, and Power Systems) are well-suited to streamline the tasks noted above. As such, after they have been set up appropriately, these capabilities help the programmer make an easier task of getting started on making changes (i.e. it's easier to "get in the zone"). And once the programmer is in that zone, the integrated development environment (IDE) helps to keep them engaged on completing that task.
The following links contain videos which show off some very basic examples of using the IDEs to do these common application programming tasks. I hope you enjoy them:
There are many more instructional videos on using the Rational tools located at the IBM Education Assistant website. I hope you find these useful for getting your team started and keeping them on task!
This posting is a bit off-topic from Application Modernization. However, it is related, since everybody has a need to use what are known as "office productivity" tools.
I have been a long-time Linux workstation user. I have enjoyed using the open source operating system and have found it to be, in general, the best environment to suit my needs as a developer at heart. When I was in college I used mostly SunOS and Unix-style environments and thus I always had a soft spot for a good command shell. But I digress.
As a Linux user, my options for "office productivity" tools were somewhat limited. After looking around, I became a Open Office user and have found Open Office to be quite good at documents, presentations, and spreadsheets - the three main elements of an office productivity tool-suite that I use most often. There was nothing I could not accomplish in Open Office that would necessitate my using MSOffice. Interoperability with my colleagues who were using MSOffice file formats (.ppt, .pptx, .doc, .docx, .xls, .xlsx) worked reasonably well, with only minor re-formatting required for most files.
But I was always on the look-out for other tools that could be used. In looking around, I decided a couple of days ago to give Symphony 3 a try. I tried Symphony 1.x in the past and was not impressed with its capabilities. I have to say, though, with Symphony 3, the user interface is much improved, to the point that I think it is more polished than Open Office. Having worked with it for a week or so now, I am thinking that I will switch to using Symphony 3 for the forseeable future. The interface is a bit nicer than Open Office and it's screen rendering seems just as good.
I suggest you give Symphony 3 a try. It's available for Mac, Linux, and Windows and ready for you to try it out.
One more note - I recently had an article published in IBM System Magazine - Mainframe EXTRA edition - see here. The article discusses multi-platform application development and the rising need for application programmers to be multi-platform and multi-language savvy. Enjoy.
I have stated before that the hardest part about application development is getting started. What I mean by this is that it takes great effort to get one-self into the mind-set and position of actually designing a piece of software or writing a software program or routine or building that source code into a program or testing that program to see if it runs and so on. Performing this task takes a strong will and focused concentration on the specific task at hand. In many cases, other distractions such as e-mail, instant messages, facebook status, SMS text alerts, and even the phone need to be turned off or ignored just so you can get into the necessary level of concentration to perform the task at hand.
There are times, though, especially when first learning how to use some of the application development tools that are available, that being able to ask questions, get answers, and interact with others who are working through similar situations is an incredibly productive way to understand how to perform a task or learn a feature about the tools that you were not familiar with before. Reading through product documentation can be boring and difficult, especially with most documentation only available in electronic format. On the one hand, electronic format is easily searched based on keywords. On the other hand, everyone has, at one time or another, found what they were looking for more quickly by fanning through the pages of a book until something catches their eye, either in a figure, a section title, or a code snippet that they saw as the pages flew by. This latter method is not well covered by electronic book formats.
To address the needs of application developers of all skill levels learn how to make better use of the tools at their disposal, there are several discussion lists available free of charge and backed by the developers of the tools themselves. We call these discussion areas "cafes" to give them the feel of places where people can come together, talk about issues or successes that they have had, and generally find out how to make use of the wide array of features in the Rational application development and application modernization tools. If you're an application developer and you haven't looked at the Rational cafes, I encourage you to do so at your first opportunity (there are links at the end of this post). You can just watch and read or contribute your successes and questions to the cafe discussion, it's your choice.
These cafes help application developers get started. They ease the learning curve for using the tools, and serve as effective "how-to" documentation to fill the gap between product manuals and personal experience. Use them!
The top level link for much of this information is:
I keep hearing that COBOL skills are in decline. But for some reason, I just don't believe that this is something for organizations to worry that much about.
Here's my reasoning.
Stepping back from all of this, what strikes me as most interesting is that the number of computer languages that people are becoming at least conversant in is rising, not falling. And on top of this, if a computer programmer is good at anything by the time they have obtained their degree, it should be the ability to pick up, read, learn, and create new programs in languages that they, at first, are not familiar with. This is just the way the software industry is headed. Softwarwe engineers will need, as a prerequisite of getting a job, the ability to learn and use languages that are "new to them".
And so, that being the case, the world is full of potential COBOL programmers! There's not a shortage of them any more than there is a shortage of C/C++ programmers.
Now, what is difficult is learning new run-time environments and new application development environments. Figuring out all the details of how to access source code, how to edit it, how to compile, debug, and test it - those are tough things to pick up. We get used to using the tools we're familiar with, and our muscle memory takes over and helps us be even more productive in the environments where we're used to working. This is where having development tools that ease the access to and usage of unfamiliar environments come to the rescue. And that is where Rational Integrated Development Environments (IDEs) such as Rational Developer for Power Systems and Rational Developer for zEnterprise come in.
So the next time someone laments to you the lack of COBOL skills, think again. I think there are millions of them available ... they just don't know it yet.
As I read over my posting from late last week and thought about this situation a bit more, I kept coming back to the term I used in the title of this posting - skills portability. I believe that application development teams will be challenged increasingly to be able to move, quickly, from project to project and from application source code base to source code base with ease, precision, and speed. In such an environment, teams and individuals will need to be multi-lingual (programming languages), multi-platform, and multi-environment savvy. These teams will have to switch from environment to environment to perform their designated tasks. In this way, these teams will have to be able to adapt their skills to different environments, thus requiring portable skills or the virtue of skills portability.
How is this possible? Does the current computer engineering or computer science curriculum emphasize such skills development? Is there a set of educational materials or exercises that will hone such skills? I think that the answers to these questions lie not in the specific courses that students take, but in the ways in which the courses are taught, the tools and environments used for projects and homework during the classes, and the emphasis on tool usage that is made both during education and on-the-job training that takes place during internships and in a permanent position.
There seems to be something of a quantum gap between "heavy tool users" (those that rely heavily on the capabilities of an integrated development environment (IDE), build automation tools, and emulation/test environments) and "command-line users" (those that are masters of arcane commands such as 'ls', 'mv', 'find', 'grep', 'make', and so on). For a person to move from one of those environments to the other is quite an effort. Everything is different. Seemingly simple tasks turn into hours of drudgery or Internet searching to find the answer. Even moving between different command-line environments can be a challenge. Commands are different, options are different, and editor styles and features vary widely.
Moving between different platforms, languages, and environments when using an IDE, however, seems to be quite a bit easier - even if the IDE is different from one the user has used in the past. A user of and IDE is already familiar with the need to "hunt around" through the menu bar or context menus to find an appropriate task. Users expect fly-over help and visual cues to guide them in "doing the right thing". In this way, I have found that users of IDEs wind up being able to transfer their skills between multiple languages, platforms, and runtime environments much more quickly and effectively than those who are experts in the particular command-line tools of the particular platforms that they are used to using.
In summary, for skills portability, I encourage teams and individuals to take that quantum leap into learning and using an integrated development environment (IDE). It will be painfully slow for the first several uses. But you will make back that investment in time saved, precision, and the ability to apply your skills to a wider platform and runtime environment set that otherwise possible.
A colleague of mine pointed out this interesting article in eWeek: link to article
. I was very happy to see this getting some attention in the industry press. The article points out some very good information about COBOL and its use in the computing industry. I think it's safe to say that COBOL is a programming language for commercial software development, and has been used effectively in a wide range of business-related applications. Features pointed out in the article include:
- much existing business software is written in COBOL
- there is a large amount of COBOL being written every year
- there are still a great number of programmers writing applications in COBOL
- COBOL is avaliable for many platforms, including emerging cloud-based runtime environments
- there are ways of using COBOL alongside other programming languages as well, including Java
- there are COBOL compilers from several vendors including Micro Focus and Veryant
I feel the need to point out some additional information that was not part of the article. IBM has continued to provide a wide range of tools and support for COBOL (and other languages as well) in the form of COBOL compilers, run-time environments, application development tools (editors, builders, debuggers), and application analysis and program understanding tools. I will use this blog post to point some of these features out, corresponding to the features that the article noted above describes.
In terms of support of COBOL and applications written in COBOL, IBM provides:
- support for programmers working with COBOL
- syntax-highlighted editing of COBOL programs with context-sensitive help and type-ahead support
- interactive debugging of COBOL programs running on remote systems
- program build and syntax check support, linking compilation errors back to the source code in the editor
- generation of COBOL programs which are accessible through web services
- generation of web services from existing COBOL programs
- See: Rational Developer for System z, Rational Developer for Power Systems
- support for COBOL compilers on multiple platforms
- using COBOL with multiple programming languages including C/C++, EGL, and Java
- application analysis of COBOL applications
- understand the program structure of a COBOL application
- understand the databases and files that a COBOL application uses or modifies
- understand the logic flow of a COBOL program
- perform impact analysis of changes that are being considered for a COBOL program
- See: Rational Asset Analyzer
Workstation and cloud environments can serve as excellent development and test systems to develop, test, and refine COBOL applications. In most cases, production runtime environments will dictate the ultimate deployment topology in which the COBOL applications run. These environments may be cloud-based or utilize traditional data-center-hosted systems. For development and test, the IBM Smart Business Development and Test Cloud
offers quick and easy access to development and test systems for individuals or teams to use.
In summary, IBM also provides a wide range of support for application development teams working with, extending, renovating, and building COBOL applications. I encourage teams to investigate all the great tools available for this important and useful programming language!
I had the pleasure of visiting my alma mater last week to interview students for summer internship positions at IBM. I always enjoy going to the campus to meet with faculty and students and hear about what they're studying, what research they're working on, and what's on their minds.
One thing always strikes me as interesting though when I talk to students about their software development experience. They are always excited to tell me about the projects that they have worked on and the cool technical problems that they had to overcome to build a solution. Some of this software runs on Linux systems, some if it is web application server-based, some of it is database programming, and still other projects are various game implemntations that the students have created. The interesting thing here is what they don't talk about. I very rarely hear about how the students worked as a team to collaborate on the project. Yes, they do work as a team. But more often than not, working as a team turns into a group of them all sitting in front of the same LCD screen and sort of "team coding" the solution.
This is just one aspect of what I will call "professional software development." And it is something that I'm sad to see is not getting as much attention as I would like it to get in our education system. Most curricula that I have seen relegates topics such as "software design", "team structure", "planning", "source configuration management", "requirements management", "work item management", and "test management" to courses such as "Software Engineering". These courses are typically junior or senior level elective courses. The outcome of this approach is that students fresh into the work-force have developed software development habits that are best described as "solitary". The concepts of multiple people working simultaneously on a source base are only experienced if the students are contributing to some open source project - only rarely do students consider this as part of their class work. The benefits of coordinated build and automated testing are never considered. And managing a group project using a source configuration management system that can coordinate multiple people's work and assist with addressing conflicting changes to source code is completely foreign.
When these students enter the work-force, they wind up spending a large amount of their time learning all of these working characteristics "on the job". And in doing so, they have to un-learn what could be viewed in the industry as "bad habits" before they can appreciate the benefits of working alongside many others who are all contributing to the same source code base.
I tend to think that we could do better here by instituting a development process for our university students that encourages good, strong, "professional software development" practices from the out-set of their college experience. In so doing, we would be preparing these students much better for the style of programming that is found in most medium to large software development environments. This would also encourage "good coding habits" before those "bad habits" take root, resulting in software development teams that seek out good source configuration management, requirements tracking, defect and feature tracking, and automated build tools rather than shun them.
What have other people seen in their interactions with new hires and college students they have worked with? I'm interested in hearing your experiences.
One of the reasons for the relatively long pause in blog entries on this blog a couple of weeks ago was because I was lucky enough to have a vacation. The vacation was great, plenty of sun, scuba diving, and relaxing on the beach. I also had time to read a couple of books which I had been wanting to get through for some time. One of those books is the subject of today's blog entry.
I read the book You Are Not A Gadget
, by Jaron Lanier. For some, this book may seem to be an attack on the Internet (Mr. Lanier is careful never to capitalize the I in Internet, by the way). I did not find this to be the case. I believe that Mr. Lanier makes some very good points about the current and possible future uses of the computing technology. While we have witnessed extraordinary growth in the computational, storage, and networking capacity of computing systems (as witnessed by the recent IBM Watson
showing on Jeopardy!), this has not resulted in any quickened pace of innovation other areas of human innovation (for example, music). Indeed, there are quite a few unpleasant areas of the Internet and certainly the ability to criticize in relative anonymity can make the Internet a seemingly unfriendly place to be at times.
On the other hand, the Internet has enabled startling growth in communications between people, even if that communications is not as formal or perhaps civil as it might have been were the people sitting across from one another. We see evidence of social networking being used to organize peaceful protests, locate lost friends, and re-unite relatives who have been searching for one another.
As luck would have it, I completed the book just before all the excitement over the IBM Watson Jeopardy! Challenge event. The contrast between the cautionary guidance from Mr. Lanier and the cutting-edge research into machine learning of IBM Watson seems on the surface to be considerable. However, my interpretation is somewhat different. What I see here is that we are moving to era where computing systems can serve as an extremely capable advisory system to assist us in being even more innovative and expressive than ever before. We must, however, be sure to keep in mind our ethics and manners as we blaze this trail, taking responsibility for what we write, compose, draw, publish, and produce.
As for computers taking over the planet - well, that does make for great science fiction. For now, I'm content to keep in mind that because every computer is still programmed, there's ample errors (unintentional) and missing features in them so that we're not going to rely on them for everything anytime soon. And that brings me right back around to the topic of application development, development tools, and application modernization which is the subject of this blog. Since the collective mountain of software in the world just keeps rising (we don't do much of anything to reduce the pile), there's ample opportunity for teams to improve applications as they build new features. IBM Watson isn't writing software ... at least not yet.
I had the opportunity to speak at the SHARE Conference in Anaheim, CA
last week. For those not aware of this conference, it is one of the longest-running technical conferences and offers attendees the opportunity to learn about new and interesting ways to take advantage of their mainframe systems, in conjunction with the myriad of other computing systems in their enterprise. The conference is membership-driven and is held twice a year in the United States. Not only do the attendees learn from the sessions, they also get the opportunity to network with their peers from other organizations, compare notes, trade techniques, and learn from one another.
I have spoken at the conference in the past on a variety of topics ranging from the z/OS LDAP server and how to configure and use LDAP directory servers, to security offerings from IBM Tivoli and how to use them alongside or to enhance mainframe security administration, to topics concerning enterprise architecture and software development. Every time I attend, I find it refreshing that the attendees bring with them a fierce desire to learn new technologies and how to apply them to their organization.
At this Spring's conference, I presented on three topics:
- Application Development for z/OS - a discussion around the latest and greatest technology for writing and maintaining applications that run on z/OS systems, covering those applications run in CICS, IMS, DB2, or batch, and also programming languages such as COBOL, PL/I, C/C++, Java, JCL, and REXX.
- Application Modernization - a presentation about application portfolio management and application modernization and how these two processes are cyclical in nature and linked to one another.
- Multi-platform Application Development - a forward-looking presentation about where application programmers and application development teams are going to have to get to in terms of being nimble and agile in the future. The premise for this talk is that application programming teams are generally able to pick up and use new programming languages quickly. What impedes their progress is the time it takes to learn platform-specific tools and runtime environments. To the extent that application development tools mitigate or remove the need to learn new tools and environments, teams can become effective and efficient moving from project to project and platform to platform.
The next time you hear that there is a skills gap for mainframe application development, think again. Computer programmers and computer scientists today are inquisitive to learn and use new environments (new to them at least). And developers are now able to choose their tool-set in such a way that it is applicable and supporting of multiple runtime environments and multiple languages simultaneously. Because of this, these developers can quickly pick up programming languages they have not used in the past and be effective developers of applications running on a wide variety of systems. This means that mainframe application development skills are within every organization's reach, as long as those teams are enabled appropriately.
It's been a bit of time (about 2.5 months) since I've had a chance to update this blog. In that time, I've been working on projects in several areas including:
- Application Portfolio Management
- IBM Compilers support and capabilities
- Enterprise Modernization tools, including IDEs, source code management, and build automation tools
- Cloud computing and its implications for application development and development tools
We're all also working hard to prepare for the up-coming Innovate 2011 conference
, taking place in June, 2011 in Orlando, FL. This year's conference looks to be an exciting one, including an appearance by IBM Watson, the computer that successfully competed in a Jeopardy! format trivia contest earlier this year.
There are also going to be several interesting session tracks at the conference for people working on applications which run on AIX and z/OS systems. There are dedicated tracks for each of these platforms, with sessions to cover a wide range of topics around application design, development, test, and support for applications running on these platforms.
In the lead-up to IBM's annual Innovate conference, Innovate 2011
, the Enterprise Modernization team has put together some short and concise video overviews of our application development tooling and application lifecycle management solutions.
You can view these at this Youtube
I encourage people interested in learning more to spend a couple of minutes to discover the wealth of features and capabilities that are provided in these tools.
Software development teams that want to work well, spend more time coding new function and less time answering those age old questions "So, are you on time? In budget? On schedule?" are always looking for ways to make reporting something that they don't have to spend time on. Building reports whenever someone asks for status just wastes the development team's time. Well, maybe not wastes time, but it does take away from their ability to focus on the tasks that they're being asked to do, namely, design new solutions, write code, and deliver functions and fixes and enhancements.
I have always been interested in how application development tools can actually help development teams rather than get in their way. One of the areas of interest for me is in this area of reporting. How well can a version control and bug tracking system support a team's need for reports, or their management's hunger for status reports from the team? Is it easy to create these reports? How do the reports look? How can consumers of the reports get them without impacting the development team who is usually asked to create them?
In my experience, most bug tracking systems and version control systems have various mechanisms for getting information out of them and then leave it to some set of smart people to figure out how to make the results look pretty for their organization. This usually means that an enterprising individual in the organization has to take it upon themselves to figure it all out, write some reports, and then maintain those reports over time. This is somewhat to be expected since every organization tends to want their reports to look and act a certain way, whether this is because of past history and tools that they have used, or because they just like to have some customization.
Bottom line, the development tool needs to serve multiple needs. There need to be some good built-in or pre-canned reports to get people started, and then there needs to be a good way of extending/adding reports to the environment so that the right information is available to the right folks ... whenever those folks need to access that information.
After working with lots of different tools over the years, I keep being impressed by what Rational Team Concert (RTC) provides in both built-in function/features as well as extensibility. This area of reporting is no exception. I found a great article on setting up Rational Team Concert reports this morning, written by Ken Kumagai. It is titled: "Creating customized reports through multiple project areas in Rational Team Concert
". At first, the information provided seems complex and daunting. BIRT reporting is used, you have to understand that RTC provides a "Jazz data source" that is usable in BIRT reports, and so on. However, once you get past this, you see that there is real power here.
First, RTC provides a bunch of built-in reports that can be used in dashboards. Second, after looking at some of the references, including some from Tim McMackin back in 2010 (see: "Creating custom reports with BIRT and Rational Team Concert - 3 part series
"), you see that RTC also offers a great way to build customized reports and then enable integration of those reports into dashboards as well.
So here you have it, a set of tools to allow both easy reporting and customized reporting across both work items and version control information. This wouldn't be possible without an integrated tool that allows access to this type of information. If you haven't already, you should have a look at these capabilities if your software development teams are plagued with the typical issues of needing to provide status all the time while still finding time to write the applications and features that they're expected to deliver.
I spoke with a customer the other day who is looking for a way to call a web service from a COBOL program running in batch, running on a z/OS system. After thinking about it, I realized that more people than just this one customer might be interested in this topic, so I've made a blog entry about it.
I typically refer to this type of thing as an "outbound" web service call. The web service implementation happens to be written in Java, runs on a Windows Server, and front-ends a SQL Server database. The COBOL batch application needs to retrieve some information from the SQL Server database as part of its batch processing.
I talked about 3 high level approaches:
- Use a mechanism like EXCI or a local queue to make a request to a CICS transaction or a WebSphere Application Server application running on the z/OS system. From this runtime environment, make an outbound web services call. Interestingly enough, the customer indicated that they are already calling outbound web services from some of their CICS transactions.
- Use a MQ queue setup to communicate with a "proxy" which will translate queue messages into web services requests, and the responses back into queue messages. I used DataPower as the easiest example. Code the DataPower box to take an incoming MQ message, copy the message elements into a web services request, invoke the web service, take the response data and put it onto a MQ queue message, and send that back.
- Use an open source web services client (such as C/C++ Axis or Java Axis) and call it directly from the COBOL batch application. The customer asked how this related to the COBOL "INVOKE" operation. This operation is what is used in COBOL to call to Java code - with the Java/COBOL support - so I told them that yes, this operation is what would be used to make calls to the Java web services client.
There were also some questions about whether or not there would be issues in calling a web services within a long running batch program loop. I indicated that no matter what, there's going to be some added latency in contacting another system and retrieving data from that system. What we typically recommend is that if the number of individual items to process in a given time is very high, then consider creating another web service which allows for making a batch of requests through a single web services call. Then use the information returned within the batch program "inner loop", alleviating the need to call the web service on every iteration.
Note that there do exist all sorts of tools to make the job of connecting the different data formats involved above together. There are WSDL files describing the web services, COBOL data structures (COPYBOOKs), Java classes, MQ message queue data formats, and so on. Some of these tools generate source code (Java, COBOL, and even PL/I), others generate configuration settings for various environments, and some do both. Which tool is best for you depends on which of the paths you choose to take from those listed above.
I often hear from people I work with
that “System z is old school” or “no one writes code for
mainframes any more.” I quietly let them say their piece – but I
know better. There is plenty of new and exciting work being done
with every hardware platform that IBM offers – System z included.
But I'm getting ahead of myself. In this blog post, I'll look at
mainframe application development, de-bunk some common
misconceptions, and provide links to the latest technology for
streamlining application development on z/OS and Linux for System z.
Current Thinking on Developing
applications for Mainframe Systems
Except for the savvy group of "Master the Mainframe" students from universities, if you ask someone
in a Computer Science program about how much exposure to mainframes
and z/OS they receive you'll find that they give you something of a
hollow stare. “Mainframes? Nobody writes applications for them
any more.” some of them might (erroneously) say. And even if you
ask someone who knows that there is quite a bit of new work done on
mainframe systems every day, they usually think of some archaic
programming environment involving large, heavy, and old text-based
terminals with 24x80 screens.
There's an ongoing perception that
application development for z/OS requires the use of text-based user
interfaces , writing code in assembler language, using ALL CAPITAL
LETTERS, and lacking common comforts such as interactive debuggers.
This causes most people to think that writing applications for z/OS
is hard to do and thus they tend to avoid it whenever possible.
This, in turn, feeds the urban legend of the difficulty in working
with z/OS, since the set of people working with the systems becomes
more isolated. While this ancient history is fun to think about, it
turns out that the reality of developing applications for z/OS is
very much different than this urban legend.
To further the thinking, people also
believe that accessing mainframe systems is nigh impossible to
accomplish. Here is a point that has a level of truth in it. It is
true that mainframe systems don't grow on trees, or fit in our
pockets. And so it has been a bit of a struggle in the past to get
access to mainframe systems. But recent offerings are making this
barrier to entry come down – I'll give some pointers on this at the
end of this blog post.
It turns out that as of 2012, the
common perceptions about application development for z/OS are very
far from reality. The reality here is that application development
for z/OS is very much like application development for any other
platform. You use modern application development tools including
integrated development environments (IDEs), interactive debuggers,
feature-rich, syntax-highlighting editors, including wizards which
generate application source code on your behalf. These tools also
include capabilities to interact with remote databases, allowing
teams to test out complex SQL queries prior to coding them into
applications and support for multiple programming languages including
Java, C/C++, COBOL, and PL/I. The tools assist development teams in
compiling, linking, and debugging their applications, all from a
remotely connected workstation which communicates with the
application running on the mainframe system.
COBOL is a very common language used
for business applications running on mainframe systems. The tools
available for COBOL are vast and deep in function, ranging from code
analysis and understanding tools to advanced code re-factoring
support built into the code editors. This re-factoring support
allows for re-locating blocks of function, ensuring that the
resulting application code still performs the same set of steps after
the re-factoring is complete.
In addition to COBOL, however, Java is
a very popular programming language for mainframe applications. Java
applications running in application server environments, on-line
transaction processing environments, as well as for running batch
programs are all supported, with the tools helping teams write these
applications and deploy them into the target runtime environments.
This still leaves the access to the
z/OS systems as something to take care of. Well, it turns out that
there is a solution to this issue as well. It is now possible to
access a high fidelity emulation environment for application
development of mainframe applications – whether those are COBOL,
C/C++, Java, or even assembler language programs. And for all the
runtime environments into which those applications might run –
batch, WebSphere Application Server, CICS, IMS, or DB2. This
environment can be thought of as an emulation environment rather than
a simulation of mainframe functions running on a workstation. As
such, development teams can get much more familiar with System z and
z/OS systems as well as working with their applications running in
their intended runtime environments.
Now to Make it Easy
All the parts which I've described
above have been available separately for quite some time now. For
those that are wondering what the names of those tools are, here is a
However, the burden has been on the
development organization to realize the integration points amongst
these tools and to figure out how to use the above tools in the
combination for which they were intended.
A solution from IBM takes the guessing
out of this and helps teams realize the benefits of using the above
tools in combination. The Integrated Solution for System z Development (ISD for z) is a combination of
the tools I have noted above. This solution represents the
recommended set of tools for developing applications that contain
mainframe components. The solution and the corresponding
integrations have been optimized to support typical development usage
scenarios. With ISD for z, you get all the capabilities which you
have come to expect when writing code for Linux or Windows platforms,
along with a whole lot more features to enable team collaboration
across projects which have pieces that run on multiple platforms.
Application development for z/OS and
System z is really just like development for every other platform.
Tools exist to help access the system environment, edit, compile, and
debug applications running on the platform, and even understand
existing applications which have evolved over decades of continual
refinement. And access to a z/OS system for application development
purposes is easier than ever before – removing that barrier to
entry which has held development teams back in the past.
Have a look at these and I think you'll
agree that the future is bright for software development – no
matter what the target platform is. In fact, the tools enable
organizations to choose their deployment platform based on their
business requirements, not based on the skill-set of their
organization. This frees up the organization to choose the best
platform for the job to be performed – freedom indeed!
Anyone who writes software for a living has been in this position. You're asked to make an update to an existing application or to add a new feature to an application. And you've never seen the source code before. Or maybe you have, but that was 2 weeks ago and so many things have happened since then that you can't recall a thing about the application.
What do you do? If you're like most developers, you pick up a piece of the source code, look at it for awhile, get a feel for what's going on, and repeat until you have a good enough idea of the program structure that you feel confident in making some changes. Unfortunately, this is time consuming, really difficult to do, and you even run the risk of looking at a piece of code that is either rarely run through or might even be wrong. This situation gets astronomically more difficult when the source code base rises into the 10,000s or 100,000s of source modules and spans 1,000s of relational tables in a database.
There are tools available to help developers navigate through a large source code base, and they're much more useful for code understanding and analysis than the programmer's trusted friends: find, grep, and Perl. One such tool is Rational Asset Analyzer
. This tool provides users the ability to search very quickly through large amounts of source code coupled with DB2 and IMS/DB database schema. While some basic source code metrics are available easily from the starting pages of its user interface, the real power of RAA comes out when its program structure diagrams, impact analysis, and dead-code display features are used. By indexing the source code by the variables and identifiers used in the source code, RAA can point out subtle dependencies between modules and database tables which would otherwise go un-noticed by developers unfamiliar with the source code base.
Leshek Feidorowicz has recently created a short video which shows some of RAA's features. I found it very informative and if you are in need of program understanding tools I think you'll find it interesting as well. You can find it here
I'll be using this blog over the next 3
weeks or so to chronicle my experiences with something very different
from my normal job. I have been given a great opportunity to learn
about emerging markets, help out a growing city, and experience a new
culture, all rolled into one. IBM has a Corporate Service program
called the IBM Corporate Service Corps. Through this program, IBM
sponsors groups of IBM employees to work with local governments in
emerging markets, to bring new technology and capabilities to these
areas so that they can grow in a smart and efficient manner.
I've been accepted into this program
and have just started a 3 week assignment in Da Nang, Vietnam. For
those who are not quite familiar with the area, Vietnam is in
southeast Asia, roughly half way between India and Japan. Da Nang,
Vietnam is near the mid-point, north-to-south, of the country, on the
coast of the South China Sea. Da Nang is the third largest urban
area in Vietnam, with about 1 million people. Hanoi, in the north,
and Ho Chi Minh City (Saigon), in the south are much larger.
Our goal, while here in Da Nang, is to
work with the local government to find ways to improve three areas of
critical infrastructure: water management, transportation, and food
safety. Within these three, I will be focusing on food safety, along
with my IBM colleague, Eileen Doherty. We will learn much more about
what challenges Da Nang faces in these areas as we work with
government leaders over the 3 week assignment.
My travel to Vietnam started early on
Tuesday morning (5AM from the house, 7AM flight) from Raleigh, NC USA
(-5 GMT). After a 4 hour lay-over at JFK airport in New York, I took
a 13.5 hour flight to Incheon, South Korea. The flight path for this
flight took us very near the North Pole, flying over Canada, Hudson
Bay, the Arctic circle, Siberia, and China before landing in South
Korea at 4PM local time – on Wednesday afternoon. After another 3
hour lay-over, we took our final flight, 4.5 hours more, to land in
Da Nang, Vietnam (+7 GMT) at 10PM on Wednesday evening. Since the US
is on daylight savings time right now, Da Nang is 11 hours ahead of
the current time in Raleigh, NC.
My first impressions of Da Nang were
very good – the airport is very modern, customs officials are very
nice, and the people are quite helpful. This promises to be a very
interesting 3 weeks!
I've been in Da Nang, Vietnam for less
that 24 hours now, have a good night's sleep behind me and have had
my first face-to-face meeting with the IBM team. We've been having
prep calls leading up to this assignment for the past 8 weeks, but
it's always great to put a face to the names and voices that you've
heard on the calls. We come from all over the world – 4 from the
USA (Connecticut, North Carolina, Illinois, and California), 1 from
Singapore, and 1 from Shanghai. We're also lucky to have two
additional team members with us from Royal Dutch Shell. They hail
from Texas and Brunei.
We got together today to discuss our
backgrounds, go over logistics for the engagement, and meet our
interpreters. Yes, we will be using interpreters for our meetings
with local government and others. This will be a first for me and so
I am looking forward to how this will go.
I haven't mentioned yet that Da Nang's
climate is hot and humid! During the day, it's in the 30s (celsius)
with high humidity. Definitely like late summer in North Carolina.
In the afternoon, since our first
meetings with government officials will be on Friday, we went into
the city center of Da Nang for lunch and to get a better feel for how
residents of the city work, eat, and travel. While the streets are
in very good condition, there are relatively few cars. And a huge
number of motorcycles! I would estimate the ratio to be around 10:1,
motorcycles to cars/trucks. The next thing you notice is that there
are very few traffic lights. Instead, motorists all move in a
constant flow style, even at intersections. This applies to
pedestrians as well. When you cross the street, don't wait for
traffic to clear. Rather, move deliberately and with a constant
speed. Motorists will miss you as long as you keep moving. Believe
it or not, it works … it does take a bit of getting used to!
I am looking forward to meeting the
mayor of Da Nang tomorrow, and also meeting the heads of water,
transportation, and agriculture departments as well. Now that we're
mostly over our jet lag, it's time to get down to some work!
As a part of our work on this service
corps assignment, we are expected to work closely with members of the
local government departments. However, in order to do this, it is
necessary, like in many other government situations, to follow proper
procedures and to establish the support of the appropriate leaders in
Today, we had the opportunity to meet
with Department of Information and Communication (DIC) Chairman Chien
along with leaders from several departments in the Da Nang government
offices. In particular, we met the leaders of the Da Nang Office of
Natural Resources and Environment (DONRE), Da Nang Water Company
(Dawaco), Da Nang Office of Foreign Affairs (DOFA), Da Nang Office of
Agriculture and Rural Development (DARD), and Da Nang Office of
During the meeting, the government
leaders introduced themselves to us. We, through our spokesperson
for the meeting, Peter Williams and our interpreter, introduced our
team as well as members from the local IBM team who will be helping
us throughout the week. We went on to express our gratitude to the
city of Da Nang for having us work with them and our excitement to
get started in working with the individual departments. Chairman
Chien assured us that his departments will work with us to get us all
the information we need to assess their current projects, identify
improvements, and recommend a path to the future to support a city of
innovation and knowledge-based workforce for Da Nang.
At the end of the meeting, gifts were
presented. We received a beautiful engraved plate containing
interesting features of the city of Da Nang.
The meeting was short, but a very good
way to start off our work with the city leaders. With this warm
introduction, we are ready to get started in earnest at the beginning
of next week. We will have meetings with each of the departments to
learn details of what they are currently doing, what their plans are,
and what areas are in need of improvement. We will be working
through interpreters for the majority of our discussions with the
teams which will certainly be an new experience for most of us.
I am looking forward to next week when
we start our in-depth discussions and analysis.
Today was a new set of experiences for
us in our services corps assignment. As part of a community
out-reach element to our visit, we spent time today speaking to
university students at a Cloud Camp, sponsored by IBM.
I had the great honor of speaking to
the students about Cloud Computing and the benefits which this
concept is bringing to Information Technology providers world-wide.
This is the first time that I had to speak to an audience through an
interpreter – I am very greatful to Anh for her work to provide
Vietnamese interpretation of my presentation.
I had a tough act to follow when I did
my talk today. Before me was a musical performance by two students
at the university. While I didn't understand any of the words of the
song, the vocal and guitar were very well done and pleasant to listen
After the presentation in the morning,
we had some free time in the afternoon. With the beach so close to
our hotel, Dave Hagerman, another member of the Executive Service
Corps team, and myself headed out to the beach area to enjoy a great
lunch al fresco followed by a relaxing time on the beach. The white
sand of the beaches here is incredible so if you're a beach lover, Da
Nang should be on your list to visit!
We started our Executive Service Corps
assignment in earnest this morning in Da Nang, Vietnam. To kick-off
the assignment, we were hosted at the Da Nang Software Park to
introduce ourselves to the Da Nang city government departments,
describe the IBM Smarter
Cities Challenge, describe IBM
Smarter Planet and Smarter
Cities initiatives, and congratulate the city of Da Nang on being
one of 33 cities across the world to receive the IBM Smarter Cities
I delivered the presentation to the
audience of local government leaders and members of the local media,
both print and television. With the help of the team, I described
the IBM Smarter Planet and Smarter Cities vision and formally
announced the city of Da Nang as a IBM Smarter Cities Challenge
recipient. This was followed by a question and answer session with
those in attendance. What a great experience!
With the formalities completed, each
our sub-teams separated and met with different government departments
in the afternoon:
For water management, Peter
Williams, Nancy Quek, and Sergio Kapusta (from Royal Dutch Shell)
met with the Department of Natural Resources and Environment.
For transportation, Bounthara
(Bounty) Ing and Dave Hagerman met with the Department of
For food safety, Eileen Doherty,
Lex Backer (from Royal Dutch Shell), and myself met with the
Department of Health
In our meetings with the Department of
Health, we learned many things about the current operations and
procedures of the department in the area of health inspections, food
preparation, and gathering information pertaining to food-related
health incidents across the city of Da Nang. We will use this
information as we develop strategies for bringing smarter cities
capabilities to the city.
One interesting fact about the City of
Da Nang is that the city is divided into seven districts. These
districts vary widely in their physical size and make-up. Some of
the districts are very urban and densely populated. Three of the
districts border on the seafront. And one of the districts, Hoa
Vang, is very large extends far to the west into the mountainous
region inland from the sea. Each of these districts faces different
types of challenges in the area of food safety.
Everyone is very excited to be working
with the local government in fact-finding mode this week. We will
use this information as input into building recommendations over the
course of our assignment.
As a parting note for this blog article
– the food in Vietnam just continues to get better and better. I
tried, for the first time yesterday, a fruit called Durian.
If you can get past the initial smell, it's really not bad. Think
ripe mango with the consistency of cooked sweet potatoes.
And tonight, we had cooked giant prawns
– bigger than I had ever seen before:
Most of that is head (which you don't
eat), with the body section underneath the exo-skeleton which you
dig into and enjoy. A dipping sauce of salt/pepper/spices, lime
juice, and a bit of mayonnaise was served on the side. Very good!
Today, our work started before the
crack of dawn. And even leaving the hotel before 5:00 AM, we were
well behind the normal schedule of people in Vietnam. Our task this
morning was to see first-hand how meat, poultry, fish, fruits, and
vegetables are distributed throughout the city of Da Nang.
We started at a wholesale market on the
south side of the city. The level of activity early in the morning
was astounding. Everyone was working very hard and very diligently
to prepare food for purchase to other market sellers and to
businesses who were purchasing their food for the day. In this
model, time to market is very important – and the flavor of the
food reflects this attention.
We had to arrive early – by 5:45 AM,
the bustle is over . Food is already on its way to locations around
the city for people to purchase, take home, and cook or store for
their own use.
We also visited a local market which
sells food to residents of the city. These markets are quite similar
to “farmers markets” which are found all around the United
States. Growers and sellers bring fresh food in from various
locations and sell it the same day.
Interestingly, there are also
western-style markets as well. The “Metro” mart is very similar
to an american Costco or Sams Club store. And the “Big C” and
“Co-op mart” are similar to large-box department stores.
In the afternoon we had our first
meeting with the Department of Agriculture and Rural Development.
This group has a very important and challenging task. They are
responsible for overseeing the production and distribution methods
used for all types of food which are consumed in the city of Da Nang.
This includes meat, poultry, fish, fruits, and vegetables. As the
city grows in size, this team will be challenged even further to keep
pace with the rising amount of food which is brought into Da Nang
from three different sources: farms within the rural districts of Da
Nang, other provinces within Vietnam, and also imported from other
We are looking forward to working
closely with the Department of Agriculture and Rural Development to
help them utilize smarter planet and smarter cities capabilities in
the city of Da Nang. We will be meeting with the department again in
the next few days to identify areas where use of information
technology can benefit the department, the city, and the people of Da
Today brings us to the one week mark in
our service corps assignment here in Da Nang, Vietnam. Our schedule
for the day included an additional meeting with the client followed
by time collecting notes, additional information, and brainstorming
solutions to assist the city in improving food safety.
One thing which is clear to us already
is that the city has already put in place several successful programs
to improve food safety for residents and visitors. In or experience,
the food quality is very good, procedures are followed, and overall
the system that is in place is working quite well.
Food production turns out to consist of
several quite complex processes. There is food production at the
start of the process, either the raising of livestock or the planting
and harvesting of crops. In the middle there is distribution of
either live or processed food, followed by either sales to consumers
through various market shop formats or through preparation for
consumption at restaurants of various types. At each stage of this
supply chain, there are checks for quality which must be made, with
information logged and reviewed. Making this more complex is that
the procedures followed are different for food produced locally
(within the city of Da Nang), imported (from the perspective of the
city) from other provinces within Vietnam, or imported from other
countries and delivered to Da Nang. Tracing food origins, both for
quality control and for responding to food-related incidents is a
monumental task, even when looking only at the 700 tons of meat
products which are consumed within the city of Da Nang every month.
With a growing city, this number will just continue to rise.
We are now looking ahead to the future
as the city prepares for the expected growth in both permanent
residents and visitors to the city, beaches, and exclusive resorts
which already exist and are being built. It is astounding to see all
the construction underway in the city of Da Nang. Several years ago,
there was only one bridge across the Han River which separates the
city proper from the beach area to the east. Now there are five
bridges with another two, with quite interesting architecture, under
construction. Likewise, in the city center, there are several high
rise building projects underway for offices and luxury hotels. This
is clearly a city on the rise.
There comes a time in all of these
projects, it seems, when everyone realizes “the party's over”.
That day started for our sub-team today. Our task here in Da Nang is
centered around helping the city improve their infrastructure and
prepare for the expected rate of population growth. With this in
mind, and knowing that we only have three weeks to put something
together, we began in earnest to dig through the material we have
gathered, start to make sense of it, and begin developing an approach
which can be repeated by the Department of Agriculture and Rural
Development after we have completed our assignment.
Like every city, there are issues which
can be addressed immediately. And like every city, all work needs to
be considered with respect to the budget which the department must
remain within. And what we are coming to see is that the food safety
issues facing the city of Da Nang are very very similar to the food
safety issues facing other cities of similar size around the world.
The current processes and practices for food handling are different,
but the goal and end result is the same – delivery of fresh food to
consumers in an efficient and safe manner as well as educating both
consumers and sellers on the proper handling and storage of food for
This morning, we developed a set of
themes to serve as a backdrop for the recommendations which we will
be sharing with the Department of Agriculture and Rural Development later next week. For each of the themes, we will be suggesting a set
of recommendations, along with suggested plan or roadmap for how to
get from where the city is now (As Is state) to where the city
desires to get to (To Be state), over the next 5-10 years. This is a
very common approach to high level strategy and planning – identify
the current state, identify the desired state, and then put
activities in place, on a schedule, to arrive at the desired state in
the desired timeframe.
We met with members of the department
for a third time this afternoon and presented this overall approach
to them. They are very interested in what recommendations we are
going to suggest, and also the timeline against which we think these
recommendations can and should be implemented. We are close to
having a strong base of understanding the current state. We will be
discussing the desired state with the department early next week, and
then spending much of next week formulating the recommendations and
roadmap for implementation.
We are also hoping that the methodology
which we use will be usable by the department to apply to other areas
of food creation, delivery, preparation, and consumption after our
engagement is complete.
No pictures today since the inside of hotel and company conference rooms look the same all over the world!
Today was another heads-down working
day for the teams. Our sub-team, working on food safety, spent the
morning working on two broad areas. First, we reviewed the materials
which we had gathered so far, noting areas where we felt we needed
more information. This resulted in a set of questions which we
submitted by e-mail to our counter-parts in the Department of
Agriculture and Rural Development. More on that in a moment.
The second area we spent time
discussing was a brainstorming session on what our first draft set of
recommendations should be for improving food safety for the city of
Da Nang. Keep in mind here that the current food supply chain works
very different from most western areas. In countries such as the
United States, the food supply chain relies heavily on refrigeration
in order to keep food fresh as it travels from where it is processed
to where it is sold and eventually consumed. In Da Nang, the food
delivery system relies on “fast and fresh”. Particularly for
meat products, live animals are brought in daily into slaughterhouses
which are located within the city. From here, food moves very
quickly from farm to table, much of it without the benefit of
refrigeration. Because of this, time to market is incredibly
important. With this in mind, we are considering what
recommendations are appropriate for the city of Da Nang to consider.
A word about communication. This is
the first time that I have had the need to work through an
interpreter for both spoken and written communication. While this
can be very frustrating at times, it is also amazing that we can make
effective progress using these means. This is a credit to our great
interpreters who are doing a marvelous job of providing both spoken
interpretation during meetings and written interpretation as we send
text correspondence to one another. My thanks to them!
Our afternoon was spent in a combined
meeting of the food safety, water management, and transportation
teams. We each reviewed our progress so far, what has gone well,
what challenges we have faced, and indicated our preliminary
findings. We also discussed the intersections and cross-cutting
issues which have impacts across all of the teams. One very
interesting aspect is that food, transportation, and water are all
inter-related. Further, it is clear that both food and water, as
well as transportation have very big requirements on energy
production and the need for a reliable energy supply. At the end of
the afternoon's meeting, we were comfortable with our progress so far
– though there is much left to do! We had the pleasure of meeting
with the IBM Country General Manager (CGM) for Vietnam – Vo Tan
Long. He gave us some really good insights from his experience
working with other ESC and CSC teams who have been in Vietnam in the
The team elected to have a relaxing
dinner at a restaurant in Da Nang which has the goal of helping deaf
residents of Vietnam. Because it is difficult for the deaf to
communicate, they often find it difficult to live in the country.
This restaurant offers them a good job, lessons in reading, writing,
and sign language, and even a place to live. The staff is very
friendly and the food (western style: burgers and pizza) is really
fantastic. If you're in Da Nang and just want a good pizza, check
This being Saturday, the team went on a
sightseeing trip to two great locations which are relatively close to
Da Nang. We first visited My
Son (pronounced “Me Son”, where the “o” in “Son”
sounds like “soap”). My Son was an ancient holy site of the Cham
(pronounced with the “a” like “cat”) people who were settlers
in the area in the 9th to 14th centuries AD.
This people were eventually moved out of the area by people migrating
from the north and from the south.
The ruins at My Son are very
interesting. As I walked through the area, I was reminded of a
vacation I took with my family to visit some Mayan ruins in Mexico.
The type of construction and the form of the ruins is completely
different though. The people used a type of brick which has not lost
its color over many hundreds of years. Also, no mortar was used
between the bricks as they were laid to create the temples. Several
teams have attempted to re-create the type of brick and the
construction methods but have failed. It is quite easy to point out
the parts of the ruins which have been re-constructed and those which
While we were walking through the
ruins, a group of children on a school trip were very interested in
talking with us. They are eager to practice spoken English as they
only are able to learn reading and writing of English in school. The
kids are like children everywhere – full of energy, a little bit
shy, but very fun to talk to.
After leaving My Son, we headed over to
an ancient seaport which is also south of Da Nang and called Hoi
An. Hoi An was one of the busiest seaports in the central part
of Vietnam in the 16th and 17th centuries.
Because so many different types of people came to the port, the city
is built up with a large variety of architectures. The narrow
streets and riverfront walking areas are wonderful to stroll through,
look at items in the endless number of shops, and negotiate for the
best price with the merchants.
It was rather hot out in the sun today,
but the whole team had a great time learning about the history of
Vietnam and visiting these two historic locations.