Data Science, Machine Learning & API / SOA: Insights and Best Practices
When Service identification Becomes Critical: Service Proliferation Syndrome and Service Interface Changes
Ali_Arsanjani 120000D8QB Tags:  governance service_portfolio service_proliferation service_model service_monitoring soma 15,392 Views
As our industry continues to mature in the adoption of SOA and related technologies, we are witnessing not only the subsiding of hype but also, the emergence of new issues that are surfacing due to growing pains. These issues include the need to change service interfaces as they evolve in conjunction with changing business requirements. The service portfolio which is the single point of reference for an enterprise's business capabilities undergoes evolution with external and internal changes that are ongoing within and without the enterprise.
As the Service Portfolio, part of the Service Model, matures, changes to the service interface, and the service contracts are inevitable. I was approached with one solution: to put them under change control. However expedient this measure, it is not a cure, but temporary stabilizers of organizational turbulence. Changes to the underlying IT capabilities that reflect the business needs that are dealt with in a proactive manner tend to stabilize over time and yield the agility that helps power sustained business performance.
Properly designing the services, i.e., proper Service identification techniques and practices are paramount for alleviating these types of issues. Second best is to use a registry and repository to store and manage them centrally. And of course you need a source code control system to manage the code that is going to be using the services.
The use of correct Service identification for alleviating the risks and impact of Service Interface and Service Contract changes coupled with management and governance in a Service Registry and Repository are thus paramount.
Further, as services abound and the organization feels more comfortable in the creation of new or modified (see above) services, the need to manage and govern this Service Proliferation Syndrome becomes increasingly important. Again there are two aspects : one of which is doable at design time and another at runtime. Service Identification coupled with Service Refactoring and Rationalization (SRR) help mitigate the first aspect -- design time best-practices. Secondly, the monitoring of Services at runtime come hand in hand with the Service identification and Monitoring approach to alleviate Service Proliferation.
The IBM Service-oriented Modeling and Architecture(TM) (SOMA) is IBM’s end to end software development and integration method (aka, process life-cycle) based on service-oriented software engineering principles to produce enterprise and individual project-level SOA or non-SOA solutions. This method includes prescriptive guidance applicable during the phases and iterations required in a software development process life-cycle. This guidance includes the activities and tasks to be performed by various designated roles within the life-cycle requiring input work products and producing or updating output work products and deliverables at specified junctures within the life-cycle. The tasks leverage a set of techniques (“capability patterns”) that prescribe and promote the use of best-practices included in the method, in conducting and accomplishing the task and to instantiate the related work products and deliverables.
SOA enables business agility. It enables flexibility of IT systems that support the business architecture. As the business changes, the IT systems required to support the business in a volatile environment of competitive engagement, are less prone to change.
The history of software engineering started with the separation of functionality or processing and data. The data of the processes that manipulate the data were traditionally separate. There came a time when as the pendulum swung emphasis was placed more on the data in one era ended and the pendulum would swing and emphasis would be placed more on process. Eventually object oriented programming, which evolved into object-oriented design and object oriented analysis, played a key role in the unification of process and data. David Parnas was the first to suggest the notion of information hiding. The notion of information hiding was that access to information or access to data was strictly done through the functionality provided by an object. Those that object encapsulated its data and protected it from direct manipulation. It offered a set of operations that you could perform on the data, but you could only invoke those operations do not access the data directly.
Objects often reflected real-world objects. The identity of the real world object was used as an abstraction and implemented in a software system as a software object. Identity often corresponded to a real-world entity although helper objects evolve from the IT constructs necessary to implement the real-world construct in a computer system.
One of the most important principles of object oriented programming was based on a separation of concerns not only in the domain of abstraction but also in terms of separation of interface from implementation. Not only did we separate data from the operations that manipulated the data all in one object, but we did so one step further by separating the interface of the behavior from the actual implementation of that behavior. The behavior or rather its implementation, actually change the state of the object or as it were manipulated the data directly. However, there must be several ways in which we can implement this data manipulation. Even though there may be several ways in which we can implement this manipulation, there is typically one way to represent the interface or the externally visible signature that would be used by a consumer to request a change to be made to the data that the object owned.
This principle is called programming to interfaces rather than implementation.
These interfaces coupled with the notion of composing a set of similar objects that often had interdependencies among them into the next level of encapsulation which was called the component lead to a whole new era of -based development. It exported interface and could contain multiple objects all of which would typically be expected to be related to one another in some logically cohesive fashion. These objects were expected to be highly collaborative with one another and thus made sense to be called located within the packaging of a single component. This allowed not only manageable functionality but also decrease the risk that nonfunctional requirements would be violated.
The promise of object orientation