Saturday, November 22, 2008

Custom XSL functions @ESB & BPEL

In one of my projects there was a requirement of assigning a unique Id from a database to keep track of the request in both ESB and BPEL. So the options we had in mind were either to use an XPath function or to use a XSL function. Compared to XPath function XSL looked an easier option. Today I will brief you on how to define and use a custom XSLT function in ESB & BPEL.

As usual with BPEL and ESB it has a java part and an xml(configuration) part.


I will start with java part


Create a class for ex: ‘com.geo.IdGenerator’. Write a static method say ‘getGlobalMessageId()’ which will return you the unique Id. For making the samle simple, I will use a static integer which is initialized to 0 and keeps incrementing by 1.


package com.geo;

public class IdGenerator{

static int id=0;

public static String getGlobalMessageId () {
return “”+id++;
}

}

 
Your java class is ready. Compile(using javac) and create a jar using jar utilty 
-> jar cf XSLService.jar com/geo/IdGenerator.class
 
Copy the jar to $ORACLE_HOME/j2ee/$OC4J_INSTANCE/applib 

Coming to the configuration part, first question that comes to your mind is how XSL file will access the class file?


You need to add the namespace

http://www.oracle.com/XSL/Transform/java/$package.className

in your XSLT where $package.className will be replaced by com.geo.IdGenerator in our example.


Next question is how the method name will be recognized?

In the xsl file use the method with namespace-prefix

<xsl:value-of select=" service: $method_name" />

where $method_name will be replaced by getGlobalMessageId () in our example


So finally your xsl function snippet will look like


<xsl ……………….

xmlns:service=”http://www.oracle.com/XSL/Transform/java/com.geo.IdGenerator

…./>

< ………………………

<messageId>

<xsl:value-of select=" service: getGlobalMessageId ()" />

<messageId>

………..

/>


Isnt it simple. :)

Friday, November 14, 2008

dynamic partner link @BPEL

Recently I was asked to take a session on Advanced BPEL topics, so one of the topics which I believed will be interesting to the audience was on how to use dynamic partner links. I would like you all to go through this article - Making BPEL Processes Dynamic .This article explains the concepts and need to go for dynamic partner links.

I will be explaining a stripped down version of this concept using my HelloWorld examples which I believe is apt for understanding the concepts of Dynamic Partner link. I will not be going into step by step detail because the above mentioned article explains it all.

  • First step to do is to come up with a common interface WSDL which all the interacting services using the dynamic partner link should follow. This is done to standardize the service @partnerlink so that it has same operation, input/output messageTypes.

  • Next step is to create dummy BPEL processes /services which will take part in dynamic binding. Here I will try with bpel processes as services. Create a BPEL process say “HelloProcess1”. Copy the content of the common WSDL and replace the content of BPEL process wsdl file.Modify the BPEL process(.bpel file) -change the namespaces, variables to use our custom defined WSDL namespaces and messageTypes. After that add your logic in the process. Create another BPEL process say “HelloProcess2”and follow the same steps.Now you have your services taking part in Dynamic partner link. Deploy them.

  • Next step is to create the main BPEL process which will call our services.Define a switch activity which will check the input and dynamically call either HelloProcess1 or HelloProcess2. In the Partnerlink WSDL which will be our Common HelloService WSDL you need to add the services

<service name="HelloProcess1">

<port name="HelloServicePort" binding="tns:HelloServiceBinding">

<soap:address location="http://geo.com/Process"/>

</port>

</service>

<service name="HelloProcess2">

<port name="HelloServicePort" binding="tns:HelloServiceBinding">

<soap:address location="http://geo.com/Process"/>

</port>

</service>

NO need to worry about the soap:address, it will assigned during runtime.For assigning the address @runtime you need to use the XML fragment of EndPointReference

For HelloProcess1

<EndpointReference xmlns="http://schemas.xmlsoap.org/ws/2003/03/addressing">

<Address>http://hsg-host_name:port/orabpel/default/HelloProcess1</Address>

<ServiceName xmlns:ns1="http://geo.com/HelloService">ns1:HelloProcess1</ServiceName>

</EndpointReference>

For HelloProcess2

<EndpointReference xmlns="http://schemas.xmlsoap.org/ws/2003/03/addressing">

<Address>http://hsg-host_name:port/orabpel/default/HelloProcess2</Address>

<ServiceName xmlns:ns1="http://geo.com/HelloService">ns1:HelloProcess2</ServiceName>

</EndpointReference>

This code snippet will be assigned to the dynamic ParnterLink and it will decide @runtime which service to call.

  • Deploy the Main BPEL process and see the results. Based on switch condition it should call process1 or process2.

I have uploaded the helloWorld sample @ DynamicPartnerLink. It will be helpful to begin with and understand the flow I have explained here.

Monday, November 10, 2008

SOA based Logging Service - part2

In my last blog I gave a brief overview of different factors that cropped up during the initial thought phase on Logging Service. Since this service was one of the core components in SOA based foundation framework, lot of thought was put in before we finalized on some low-level requirements and features of Logging service. I am listing some core functional requirements that the service should satisfy.

  • Service should have a straightforward, lightweight interface

  • Service should use a standard, extensible schema to represent the information

  • Service should have different levels of severity used to differentiate between logs

  • Service should have timestamps associated with each log which will help in preserving the order of logged events.

  • Service should represent a set of related logs using a single unique identifier which will be helpful in auditing/tracking

  • Service should be able to discard log messages which are below the specified severity

  • Service should have multiple destination support (Database, JMS, File, Email)

  • Service should have option to set business rules to route logs to different destinations

  • Service should be able to modify the data passed in from the client on the fly.

  • Service should be able to route to a different destination in case of a failure of the defined destination.(failover)

  • Service should be exposed as a web service so that it’s easily accessible to all the heterogeneous systems which form part of the SOA environment.

We did a freeze on requirements so that we could start with the design and development phase and see how things workout. I will write more on this and other components of the SOA foundation framework in my coming blogs.

Sunday, November 2, 2008

SOA based Logging Service - part1

Finally its november, considered to be a lucky month for me. I thought I will start this month’s blogging with a non-technical article. I was recently working on designing and developing SOA foundation components which should be generic in nature and can be used for multiple clients. Heavy name rite ?? What was the SOA foundation component supposed to do??? It was supposed to be an Error Handling /Logging /Auditing/Notification Service. I really liked this idea since it involved lot of research and development. Lot of late nights/weekends and breaking of head.


So today I will give a brief on some factors that we considered while doing a requirement analysis of the Logging/Auditing feature of the service.

  1. What to log – error data, auditing data, system problems data for troubleshooting, This list keeps growing
  2. Log volume - Considering multiple systems in the environment there will be too many log messages.
  3. Log diversity - logs generated by all look different .So need to standardize the log format across all platforms and systems.
  4. Bad logsbad logs are also a major challenge. Log messages which do not have enough information. Need to handle them properly.
  5. Integrating different logging mechanisms – All applications will be having different ways of error handling and logging. So need to zero-in on the best approach.
  6. Making sense of logs - Analyzing the logs based severity, system and other details. Use pre-defined standards.
  7. Managing the logs – Using Console/Reporting tool get users to analyze and make best use of the logs.This list will keep growing. But for a generic service these are some factors which I can remember

In my coming blog’s I will write on different options we had in mind and what all features we included in our implementation.

Happy November J