Archive for the ‘SOA Suite’ Category

Oracle Fusion Middleware with Docker

This is one of the best posts I’ve seen in a long time about Oracle Fusion Middleware (SOA Suite) and the Oracle 12c database. It shows how to use Chef and Puppet and Docker to build a container running Oracle Fusion Middleware 12 with Oracle 12c. Excellent and truly useful!

Categories: Oracle, SOA Suite

Oracle Fusion Middleware still spells dsa as dss

As I found out the hard way several years ago the documentation for Oracle’s FTP adapter in SOA Suite 10.1.3 does not match the code. For public key authentication with ssh-dsa the preferredPKIAlgorithm property should be set to ssh-dsa according to the documentation, but the code expects ssh-dss. It turns out that this bug remains in The documentation still insists on ssh-dsa and the code steadfastly uses ssh-dss. Some things never change.

Categories: SOA Suite

Unused namespaces begone!

We recently ran into a tricky problem with SOA Suite 11g: unused namespaces in XML files. Consider a simple input file like this one:

<?xml version="1.0" encoding="ISO-8859-1" ?>
<a:A xmlns="" xmlns:a=""
   <b:B xmlns="">
      <Name>Name 1</Name>

Now, we want to transform it into an output file that consists of the element B. The transformation is simple:

<?xml version="1.0" encoding="ISO-8859-1" ?>
<xsl:stylesheet version="1.0" xmlns:a=""
                exclude-result-prefixes="a b">
  <xsl:template match="b:B">
    <xsl:copy-of select="."/>

Unfortunately the result is not so good:

<?xml version = '1.0' encoding = 'UTF-8'?>
<b:B xmlns="" xmlns:b=""

The resulting file refers to namespace A. This is allowed by the XSLT specification, but some XML engines fail to read the resulting file as they choke on the unknown namespace. What to do?

Fortunately SOA Suite 11g supports XSLT 2, at least partly. Change version to 2.0 and add copy-namespaces to the copy-of statement:

<?xml version="1.0" encoding="ISO-8859-1" ?>
<xsl:stylesheet version="2.0"
                exclude-result-prefixes="a b">
  <xsl:template match="b:B">
    <xsl:copy-of select="." copy-namespaces="no"/>

This produces the desired result. XSLT 2 has many other useful features, so it is well worth investigating even though it is not fully supported in SOA Suite yet.

Categories: SOA Suite

Write XML files with specific encoding in SOA Suite

Outbound FTP adapters in Oracle SOA Suite normally use the default character encoding. Usually that means that all XML files are written in UTF-8. What do you do if you need to use something else, for example ISO-8859-1?

The logical solution would be to set an adapter property. In there are three likely candidates: CharacterSet, Encoding and jca.message.encoding. Unfortunately none of them works. CharacterSet and Encoding only apply to attachments and are not even used by the adapter. The documentation states that they are meant for third-party applications that need to process the attachment. The jca.message.encoding is used, but only when the adapter reads files, not when it writes them.

What to do? The solution has been with us for a long time, the adapters in 10g worked in the same way. For native files the encoding can be specified in the schema with the nxsd:encoding attribute. It turns out that this works for normal XML files that have no need for native conversions too.

Simply add the following to the main XSD:

  nxsd:version="XSD" nxsd:encoding="ISO-8859-1"

Problem solved, the adapter will use ISO-8859-1. A bit dirty (in particular when the schema is owned by someone else), but it works.

Categories: SOA Suite

Async JAX-WS web service for BPEL

I have just completed a portable asynchronous JAX-WS web service that works with BPEL and WS-Addressing in Oracle Fusion Middleware 11g. I thought that it would be easy, but in fact it took several days to get it right; hence this blog entry.

First I got the web service up and running and working when invoked from SoapUI. That was not too bad. The main difficulty was how to get at the WS-Addressing headers in a portable way. The callback is invoked by another thread, possibly days after the original call and possibly from another physical machine, so I had to find and store the reply address and the message id. I used a SOAPHandler, as described in Asynchronous web services with WS-Addressing.

Unfortunately BPEL did not want to play. Luckily I found an excellent guide: Create an Asynchronous JAX-WS Web Service and call it from Oracle BPEL 11g. It noted that BPEL expects the WSDL of an asynchronous web service to be structured in a specific way:

  1. The asynchronous request operation and the callback operation must be on different ports in the same service.
  2. A parterLinkType section is required to associate the request and callback roles.
  3. The asynchronous operation must not have a return type (or it will be interpreted as a synchronous service).
  4. The callback operation should contain a single input that represents the value returned by the callback.
  5. The asynchronous method operation must specify a value for the soapAction attribute.

After that there were no problems. Along the way I found the following post, which was also quite interesting: Dynamic WSDL location with JAX-WS.

Categories: Java, SOA Suite

Unit tests for SCA components with adapters in 11g

2012-04-20 1 comment

The unit test framework for SCA components that comes with Oracle Fusion Middleware 11g has many serious limitations. It deals poorly with failures, reducing the types of tests that can be created. Today I ran into another really frustrating limitation, but luckily I managed to find an acceptable workaround. Share and enjoy.

The composite under test is dead simple. An inbound DB adapter triggers a Mediator. It calls another Mediator which in turn invokes an outbound DB adapter. Each input record creates a similar output record.

I created a unit test with a fixed input and an assert for the output step. It worked, but the DB adapter inserted the record. Not what I wanted, but easy to fix – just add an emulation for the adapter, right? No.

It is not possible to add emulations for DB adapters, the add button in jDeveloper is disabled. Most likely this applies for the other adapter types as well. When a test runs all the adapters will run, updating other systems. Not at all a good idea for a continuous integration setup where the tests run all the time.

As usual with SOA Suite the solution involves handcrafting generated code. Edit the WSDL file for the output adapter. Find the wsdl:operation element and copy the wsdl:input element to a wsdl:output element. Edit the unit test. It is now possible to define an emulation, as jDeveloper thinks that the adapter is a synchronous web service. When the test has been saved, restore the original WSDL file again.

The code is as it was originally generated, but the tests work as they should without side-effects.

EJB3 web service context path in WebLogic 11g

EJB3 web services are great. Simply slap on a @WebService annotation, deploy and it works. A few additional annotations can improve the interface. However, with WebLogic 11g the service endpoint becomes http://server:port/ClassName/ServiceName, where ClassName is the name of the session bean and ServiceName comes from the annotation. This is a bit ugly, the class name of the session bean is an implementation detail and should not be exposed.

Fortunately there is a solution and it keeps the code portable. First, define a generic webservices.xml file with the minimum required information for the web service:

<?xml version="1.0" encoding="UTF-8"?>
<webservices xmlns=""
  xmlns:xsi="" version="1.2">
        <wsdl-port xmlns:ws="">


Then add a weblogic-webservices.xml file:

<?xml version = '1.0' encoding = 'UTF-8'?>
<weblogic-webservices xmlns:xsi=""

This will bind the web service to the address http://server:port/SomeService/SomeOperation.

Categories: Java, SOA Suite

Fault policy Java action in 11g

Oracle SOA Suite 11g makes it possible to use fault policies in order to catch and act on certain types of errors. This can be very useful and if the built-in actions are not enough it is possible to write custom actions in Java. See How to Use a Java Action Fault Policy in the Developer’s Guide.

Both the documentation and the API has room for improvement, though.

The documentation helpfully records that IFaultRecoveryJavaClass needs fabric-runtime.jar, but it fails to mention that orabpel.jar is needed as well for IBPELFaultRecoveryContext. Unfortunately the plain IFaultRecoveryContext interface is so limited that it is almost useless. It doesn’t even provide access to the fault. In order to be of use the context must be cast into a IBPELFaultRecoveryContext (for BPEL faults).

IFaultRecoveryContext does include a method named getProperties that returns the property set for the fault policy action. Does it return a Properties object? No, it returns a raw Map. Oh well, but surely it is a map from String to String? Same thing, right? Wrong. It returns a map from String to ArrayList. As nothing is documented I found that out with trial and error. Typically the first element in the list can be cast into a String and used.

Finally the documentation uses elements named ReturnValue in the fault policy, but the implementation uses returnValue.

Categories: SOA Suite

Unit testing file transfers

I recently wrote an application for managed file transfers with FTP, FTPS and SFTP. Naturally I wanted good unit tests and I found an excellent solution that made it possible to run real transfers in a controlled way: MockFtpServer and Apache SSHD. This way I could fire up an FTP server and an SSH server from jUnit, run a test and be in full control over both the client (my application) and the servers.

The same approach can be used to test file-based integrations in general driven from jUnit, possibly with a continuous integration engine such as Jenkins running the tests.

To use the libraries with Maven, include:



You may need additional dependencies as well, for example SLF4J and BouncyCastle.

Categories: Java, SOA Suite

Unified logging in Oracle Fusion Middleware 11g

2012-02-23 1 comment


For some reason logging frameworks abound in the Java world. What is worse, many vendors have a hard time picking the right one and end up using several. Sometimes in the same product. Oracle Fusion Middleware is a case in point. It uses standard JUL (Java util logging) and ODL and optionally Log4j and/or commons logging and possibly other frameworks as well in dark corners. There is no single point of configuration and it is hard to add custom handlers/appenders. At least the situation is much better than in 10g.

What to do? Actually it is possible to route most logging to Log4j or JUL and to use a single configuration file for some of the settings. This makes it possible to add custom handlers/appenders in order to log to a database, to Nagios or to some other destination not supported out of the box.

There are many components in OFM 11g. This post focuses on SOA Suite, WLS and the Java node manager. Many of the other components can be handled similarly. It targets Log4j (which has a Nagios appender), but JUL would also work and would be a bit easier.

Node Manager

The node manager logging is not well documented, but it uses standard JUL. To make it use Log4j we can install the SLF4JBridgeHandler from slf4j – yet another logging framework. Edit the start script for the node manager and add the following jar files to the class path (along with any custom appender jars):


Add two properties:

where the Log4j properties (or XML) file is the one and only Log4j configuration file for the system and where contains:

handlers= org.slf4j.bridge.SLF4JBridgeHandler
.level= INFO
org.slf4j.bridge.SLF4JBridgeHandler= INFO

This redirects all log entries with level INFO or higher to Log4j. There is a performance penalty for using the bridge, but the node manager is not performance critical and is not affected much anyway as it logs relatively sparingly.

SOA Suite (and others)

Many of the applications in OFM use Oracle’s own logging framework ODL. For some reason they don’t share the same configuration, there are multiple ODL configuration files. Refer to the official documentation for details, this is covered. For SOA Suite the configuration file is found in soa_domain/config/fmwconfig/servers/server-name/logging.xml where server-name is the name of a managed server. In a HA installation all the servers must be configured.

ODL as such is pretty closed, it is evidently not meant to be extended by customers. However, it is built on JUL but with its own configuration (it doesn’t use LogManager). This means we can add JUL handlers.

Add the logging jar files listed above to the server’s class path, then edit logging.xml. Add:

<log_handler name="slf4j-bridge-handler" level="NOTIFICATION:1"/>

Add the new handler to the root level:

<handler name="slf4j-bridge-handler"/>


<logger name='' level='WARNING:1'>
<handler name='odl-handler'/>
<handler name='wls-domain'/>
<handler name='console-handler'/>
<handler name="slf4j-bridge-handler"/>

Unless the child loggers are configured differently all messages that are logged with level WARNING or higher will be sent to Log4j, as well as to the standard locations. As with the node manager, define log4j.configuration to point to the common Log4j configuration file.

A word of warning. While performance is a non-issue for the node manager, it can affect SOA Suite. Logging can be prolific and the bridge can make it much slower. The solution is to use the administration console and/or the ODL configuration file and set logging levels in ODL so that only messages that actually should be logged are sent to Log4j. The Log4j configuration can pick the proper appenders, but should not filter messages.


The final piece of the puzzle is WebLogic Server. According to the official documentation it can be configured to use Log4j, so everything should be fine. Right? No. Actually it can use handlers/appenders from JUL or Log4j or commons logging, but it will not use the configuration files from any of these frameworks. It is not possible to add a Nagios appender to the Log4j configuration and add it to WLS loggers. Standard but not so standard – why? Oh well, nothing we can’t solve with a few lines of well-placed code.

First configure the server to use Log4j following the official documentation. Point it to the common Log4j configuration file. Define a logger in the file with a reserved name (anything will do) and configure it with the appenders that should be used with WLS, for example JDBC and Nagios. Next, create a startup class and copy the appenders from the logger with the reserved name to the WLS server logger:

  public static void main(String[] args) {
    Logger serverLogger;

    try {
      serverLogger = Log4jLoggingHelper.getLog4jServerLogger();
    } catch (Exception e) {

    for (@SuppressWarnings("rawtypes")
      Enumeration appenderEnum = Logger.getLogger(LOGGER_WITH_APPENDERS)
      .getAllAppenders(); appenderEnum.hasMoreElements();) {
      serverLogger.addAppender((Appender) appenderEnum.nextElement());

Register the startup class in the administration console. Finally, at long last all the main components use the same Log4j configuration file (albeit not fully) and can log to custom appenders. A step towards unified logging, at least.

Beware of logging loops! WLS creates log entries for text written to standard out or error, so a console handler/appender will generate new log entries for every line it writes. Not a good idea, never log to the console.

Categories: SOA Suite