Updating a task from .Net

In this previous post I showed how to use the TaskQueryService to query tasks.  In this post, I am going to take this one step further and update the task payload and process the task.  In order to do this, we need to use both the TaskQueryService and the TaskService.  This introduces a couple of new challenges that we need to deal with.

Let’s take a look at the basic outline of the code first, then drill into the challenges.  First, we need to authenticate to the engine, as we did in the previous example.  This is done by calling the authenticate operation on the TaskQueryService web service.

TaskQueryServiceClient tqs = new TaskQueryServiceClient("TaskQueryServicePort");

// provide credentials for ws-security authentication to WLS to call the web service
tqs.ClientCredentials.UserName.UserName = "weblogic";
tqs.ClientCredentials.UserName.Password = "welcome1";

// set up the application level credentials that will be used to get a session on BPM (not WLS)
credentialType cred = new credentialType();
cred.login = "weblogic";
cred.password = "welcome1";
cred.identityContext = "jazn.com";

// authenticate to BPM
Console.WriteLine("Authenticating...");
workflowContextType ctx = tqs.authenticate(cred);
Console.WriteLine("Authenticated to TaskQueryService");

Next, we need to retrieve the task that we want to update.  In this example, I am just hard coding the task number.  Then we call the getTaskDetailsByNumber operation on the TaskQueryService web service, passing in the context we got back from the authenticate operation, and the task number.

taskDetailsByNumberRequestType request = new taskDetailsByNumberRequestType();
request.taskNumber = "200873";
request.workflowContext = ctx;
task task = tqs.getTaskDetailsByNumber(request);

Now that we have the task, we want to update the payload.  In this example, I up just updating one of the string parameters in the payload to contain the text “changed in .net” and then updating the payload in our local copy of the task.

TaskService.TaskServiceClient ts = new TaskService.TaskServiceClient("TaskServicePort");
System.Xml.XmlNode[] payload = (System.Xml.XmlNode[])task.payload;
payload.ElementAt(0).ChildNodes.Item(1).InnerText = "changed in .net";
task.payload = payload;

Now to actually update the real task on the server, we need to call the updateTask operation on the TaskService web service and pass it our locally updated task.  This call will return back a new task object which represents the updated task.

// update task
TaskService.taskServiceContextTaskBaseType updateTaskRequest = new TaskService.taskServiceContextTaskBaseType();
updateTaskRequest.workflowContext = ctx;
updateTaskRequest.task = task;
TaskService.task updatedTask = ts.updateTask(updateTaskRequest);

Now, we want to take an action on the task, in this case I have just hardcoded the “OK” action.  To have the task processed, we call the updateTaskOutcome operation on the TaskService web service, again we pass in the context and the updated task object.

// complete task
TaskService.updateTaskOutcomeType updateTaskOutcomeRequest = new TaskService.updateTaskOutcomeType();
updateTaskOutcomeRequest.workflowContext = ctx;
updateTaskOutcomeRequest.outcome = "OK";
updateTaskOutcomeRequest.Item = updatedTask;
ts.updateTaskOutcome(updateTaskOutcomeRequest);

So, this all looks relatively straight forward and if you have followed our custom worklist sample then the code probably looks pretty similar to the Java code in that sample.  But unfortunately, this code will not work as is.

The problem we have here is to do with the way web services work in .Net.  For each of the two web services that we want to use, the TaskQueryService and the TaskService, we need to add a service reference to our .Net solution.  When we add the service reference, we need to create a namespace, and they need to be unique.  So we end up with two definitions of task in two different namespaces, i.e. we get a TaskQueryService.task and a TaskService.task.  These are in fact exactly the same and came from the same Java object, but because of the way web service references work, .Net does not think they are the same object, and you cannot cast from one to the other.

This creates an issue for us, as we get our workflowContext object from the TaskQueryService but we need to provide it to the TaskService.  There is no way to get it from the TaskService.  If you invest five or ten minutes into searching the web, you will discover this is a fairly common issue encountered in .Net when using web services.

So what do we do?

My initial approach was to just write some logic to manually convert the objects.  That looked something like this:

public static TaskService.workflowContextType convertWorkflowContextType(TaskQueryService.workflowContextType input)
{
  TaskService.workflowContextType output = new TaskService.workflowContextType();
  output.credential = convertCredentialType(input.credential);
  output.locale = input.locale;
  output.timeZone = input.timeZone;
  output.token = input.token;
  return output;
}

This does not look too bad, but the issue is in the size of these objects.  Notice that the credential is a complex type and I need another method like this to copy it.  So in order to actually implement this method for just the workflowContext and the task objects, we would need several hundred lines of ugly boring boilerplate code.  So I gave up on this method.

My second approach was to use reflection to do a deep copy on the objects.  This looked promising and I found several samples online, but again, I ran into issues.  First, it had problems with arrays.  Once I fixed this, it then had problems with enumerated types.  Again, this was getting pretty ugly, so I abandoned this method too.

Next, I turned to an open source (MIT-license) project called AutoMapper which addresses this very issue.  I found that investing a few minutes in learning how to use AutoMapper resolved my issues completely.  So this is the approach I have adopted.  Here is the code that configures the AutoMapper to handle our two types we are discussing, and all of the embedded subtypes we need:

// set up the automapper
AutoMapper.Mapper.CreateMap<TaskQueryService.workflowContextType, TaskService.workflowContextType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.credentialType, TaskService.credentialType>();

AutoMapper.Mapper.CreateMap<TaskQueryService.task, TaskService.task>();
AutoMapper.Mapper.CreateMap<TaskQueryService.attachmentType, TaskService.attachmentType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.callbackType, TaskService.callbackType1>();
AutoMapper.Mapper.CreateMap<TaskQueryService.customAttributesType, TaskService.customAttributesType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.documentType, TaskService.documentType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.EvidenceType, TaskService.EvidenceType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.processType, TaskService.processType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.commentType, TaskService.commentType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.identityType, TaskService.identityType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.ucmMetadataItemType, TaskService.ucmMetadataItemType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.systemAttributesType, TaskService.systemAttributesType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.actionType, TaskService.actionType2>();
AutoMapper.Mapper.CreateMap<TaskQueryService.displayInfoType, TaskService.displayInfoType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.shortHistoryTaskType, TaskService.shortHistoryTaskType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.assignmentContextType, TaskService.assignmentContextType1>();
AutoMapper.Mapper.CreateMap<TaskQueryService.assignmentContextTypeValueType, TaskService.assignmentContextTypeValueType1>();
AutoMapper.Mapper.CreateMap<TaskQueryService.collectionTargetType, TaskService.collectionTargetType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.collectionTargetActionType, TaskService.collectionTargetActionType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.preActionUserStepType, TaskService.preActionUserStepType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.systemMessageAttributesType, TaskService.systemMessageAttributesType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.flexfieldMappingType, TaskService.flexfieldMappingType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.scaType, TaskService.scaType>();
AutoMapper.Mapper.CreateMap<TaskQueryService.UpdatableEvidenceAttributesType, TaskService.UpdatableEvidenceAttributesType>();

// check automapper config is valid
AutoMapper.Mapper.AssertConfigurationIsValid();

But I really don’t want to have all the AutoMapper code messing up my nice clean class.  So I went one step further and implemented some implicit operators so that I can write my code like I showed at the start of this article and pretend that this issue does not even exist.  Here is the code to implement implicit operators to covert from TaskQueryService.task to TaskService.task and from TaskQueryService.workflowContext to TaskService.workflowContext:

namespace TaskService
{
  partial class workflowContextType
  {
    public static implicit operator workflowContextType(TaskQueryService.workflowContextType from)
    {
      return AutoMapper.Mapper.Map(from);
    }
  }

  partial class task
  {
    public static implicit operator task(TaskQueryService.task from)
    {
      return AutoMapper.Mapper.Map(from);
    }
  }
}

So here is the completed class:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using ConsoleApplication1.TaskQueryService;
using System.Web.Services;
using System.ServiceModel.Security;
using System.ServiceModel.Security.Tokens;

namespace ConsoleApplication1
{
    class Class1
    {
        static void Main(string[] args)
        {
            Console.WriteLine("Sample C# TaskQueryService client");
            init();

            // set up the TaskQueryService client
            // Note that this constructor refers to an endpoint configuration that is defined in the app.config
            // which was created by Visual Studio when you added the web service reference.
            // You have to edit the app.config to set the security mode to "TransportCredentialOnly"
            // and the transport clientCredentialType to "Basic"
            TaskQueryServiceClient tqs = new TaskQueryServiceClient("TaskQueryServicePort");
            // provide credentials for ws-security authentication to WLS to call the web service
            tqs.ClientCredentials.UserName.UserName = "weblogic";
            tqs.ClientCredentials.UserName.Password = "welcome1";

            // set up the application level credentials that will be used to get a session on BPM (not WLS)
            credentialType cred = new credentialType();
            cred.login = "weblogic";
            cred.password = "welcome1";
            cred.identityContext = "jazn.com";

            // authenticate to BPM
            Console.WriteLine("Authenticating...");
            workflowContextType ctx = tqs.authenticate(cred);
            Console.WriteLine("Authenticated to TaskQueryService");

            // get task
            taskDetailsByNumberRequestType request = new taskDetailsByNumberRequestType();
            request.taskNumber = "200873";
            request.workflowContext = ctx;
            task task = tqs.getTaskDetailsByNumber(request);

            // get TaskService
            TaskService.TaskServiceClient ts = new TaskService.TaskServiceClient("TaskServicePort");

            // update the payload
            System.Xml.XmlNode[] payload = (System.Xml.XmlNode[])task.payload;
            payload.ElementAt(0).ChildNodes.Item(1).InnerText = "changed in .net";
            task.payload = payload;

            // update task
            TaskService.taskServiceContextTaskBaseType updateTaskRequest = new TaskService.taskServiceContextTaskBaseType();
            updateTaskRequest.workflowContext = ctx;
            updateTaskRequest.task = task;
            TaskService.task updatedTask = ts.updateTask(updateTaskRequest);

            // complete task
            TaskService.updateTaskOutcomeType updateTaskOutcomeRequest = new TaskService.updateTaskOutcomeType();
            updateTaskOutcomeRequest.workflowContext = ctx;
            updateTaskOutcomeRequest.outcome = "OK";
            updateTaskOutcomeRequest.Item = updatedTask;
            ts.updateTaskOutcome(updateTaskOutcomeRequest);

            // all done
            Console.WriteLine();
            Console.WriteLine("Press enter to exit");
            Console.Read();
        }

        private static void init()
        {
            // set up the automapper
            AutoMapper.Mapper.CreateMap<TaskQueryService.workflowContextType, TaskService.workflowContextType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.credentialType, TaskService.credentialType>();

            AutoMapper.Mapper.CreateMap<TaskQueryService.task, TaskService.task>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.attachmentType, TaskService.attachmentType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.callbackType, TaskService.callbackType1>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.customAttributesType, TaskService.customAttributesType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.documentType, TaskService.documentType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.EvidenceType, TaskService.EvidenceType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.processType, TaskService.processType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.commentType, TaskService.commentType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.identityType, TaskService.identityType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.ucmMetadataItemType, TaskService.ucmMetadataItemType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.systemAttributesType, TaskService.systemAttributesType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.actionType, TaskService.actionType2>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.displayInfoType, TaskService.displayInfoType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.shortHistoryTaskType, TaskService.shortHistoryTaskType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.assignmentContextType, TaskService.assignmentContextType1>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.assignmentContextTypeValueType, TaskService.assignmentContextTypeValueType1>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.collectionTargetType, TaskService.collectionTargetType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.collectionTargetActionType, TaskService.collectionTargetActionType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.preActionUserStepType, TaskService.preActionUserStepType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.systemMessageAttributesType, TaskService.systemMessageAttributesType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.flexfieldMappingType, TaskService.flexfieldMappingType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.scaType, TaskService.scaType>();
            AutoMapper.Mapper.CreateMap<TaskQueryService.UpdatableEvidenceAttributesType, TaskService.UpdatableEvidenceAttributesType>();

            // check automapper config is valid
            AutoMapper.Mapper.AssertConfigurationIsValid();
        }
    }

    namespace TaskService
    {
        partial class workflowContextType
        {
            public static implicit operator workflowContextType(TaskQueryService.workflowContextType from)
            {
                return AutoMapper.Mapper.Map<TaskQueryService.workflowContextType, workflowContextType>(from);
            }
        }

        partial class task
        {
            public static implicit operator task(TaskQueryService.task from)
            {
                return AutoMapper.Mapper.Map<TaskQueryService.task, task>(from);
            }
        }
    }
}

Where to next? My next step is to take this approach and apply it to writing a human task user interface in ASP.NET C# and integrate that into the BPM Workspace as shown in the example below.

Posted in Uncategorized | Tagged , , , , | 11 Comments

WebLogic 12c Released

Just a quick post the let you all know that WebLogic Server 12c is now generally available and can be downloaded from OTN.

 

Posted in Uncategorized | Tagged , | Leave a comment

New build of custom worklist for BPM 11.1.1.5 ‘Feature Pack’ patched systems

I have just uploaded a new build of the custom worklist sample which is compiled against (and will therefore work with) BPM 11.1.1.5 systems which have the feature pack patch installed.  You can download this from the links on the main worklist page.

Posted in Uncategorized | Tagged , , | 2 Comments

New blog – BPM in Practice – launched

Oracle North America Consulting’s BPM practice has set up a new blog where they plan to host articles from a variety of guest bloggers (me included) covering practical aspects of BPM implementation.  I would encourage you to take a look over the next few weeks as they start to get some material posted there.  The blog is located at http://blogs.oracle.com/practicalbpm/

From their newly launched site:

[This new space on] Oracle blogs is dedicated to practical implementation of Oracle’s BPM Suite and surrounding technologies.  This space is designed to host dozens of guest bloggers from the ranks of Oracle engineers, field solutions consultants, architects and general developers.  The goal is to disseminate practical guidelines and examples from actual implementations or proof of concept exercises.  Our hope is that it not only promotes greater use but better and more defined use of Oracle’s BPM Suite by those who have engaged it’s powerful capabilities.

As best practices, design patterns and common use cases emerge or are refined they will be discussed in detail here for the good of the BPM community.  Technical deep dives and short hands-on lab-like posts will also be a regular part of the menu so stay tuned and enjoy.

Posted in Uncategorized | Tagged | Leave a comment

Using the TaskQueryService from .Net (C#)

As regular readers will know, I am working on a .Net version of the custom worklist sample.  As I work on this, I am playing with a few different things in .Net along the way, and it seemed like it would be worth sharing these.

Ardent readers will recall that the Human Workflow APIs (generally under oracle.bpel package) have web services exposed, but the BPM APIs (generally under oracle.bpm) do not.  In this post, we are looking only at the Human Workflow APIs, so this is not an issue for us (yet…)

Arguably the most interesting of the Human Workflow APIs/web services is the TaskQueryService.  This lets us get information about, and take action on, tasks in the workflow system.  In this first example, let us take a look at using the TaskQueryService (web service) from .Net to get a list of tasks.

I am using Visual Studio 2010 Professional on Windows 7 Ultimate 64-bit with .Net Framework 4.0.30319 and my language of choice is (of course) C#.  If you don’t have a ‘full use’ version of Visual Studio, you could download the free ‘Express’ version and still be able to build this sample.

To keep things simple, we will use a ‘console’ (command line) application.  From the File menu, select New then Project.  Select a Console Application from the gallery.

Click on OK to create the new project.  Next, we want to add a couple of references that we will need.  In the Solution Explorer pane (on the right hand side) right click on the References entry and select Add Reference…

In the dialog box, navigate to the .Net tab.  You need to add the System.Web.Services component.  Select it from the list and then press OK.  Then go and add a second reference, to the System.ServiceModel component.

These two .Net components (libraries) are needed to allow us to call Web Services and use WS-Security, which we will need to do to call the TaskQueryService.

Next, we need to add a reference to the web service itself.  Right click on the References entry again and this time select Add Service Reference…  In the Add Service Reference dialog box, enter the address of the TaskQueryService in the Address box and click on OK.  The address should look like this:

http://server:8001/integration/services/TaskQueryService/TaskQueryService?wsdl

You will obviously need to update the server name and make sure you have the right port.

Enter a Namespace, I called mine TaskQueryService, and click on OK.  Visual Studio will create some resources for you.  You will see the new reference listed in the solution explorer and you may also notice that you get a new source file and an app.config file.  We will come to these later.

Now we are ready to start writing our code.  We need to add a couple of using statements to reference those three references that we just added:

using ConsoleApplication1.TaskQueryService;
using System.Web.Services;
using System.ServiceModel.Security;
using System.ServiceModel.Security.Tokens;

Here is the code, with some comments in it to explain what it is doing:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using ConsoleApplication1.TaskQueryService;
using System.Web.Services;
using System.ServiceModel.Security;
using System.ServiceModel.Security.Tokens;

namespace ConsoleApplication1
{
    class Program
    {
        static void Main(string[] args)
        {
            Console.WriteLine("Sample C# TaskQueryService client");

            // set up the TaskQueryService client
            // Note that this constructor refers to an endpoint configuration that is defined in the app.config
            // which was created by Visual Studio when you added the web service reference.
            // You have to edit the app.config to set the security mode to "TransportCredentialOnly"
            // and the transport clientCredentialType to "Basic"
            TaskQueryServiceClient tqs = new TaskQueryServiceClient("TaskQueryServicePort");
            // provide credentials for ws-security authentication to WLS to call the web service
            tqs.ClientCredentials.UserName.UserName = "weblogic";
            tqs.ClientCredentials.UserName.Password = "welcome1";

            // set up the application level credentials that will be used to get a session on BPM (not WLS)
            credentialType cred = new credentialType();
            cred.login = "weblogic";
            cred.password = "welcome1";
            cred.identityContext = "jazn.com";

            // authenticate to BPM
            Console.WriteLine("Authenticating...");
            workflowContextType ctx = tqs.authenticate(cred);
            Console.WriteLine("Authenticated to TaskQueryService");

            // now we need to build the request ... there is a whole bunch of stuff
            // we have to specify in here ... a WHOLE bunch of stuff...
            taskListRequestType request = new taskListRequestType();
            request.workflowContext = ctx;
            // predicate
            taskPredicateQueryType pred = new taskPredicateQueryType();
            // predicate->order - e.g. ascending by column called "TITLE"
            orderingClauseType order = new orderingClauseType();
            order.sortOrder = sortOrderEnum.ASCENDING;
            order.nullFirst = false;
            order.Items = new string[] { "TITLE" };
            order.ItemsElementName = new ItemsChoiceType1[] { ItemsChoiceType1.column };
            orderingClauseType[] orders = new orderingClauseType[] { order };
            pred.ordering = orders;
            // predicate->paging controls - remember TQS.queryTasks only returns 200 maximum rows
            // you have to loop/page to get more than 200
            pred.startRow = "0";
            pred.endRow = "200";
            // predicate->task predicate
            taskPredicateType tpred = new taskPredicateType();
            // predicate->task predicate->assignment filter - e.g. "ALL" users
            tpred.assignmentFilter = assignmentFilterEnum.All;
            tpred.assignmentFilterSpecified = true;
            // predicate->task predicate->clause - e.g. column "STATE" equals "ASSIGNED"
            predicateClauseType[] clauses = new predicateClauseType[1];
            clauses[0] = new predicateClauseType();
            clauses[0].column = "STATE";
            clauses[0].@operator = predicateOperationEnum.EQ;
            clauses[0].Item = "ASSIGNED";
            tpred.Items = clauses;
            pred.predicate = tpred;
            // items->display columns
            displayColumnType columns = new displayColumnType();
            columns.displayColumn = new string[] { "TITLE" };
            // items->presentation id
            string presentationId = "";
            // items->optional info
            taskOptionalInfoType opt = new taskOptionalInfoType();
            object[] items = new object[] { columns, opt, presentationId };
            pred.Items = items;
            request.taskPredicateQuery = pred;

            // get the list of tasks
            Console.WriteLine("Getting task list...");
            task[] tasks = tqs.queryTasks(request);

            // display our results with a bit of formatting
            Console.WriteLine();
            Console.WriteLine("Title                                    State           Number");
            Console.WriteLine("---------------------------------------- --------------- ----------");

            foreach (task task in tasks) {

                Console.WriteLine(
                    string.Format("{0,-40}", task.title)
                    + " "
                    + string.Format("{0,-15}", task.systemAttributes.state)
                    + " "
                    + string.Format("{0,-10}", task.systemAttributes.taskNumber)
                );

            }

            // get rid of the context
            tqs.destroyWorkflowContext(ctx);

            // all done
            Console.WriteLine();
            Console.WriteLine("Press enter to exit");
            Console.Read();

        }
    }
}

In order to run this, we also need to set up WS-Security.  Go ahead and open up the app.config file.  It should look similar to the following example:

<?xml version="1.0" encoding="utf-8" ?>
<configuration>
  <system.serviceModel>
    <bindings>
      <basicHttpBinding>
        <binding name="TaskQueryServiceSOAPBinding" closeTimeout="00:01:00"
          openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00"
          allowCookies="false" bypassProxyOnLocal="false" hostNameComparisonMode="StrongWildcard"
          maxBufferSize="65536" maxBufferPoolSize="524288" maxReceivedMessageSize="65536"
          messageEncoding="Text" textEncoding="utf-8" transferMode="Buffered"
          useDefaultWebProxy="true">
          <readerQuotas maxDepth="32" maxStringContentLength="8192" maxArrayLength="16384"
          maxBytesPerRead="4096" maxNameTableCharCount="16384" />
          <security mode="TransportCredentialOnly">
            <transport clientCredentialType="Basic" proxyCredentialType="None"
              realm="" />
            <message clientCredentialType="UserName" algorithmSuite="Default"  />
          </security>
        </binding>
      </basicHttpBinding>
    </bindings>
    <client>
      <endpoint address="http://192.168.174.132:8001/integration/services/TaskQueryService/TaskQueryService2/*"
        binding="basicHttpBinding" bindingConfiguration="TaskQueryServiceSOAPBinding"
        contract="TaskQueryService.TaskQueryService" name="TaskQueryServicePortSAML" />
      <endpoint address="http://192.168.174.132:8001/integration/services/TaskQueryService/TaskQueryService"
        binding="basicHttpBinding" bindingConfiguration="TaskQueryServiceSOAPBinding"
        contract="TaskQueryService.TaskQueryService" name="TaskQueryServicePort" />
    </client>
  </system.serviceModel>
</configuration>

The section that you will need to update is the security section (shown below).  You need to change the security mode to TransportCredentialOnly, the clientCredentialType to Basic in the transport section, and in the message section to UserName.  This will allow .Net to call the WS-Security web service with username token policy on the BPM server.

<security mode="TransportCredentialOnly">
  <transport clientCredentialType="Basic" proxyCredentialType="None"
    realm="" />
  <message clientCredentialType="UserName" algorithmSuite="Default"  />
</security>

That’s all we need.  Now you can go ahead and build and run the solution.  You should get a window open with output like the following:

Sample C# TaskQueryService client
Authenticating...
Authenticated to TaskQueryService
Getting task list...

Title                                    State           Number
---------------------------------------- --------------- ----------
Choose Next User                         ASSIGNED        200401
Claim This Task                          ASSIGNED        200435
Do Something                             ASSIGNED        200393
Do Something                             ASSIGNED        200396
DotNetTest                               ASSIGNED        200750
MTLChooseNextUser                        ASSIGNED        200293
UserTaskWithUCMContent                   ASSIGNED        200385

Press enter to exit

Enjoy, and stay tuned for more .Net articles.

Posted in Uncategorized | Tagged , , , , , | 3 Comments

Getting started with tuning your SOA/BPM database using AWR

Update:  When I initially published this post, I was relying on information from a single source inside Oracle, however since publishing it, I have been discussing the content further with other sources in the Oracle community, and in the course of doing so, have identified some improvements and updated the post to reflect them.  I will continue to update this post as better information comes to hand, and to make as clear, balanced and accurate as possible.

Special thanks to Jacco H. Landlust, Software Architect, Web-DBA and Oracle ACE for his highly valuable input.

In order to continue to get good performance from your SOA or BPM 11g server, you will want to periodically check your database – the one you are storing your SOAINFRA schema in – to see if there are any performance issues there.  You need to keep doing this, as the workload changes and the usage of space in the database changes.  Depending on the volume of traffic going through your system, you might want to think about tuning the database every week or every month for example.

Tuning an Oracle database is a specialist task.  It requires a highly trained and experienced DBA to do it well.  It is not the kind of thing that you can learn from a short blog post, like this one for example.  This post is not intended to teach you how to tune an Oracle database, but rather to just give a few pointers that might help your DBA, or that you can experiment with in your own environment if you don’t have the services of a good DBA.

If you are lucky enough to have a good DBA running your SOAINFRA database, then they will probably already know how to use AWR to tune and Oracle database.  If this is the case, you should just let them know that common issues in SOA/BPM databases are SGA sizing, statistics, missing indexes and high watermark contention.  They should know what to do with that information.

If, however you do not have a good DBA managing your database, perhaps you only have the database because it is needed for SOA/BPM, and it is being managed by a middleware-style systems administrator, then you might want to read on…  but please keep in mind that this advise is not intended to replace the need for a well trained specialist.  You should probably try to get a DBA on staff, or contract, to keep your database performing well.

This article provides a very brief introduction to the use of the Automatic Workload Repository (AWR) in the Oracle Database and what to look for in the reports for your SOA/BPM environment.

Before you start playing with AWR, it is a good idea to go and read a bit about it.  A good place to start would be Overview of the Automatic Workload Repository and Managing the Automatic Workload Repository.  You should pay particular attention to making sure you develop an understanding of the concept of ‘DB TIME,’ without which extracting much meaning from AWR reports will be difficult.

AWR is a built in feature of the Oracle Database.  Your database will automatically collect performance information and create snapshots every hour.  It will also automatically age and remove these over time.

You can also tell the database to take a snapshot manually using this command, which you will need to issue as a SYSDBA user in SQLPlus:

SELECT DBMS_WORKLOAD_REPOSITORY.Create_Snapshot FROM DUAL;

So the process is as follows:

  1. Create a snapshot (using the SQL above),
  2. Run your tests,
  3. Create another snapshot.

Your tests should be some kind of representative and repeatable workload if you are doing this in a test environment.

It is also safe to run these reports against your production environment.  In this case, you need not create the snapshots manually, you can just use the hourly ones that the database creates for you automatically.

Once you have your snapshots, you are ready to create a report.  You use the following command, again as a SYSDBA, to create the report:

@?/rdbms/admin/awrrpt.sql

This will ask you to select the start and end snapshots and for other details like the format and file name for the output.

After you have done this, open up your report and take a look.  Be warned – it is a pretty big report.

Here is an example of the first page of the report, this is from a VM with BPM 11.1.1.5 plus the Feature Pack, running on Oracle Linux 5, with 10GB of memory, everything in the one VM – so not an ideal production environment, which is good, because we should be able to see some issues in the report.


WORKLOAD REPOSITORY report for

DB Name         DB Id    Instance     Inst Num Startup Time    Release     RAC
------------ ----------- ------------ -------- --------------- ----------- ---
ORCL          1292287891 orcl                1 24-Nov-11 11:03 11.2.0.1.0  NO

Host Name        Platform                         CPUs Cores Sockets Memory(GB)
---------------- -------------------------------- ---- ----- ------- ----------
bpmfp.mark.oracl Linux x86 64-bit                    8     8       2       9.78

              Snap Id      Snap Time      Sessions Curs/Sess
            --------- ------------------- -------- ---------
Begin Snap:       468 24-Nov-11 11:27:20        41       5.0
  End Snap:       469 24-Nov-11 11:33:44        42       6.8
   Elapsed:                6.41 (mins)
   DB Time:                0.42 (mins)

Cache Sizes                       Begin        End
~~~~~~~~~~~                  ---------- ----------
               Buffer Cache:       252M       252M  Std Block Size:         8K
           Shared Pool Size:       396M       396M      Log Buffer:     5,424K

Load Profile              Per Second    Per Transaction   Per Exec   Per Call
~~~~~~~~~~~~         ---------------    --------------- ---------- ----------
      DB Time(s):                0.1                0.0       0.00       0.00
       DB CPU(s):                0.0                0.0       0.00       0.00
       Redo size:           36,700.2            7,761.1
   Logical reads:              244.9               51.8
   Block changes:              158.5               33.5
  Physical reads:                1.2                0.3
 Physical writes:                3.6                0.8
      User calls:              242.8               51.3
          Parses:               33.9                7.2
     Hard parses:                0.5                0.1
W/A MB processed:                0.1                0.0
          Logons:                0.1                0.0
        Executes:               69.1               14.6
       Rollbacks:                0.8                0.2
    Transactions:                4.7

Instance Efficiency Percentages (Target 100%)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
            Buffer Nowait %:   99.82       Redo NoWait %:  100.00
            Buffer  Hit   %:   99.52    In-memory Sort %:  100.00
            Library Hit   %:   98.63        Soft Parse %:   98.60
         Execute to Parse %:   50.96         Latch Hit %:   98.16
Parse CPU to Parse Elapsd %:   66.67     % Non-Parse CPU:   97.75

 Shared Pool Statistics        Begin    End
                              ------  ------
             Memory Usage %:   41.73   43.63
    % SQL with executions>1:   85.59   85.23
  % Memory for SQL w/exec>1:   78.53   80.89

Top 5 Timed Foreground Events
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
                                                           Avg
                                                          wait   % DB
Event                                 Waits     Time(s)   (ms)   time Wait Class
------------------------------ ------------ ----------- ------ ------ ----------
DB CPU                                               15          59.9
log file sync                         1,592           8      5   32.3 Commit
sort segment request                      1           1   1001    4.0 Configurat
db file sequential read                 216           1      4    3.6 User I/O
db file scattered read                   64           0      6    1.5 User I/O
Host CPU (CPUs:    8 Cores:    8 Sockets:    2)
~~~~~~~~         Load Average
               Begin       End     %User   %System      %WIO     %Idle
           --------- --------- --------- --------- --------- ---------
                0.11      0.13       3.3       0.5       0.4      95.8

Instance CPU
~~~~~~~~~~~~
              % of total CPU for Instance:       0.5
              % of busy  CPU for Instance:      12.9
  %DB time waiting for CPU - Resource Mgr:       0.0

Memory Statistics
~~~~~~~~~~~~~~~~~                       Begin          End
                  Host Mem (MB):     10,017.1     10,017.1
                   SGA use (MB):        668.0        668.0
                   PGA use (MB):         87.9         94.0
    % Host Mem used for SGA+PGA:         7.55         7.61

Tips for SOA/BPM database tuning

Here are some specific areas to check.  Please keep in mind that these are specifically for the SOAINFRA database, and would not necessarily apply to any other workloads.  Also, remember that there is not really any globally applicable set of settings that will work for everyone.  These are just some guidelines – if you are serious about tuning your database, you need to get a good DBA to do it.

Redo logs

There will normally be a lot of redo activity on the SOA database.  You need to make sure your redo logs are ‘large enough.’  One (simplistic) way to do this is to check the number of log switches.  When the system is running at peak workload, one log switch every twenty minutes is ideal, more than this is too high and you should make the redo logs larger to reduce the number of switches.  Your DBA will know better ways to tune the redo log size.

If you are using ‘plain old-fashioned’ disks in your server, as opposed to a SAN or ASM, you should place your redo logs on a different disk to the database files.  You should probably also consider moving to ASM and SAN storage if your workload justifies it.

You can find the log switches in the Instance Activity Stats part of the report, here is an example:

Instance Activity Stats - Thread Activity   DB/Inst: ORCL/orcl  Snaps: 468-469
-> Statistics identified by '(derived)' come from sources other than SYSSTAT

Statistic                                     Total  per Hour
-------------------------------- ------------------ ---------
log switches (derived)                            0       .00
-------------------------------------------------------------

You can see in this system there are no log switches, which is good.  So this tells us the redo logs are large enough, or that we did not run for a long enough period of time to get any meaningful results – this report comes from a six minute test run.

Parsing

Check the hard parsing amount.  It should be zero.  If it is not, this could indicate that your SGA is too small.  (It could also indicate other things.)  You should try increasing the size of SGA and testing again.  Hard parsing can be caused by use of literals in SQL (as opposed to bind variables).

If the queries in question are your own, e.g. in a database adapter, then you should consider changing them to use bind variables and retesting.  Note that there are other approaches to addressing this issue, your DBA will be able to adivse you.  Also, you probably should not have your own queries running in the same database that is hosting SOAINFRA, except perhaps if you are in a development environment.

You can find this information on the first page.

Load Profile              Per Second    Per Transaction   Per Exec   Per Call
~~~~~~~~~~~~         ---------------    --------------- ---------- ----------
...
          Parses:               33.9                7.2
     Hard parses:                0.5                0.1
...

You can see in this system the hard parses is almost zero, which is good.

SGA

Check the buffer hit and library hit percentages.  We want them to be 100%, if not you should increase the size of SGA.  This is also on the first page:

Instance Efficiency Percentages (Target 100%)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
            Buffer Nowait %:   99.82       Redo NoWait %:  100.00
            Buffer  Hit   %:   99.52    In-memory Sort %:  100.00
            Library Hit   %:   98.63        Soft Parse %:   98.60
         Execute to Parse %:   50.96         Latch Hit %:   98.16
Parse CPU to Parse Elapsd %:   66.67     % Non-Parse CPU:   97.75

In this case they are also good.

You should be aware that the usefuleness or otherwise of the buffer hit ratio is a matter of some debate in Oracle circles.  For an overview of the pro’s and con’s, please see this article by Richard Foote.

Top 5

Check the average wait times.  Anything over 5ms indicates a problem.  If you see database CPU events in the Top 5, this could potentially indicate that SGA is too small in some circumstances, but it may not be a problem at all.  You may also be missing indexes.  Check the optimizer statistics.

Here are the Top 5 from my environment:

Top 5 Timed Foreground Events
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
                                                           Avg
                                                          wait   % DB
Event                                 Waits     Time(s)   (ms)   time Wait Class
------------------------------ ------------ ----------- ------ ------ ----------
DB CPU                                               15          59.9
log file sync                         1,592           8      5   32.3 Commit
sort segment request                      1           1   1001    4.0 Configurat
db file sequential read                 216           1      4    3.6 User I/O
db file scattered read                   64           0      6    1.5 User I/O

You can see here that the top event is DB CPU, which could potentially indicate that SGA is too small.  However, in this case it does not.  It is high because this report was run on a VM with the database and BPM sharing the CPU and disk, so the CPU was busy doing ‘other stuff’ like running BPM and WebLogic.  Database activities like sorting and logical I/O (reading memory) also shows up as DB CPU.

Database file sequential/scattered read

These indicate time spent doing table scans and index scans (respectively).  If these are high (over 5ms), you should consider moving your data files to reduce disk I/O contention, or move them to faster disks.  You can see these values in the previous example too.

Enqueue high watermark

This indicates enqueue high watermark contention that occurs when there are multiple users inserting into LOB segments at once while the database is trying to reclaim unused space.  You should consider enabling secure files to improve LOB performance (SECURE_FILES=ALWAYS).  Note that you would have to do this before you run RCU to create the schemas.  It is possible to move LOBs after creation, but this is not a procedure that a novice DBA should attempt (unless they are confident with backup and restore first).  The procedure involves the use of the DBMS_REDEFINITION package.

You cannot see enqueue high watermark contention in my example report, because this was not a problem in my environment, so it did not make it into the Top 5.  If it did, you would see an event called:

enq: HW - contention

Some other considerations…

There are some database configuration parameters that can have an impact on performance.  The use or otherwise of these parameters is a matter of much debate.

If you are doing a performance benchmark, where your goal is to get the best possible performance, then you might want to consider not using MEMORY_TARGET and AUDIT_TRAIL.  However, keep in mind that running a performance benchmark is a lot different to running a production system.

MEMORY_TARGET

This setting allows the database to automatically tune its own memory usage.  If you do not use this setting, you will need to have your DBA tune the memory usage manually.  There is an argument that a DBA manually tuning the database will result in a better tuned database.  There is a counter argument though, that not many DBA’s have the time to sit around tuning the database constantly, and you might be better off letting the database do it itself.  If you do not use this setting, you should start with 60% of physical memory allocated to SGA and 20% to PGA.

AUDIT_TRAIL

There is an argument that you should not use this setting if you are going for absolute best performance.  However, the overhead is very low, and and benefit of having the audit trail will most likely outweight the slight performance cost in almost all situations.

Posted in Uncategorized | Tagged , , , | Leave a comment

Finding which activities will execute next in a process instance

We have had a few queries lately about how to find out what activity (or activities) will be the next to execute in a particular process instance.  It is possible to do this, however you will need to use a couple of undocumented APIs.  That means that they could (and probably will) change in some future release and break your code.  If you understand the risks of using undocumented APIs and are prepared to accept that risk, read on…

The way to do this is to look at two things:

  • The model of the process itself, i.e. what tasks and connections exist in the process model, and
  • The audit trail for the specific process instance the we are interested in.

By comparing these two pieces of information, we can work out where the process instance is currently (by finding all the activities that have started but have not yet ended) and what the next activities are (by following the connections that start with these unfinsihed activities to see where they go).

I am using plurals here because, of course, you can have multiple parallel execution paths in a process, for example when you use an inclusive or complex gateway, or a multi-instance embedded subprocess, or even a non-interupting event subprocess.

Here is the sample code.  You will need to edit this to suit your own environment.


package nextactivity;

import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

import oracle.bpel.services.bpm.common.IBPMContext;
import oracle.bpel.services.workflow.client.IWorkflowServiceClientConstants;

import oracle.bpm.client.BPMServiceClientFactory;
import oracle.bpm.project.SequenceFlowImpl;
import oracle.bpm.project.model.ProjectObject;
import oracle.bpm.services.client.IBPMServiceClient;
import oracle.bpm.services.instancemanagement.model.IProcessInstance;
import oracle.bpm.services.instancequery.IAuditInstance;
import oracle.bpm.services.instancequery.IInstanceQueryService;
import oracle.bpm.services.internal.processmodel.model.IProcessModelPackage;

public class NextActivity {

    private static BPMServiceClientFactory bpmscf;

    public NextActivity() {
        super();
    }

    public static void main(String[] args) {

        try {

            // check that we have a process instance ID
            if (args.length != 1) {
                System.out.println("You must specify the instance ID");
                System.exit(0);
            }
            String instanceId = args[0];

            // get the BPMServiceClient
            IBPMServiceClient bpmServiceClient =
                getBPMServiceClientFactory().getBPMServiceClient();

            // authenticate to the BPM engine
            IBPMContext bpmContext =
                getBPMServiceClientFactory()
                .getBPMUserAuthenticationService()
                .authenticate("weblogic", "welcome1".toCharArray(), null);

            // get details of the process instance
            IInstanceQueryService instanceQueryService =
                bpmServiceClient.getInstanceQueryService();
            IProcessInstance processInstance =
                instanceQueryService.getProcessInstance(bpmContext,
                                                        instanceId);

            if (processInstance == null) {
                System.out.println("Could not find instance, aborting");
                System.exit(0);
            }

            // get details of the process (not a specific instance of it,
            // but the actual process definition itself)
            // WARNING WARNING WARNING
            // The ProcessModelService is an UNDOCUMENTED API - this means
            // that it could (and probably will) change in some future
            // release - you SHOULD NOT build any code that relies on it,
            // unless you understand and accept the risks of using an
            // undocumented API.
            IProcessModelPackage processModelPackage =
                bpmServiceClient
                .getProcessModelService()
                .getProcessModel(bpmContext,
                                 processInstance.getSca().getCompositeDN(),
                                 processInstance.getSca().getComponentName());

            // get a list of the audit events that have occurred in this instance
            List auditInstances =
                bpmServiceClient
                .getInstanceQueryService()
                .queryAuditInstanceByProcessId(bpmContext, instanceId);

            // work out which activities have not finished
            List started = new ArrayList();
            for (IAuditInstance a1 : auditInstances) {
                if (a1.getAuditInstanceType().compareTo("START") == 0) {
                    // ingore the process instance itself, we only care
                    // about tasks in the process
                    if (a1.getActivityName().compareTo("PROCESS") != 0) {
                        started.add(a1);
                    }
                }
            }
            next:
            for (IAuditInstance a2 : auditInstances) {
                if (a2.getAuditInstanceType().compareTo("END") == 0) {
                    for (int i = 0; i < started.size(); i++) {
                        if (a2.getActivityId()
                              .compareTo(started.get(i).getActivityId()) == 0) {
                            started.remove(i);
                            continue next;
                        }
                    }
                }
            }
            System.out.println("\n\nLooks like the following have started but not ended:");
            for (IAuditInstance s : started) {
                System.out.println(s.getActivityId() + "\nwhich is a "
                                   + s.getActivityName() + "\ncalled "
                                   + s.getLabel() + "\n");
            }

            // now we need to find what is after these activities...
            // WARNING WARNING WARNING
            // The ProcessModel, ProcessObject, etc. are UNDOCUMENTED APIs -
            // this means that they could (and probably will) change
            // in some future release - you SHOULD NOT build any code
            // that relies on them, unless you understand and
            // accept the risks of using undocumented APIs.
            List nextActivities = new ArrayList();
            next2:
            for (ProjectObject po : processModelPackage.getProcessModel().getChildren()) {
                if (po instanceof SequenceFlowImpl) {
                    for (IAuditInstance s2 : started) {
                        if (((SequenceFlowImpl)po).getSource()
                                                  .getId().compareTo(s2.getActivityId()) == 0) {
                            nextActivities.add(po);
                            continue next2;
                        }
                    }
                }
            }
            System.out.println("\n\nLooks like the next activities are:");
            for (ProjectObject po2 : nextActivities) {
                System.out.println(((SequenceFlowImpl)po2).getTarget().getId() 
                                   + "\nwhich is a "
                                   + ((SequenceFlowImpl)po2).getTarget().getBpmnType() 
                                   + "\ncalled "
                                   + ((SequenceFlowImpl)po2).getTarget().getDefaultLabel() 
                                   + "\n");
            }

        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    protected static BPMServiceClientFactory getBPMServiceClientFactory() {

        if (bpmscf == null) {

            Map properties = new HashMap();
            properties.put(IWorkflowServiceClientConstants.CONNECTION_PROPERTY.CLIENT_TYPE,
                           IWorkflowServiceClientConstants.CLIENT_TYPE_REMOTE);
            properties.put(IWorkflowServiceClientConstants.CONNECTION_PROPERTY.EJB_PROVIDER_URL,
                           "t3://bpmfp:8001");
            properties.put(IWorkflowServiceClientConstants.CONNECTION_PROPERTY.EJB_SECURITY_CREDENTIALS,
                           "welcome1");
            properties.put(IWorkflowServiceClientConstants.CONNECTION_PROPERTY.EJB_SECURITY_PRINCIPAL,
                           "weblogic");

            bpmscf = BPMServiceClientFactory.getInstance(properties, null, null);

        }
        return bpmscf;

    }

}

To run the sample, you will need to put some JAR files on the CLASSPATH.  These may not all be needed, but here are the ones I am using:

Oracle_SOA1\soa\modules\oracle.soa.workflow_11.1.1\bpm-services.jar
Oracle_SOA1\soa\modules\oracle.soa.fabric_11.1.1\bpm-infra.jar  
Oracle_SOA1\soa\modules\oracle.bpm.client_11.1.1\oracle.bpm.bpm-services.client.jar  
Oracle_SOA1\soa\modules\oracle.bpm.client_11.1.1\oracle.bpm.bpm-services.interface.jar  
oracle_common\webservices\wsclient_extended.jar   
oracle_common\modules\oracle.xdk_11.1.0\xmlparserv2.jar
oracle_common\modules\oracle.xdk_11.1.0\xml.jar
wlserver_10.3\server\lib\wlthint3client.jar
Oracle_SOA1\soa\modules\oracle.bpm.project_11.1.1\oracle.bpm.project.model.jar
Oracle_SOA1\soa\modules\oracle.bpm.project_11.1.1\oracle.bpm.project.io.jar
Oracle_SOA1\soa\modules\oracle.bpm.project_11.1.1\oracle.bpm.project.ui.jar
Oracle_SOA1\soa\modules\oracle.bpm.project_11.1.1\oracle.bpm.project.draw.jar
Oracle_SOA1\soa\modules\oracle.bpm.project_11.1.1\oracle.bpm.diagram.draw.jar
Oracle_SOA1\soa\modules\oracle.bpm.workspace_11.1.1\oracle.bpm.ui.jar
Oracle_SOA1\soa\modules\oracle.bpm.project_11.1.1\oracle.bpm.project.compile.jar
Oracle_SOA1\soa\modules\oracle.bpm.project_11.1.1\oracle.bpm.project.catalog.jar
Oracle_SOA1\soa\modules\oracle.bpm.project_11.1.1\oracle.bpm.project.jar
Oracle_SOA1\soa\modules\oracle.bpm.runtime_11.1.1\oracle.bpm.core.jar
Oracle_SOA1\soa\modules\oracle.bpm.runtime_11.1.1\oracle.bpm.lib.jar
Oracle_SOA1\soa\modules\oracle.bpm.runtime_11.1.1\oracle.bpm.xml.jar

This will produce output like this:

Looks like the following have started but not ended:
ABSTRACT_ACTIVITY1824320344446
which is a USER_TASK
called ChooseNextUser

Looks like the next activities are:
ABSTRACT_ACTIVITY1824321141176
which is a USER_TASK
called DoSomething

When run on a process like this:

Posted in Uncategorized | Tagged , | Leave a comment

An event not to be missed… WebLogic 12c Launch and Deep Dive

If you are using WebLogic Server, you wont want to miss the launch for the brand spanking new WebLogic Server 12c on December 1.  Find all the details here.  The second half is a deep dive session for developers!

Trivia:  The ‘c’ in the version number means ‘cloud.’  The ‘g’ in 11g was ‘grid,’ before that there was an ‘i’ for Internet.  What will be next?  A colleague of mine joked ‘r’ for rainbow – what comes out after the clouds…
Posted in Uncategorized | Leave a comment

Don’t install JDev and BPM in the same Home

I don’t think this is actually documented anywhere, but it is something that you will want to be aware of if you are using the BPM 11.1.1.5 Feature Pack.

It is not supported to install the Feature Pack patch into an Oracle Home which contains JDeveloper and the runtime components (WebLogic, SOA, BPM, etc.)

If you are installing on the same machine, like a developer’s machine for example, you should install JDeveloper into a separate Oracle (Middleware) Home.

Posted in Uncategorized | 1 Comment

SOA11g: Database as a policy store

As more and more customers of SOA 11g move to production, we have been asked often about the recommendations for a PolicyStore for SOA 11g in production. This document addresses the various policy store options, helps evaluate the pros and cons of each of these options and describes the configuration steps required for using the database as a policy store in SOA 11g.

Note: This document is not a replacement for official documentation in this regard. Per official documentation, the policy store is defined as, ‘… the domain policy store is the repository of system and application-specific policies…‘ Further, ‘[i]n a given domain, there is one store that stores all policies that all applications deployed in the domain may use.

To reiterate, as of 11g PS4 only one Security/Policy store per domain is supported. And so, ‘all’ products in that domain (say SOA, WebCenter, etc.) will share this single repository.

Policy Store options

Currently the following are the options supported by OPSS (Oracle Platform Security Service) for Policy store:

  1. File/XML based system-jazn-data.xml.
    This file based store is installed by default in the <domain home>/config/fmwconfig directory and seeded with the initial SOA application specific roles and policies.  The how-to section of this post (see below) lists steps to move from this file based repository to a database based one.
  2. OID (Oracle Internet Directory – an LDAP provider)
  3. the Oracle Database as a policy store

Evaluating these options

The file based repository, while convenient and usable for development purposes is ‘not’ recommended by Oracle in production.  According to OPSS documentation, ‘…in domains where several server instances are distributed across multiple machines, it is highly recommended that the OPSS security store be LDAP- or DB-based.

In cases where the customer doesn’t have an option but to use this file, the onus is on the customer to provide for high availability of this file.  Further, scalability with using this repository is questionable and under large transaction volumes it is easy to see this file becoming a bottleneck.  And again, for the customer that doesnt have any other options, this file based repository is usable when there are few users (tens) in the system and volume of transactions very little.

In summary, concerns regarding HA, scalability and need for manual synchronization of this file across servers, render the file based repository a non-viable option in production.

This brings us to the viable and recommended production options for this repository namely OID and the database.  Using OID as a policy store has been well documented here.

However, amassing policies in OID is a good option for customers who already use OID. Several of our SOA customers use non-OID LDAP directories and would not like to be tied to OID. In such cases, the Oracle database can be used as a policy store.

Steps to enable the Oracle Database as a policy store

Here are the list of steps to do enable the Oracle Database as a Policy store. Note that as usual it is recommended that the whole domain be backed up before any policy store changes are made, particularly the following files:

  • jps-config.xml
  • system-jazn-data.xml

As a precaution, you should also back up the boot.properties file for the Administration Server for the domain.

Step 1: Create the OPSS schema or ensure OPSS schema exists

Running RCU as part of SOA install automatically creates the schema and objects OPSS needs.  If RCU has already been run on an installation, check that the schema exists- <your prefix when running RCU>_OPSS is the naming convention for that schema.
One can login to this schema and run desc jps_dn as well.  Section 8.3 of this link is relevant as well.

Step2: Create a datasource for the OPSS schema

Create a Datasource for the OPSS schema using wlst or the weblogic console.
Important: Note that this datasource is to be created with a NON-XA driver and with no global transaction support.

Step 3: Reassociate security store

Now run this wlst script:

  • run wlst and connect
  • reassociateSecurityStore(domain="your_domain", servertype="DB_ORACLE",datasourcename="your_data_source", jpsroot="cn=jpsTestNode", join="false")

These parameters are elaborated in official FMW Application Security Guide but here is a quick recap:

  • domain:refers to the name of the domain that this policy store is enabled for. Note again, that currently only one Policy store per domain is supported
  • servertype: is to be set to DB_ORACLE for the Oracle database
  • datasource name refers to the JNDI name of the Datasource
  • jpsroot:is a logical name only, any name can be used and should be of the format cn=<a node name>
  • join: see below

If you are reassociating your first domain and want to migrate policies from say system-jazn-data.xml, use join=”false”. Setting join=”true” doesn’t migrate any policies. It simply reconfigures the domain to look at DB for policies.

Maintaining the database policy store: For more information on ongoing maintenance for the Database Policy store, refer to this link (Section 8.3.2).

And there you are! All done with enabling usage of the Oracle Database as a policy store instead of the default system-jazn-data.xml. (Note: currently a bug prohibits the deletion of the system-jazn-data.xml file.  And so even an empty file named system-jazn-data.xml under <yourdomain>/config/fmwconfig/ will suffice.)

Also, my colleague Andre Correa has written a great article about the whole reassociation business as he calls it – well worth reading!

Posted in Uncategorized | Tagged , , , | 1 Comment