Modeling tools in PI 7.1

With PI 7.1 it is possible to create Aris Models within the Enterprise Service Builder. This will make it easier for the integration team to model the process and interfaces.It is possible to go from the high level business process models, all the way down to an interface implementation and configuration of integration scenarios. The use of models adds an extra layer of documentation of the business processes in conjunction with the Enterprise Services (ES). The models can be used to get an overview of how a scenario is created.

This paper will contain information about which models can be used in PI 7.1 and which meaning they will have for a PI developer.

The Sales Order model is based on the ES for sales orders. But the modeled system contains more information than it is possible to find on ES workplace. What is added to the models in the ESR is how the services can be connected in a specific scenario.

At first look at the models there are a lot of different models. It takes a little while to get used to the models, and figure out which models can be used for which purpose. There are 12 different model types, and all of them can be used for describing a business process in different ways and levels.

At the highest level different integration scenarios are grouped together. An example of this is the SAP Scenario Catalog, which is a grouping of different scenarios. This model makes it easier to understand how different scenarios belong together and to find the scenarios that have something in common.

An example of the scenario catalog is the following.

An Integration Scenario is a high level overview of what Deployment Units are used. In each Deployment Unit there is one or more Process Components which can contain a number of process steps. The connection between the Deployment Units can be linked to information about the integration scenarios.

The interaction between the Process Components can be described in a ProComp Interaction Model.

A ProComp Interaction model shows how different Process Components relate to each other, for instance the message flow between Process Components. An example of this is showed below.

The ProComp Interaction model can contain information about what Service Interfaces are used and the mappings they contain. This information can be used to configure an Integration Scenario by adding information about business systems and where the different process components are installed – and by selecting adapters. Then it works just the same way as an Integration Scenario in PI 7.0.

The ProComp model can also be used to describe how the flow is within a Process Component. This type of model seems to be more useful if the aim is to document how Enterprise Services are connected within a Process Component. An example of what the ProComp model could be used for: To describe what is going on in a BPM (Integration Process in PI) which can then later be created based on the model.

It is still possible to make use of integration scenarios from XI 3.0/PI 7.0. These scenarios do not explain the business in the same detail as some of the other model types. They do, however, provide information about which connections are used and how the messages are mapped. The integration scenarios are easier to understand for a PI developer since they give information about which connections are used in a direct fashion and because they have been used earlier versions of XI/PI.

It takes a little while to get used to working with the models in PI 7.1 and to create models which can be used and understood by developers and Business Process eXperts ( on SDN SAP has a BPX community).

The use of models does seem to create some extra overhead compared to a top-down approach which starts with an Integration Scenario and the objects are created to fit into the process. To be able to make such a scenario one would normally create a drawing to describe what is going on and to support development of the scenarios. This drawing is often a process diagram, for instance in Visio, PowerPoint or on paper. With help of the built-in model tools it is now possible to store such models within the ES Builder, thus serving the purpose of documenting the process and context to which the interface belongs.

I recommend to invest time in establishing naming conventions for modeling and guidelines for when and how modeling should be used.

A question which has to be answered is if models should be created for all integration scenarios – or only when Enterprise Services are involved? I probably need to use modeling in real projects and then evaluate if the use of modeling makes sense.

Publishing services in PI 7.1

With PI 7.1 is it possible to publish services to the Service Registry (SR) directly from the Enterprise Service (ES) Builder and Integration Directory. The publishing functionality will allow developers and Business Process Experts (BPXs) to publish the services interfaces to the SR or UDDI. The UDDI will then contain a global list of all the services, which is implemented or at some point will be implemented.

To show the different ways to define services, it is necessary to see how they can be published. Services can be provided the following ways.

  • Brokered Service implemented in own system. A service provided by your company is exposed as a web service in your PI system (a brokered service) and the endpoint is made available via the SR.
  • Brokered service to be implemented in partner system. A new interface must be developed. It will be implemented as a web service provided by a partner system. You want to offer this service in your own SR. You define the interface in your ES Builder and create a WSDL which the partner will use to develop and implement the service. When the service is deployed the endpoint can be posted in your SR.
  • Web service provided by 3rd party. Someone has developed a webservice. The WSDL and endpoint can be published in your SR thus making the service available to users (developers) of your SR.

How can the ES Builder and SR be used to support the three different options ? This will be described in the sections below.

1 Brokered Service implemented in own system

In PI 7.0 the only way to expose brokered services was to generate a WSDL and in the process enter a lot of information regarding the URL and service information. The WSDL file could then be saved as a file and mailed to the developers who wanted to use the service. If the file should be exposed via an UDDI the WSDL had to be placed on an HTTP server and then published.

This process has improved a lot with PI 7.1. Publishing of web services to the SR can now be performed with a few mouse clicks.

The Service Interface is defined the normal way as an outbound interface.

To configure the outbound service a Sender Agreement and a SOAP communication channel will have to be created. The Sender Agreement should then be configured to the communication channel.

To publish the service select Publish in SR from the menu.

It is possible to change the URLs to fit with external naming conventions.

When I tried to publish the service I got an error. It was like something is missing to complete the publication. The Service has been published in the SR but without an endpoint. Either it is a configuration we are missing or a bug that hopefully will be corrected in next service pack.

The service is published in the SR with the following information.

2 Brokered service to be implemented in partner system

You and a partner agree on a new interface where you need to call a service to be implemented in the partner’s system. The interface is first designed in your PI system. A proxy can be implemented with the help of SPROXY (ABAP) or you can generate a Java proxy interface. This works on SAP systems, but it does not work as seamless with non-SAP products. To share the interfaces in PI 7.0 the PI developer had to export the WSDL files and mail them to the partner.

This is a lot easier with PI 7.1. In an inbound interface there is a publish button on the WSDL tab. This will allow for direct publishing to the Service Registry.

And what is really nice is the WSDL is also published in a way, which will allow developers to get access to it, directly or from the UDDI.

When the developers have completed the service, they can publish the service in the SR with an endpoint. What seems to be missing is a way to configure the PI communication channels to retrieve the endpoint information from SR. This would be a nice feature, which would make it possible to be able to change the endpoint without having to change the communication channel.

3 Web service provided by 3rd party

A WSDL of a 3rd party web service can be published in your SR from the Publish page. Your developers can then browse through delivered WSDLs in the SR and make use of (implementing calls to) the services.

Publishing can happen quite fast by entering the URL for the WSDL and then selecting Publish.

If one of the services has to be consumed by a PI scenario, there is missing a link from the SR to the ES Builder. It is not possible to import a WSDL directly from the SR or with the URL. The WSDL and XSD must be saved as files and then imported using the mass import tool.

The process for importing multiple WSDLs into external definitions in ESR is as follows. First select where the external definitions should be stored.

Then select the files to import.

Then confirm the type

After verifying the types and links, the schemas are imported as external definitions. After importing the WSDL, links between the different components are still valid. I do not know if this also works if there is HTTP links to the WSDLs.


With the new version of PI 7.1 the publishing functionality is increased a lot, to make it easier for developers to share their work. The functionality does make it easy to publish services and therefore it will be something there is more likely to be used. The only feature that seems to be missing is a way to import WSDLs directly from a HTTP host or from the SR.


I have just completed my first real triathlon, or an offroad version.

It was rather hard. I have just started training a month ago, so there are room for improvement.

Some one got at great picture of me when I got out of the water.

I finished first from the participants from AppliCon, and is now leading the Applicon sommertour. Where we are trying different sports events.

The results can be seen at Events4U site. I believe that I’ll have to work a little more on the biking and running part.

Harddisk crash

I had the misfortune that my harddrive at my laptop crashed, it always gives a lot of problem. The largest is if the backup is up to date. The company we have a backup client at for the laptops, but it only backup information when you are in the office and as a consultant it is not every day I’m there, so there where some data which were not backed up.

I found a external harddrive case and placed the failed disk in, and tried to use the disk at an other computer. But it was not possible because there where some bad sections on the drive.

I then tried to use some of the free/shareware tools that lets you see, what information they can restore. The you have to pay for the program if you want to restore the data.

The only program i found, which could solve some of the problem. But it seem like it was looping around bad areas. So it could not be used.

The next step in the recovery process, was to use Linux. I have a dual boot Linux/Vista workstation at home. I was impressed how easy it was to get access to the drive.

First run dd_rescue, it works like dd but it has some error handling around the use of bad sectors. See ( ) It was easy to install with apt-get at Ubuntu, and it should be easy with other systems.

The process of running dd_rescue took for ever. The 38G drive was processed in around 20 hours, and some of the processing was really slow.

The only commands that I had to perform was.

sudo apt-get install ntfsprogs ntfs-3g

ddrescue /dev/sdf2 hardrive.img

Press Ctrl-C to interrupt
rescued: 39933 MB, errsize: 24576 B, current rate: 85 B/s
ipos: 31444 MB, errors: 49, average rate: 670 kB/s
opos: 31444 MB

After running dd_rescue it was just to use mount the drive image as a loop back drive. Then there was access to all the information.
I suspects that it could be possible to use dd to write the image to a new drive an then use the computer again. But I got a re-installation of the computer to be safe. To copy the files I instead used samba to share drive image and then copied the files on the network.

sudo mkdir /media/windisk
sudo mount -o loop hardrive.img /media/windisk/ -o umask=000

And to make sure that Dell was not going to use the data for something else, I wrote a lot of garbage to it with shred.
shred -vfz -n 3 /dev/sdf2

It was the easiest repair i have performed on my computer so far. So there are no need to buy one of the expensive 50$+ restore drives, went it is possible to use a free version of Linux and other tools. I could probably have used the Ubuntu live cd for the backup.