Easy access to SAP sites

I have been using Launchy to get easy access to search on SAP notes and start my SAP GUI. Launchy is an application which can be used to launch other applications or access websites. Nigel James wrote about how to use Launchy in a blog post and how to access sap notes easy.

Recently I discovered that Firefox was also working on a similarly project called Ubiquity. The difference is that Ubiquity is placed within the browser and can access the same information as the browser. It has easy access to google maps for an address which has been selected and complete other tasks that you would do in your browser.

It is possible to create some commands you self, which can be used to enhance your browser experience. For instance it could be to change tabs or to make the tasks that you normally do on your favorite sites. The commands are written in Javascript, and there is a pretty good tutorial on how to get started.

To get started using Ubiquity just install the Firefox plugin from the Mozilla site.

After you have installed the plugin goto the command press CONTROL and SPACE and you will get a popup in our browser like the following. In this window you can enter commands to Ubiquity.

Try commands like map (address) or in an edit field mark an url and use the command tinyurl.

I have written some commands which can be implemented pretty easy. To implement them use the command command-editor.

In the command window insert the following code, and it should be possible to run the commands.

/**

* Ubiquity Command java scripts

* Used for searching SAP sites

*/

CmdUtils.CreateCommand({

name: “sapnote”,

description: “Finds SAP NOTES”,

takes: {“note number”: noun_arb_text},

preview: function( pblock, noteno ) {

pblock.innerHTML = “Open service markedplace for note: ” + noteno.text ;

},

execute: function( noteno ) {

var url = “http://service.sap.com/sap/support/notes/{QUERY}”

var urlString = url.replace(“{QUERY}”, noteno.text);

Utils.openUrlInBrowser(urlString);

}

})

CmdUtils.CreateCommand({

name: “sapql”,

description: “Open SAP Service Markedplace”,

takes: {“site”: noun_arb_text},

preview: function( pblock, siteType ) {

pblock.innerHTML = “Open SAP Service for : ” + siteType.text ;

},

execute: function( siteType ) {

var url = “http://service.sap.com/{QUERY}”

var urlString = url.replace(“{QUERY}”, siteType.text);

Utils.openUrlInBrowser(urlString);

}

})

CmdUtils.CreateCommand({

name: “sdn-search”,

description: “Search SDN”,

takes: {“query”: noun_arb_text},

preview: function( pblock, query) {

pblock.innerHTML = “Search SDN for : ” + query.text ;

},

execute: function( query) {

var url = “https://www.sdn.sap.com/irj/sdn/advancedsearch?query={QUERY}&cat=sdn_all”

var urlString = url.replace(“{QUERY}”, query.text);

Utils.openUrlInBrowser(urlString);

}

})

The script contains three commands.

  • sapnote which take a service marked place note number and display a page with the note.
  • sapql is to access SAP Quicklinks like SWDC or notes
  • sdn-search which make a search on SDN for the query you have made.

The sdn-search search command look like the following.

PYTHONPro OffRoad Triathlon 2008

This year I participated in the Offroad Triathlon. This year I have prepared a little better. I had got at new mountain bike, where it was possible to use all gears and use click pedals. I had also practiced biking and running to get use to the switching.

The result was a large improvement. Last year got a time on 1:32:05, while I this year did the 1:14:28. I had improved in all the different events. If I had got the same time last year it would have given a 2 place while I instead got a 10 place for the short triathlon. The results for this year can be seen at offroadtri.dk

The whole event also felt better, since I had a little more practice and knew what the event was about. I the running last year I had to walk a lot but this year I could run in a fast pace for the whole running trip. The biking track was also more dry this year, which also made it possible to ride faster.

I was also the lucky winner of a Aquaman wetsuit, unfortunately it was a ladies model so I cannot use it.Updated: I was able to exchange the certificate to a male model, so next year I’m going to swim much more in open water.

Next year it should be possible to perform the triathlon faster if the shape of the day is better and if I have practiced more.

Using PI scenarios

I have not been a fan of the scenarios created in XI and PI. I have mostly used the scenarios to document objects, which already had been implemented. By adopting this process the scenarios was often lacking functionality or missing important objects. The scenarios served mostly as a form of documentation, for PI developers with less knowledge of the interfaces designed.

My main reason for not using the scenarios was the following.

  • They could only be used to configure some objects in the auto configuration phase. It was not possible to generate a sender agreement, which then had to be created manually.
  • When new scenarios had to be configured, partners and systems should be selected. For configuring just a few objects.
  • The scenario cannot configure all necessary objects to be used. It is something like “Exactly Once In Order” sequence and routing rules.

With the above reasons it was I was not very excited about using scenarios in PI projects. Fortunately my colleague Emil Jessen, had a lot of ideas on why to use scenarios. The primary reason was to use them as a dialog tool to communicate integration processes with the business teams. The other reason was to use them for auto configuration of objects.

The scenarios should also make the hand over to the customers own people easier. They should be able configure the test and production environment and thereby have ownership of the solution. They did unfortunately not have the time for the configuration part, so we could not see how it worked.

The main focus was on creating a way to communicate, how we pictured each process and the messages flow in the process. The swimlane diagram is easy to understand since it deals with interaction between systems. It also described how the flow was when using BPM’s.

The use of models is going to get much more used in PI 7.1. I have tried to use the functionality as described in this blog. In PI 7.1 the models can get more complex and be used to describe how the business works. I’m looking forward to see if the models in PI 7.1 can give a better understanding of what is going on or they are too complex for the application consultants.

To streamline our use of scenarios the following guidelines was created.

  • The scenario should describe a process and the scenarios should be named after the process.
  • Scenarios was created in the beginning of the realization phase and populated with dummy actions until it had been defined each action.
  • Actions should be marked as with start and end action(s).
  • Actions used in a scenario should contain a description. This description should contain information on what the action is doing. It could be send message, provide lookup service, consume lookup service and send message. And then a more detailed description on what the interface did.
  • In descriptions for actions and scenarios the technical specification number was added, it was therefore a lot easier to find a scenario by using the search functionality.
  • Actions can have multiply interfaces, which could be used in different scenarios and in different ways. As long as the different message does the same it could be an IDOC and an BAPI, which both creates an invoice.
  • Communication channels temples should be places on all communication channels, which needed a sender agreement.
  • All configurations of scenarios should be done by using the scenarios, except where the limitations are as described below. This mean that it is easy to reconfigure all interfaces in a PI system, simply delete all configuration objects, except communication channels and then auto configure everything.

A scenario could look like the following. Emil had also taken the time to translate the actions to English.

It is easy to get the auto configuration for work for sender agreement, but was not documented where I looked. The solution is simple. Just place a Communication Channel Template on the sender side in the Connection between actions like showed below. It is then possible to select a communication channel in the configuration. Venkat Donela had a blog where he describes how to use the communication channel temple, I was just pointed to this blog recently.

We still need a way on how do document, which changes there need to be implemented after the configuration has been performed. This could be creating Routing determinations and the sequence for interface with EOIO. We have currently documented them in our cut-over plan, so we know what to do when we go live. Have anybody solved this problem.

I got a comment about the diagrams from one of the application consultants. He thought the diagrams were a nice way to represent what did happen in the interface. For them it looked like the solution maps, which they where use to see. It was nice to know that the diagrams could be used by the business experts to show what we were doing and we are on the right track in with the development.

Scrum and SAP projects

I have this week certified as a Scrum master, after a two day course. And I will therefore like to share, some thoughts on, how I think Scrum can be implemented in SAP projects. I have no experience in project management except for at team level and being a participant in SAP projects.

Scrum is a framework designed for Agile projects. The article “Scrum in 5 minutes”, gives a pretty good understanding of the concepts. The basic idea is to make prototypes, add business value continues during the project and remove impediments.

The opposite of scrum is the waterfall method with separate phases for each task like specification, analysis, design, implementation, test and deployment. The waterfall model has some challenges.

  • The business or customers does not have a complete idea of what they need before the see how a model works. This result in change requests along the way, which is more expensive the later in the process they are found.
  • Features is requested that is never used a study published in xp2002 showed that 45% of features in software is never used.
  • It become more difficult to change or correct something, in the later stages of the project
  • During the test face everything has to work and then fixed and retested.
  • Hand over between the different faces requires a lot of documentation, which can be difficult for other persons to understand or requires a lot of time on creating.
  • The method assumes that we do not live in a complex environment, where changes does not occur.
  • Often has overrun in time, price or bad quality.

I would say that ASAP is a clear waterfall method, with a analysis, implantation, test and go live face. ASAP contains some accelerators and templates to assists in the implantation. ASAP therefore contains most of the challenges stated above.

Scrum acknowledge that it is not possible to plan everything into the future and is therefore using 2-4 weeks iterations, which each have to result in a “workable” product. A large part of the framework is dealing with how to prioritize the most imports features from a business value perspective. The target is to create hyper productive teams, which produce more business value per resource.

 

 

A study done in Systematic , a CMMI5 company (so they must know what they are doing), showed that it was possible to halve the project cost and provide better quality than a waterfall implementation with Scrum.

Scrum can probably not save the world of projects. Like all other frameworks Scrum has some challenges, which can cause problems with the implementation.

  • First of all organizations have to acknowledge they cannot plan feature the want in advance.
  • Many organizations leave parts of the framework out; this will not result in a ScrumBUT with a lower productivity.
  • The organization has to believe in the scrum approach and leave the groups alone in the sprints. If not it can result in lower production and a product which does not provide the correct business value.
  • Other revenue models have to be created for consultancies, to adopt Scrum instead of a high paying waterfall approach.

 

I’m currently trying to figure out which parts of Scrum, we can use as an Integration Team in an implementation project. It will result in a ScrumBUT, but it will hopefully provide some experiences on how projects can be managed, from a micro level.

 

Do you have any experience in running Scrum in SAP implantations, and would like to share some experiences about it?

Use proxy to inspect http/soap requests

I have developed some PI webservice scenarios, that I needed to test. When I test webservice I use .NET WebService Studio, which I’m unable to find anymore. It is a small free application that can use a WSDL to create and interface to test the application.

When I tested this interface, I got the following error: System.Net.WebException: The underlying connection was closed: The connection was closed unexpectedly. It took a while to trace what the error was, because the processing looked fine at the PI system.

To trace the problem I deceived to place a proxy in between the .NET application and the PI SOAP adapter. I had earlier used Webscrab as a proxy to see how the used HTTP request looked and change some parameters. The new version of Webscrab is a Java Webstart application so it is much easier to get started with.

When the application is started, the user is requested to select a database to store the requests in. I normally use blank password, since the requests is just private.

The application looks like the following.

It is possible to change the port, for where the proxy listens in the menu Plugin->Proxy->Proxy Listeners. Default is 8008.

In the .NET application the proxy have to be changed to http://localhost:8008 and it should then be possible to see the incoming requests in the log.

When I placed this proxy between the two applications, I got the response that I expected and the .NET application also received the correct data.

If you need to change some requests, it is also possible. From the Plugin -> Intercept Requests and POST.

Then for each new request you should get a popup, where it is possible to change the HTTP request before it is sent to the server.

Modeling tools in PI 7.1

With PI 7.1 it is possible to create Aris Models within the Enterprise Service Builder. This will make it easier for the integration team to model the process and interfaces.It is possible to go from the high level business process models, all the way down to an interface implementation and configuration of integration scenarios. The use of models adds an extra layer of documentation of the business processes in conjunction with the Enterprise Services (ES). The models can be used to get an overview of how a scenario is created.

This paper will contain information about which models can be used in PI 7.1 and which meaning they will have for a PI developer.

The Sales Order model is based on the ES for sales orders. But the modeled system contains more information than it is possible to find on ES workplace. What is added to the models in the ESR is how the services can be connected in a specific scenario.

At first look at the models there are a lot of different models. It takes a little while to get used to the models, and figure out which models can be used for which purpose. There are 12 different model types, and all of them can be used for describing a business process in different ways and levels.

At the highest level different integration scenarios are grouped together. An example of this is the SAP Scenario Catalog, which is a grouping of different scenarios. This model makes it easier to understand how different scenarios belong together and to find the scenarios that have something in common.

An example of the scenario catalog is the following.

An Integration Scenario is a high level overview of what Deployment Units are used. In each Deployment Unit there is one or more Process Components which can contain a number of process steps. The connection between the Deployment Units can be linked to information about the integration scenarios.

The interaction between the Process Components can be described in a ProComp Interaction Model.

A ProComp Interaction model shows how different Process Components relate to each other, for instance the message flow between Process Components. An example of this is showed below.

The ProComp Interaction model can contain information about what Service Interfaces are used and the mappings they contain. This information can be used to configure an Integration Scenario by adding information about business systems and where the different process components are installed – and by selecting adapters. Then it works just the same way as an Integration Scenario in PI 7.0.

The ProComp model can also be used to describe how the flow is within a Process Component. This type of model seems to be more useful if the aim is to document how Enterprise Services are connected within a Process Component. An example of what the ProComp model could be used for: To describe what is going on in a BPM (Integration Process in PI) which can then later be created based on the model.

It is still possible to make use of integration scenarios from XI 3.0/PI 7.0. These scenarios do not explain the business in the same detail as some of the other model types. They do, however, provide information about which connections are used and how the messages are mapped. The integration scenarios are easier to understand for a PI developer since they give information about which connections are used in a direct fashion and because they have been used earlier versions of XI/PI.

It takes a little while to get used to working with the models in PI 7.1 and to create models which can be used and understood by developers and Business Process eXperts ( on SDN SAP has a BPX community).

The use of models does seem to create some extra overhead compared to a top-down approach which starts with an Integration Scenario and the objects are created to fit into the process. To be able to make such a scenario one would normally create a drawing to describe what is going on and to support development of the scenarios. This drawing is often a process diagram, for instance in Visio, PowerPoint or on paper. With help of the built-in model tools it is now possible to store such models within the ES Builder, thus serving the purpose of documenting the process and context to which the interface belongs.

I recommend to invest time in establishing naming conventions for modeling and guidelines for when and how modeling should be used.

A question which has to be answered is if models should be created for all integration scenarios – or only when Enterprise Services are involved? I probably need to use modeling in real projects and then evaluate if the use of modeling makes sense.

Publishing services in PI 7.1

With PI 7.1 is it possible to publish services to the Service Registry (SR) directly from the Enterprise Service (ES) Builder and Integration Directory. The publishing functionality will allow developers and Business Process Experts (BPXs) to publish the services interfaces to the SR or UDDI. The UDDI will then contain a global list of all the services, which is implemented or at some point will be implemented.

To show the different ways to define services, it is necessary to see how they can be published. Services can be provided the following ways.

  • Brokered Service implemented in own system. A service provided by your company is exposed as a web service in your PI system (a brokered service) and the endpoint is made available via the SR.
  • Brokered service to be implemented in partner system. A new interface must be developed. It will be implemented as a web service provided by a partner system. You want to offer this service in your own SR. You define the interface in your ES Builder and create a WSDL which the partner will use to develop and implement the service. When the service is deployed the endpoint can be posted in your SR.
  • Web service provided by 3rd party. Someone has developed a webservice. The WSDL and endpoint can be published in your SR thus making the service available to users (developers) of your SR.

How can the ES Builder and SR be used to support the three different options ? This will be described in the sections below.

1 Brokered Service implemented in own system

In PI 7.0 the only way to expose brokered services was to generate a WSDL and in the process enter a lot of information regarding the URL and service information. The WSDL file could then be saved as a file and mailed to the developers who wanted to use the service. If the file should be exposed via an UDDI the WSDL had to be placed on an HTTP server and then published.

This process has improved a lot with PI 7.1. Publishing of web services to the SR can now be performed with a few mouse clicks.

The Service Interface is defined the normal way as an outbound interface.

To configure the outbound service a Sender Agreement and a SOAP communication channel will have to be created. The Sender Agreement should then be configured to the communication channel.

To publish the service select Publish in SR from the menu.

It is possible to change the URLs to fit with external naming conventions.

When I tried to publish the service I got an error. It was like something is missing to complete the publication. The Service has been published in the SR but without an endpoint. Either it is a configuration we are missing or a bug that hopefully will be corrected in next service pack.

The service is published in the SR with the following information.

2 Brokered service to be implemented in partner system

You and a partner agree on a new interface where you need to call a service to be implemented in the partner’s system. The interface is first designed in your PI system. A proxy can be implemented with the help of SPROXY (ABAP) or you can generate a Java proxy interface. This works on SAP systems, but it does not work as seamless with non-SAP products. To share the interfaces in PI 7.0 the PI developer had to export the WSDL files and mail them to the partner.

This is a lot easier with PI 7.1. In an inbound interface there is a publish button on the WSDL tab. This will allow for direct publishing to the Service Registry.

And what is really nice is the WSDL is also published in a way, which will allow developers to get access to it, directly or from the UDDI.

When the developers have completed the service, they can publish the service in the SR with an endpoint. What seems to be missing is a way to configure the PI communication channels to retrieve the endpoint information from SR. This would be a nice feature, which would make it possible to be able to change the endpoint without having to change the communication channel.

3 Web service provided by 3rd party

A WSDL of a 3rd party web service can be published in your SR from the Publish page. Your developers can then browse through delivered WSDLs in the SR and make use of (implementing calls to) the services.

Publishing can happen quite fast by entering the URL for the WSDL and then selecting Publish.

If one of the services has to be consumed by a PI scenario, there is missing a link from the SR to the ES Builder. It is not possible to import a WSDL directly from the SR or with the URL. The WSDL and XSD must be saved as files and then imported using the mass import tool.

The process for importing multiple WSDLs into external definitions in ESR is as follows. First select where the external definitions should be stored.

Then select the files to import.

Then confirm the type

After verifying the types and links, the schemas are imported as external definitions. After importing the WSDL, links between the different components are still valid. I do not know if this also works if there is HTTP links to the WSDLs.

Conclusion

With the new version of PI 7.1 the publishing functionality is increased a lot, to make it easier for developers to share their work. The functionality does make it easy to publish services and therefore it will be something there is more likely to be used. The only feature that seems to be missing is a way to import WSDLs directly from a HTTP host or from the SR.

Triathlon

I have just completed my first real triathlon, or an offroad version.

It was rather hard. I have just started training a month ago, so there are room for improvement.

Some one got at great picture of me when I got out of the water.

I finished first from the participants from AppliCon, and is now leading the Applicon sommertour. Where we are trying different sports events.

The results can be seen at Events4U site. I believe that I’ll have to work a little more on the biking and running part.

Harddisk crash

I had the misfortune that my harddrive at my laptop crashed, it always gives a lot of problem. The largest is if the backup is up to date. The company we have a backup client at for the laptops, but it only backup information when you are in the office and as a consultant it is not every day I’m there, so there where some data which were not backed up.

I found a external harddrive case and placed the failed disk in, and tried to use the disk at an other computer. But it was not possible because there where some bad sections on the drive.

I then tried to use some of the free/shareware tools that lets you see, what information they can restore. The you have to pay for the program if you want to restore the data.

The only program i found, which could solve some of the problem. But it seem like it was looping around bad areas. So it could not be used.

The next step in the recovery process, was to use Linux. I have a dual boot Linux/Vista workstation at home. I was impressed how easy it was to get access to the drive.

First run dd_rescue, it works like dd but it has some error handling around the use of bad sectors. See (http://www.garloff.de/kurt/linux/ddrescue/ ) It was easy to install with apt-get at Ubuntu, and it should be easy with other systems.

The process of running dd_rescue took for ever. The 38G drive was processed in around 20 hours, and some of the processing was really slow.

The only commands that I had to perform was.

sudo apt-get install ntfsprogs ntfs-3g

ddrescue /dev/sdf2 hardrive.img

Press Ctrl-C to interrupt
rescued: 39933 MB, errsize: 24576 B, current rate: 85 B/s
ipos: 31444 MB, errors: 49, average rate: 670 kB/s
opos: 31444 MB

After running dd_rescue it was just to use mount the drive image as a loop back drive. Then there was access to all the information.
I suspects that it could be possible to use dd to write the image to a new drive an then use the computer again. But I got a re-installation of the computer to be safe. To copy the files I instead used samba to share drive image and then copied the files on the network.

sudo mkdir /media/windisk
sudo mount -o loop hardrive.img /media/windisk/ -o umask=000

And to make sure that Dell was not going to use the data for something else, I wrote a lot of garbage to it with shred.
shred -vfz -n 3 /dev/sdf2

It was the easiest repair i have performed on my computer so far. So there are no need to buy one of the expensive 50$+ restore drives, went it is possible to use a free version of Linux and other tools. I could probably have used the Ubuntu live cd for the backup.