SAP Process Orchestration is ready

I have been working on creating a course for PI developers so they could learn how to useProcess Orchestration/BPMN. I was missing a good tutorial to get started with BPMN so I could help my customers move to the single stack.

So I decided to create a course on the topic of BPMN and PI.  One of the things I learned most from was on interview with some of the people how have been suing BPMN for some time.  In this blog I’ll share some of the information that I got from the interviews.

  • BPMN is a beautiful tool that, we as PI developers, must understand how to use. Yes it was the word beautiful on a SAP product. Really nice. The reason is that it enables developers to draw the processes much better and is easier to understand. There is also the concept that there is the Business Rules Management (BRM) which makes some actions easier.
  • BPM is easy to get started with. It was not so difficult to use if you had the background on ccBPM. The basic building blocks are much the same and then it can do a bit more. Most experts agreed that it was a good idea to start small and with a simple process. Then you could enhance it to make sure that you covered the business. If you stared with designing the full process you would have a hard time validating it.
  • Performance is improvement is much better. So there is not the requirement to try to avoid using BPMN for all cost. With ccBPM the goal was to avoid using it because of the negative performance that it had. The people that I interview did not share this concern and thought that BPMN was a much better performing tool and the PO was a good solid platform.
  • BPMN can be eliminated in many patterns in the migration. In a lot of instances we want can avoid using BPMN when migrating. A lot of ccBPM is from old releases of XI where we often had to create collect patterns and async/sync bridges.  Well this mean that you will not end up having the same number of ccBPMs and BPMN if you do a migration. In some scenarios you may also end up creating new processes, to make the business process better supported.
  • Data structures/message types is being validated much more. In ccBPM you could put whatever message into the process. BPMN requires you to have the exact data structure, so you will have to define the data as it are. This is giving some issues if you want to have idoc data into the process. One workaround is to use CDATA structures for the data you don’t want to define.
  • Versioning can cause some challenges. The best is to use NWDI to handle the projects. NWDI did make all of the change management and version control much better. The challenge is that not all clients have NWDI. So there is the option to export the software components

You can get access to the all information on the interview at http://picourse.com/po

*) I don’t know if any of the issues has been change with the newer services packs, but this is the results of my interviews.

Book review: The SAP Consultant Handbook

I have been reading “The SAP Consultant Handbook” by Jon Reed and Michael Doane on my flight home from San Diego. It is a nice short book about how it is to be a part of the SAP eco system.

It is always nice to be reconfirmed in some of the ideas you have, of what you see the business you are in. The book is old from 1999 and updated in 2004. A lot of things have happened in the SAP field over the last years.

The book goes into details with all the different aspect of how you can approach the SAP marked either as a consultant or as an end user. The premise for those has not changed over the years. So it is the same type of decisions you will have to make today. In that perspective I think it is a killer book for everybody how wants to move into the SAP field.

There is a nice way of balancing what to do in some situations, should you take a higher paying job to work with older systems or getting the job that makes you travel.

One issue I have been seeing is that people may not have the long term for what they want to do in the business. They just have a job and like or don’t like it. The long term vision is important when you are selecting the job and when you have to select if you want to move.

One interesting thing the is

The Catch-22 of SAP: there are not enough trained consultants because those who have training are consulting and none of them are teaching.

It makes it really hard for new people to enter the space and make sure that companies are getting more qualified jobs.

One of the reasons is that it is either a job as a trainer or as an implementer. This means that you do not get the information for people how really do the implementation. It would be nice if it was possible for some of the top implementers/specialists to be able to deliver content they know.

 

 

#SAPPHIRE youtube challenge: Meet the right people for you

I’m leaving from home in a few hours for SAPPHIRE, and really looking forward to meet all the really cool people there. But you mostly meet people at random, some will help get your objectives. But you have to be lucky.

So I thought that it could be interesting to see how was at the event that could make a difference. The ideal people for me to meet would be people working with SAP Process Integration (PI). It is difficult to find them and you have to meet a lot of people, which is nice though.

I decided to make a video saying how I wanted to meet. It would though probably be nicer if other people did the same. So if you are up for the challenge. Create a youtube video about how you want to meet. The format can be whatever you can make. If it is with your mobile phone it is also great.

Just make a video reply to my movie and tag the video with #sapphire and #sapphiremeet.

Hire is my video.


 

SAP Enterprise Service and Google Wave

I would expect that you know what Google Wave is; otherwise you are not viewing this post. If not please look at Starting on Google Wave, to get more information on the future of communication.

For a while I have been working on how to create Google Wave applications and how they can interact with enterprise application. It is an interesting area with a lot of integration work possible and productivity enhancements.

From what I have learned so fair the features of Wave can connect very well with enterprise applications and make an environment where the users can work.

I created a demo on how Google Wave can be used to manage a simple workflow application. The demo features a loan application, with possible collaboration between the customer and the bank. The demo is made without any backend integration, but it is possible to make this integration.

A more detailed description of how this works can be found at the blog.

So it is possible to create simple “workflows”. Hardcore ABAP workflow programmers will laugh of this type of workflow, since they are missing a lot of the functionality they are using. With time and implementation of a workflow framework, creating workflows will be easier. Wave workflows will be have the advantage of being a much better tool for collaborating and contain semi structured data.

With a lot of Enterprise Services on the SAP systems, it is fairly easy to find the services that you need to enable you to implement ERP functionality in third party applications. You browse the ESR to find the webservice, which you want to use. Then you create a proxy for the service and can call this from your application, in this instance Google Wave. And then when the bank employee has approved the loan the SAP system is informed about the change.

I did have some problems with calling the ES from the Google Appengine, where it is only possible run robots currently. On Appengine there are some limitations on what you can call of Java classes, and I did not manage to get Axis working with ES. Therefore I have just used Plain HTTP calls to ES, where I have created the SOAP envelope manually. Then parsing is also done using XML parsers. From an architectural point I’m not real proved of the solution, but for the demo it is ok.

An example of how it is possible to use Google Wave in SAP is showed in the following video. In this video the user enters a command. The robot then responds with a list of orders this customer have.

The integration still has some short comings, which needs to be addressed, before Wave can be used in organizations.

The list of orders is displayed as plain text, so it is not possible to interact with the text. It would be possible to add buttons or check boxes to perform some simple processing of the orders.

There is a need for a more clear communication with the robot. Using a command like the one used in this example, is not useable. Controlling of the robot can performed using a form like in the workflow demo.

When do you get rid of old text data? Using a SAP CE and BPM it is possible to design dashboards which just shows the data a user needs to see. This is not possible in Wave. It is possible to delete Blips (the boxes with text), but it might be difficult to know when the data is not useful anymore. The user can always scroll back with the replay function.

I will be sharing some more of my examples with you as I get further in my research with Google Wave and SAP. You can follow the progress at http://masteringwave.com.

SAP and Tibco?

Apparently it is time for a lot of mergers and acquisitions with VmWare/Springsource and Facebook/friendfeed. Brian Sommer has found the Reuters story about SAP are considering to by Tibco. Apparently they have tried to get hitched before, but without a final conclusion.

If such a fusion happed it could resemble the merger between Oracle and BEA in January 2008. It also was two companies with the same product suite for the middle layer. Oracle then had select which of the products future development should happen on. For the integration part the BEA product was selected, so all consultants had to learn the new product.

SAP did a similarly integration with Business Objects, which from the side as given a lot of new products or rebranded products. For someone who has not worked with BI the integration of two products happened limited problems. From what I have heard it was the frontend from Business Objects and the backend from SAP BI, which was developed as a product, so a new best of breed application was created.

But Tibco has more than just one area of products they have complementary product whole Netweaver suite. For the integration/SOA part I have heard that Tibco should have some better products in some areas and the portal probably does not match how SAP has created the portal. Come to think of the J2ee stack it was also purchased from a company that I have forgotten. But that was on a time when SAP did not have the java stack.

Dennis Howlett says that SAP have to watch Software AG. And that Tibco’s Silver cloud platform could profit SAP.

If the acquisition is approved, then it would be interesting to see what that mean for customers, SAP and partners.

Do you have any Tibco product which you would like to have instead of the equivalent SAP products?

 

 

Using SFTP for windows in PI

Last December I wrote about how to use SFTP SSH from a Unix server, without using a Seeburger Adapter. SAP PI/XI cannot be used to access SSH SFTP sites directly, but this script can help. There have been many requests for a version which also works on windows. I do not have access to a windows server with PI or XI so it is a little difficult to test for me.

I have now written the following script, which works on my Vista labtop. I have not tested it on Windows 2003 or Windows 2008, where most PI systems will run.

I have used Putty for creating the SSH connection. I have used the pscp (an SCP client, i.e. command-line secure file copy). To try something different then using SFTP. SCP makes it easier to get files which should exist in a directory. Pscp should be downloaded and saved in the same directory as the script.     

The script looks like the following.

@echo off
REM PARAMETERS
REM %1 target file  PI %F
REM %2 query for where the file is located ie. root@figaf.com:dir/*
REM %3 SSH Password

REM RESET the target file
echo '' >%1


SET TARGETDIR=%~d1%~p1download
echo %TARGETDIR%

IF EXIST %TARGETDIR% GOTO SKIPCREATEDIR
   mkdir  %TARGETDIR%
:SKIPCREATEDIR
del /Q %TARGETDIR%\*


pscp.exe -pw %3 %2 %TARGETDIR%

type  %TARGETDIR%\* > %1

The script takes the following parameters.

  1. The %F which is the name of the file, which the adapter is currently reading.
  2. A the location of the file on the server side in the form “user@server:path/[filter*] ie. root@sftp.figaf.com:dir/SKB*. This command logon with the user root on the host sftp.figaf.com. Then looks in the directory dir, relative to the login dir and the selects all files starting with SKB.
  3. The users password.

The command in the communication channel should look something like.

C:\scripts\sshftp.bat %F root@sftp.figaf.com:dir/SKB*.

I have only tested with password, but pscp might also work with using ssh keys.

Search on SAP SDN/SCN with Ubiquity

Last year I wrote about the SCN search plugin to Ubiquity, which made it easy to search on SCN. Ubiquity is a plugin for Firefox. The problem with writing such utilities is that they need to be updated when the unpublished API changes.

The SCN search has changed to a quite fancy web 2.0 search. It is now possible to filter the results based on type, and then subtype and other parameters. The old search just had the option of choosing between few options, which all were accessible via a GET parameter. The new search uses, as fair as I can see, Google Web Toolkit as javascript framework. It looks quite difficult to interact with this framework, so the new search script just take you to the search page.

Michael Koegel informed me that a new version of Ubiquity has been released. Version 0.5 can be installed here. This new version had some changes in the way scripts was written, so the script had to be changed.

The new script can be installed from Githup.

When the script is install, and the access keys (CTRL + SPACE standard) are pressed the following popup is created. First scn-search is typed and then the query phase is called. When Enter is pressed the query will be performed.

SAP PI XML Mappings using groovy

Creating XML mapping in Java have for me always been difficult, it has been possible but I would prefer other tools. I was looking at scripting languages like Ruby/JRuby or Groovy for creating some web apps. Those languages seem quite hot right now. On the SCN Wiki a group has implemented the Grails (groovy on Rails) on the Netweaver system, as Composition on Grails. With this tool it is possible to applications with a Webdynpro look and feel. Grails is a framework for creating webapps with less coding.

Groovy is a scripting language designed on the basis of Java. Groovy script is compiled into Java classes, and both Java and Groovy can be mixed. This makes the implementation easier, just start writing Java and when you feel like use some of the smarter features of Groovy you can use them.

While I was looking at Grails, I thought that I would be possible to use it in PI. One place could be in java mappings. I’ll describe the steps that I have taken to implement this.

  1. Download and install the groovy library
  2. Get the Groovy plugin to Eclipse, this make developing much easier.
  3. Create a new Eclipse project
  4. Insert the aii_map_api.jar in the project, to be able to implement Streamtransformation service.
  5. Create a new Groovy file in the source folder, with the name GroovyMapIdoc.groovy, then Eclipse know that it is a groovy file.
  6. Create the mapping of your file. I have attached my example code bellow.
  7. Compile the Groovy files using the context menu on the GroovyMapIdoc.groovy file.
  8. Zip the content of the bin-groovy in the project folder and upload it, as an imported archive in the Integration builder. Alternative use ant build to create the zip files.
  9. Upload the two files Groovy-1.6.1.jar and asm-2.2.3.jar as imported archives. They can be found in <GROOVY_HOME>\lib
  10. Activate and use the mapping.

I would expect people trying this to have a good knowledge of using XI or PI Java mappings, because it is a requirement for the development of mappings.

One example I always have considered, was my first challenging mapping experience. Posting financial post with more than 1000 lines to the FIDCCP02 idoc. The FIDCCP02 only accepts 999 lines. The posting can be created multiply idocs with 998 lines and the post a balance on each item. This way all documents will balance.

The document is transformed from the left document to the right. I have for this example used a max size of 3 to make testing easier.

The code that I have used for the mapping is.

package com.figaf.mapping

import com.sap.aii.mapping.api.StreamTransformation;
import java.io.InputStream;
import java.io.OutputStream;
import java.util.Map;

import groovy.xml.MarkupBuilder

class GroovyMapIdoc implements StreamTransformation{

   Map param;
   
    void setParameter(Map param) {
        this.param = param;
    }
    // Number of lines pr idoc
    def step=3
   
    /**
     * Implementation of the execution method
     */
    void execute(InputStream input, OutputStream out) {

        // Parse the input using the XMLSlurper
        def FICCP01 = new XmlSlurper().parse(input)
        // get the different lines using the GPath
        def Lines = FICCP01.IDOC.LINE
        // create a writer example
        def writer = new OutputStreamWriter(out)
       
        def xml = new MarkupBuilder(writer)
        // create the root element and fill data into it.
        xml.FICCP01(){
            // get the number of idocs to be created.
            def numIdocs =   Lines.size()/step + (Lines.size()%step>0?1:0)  
            // loop for each idoc
            for ( i in 0..numIdocs-1 ) {
                // find the limit for the current idoc
                def max = Math.min( Lines.size(), i* step+2)
                // create sum ellement to create balances
                def sum = 0.0;
                 def lineno=1;
                IDOC(){
                    // create the number segment, using GPATH
                    NR(FICCP01.IDOC.NR )
                    // for each line in the range do the following
                    Lines[i*step..max].each{oldline->
                         // create a  new Line node, in the out put element
                         // with the following content
                        LINE(){
                        NO(lineno++)
                        Text(oldline.Text.text())
                        Amount(oldline.Amount.toBigDecimal())
                    }
                   // update the sum
                  sum +=oldline.Amount.toBigDecimal()
                }    
                    // create a balancing line, with balances the result
                    LINE(){
                        NO(step+1)
                        Text('Balance')
                        Amount(-sum)
                    }
                }
            }
        }

        // write the xml to output
         writer.flush()
         writer.close()
       
    }
   
}

Behind the scenes the Groovy file is changed in to java classes. Because Java does not support Closures natively different subclasses are created. Try to have a look on them using a decompiler like jad.

Conclusion

Groovy could be a way to improve the how java mappings are created. The XML generation is easier to handle then how it would have been created in Java and it is more powerful than XSLT. It takes some effort to get use to the closures concept of Groovy and the other notation, but it seems to work real well.

I don’t think the performance issue with the mapping is a problem. There is an overhead to load the groovy libraries and the code is probably not as optimized if it was written directly in java. I have not made any measurements for this.

The future of integration consulting

For the last couple of years the job considering integrating legacy systems with the SAP ERP system has stayed the same. Each system is unique, so integration had to start from scratch. There could be some limited reuse between integrations, but they did not have a large impact.

Many of companies are going to use SaaS (at least that is what many company is betting on). SaaS strategies could be seen as a way to get a best-of-breed application. Some of the SaaS applications are probably going to replace old legacy applications or reduce the number of more legacy applications being created. Integration between SAP and SaaS applications are wherefore going to play a large role in the integration work done with PI.

My experience with integration with a systems, is that it takes an average of 10 days pr system (The development could be completed within 2 days, but support during testing and bug tracking is required). If the complexity is getting larger or requires the use on new services in SAP, the number of days will skyrocket. Large parts of the integration work are concern error handling, when some unexpected data is received. The price for 10 day integration is at least €10.000, but with a high uncertainty. Such start prices will many business cases fail in adopting a SaaS application, because of the initial investment.

One way to lower this price is if the SaaS vendors also use the same Enterprise Services that SAP is exposing. Then integration will just be to connect the two interfaces and will just require testing for the business functionality. From a customer perspective this would be the ideal situation, and lower the cost of integration.

If there are no Enterprise Services to cover the SaaS integration, two things should happen; Either an Enterprise Service is created or a PI integration is created. The two options can be produced by a consulting partner and then shared under some license. Open source could be an option if it was supported by the SaaS provider.

Creating reusable integration parts can be difficult, especially in small markets like Denmark. The Danish government was very keen on electronic invoices(OIOXML), using the UBL standard. That led to a race for consultancies to create templates for integrations. I do not know any company, how got a large enough part of the integrations, to have justified a large upfront investment a shrink wrapped solution.

Reuse is possible for some domains. I have been involved in project, where reuse of BPM functions lead to easy integration with the 2th and 3rd application. In this project the Enterprise Service was implemented as a ccBPM in PI. The later integrations needed some adjustments to support the new functions and integration protocols, but the overall framework proved successful.

We asked the vendors of the third party application, if they had any integration to SAP. They all did have some integrations, but for different versions of SAP and application components. Challenges like this are going to continue because of the flexibility in SAP. In some cases can different modules handle the same functions. I doubt an Enterprise Service should handle the functionality in two different modules.

With SaaS application is the marked larger since it is global and allow for packaged solutions. The packaged solutions can both be PI mappings and BPM’s or Enterprise Services and sold via Ecohub or from the SaaS vendor.

A way a business model for this could be to give a lower rate on the first implementation; then sell this integration to other customers. It will take a few iterations before a shrink wrapped integration can be released.

To have an integration for SAP must be a really interesting for SaaS providers, because it will make it easier for customers to start using their services. If the customer is going to spend a large number of days on integrating the application, it is less likely to happen. Without an integration to the ERP system, some of the benefits are removed from the application.

For consultants will it also be interesting, if you only have to produce something once and then be able to sell it multiply times. The issue is then how the global marked can be supported, but remote consulting via VPN can probably solve most of the issues.

I’m looking forward to see if, this is going to be the business model for future PI consultants.

Release of PI documenter tools as open source

I released the two services for PI to get documentation of mappings and to compare two mappings to see what have changed. The release to those services can be found in the two
blogs.

After the release people have been asked if it was possible to run the tools at the users own computer instead of the server version. And I have been thinking about releasing the scripts as commercial products to accommodate this request. It is possible to create compiled PHP code, which can be executed at local PC’s. I did decide against it because it would require a product to be more mature then the product is and did not believe the marked was large enough and marketing would be difficult. By releasing the product as open source it might be possible to get other to contribute and create a service, which could help many more.

The product was original designed to run on a server in a controlled environment. But with this version I had to refactor some of the code to support the usage a client perspective. A problem is that script is extracting a lot of files from the XIM files. These files need to be cleaned up and I used the quick solution to use the windows Delete command to do this clean up. There are some other places which have not been cleaned up yet, they will be cleaned up in the next release.

One of the challenges with releasing software as open source is that it exposes ones coding capabilities. I would say that there is a long way to go before I’m able to get a living from coding PHP. The code seems to work, but the refactoring is a little more time consuming.

You can find the code and installation guides at code.google.com.

If you want to help with improving the code, please join the group and help improve the product.