Why did i create a SAP PI/PO course

I have seen many developers who were trying to start using the tool, but many did not know where to start. Others couldn’t afford a PI training that could guide them through SAP PI/PO. In other cases, the developer simply missed the course – if your timing is bad, you might have to wait for months in order to enroll in another course.

If you are a new employee, it is quite difficult to wait for the start of a new course. Until then, your role at the company is on par with that of a paperweight. If you are a new developer just waiting around for a new course, you are unable to use your skills to their fullest extent, and you are basically unable to complete the tasks you were hired to do.

This course offers new developers a good foundation. They will be able to understand what components are there in the PI landscape, they will progress in their ability to develop scenarios, and they will be able to understand the projects created by others working at their company, so they can leverage their accumulated knowledge. Furthermore, understanding the work done by others will also lead them to new enhancement ideas.

As I’ve been working as an SAP consultant for approximately 11 years now, I have seen many scenarios. A lot of them were created after I taught people how to use the tool. Whenever I was leaving, I had to be sure that there was someone in the organization who could manage the scenarios and handle whatever was going on.

My consulting experience has provided me with a lot of insight and inspiration for this course. I created the course in order to help people learn and improve their skills quickly. That is my main goal.

If you want to join my SAP PI Training you can join it at the SAP PI/PO training site. On the site you can also find free tutorials that guides you to how to get started and create an end to end scenario.

#SAPPHIRE youtube challenge: Meet the right people for you

I’m leaving from home in a few hours for SAPPHIRE, and really looking forward to meet all the really cool people there. But you mostly meet people at random, some will help get your objectives. But you have to be lucky.

So I thought that it could be interesting to see how was at the event that could make a difference. The ideal people for me to meet would be people working with SAP Process Integration (PI). It is difficult to find them and you have to meet a lot of people, which is nice though.

I decided to make a video saying how I wanted to meet. It would though probably be nicer if other people did the same. So if you are up for the challenge. Create a youtube video about how you want to meet. The format can be whatever you can make. If it is with your mobile phone it is also great.

Just make a video reply to my movie and tag the video with #sapphire and #sapphiremeet.

Hire is my video.


 

Book of the week: Cradle to cradle and SAP development

This week I was reading Cradle to Cradle from Michael Braungart and William McDonugh. The book gave room for thought. Both on how we create more enviormental friendly products. But was triggered me most was how it also applys to the usage in SAP.

Vishal Sikka mention the timeless software. Where software can continue to run for ever, but just get a new front.

As a developer you can also apply some of this. How do you make software that is easy to get started with.

See the video hire.

Using SFTP for windows in PI

Last December I wrote about how to use SFTP SSH from a Unix server, without using a Seeburger Adapter. SAP PI/XI cannot be used to access SSH SFTP sites directly, but this script can help. There have been many requests for a version which also works on windows. I do not have access to a windows server with PI or XI so it is a little difficult to test for me.

I have now written the following script, which works on my Vista labtop. I have not tested it on Windows 2003 or Windows 2008, where most PI systems will run.

I have used Putty for creating the SSH connection. I have used the pscp (an SCP client, i.e. command-line secure file copy). To try something different then using SFTP. SCP makes it easier to get files which should exist in a directory. Pscp should be downloaded and saved in the same directory as the script.     

The script looks like the following.

@echo off
REM PARAMETERS
REM %1 target file  PI %F
REM %2 query for where the file is located ie. root@figaf.com:dir/*
REM %3 SSH Password

REM RESET the target file
echo '' >%1


SET TARGETDIR=%~d1%~p1download
echo %TARGETDIR%

IF EXIST %TARGETDIR% GOTO SKIPCREATEDIR
   mkdir  %TARGETDIR%
:SKIPCREATEDIR
del /Q %TARGETDIR%\*


pscp.exe -pw %3 %2 %TARGETDIR%

type  %TARGETDIR%\* > %1

The script takes the following parameters.

  1. The %F which is the name of the file, which the adapter is currently reading.
  2. A the location of the file on the server side in the form “user@server:path/[filter*] ie. root@sftp.figaf.com:dir/SKB*. This command logon with the user root on the host sftp.figaf.com. Then looks in the directory dir, relative to the login dir and the selects all files starting with SKB.
  3. The users password.

The command in the communication channel should look something like.

C:\scripts\sshftp.bat %F root@sftp.figaf.com:dir/SKB*.

I have only tested with password, but pscp might also work with using ssh keys.

SAP PI XML Mappings using groovy

Creating XML mapping in Java have for me always been difficult, it has been possible but I would prefer other tools. I was looking at scripting languages like Ruby/JRuby or Groovy for creating some web apps. Those languages seem quite hot right now. On the SCN Wiki a group has implemented the Grails (groovy on Rails) on the Netweaver system, as Composition on Grails. With this tool it is possible to applications with a Webdynpro look and feel. Grails is a framework for creating webapps with less coding.

Groovy is a scripting language designed on the basis of Java. Groovy script is compiled into Java classes, and both Java and Groovy can be mixed. This makes the implementation easier, just start writing Java and when you feel like use some of the smarter features of Groovy you can use them.

While I was looking at Grails, I thought that I would be possible to use it in PI. One place could be in java mappings. I’ll describe the steps that I have taken to implement this.

  1. Download and install the groovy library
  2. Get the Groovy plugin to Eclipse, this make developing much easier.
  3. Create a new Eclipse project
  4. Insert the aii_map_api.jar in the project, to be able to implement Streamtransformation service.
  5. Create a new Groovy file in the source folder, with the name GroovyMapIdoc.groovy, then Eclipse know that it is a groovy file.
  6. Create the mapping of your file. I have attached my example code bellow.
  7. Compile the Groovy files using the context menu on the GroovyMapIdoc.groovy file.
  8. Zip the content of the bin-groovy in the project folder and upload it, as an imported archive in the Integration builder. Alternative use ant build to create the zip files.
  9. Upload the two files Groovy-1.6.1.jar and asm-2.2.3.jar as imported archives. They can be found in <GROOVY_HOME>\lib
  10. Activate and use the mapping.

I would expect people trying this to have a good knowledge of using XI or PI Java mappings, because it is a requirement for the development of mappings.

One example I always have considered, was my first challenging mapping experience. Posting financial post with more than 1000 lines to the FIDCCP02 idoc. The FIDCCP02 only accepts 999 lines. The posting can be created multiply idocs with 998 lines and the post a balance on each item. This way all documents will balance.

The document is transformed from the left document to the right. I have for this example used a max size of 3 to make testing easier.

The code that I have used for the mapping is.

package com.figaf.mapping

import com.sap.aii.mapping.api.StreamTransformation;
import java.io.InputStream;
import java.io.OutputStream;
import java.util.Map;

import groovy.xml.MarkupBuilder

class GroovyMapIdoc implements StreamTransformation{

   Map param;
   
    void setParameter(Map param) {
        this.param = param;
    }
    // Number of lines pr idoc
    def step=3
   
    /**
     * Implementation of the execution method
     */
    void execute(InputStream input, OutputStream out) {

        // Parse the input using the XMLSlurper
        def FICCP01 = new XmlSlurper().parse(input)
        // get the different lines using the GPath
        def Lines = FICCP01.IDOC.LINE
        // create a writer example
        def writer = new OutputStreamWriter(out)
       
        def xml = new MarkupBuilder(writer)
        // create the root element and fill data into it.
        xml.FICCP01(){
            // get the number of idocs to be created.
            def numIdocs =   Lines.size()/step + (Lines.size()%step>0?1:0)  
            // loop for each idoc
            for ( i in 0..numIdocs-1 ) {
                // find the limit for the current idoc
                def max = Math.min( Lines.size(), i* step+2)
                // create sum ellement to create balances
                def sum = 0.0;
                 def lineno=1;
                IDOC(){
                    // create the number segment, using GPATH
                    NR(FICCP01.IDOC.NR )
                    // for each line in the range do the following
                    Lines[i*step..max].each{oldline->
                         // create a  new Line node, in the out put element
                         // with the following content
                        LINE(){
                        NO(lineno++)
                        Text(oldline.Text.text())
                        Amount(oldline.Amount.toBigDecimal())
                    }
                   // update the sum
                  sum +=oldline.Amount.toBigDecimal()
                }    
                    // create a balancing line, with balances the result
                    LINE(){
                        NO(step+1)
                        Text('Balance')
                        Amount(-sum)
                    }
                }
            }
        }

        // write the xml to output
         writer.flush()
         writer.close()
       
    }
   
}

Behind the scenes the Groovy file is changed in to java classes. Because Java does not support Closures natively different subclasses are created. Try to have a look on them using a decompiler like jad.

Conclusion

Groovy could be a way to improve the how java mappings are created. The XML generation is easier to handle then how it would have been created in Java and it is more powerful than XSLT. It takes some effort to get use to the closures concept of Groovy and the other notation, but it seems to work real well.

I don’t think the performance issue with the mapping is a problem. There is an overhead to load the groovy libraries and the code is probably not as optimized if it was written directly in java. I have not made any measurements for this.

The future of integration consulting

For the last couple of years the job considering integrating legacy systems with the SAP ERP system has stayed the same. Each system is unique, so integration had to start from scratch. There could be some limited reuse between integrations, but they did not have a large impact.

Many of companies are going to use SaaS (at least that is what many company is betting on). SaaS strategies could be seen as a way to get a best-of-breed application. Some of the SaaS applications are probably going to replace old legacy applications or reduce the number of more legacy applications being created. Integration between SAP and SaaS applications are wherefore going to play a large role in the integration work done with PI.

My experience with integration with a systems, is that it takes an average of 10 days pr system (The development could be completed within 2 days, but support during testing and bug tracking is required). If the complexity is getting larger or requires the use on new services in SAP, the number of days will skyrocket. Large parts of the integration work are concern error handling, when some unexpected data is received. The price for 10 day integration is at least €10.000, but with a high uncertainty. Such start prices will many business cases fail in adopting a SaaS application, because of the initial investment.

One way to lower this price is if the SaaS vendors also use the same Enterprise Services that SAP is exposing. Then integration will just be to connect the two interfaces and will just require testing for the business functionality. From a customer perspective this would be the ideal situation, and lower the cost of integration.

If there are no Enterprise Services to cover the SaaS integration, two things should happen; Either an Enterprise Service is created or a PI integration is created. The two options can be produced by a consulting partner and then shared under some license. Open source could be an option if it was supported by the SaaS provider.

Creating reusable integration parts can be difficult, especially in small markets like Denmark. The Danish government was very keen on electronic invoices(OIOXML), using the UBL standard. That led to a race for consultancies to create templates for integrations. I do not know any company, how got a large enough part of the integrations, to have justified a large upfront investment a shrink wrapped solution.

Reuse is possible for some domains. I have been involved in project, where reuse of BPM functions lead to easy integration with the 2th and 3rd application. In this project the Enterprise Service was implemented as a ccBPM in PI. The later integrations needed some adjustments to support the new functions and integration protocols, but the overall framework proved successful.

We asked the vendors of the third party application, if they had any integration to SAP. They all did have some integrations, but for different versions of SAP and application components. Challenges like this are going to continue because of the flexibility in SAP. In some cases can different modules handle the same functions. I doubt an Enterprise Service should handle the functionality in two different modules.

With SaaS application is the marked larger since it is global and allow for packaged solutions. The packaged solutions can both be PI mappings and BPM’s or Enterprise Services and sold via Ecohub or from the SaaS vendor.

A way a business model for this could be to give a lower rate on the first implementation; then sell this integration to other customers. It will take a few iterations before a shrink wrapped integration can be released.

To have an integration for SAP must be a really interesting for SaaS providers, because it will make it easier for customers to start using their services. If the customer is going to spend a large number of days on integrating the application, it is less likely to happen. Without an integration to the ERP system, some of the benefits are removed from the application.

For consultants will it also be interesting, if you only have to produce something once and then be able to sell it multiply times. The issue is then how the global marked can be supported, but remote consulting via VPN can probably solve most of the issues.

I’m looking forward to see if, this is going to be the business model for future PI consultants.

Naming conventions for software components.

During the different XI/PI implementations i have been involved with I have experiance differet naming conventions. I would like to share the different types of conventions that I have seen and my thought of it.

There is a great document for best practice on naming conventions. Also see Thorstens blog about the subject. The naming convention does not explicit ways to name things. In this blog, I would instead describe how namespaces and software components can be derived. For all the different types of interfaces lot of effort has to be devoted create abbreviation and create a meaning full structure, this will not be covered in this blog, but hopefully in a new blog.

In this blog the term object refers to all object created in the Integration Builder Repository, like scenarios, interface mapping or message interface.

Simple Group

With the first implementations we used a very rough, method to divide the object based on the target system. I think we had software components like SAPR3, SAPXI and all the legacy systems as separate software components.

This method was very easy to get started on and did not require a lot of work op front. It was pretty easy to find which software components to use. The component names could be implemented at all customers, because it was the same setup they all used.

The namespaces just contained the legacy system name, so it was more a point to point connection. It was easy to find objects involved in a scenario. You just use the legacy system name, and you go the information you needed from the object involved.

The problem with this method, was that a lot of interfaces are grouped into the same component. This can make it more difficult to transport the components; at least you need to be more careful, what you are promoting to production. A large problem with this tight coupling was that interfaces were not named for reuse, or a refactoring should exists. This limits the SOA cababilities, which are one of the key selling points for XI/PI.

In recent implementations more focus was on making an SOA architecture. I my view this requires a more process centric approach. I have seen two different methods for creating a naming convention for grouping the objects smarter. Using the Solution Manager description of the processes, or using Solution Maps, which is SAPs standard way to look at processes in organizations.

 

Solution manager (SOLMAN)

In SAP implementation SOLMAN is vital part, to describe the business process used in the organization. This is both true in the implementation phase and I believe also after the solution has been implemented.

I have not been involved in designing the SOLMAN maps used for describing the processes. What I have seen is that the business process has been used for grouping of the scenarios. By using the same structure for the object, there is a clear mapping from SOLMAN to the PI respository.

Using the SOLMAN grouping of processes gives to following advantages.

  • The business knows which process you are working on, since they have the same notation. This way it is easy for a developer to communicate, where there is a problem.
  • The architecture and naming is given, because the naming has already been created.

The disadvantages

  • If the project has multiply phases more maps can be created. With this it could be a little more difficult to find the mapping between the SOLMAN and PI objects. This could be solved with the help of software component versions.
  • If you are designing a project, which does not involve SOLMAN, then you need to create a new structure or create the SOLMAN structure.

Solution Maps

On a resent project we used the SAP Solution Maps as a source grouping our objects. The solution maps contain all the different processes used in an organization. There are a number of different solution maps, both the general ERP but also industry specialized maps.

An example solution map.

 

Advantages of Solution maps:

  • Structure which matches business organization.
  • Common naming conventions between projects (really nice if you are a consultant). If used on all projects it is easy to find the objects. There is no need to considerer which naming conventions for software components and namespaces, because they can be defined before they are used.
  • The convention is prepared for e-SOA, because the solution maps are used to place the enterprise services. There is though a small problem, about the enterprise services are places in the SAPAPP component. This can be solved in the design phase with placing the action in the software component corresponding to the Solution Map.
  • Interface objects are placed in a way, relating to the service they perform. Different processes can use the same service.

 

Disadvantage

  • It can be difficult to communicate issues about a scenario, because the implementation is focused on SOLMAN and not on the solution maps
  • It can be difficult to place an object in the correct group. Is this service use for accounts payable or sales order management.

Conclusion

Both SOLMAN and solution maps focus on the process. They have both some advantages and disadvantages. I currently like the Solution Maps method best, because it allows a standard setup for all customers. I also like the point with having services grouped pr function, so multiply processes can use the same service.

I need to share more thought on how, I see the Solution Maps solution. This will be describe in a blog soon.

PI documenter services has been upgraded to include User defined functions

The first release of the open source version of the documenter library did not have support for the newest PHPExcel. It was therefore not deployed as a service on the figaf homepage. I have now upgraded the service to support the newest release. The functionality is still free and works the same way.

I finally got time to work with the code and fund the bug, which did not allow for the newest PHPExcel library. The problem was that I had to much reuse of functions. When I created a difference document, I created two documentation documents. The mapping information was then extracted from the two documents and compared to create the difference document. I created a new Excel serializer to write the different documents, a problem occurred.

I also go time to a feature, which I have been requested. That was to get user defined functions in the documentation. A more needed function was to include the UDF in the difference functions, so it is possible to see which user defined function has changed.

For the documentation I have decided not to include comments. I believe comments for this instead should be included in the source code. Is this a plausible way to look at things, or should I reconsider the design?

I have not had access for a PI 7.1 system. If you are running 7.1 would you test my service and see if it works. If it is not working would you send the XIM file and I will create a new version, which allows for PI 7.1 User defined mappings.

Do you have any other request for functionality, then comment in this blog or in the google defect log.

Release of PI documenter tools as open source

I released the two services for PI to get documentation of mappings and to compare two mappings to see what have changed. The release to those services can be found in the two
blogs.

After the release people have been asked if it was possible to run the tools at the users own computer instead of the server version. And I have been thinking about releasing the scripts as commercial products to accommodate this request. It is possible to create compiled PHP code, which can be executed at local PC’s. I did decide against it because it would require a product to be more mature then the product is and did not believe the marked was large enough and marketing would be difficult. By releasing the product as open source it might be possible to get other to contribute and create a service, which could help many more.

The product was original designed to run on a server in a controlled environment. But with this version I had to refactor some of the code to support the usage a client perspective. A problem is that script is extracting a lot of files from the XIM files. These files need to be cleaned up and I used the quick solution to use the windows Delete command to do this clean up. There are some other places which have not been cleaned up yet, they will be cleaned up in the next release.

One of the challenges with releasing software as open source is that it exposes ones coding capabilities. I would say that there is a long way to go before I’m able to get a living from coding PHP. The code seems to work, but the refactoring is a little more time consuming.

You can find the code and installation guides at code.google.com.

If you want to help with improving the code, please join the group and help improve the product.

Thoughts on versioning of PI components

This blog describes some of the thoughts that I have on using versions for different software component versions for development and support tracks.

Why

The use of versions as a way for managing releases has been used for a long time to maintain programs. With the help of a revision control program is it possible to get an overview of what releases are build and which changes have been made. This will make it easier to give a better understanding of what is promoted to production.

On my current PI project we have different phases. We have just gone live with the first part and is currently developing the second part. After each phase a release is tested and moved to production. While the second phase is being developed the first phase must be supported, to ensure corrections can be moved into production before the next release.

PI makes it possible to use different versions of software components, to make it possible to use each version. Like other versioning tools is it a challenge to use it correct. I’ll describe, how we have used it and what we have learned.

How to use versions

The main part of configuring different versions is fairly easy. It just requires the users to create a new software component with a new version identifier. The first release will have version 1.0 and the second version will have version 2.0 and otherwise will be placed in the same products. Remember to add dependencies to the version 2.0 of the interface components, assuming the interface components also are upgrade. The version 2.0 should also be installed on the same systems as the version 1.0 components, otherwise it could cause problems.

Create the namespaces which will be used in version 2.0. Then make a release transfer, where the objects from version 1.0 are copied to version 2.0. The Release Transfer can be found at Tools Menu, and it works much like the export function. Only content from version 1.0 is copied if a similarly namespace is in the version 2.0 component. You have now copied the content to the version 2.0 and can make changes in each component separate.

With the copy all object is copied to version 2.0. Objects will continue use the depended objects from version 1.0. For instance will a Message Mapping still use the Messages from the version 1.0. Unless the Message has been imported via dependent object, then the Mapping will use the Message from version 2.0 of the dependent software component.

With this upgrade maneuver the objects, will still be the same with very little change to them. It is a problem in the beginning but after getting use to it seems like a good idea. When changes are required simply alter the scenario to use a new interface mapping and maybe actions, and then add the functionality and the functionality can be used.

We tried to copy objects which had imported messages from the imported components. We therefore had the message types from version 2.0 in our 2.0 mappings. I do not think that this was a smart move, since I more like the idea of having to select when to upgrade a message type. Therefore this trick only works for abstract mappings, which have to be imported via the imported components. .

Objects and scenarios

I have earlier written about using scenarios for documentation and dialog tool for communicate the process with the business. With the help of scenarios, it is easier to maintain which version will be used.

When scenarios are copied to version 2.0 it will still point to version 1.0 mappings and actions. When changes are made to the version 2.0 object the scenario must be change to reflect that the mapping is changed. It is thereby possible to make configuration on a system, while maintain everything from version 1.0 except the mapping which has to be changed.

Support

The use of two versions can cause problems. When a support issue arrives, it must be correct at version 1.0. But this change is not maintained in version 2.0. It is therefore necessary to somehow maintain both versions. If this is not done the problem will exist again in when version 2.0 is deployed. This can be difficult because it requires the users to implement the changes in version 2.0, and it gets more complicated if the involved object has been altered for version 2.0. If objects have not been altered in version 2.0, release transfer is possible again. If the object has been changed in version 2.0, the release transfer will show that conflicts exist.

To avoid having to develop thing twice and make sure that, we make the same changes to version 2.0, we made some changes to the process.

  • First we have decided that some processes will not be changed in face 2. Those object have been change so we use the version 2.0 in production. This will remove the need to maintain those objects in both versions.
  • Secondly we have decided on some systems which will be sent into production and maintain in version 1.0. This system shared BPMs with other systems, which will be changed in version 2.0 to support new features.

Namespaces and versioning    

The use of namespaces could make scenes for some areas. Then the version number could be a part of the namespace. It would make it clear if an object contained content from a different version. If this approach was use then it will not be possible to use release transfer, because the namespace differed. I think that this probably makes most sense to use when communicating with third parties and the WSDLs need to be shared and agreed on which versions are used.

 

Testing

Since we need to be able to support the current running production system and creating we have to have to lines of ERP systems. I believe that this is a common setup for ERP projects. For the PI development it is just to configure the correct scenarios version for the correct ERP and third party systems. It hereby seems possible to perform the support alongside development of phase 2.

Conclusion

The use of versions is pain and requires developers to check what they are doing. I have avoided the use of versions for 4 years now, but have finally agreed to use version. The main argument was, that project had deliverables in two stages. I believe that it is correct to use versions, but it still requires caution since it is easy to break the setup.