Integrating NetWeaver Java stack with Microsoft Active Directory for User Authentication

Using this form of user authentication will alleviate headaches for you, the security team, and the end-user by reducing the number of user credentials and providing a central source of user names.

Basis and Security:  They are typically treated as opposite sides of the same coin. A lot of companies will put the two of them together under one technical lead. As Basis Administrators, we can’t help but pick up some security knowledge, even if we don’t want to. Part of the Basis/Security symbiotic relationship comes from Basis having to configure things for Security purposes. Centralized User Administration (CUA) is a great example of this. Basis configures the connections and the main CUA server, so that Security can have a single point of user and role administration for the ABAP stack.

What about Java? Java requires the same level of user administration, and, with Enterprise Portal using Employee Self Services/Manager Self Services (ESS/MSS), could possibly have the largest user base of any SAP instance in a company’s landscape. Should the security team be forced to manage users and groups on each individual instance? A perfectly good question to which we Basis administrators have an answer. In short, no.
The Java stack has the ability to tie directly into Active Directory (AD). We can set it as the source database for the User Management Engine and allow the Security team to assign roles based on AD groups. This will also alleviate some stress on the end-users by removing another set of credentials they have to remember.

The Java stack has the ability to tie directly into Active Directory (AD).  We can set it as the source database for the User Management Engine and allow the Security team to assign roles based on AD groups.  This will also alleviate some stress on the end-users by removing another set of credentials they have to remember.

The steps illustrated below assume a working understanding of structures in Active Directory and the options under “User Management” of the SAP Java Stack.

[Read more...]

SAP Data Migration – Dealing With Fallout (Part 3)

One of the inevitable aspects of data migration is dealing with fallout from automated data loads.  Typically, this process includes identifying the data that will not load, analyzing the error messages to determine the root cause, formatting a readable report that can be used as a tool in the cleanup process, and fixing the root cause of the problem so that it does not happen again.

Why the data will not load correctly.

There is a litany of reasons why some data records will load correctly while others will not.  Here is a list of some common root causes:


  1. Poor quality legacy data.
    Legacy systems which are not as tightly integrated as SAP, and are not under master data control allow the end user a bit of freedom when entering data.  A zip code may contain too little or too many characters; the email address is not properly formatted; numeric fields have transposed digits; various forms of abbreviations (especially in the city field), a quantity of zero (0) permitted by the legacy system and uploaded into a field where SAP will not accept a quantity of 0 and even simple misspellings  all can cause stringent validation checks to trigger an error and prevent the record from loading at all.  A more sinister type of error occurs when the data is functionally incorrect, but good enough to pass all of the SAP validity checks.  In this case, the data record will technically load into SAP, but will not be functionally correct.  Duplicate customers, duplicate vendors, and the data entry error for a quantity of 1000 instead of 100, and the wrong pricing condition applied to a sales order line are examples of this scenario.



  1. Functional configuration and supporting data effects.
    Many times I have watched the load statistics for a data object plummet from near 100% in the cycle two test load to near 0% in the cycle three test load.  This is very unnerving to the client because the cycle three test load is getting rather close to the go-live date, and “by the way, shouldn’t the statistics be getting better rather than worse?”  Functional configuration changes can wreak havoc on any data load.  Flipping the switch on a data field from optional to required; turning on batch management or serialization for materials for the first time; changes in the handling of tax, tax codes, and tax jurisdiction codes; that account determination entry that is missing or not set up correctly; a missing unit of measure or unit or measure conversion factor; the storage location in the upload file which does not exist in SAP – any of these can cause a load to drop mostly or completely onto the floor.While change is inevitable on any project, it is important to control and communicate the change so that the downstream impact can be recognized and understood.   Controlled change and communication always works better than total surprise.  Perhaps if we all know ahead of time about that data field that is now required, we can impose a requirement on the data extract side to make sure that the data field is populated before it enters the upload file.
  2. Additional data in the upload file.
    Inserting a new field in the middle of the upload file data structure might be necessary for the business to close a gap, but if that change is not communicated to the technical team so that appropriate adjustments can be made to the load object’s input structures and processing logic, the new data will surely never load, and may cause misalignment of the data fields which follow it in the upload structure.

[Read more...]

SAP Solution Manager Service Desk Integration

Nowadays when you install SAP ECC 6.0 you get SAP Solution Manager (SOLMAN) as part of the deal – ostensibly for free (although it is really included in the purchase price).  SOLMAN provides a wealth of functionality to help manage the technical environment as well as project processes like testing.

Service Desk functionality is delivered to you for use as a ticketing system.  One of the features of it is that it can be used as a ticketing system for both SAP and non-SAP systems as well as in conjunction with other ticketing systems that may be in place already.  In this blog post I’ll briefly touch on some of the scenarios I have encountered and show that there are several ways to deploy Service Desk.

Using Service Desk is beneficial because it can automatically capture a wealth of information about what a user was doing when a problem occurred if the ticket is created directly from SAP.  Also, Service Desk can communicate directly with the SAP mother ship to log issues and manage OSS notes which obviously reduces the risk of transcription errors.  And Service Desk can be extended to include functional components from non-SAP systems which in turn leads to the possibility of one-stop-shopping for ticket management. [Read more...]

SAP Data Migration – The Data Migration Plan (Part 2)

If you are responsible for the success of data migration, you will want to build a detailed plan that will walk you through all of the three phases of data migration: pre-data migration preparation, the data migration itself, and post-data migration cleanup.  I like my data migration plan to contain detailed steps that ensure that I don’t forget anything.  Each step lists a specific named responsible person along with their area of responsibility and contact information.  Unless I am responsible for executing the task myself, I prefer the named person to be a client employee (i.e. the business owner of the process) rather than a project consultant.    This is where the responsibility should be, and it requires that the business process owners actually participate in the activity rather than sit on the sidelines and watch.

[Read more...]

What’s in a Naming Convention? Part II

In my last post, I discussed naming the naming convention that DataXstream recommends for SAP PI Integration Directory (ID) objects.  I would like to say that I had a great DataXstream ESR-specific naming convention, however the SAP naming convention guide for PI 7.1 does the job perfectly. Here is the link to the  PI 7.1 naming convention guide . I would like to point out some things that I feel most people miss, as well as some things that I think are particularly interesting.

Object Name Prefixes

I have seen a lot of places use prefixes or suffixes before objects such as or dt_ or _mt. I have never really been a fan of these prefixes and suffixes, and apparently neither is the above linked naming convention guide. The reason being is that they are unnecessary. It would be difficult to confuse a message type with a data type in a real interface scenario, even if troubleshooting an unknown broken object. Plus it makes message mapping unnecessarily confusing and long since it’s not clear whether to add the prefix, i.e. MT_one_to_MT_two. Good descriptive names are usually all that you need e.g. DEBMAS_to_Customer. The only possible exception to a no-suffix-or-prefix-policy is the service interface, as sometimes it is useful to know which direction and type a service interface is. An argument for a prefix or a suffix to describe a service interface would be to assist in understanding the flow from an SAP perspective in the event that someone who didn’t develop the interface had to come behind and troubleshoot. A argument against would be the fact that it looks silly if you use it for a web service, because you have a name that doesn’t mean anything to a third party user (note operation mappings are told to omit the prefix of direction and mode in the SAP guide).

One interesting thing that I noticed in my investigation of PI 7.1 EHP1 is that it appears that naming conventions on PI can be validated. If the object names do not conform to a naming convention a message will appear. Ter perform this check in the ESR go to menu Tools>Component Check.

Select “Governance” and “Interface Name Checks”:

If the service interface does not end with (In/Out)(SYNC/ASY) the interface check will show as an error (does not impact interface processing). I created 2 interfaces: one good and one bad to show this error.

My suspicion is that SAP will put more in place to force consistent naming standards depending on the service interface pattern in future releases.

Using a software component and Namespace for each “side” of an interface

All objects of an interface should not be grouped in a single namespace. They should to be split among the Software Component Versions of the systems being interfaced. Otherwise when you go to configure, you will not be able to see your operational mapping (OM) in the dropdown box without having to select all in the dropdown menu. A general rule of thumb is: if it’s not easy to configure, odds are you have probably done something wrong. Another reason why an object might not appear in the dropdown menu (for example for an operations mapping on an interface mapping) would be if the installed checkbox is not clicked on the SLD. When done correctly, most interfaces should be able to be configured quickly and intuitively in the integration directory (ID) without the need to select from all SWCV from dropdown menus on the integration builder.

Whatever naming convention you choose to use for the ESR, the important thing to remember is that adhering to the standard makes production support and troubleshooting faster and easier.

Taxes and TemSe

Recently while supporting my current client, I was tasked with solving a rather puzzling issue an end user was experiencing. While using T-Code PU19, the user would receive:

[Read more...]

Too Many Developers Spoil The Code

You may have heard the following idiom before:

Too many cooks spoil the broth

The common meaning extracted from this saying is that too many people working on a single task tend to make a mess of it.  But, what happens when you have too many developers working in a single piece of ABAP code?  That’s right, you get a big mess.  This issue is especially difficult to deal with when there are multiple functional requirements leveraging the same custom code object, form-based user-exit, or function-based user-exit.

While current releases of SAP (those built on SAP NetWeaver 2004s and later) have good built-in handling of enhancements and customizations via implicit and explicit enhancement points and BADIs, there still exists many old-school user-exits.

Multiple Developers; One Code Object

I recently worked on a project where three separate developers were creating three separate interfaces based on the outbound delivery IDOC.  While the development for all three interfaces was occurring at the same time, the go-live date for each of the interfaces were different (we’ll discuss that project management glitch at another time).  Each interface required a separate set of custom fields and, therefore, it’s own IDOC extension.  The problem is there is only one appropriate user-exit in IDOC_OUTPUT_DELVRY and three developers needed to be developing in it at the same time!

How did we solve this problem?
[Read more...]

Command Line Driven Transporting Using the ‘tp’ Command

STMS is a very powerful transaction in the BASIS world.  The whole transport system in SAP is paramount to it’s functionality.  99% of the time, you will use STMS for your transport needs.  What of that last 1%?  Sometimes it becomes more efficient, or just safer, to have a little more manual control.

[Read more...]

SAP Testing Preparation – Due Diligence

This entry sounds like project management 101 and can be summarized as: don’t assume because when you do… and you know the rest of that one.  The trick is to uncover your implicit assumptions, the explicit ones are easier to identify, whereas the ones lurking in your subconscious may only want to come out when they can trip you up.

[Read more...]

“Transporting” scheduled jobs

‘Transport’ is a touch misleading. In this example, we aren’t using STMS to move a job from one AS/CI to another, but we aren’t recreating it from scratch either.


Our SAP servers are running on HP-UX hosts with Oracle 10g databases. Recently, the client underwent SPS application to production servers. The process called for the stopping of scheduled jobs during update. The jobs were to be restarted as directed by team leads. When a request to restart job was executed, it was unable to be completed because the required job had “disappeared”. The job in this scenario was over 100 steps with different programs and variants being executed. Due to time constraints and the possibility of incorrect data entry, manually recreating the job in SM36 was not an option.

It is possible to extract a job definition directly out of the tables in which it is stored and then reinsert it into another instance (i.e. copy the job definition from your Q box and drop it back in Production). For demonstration purposes we will call our source server Q01 and our destination server P01. The job used in this example will be OUR_LOST_JOB.

NOTE: The steps used in this tip may utilize commands that access and modify data in ways not explicitly endorsed by SAP. Therefore, the use of this tip should be done at your own risk.

[Read more...]