In my final post on this topic, I will discuss some of the techniques that I use to “discover” information about customizations in an SAP system, even in the absence of any documentation. The information available to be discovered may include such details as the object name, object type, user name of the person who made the last modification, date and time of the last modification, usage statistics, where-used, and for code-based objects, even the versions and their code differences.
In my last post on this topic, I discussed two negative effects of customizations in an upgrade project – risk and cost. I also discussed an obvious reason to eliminate unnecessary customization – the mitigation of risk and cost.
In this post, we will look at some of the customization areas which add risk and the cost to an upgrade project.
One of the inevitable aspects of data migration is dealing with fallout from automated data loads. Typically, this process includes identifying the data that will not load, analyzing the error messages to determine the root cause, formatting a readable report that can be used as a tool in the cleanup process, and fixing the root cause of the problem so that it does not happen again.
Why the data will not load correctly.
There is a litany of reasons why some data records will load correctly while others will not. Here is a list of some common root causes:
- Poor quality legacy data.
Legacy systems which are not as tightly integrated as SAP, and are not under master data control allow the end user a bit of freedom when entering data. A zip code may contain too little or too many characters; the email address is not properly formatted; numeric fields have transposed digits; various forms of abbreviations (especially in the city field), a quantity of zero (0) permitted by the legacy system and uploaded into a field where SAP will not accept a quantity of 0 and even simple misspellings all can cause stringent validation checks to trigger an error and prevent the record from loading at all. A more sinister type of error occurs when the data is functionally incorrect, but good enough to pass all of the SAP validity checks. In this case, the data record will technically load into SAP, but will not be functionally correct. Duplicate customers, duplicate vendors, and the data entry error for a quantity of 1000 instead of 100, and the wrong pricing condition applied to a sales order line are examples of this scenario.
- Functional configuration and supporting data effects.
Many times I have watched the load statistics for a data object plummet from near 100% in the cycle two test load to near 0% in the cycle three test load. This is very unnerving to the client because the cycle three test load is getting rather close to the go-live date, and “by the way, shouldn’t the statistics be getting better rather than worse?” Functional configuration changes can wreak havoc on any data load. Flipping the switch on a data field from optional to required; turning on batch management or serialization for materials for the first time; changes in the handling of tax, tax codes, and tax jurisdiction codes; that account determination entry that is missing or not set up correctly; a missing unit of measure or unit or measure conversion factor; the storage location in the upload file which does not exist in SAP – any of these can cause a load to drop mostly or completely onto the floor.While change is inevitable on any project, it is important to control and communicate the change so that the downstream impact can be recognized and understood. Controlled change and communication always works better than total surprise. Perhaps if we all know ahead of time about that data field that is now required, we can impose a requirement on the data extract side to make sure that the data field is populated before it enters the upload file.
- Additional data in the upload file.
Inserting a new field in the middle of the upload file data structure might be necessary for the business to close a gap, but if that change is not communicated to the technical team so that appropriate adjustments can be made to the load object’s input structures and processing logic, the new data will surely never load, and may cause misalignment of the data fields which follow it in the upload structure.
If you are responsible for the success of data migration, you will want to build a detailed plan that will walk you through all of the three phases of data migration: pre-data migration preparation, the data migration itself, and post-data migration cleanup. I like my data migration plan to contain detailed steps that ensure that I don’t forget anything. Each step lists a specific named responsible person along with their area of responsibility and contact information. Unless I am responsible for executing the task myself, I prefer the named person to be a client employee (i.e. the business owner of the process) rather than a project consultant. This is where the responsibility should be, and it requires that the business process owners actually participate in the activity rather than sit on the sidelines and watch.
It is data migration time on your SAP business project. Whether your project is implementation, acquisition, or merger, the goal is pretty much the same: the seamless inbound acquisition of master and transactional data from one or more external data sources while ensuring that this activity has minimal impact on the rest of the business. This is where we attempt to move years of neglected master and transactional data from a loosely structured, anything-goes legacy system into a very tightly integrated and highly structured SAP system. You must consider the likelihood that the concept of master data management had not been invented yet when the legacy or source system providing your data was implemented.
How much data to move? How much data to leave behind? What to automate, and what to execute manually? How to gracefully orchestrate and execute a data migration cutover from one system to another? Where and how to fit the data migration plan into the overall business implementation plan? How to continue to run the business during the data migration phase of the business project implementation? These questions are all part of the planning fun!
Recently, I was looking at a requirements document to build an interface to an external system that wants to query customer master data by the customer first name and last name. As I read this, there were a cacophony of thoughts, all demanding equal attention, racing through my head:
- How will I ever match the inbound interface parameter “Tom” with “TOM”, or “tom”?
- How will I ever match the inbound interface parameter “Smith” with “SMITH” or “smith”?
- The ABAP WHERE clause is not case-INsensitive.
- There could be hundreds of customers named Tom Smith.
- KNA1-NAME1 and KNA1-NAME2 are not indexed fields.
- And no, we are not storing any portion of either first or last name in an existing indexed field like SORTL.
- There are well over one million customers in the database.
- We have already decided to use PI for all interfaces.
- I will have to buy the BASIS team a case of beer to get them to agree to create indices on the fields KNA1-NAME1 and KNA1-NAME2 in a table with over one million records.
I arrived at the conclusion that I need a case-insensitive database query, along with database indices created for the fields KNA1-NAME1 and KNA1-NAME2.
But, what is a case-insensitive WHERE clause? A little research and help from colleagues revealed that many had gone before me, and this was nothing new. To implement a case-insensitive WHERE clause in ABAP, you simply needed to use the native SQL UPPER() construct. The database system that is being used is Microsoft SQL Server, but the UPPER() function and its syntax is similar across different database platforms. This seemed like an easy nut to crack. But, as I soon found out, I actually had a lot to learn.
I am currently working on an SAP implementation project that is just starting its realization phase. One of my first tasks, as a member of the technical implementation team, is to review completed functional specification documents for RICEF objects. These documents, written by functional subject matter experts, are supposed to detail business requirements that address gaps, and which need to be incorporated into the system being implemented. The purpose of the review is to make sure that the functional specification documents are complete, accurate, and contain the approval signatures required to move on to the technical design phase.
Every Monday morning I hop on a plane, arrive at my destination city, pick up a rental car, and drive to my client’s site. The car rental company gives me a different make and model car every week. And yet, somehow, I am successfully able to open the car, adjust the seat and mirrors, start the car, shift gears, and drive. I can also operate the radio, air conditioning, heat, windshield wipers, and headlights.
Now, put me behind a keyboard in front of a computer application which I have never seen before. My user experience is all over the map – somewhere in the continuum between most excellent and very poor. Some application user interfaces are extremely intuitive, well-designed and easy to navigate, logically follow the business process flow, and provide real meaningful help when needed. Other application user interfaces are extremely difficult to navigate, are not intuitive, do not follow a logical business process flow, and offer little or no meaningful help. And sometimes in these difficult user interfaces, not only has the location of the steering wheel been moved to a totally unsuspecting location, but its appearance has been changed so that, even when I see it, I do not even recognize it as being the application’s steering wheel.
A well-engineered user interface is no accident. It doesn’t just magically happen. It must be woven into the fabric of the design and the code; and it should never be shoe-horned into the application as an after-thought. It takes a lot of up front planning, designing, testing, functional effort and technical effort to produce a really good application user interface. And yes, designing, building, testing, and implementing a good user interface for your application will extend the delivery time of whatever it is that you are building.
Why is a well-designed and ergonomic user interface so important? You could have built the best application ever developed. But if it is unusable, it will never get very far. Countless hours are lost every day as thousands of frustrated users spend extra time and effort wrestling with poorly designed user interfaces, rather than focusing on their jobs. And when the frustration levels reach a certain trigger point, the users will seek out and find alternative ways to perform their duties.
Here are a few examples of some very interesting user interface experiences that I have personally encountered.
In my last blog entitled What’s in a Namespace, I discussed the value of developing deliverable custom solutions in a reserved unique namespace. In this blog, I will discuss how a namespace is related to a software component. I will also discuss the typical product lifecycle, the software component version, and the convention which we use for establishing the software component version release increments.
DataXstream, an SAP Solution Partner, builds, packages, and distributes custom solutions for our clients. We develop all of our custom add-on products in our own reserved and unique namespace /XSTREAM/. But, we also need to reserve a separate unique namespace for each add-on product that we package and deliver using the SAP Add-on Assembly Kit. So, we have a single development namespace /XSTREAM/ and a separate “packaging and delivery” namespace for each add-on product. Why is that?
DataXstream, an SAP Solution Partner, builds, packages, and distributes custom solutions for our clients. We have built and packaged these solutions both in our own SAP landscape and in our client’s SAP landscape. In doing so, we must be careful about how we manage our namespaces, their associated development and repair license keys, and packages. It is not surprising, then, that I have received several inquiries asking about our namespace strategy for the development, packaging, and distribution of add-on products.