DataXstream Concludes another Successful Year at SAPPHIRE NOW 2013 and ASUG Annual Conference

Sapphire Now 2013 and ASUG Annual conference wrapped up last week in Orlando. DataXstream was among the 20,000 customers, vendors, and prospects in attendance. In addition to the participants in Orlando, over 80,000 virtual participants experienced this year’s conference. This year was no exception to SAP’s tradition of top-notch event, providing exciting news and innovative solutions. Likewise, DataXstream had a great week with customers and potential partners during the conference.

DataXstream highlighted core lines of business including Consulting Services, Project Leadership, Functional Expertise, Custom Development, Basis Administration, Integration Services, and ISV Enablement Solutions.

Here are a few highlights from DataXstream Core Offerings presented at SAPPHIRE NOW 2013.

  • DataXstream’s Consulting Services provide experienced consultants that bring technical expertise and delivery project success.
  • DataXstream’s Project Leadership utilizes years of implementation experience to set the right direction for your project.  We provide solutions that work for your company.
  • DataXstream provides Functional Expertise that delivers processes and configuration to maximize the business benefit of your SAP solutions.
  • DataXstream provides Custom Development solutions to bridge gaps between your business processes and the SAP solution, as well as driving cost reduction through better design.
  • DataXstream provides experienced Basis Administration to provide excellent NetWeaver platform support, and provide system architecture to deliver a reliable, scalable, and adaptable SAP platform.
  • DataXstream’s Integration Services team works with customers to build reliable, stable, and robust integration solutions.  Throughout the integration process, we work with you to integrate your SAP and non-SAP applications to build your complete solution.
  • DataXstream provides ISV Enablement services that integrate your third-party products into the SAP solution.

Now that SAPPHIRE NOW 2013 is over, DataXstream will return to its core activities and continue to deliver value to its customers and establish new business relationships.  We wish everyone who participated in the show continued success.

What Makes a Great Basis Administrator?

Basis Administrators are key to overall project success. Proper system administration provides stable systems and consistent support for your project. One of the key questions that companies face is what makes a great basis administrator? The following items are critical to basis administration success:

  • Technical expertise with NetWeaver platform
  • Understanding of technical operations for datacenter activities (OS, DB, SAN, network)
  • Ability to adapt to changing technology
  • Ability to problem solve and learn on the job

A great basis administrator must understand the different NetWeaver platforms, integration points, and how to support the multiple systems in a SAP landscape. The SAP landscape is becoming complex with multiple SAP systems, hybrid models for infrastructure components (e.g. Linux and Windows OS), and multiple Database technologies based on application dependency. The increased use and deployment of technologies like mobility, analytics, cloud computing, and in-memory-computing force a basis administrator to have a breadth of technical knowledge. The rapidly changing landscape also means basis administrators must be flexible and adapt to the changing infrastructure requirements. A great basis administrator has to support current technology while learning new technology. The ability to utilize core technical knowledge and apply that knowledge to emerging technologies is key to long-term success. Adapting to the shifting technical landscape and problem solving within this environment is critical to success.

DataXstream basis administrators understand the multiple aspects of system administration. We understand the technical details and ability to adapt to emerging technology. DataXstream basis administrators have real experience, with proven results.

SAP Technical Consulting Services

DataXstream’s technical consulting teams have been committed to our customer’s success with SAP for over 15 years. Our focus on SAP technologies allows us to have a positive impact on our customer’s SAP projects and businesses. Our team members have a broad cross section of industry experience, working on both large and small teams, allowing them to quickly adapt to the needs of any SAP project. DataXstream’s investment over the last 10 years in SAP development capabilities gives our consulting teams an advantage by providing a solid foundation for advancing their SAP skills. This investment has given DataXstream the ability to build and maintain tools and project accelerators; allowing our consultants to set the right direction for your project, accelerate your timeline, reduce delivery risk, control cost, and ultimately, provide your business with meaningful SAP services and solutions.

“DataXstream was by far the best partner during our SAP implementation. Their level of knowledge was extraordinary, their dedication was unquestionable and their integrity was never a matter of concern for us. I would recommend their services to any company that is serious about SAP.” – SAP Project Manager, Wholesale Distribution Company.

SAP Data Migration – Dealing With Fallout (Part 3)

One of the inevitable aspects of data migration is dealing with fallout from automated data loads.  Typically, this process includes identifying the data that will not load, analyzing the error messages to determine the root cause, formatting a readable report that can be used as a tool in the cleanup process, and fixing the root cause of the problem so that it does not happen again.

Why the data will not load correctly.

There is a litany of reasons why some data records will load correctly while others will not.  Here is a list of some common root causes:

 

  1. Poor quality legacy data.
    Legacy systems which are not as tightly integrated as SAP, and are not under master data control allow the end user a bit of freedom when entering data.  A zip code may contain too little or too many characters; the email address is not properly formatted; numeric fields have transposed digits; various forms of abbreviations (especially in the city field), a quantity of zero (0) permitted by the legacy system and uploaded into a field where SAP will not accept a quantity of 0 and even simple misspellings  all can cause stringent validation checks to trigger an error and prevent the record from loading at all.  A more sinister type of error occurs when the data is functionally incorrect, but good enough to pass all of the SAP validity checks.  In this case, the data record will technically load into SAP, but will not be functionally correct.  Duplicate customers, duplicate vendors, and the data entry error for a quantity of 1000 instead of 100, and the wrong pricing condition applied to a sales order line are examples of this scenario.

 

 

  1. Functional configuration and supporting data effects.
    Many times I have watched the load statistics for a data object plummet from near 100% in the cycle two test load to near 0% in the cycle three test load.  This is very unnerving to the client because the cycle three test load is getting rather close to the go-live date, and “by the way, shouldn’t the statistics be getting better rather than worse?”  Functional configuration changes can wreak havoc on any data load.  Flipping the switch on a data field from optional to required; turning on batch management or serialization for materials for the first time; changes in the handling of tax, tax codes, and tax jurisdiction codes; that account determination entry that is missing or not set up correctly; a missing unit of measure or unit or measure conversion factor; the storage location in the upload file which does not exist in SAP – any of these can cause a load to drop mostly or completely onto the floor.While change is inevitable on any project, it is important to control and communicate the change so that the downstream impact can be recognized and understood.   Controlled change and communication always works better than total surprise.  Perhaps if we all know ahead of time about that data field that is now required, we can impose a requirement on the data extract side to make sure that the data field is populated before it enters the upload file.
  2. Additional data in the upload file.
    Inserting a new field in the middle of the upload file data structure might be necessary for the business to close a gap, but if that change is not communicated to the technical team so that appropriate adjustments can be made to the load object’s input structures and processing logic, the new data will surely never load, and may cause misalignment of the data fields which follow it in the upload structure.

[Read more...]

SAP Solution Manager Service Desk Integration

Nowadays when you install SAP ECC 6.0 you get SAP Solution Manager (SOLMAN) as part of the deal – ostensibly for free (although it is really included in the purchase price).  SOLMAN provides a wealth of functionality to help manage the technical environment as well as project processes like testing.

Service Desk functionality is delivered to you for use as a ticketing system.  One of the features of it is that it can be used as a ticketing system for both SAP and non-SAP systems as well as in conjunction with other ticketing systems that may be in place already.  In this blog post I’ll briefly touch on some of the scenarios I have encountered and show that there are several ways to deploy Service Desk.

Using Service Desk is beneficial because it can automatically capture a wealth of information about what a user was doing when a problem occurred if the ticket is created directly from SAP.  Also, Service Desk can communicate directly with the SAP mother ship to log issues and manage OSS notes which obviously reduces the risk of transcription errors.  And Service Desk can be extended to include functional components from non-SAP systems which in turn leads to the possibility of one-stop-shopping for ticket management. [Read more...]

SAP Data Migration – The Data Migration Plan (Part 2)

If you are responsible for the success of data migration, you will want to build a detailed plan that will walk you through all of the three phases of data migration: pre-data migration preparation, the data migration itself, and post-data migration cleanup.  I like my data migration plan to contain detailed steps that ensure that I don’t forget anything.  Each step lists a specific named responsible person along with their area of responsibility and contact information.  Unless I am responsible for executing the task myself, I prefer the named person to be a client employee (i.e. the business owner of the process) rather than a project consultant.    This is where the responsibility should be, and it requires that the business process owners actually participate in the activity rather than sit on the sidelines and watch.

[Read more...]

SAP Data Migration – Answering the Important Questions (Part 1)

It is data migration time on your SAP business project.  Whether your project is implementation, acquisition, or merger, the goal is pretty much the same: the seamless inbound acquisition of master and transactional data from one or more external data sources while ensuring that this activity has minimal impact on the rest of the business.  This is where we attempt to move years of neglected master and transactional data from a loosely structured, anything-goes legacy system into a very tightly integrated and highly structured SAP system.  You must consider the likelihood that the concept of master data management had not been invented yet when the legacy or source system providing your data was implemented.

How much data to move? How much data to leave behind? What to automate, and what to execute manually?  How to gracefully orchestrate and execute a data migration cutover from one system to another?  Where and how to fit the data migration plan into the overall business implementation plan?  How to continue to run the business during the data migration phase of the business project implementation? These questions are all part of the planning fun!

[Read more...]

What’s in a Naming Convention? Part II

In my last post, I discussed naming the naming convention that DataXstream recommends for SAP PI Integration Directory (ID) objects.  I would like to say that I had a great DataXstream ESR-specific naming convention, however the SAP naming convention guide for PI 7.1 does the job perfectly. Here is the link to the  PI 7.1 naming convention guide http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/40a66d0e-fe5e-2c10-8a85-e418b59ab36a?QuickLink=index&overridelayout=true . I would like to point out some things that I feel most people miss, as well as some things that I think are particularly interesting.

Object Name Prefixes

I have seen a lot of places use prefixes or suffixes before objects such as or dt_ or _mt. I have never really been a fan of these prefixes and suffixes, and apparently neither is the above linked naming convention guide. The reason being is that they are unnecessary. It would be difficult to confuse a message type with a data type in a real interface scenario, even if troubleshooting an unknown broken object. Plus it makes message mapping unnecessarily confusing and long since it’s not clear whether to add the prefix, i.e. MT_one_to_MT_two. Good descriptive names are usually all that you need e.g. DEBMAS_to_Customer. The only possible exception to a no-suffix-or-prefix-policy is the service interface, as sometimes it is useful to know which direction and type a service interface is. An argument for a prefix or a suffix to describe a service interface would be to assist in understanding the flow from an SAP perspective in the event that someone who didn’t develop the interface had to come behind and troubleshoot. A argument against would be the fact that it looks silly if you use it for a web service, because you have a name that doesn’t mean anything to a third party user (note operation mappings are told to omit the prefix of direction and mode in the SAP guide).

One interesting thing that I noticed in my investigation of PI 7.1 EHP1 is that it appears that naming conventions on PI can be validated. If the object names do not conform to a naming convention a message will appear. Ter perform this check in the ESR go to menu Tools>Component Check.

Select “Governance” and “Interface Name Checks”:

If the service interface does not end with (In/Out)(SYNC/ASY) the interface check will show as an error (does not impact interface processing). I created 2 interfaces: one good and one bad to show this error.

My suspicion is that SAP will put more in place to force consistent naming standards depending on the service interface pattern in future releases.

Using a software component and Namespace for each “side” of an interface

All objects of an interface should not be grouped in a single namespace. They should to be split among the Software Component Versions of the systems being interfaced. Otherwise when you go to configure, you will not be able to see your operational mapping (OM) in the dropdown box without having to select all in the dropdown menu. A general rule of thumb is: if it’s not easy to configure, odds are you have probably done something wrong. Another reason why an object might not appear in the dropdown menu (for example for an operations mapping on an interface mapping) would be if the installed checkbox is not clicked on the SLD. When done correctly, most interfaces should be able to be configured quickly and intuitively in the integration directory (ID) without the need to select from all SWCV from dropdown menus on the integration builder.

Whatever naming convention you choose to use for the ESR, the important thing to remember is that adhering to the standard makes production support and troubleshooting faster and easier.

Taxes and TemSe

Recently while supporting my current client, I was tasked with solving a rather puzzling issue an end user was experiencing. While using T-Code PU19, the user would receive:

[Read more...]

Too Many Developers Spoil The Code

You may have heard the following idiom before:

Too many cooks spoil the broth

The common meaning extracted from this saying is that too many people working on a single task tend to make a mess of it.  But, what happens when you have too many developers working in a single piece of ABAP code?  That’s right, you get a big mess.  This issue is especially difficult to deal with when there are multiple functional requirements leveraging the same custom code object, form-based user-exit, or function-based user-exit.

While current releases of SAP (those built on SAP NetWeaver 2004s and later) have good built-in handling of enhancements and customizations via implicit and explicit enhancement points and BADIs, there still exists many old-school user-exits.

Multiple Developers; One Code Object

I recently worked on a project where three separate developers were creating three separate interfaces based on the outbound delivery IDOC.  While the development for all three interfaces was occurring at the same time, the go-live date for each of the interfaces were different (we’ll discuss that project management glitch at another time).  Each interface required a separate set of custom fields and, therefore, it’s own IDOC extension.  The problem is there is only one appropriate user-exit in IDOC_OUTPUT_DELVRY and three developers needed to be developing in it at the same time!

How did we solve this problem?
[Read more...]