SAP ABAP/PI Developer Position

DataXstream has a full-time opening for an ABAP/PI Developer in our Indianapolis, IN office.

Technical Qualifications

  • 7+ years of experience in ABAP/PI/XI Programming is required
  • Experience with EDI strategy and tools
  • Understanding of Object Oriented concepts
  • Experience with Data Dictionary, Function Modules (Remote Enabled), BAPIs, Web Services
  • Experience creating/modifying Adobe Forms
  • Proficiency with ALE/IDocs including error handling is desirable
  • Experience working with SAP ECC 6.0
  • Programming experience in various modules – MM, SD, FI
  • Work will primarily be remote, but could be potential for periodic travel to client sites

Additional Qualifications

  • US citizens and those authorized to work in the US are encouraged to apply. We are unable to sponsor H1b candidates at this time.
  • Sharp, energetic consultants with an understanding of business processes
  • Self-starter
  • Ability to work both individually and as part of a team

For immediate consideration, forward your resume and salary requirements to Jane Robitaille.

 

SAP IDOCs for Customer Number with different Sales Organizations to different External Partnerships

Have you ever implemented an outbound  EDI process from SAP for a single customer number where the customer has multiple EDI trading by Sales Organization or Division?  It can be done.  In order to accomplish this you will need to create separate output types for each Sales Organization/Division and then set up the Access Sequence/Output Determination in order to create the IDOC for each partnership.  You can then use the Message Variant and/or Message Function fields of the Partner Profile to differentiate between the two Sales Organizations/Divisions.  Finally, you would set up your EDI Mapping to look at the Partner Profile fields in order to route it to the correct partnership.  Let’s take a closer look at this process.

Let’s say that Customer 15 in your SAP system buys products from your company.  It sends inbound EDI Orders to you using three different partner IDs because they have 3 internal divisions and they want all transactions to be separate.  You want to keep all sales data for this customer under one customer number in your SAP system and just separate them by a different division.  You are required to send out EDI invoices to this customer, but they must go to the correct EDI Partner ID.  Let’s say you would normally use the Standard SAP Output Type RD00 and  Access Sequence 0003 (Sales Org, Distribution Channel, Division, Customer Number) for producing your INVOIC IDOCs. [Read more...]

SAP EDI EDPAR Table Walkthrough – How to Cross Reference External Customer Number to SAP Customer Number (Part 2)

Let’s say you are receiving EDI ANSI X12 850 Sales Orders from you customers that need to be uploaded into your SAP System using the ORDERS05 IDOC.  Most customers will have their own internal customer numbers that they send in their EDI transmissions to represent the Sold-To and Ship-To partners.

How do you convert these external customer numbers into your internal SAP customer numbers?

Some may hard code these conversions into their EDI maps.  This approach can be very high maintenance as customers can add new ship-to locations or reorganize their internal numbers which would require changes to your maps.

Others may set up a cross reference table within their EDI translation table to perform the conversion.  This works well at times, but then you are at the mercy of your EDI group to update the table with any new additions or changes to existing entries. [Read more...]

SAP EDI EDPAR Table Walkthrough – How to Cross Reference SAP Customer Number to External Customer Number (Part 1)

When creating IDOCs in SAP to send Invoices to customers via EDI you will likely have to send the customers their internal partner numbers on the EDI ANSI X12 810 Invoice Document.  In almost all cases this will not be the same as the SAP partner numbers.  So how can we set up a cross reference of SAP and external partner numbers?  Well, the answer is simple because SAP has set up a utility to handle this for you.  All you need to do is populate the EDPAR table in SAP using the VOE4 transaction.  Once this is completed the IDOC_OUTPUT_INVOIC function module will read the EDPAR table when the Invoice document output is processed and populate the LIFNR element of the E1EDKA1 or E1EDPA1 segments of the INVOIC IDOC with the external partner number.  Entries in EDPAR can be set up for multiple partners including the Sold-to, Ship-to, and Bill-To numbers so that external customer number cross-references can be passed on the IDOC if needed.

Let’s look at how this process works.  Let’s say we have created an invoice document in SAP.  In this case, the Sold-to, Ship-to, and Bill-to partners are all SAP customer number 15.  If we want to create an INVOIC02 IDOC on which the external customer numbers are populated for all three of these partners we would have to set up three EDPAR entries as displayed on the below screen shot.  The Customer field will contain the SAP partner number (Sold-to, Ship-to, Bill-to).  The Ext. Function field will contain the Partner Function (SP = Sold-to, SH = Ship-to, BP = Bill-to).  The External Partner field will contain the external partner number that the customer is expecting on the EDI file.  And the Int. no. field will contain the SAP partner number (Same as the Customer field).

[Read more...]

SAP Data Migration – Dealing With Fallout (Part 3)

One of the inevitable aspects of data migration is dealing with fallout from automated data loads.  Typically, this process includes identifying the data that will not load, analyzing the error messages to determine the root cause, formatting a readable report that can be used as a tool in the cleanup process, and fixing the root cause of the problem so that it does not happen again.

Why the data will not load correctly.

There is a litany of reasons why some data records will load correctly while others will not.  Here is a list of some common root causes:

 

  1. Poor quality legacy data.
    Legacy systems which are not as tightly integrated as SAP, and are not under master data control allow the end user a bit of freedom when entering data.  A zip code may contain too little or too many characters; the email address is not properly formatted; numeric fields have transposed digits; various forms of abbreviations (especially in the city field), a quantity of zero (0) permitted by the legacy system and uploaded into a field where SAP will not accept a quantity of 0 and even simple misspellings  all can cause stringent validation checks to trigger an error and prevent the record from loading at all.  A more sinister type of error occurs when the data is functionally incorrect, but good enough to pass all of the SAP validity checks.  In this case, the data record will technically load into SAP, but will not be functionally correct.  Duplicate customers, duplicate vendors, and the data entry error for a quantity of 1000 instead of 100, and the wrong pricing condition applied to a sales order line are examples of this scenario.

 

 

  1. Functional configuration and supporting data effects.
    Many times I have watched the load statistics for a data object plummet from near 100% in the cycle two test load to near 0% in the cycle three test load.  This is very unnerving to the client because the cycle three test load is getting rather close to the go-live date, and “by the way, shouldn’t the statistics be getting better rather than worse?”  Functional configuration changes can wreak havoc on any data load.  Flipping the switch on a data field from optional to required; turning on batch management or serialization for materials for the first time; changes in the handling of tax, tax codes, and tax jurisdiction codes; that account determination entry that is missing or not set up correctly; a missing unit of measure or unit or measure conversion factor; the storage location in the upload file which does not exist in SAP – any of these can cause a load to drop mostly or completely onto the floor.While change is inevitable on any project, it is important to control and communicate the change so that the downstream impact can be recognized and understood.   Controlled change and communication always works better than total surprise.  Perhaps if we all know ahead of time about that data field that is now required, we can impose a requirement on the data extract side to make sure that the data field is populated before it enters the upload file.
  2. Additional data in the upload file.
    Inserting a new field in the middle of the upload file data structure might be necessary for the business to close a gap, but if that change is not communicated to the technical team so that appropriate adjustments can be made to the load object’s input structures and processing logic, the new data will surely never load, and may cause misalignment of the data fields which follow it in the upload structure.

[Read more...]

SAP Data Migration – The Data Migration Plan (Part 2)

If you are responsible for the success of data migration, you will want to build a detailed plan that will walk you through all of the three phases of data migration: pre-data migration preparation, the data migration itself, and post-data migration cleanup.  I like my data migration plan to contain detailed steps that ensure that I don’t forget anything.  Each step lists a specific named responsible person along with their area of responsibility and contact information.  Unless I am responsible for executing the task myself, I prefer the named person to be a client employee (i.e. the business owner of the process) rather than a project consultant.    This is where the responsibility should be, and it requires that the business process owners actually participate in the activity rather than sit on the sidelines and watch.

[Read more...]

BD53 Doesn’t Play Well With Others

I recently posted a blog about how to implement field-level IDOC reduction for the HRMD_A message type.  In short, the standard SAP transaction to reduce IDOC segments and fields (transaction BD53) can’t be used because the field-level reduction is ignored by SAP.  My solution leveraged TVARV as a repository for the fields to clear.  Read the whole solution here. A colleague of mine was very quick to point out that instead of using TVARV as the method for controlling which fields are cleared, I should have continued to leverage transaction BD53 for the IDOC reduction maintenance and changed my code to look up field level reductions in table TBD24.

What a great idea!  Too bad I hate this suggestion… and it’s all SAP’s fault!!

[Read more...]

How To Implement Field-Level HRMD_A Reduction

I love ALE.  It is super-powerful and, once you get the hang of it, is a snap to configure.  Recently, I was setting up an HRMD_A interface for my client.  Everything was going smoothly until I ran into a requirement to filter out the social security number (PERID), birthdate (GBDAT), and gender (GESCH) for privacy reasons.  All of these fields are in segment E1P0002.  Initially, I thought that this requirement was easy enough to accomplish.  I just created a IDOC reduction in transaction BD53 and filtered out the three fields.  I soon found out that while entire segments were getting reduced from the IDOC as configured, the individually reduced fields were still showing up in the IDOC.  What’s going on?!?
[Read more...]

Changing SAP IDOCs Status In Mass / Mass Deletion Of IDOCs

Mass Change of SAP IDOC Status

From time to time it becomes necessary to change the status of SAP IDOCs in SAP. The most common scenario is the requirement to mark SAP IDOCs for deletion. There is no good way to mass mark IDOCs for deletion via the standard IDOC processing transaction BD87. However there is a program that will let you change status.

RC1_IDOC_SET_STATUS

CAUTION: This program should be used with great care and consideration. Improper use of this program can result in data consistency issues. Make sure you know what you are deleting, why you are deleting it, and what is required to correctly update you system after deleting.

[Read more...]

Are You Managing Your Change Pointers Properly Part 5 – Going Forward

In my last post, I discussed a collaborative effort with functional business owners to devise and execute a proper cleanup plan.  I also discussed a functional review of the configuration to make sure that change pointers are being created only when needed.

So now that you have achieved control over your change pointer data, how do you make sure that it does not go out of control again?

[Read more...]