SAP NetWeaver Gateway : A Step-by-Step Guide for Beginners

By Guest Blogger Kenny Sutherland, SAP Technical Intern, DataXstream

In three-tier architecture, SAP Gateways make up the middle or application tier. This middle layer is essential for communication between the frontend and the backend. The use of multi-tier architecture to implement a data processing system adds a valuable level of modularity and flexibility to the system by being able to develop and maintain each tier individually.  The purpose of this tutorial is to create a Gateway that will extract data from the flight carrier database table, which will be accessed from the Gateway Client. The Gateway will be able to retrieve a list of carriers as well as individual carriers.  Let’s get started…

Creating a New Project

  • First, navigate to the Gateway Service Builder with the T-Code “SEGW”
  • Click the “Create Project” button

Screen Shot 2015-07-07 at 3.42.25 PM

  •  Give the Gateway project a name and a description
  • Save the project and assign it to the appropriate package. For the purpose of this demo the project will be saved as a local object ($TMP or click the button “Local Object). The Gateway project has now been successfully created

Screen Shot 2015-07-09 at 4.18.25 PM

Creating an Entity & Entity Set

  • What is an Entity and Entity Set?
  • An Entity is a structure that can be defined by the user or defined as an ABAP Data Dictionary structure.  An Entity Set is simply a collection or table of Entities.
  • Right click Data Model and select “Import” -> “DDIC Structure” to use an ABAP Data Dictionary structure

Screen Shot 2015-07-07 at 3.30.20 PM

  • Type “SCARR” for ABAP Structure and press enter. A list of properties should appear
  • Make “Carrier” the object name for the entity
  • Change the “Usage” of the property “MANDT” to “Ignore” and click the check mark at the bottom right


  • Double click the folder “Entity Sets”
  • Click “Insert Row” and name the Entity Set

Screen Shot 2015-07-07 at 3.42.25 PM

  • The naming convention is to either make the Entity Set name the plural form of the name of the entity or append “_Set” to the Entity name. For training purposes, name the entity set “Carriers” or “Carrier_Set”. “Carriers” will be used for the remainder of this tutorial
  • Use the Entity name, “Carrier”, for “Entity Type Name”. Make sure to save and the Entity and corresponding Entity Set have successfully created

Screen Shot 2015-07-07 at 3.47.51 PM

How to Generate ABAP Classes

  • Click on the “Generate Runtime Objects” button towards the top left of the IDE

Screen Shot 2015-07-07 at 3.40.44 PM

How to Activate Gateway Service

  • Navigate to the “Activate and Maintain Services” page, “/iwfnd/maint_service”, and click “Add Service”

Screen Shot 2015-07-09 at 9.47.43 AM

  • Set System Alias to “LOCAL” and Technical Service Name to the name of the Gateway

Screen Shot 2015-07-09 at 9.49.04 AM

  • Click “Local Object” and then the check button to save

Screen Shot 2015-07-09 at 9.52.08 AM

  • Go back to the “Activate and Maintain Services” page, click on the service name, and click on “Gateway Client”

Screen Shot 2015-07-09 at 10.08.45 AM

  • To test the service, verify the “HTTP Method” is set to “GET” and then click “Execute”. There should now be some auto-generated XML

Screen Shot 2015-07-09 at 10.22.04 AM

  • In order to view the entity and its properties add a URI option to the end of the URI. Click “Add URI Option” and use “$metadata” and “sap-ds-debug=true”

Screen Shot 2015-07-09 at 10.23.05 AM

  • Now we can see the Entity Type as well as its properties

Screen Shot 2015-07-09 at 10.25.49 AM

Congratulations! You have made a usable Gateway Service. Now the backend functionality of the Gateway must be coded in order to make it useful.

Implementing GetEntitySet

  • Navigate back to the gateway service builder, expand the “Service Implementation” folder, and expand the entity set. There will be a few auto-generated methods
  • Right click “GetEntitySet”, click “Go to ABAP Workbench”, and ignore the popup that follows. This will take bring up the Class Builder

Screen Shot 2015-07-09 at 10.34.41 AM

  • In the left menu, expand the “Methods” folder, right click on the “GET_ENTITYSET” method, and select “Redefine”

Screen Shot 2015-07-09 at 10.38.09 AM

  • Under “Signature”, what the method is exporting to the frontend service can be seen, “ET_ENTITYSET”. This exporting table needs to be populated with data from the backend database
  • It is generally bad practice to select all the records from a database table because it can be extremely inefficient and redundant so instead of using “SELECT *”, only select the first 100 records from the database using the following statement…
  • Activate the method
  • To debug this code, set an external breakpoint. Session breakpoints will not work using the Gateway Client. Now the method needs to be tested

Screen Shot 2015-07-09 at 10.55.42 AM

Testing GetEntitySet

  • Reenter the “Activate and Maintain Services” or if it is already in a window click “Refresh Catalog”
  • Open the service again using the Gateway Client
  • Append the name of the Entity Set to the end of the URI, verify “HTTP Method” is set to “GET”, and execute. There should now be multiple Carrier entries

Implementing GetEntity

  • To get an individual Carrier, the Get_Entity method must be implemented
  • In the Class Builder, right click CARRIERS_GET_ENTITY and select “Redefine”
  • Add the following code,
“DATA: ls_key_tab LIKE LINE OF it_key_tab.
    READ TABLE it_key_tab INTO ls_key_tab WITH KEY name = 'Carrid'.
    SELECT SINGLE carrid carrname currcode url FROM scarr
      WHERE carrid = ls_key_tab-value.”
  • The above code will select a Carrier using the Carrid that will be passed into the URI
  • Activate this method and open the Gateway Client one more time
  • Make sure HTTP Method is “GET”, type “/sap/opu/odata/sap/Z_GATEWAY_DEMO_SRV/Carriers(‘AF’)” for the URI, and press execute
  • There should now be an individual Carrier using the Carrid that was just passed in the URI


Congratulations! You have made your first Gateway Service.

Kenny Sutherland is a current Christopher Newport University student working towards a degree in Information Systems.

This summer, Kenny is focused on an individual learning assignment under the direction of a senior consultant that deals directly with DataXstream’s OMS+ Solution software. His work primarily focuses on the back-end payment functionality of OMS+. Kenny is showing true leadership skills and DataXstream is proud to extend an offer for full-time employment upon completion of his studies at CNU. Kenny brings a lot to the table, including skills in JavaScript, HTML, and several other programming languages. He recently took home first place at CNU’s Software Fair!

We welcome prospective interns to contact us about starting your journey at DataXstream!

Visit our Jobs page or email our Director of Human Resources for more information.


DataXstream Concludes another Successful Year at SAPPHIRE NOW 2013 and ASUG Annual Conference

Sapphire Now 2013 and ASUG Annual conference wrapped up last week in Orlando. DataXstream was among the 20,000 customers, vendors, and prospects in attendance. In addition to the participants in Orlando, over 80,000 virtual participants experienced this year’s conference. This year was no exception to SAP’s tradition of top-notch event, providing exciting news and innovative solutions. Likewise, DataXstream had a great week with customers and potential partners during the conference.

DataXstream highlighted core lines of business including Consulting Services, Project Leadership, Functional Expertise, Custom Development, Basis Administration, Integration Services, and ISV Enablement Solutions.

Here are a few highlights from DataXstream Core Offerings presented at SAPPHIRE NOW 2013.

  • DataXstream’s Consulting Services provide experienced consultants that bring technical expertise and delivery project success.
  • DataXstream’s Project Leadership utilizes years of implementation experience to set the right direction for your project.  We provide solutions that work for your company.
  • DataXstream provides Functional Expertise that delivers processes and configuration to maximize the business benefit of your SAP solutions.
  • DataXstream provides Custom Development solutions to bridge gaps between your business processes and the SAP solution, as well as driving cost reduction through better design.
  • DataXstream provides experienced Basis Administration to provide excellent NetWeaver platform support, and provide system architecture to deliver a reliable, scalable, and adaptable SAP platform.
  • DataXstream’s Integration Services team works with customers to build reliable, stable, and robust integration solutions.  Throughout the integration process, we work with you to integrate your SAP and non-SAP applications to build your complete solution.
  • DataXstream provides ISV Enablement services that integrate your third-party products into the SAP solution.

Now that SAPPHIRE NOW 2013 is over, DataXstream will return to its core activities and continue to deliver value to its customers and establish new business relationships.  We wish everyone who participated in the show continued success.

What Makes a Great Basis Administrator?

Basis Administrators are key to overall project success. Proper system administration provides stable systems and consistent support for your project. One of the key questions that companies face is what makes a great basis administrator? The following items are critical to basis administration success:

  • Technical expertise with NetWeaver platform
  • Understanding of technical operations for datacenter activities (OS, DB, SAN, network)
  • Ability to adapt to changing technology
  • Ability to problem solve and learn on the job

A great basis administrator must understand the different NetWeaver platforms, integration points, and how to support the multiple systems in a SAP landscape. The SAP landscape is becoming complex with multiple SAP systems, hybrid models for infrastructure components (e.g. Linux and Windows OS), and multiple Database technologies based on application dependency. The increased use and deployment of technologies like mobility, analytics, cloud computing, and in-memory-computing force a basis administrator to have a breadth of technical knowledge. The rapidly changing landscape also means basis administrators must be flexible and adapt to the changing infrastructure requirements. A great basis administrator has to support current technology while learning new technology. The ability to utilize core technical knowledge and apply that knowledge to emerging technologies is key to long-term success. Adapting to the shifting technical landscape and problem solving within this environment is critical to success.

DataXstream basis administrators understand the multiple aspects of system administration. We understand the technical details and ability to adapt to emerging technology. DataXstream basis administrators have real experience, with proven results.

SAP Technical Consulting Services

DataXstream’s technical consulting teams have been committed to our customer’s success with SAP for over 15 years. Our focus on SAP technologies allows us to have a positive impact on our customer’s SAP projects and businesses. Our team members have a broad cross section of industry experience, working on both large and small teams, allowing them to quickly adapt to the needs of any SAP project. DataXstream’s investment over the last 10 years in SAP development capabilities gives our consulting teams an advantage by providing a solid foundation for advancing their SAP skills. This investment has given DataXstream the ability to build and maintain tools and project accelerators; allowing our consultants to set the right direction for your project, accelerate your timeline, reduce delivery risk, control cost, and ultimately, provide your business with meaningful SAP services and solutions.

“DataXstream was by far the best partner during our SAP implementation. Their level of knowledge was extraordinary, their dedication was unquestionable and their integrity was never a matter of concern for us. I would recommend their services to any company that is serious about SAP.”

- SAP Project Manager, Wholesale Distribution Company

SAP Data Migration – Dealing With Fallout (Part 3)

One of the inevitable aspects of data migration is dealing with fallout from automated data loads.  Typically, this process includes identifying the data that will not load, analyzing the error messages to determine the root cause, formatting a readable report that can be used as a tool in the cleanup process, and fixing the root cause of the problem so that it does not happen again.

Why the data will not load correctly.

There is a litany of reasons why some data records will load correctly while others will not.  Here is a list of some common root causes:


  1. Poor quality legacy data.
    Legacy systems which are not as tightly integrated as SAP, and are not under master data control allow the end user a bit of freedom when entering data.  A zip code may contain too little or too many characters; the email address is not properly formatted; numeric fields have transposed digits; various forms of abbreviations (especially in the city field), a quantity of zero (0) permitted by the legacy system and uploaded into a field where SAP will not accept a quantity of 0 and even simple misspellings  all can cause stringent validation checks to trigger an error and prevent the record from loading at all.  A more sinister type of error occurs when the data is functionally incorrect, but good enough to pass all of the SAP validity checks.  In this case, the data record will technically load into SAP, but will not be functionally correct.  Duplicate customers, duplicate vendors, and the data entry error for a quantity of 1000 instead of 100, and the wrong pricing condition applied to a sales order line are examples of this scenario.



  1. Functional configuration and supporting data effects.
    Many times I have watched the load statistics for a data object plummet from near 100% in the cycle two test load to near 0% in the cycle three test load.  This is very unnerving to the client because the cycle three test load is getting rather close to the go-live date, and “by the way, shouldn’t the statistics be getting better rather than worse?”  Functional configuration changes can wreak havoc on any data load.  Flipping the switch on a data field from optional to required; turning on batch management or serialization for materials for the first time; changes in the handling of tax, tax codes, and tax jurisdiction codes; that account determination entry that is missing or not set up correctly; a missing unit of measure or unit or measure conversion factor; the storage location in the upload file which does not exist in SAP – any of these can cause a load to drop mostly or completely onto the floor.While change is inevitable on any project, it is important to control and communicate the change so that the downstream impact can be recognized and understood.   Controlled change and communication always works better than total surprise.  Perhaps if we all know ahead of time about that data field that is now required, we can impose a requirement on the data extract side to make sure that the data field is populated before it enters the upload file.
  2. Additional data in the upload file.
    Inserting a new field in the middle of the upload file data structure might be necessary for the business to close a gap, but if that change is not communicated to the technical team so that appropriate adjustments can be made to the load object’s input structures and processing logic, the new data will surely never load, and may cause misalignment of the data fields which follow it in the upload structure.

[Read more...]

SAP Solution Manager Service Desk Integration

Nowadays when you install SAP ECC 6.0 you get SAP Solution Manager (SOLMAN) as part of the deal – ostensibly for free (although it is really included in the purchase price).  SOLMAN provides a wealth of functionality to help manage the technical environment as well as project processes like testing.

Service Desk functionality is delivered to you for use as a ticketing system.  One of the features of it is that it can be used as a ticketing system for both SAP and non-SAP systems as well as in conjunction with other ticketing systems that may be in place already.  In this blog post I’ll briefly touch on some of the scenarios I have encountered and show that there are several ways to deploy Service Desk.

Using Service Desk is beneficial because it can automatically capture a wealth of information about what a user was doing when a problem occurred if the ticket is created directly from SAP.  Also, Service Desk can communicate directly with the SAP mother ship to log issues and manage OSS notes which obviously reduces the risk of transcription errors.  And Service Desk can be extended to include functional components from non-SAP systems which in turn leads to the possibility of one-stop-shopping for ticket management. [Read more...]

SAP Data Migration – The Data Migration Plan (Part 2)

If you are responsible for the success of data migration, you will want to build a detailed plan that will walk you through all of the three phases of data migration: pre-data migration preparation, the data migration itself, and post-data migration cleanup.  I like my data migration plan to contain detailed steps that ensure that I don’t forget anything.  Each step lists a specific named responsible person along with their area of responsibility and contact information.  Unless I am responsible for executing the task myself, I prefer the named person to be a client employee (i.e. the business owner of the process) rather than a project consultant.    This is where the responsibility should be, and it requires that the business process owners actually participate in the activity rather than sit on the sidelines and watch.

[Read more...]

SAP Data Migration – Answering the Important Questions (Part 1)

It is data migration time on your SAP business project.  Whether your project is implementation, acquisition, or merger, the goal is pretty much the same: the seamless inbound acquisition of master and transactional data from one or more external data sources while ensuring that this activity has minimal impact on the rest of the business.  This is where we attempt to move years of neglected master and transactional data from a loosely structured, anything-goes legacy system into a very tightly integrated and highly structured SAP system.  You must consider the likelihood that the concept of master data management had not been invented yet when the legacy or source system providing your data was implemented.

How much data to move? How much data to leave behind? What to automate, and what to execute manually?  How to gracefully orchestrate and execute a data migration cutover from one system to another?  Where and how to fit the data migration plan into the overall business implementation plan?  How to continue to run the business during the data migration phase of the business project implementation? These questions are all part of the planning fun!

[Read more...]

What’s in a Naming Convention? Part II

In my last post, I discussed naming the naming convention that DataXstream recommends for SAP PI Integration Directory (ID) objects.  I would like to say that I had a great DataXstream ESR-specific naming convention, however the SAP naming convention guide for PI 7.1 does the job perfectly. Here is the link to the  PI 7.1 naming convention guide . I would like to point out some things that I feel most people miss, as well as some things that I think are particularly interesting.

Object Name Prefixes

I have seen a lot of places use prefixes or suffixes before objects such as or dt_ or _mt. I have never really been a fan of these prefixes and suffixes, and apparently neither is the above linked naming convention guide. The reason being is that they are unnecessary. It would be difficult to confuse a message type with a data type in a real interface scenario, even if troubleshooting an unknown broken object. Plus it makes message mapping unnecessarily confusing and long since it’s not clear whether to add the prefix, i.e. MT_one_to_MT_two. Good descriptive names are usually all that you need e.g. DEBMAS_to_Customer. The only possible exception to a no-suffix-or-prefix-policy is the service interface, as sometimes it is useful to know which direction and type a service interface is. An argument for a prefix or a suffix to describe a service interface would be to assist in understanding the flow from an SAP perspective in the event that someone who didn’t develop the interface had to come behind and troubleshoot. A argument against would be the fact that it looks silly if you use it for a web service, because you have a name that doesn’t mean anything to a third party user (note operation mappings are told to omit the prefix of direction and mode in the SAP guide).

One interesting thing that I noticed in my investigation of PI 7.1 EHP1 is that it appears that naming conventions on PI can be validated. If the object names do not conform to a naming convention a message will appear. Ter perform this check in the ESR go to menu Tools>Component Check.

Select “Governance” and “Interface Name Checks”:

If the service interface does not end with (In/Out)(SYNC/ASY) the interface check will show as an error (does not impact interface processing). I created 2 interfaces: one good and one bad to show this error.

My suspicion is that SAP will put more in place to force consistent naming standards depending on the service interface pattern in future releases.

Using a software component and Namespace for each “side” of an interface

All objects of an interface should not be grouped in a single namespace. They should to be split among the Software Component Versions of the systems being interfaced. Otherwise when you go to configure, you will not be able to see your operational mapping (OM) in the dropdown box without having to select all in the dropdown menu. A general rule of thumb is: if it’s not easy to configure, odds are you have probably done something wrong. Another reason why an object might not appear in the dropdown menu (for example for an operations mapping on an interface mapping) would be if the installed checkbox is not clicked on the SLD. When done correctly, most interfaces should be able to be configured quickly and intuitively in the integration directory (ID) without the need to select from all SWCV from dropdown menus on the integration builder.

Whatever naming convention you choose to use for the ESR, the important thing to remember is that adhering to the standard makes production support and troubleshooting faster and easier.

Taxes and TemSe

Recently while supporting my current client, I was tasked with solving a rather puzzling issue an end user was experiencing. While using T-Code PU19, the user would receive:

[Read more...]