Thursday, August 28, 2008

Note 566025 - Consumed and not consumed forecast together from DP to SNP

Note 566025 - Consumed and not consumed forecast together from DP to SNP

Summary

Symptom

It is needed to release Consumed and not consumed forecast together from DP to SNP.

Other terms

/sapapo/mc90 release-functionality-DP->SNP

Reason and Prerequisites

It is needed to release Consumed and not consumed forecast together from DP to SNP.

Solution

There are two possible solution.

    1. Solution #1.

Please set your SNP data view in a way to have:

  • a standard forecast key figure with category type FA,
  • a standard forecast key figure with category type FC.

Using transaction SE18 and selecting the definition name '/SAPAPO/SDP_RELSTRAT', please push the display button and add in your own implementation for the method 'SET_STRATEGY' the following code:
if IV_CATEGORY = 'FC'.
CV_NO_STRAT = 'X'.
endif.

Using transaction SMOD and selecting the Enhancement APODM017 and the Components radio button, please double-click on the Function module 'EXIT_/SAPAPO/SAPLAPO_EXIT_001'.
Then in your own include add the following code:
FIELD-SYMBOLS: like line of SCHEDULE_LINE_TAB.
LOOP AT SCHEDULE_LINE_TAB assigning
where CATEGORY = 'FC'.
-IGNORE_PEGGING = 'X'.
ENDLOOP.

The forecast ATPcat FC must not be used for other purposes already.
Activate all objects where these changes are done.
The consumed forecast should be displayed by the key figure with category FA, the not consumed forecast should be displayed by the key figure with category FC and no problem or side effects should occur in the consumption and PP/DS processes.

    2. Solution #2.

If the forecast ATPcat FC is used for other purposes already then this second solution can be implemented.

      a) The following way is valid for APO3.0 and APO3.1.

Please set your SNP data view in a way to have:

  • a standard forecast key figure with category type FA,
  • a standard forecast key figure with category type FA and semantic value = 011.

With this setting, releasing by /sapapo/mc90 to category FA once you will have then values in both key figures and no badi and no user exit are necessary any more.
Consumed forecast will be available in the first standard forecast key figure, not consumed forecast will be available in the standard forecast key figure with semantic value = 011.

      a) The following way is valid for SCM4.0 and newer releases.

Please set your SNP data view in a way to have:

  • a standard forecast key figure with category group DF1, category FA and quantity type initial, ( standard forecast key figure setting),
  • a standard forecast key figure with your own category group, category FA and quantity type = 01, Key Figure Semantics = 000 and Key Figure Funcs initial.

With this setting, releasing by /sapapo/mc90 to category FA once you will have then values in both key figures and no badi and no user exit are necessary any more.
Consumed forecast will be available in the first standard forecast key figure, not consumed forecast will be available in the standard forecastkey figure with quantity type = 01.

Tuesday, August 26, 2008

ABAP programming

When you run an ABAP program, you call its processing blocks. ABAP programs are controlled
from outside the program itself by the processors in the current work process.



Type I
Type I programs - called includes - are a means of dividing up program code into smaller, more
manageable units. You can insert the coding of an include program at any point in another ABAP
program using the INCLUDE statement. There is no technical relationship between include
programs and processing blocks. Includes are more suitable for logical programming units, such
as data declarations, or sets of similar processing blocks. The ABAP Workbench has a
mechanism for automatically dividing up module pools and function groups into include
programs.



SY is a structure with the ABAP Dictionary data type SYST. The components of SY are known as
system fields. System fields contain values that provide information about the current state of the
system. They are automatically filled and updated by the ABAP runtime environment. Examples
of system fields:
SY-SUBRC: Return code for ABAP statements
(zero if a statement is executed successfully)
SY-UNAME: logon name of the user
SY-REPID: Current ABAP program
SY-TCODE: current transaction
SY-INDEX: Number of the current loop pass

LIKE
can be used in the same ABAP statements as the TYPE addition to refer to any data
object that is already visible at that point in the program. The expression is either
the name of the data object or the expression

Definition of local types in a program using
TYPES LIKE .
The new data type inherits all of the technical attributes of the data object .

Wednesday, August 20, 2008

SAP architechture

Processing a User Request

IDocs, ALE

IDoc (for intermediate document) is a standard data structure for electronic data interchange (EDI) between application programs written for the popular SAP business system or between an SAP application and an external program. IDocs serve as the vehicle for data transfer in SAP's Application Link Enabling (ALE) system. IDocs are used for asynchronous transactions: each IDoc generated exists as a self-contained text file that can then be transmitted to the requesting workstation without connecting to the central database. Another SAP mechanism, the Business Application Programming Interface (BAPI) is used for synchronous transactions.

A large enterprise's networked computing environment is likely to connect many geographically distributed computers to the main database. These computers are likely to use different hardware and/or operating system platforms. An IDoc encapsulates data so that it can be exchanged between different systems without conversion from one format to another.

IDoc types define different categories of data, such as purchase orders or invoices, which may then be broken down into more specific categories called message types. Greater specificity means that an IDoc type is capable of storing only the data required for a particular transaction, which increases efficiency and decreases resource demands.

An IDoc can be generated at any point in a transaction process. For example, during a shipping transaction process, an IDoc may be generated that includes the data fields required to print a shipping manifest. After a user performs an SAP transaction, one or more IDocs are generated in the sending database and passed to the ALE communication layer. The communication layer performs a Remote Function Call (RFC), using the port definition and RFC destination specified by the customer model. The IDoc is transmitted to the receiver, which may be an R/3, R/2, or some external system.

Source:http://searchsap.techtarget.com/sDefinition/0,,sid21_gci852485,00.html


What is ALE?
The ALE (Application Link Enabling) concept available in R/3 (Release 3.0) supports the development of applications across different SAP systems. It incorporates the exchange of business information across these systems whilst ensuring consistency and integrity of the data. This functionality is achieved with the use of IDocs (Information Document) as opposed to the use of a centralized database. ALE allows the user to perform an SAP transaction in the sending system, after-which
the following steps occur:
• One or more communication IDocs (intermediate documents: container for the application data) are created in the sending system database. An ALE distribution
model, that needs to have been configured, determines which systems the IDocs are to be sent
• These communication IDocs, that contain the relevant application data of the transaction that was performed, are then passed to the ALE communication layer
• This layer performs an RFC call, using the port definition and an RFC destination determined through the customer model
• The IDocs are then transferred to the respective receiving systems. These could be SAP R/3, R/2 or external systems
• If the receiving system is an SAP system then:
• In the case of master data distribution the same transaction that was performed on the sending system is again performed on the receiving system with the data contained in the IDoc. This allows the data to go through the SAP checks before posting occurs
• In the case of transaction scenarios the relevant data is passed to the respective transactions in order to load the required application document. E.g., a PO is loaded on the sending side, yet a SO is created on the receiving system
• Master data has another difference:
• It can be set up in such a way that any changes made to specific fields in master data tables can automatically trigger off the ALE distribution process for that particular master data object
• If a master data object is created or changed on a sending system and distributed to another system the respective transaction is used to either create or change that respective master data object on the receiving system
In general, if standard SAP can't perform the task required then neither can ALE. It doesn't add functionality; it merely de-couples it and allows you to distribute it onto other remote systems.

What can be distributed?
Control data
The control data includes all objects that describe how the system is organized.
These include Customizing data like company codes, languages, purchasing organizations, plants and user maintenance.
The customer details his specific distribution in the customer distribution model. Once the control data is distributed the control data cannot be changed on the receiving systems. All changes are made to the central system and transported to the receiving systems.
Master Data
The distribution scenarios for the master data are contained in the reference model.
Rather than distributing the entire master data information, views on it are distributed
(for example, sales views on the material master). By configuring the views, the customer can select the master data fields to be distributed.
Transaction Data
The distribution scenarios for the transaction data are stored in the distribution reference model. Examples of transaction data are customer order, distributed contracts, purchase order, shipping notification and invoice.
Why ALE?
ALE is business solutions to a very real need emerging in the SAP market. This is the need for businesses to move towards tighter integration between systems, yet, at the same time, provide independence to the various business units within the company.
In the past the move was towards centralized systems.
Standardization of business processes accompanied by ever-tighter integration within the central system no longer represents a practicable approach to this problem. The following are some of the most commonly encountered difficulties:
• Technical bottlenecks,
• Upgrade problems,
• The effect of time zones on international corporations,
• Excessively long response times in large centralized systems.
In order both to meet the requirements of today's customers and to be open for future developments, ALE must meet the following challenges:
• Communication between different software releases.
• Continued data exchange after a release upgrade without special maintenance.
• Independence of the technical format of a message from its contents.
• Extensions that can be made easily even by customers.
• Applications those are decoupled from the communication.
• Communications interfaces that allow connections to third party applications.
• Support for R/3-R/2 scenarios.

Source:https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/abap/An%
20Advanced%20Guide%20to%20Implementing%20Application%20Link%20Enabling.pdf


Tuesday, August 19, 2008

Note 429422 - SAP APO System Requirements for multiple Planning Versions

Summary

Symptom

This SAP Note describes how to optimize your liveCache Main Memory
Requirements when using multiple SAP APO Planning Versions
and especially how to enter the correct amount of Planning Versions
in the Quicksizer in oder to obtain realistic main memory
requirements for the SAP liveCache.

Other terms

Multiple Planning Versions, DP, SNP, PP/DS, liveCache, requirements,
main memory, swap, performance, sizing, hardware resources, Demand
Planning, Supply Network Planning, Planning Versions

Reason and Prerequisites

You use multiple Planning Versions and want to optimize your Main
Memory requirements for the SAP liveCache.

Solution

---------------
Introduction
---------------

Multiple Planning versions held in the SAP liveCache Main
Memory have a linear multiplicator effect for the required amount of
liveCache Main Memory in the Quicksizer.
However, there are different ways to reduce your SAP
liveCache Main Memory requirements due to the amount of versions.
Furthermore, you must enter the correct amount of Planning versions
in the Quicksizer, so that the resulting memory requirements are
realistic. Please, note that if you enter "n" Planning Versions, the
Quicksizer will assume that these Planning Versions are all used at the
same time and that all Versions have the same size : This is usually
not the case.

A possible way to optimize main memory requirements due to the amount
of planning versions, is, for example, optimizing version copy
procedures.
Please, see more details below.

-----------------------------------------------------
1. Optimization of liveCache Main Memory Requirements
-----------------------------------------------------

----------------------------
1.1 SAP APO Demand Planning
----------------------------

In case of SAP APO Demand Planning, if you store few versions
in liveCache Main Memory and keep the rest on the BW Info Cube on the
SAP APO Database, you can optimize your main memory requirements for
SAP liveCache.
This recommendation can be important if your SAP liveCache Main
Memory Requirements are high independently of the amount of versions,
that is, if you plan with a high amount of characteristic
combinations, key figures, etc.
However, for performance reasons, this alternative is
only recommended if your SAP liveCache main memory requirements are
much higher than the available memory addressing capabilities of
your liveCache server, for example, if you are using SAP liveCache
on a 32-Bit Windows Server.
Please, also note that some Demand Planning Versions may only be
required for reporting purposes in form of BW Info Cube Backups on
the SAP APO Database and not necessarily in the SAP liveCache main
memory : In such cases it is correct to enter this amount of Planning
Versions in the Quicksizer as "Amount of Demand Planning Versions in
the BW Info Cube" and NOT as "Amount of Demand Planning Versions in
the liveCache".

------------------------------------
1.2 Planning Version Copy Procedures
------------------------------------

Not all SAP APO users need copies of 100 % of the objects
for a Planning Version. You can avoid the copy of unnecessary data
and reduce your memory requirements by using the following approach
when copying versions for DP, SNP or PP/DS :
a) The central transaction for administration of versions and models
is /sapapo/mvm. This version will allow you to create and copy versions
restricting the amount of data to be stored in the version.
b) For time series data, you can also use transaction /sapapo/tscopy
which is more effective than /sapapo/vercop because it offers you the
choice to select which data area is going to be copied. If you take
advantage of the selective capabilities of /sapapo/tscopy for Time
Series, the result will be similar as if you had performed a 50 %
version copy.
c) Transaction /sapapo/vercop is recommended to be used only with Order
data and not with Time Series data : Please, do not check the Time
Series box when using /sapapo/vercop.

Please, note that liveCache processes planning data in basically two
ways depending on the kind of planning concept in use :
a) If planning is based on time buckets processing, like for DP,
the so called Time Series liveCache will be used.
b) If planning is based on order (sales orders, stock orders, etc.)
processing, like for SNP or PP/DS, the so called Order liveCache will
be used.

SAP Notes about Model and Version Management for SAP APO are available
in the component APO-MD-VM.
You can also check the size of your planning versions with
/SAPAPO/OM16 : See SAP Note 431299 for details.

--------------------------------------------------------------------
2. Tips & Tricks : How to enter the "right" data in the Quicksizer ?
--------------------------------------------------------------------

Especially for global customers with several business units, multiple
users may work with multiple versions and/or copies of SAP APO DP, SNP
and PP/DS versions.
However, in general, not all versions will be used at the same time and
not all versions are equally big in size.

1) You can get a first approximation about the system requirements for
your SAP APO by using the Quicksizer for SAP APO in :
http://service.sap.com/quicksizer

Currently, the SAP APO quicksizer is available for DP, SNP, PP/DS, ATP
and CIF. The results of the quicksizer are optimized for mass
processing (batch planning) rather than for interactive planning.
However, overhead CPU resources for interactive planning are also
given based on the amount of concurrent users.
Please, ask your hardware
Partner to perform the sizing of your SAP APO and/or to review your
your quicksizer results. SAP is also available for final reviews in the
framework of the different Service offerings.
It is also strongly recommended that you perform mass/volume tests
before going productive. In this way you will be able to check
if all performance settings and sizing assumptions are adequated
for your productive system.

The quicksizer questionnaire will ask you, among others, your planned
amount of characteristic combinations, key figures, time buckets,
sales orders, stock orders, purchase orders and planning versions
for DP, SNP and PP/DS. Some of these business figures, say, the
amount of Characteristics Combinations, Key figures,
Orders, that is, transactional data, Planning Versions, etc. have a
significant influence on the resulting main memory requirements for
liveCache. On the contrary, Master Data has a small influence on
the liveCache Main Memory Requirements.
If the figures entered in the Quicksizer are not accurate and do not
correspond to your business process reality, the resulting System
Requirements will also not be correct.
For example :
Please, take into account that the SAP APO sizing relevant amount
of characteristic combinations is NOT necessarily the mathematic
product of all your customers, locations, products, etc. :
Not all mathematic combinations may be relevant for your Demand
Planning run. A lower amount of characteristic combinations will
require less SAP liveCache Main Memory.
The same is valid for the key figures. For your SAP APO sizing, please
consider only the amount of key figures which will be relevant for
your Demand Planning run. In this way you will obtain realistic
liveCache Main Memory requirements.

Furhermore, some particular quicksizer figures such as Planning
Versions have a multiplicator effect on the resulting liveCache main
memory requirements.

If you type "n" versions, the quicksizer will
assume that these versions are used in parallel and have all exactly
the same size, that is, the maximum size which is associated to your
active version and which comes from all the figures you entered in the
quicksizer (maximum amount of characteristic combinations, key figures,
buckets, orders ...).
However, planners will define their own versions for their needs.
These may contain less orders than the active version, less master
data, and be therefore much smaller in size than the active version.
Therefore, the figure you have to enter in the quicksizer as amount
of planning versions must take this into account.
Example :
You plan to use 25 SNP versions presumably in parallel. 3 of these
versions are 100 % copies of the active version, 8 versions are
supposed to contain only 50 percent of the amount of orders of the
active version, 14 versions are planned to contain only 10 % of the
orders stored in the active version. Then, the figure you should enter
in the quicksizer must not be bigger than :
3 + 50% x 8 + 10% x 14 = 3 + 4 + 2 = 9
If you enter 9 versions, the quicksizer will give as result the
required amount of main memory to hold 9 versions in main memory.
However, SAP liveCache may not need to hold all 9 versions in main
memory at the same time.

2) You do not need to size your SAP APO so that all versions can be
held in Main Memory at the same time. Versions which cannot
be held on liveCache Main Memory will be swapped to the SAP liveCache
devspaces. Swapping is performed by the liveCache management system
automatically on a page level, the size of a Page being 8 KByte.
The liveCache swapping strategy is based on LRU (Last recently used),
as for all standard RDBMS.
liveCache keeps those 8KB-pages which are used most often in main
memory and swaps the rest of pages to disk. Eventually, the performance
may be reduced if there is not enough liveCache Main Memory available
to keep all versions which are often in use.
In some cases, the active Version may be swapped to disk if users
are running jobs which often require pages from non-active versions.
This should be avoided : Goal is to keep the pages belonging to the
active planning version in Main Memory.
Example :
Even if you use 9 versions presumably in parallel, not all concurrent
users will use the 9 versions exactly at the same time. The more
frequently used versions will be held in main memory, say, 3, and the
rest will alternately be swapped to disk until they are again needed.
That is, you may enter only 3 Planning versions in the Quicksizer and
not 9.
Please, also note the following.
Let's assume that your company has 2 Business Units working on one
central SAP APO. One Business Unit is located in the USA and the other
one in Europe. In such cases, most users in the USA will certainly
not work at the same time than users in Europe and therefore will not
need all Planning versions in main memory at the same time.

--------------
Conclusions
--------------

When sizing your SAP APO, you must take the following into account :

a) How big are your sizing requirements independently of the amount
of versions (amount of sales orders, forecast orders, etc.) ?
If they are already big, more versions stored in main memory will
only increase the liveCache Main Memory requirements. If they are not
very big, you can afford to have more liveCache Main Memory available
for multiple planning versions.
b) How big is your active version ?
The active version is the one which should be held in Main Memory,
while the rest of versions may be swapped to disk without
necessarily affecting the performance of the system.
c) How many 100 % version copies will be used on average and in
parallel ?
If only some percent of the versions are used in parallel, only this
percent of versions will be relevant for your sizing.
d) How many versions are Time Series or Order relevant ?
Performing version copies with the transaction tscopy can reduce the
amount of effective versions to 50 %.
e) How big are your Planning Versions in general ?
If most of them are not as big as the active Version, you must take
this into account when entering the amount of Planning Versions in the
Quicksizer, so that the sizing results are realistic.


Header Data



Release Status:Released for Customer
Released on:05.04.2002 14:02:16
Priority:Recommendations/additional info
Category:Consulting
Primary Component:SCM-TEC In Case of LiveCache Problems: Please use SCM-APO-LCA
Secondary Components:BC-DB-LVC liveCache

SCM-APO-SNP Supply Network Planning (SNP)

SCM-APO-FCS Demand Planning

SCM-APO-MD-VM Version Management

BC-DB-LCA liveCache Applications

Affected Releases

Release-Independent

Related Notes




541703 - Collective consulting note on technical subjects in DP

500843 - Composite SAP note for COM and SAP liveCache 7.2 or higher

431299 - Determining the size of plan versions

Wednesday, August 13, 2008

User Exit

User Exit

In computer software, a user exit is a place in a software program where a customer can arrange for their own tailor-made program to be called. In the R/3 system from SAP, a user exit is contrasted with a customer exit and allows a customer's developer to access program components and data objects within the R/3 system. In R/3, some user exits use Include statements to include customer program enhancements that are called from the program. Other user exits use tables that are accessed through customization.

Debugging a User Exit or Program


Enhancement/Modifications

1) Execute tcode SMOD to find available enhancement/modifications.
2) Create a project for the enhancement in tcode CMOD.
3) You must activate your project first in order to hit a break-point or get into debug mode for your existing enhancements/modifications, if you do not, the best you will be able to do is step through the main program until you hit the call for that particular customer enhancement.
4) To get into debug, you can enter a hard break-point in the enhancement itself, set a soft break-point with the stop sign, or the long way, before you execute your transaction or while you are in your transaction, you can place a /h in the ok code area (this is the area of your gui where you can type in a tcode). Once you have the /h, hit enter and that will take you into debug, from there, you can do many different things to find exactly what you are looking for.

User Exits

1) Identify the main program you want to locate a user exit/debug.
2) For example, go to SE80 and do a search by program or dev class (SAPMV45A sales order or Dev Class VMOD, most SD user exits are in this dev class). In SE80 if you go by program, most user exit programs end in a 'Z' on a rare occasion 'X' or 'Y'.
3) If you are looking at including MV45AFZZ, you can see where there are different forms. These forms will get called at times within the program. If you are looking to fill the storage location on the sales order, you will probably want to take a look at the perform that fills in a field in vbap.
4) If this is what you are trying to accomplish, you will need to do the select against the config Table TVKOL based on the shipping point/plant and possibly storage condition based on your picking strategies.
5) For the debug part, you can do the same as in the enhancements/modifications but you will not need to activate any projects.

Wednesday, August 6, 2008

Procurement for production

Maintain Info Records:
Purchasing Info Records(info record) are maintained as master data. An info record links a material to a vendor. Data included in an info record are current prices and future pricing conditions, vendor and material relationship information and the amount of time it takes a vendor to deliver the material.

Scheduling Agreement:
An outline purchase agreement is a long-term agreement between a purchasing org. and a vendor. It is regarding the supply of materials or the performance of services within a certain period. A Scheduling Agreement is a form of outline purchase agreement under which materials are procured on predetermined dates within a certain time period.

Procurement Activity Flow:
Create and Maintain Material Master->Create Info Record in SAP for suppliers->Create Scheduleing Agreements for suppliers->Release Scheduling Agreement to suppliers->Create Delivery Lines->Review and Release Delivery Schedule to IEC

SNC Senarios mapping examples

Purchase Order Collaboration with Contract Manufacturing



Tuesday, August 5, 2008

CTM SAP notes category

Solution

The various areas in which individual consulting notes are referred to are listed below.

    1. General


Guidelines for note searching in SCM-APO-SNP and SCM-APO-SDM
Note 797264

Release Restrictions for SCM 5.0
Note 832393

    2. SNP-Heuristic


Periodic Lot Sizes in the SNP Heuristic
Note 503109

Using net change planning in SNP heuristic
Note 654235

Adjustment of variants after implementing note 533926
Note 539279

Scheduling logic in the SNP Heuristic
Note 1045636

Heuristic and automatic parallelization
Note 961273

Performance: Tips for improving heuristic performance
Note 991089

Results with direct delivery in the SNP heuristic
Note 912887

Missing unit of measure conversion in the product master
Note 781597

    3. SNP-Interactive Planning


Assignment of users to planning books/data views
Note 445837

Timestream Generation Error
Note 487776

Displaying detailed information in the lower grid
Note 445837

MRP areas in Interactive Planning
Note 663420

    4. SNP-Macros


Generation of Alerts with Exit Macros during a Planning Run
Note 958156

Deletion of not used links to a macros book
Note 359985

Buffering of master data in the SNP
Note 663731

    5. SNP-Deployment Heuristic


Deployment and automatic parallelization
Note 961483

    6. SNP-Optimizer


Locking issue in SNP Optimizer and Deployment
Note 960579

Optimizer provides unclear results
Note 420650

    7. SNP-Deployment Optimizer


Deployment Optimizer ignores reqmts, ATD quantities
Note 701438

    8. SNP-Transport Load Builder


TLB and automatic parallelization
Note 961488

Switching from old to new TLB planning
Note 707828

    9. SNP-Aggregation and Disaggregation


Reports to support standard SNP Hierarcy
Note 992948

    10. SNP-Safety stock Planning


Safety stock is ignored
Note 708910

    11. SNP-Miscellaneous


Values from auxiliary key figures are not extracted
Note 837004

Buffering of master data in the SNP
Note 585382

System upgrade and new installation
Note 722695

Deleting a planning version
Note 663420

    12. Capable to Match


FAQ : Capable-to-Match (CTM) planning
Note 855229

    13. User exits and BAdIs

BAdI Info for Modifying Optimizer Input and Output
Note 542145

CTM: BADI for influencing substitutions
Note 452427

Note 855229 - FAQ : Capable-to-Match (CTM) planning

Solution

- CTM performance -

Question :
How can I improve the CTM performance ?


Answer :
1) Hardware (technical improvements)
Regarding the hardware it is recommended to use a CPU with a high quality. A system with multi CPUs cannot improve the performance because the CTM engine can use only one CPU. An ideal environment is a system with a CPU > 3 GHz and 3 GB RAM.

2) Global customizing settings
The package size is the key parameter that can improve the performance if you are using the asynchronous liveCache (LC) update.
In asynchronous update the system creates the orders in the LC after planning the specified number of demands. At the same time, the system continues with the CTM planning for the other demands.
The performance depends on the amount of planned orders that are created per demand.
If you set up a package size of 1000 demands, this means that each 1000 demands CTM writes the results to the LC. If there are only 1000 planned orders created for fulfilling these 1000 demands then the package size is too small for an asynchronous run with a good performance because the CTM run needs to stop each 1000 planned orders for triggering the LC writing. This increases then the run time. In case there are 100.000 planned orders created then the package size is too big because the CTM run for creating these 100.000 planned orders needs probably longer than the parallel writing of the results to the LC.
So the optimal package size makes the planning and writing to LC almost parallel.

Apart from the package size of the asynchronous LC writing, there are three more areas where packages must be defined.
- Package Size for Creating Orders. The value indicates how many orders can be created in the LC at one time.
- Package Size for Creating Pegging Relationships. The value indicates how many pegging relationships can be created in the LC at one time.
- Package Size for Order Selection. The value indicates for how many location products a LC request can be started at one time (package size for reading orders from LC in a way that CTM can use them for planning).

It is difficult to give recommendations regarding these package sizes because one needs to play around in a specific situation to improve the performance. So consider the following values as an orientation.

Regarding the package size of the Order Creation a value smaller than 1000 can be recommended (default value is 500 orders), but also a value up to 5000 can be efficient. In case of performance issues it is recommended to run CTM with different package sizes : 500, 1000, 2500, 5000 and see whether one of these settings can improve the performance.

Regarding the package size for Creating Pegging Relationships a value bigger than 1000 is recommended. The default value is 5000 pegging relationships.

The setting for the Order Selection is based on location-products, which implies that if the setting is currently 5000, CTM reads the orders for the first 5000 location-products and sends them in one package from the LC. If the average amount of orders per location-products combination is 10 (each location-product has in average 10 orders) the result is 50.000 orders transferred in the package.
As this setting depends on the amount of orders existing per location-product, one needs to play again with different settings in case of performance issues. Generally it is recommended to have a value smaller than 10.000.

3) Planning profile (business improvements)
Apart from the customizing settings, the performance depends on many other criteria like the Master Data Selection, Order Selection, Selected Demands, Planning Period, Product-Location restrictions.

If you want to improve the performance you should create a master data selection which contains only the objects in the model which are actively planned by the CTM run.
You can check whether it is possible to divide your supply chain in separate independent model parts. This covers a big performance improvement potential because the independent parts of the chain can be planned in parallel.

Then the order selection, especially the size and data requirements, is one of the major areas where you can improve the performance.
During the deletion of orders in a CTM planning run, several checks that are not always required are run. The parameter 'FAST_DEL' skips some checks of the normal planning run to improve the planning run time. The parameter is activated if you set Value1 of the planning parameter to X. In this case, no information is analysed on pegging, input or output nodes during order selection. Therefore, the parameter can only be reasonably used with the planning mode "Regenerative planning" (Replan all Orders) and the deletion mode "All unfirmed orders" (Delete all Orders that are not firmed). Note that in this case, only the status and the type of the order are analyzed for the deletion decision. This means that all planned orders or purchase requisitions are deleted, if they were not firmed manually (Output fix) and contain at least one location product that is contained in the CTM master data selection. This parameter must not be used if you use a subcontracting scenario. In this case, inconsistencies can occur between the data in APO and the R/3 System because subcontracting purchase requisitions are deleted but not the subcontractor planned order at the subcontractor side.
Notice that since release SCM 4.0 the parameter 'FAST_DEL' is obsolete and has been replaced with the planning step (Technical Settings tab) End Planning Run after Oder Selection in combination with Do Not Check Order Details.

CTM run time depends on the number of demands. Consequently if your business need for CTM does not require all demands simultaneously, you can perform CTM runs using a demand selection. You can cut down on run time if you can first reduce the amount of demands considered in the planning run.

Another factor to take into account is the planning period set for the CTM profile. You can implement a horizon in the work area which includes only those demands that can be planned during the horizon of the CTM run. That is, exclude any periods containing demands which cannot be planned.

Consider also whether you can meet your business requirements by restricting your demands by product-location. You can experience long run times because in a large supply chain you do not restrict the demands by product-location combination.

--------------------

Question:
What is the influence of the late demand fulfillment strategy on the performance ?

Answer:
The fastest strategy is the Standard Procedure for Scheduling.
With the Keep Lateness to a Minimum strategy, the CTM planning run takes longer than it does with the standard procedure.
The Gradually Postpone Demand Date can be extremely time consuming if you have selected a large time period for the late demand fulfillment (/SAPAPO/CTMCUST) and entered a low value under Offset.

--------------------

Question:
Why does CTM take longer than usually to complete the planning run ?

Answer:
You can experience a very long run time when a large number of orders are split. The consequence is a bottleneck situation when writing the results to the LC.
This order splitting occurs usually in time continuous planning and it depends on the constraints used for the CTM planning. When one of the constraints leads to a complete fail CTM tries to fulfill the demand using the smallest possible quantity.
So the main rule is that CTM tries to fulfill demands on time using all existing alternatives. Thus if several alternative resources exist CTM uses in the worst case all of them just to fulfill the demand on time.
Therefore orders with varying quantities could appear. These orders can be so small that they cannot be used for an efficient production.

It is highly recommended to use the maximum scheduling time to end the scheduling of those demands which result in this splitting.

--------------------

Question:
Why does the CTM planning run take longer when I use the field LIFPRIO (delivery priority of the sales order) in the demand prioritization ?

Answer:
When you use the field LIFPRIO, CTM reads data related to sales.
As the treatment has to be carried out for each demand and not collectively, this has a bad impact on the run time and the CTM planning run takes longer.
An alternative is to split the CTM planning.
You can face a similar problem when you use the following criteria : ANTLF, DELNR, DELPS, ERTMS, GRKOR, KUNNR, LFTMS, PSTYV, VBTYP, BMENG, WMENG, GMENG, CNFPART .

--------------------

Question:
Why does the CTM planning fail and terminate with an error related to memory consumption (like TSV_TNEW_PAGE_ALLOC_FAILED) ?

Answer:
This might happen when the model is too big. If you check the trace file of the CTM run and its size exceeds 500 MB then you can face a memory problem during the run. In order to avoid this problem you have to split the model into separate independent parts.

Notice that the CTM run generates a trace file. This trace file is recorded if you have previously set in the customizing (Optimization Server Master Data) the status of the CTM engine as L Active(Logging Switched On) and you have entered a log directory.
Transaction SPRO : mySAP SCM -> APO -> Basis Settings -> Optimization -> Basic Functions - > Maintain Master for Optimization Server

A memory consumption problem might also happen when the CTM engine takes a long time to compute a solution for some demands.
In this case you should run CTM with a Maximum Scheduling Time value.

- CTM planning -


Question:
Why are some orders not deleted by a CTM planning run ?

Answer:
Fixed orders (also called firmed orders) cannot be deleted by a CTM planning run.
An order is fixed if one of the following conditions is fulfilled:
1. The order was fixed manually in the PP/DS product view.
2. A CTM planning run can only delete planned orders, purchase requisitions, transfer-requisitions and substitute-requisitions. All the other order types (forecasts, sales orders, stocks, etc.) cannot be deleted at all by a CTM planning run. Therefore these orders are always considered fixed by CTM.
Subcontracting scenario is an exception to this rule. If a purchase requisition for a subcontractor is deleted the corresponding subcontractor production order is also deleted even if it is fixed.
3. Already created PP/DS orders are considered as fixed if a CTM planning run creates SNP orders, but created SNP orders are not fixed when CTM creates only PP/DS orders.
4. CTM considers orders which are not inside the planning horizon as fixed. An order is considered fixed if the start date is before the planning start date or if the end date is after the planning end date.
Orders that are lying inside the planning horizon and have been pegged to orders or demands outside the planning horizon are considered fixed.
This effect is amplified when you use dynamic pegging.
As from release 4.0 the dynamic pegging can be deactivated from the product location master data but switching off the dynamic pegging has side effects and the safety stock functionality does not work anymore.
In order to reduce this effect in release 3.1, we can recommend you to decrease the Maximum earliness in the product location maintenance (tab Demand -> Pegging -> Dynamic pegging -> Maximum earliness of a receipt) from 100,000 hours to 1 minute.
5. CTM does not delete any planned orders within the production horizon. These orders are considered fixed. The horizon starts on the system date (or the date you specified as the planning start in the CTM profile).
6. Orders which have at least one component that does not belong to the master data selection of the CTM profile are considered fixed.

--------------------

Question:
How does CTM select the PPM that will be used for a planning run ?

Answer :
A five-step process specifies which PPM will be used for a planning run. The selection stops as soon as one PPM is left based on the selection criteria. Here are the five steps of the selection process :
1) On-time fulfillment.
First, CTM looks for PPMs that can fulfill the demand on time.
2) Priority or Quota.
If both are maintained, the quota will be considered. Otherwise, the priorities (multilevel costs of the plan) for the single PPMs are taken into account.
3) Validity period.
The PPM with the validity period that is the closest to the demand date will be selected.
4) Lot size.
The following process is used :
- The PPM with the greatest maximum lot size is considered first.
- Then the PPM with the smallest minimum lot size is used.
- If no PPM was selected, a random selection will pick a PPM.
5) If the demand cannot be fulfilled on time, CTM starts late demand handling and steps 2 to 4 are repeated.

--------------------

Question:
What is the behaviour of the late demand fulfillment strategy ?

Answer:
The general late demand behaviour of CTM is as follows: no matter whether CTM is run in backward or forward scheduling mode, when the demand cannot be fulfilled on-time CTM starts the late demand handling.

With the default method for late demand handling (Standard Procedure for Scheduling - BackwardProjection3), each day after the demand date is checked for available capacity to fulfill or fulfill partially the demand quantity. CTM steps forward day by day until it cannot fulfill any quantity. As soon as CTM does not find a partial solution, it switches to forward scheduling and looks for enough quantity to fulfill a minimum quantity in order to find the date closest to the original due date within the late demand fulfillment frame. With the solution found in forward scheduling, CTM calculates the final solution by using the new date and applying the backward strategy.

With the Keep Lateness to a Minimum strategy (BackwardProjection5), CTM tries to find the earliest solution in late demand handling. Priorities and costs are ignored. This option is more time-consuming than the first option because CTM has to check several alternatives.

With the Gradually Postpone Demand Date strategy (Late Demand Offset), CTM does not apply a forward scheduling shift. You have to specify a number of days (N) and then CTM always add N days to the original due date. So CTM first tries to satisfy the demand on-time. If it does not find the solution, it tries to satisfy the demand N days later. If it still does not find a solution, it tries to satisfy the demand 2*N days later and so on. There is no forward step.

--------------------

Question:
Why does CTM create an excess of supply ?

Answer:
The excess of supply might result from the lot size specifications.
For example the product A at location Loc1 has a minimum lot size of 100 and the input component B at location Loc1 has a minimum lot size of 50.000. When you use backward scheduling CTM creates orders of lot 100 for the product A in order to fulfill demands at time t1, t1-x and t1-2x.

--------o--------------o--------------o--------> t
t1-2x t1-x t1

The corresponding input component B is also created in the same order. But the supply of 50.000 created at time t1 cannot be used for the demand at time t1-x and so on. Consequently CTM creates multiple orders of 50.000 and you get an excess of supply.
To correct this situation you can define another CTM profile and plan for product B only. This profile will correct the excess stock situation.

--------------------

Question:
Why does CTM create excess of supply within a demand ?

Answer:
CTM is not able to consume the excess of supply created by orders from the same demand.
The excess of supply created by orders in a demand can only be used (consumed) by the next demands.
A workaround is to run CTM twice by splitting the production line via a master data selection.

--------------------

Question:
Why does CTM create delayed planned orders ?

Answer:
Usually the reason for the delayed fulfillment of the forecast is that there are restrictions concerning the procurement of the product required. CTM cannot fulfill the demand on time and uses the late demand fulfilment strategy.

--------------------

Question:
Why does the CTM planning might result in shortages for some products when I use the Interval Planning strategy with safety stock ?

Answer:
The Interval Planning strategy is a process where the entire planning horizon of a CTM planning profile is divided into individual intervals. CTM planning processes the intervals sequentially. For each interval, CTM first fulfills demands and then creates possible safety stock. You should take into account that demands from one interval can consume receipts that were created as safety stock in a previous interval.

--------------------

Question:
How can I set up the Interval Planning ?

Answer:
Interval Planning requires the CTM profile to be set with the Fixed Pegging strategy to run successfully.
Many intervals will decrease performance substantially.
If Interval Planning shall be applied for safety stock planning, check whether you can switch to a safety days of supply scenario without Interval Planning.

--------------------

Question:
Why are some sales orders not considered during the CTM planning run ?

Answer:
Sales orders are not relevant for CTM planning when they are not pegging relevant. Please check the flag Not Pegging-relevant in the product view.
Only sales orders that are relevant for pegging are taken into account for demand prioritization.

--------------------

Question:
Why does CTM create small orders ?

Answer:
Please check the Performance chapter and the question: Why does CTM takes longer than usually to complete the planning run ?
Notice that in backward scheduling with time-continuous scheduling a short resource slot with high capacity can be used to place a lot of small quantity orders to minimizing earliness.

--------------------

Question:
Does CTM support subcontracting scenario ?

Answer:
CTM only supports subcontracting with source location. If you want to use subcontracting within your CTM planning run, you have to model the source location and you have to create the products for your supplier locations.

Notice that subcontracting works only for PPMs and orders of type PP/DS before release SCM 4.0.
Since release SCM 4.0 it also works for SNP.

--------------------

Question:
Why are the planned orders not created in the first bucket?

Answer:
The planned orders cannot be created in the first bucket (bucket-oriented planning) when the planning start date cannot be assigned to a time zone, thus resource grid for first bucket and planning start do not match. The CTM planning time stream is defined and used with reference to the UTC time zone. The reason is that the planning start is unique for all locations of a supply chain network. As a work around you can decrease the planning start date.

--------------------

Question:
Why does CTM create an excess of supply when I use the safety stock method ?

Answer:
The standard safety stock method uses a second run to fulfill the requirements for the safety stock. CTM checks the inventory profile and as soon as there is a need it creates a virtual demand for the second run. With this method CTM creates demands at a certain point of time to maintain safety stock level irrespective of the orders created further down the timeline. Consequently you can get an excess of supply, especially when lotsize settings are being considered.

For example you can face a situation with no stock at the beginning of
the planning period and then several big forecasts at the end.
In this case CTM creates a demand for the safety stock without taking
into account the following forecasts (which have been supplied in the first run).

In order to mitigate the creation of excess of supply you can use the control parameter SSTOCK_MODE :

SSTOCK_MODE
Do not use in SCM 5.0 or higher. Instead use in CTM profile ->
Supplies with 'Build Up Safety Stock'
Do not plan Demands indicator = SSTOCK_MODE Value1 = X.
Consider Supply Shortage indicator = SSTOCK_MODE Value2 = X and 1 and 2 Stock is Available indicator = SSTOCK_MODE Value2 = 0 and 2.

Release APO 3.0A (SP20+) APO 3.10 SCM 4.0 SCM 4.1
Parameter used in Safety Stock Run (ABAP)
Parameter has to be set by User
Value 1 = "", X (Parameter active also if Value 1 not set)
Value 2 = 0, 1 (previously X) or 2
Description: Demands and Safety Stock fulfillment can be done in the same CTM run. Applied only if safety stock rebuild active.

If Value 1 is set the order selection of the normal run is performed but the planning of normal demands is skipped. The safety stock run is performed as usual.
In the safety stock run, CTM creates virtual demands to trigger building up to the desired safety stock level.
If Value 2 = 1 or 2 the net calculation of the inventory also takes into account the backlog of normal demands. By default, the safety stock run doesn't take into account the backlog of normal demands, which is determined by the unpegged quantity.
If value 2 = 0 or 2 , available stock is not blocked for safety stock requirement of the same location product, but is visible to dependent requirements during planning. However stock with a pegging relation is not available and remains untouched.

The combination of both values can be used to plan normal demands and safety stock in one run. This allows to avoid interval planning in case of time-dependent safety stock. This setting is only suitable if safety stock is considered exclusively for finished products because normal demand counts up against decreasing safety stock level. In this case parameters SSTOCK_EARLYFRAME and SSTOCK_LATEFRAME should be applied.


Question:
Why are the safety days' supply not considered during the CTM run ?

Answer:
You have defined a safety lead-time (X days) for a product. However, the CTM result does not consider this lead-time. You should check whether there is another constraint defined for this product like the OrderCreationFrame (global settings for CTM). You can also see the OrderCreationFrame time in the demand list. With this parameter you force the engine to create all orders within Y days (Y

- Transportations lanes and PPMs (Quota arrangements, procurement priorities and multi-level costs) -

Question:
What are the CTM restrictions regarding the definition of a PPM ?


Answer:
The following points should be considered :

- Activities need to be in a linear sequence.
However CTM supports the feature minimum/maximum duration between the end and the start of an activity. This allows to model minimum or maximum breaks between activities. By default this is only possible for PP/DS PPMs and the SNP PPMs cannot have any breaks between activities.
The planning parameter ACTREL_MAX allows to set a global value for breaks between activities when using SNP PPMs. Please check note 512763 for a detailed description of the planning parameter.

- CTM only supports relation type 'end-start relationship'.

- In releases APO 3.0 and 3.1 all input components have to be assigned to the first activity of the PPM and all output components have to be assigned to the last activity.
Components from activities in between the first activity and the last one can be assigned to the first or the last activity by using the planning parameter COMP_REASSIGN. This parameter assigns output components to the end activity and input components to the start activity. Please check note 506700 for a detailed description of the planning parameter.
In release SCM 4.0 an enhancement allows the assignment of input activitiy in the PPM. Output products/components still need to be assigned to the last activity.

- When setting the capacity consumption (variable/fixed) you have to be aware of the dependencies between consumption/duration and type of PPM.
For SNP PPMs (bucket oriented planning) a variable and a fixed consumption can be used but only in combination with a fixed duration. A variable duration for bucket planning is not possible.
For PP/DS PPMs (time continuous planning) only a fixed consumption can be utilized but this can be combined with a fixed or a variable duration.

--------------------


Question:
Does CTM take into account the transport costs for the selection of the transportation lanes ?

Answer:
CTM does not take into account the transport costs. In case of transportation lanes only the quota arrangements or the procurement priorities are considered. In case both are maintained in the transportation lane, the procurement priority is ignored. The comparison of costs is more in line with the functioning of the optimizer which is targeted at maximizing the profits by using the lowest cost procurement alternative across locations.

--------------------

Question:
Why does CTM not select the transportation lane with the lowest procurement priority ?

Answer:
You have maintained several transportation lanes for the same location. If one quota exists, all other procurement priorities are ignored, even if they are maintained for a different time period.

For instance assume the following situation is modelled :

From 01.xx to 07.xx
Transportation lane 1 -> quota arrangement 30%
Transportation lane 2 -> quota arrangement 70%

From 07.xx to 10.xx
Transportation lane 1 -> procurement priority 1
Transportation lane 2 -> procurement priority 2

In this situation quotas and priorities are mixed. Therefore, only the quotas would be used and the procurement priority would be ignored completely.

Notice that this restriction is only valid per location and it is possible to have quotas at one location and priorities at another location.

--------------------

Question:
Why does CTM not select the PPM with the lowest procurement priority ?

Answer:
PPM procurement priorities are not considered by CTM and only multi-level fixed and variable costs for the plan are used by CTM.

Notice that the procurement priority is available as from release
SCM 4.1.

--------------------

Question:
Why does CTM not choose the PPM with the lowest multi-level cost ?

Answer:
During the process that specifies which PPM will be used for a planning
run, the system checks multi-level costs and quota arrangements.
If both are maintained, the quota will be considered and the multi-level cost will be ignored.

--------------------

Question:
Why does CTM select only one source of supply although you have maintained several supply sources, locations (transportation lanes) and PPMs ?

Answer:
If a specific quota is maintained for only one of the supply source, no matter which value it is, then everything is taken from this source. Nothing from the other sources will be delivered to fulfill a demand.

--------------------

Question:
Why does a different supply distribution than the one maintained in the quota comes out during the planning run ?

Answer:
The reason is a limitation from other constraints like capacity constraints.
For example you have maintained a quota of 60 and 40% and after the planning run the supply is distributed as 10 and 90%.

--------------------

Question:
Why does CTM fulfill a demand using the transportation lane with the lowest priority ?

Answer:
If the demand is fulfilled late and you use Keep Lateness to a Minimum procedure (BackwardProjection5), CTM planning selects the source of supply as early as possible and it only considers priorities, costs and quota arrangements when there are several sources of supply with the same availability date.

--------------------

Question:
Why is the PPM ignored during the planning run ?

Answer:
In case of SNP planning when the mode duration is less than 1 day, then such a mode is ignored.
In case no further modes are available then the engine fails to find any solution. The engine internally does not change the duration to 1 day if it is less than 1 day.

---------------------

Question:
Why CTM does not plan for small quantities?

Answer:
Here are some boundaries that the engine supports. If the values are outside these limits then a PPM cannot be used.

Minimum material consumption 0.001 units
Minimum resource consumption 0.001 units
Minimum duration 1 sec
Maximum resource consumption 10^19 units

To solve the minimum duration, use a fixed duration of 1 sec in the PPM. This way the min duration of the activity will always be greater than 1 sec. This can be done either manually by adding 1 sec in the PPM or by using the BADI /SAPAPO/CTM_SOSINT.

- Resources -


Question:
Why does CTM not create enough supply although capacity is set as infinite ?

Answer:
CTM is handling "true" infinite capacity only in release SCM 4.1.
In previous releases the infinite capacity is simulated by using 2^30 as maximum capacity. In some cases this limit can be reached. To minimize the problem you can use the planning parameter 'Precision', which allows larger values during the planning but provides less accuracy. The value of 1000 is used as a default setting and you have to enter 1, 10 or 100 for the Value1 of the planning parameter.
With the value 1, all decimal places disappear for the available resources.

Precision
Parameter Name Precision
Release APO 3.0A (SP12+)
Parameter used in CTM engine
Parameter has to be set by User
Value 1 1, 10, 100 or 1000
Value 2 Not relevant
Description Precision of capacity / capacity consumption

--------------------

Question:
Why do I get an overload of the capacity consumption ?

Answer:
1)The overload situation comes probably from existing firmed planned
orders. CTM cannot consume more than the available resource capacity
unless you plan infinitely. It is advisable to check first the capacity situation of the resource (master data check) before running CTM.
2) The overload of the resource might result from the combination
of infinite and finite capacity planning. Infinite capacity creates orders in a parallel mode although finite capacity creates orders in a sequential mode. As the capacity data of CTM reflects correctly the combination of infinite and finite capacity there is no plan to change the current behaviour.

--------------------

Question:
Why does CTM not take into account the factory calendar for the resource ?

Answer:
The length of the time stream of the resource in the LC does not longer correspond to the planning period. Consequently no factory calendar is taken into account. You have to update regularly the resource time stream. You can display the length of the time stream in the liveCache via transaction /SAPAPO/REST02 by clicking on the GUID in the "Time stream ID" field.
Note 550330 provides you with a detailed explanation regarding the procedure :
'The length of the time stream in the liveCache is defined by the fields LC_DAYS_MINUS and LC_DAYS_PLUS in relation to today (planning parameter TAB in the resource maintenance). As time goes by, the length of the time stream and the available capacity from "today" is thus continuously reduced. To keep the time stream to the relevant length (length set for planning), you should periodically schedule the report /SAPAPO/CRES_CREATE_LC_RES ...'

--------------------

Question:
Why does CTM not create planned orders for some resources ?

Answer:
1) The resource capacity profile might not be fully generated in the LC for the complete planning horizon. You can use transaction /SAPAPO/REST02 to generate it.
2) The resource might not be updated in the LC with the correct bucket profile (start and end date time for each bucket). If you compare resources there is maybe a bucket offset. In the PPM the condition for
the minimum and maximum value for the activity relationship cannot be satisfied due to the bucket offset.
You can check it using transaction /SAPAPO/REST02.

--------------------

Question:
Why CTM does not use the remaining resource capacity?

Answer:
Check the resource calendar.
If the calendar days are maintained from 0:00 to 23:59, such calendar maintains an interval of 1 second non working time.
This break prevents the complete consumption of the capacity of the resource: orders start one second before the next higher capacity is available. Maintain the days from 0:00 to 24:00.
This applies for shipping calendars as well.

- Rules -


Question:
Why does CTM not create any substitution order to fulfill a demand covered by a rule ?

Answer:
Substitution orders are only created if there is some available supply for the substitute products.

--------------------

Question ?
Why is the transportation time not taken into account ?

Answer:
You have defined a cross location substitution to substitute a product. Unfortunately with this configuration the transport duration from location Loc1 to location Loc2 is not taken into account. In order to include the transport duration you should first create a transportation lane for product Prod1 from location Loc1 to location Loc2 and then a product substitution with a single location like defined below :
Location Product Sub. location Sub. product
Loc2 Prod1 <= Loc2 Prod2 During the planning run CTM considers the transport duration from the transportation lane and then proceeds to the product substitution. Location substitutions is not viable because CTM already tries to check all transportation alternatives by using existing transportation lanes. Not recommended - location substitution : Location Product Sub. location Sub. product Loc1 Prod1 <= Loc2 Prod2 It is recommended to set up a product substitution scenario. Products might only substitute products within the same location. Location Product Sub. location Sub. product Loc1 Prod1 <= Loc1 Prod2

- Aggregation -


Question:
How do I set up the aggregation functionality ?

Answer:
The aggregation is done based on the time stream maintained in the "Planning Scope" tab. The degree of aggregation is based on the defined period in the time stream. For instance if days are defined then all demands and supplies are summarized on a daily basis, if weeks are defined the same is done on a weekly basis.
In the customizing (transaction /SAPAPO/CTMCUST - tab "General Customizing"), it is necessary to define at which part of the aggregated period the supplies or demands are requested.
For instance you can specify that weekly aggregated demands are available at the beginning of the week, at the middle or at the end.
It is also possible to define the date of availability at the date of the first or last order of all orders in one aggregation period. Generally it is recommended to set aggregated demands to the beginning of the period and supplies to the end. The reason for this is "to make" demands as early as necessary to make sure they can be fulfilled on time and "to make" supplies as late as necessary to increase the possibility that the available resource offer can cover the requested quantity on time.
To apply the aggregation functionality it is necessary to set the aggregation flag in the "Aggregation" tab and enter the corresponding (ATP) category in the table beneath the flag.
The aggregation is generally only possible for demands and supplies which are shown in the demand/supply simulation.
Aggregation can take place during a safety stock run as of release 5.0.
Planning parameter SRCAT_AGGREGATION triggers aggregation of the safety stock requirement (category 'SR').

- Active Ingredient -

Question:
Was the Active ingredient functionality maintained within CTM?

Answer:
No. For this reason, it is not recommended to maintain products with active ingredient's settings.

SNC 5.1 installation

SAP does not recommend installing all components(EPR server, XI server, BI server(optional), SCM server, SNC server) on one host. Instead, you can distribute the components among several hosts. The distribution depends on many factors, such as sizing, security, available hardware, and so on.

Netweaver 7.0 Browser support


Related notes:
1062459 1033669 event magmt
1025582 scm basis 5.1 add-on
1033723 snc5.1 add-on
853692 cFolders 4.0