SAP HANA FATAL_OUT_OF_MEMORY


Many times while working on SAP HANA, you might have faced memory errors like SAP HANA FATAL_OUT_OF_MEMORY. Let’s check it out in more detail in this post.

SAP HANA FATAL_OUT_OF_MEMORY Error

In SAP HANA, the FATAL_OUT_OF_MEMORY error indicates an OOM (Out Of Memory) situation. You can get further information in the respective trace files created in the system. For example, Trace files with the following naming convention are created in the system.

<service>_<host>.<port>.rtedump.<timestamp>.oom.trc
<service>_<host>.<port>.rtedump.<timestamp>.oom_memory_release.trc
<service>_<host>.<port>.rtedump.<timestamp>.compositelimit_oom.trc
<service>_<host>.<port>.rtedump.<timestamp>.after_oom_cleanup.trc
<service>_<host>.<port>.emergencydump.<timestamp>.trc (if memory-related errors like “allocation failed” are responsible)

So depending on the error reported in these files, you may need to work on the parameter tuning for the global allocation limit of SQL statement memory allocation limit, just to name a few.

global_allocation_limit :- This parameter limits the overall DRAM memory consumption of SAP HANA. It also covers DRAM memory allocated in the context of the fast restart option and SAP HANA >= 2.0 SPS 05 (SAP Note 2700084). If “%” is specified at the end of the parameter value (without preceding blank), the value is interpreted as the percentage of RAM, otherwise, it is interpreted as MB.

allocationlimit :- This parameter limits the memory consumption of the related SAP HANA process (). If “%” is specified at the end of the parameter value (without preceding blank), the value is interpreted as the percentage of RAM, otherwise, it is interpreted as MB. Per default, the allocation limit of SAP HANA services is identical to the global allocation limit.

statement_memory_limit :- Defines the maximum memory allocation per statement in GB. The default value is 0 (no limit).


Also Read

Convert Local Objects ($TMP) to Transportable Objects

Basic Guidelines around Memory in HANA

There are some general rules of thumb available that can help to understand if the memory is properly sized in an existing system, e.g.:

  • Memory size should optimally be at least two times the total size of row store and column store.
  • The memory used by SAP HANA should be significantly below the SAP HANA allocation limit (exception: Large caches that can be shrunk automatically on demand)

All these rules are only rough guidelines and there can always be exceptions. For example, some large S/4HANA systems can work absolutely fine even if 65 % of the memory is populated with table data.

Reference

SAP Note 1999997


You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

error: Content is protected !!