Fortinet white logo
Fortinet white logo

User Guide

Dynamic Scripting Limits

Dynamic Scripting Limits

In FortiSIEM, Elasticsearch queries are run using dynamic scripts because of multi-field aggregation of query results. In Elasticsearch, there are two related limits.

  1. Rate at which scripted queries can be compiled (cluster level limit)
  2. Cache for the compiled scripts (node level limit)

If the compile limit is reached, then Elasticsearch will throw a circuit_breaking_exception error and the query will not run. If the cache limit is reached, then the query will need to be recompiled and may run a little slowly for the first time.

[PH_JAVA_QUERYSERVER_ERROR]:[eventSeverity]=PHL_ERROR,[phEventCategory]=3,[methodName]=innerFromXContent,[className]=org.elasticsearch.ElasticsearchException,[procName]=javaQueryServer,[lineNumber]=509,[errReason]=Elasticsearch exception [type=circuit_breaking_exception, reason=[script] Too many dynamic script compilations within, max: [75/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.aggs.max_compilations_rate] setting],[phLogDetail]=Elasticsearch exception [type=search_phase_execution_exception, reason=all shards failed]

Current limits in your environment can be obtained by running the following API.

GET /_nodes/stats?filter_path=nodes.*.script_cache.contexts
{
  "nodes": {
    "kISLOIv_QvGbFNnpDtLdyw": {
      "script_cache": {
        "contexts": [
          {
            "context": "aggregation_selector",
            "compilations": 0,
            "cache_evictions": 0,
            "compilation_limit_triggered": 0
          },
          {
            "context": "aggs",
            "compilations": 1795,
            "cache_evictions": 1695,
            "compilation_limit_triggered": 249
          },
    ...
} 

You can change the compilation rates by running the query using the following API.

PUT _cluster/settings (For ES 7.9 and above)
{
    "persistent": {
        "script.context.aggs.max_compilations_rate": "150/5m"
    }
}

PUT _cluster/settings (For ES below 7.9)
{
    "persistent": {
        "script.max_compilations_rate": "150/5m"
    }
}

You can set script.context.$CONTEXT.cache_max_size in the elasticsearch.yml configuration file. For example, to set the max size for the aggs context, you would add the following to elasticsearch.yml.

script.context.aggs.cache_max_size: 300

This is a node level setting, which when changed, needs node restart.

References

  1. https://www.elastic.co/guide/en/elasticsearch/reference/7.9/modules-scripting-using.html#prefer-params
  2. https://alexmarquardt.com/2020/10/21/elasticsearch-too-many-script-compilations/

Dynamic Scripting Limits

Dynamic Scripting Limits

In FortiSIEM, Elasticsearch queries are run using dynamic scripts because of multi-field aggregation of query results. In Elasticsearch, there are two related limits.

  1. Rate at which scripted queries can be compiled (cluster level limit)
  2. Cache for the compiled scripts (node level limit)

If the compile limit is reached, then Elasticsearch will throw a circuit_breaking_exception error and the query will not run. If the cache limit is reached, then the query will need to be recompiled and may run a little slowly for the first time.

[PH_JAVA_QUERYSERVER_ERROR]:[eventSeverity]=PHL_ERROR,[phEventCategory]=3,[methodName]=innerFromXContent,[className]=org.elasticsearch.ElasticsearchException,[procName]=javaQueryServer,[lineNumber]=509,[errReason]=Elasticsearch exception [type=circuit_breaking_exception, reason=[script] Too many dynamic script compilations within, max: [75/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.aggs.max_compilations_rate] setting],[phLogDetail]=Elasticsearch exception [type=search_phase_execution_exception, reason=all shards failed]

Current limits in your environment can be obtained by running the following API.

GET /_nodes/stats?filter_path=nodes.*.script_cache.contexts
{
  "nodes": {
    "kISLOIv_QvGbFNnpDtLdyw": {
      "script_cache": {
        "contexts": [
          {
            "context": "aggregation_selector",
            "compilations": 0,
            "cache_evictions": 0,
            "compilation_limit_triggered": 0
          },
          {
            "context": "aggs",
            "compilations": 1795,
            "cache_evictions": 1695,
            "compilation_limit_triggered": 249
          },
    ...
} 

You can change the compilation rates by running the query using the following API.

PUT _cluster/settings (For ES 7.9 and above)
{
    "persistent": {
        "script.context.aggs.max_compilations_rate": "150/5m"
    }
}

PUT _cluster/settings (For ES below 7.9)
{
    "persistent": {
        "script.max_compilations_rate": "150/5m"
    }
}

You can set script.context.$CONTEXT.cache_max_size in the elasticsearch.yml configuration file. For example, to set the max size for the aggs context, you would add the following to elasticsearch.yml.

script.context.aggs.cache_max_size: 300

This is a node level setting, which when changed, needs node restart.

References

  1. https://www.elastic.co/guide/en/elasticsearch/reference/7.9/modules-scripting-using.html#prefer-params
  2. https://alexmarquardt.com/2020/10/21/elasticsearch-too-many-script-compilations/