Changing properties and log levels on storefront nodes on CCv2 from HAC

Changing log level and properties from HAC is often used by developers for various support things. Unfortunately on SAP Commerce Cloud (CCv2) there is no more such possibility, as HAC is not available on storefront nodes.

Due to HAC can not be enabled on all CCv2 node types a new infrastructure must be implemented to add possibility of changing log levels and properties from backoffice node on other nodes. For that can be used Spring Event system and OOTB SAP Commerce ClusterAwareEvent, which allows to publish events across all nodes.

Firstly we need to define ChangeLogLevelEvent and ChangeConfigValueEvent and create corresponding implementations of AbstractEventListener to handle these events. In most cases both types of events should be executed only on some types of nodes (only storefront nodes etc.), so some abstract event should be created to limit execution on some types of nodes:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
package com.blog.event;

import de.hybris.platform.servicelayer.event.ClusterAwareEvent;
import de.hybris.platform.servicelayer.event.PublishEventContext;
import de.hybris.platform.servicelayer.event.events.AbstractEvent;
import org.apache.commons.collections4.CollectionUtils;
import org.apache.commons.lang3.StringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public abstract class AbstractAspectAwareEvent extends AbstractEvent implements ClusterAwareEvent {

    private static final Logger LOG = LoggerFactory.getLogger(AbstractAspectAwareEvent.class);

    protected String targetNodeGroups;

    @Override
    public boolean canPublish(PublishEventContext publishEventContext) {
        if (StringUtils.isEmpty(targetNodeGroups) || CollectionUtils.isEmpty(publishEventContext.getTargetNodeGroups())) {
            //broadcast from all to all cluster nodes
            LOG.info("Broadcast from all to all cluster nodes");
            return true;
        }

        boolean canPublish = CollectionUtils.containsAny(publishEventContext.getTargetNodeGroups(), targetNodeGroups.split(","));

        LOG.info("Event canPublish: " + canPublish
                + "/n for nodeGroups: " + targetNodeGroups
                + "/n and targetNodeGroups: " + publishEventContext.getTargetNodeGroups());

        return canPublish;
    }

    public abstract boolean validateRequiredFields();

    public void setTargetNodeGroups(String targetNodeGroups) {
        this.targetNodeGroups = targetNodeGroups;
    }
}

Now could be defined ChangeLogLevelEvent class, which is a specific event type that extends the base event class AbstractAspectAwareEvent. It’s designed to handle events related to changing log levels for specific loggers. This class provides fields for specifying the target logger and the desired log level and implements a method to validate that these fields are not empty when processing the event.:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
package com.blog.event;

import org.apache.commons.lang3.StringUtils;

public class ChangeLogLevelEvent extends AbstractAspectAwareEvent {


    private String targetLogger;
    private String levelName;

    @Override
    public boolean validateRequiredFields() {
        return StringUtils.isNotEmpty(targetLogger) && StringUtils.isNotEmpty(levelName);
    }

    public String getLevelName() {
        return levelName;
    }

    public String getTargetLogger() {
        return targetLogger;
    }

    public void setTargetLogger(String targetLogger) {
        this.targetLogger = targetLogger;
    }

    public void setLevelName(String levelName) {
        this.levelName = levelName;
    }
}

ChangeLogLevelEventListener class extends AbstractEventListener and is responsible for handling events of type ChangeLogLevelEvent. When such an event is received, it triggers a change in the log level for a specified logger. The class dynamically configures and updates the logger configurations based on the event, allowing for runtime adjustments to logging behavior. The main idea of implementation was taken from HacLog4JFacade#changeLogLevel.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
package com.blog.event;

import de.hybris.platform.servicelayer.event.impl.AbstractEventListener;
import de.hybris.platform.util.logging.log4j2.HybrisLoggerContext;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.core.config.AppenderRef;
import org.apache.logging.log4j.core.config.Configuration;
import org.apache.logging.log4j.core.config.LoggerConfig;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class ChangeLogLevelEventListener extends AbstractEventListener<ChangeLogLevelEvent> {

    private static final Logger LOG = LoggerFactory.getLogger(ChangeLogLevelEventListener.class);

    @Override
    protected void onEvent(ChangeLogLevelEvent event) {
        changeLogLevel(event.getTargetLogger(), event.getLevelName());
    }

    private void changeLogLevel(String targetLogger, String levelName) {
        LoggerConfig config = this.getOrCreateLoggerConfigFor(fromPresentationFormat(targetLogger));
        org.apache.logging.log4j.Level level = org.apache.logging.log4j.Level.getLevel(levelName);
        LOG.info("Changing level of " + config + " from " + config.getLevel() + " to " + level);
        config.setLevel(level);
        this.getLoggerContext().updateLoggers();
    }

    private LoggerConfig getOrCreateLoggerConfigFor(String loggerName) {
        Configuration configuration = this.getLoggerContext().getConfiguration();
        LoggerConfig existingConfig = configuration.getLoggerConfig(loggerName);
        if (existingConfig.getName().equals(loggerName)) {
            return existingConfig;
        } else {
            LOG.info("Creating logger " + loggerName);
            LoggerConfig rootLoggerConfig = configuration.getRootLogger();
            LoggerConfig newLoggerConfig = LoggerConfig.createLogger(true, rootLoggerConfig.getLevel(), loggerName, String.valueOf(rootLoggerConfig.isIncludeLocation()), rootLoggerConfig.getAppenderRefs().toArray(new AppenderRef[0]), null, configuration, rootLoggerConfig.getFilter());
            rootLoggerConfig.getAppenders().forEach((k, v) -> rootLoggerConfig.addAppender(v, null, null));
            configuration.addLogger(loggerName, newLoggerConfig);
            return newLoggerConfig;
        }
    }

    private HybrisLoggerContext getLoggerContext() {
        return (HybrisLoggerContext) LogManager.getContext(false);
    }

    private static String fromPresentationFormat(String name) {
        return "root".equals(name) ? "" : name;
    }

}

In the same way would be implemented ChangeConfigValueEvent. This class is another specific event type that extends the base event class AbstractAspectAwareEvent. It represents an event where the configuration value for a specified key is intended to be changed. The class includes fields for specifying the key and value to be updated and implements a method to validate that these fields are not empty when processing the event.:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
package com.blog.event;

import de.hybris.platform.servicelayer.event.PublishEventContext;
import org.apache.commons.lang3.StringUtils;

public class ChangeConfigValueEvent extends AbstractAspectAwareEvent {

    private String key;
    private String value;

    @Override
    public boolean validateRequiredFields() {
        return StringUtils.isNotEmpty(key) && StringUtils.isNotEmpty(value);
    }

    public void setKey(String key) {
        this.key = key;
    }

    public void setValue(String value) {
        this.value = value;
    }

    public String getKey() {
        return key;
    }

    public String getValue() {
        return value;
    }
}

ChangeConfigValueEventListener class is an event listener designed to handle events of ChangeConfigValueEvent. When such an event is received, it updates or creates configuration properties based on the key-value pair provided in the event. This class utilizes the Hybris ConfigurationService to manage configuration properties:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
package com.blog.event;

import de.hybris.platform.servicelayer.config.ConfigurationService;
import de.hybris.platform.servicelayer.event.impl.AbstractEventListener;
import org.apache.commons.configuration.Configuration;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import javax.annotation.Resource;

public class ChangeConfigValueEventListener extends AbstractEventListener<ChangeConfigValueEvent> {

    private static final Logger LOG = LoggerFactory.getLogger(ChangeConfigValueEventListener.class);

    @Resource
    private ConfigurationService configurationService;

    @Override
    protected void onEvent(ChangeConfigValueEvent event) {
        configUpdateCreate(event.getKey(), event.getValue());
    }

    protected void configUpdateCreate(String key, String val) {
        if (LOG.isDebugEnabled()) {
            LOG.debug("updating config key: " + key + " / value: " + val);
        }
        Configuration configuration = configurationService.getConfiguration();

        if (configuration.getProperty(key) != null) {
            configuration.addProperty(key, val);
        } else {
            configuration.setProperty(key, val);
        }
    }
}

This would be already enough to fulfill initial requirements by using simple groovy script to publish events from HAC, for example:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
import com.blog.event.ChangeLogLevelEvent
import de.hybris.platform.servicelayer.event.EventService

EventService eventService = spring.getBean("eventService")


ChangeLogLevelEvent event = new ChangeLogLevelEvent()
event.setTargetLogger("org.springframework")
event.setLevelName("debug")
eventService.publishEvent(event)

Using groovy script is not always convenient, so it could be more reasonable to create some UI in HAC or backoffice. In case of HAC could be followed instruction from help SAP Commerce to create a new tab in HAC with additional functionality.

Below is Spring MVC controller, which could be used for HAC UI implementation. It handles HTTP requests mapped to the " /clusteraware" path with 2 main methods:

  • getEvents this method handles GET requests to “/clusteraware/events” and adds a list of cluster events obtained from hacAspectAwareEventsFacade to pass it in JSP. It then returns the view name “clusterAwareEvents” for rendering.
  • publishEvent this method handles POST requests to “/clusteraware/publish-event” with JSON content. It publishes an event using the hacAspectAwareEventsFacade and returns a JSON response indicating the success or failure of the event publication.
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
package de.hybris.platform.hac.controller;

import com.blog.facades.reflection.ClassData;
import com.blog.facade.HacAspectAwareEventsFacade;
import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.ResponseBody;

import javax.annotation.Resource;
import java.util.HashMap;
import java.util.Map;

@Controller
@RequestMapping({"/clusteraware"})
public class HacClusterAwareEventController {

    @Resource
    private HacAspectAwareEventsFacade hacAspectAwareEventsFacade;

    @RequestMapping({"/events"})
    public String getEvents(Model model) {
        model.addAttribute("clusterEventDataList", hacAspectAwareEventsFacade.getEvents());
        return "clusterAwareEvents";
    }

    @RequestMapping(
            value = {"/publish-event"},
            method = {RequestMethod.POST},
            headers = {"Accept=application/json"}
    )
    @ResponseBody
    public Map<String, Object> publishEvent(@RequestBody ClassData eventClassData) {
        Map<String, Object> result = new HashMap<>();
        if (hacAspectAwareEventsFacade.publishEvent(eventClassData)) {
            result.put("success", true);
            return result;
        }

        result.put("success", false);
        result.put("errorMessage", "An error happened. Event hasn't been published. See the logs for more details.");

        return result;
    }
}

And regular beans definitions for DTOs from Controller:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11

<bean class="com.blog.facades.reflection.ClassData">
    <property name="name" type="String"/>
    <property name="simpleName" type="String"/>
    <property name="fields" type="java.util.List&lt;com.blog.facades.reflection.FieldData>"/>
</bean>

<bean class="com.blog.facades.reflection.FieldData">
<property name="name" type="String"/>
<property name="value" type="String"/>
</bean>

In Controller is used DefaultHacAspectAwareEventsFacade which implements the HacAspectAwareEventsFacade interface. It retrieves a list of classes that extend AbstractAspectAwareEvent through reflection and converts them into a list of ClassData objects. The publishEvent method converts a ClassData object into an instance of an AbstractAspectAwareEvent class using reflection with method handles and publishes this event using the EventService.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
package com.blog.facade.impl;

import com.blog.event.AbstractAspectAwareEvent;
import com.loopintegration.core.util.MethodHandleUtils;
import com.blog.facades.reflection.ClassData;
import com.blog.facades.reflection.FieldData;
import com.blog.facade.HacAspectAwareEventsFacade;
import de.hybris.platform.servicelayer.event.EventService;
import org.apache.el.util.ReflectionUtil;
import org.reflections.Reflections;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import javax.annotation.Resource;
import java.lang.reflect.Constructor;
import java.lang.reflect.Field;
import java.lang.reflect.InvocationTargetException;
import java.util.ArrayList;
import java.util.List;
import java.util.Set;

public class DefaultHacAspectAwareEventsFacade implements HacAspectAwareEventsFacade {

    private static final Logger LOG = LoggerFactory.getLogger(DefaultHacAspectAwareEventsFacade.class);

    @Resource
    private EventService eventService;

    @Override
    public List<ClassData> getEvents() {

        Reflections reflections = new Reflections("com.blog");
        Set<Class<? extends AbstractAspectAwareEvent>> eventClasses = reflections.getSubTypesOf(AbstractAspectAwareEvent.class);

        List<ClassData> clusterEventDataList = new ArrayList<>();

        for (Class<? extends AbstractAspectAwareEvent> eventClass : eventClasses) {
            ClassData classData = convertToClassData(eventClass);
            List<FieldData> fieldDataList = convertToFieldDataList(eventClass.getDeclaredFields());
            classData.setFields(fieldDataList);
            clusterEventDataList.add(classData);
        }
        return clusterEventDataList;
    }

    private static ClassData convertToClassData(Class<? extends AbstractAspectAwareEvent> eventClass) {
        ClassData classData = new ClassData();
        classData.setName(eventClass.getName());
        classData.setSimpleName(eventClass.getSimpleName());
        return classData;
    }

    @Override
    public boolean publishEvent(ClassData classData) {

        AbstractAspectAwareEvent event = convertToEvent(classData);
        if (event == null || !event.validateRequiredFields()) {
            return false;
        }

        eventService.publishEvent(event);
        return true;
    }


    private AbstractAspectAwareEvent convertToEvent(ClassData classData) {
        try {
            Constructor<?> constructor = ReflectionUtil.forName(classData.getName()).getConstructor();
            AbstractAspectAwareEvent event = (AbstractAspectAwareEvent) constructor.newInstance();
            for (FieldData field : classData.getFields()) {
                MethodHandleUtils.invokeSetterMethod(event, field.getName(), field.getValue());
            }
            return event;
        } catch (NoSuchMethodException e) {
            LOG.error("Event should have constructor without arguments!", e);
        } catch (ClassNotFoundException | InvocationTargetException | InstantiationException |
                 IllegalAccessException e) {
            LOG.error("An error happened during event creation!", e);
        }

        return null;
    }

    private List<FieldData> convertToFieldDataList(Field[] declaredFields) {
        List<FieldData> fieldDataList = new ArrayList<>();
        for (Field field : declaredFields) {
            fieldDataList.add(createFieldData(field.getName()));
        }
        return fieldDataList;
    }

    private FieldData createFieldData(String name) {
        FieldData fieldData = new FieldData();
        fieldData.setName(name);
        return fieldData;
    }
}

Mainly for implementation is used reflections library, which is bundled in platform, but there are several utility methods for converting between different data structures and handling event creation and validation, which should be implemented additionally. Also JSP and JS files must be written according to desired UI to complete integration with HAC.

comments powered by Disqus