Skip navigation

Between unit and integration testing


Everyone knows that it is mandatory to implement automated software tests that run on every build, commit or on a continuous integration system to ensure the quality of your developed software and to find bugs before a user has to experience it. The most common way to do this is to implement unit tests to verify the functionality of very small-grained code units (i.e. classes). Depending on your development process, you maybe even implement these test cases based on your requirements before you implement the software itself. But in some environments and for some software components it is very time-consuming to implement a unit test for every component, especially if the unit has a lot of dependencies that have to be mocked by using a mocking framework like Mockito or EasyMock. To write a unit tests for a plugin's Manager class that uses other Manager classes from the Jive core, you have to mock all of these dependencies. Some components like DataAccessObjects (DAOs) are even harder to test because you have to find a way to mock a RDMS. To test your software "in action" (i.e. against a running RDMS), you can of course implement automated integration tests with frameworks like Selenium that run on the frontend of your application. However one drawback of this approach is that you have to rewrite tests very often with every change of the UI. If you have only little control over the system's UI (as you have when developing plugins for Jive), you might have to rewrrite the whole UI and all the tests with it with every new major version.

These drawbacks made us look for another way to test our plugins that is not too time-consuming, not that affected by Jive UI changes but can test the plugin functionality regarding the interaction with the Jive core components. This blog post describes the found solution that uses a testing framework that is implemented by and is delivered with Jive.


Functional testing with JiveCommunityTest


The core base class for implementing automated functional tests for Jive plugins is the class com.jivesoftware.util.JiveCommunityTest located in com.jivesoftware:core-test-base. The JavaDoc for this class reads as follows:

This class is an extension of jUnit's TestCase. It is designed to be used as a functional test harness. A functional test is a test which needs various system resources set up -- a database connection, config files, etc. By simply extending this class you can call any Jive APIs and be assured there is a clean database underneath the hood.


If you use this class as base class for your custom JUnit test, it will start up (@BeforeClass) and tear down (@AfterClass) an environment that is very close to the real jive-application:

  • Create the spring application context
  • Create an in-memory database (HSQL) or setup up the schema on an existing database like Oracle, PostgreSQL,... (see com.jivesoftware.util.bellerophon.TestDatabaseProvider for details)
  • Setup jive home directory


Extending the Spring application context


So imagine having a class CustomManager in your plugin that has dependencies to Jive's UserManager, GroupManager, TagManager, ... and now you want to create an automated test class for it.

To implement your CustomManagerTest, you just have to create a test class that is derived from JiveCommunityTest:


@Configurable(autowire = Autowire.BY_NAME)
public class CustomManagerTest extends com.jivesoftware.util.JiveCommunityTest {

    private CustomManager customManager;

    public void setCustomManager(CustomManager customManager) {
        this.customManager = customManager;

    public void perform() throws Exception {


By default, the base class starts up the Spring application context that is defined in classpath:spring-testApplicationContext.xml. As you need an extended context here (containing the configuration for the plugin's beans) you have to create a new file spring-customTestContext.xml that includes the plugin's spring.xml and Jive's spring-testApplicationContext.xml:


<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="" [...] >
    <import resource="classpath:spring-testApplicationContext.xml" />
    <import resource="classpath:spring.xml" />


You also have to ensure that the classpath is correctly set up when running the tests.


Extending the database schema


Jive offers the possibility to extend the database schema by plugins. To achieve this, you just have to describe your tables in XML in a file called schema.xml. When running a test based on JiveCommunityTest this additional schema is not automatically created by the test class. To prepare the test database for your plugin test, you can use the class com.jivesoftware.util.bellerophon.TestDatabaseProvider to obtain the database configuration and create the required schema.


public static void setupSchema() throws Exception {
    final DatabaseProperties props = TestDatabaseProvider.getDatabaseProperties(DatabaseTypes.HSQL);
    TestDatabaseProvider.createSchema(props, CustomManagerTest.class.getClassLoader().getResourceAsStream("schema.xml"));


By using the @BeforeClass annotation, the custom schema is created after the base class' setup is performed but before any of the tests contained in this class is executed.

It is also possible to call the createSchema() method again to prepare the database with additional test data that is defined in another XML-File.


Creating places, content, users with FunctionalTestUtil


Many test cases depend on existing users, groups, permissions, places or content in a jive application, that are not present when a clean database is set up for every test run. You can of course - as described in the section above - define another schema.xml file that creates the required jive objects, but that requires a deep knowledge of what is implemented in the jive core.

The better solution is to use the class com.jivesoftware.util.FunctionalTestUtil that provides lots of helper methods to create and link these core jive objects, for instance:


User user = FunctionalTestUtil.createTestUser("john.doe", "***", "");
SocialGroup socialGroup = FunctionalTestUtil.createTestSocialGroup("John's private group", user, DefaultSocialGroupType.PRIVATE);


Running multiple test classes in the same context


The start up of the environment for one test class obviously takes some seconds to minutes. To keep tests maintainable a developer will prefer to implement test cases grouped by components in multiple test classes. The more test classes you derive from JiveCommunityTest, the longer the test execution for your plugin will take. But fortunately, JiveCommunityTests offer the possibility to create a shared environment to be used by multiple test classes. To configure this mode, you just have to run your tests with an additional parameter: -Djive.test.oneContext=true




Implementing automated functional test cases with the described framework by Jive is a great possibility to test your developed components against the jive core components. As you nearly don't have to spend any time to mock your component's dependencies, you can focus on the actual business logic of the test case. Another advantage is the possibility to run your tests against a real RDBMS and to use a defined and clean schema and dataset for every test run.



In this document I want to demonstrate a way to add customizations to Spring beans without overlaying/overriding the bean definition. This approach follows Kevin Conaway's ideas of modifying or adding functionality in an unobtrusive and repeatable (i.e. "multiple plugins or extensions can apply the same pattern to the same page without overriding or canceling one another", Freemarker Mixins) manner. These ideas and applicable solutions are described in these documents:

  • Freemarker Mixins: This article is focused on applying an unobtrusive and repeatable customization style on Struts action mappings and views (i.e. freemarker templates).
  • How To: Add Struts Interceptors at Runtime: This document describes how to add a custom Struts interceptor at runtime without extending the respective Action class in Jive or modifying (i.e. overlaying or overriding) the Struts action mapping.
  • Modifying Prototype Spring Beans: While the first two documents are about customizing the Struts Actions and UI templates, this article describes how to modify Spring beans with prototype scope (that means no singletons) without overriding the bean definition in a spring-*.xml file in a plugin.

After reading these three articles I wanted to find a way to customize some core Jive functionality without extending the Jive core classes or overriding the bean definitions. My first thought was to apply such customizations in an aspect-oriented (AOP) style, as Spring offers a lot of possibilities here. I spent a few minutes on trying to apply some advices via AspectJ-style by annotations and namespace/schema configuration, but neither worked: when using annotation configuration my IDE (Eclipse with AJDT) showed up compilation errors - I guess because the compile-time-weaver probably has to weave classes outside of the project; I don't know why the schema-based configuration did not work (perhaps I will investigate on this some time), so I switched to the "auto-proxying" approach, which did finally work, but I had to work around a few issues that came up with Jive (using 5.0.3).


Basics on Aspect-oriented Programming and Proxies


Most experienced Java (and especially Spring) developers will probably know the basics about aspect-oriented programming and proxying Java objects, but here is a brief introduction:

By most definitions out there, aspect-oriented programming is a programming paradigm that offers a modular approach to implement cross-cutting-concerns in object-oriented applications. Usually, a cross-cutting-concern is some technical or infrastructural functionality, that has to be applied to a lot of components in the application, like logging or transaction management. In terms of aspect-oriented programming, the cross-cutting-concern is the Aspect. The aspect is implemented in an Advice, which is often an Interceptor. An advice is applied to a Join Point, which is the point in the application where the aspect should be executed. Most AOP frameworks offer the concept of Pointcuts, which are expressions to define join points. Although I do not really want to implement a concern that is cross-cutting (I want to apply additional logic to specific beans), the general requirements and conditions for implementing plugins in Jive make AOP a nice way to implement additional functionality in a modular, unobtrusive (depends on implementation) and repeatable way.


In Spring, aspect-oriented programming is implemented through Proxies. A Proxy object is an object that encapsulates or wraps the target object (i.e. the actual bean object) and offers the same programming interface as the target class. By applying a proxy to a target object and using the proxy object as bean instance, all method calls to the target object (the join point of the bean) can be intercepted, which means that you can apply code or logic (the aspect) before and after the actual method calls happens. This additional code is implemented in a MethodInterceptor (the advice). Spring offers two ways for creating proxy objects:

  • JDK Dynamic Proxy API: included in Java, enables you to create proxies with defined interfaces. As these interfaces have to be Java Interfaces, you can only use this mechanism if you develop consequently against interfaces.
  • CGLib: third-party-library (included in Jive), that enables you to create proxies for Java classes. So you can use this mechanism if you develop against classes instead of interfaces (of course you can still develop against interfaces, too).


Implementing the MethodInterceptor


At first you have to implement the aspect as advice, which is a MethodInterceptor in this case. The MethodInterceptor is applied to all method calls (i.e. all join points of the bean), so in most use cases you will probably have to perform additional checks (like method names/signatures of the MethodInvocation) for executing your aspect's code. In my example use case, I want to perform additional checks after a user is authenticated by the "daoAuthenticationProvider" bean.


public class AuthenticationProviderInterceptor implements MethodInterceptor {

    public Object invoke(MethodInvocation invocation) throws Throwable {
        final Object result = invocation.proceed();
        if ("authenticate".equals(invocation.getMethod().getName())) {
            // perform additional checks
        return result;


Bean definition:

<bean id="mfAuthenticationProviderInterceptor" class="" />


Proxying with BeanPostProcessors


A common way in Spring to define proxy beans is to use a ProxyFactoryBean that enables you to create proxies for a target object with defined interfaces to expose and interceptors to apply. If you want to use a proxy instance of this FactoryBean, you would have to use the ProxyFactoryBean's name to refer to it. But as we cannot (or do not want to) change all bean definitions that use or reference the target bean, a different approach is required. Luckily, Spring offers the concept of BeanPostProcessors. A BeanPostProcessor is a Spring construct to hook into the creation process of bean instances that allows you to modify and even replace instances before they are injected into other beans. If a BeanPostProcessor is defined as Spring bean and the application uses an ApplicationContext, the processors are registered automatically. For applying proxies to beans, Spring even offers an implementation of a BeanPostProcessor: BeanNameAutoProxyCreator. Configured with bean names for target objects (property "beanNames") and interceptors (property "interceptorNames"), this class automatically creates proxies according to the supplied configuration. One important setting is the "proxyTargetClass" boolean-property (default: false). If it's set to true, Spring will use the CGLib mechanism to create proxies.


Unfortunately, I got some errors in Jive when I used the BeanNameAutoProxyCreator (caused by a call to beanFactory.getAliases(beanName)). I did not look deeper into this issue - instead I created a similar class that only checks for the bean name and skips the check for aliases:


public class BeanNameOnlyAutoProxyCreator extends AbstractAutoProxyCreator {

    private static final Logger LOG = Logger.getLogger(BeanNameOnlyAutoProxyCreator.class);

    private List<String> beanNames;

    public void setBeanNames(String[] beanNames) {
        this.beanNames = new ArrayList<String>(beanNames.length);
        for (String mappedName : beanNames) {

    protected Object[] getAdvicesAndAdvisorsForBean(Class<?> beanClass, String beanName, TargetSource targetSource) {
        for (String matchedName : beanNames) {
            if (isMatch(beanName, matchedName)) {
      "Proxying bean " + beanName);
        return DO_NOT_PROXY;

    protected boolean isMatch(String beanName, String mappedName) {
        return PatternMatchUtils.simpleMatch(mappedName, beanName);


Bean definition:

<bean id="mfAuthProviderProxyCreator" class="">
    <property name="beanNames">
    <property name="interceptorNames">


Registering BeanPostProcessors in Jive


As I mentioned before, BeanPostProcessors are automatically registered if an ApplicationContext is used. As Jive does use an ApplicationContext, the proxying should already work after defining the AutoProxyCreator. Unfortunately, there is a class com.jivesoftware.base.event.v2.AutoEventListenerRegistrar, which is a BeanFactoryPostProcessor. BeanFactoryPostProcessors do - as the name indicates - post-process bean factories and are registered and executed before BeanPostProcessors (see The AutoEventListenerRegistrar performs a lookup for beans of type com.jivesoftware.base.event.v2.EventSource to inject the EventDispatcher. By doing this, the beans will be instantiated and initialized (I don't know if this applies to all beans or only the resolved references, but all *Manager classes implement EventSource and these classes have a lot of dependencies). As the BeanPostProcessor is registered afterwards and the target objects are already created, the post processing is not executed for these instances. To work around this issue, it is possible to register the BeanPostProcessor manually. I did this by defining a bean that implements the interface BeanDefinitionRegistryPostProcessor. BeanDefinitionRegistryPostProcessors are also registered and executed automatically, but before the BeanFactoryPostProcessors (see In this bean, we can register the BeanPostProcessor.


public class BeanPostProcessorsSetup implements BeanDefinitionRegistryPostProcessor {

    private static final Logger LOG = Logger.getLogger(BeanPostProcessor.class);

    private List<BeanPostProcessor> beanPostProcessors;

    public void setBeanPostProcessors(List<BeanPostProcessor> beanPostProcessors) {
        this.beanPostProcessors = beanPostProcessors;

    public void postProcessBeanFactory(ConfigurableListableBeanFactory bf) throws BeansException {
        for (BeanPostProcessor bpp : this.beanPostProcessors) {
  "Registering " + bpp.getClass().getName());

    public void postProcessBeanDefinitionRegistry(BeanDefinitionRegistry registry) throws BeansException {
        // nothing to to


Bean definition:

<bean id="mfBeanPostProcessorsSetup" class="">
    <property name="beanPostProcessors">
            <ref bean="mfAuthProviderProxyCreator" />


Summary (Pros & Cons)


The main reason for me to implement this approach was an additional plugin I had to develop: I needed to apply additional code after the daoAuthenticationProvider authenticates a user. I could have implemented this by extending the Jive class and overriding the bean definition, but I had already done this in another plugin (hence this was not repeatable). With this approach I am now able to add new behavior even to beans that are overridden in other plugins. I already have a use case, where a core bean is proxied by two separate plugins and it works perfectly.


But I think there are also two things you really have to consider before following this approach:

  • Implementation: you need to know the implementation of the target method and have answers to the following questions: What happens if I modify the method parameters? Which exceptions does the target method throw and how are they handled by callers? How is the result object used by callers and what happens if I modify the result object? If you use this approach, the aspect's code should be as unobtrusive as possible. If you modify method parameters and execution results, this is likely to affect the stability of the system and the repeatability of the approach. You also have to take other applied aspects into account, like transaction management for instance (transaction boundaries!).
  • Product upgrades and API conformity: the code of an implemented aspect/advice is not "type-safe". In the MethodInterceptor you probably evaluate the method name and signature and cast Objects to interfaces or classes. If the method name or signature changes with new versions, your advice might not work anymore or cause errors at runtime.


Comments and opinions on this approach are very welcome. Can you think of any drawbacks I might have missed? Has anybody done something similar yet and what are your experiences?