cmis-in-batch released with data generation mode

I have just pushed proper 1.0 release of the cmis-in-batch tool to github and bintray:

Here is a quote from the

Data generation is a useful feature that allows bulk importing of test data documents into CMIS compatible repository. Additionally it can populate metadata of documents with values coming from predefined dictionaries.

Sample script for generating thousands of documents can look like this one below.

Here is a brief description of what the script does:

* it will load three dictionaries from files /tmp/disciplines, /tmp/types, /tmp/subtypes. The dictionaries are simple text files where values are separated by new line characters. From the dictionary values Cartesian product will be calculated so for example, having three dictionaries:

1. level1A, level1B
2. level2A, level2B
3. level3A, level3B
following combinations will be generated:

[level1A, level2A, level3A]
[level1A, level2A, level3B]
[level1A, level2B, level3A]
[level1A, level2B, level3B]
[level1B, level2A, level3A]
[level1B, level2A, level3B]
[level1B, level2B, level3A]
[level1B, level2B, level3B]

* it will import each file in content-path location (“/media/kbryd/Media/work/sample_data/department”) to a location in repository defined with linking-rule: /Repository/${discipline}/static/${doctype}/sub/${docsubtype} – each ${} variable will be replaced by a value coming from appropriate dictionary.
* naming-rule defines what the object name should be. It can use variables from the dictionaries plus a few additional: ${file_name}, ${file_size}, ${file_path}, ${file_ext}, ${file_mime}
* mapping defines the mapping for populating metadata of each document, e.g. in this case discipline attribute will be populated with value of discipline.

   generate-random-data "set1" {
   doc-type "cara_document"
   linking-rule "/Repository/${discipline}/static/${doctype}/sub/${docsubtype}"
   naming-rule "${file_name} - ${doctype}"
   content-path "/media/kbryd/Media/work/sample_data/department"

   mapping {
       discipline {
       doc_type {
       doc_subtype {

   dictionaries {
      discipline "/tmp/disciplines"
      doctype "/tmp/types"
      docsubtype "/tmp/subtypes"

And that’s all! Have fun using it! 🙂

Docker & orphaned volumes

What is the best way to get rid of orphaned Docker volumes?

I have written one liner that seems to be solving this problem. First lets check how many volumes there are in total:

root@quad:~# docker volume ls | wc -l

72 volumes. Quite a lot to be honest.

Then I run this to double check that it really does return valid data.

for vol in `docker volume ls -qf dangling=true`; do  docker volume inspect $vol | jq -r '.[] | .Mountpoint' | xargs ls ; done

and I get 52 lines like these:

ls: cannot access '/opt/docker/volumes/07a1d0711833af526d08809118ae880fce4d6c537d4246c2fc332fceca0cde9a/_data': No such file or directory
ls: cannot access '/opt/docker/volumes/0d88bab60c8f5023a22f3ff15b35ab585c66b476f2e6615db06f50fefb2bc970/_data': No such file or directory
ls: cannot access '/opt/docker/volumes/0e4e14e289b25a65d0ee597bc7810c6f09961d13a059293a357c425b46534f5a/_data': No such file or directory
ls: cannot access '/opt/docker/volumes/18c2481850150cab1c46b0bdc9178e959c05dd4a205a45510c20467c6b95e473/_data': No such file or directory
ls: cannot access '/opt/docker/volumes/2275d260b670d10d026f1be776a9d32b9a26e23d513cca63add039e9451e3454/_data': No such file or directory
ls: cannot access '/opt/docker/volumes/28252f4885f02615e33d1ed2fdc4b45ebb21d38b933457760090f9f3b68f4e09/_data': No such file or directory

which means that only 20 volumes are actually valid, rest of them point to nonexistent folders on the hard drive.

So, they can be easily removed by running

docker volume ls -qf dangling=true | | xargs -r docker volume rm

Bizarre Composer error

Composer is good when it works, but when it stops is a real PITA to understand what is going on. Sometimes cleaning a project helps, sometimes, re-importing does the magic, but sometimes nothing works. Almost.

So I had this issue that a perfectly fine Composer project one day simply stopped building. At all. All I was getting was this error message:

'Dar Install Options Set' of 'Dar Def-Methods' must be set default.dardef /Methods/dar org.eclipse.emf.ecore Problem

Turned out that project settings (upgrade/overwrite etc.) *somehow* vanished from default.dardef file, so I have added it back:

      <location xsi:type="installparameter:FolderParameter" href="urn:com.emc.ide.folderparameter/Dar+Install+Folder?artifactURI=file:/C:/Composer/Methods/Artifacts/Installation%20Parameters/darinstallfolder.parameter#//@dataModel"/>
      <upgradeOption xsi:type="dardef:UpgradeOption"/>

And it started working fine again…

Repoint 0.2.0 released

I have just released a new repoint release, version 0.2.0. Head to the repoint’s github project to get the executables for Linux and Windows. If you would like package for Mac OS please leave a comment and I will add it as well.

What’s new? Well, in the 0.1.0 version I have added a new dialog that shows ACL in a nice and readable way:Selection_059The feature is available from the right click menu from the tree and also from the DQL results viewer. In future this feature will also allow editing of ACLs.

In 0.2.0 I have added one more feature that I had on my TODO list since forever


It is now possible to add a docbroker and a docbase from that docbroker without need to edit You can still use, but if need to quickly connect to a different docbase which is not in your then now it is possible 🙂

If you have ideas, bug reports etc. please use github or comments below to let me know


D2 Lockbox & VirtualBox

D2 Lockbox can be a real PITA to install but it surely is even more PITA when it is installed in a VM. Thing is that even if you have a properly installed D2 Lockbox it will stop functioning after each VM reboot.

It took me long hours to notice what is wrong, I was changing the setup, double checking it with the documentation a few times and everything seemed just fine…Funny enough after changing the setup I was doing a reboot – “just in case” 🙂

So, the lesson I have learnt is that  VirtualBox VM should not be restarted after Lockbox was configured. It is better to use “VM Pause” otherwise D2.lockbox file will have to be re-generated.



Documentum D2: How to display a dialog from a D2-Plugin?

Sometimes it is useful to display some message to the user from the code running on the server side (in D2-Plugin). At the first sight there doesn’t seem to be a way to do this but thanks to Dariusz R. (thanks!) who has decompiled half of D2 code 🙂 we have learned that it is actually possible. And it is also extremely simple:

The trick is to use D2fsExceptionManager.throwD2SilentException(D2fsContext context, String eventName, String messageToPublish) exception with appropriate event name and message, for example:

D2fsExceptionManager.throwD2SilentException(context, "D2_ACTION_DISPLAY_DIALOG", "DIALOG_NAME==VeryImportantDialog!!CHANNEL_EVENT==D2_ACTION_DISPLAY_DIALOG");

That’s all! Try it for yourself.

Customizing Documentum D2

I have been working on customization of D2 for the last 1.5 years and I think that I am in a good position to share some knowledge I managed to gather. I am going to start a series of blog posts explaining some less obvious tricks that I find useful.

So, lets start with customizing D2 menus…

There are two ways of doing it, one way involves using D2-Config, you can simply click on “Go to…/D2 Menu” and then configure order of menu items, change their conditions etc.
This is fine, but this is not always the best way of doing it. For example, what if you would like to add more advanced conditions which are not available in vanilla D2? Or what if you would like to dynamically create menu items? This simply can’t be done by using just the D2-Config. In more advanced projects it is often better to have the menus customized via XML in a D2 plugin than in D2-Config, this also allows better control of changes (as all files are in VCS some sort).

Okay, enough introduction, lets see some example. Lets assume that we would like to add a static D2 menu that will open some dialog.
In order to achieve this goal it is good to understand how D2 plugins work. At the moment lets only focus on a typical plugin folder structure:


The menu layout XML file is stored in src/main/resources/xml/menu, so lets create a file MenuContextDelta.xml there with following content:

<?xml version="1.0" encoding="utf-8"?>
    <insert position-after="menuContextEdit">
        <menuitem id="menuContextSampleStatic">

The name of this file is important, it is a ‘delta’ file containing differences that merged with the base MenuContext.xml (that is stored in D2FS4DCTM-WEB-4.5.jar) will produce the final MenuContext.xml.

What the XML code does is pretty simple, we define a new menu item, that will be inserted after menuContextEdit menu item (that is ‘Edit’) in the Context menu (this is implied by name of the MenuContextDelta.xml file). When clicked, the new menu item will show dialog named ‘SampleDialog’.

Now, what if we would like to have a dynamic menu item? Lets say the requirement is to list available custom actions e.g “Publish to system A”, “Publish to system B” etc. when the menu is selected.

In order to implement it we need two files: first one adds the main menu item that contains reference to the second file (please note src=”PublishSubmenu”), second one points to the Java class that will generate available submenu options based on the current document selection (as available publishing systems depend on the selected document):

File: MenuContextDelta.xml

<?xml version="1.0" encoding="utf-8"?>
    <insert position-after="menuContextEdit">
        <menuitem id="menuContextSampleSubmenu" src="PublishSubmenu">

File: PublishSubmenu.xml

<?xml version="1.0" encoding="utf-8"?>
    <dynamic-menuitem id="submenuPublish" class="">

And finally the Java class that will create submenu items:


import java.util.HashMap;
import java.util.List;
import java.util.Map;

import com.documentum.fc.client.IDfSysObject;
import com.emc.d2fs.dctm.ui.dynamicactions.actions.ShowDialog;
import com.emc.d2fs.utils.AttributeUtils;

 * Created by kbryd on 11/01/2015.
public class PublishMenu extends SimpleDynamicMenu {

  public XmlNode getXmlMenuItem(D2fsContext context) throws Exception {
    XmlNode result = new XmlNode();
    IDfSysObject sysObject = (IDfSysObject)context.getFirstObject();
    AvailablePublishingSystemsModel model = PublishingUtils.getSystems(context.getSession(), sysObject);
    List<String> configNames = model.getSystemsAsList();

    for(int i = 0; i < configNames.size(); i++) {
      String configName = configNames.get(i);
      String label = configName;
      XmlNode node = result.appendChildNode("menuitem");
      node.setAttribute("id", "genid_" + i);
      node.setAttribute("label", label);

      Map<String, Object> attributes = new HashMap();
      attributes.put("DIALOG_NAME", "ConfirmPublishDialog");
          "target_system" + AttributeUtils.SEPARATOR_VALUE + "label" + AttributeUtils.SEPARATOR_VALUE
              + "oam_id");
      attributes.put("target_system", configName);
      node.setAttribute("action", new ShowDialog().getAction(context, attributes));
    return result;

There is one small interesting thing going on there…I pass the target_system name to the dialog so it can be displayed as part of the confirmation message:

      Map<String, Object> attributes = new HashMap();
      attributes.put("DIALOG_NAME", "ConfirmPublishDialog");
          "target_system" + AttributeUtils.SEPARATOR_VALUE + "label" + AttributeUtils.SEPARATOR_VALUE
              + "oam_id");
      attributes.put("target_system", configName);
      node.setAttribute("action", new ShowDialog().getAction(context, attributes));

Merely adding the value to the attributes Map is not enough to pass the value to the dialog. It is also important to add that attribute name to the DIALOG_LIST_PARAM value. Then in the dialog buildDialog method you can access the parameter value like this:

  public XmlNode buildDialog(D2fsContext context, List<Attribute> attributes) throws Exception {
    XmlNode configDialog = XmlUtil.loadFromURL(context.getXmlDialogFile()).getRootXmlNode();

    DialogProcessor dialogProcessor = new DialogProcessor(context, configDialog);
    Map<String, String> defaultValues = new HashMap<>();
    defaultValues.put("target_system", context.getParameterParser().getStringParameter("target_system"));

    XmlNode dialogNode = dialogProcessor.getDialog();
    return dialogNode;

Nice thing about default values in dialogs is that the values can be automatically used in *.properties files.

Lets have a look at a ConfirmPublishDialog layout XML:

<?xml version="1.0" encoding="utf-8"?>
<dialog id="ConfirmPublishDialog" width="410" height="200" resizable="true" buttons_right="false">
        <comment id="confirmMessage" html_content="true" condition_visible="true"/>
        <button id="buttonOk" type="submit" action="validDialog()" />
        <button id="buttonCancel" type="reset" action="cancelDialog()" />

The comment element has id ‘confirmMessage’. This id also points to the resource in in src/main/resources/strings/dialog/ConfirmPublishDialog.

And the promised trick is that you can use $value constructs there like this:

confirmMessage=Please confirm that you want to publish selected document to $value(target_system)

This saves some additional effort with passing the value down to the view.

I hope this was useful!

Simple CMIS Export tool

It is hard to believe that there was no basic (well, even extremely basic) tool that would allow exporting some folders and documents from a CMIS repository (Alfresco in my case) to a file-system in a hassle free way. Thanks to the OpenCMIS library writing such tool took around one hour and here is the result:

Currently the tool accepts following arguments:

usage: com.metasys.CMISExportTool
 -h                           Print help for this application
 -f <arg>                     Destination folder location
 -u <arg>                     User login
 -p <arg>                     Password
 -levels <number of levels>   Number of levels
 -s,--starting-path <arg>     Start path

-levels argument is not yet supported, but will be soon.

This tool can be very useful when used together with cmis-upload-maven-plugin ( for writing Unit tests. For example, you can export some files (e.g. configuration) from a repository that are required by your unit test, then you put them in your AMP project and, finally, using cmis-upload-maven-plugin you can automatically upload those files to your test repository that it started during execution of your unit tests (mvn test).

So here is an example. Let’s assume that you have a project that needs some configuration in the repository, let’s name it ‘stamper’. In order to make the bootstrap process more convenient you need to add following section to your pom.xml file:



This will upload all files from ${}/stamper-resources folder to the root folder of your repository. You could copy some files there using maven.resources.plugin, so let’s do it…

This will copy files from main/src/test/config to target/stamper-resources.


and now all you have to do to upload files is it type:

mvn -Dmaven.test.skip=true -Daction=upload-config package

Enjoy! 🙂

Documentum Repoint Resurrected!

I guess everyone using Documentum on a daily basis knows what dqMan and repoint applications are. Life of a Documentum developer would be very difficult if those tools were not available. I tried so many times to get used to dqMan but I simply couldn’t, there is just something wrong with the User Interface…So even though repoint application felt like ‘beta’ version I was still preferring using repoint to dqMan.

I’ve had this plan since long time ago to restart development of repoint as it seems like the original author abandoned the project. And it finally happened. I have just committed a Maven friendly repoint project to github:

If you would like to play with it you can grab the sources and use Maven to build it, just run:

mvn clean install

and after some time you will find repoint application for your architecture (linux, windows, macos) in following folder: repoint-eclipse-repository/target/products/Repoint

So far I have only removed some deprecated RCP code but I have some bigger plans for that application. Depending on my spare time I am going to work on:

* ACL editor
* Trusted Content Services support

and of course fix issues and improve usability.