Thursday, November 20, 2014

Setting JAVA_HOME for Bitnami Apache Tomcat stack as a Windows Service

tl;dr 

You can't set it to a different JAVA_HOME, you'll have to set your JAVA_HOME to the Bitnami JDK if you want a consistent JDK.

Setting the Bitnami Tomcat JAVA_HOME on Windows

The Problem

The Bitnami Tomcat stack packages it's own JDK and uses it over the existing JAVA_HOME directory.  This means that if you want a consistent JDK between your unit tests and in-container tests (and you do) you either need to set your usual JAVA_HOME to the Tomcat version or get Tomcat to use the system JAVA_HOME.

The Background

This tutorial is about the latter, as I have already installed the Java Cryptography Extension (JCE) unlimited strength jurisdiction policy files into the system JAVA_HOME (as per a blog post on Suhorish!) and didn't want to do it again.

The Research

After going through several stackoverflow questions and a Bitnami community post I found that they all suggested changing the various Windows BAT files in /bin and /scripts.  This did not work.

The Experiments

Digging into the Windows Service properties I found that the tomcat7.exe is called directly, therefore bypassing all of these scripts.

I eventually decided to use Junction Link Magic to create a hard link from Bitnami/tomcatstack/java to my actual JAVA_HOME... and that didn't work.  The Tomcat service will fail with Application Error 1 and an event error of:
The tomcatstackTomcat service terminated with the following service-specific error: 
Incorrect function.

So I tried to bypass the service by calling the scripts directly, also to no avail.

The Conclusion

This led me to the tl;dr at the start of this article.  I hate admitting defeat.

Thursday, October 2, 2014

Make Ubuntu 14.04 Look Like Windows with Gnome

After setting up dual-booting with Windows 8 and Ubuntu one of the first things that I did was to have my Windows habits and my Ubuntu habits converge to make things (much) easier.  I decided to make Ubuntu look more like Windows.

There are already several posts about having Ubuntu look like Windows, however they have left out that in the default Unity desktop it is impossible to have your menu buttons (close, minimize, maximize) on the right hand side a la Windows (Unity Tweak Tool bug report).  This leads to needing to use an alternate desktop (like Gnome with the Gnome Tweak Tool) to achieve this effect.  It was pretty easy to switch.

Tuesday, September 23, 2014

Dual Booting Ubuntu 14.04 and Windows 8

Although there have been many, many posts, questions (Ubuntu Forums, Ask Ubuntu)and tutorials (Ubuntu.com) on this subject the solution is still elusive and complex.  Also some background information is needed before delving into the black art of modern boot loaders.  Also note that this article is focusing on having Windows as the primary boot loader and having it delegate to GRUB2 to boot Ubuntu.  This is the focus as Windows 8 will rewrite it's boot loader if it isn't the first OS to boot, wiping out GRUB boot info entirely.

Safety First

On the Ubuntu.com tutorial above it details backing up your entire system and creating recovery disks in case something goes wrong.  It would be a shame to have to reinstall your entire TWO OS's because of a couple of bad files.

A Brief History of Boot Loading

Booting up has a long and venerable history.  For the Microsoft family of OSs (DOS and Windows) the old scheme was to use the Master Boot Record (MBR) located on C:\boot.ini (as a hidden file).  There is also a good Manjaro Wiki  However this had the limitation of "only" being able to index 2 TB of data on a single volume so a new system was devised to future-proof Windows Vista and on, i.e. Windows 8, against truly massive drives.  This scheme is called Unified Extended Firmware Interface (UEFI) and instead of using a single file it uses an entire partition.  You will need to determine your boot loader scheme on Windows.  The scheme to edit UEFI is called the Boot Configuration Data Editor (BCD).  There is a Windows Help article about the back-story and another on the bcdedit utility.  The important take away point is that now a Windows Vista / Windows 8 system can boot up EITHER with UEFI OR the MBR and BCD was created to abstract away this underlying complexity.  This results in either having BCD with UEFI or BCD with the MBR.  However this turns out to be a leaky abstraction.

UEFI or MBR

As a rule of thumb, a computer bought after Windows Vista came out (2008?) will support UEFI.  Also you can dig around in the BIOS during boot up (to bring it up you may have to press F1, F10 or F12) looking for boot information.  In the BIOS MBR may be referred to as "legacy boot" or something similar.  Alternately you can just use msinfo32.

Pick Your Poison

Install and Fix

To modify the new UEFI scheme you need to use either the Windows utility command-line program bcdedit (run from the native shell, NOT a replacement like Console2, and as Administrator), the free-for-non-commercial-use easyBCD or the also free-for-commercial-use BootNext utility.  Unfortunately easyBCD only works with BCD over the MBR and BootNext requires you to register on the blog site to download.  The good news is that easyBCD is easy to use.

Using easyBCD (BCD / MBR)

After much searching and consternation I found an Ask Ubuntu post on how to dual boot Windows and Linux as well as an in depth walk-through on the easyBCD site.  Basically you install normally with the Advanced Partitioning, and then tell Windows how to find Ubuntu.  These posts let me boot to Ubuntu after poking through the boot menu after an error about being unable to find the /NST/AutoNeoGrub0.mbr file.  This happens when you use easyBCD with UEFI.

Using BootNext (BCD / UEFI)

While BootNext supports UEFI it appears to only have a help file for installing with Windows 7.  Also the method is complex as you have to load your Ubuntu partition under Windows and then copy 512 bytes of the boot sector.  This leads us to the option of having UEFI and installing it right to begin with.

Install it Right

It turns out that there was a hard-to-find article on installing Ubuntu alongside another OS with UEFI enabled at help.ubuntu.com and is far easier to understand than a similar Q&A page.  The critical part of this article is that when you install Ubuntu in "Advanced" mode to manage your system partitions you need to find your UEFI partition and mount it at /boot/efi.  You can find this partition by going into the Windows' Disk Management utility and look for a partition with a Status of "Healthy (EFI System Partition)" and note which partition this is (in my case, it was the second partition on the disk).  Remember to create a SWAP partition (1x or 2x as large as your RAM, see the FAQ) as well as your main Ubuntu partition (20 GB or more).

Conclusion

So, in this article we have seen the history of boot loaders, the different kinds of boot loaders and tools to work with them as well as how to install Ubuntu and then tell Windows about it or installing Ubuntu correctly in the first place.  This has been one of the more difficult articles that I have written and I hope that I distilled my research over about 30 different tutorials and Q&A pages into an easy to understand format.

Tuesday, April 29, 2014

Using JS Test Runner with slf4j-log4j12

Hello.

If you've tried using JS Test Runner to integrate your QUnit JavaScript unit tests with Maven you may have noticed that JS Test Runner has a transitive dependency to slf4j-jcl (Simple Logging Facade for Java with Jakarta Commons Logging bridge).  This is supposed to be easily fixed by using Maven's dependency exclusions but for some unknown reason this did not work for my project.  This resulted in two slf4j appenders being present at the same time, causing a warning from slf4j ("Multiple bindings were found on the class path").  A warning usually isn't too bad of a problem but it caused the Google App Engine to crash locally, making it a blocking issue.  I tried manually changing the POM file the Right Way by cloning the Git repository and trying to run `mvn install` on an incremented version of it, but it required GPG and that failed on my Windows 64bit machine ("Sorry, no terminal at all requested - can't get input").  So, out of desperation I "monkey-patched" the current POM (1.0.2) manually to use slf4j-log4j12 (like the rest of my project) instead of slf4j-jcl.  This has been working beautifully... but is a blatant hack.

In Summary, if you're having problems with JS Test Runner vis a vis logging you want to manually edit the POM file in your local repository (and document it in your project).

Tuesday, March 18, 2014

The Maven Central Archetype Catalog File

During the course of trying to look at the new Google App Engine Archetypes I found that they were in Maven Central... but Maven Central wasn't configured as an archetype source in my Eclipse installation.

It took some digging to find the Maven docs pointing to http://repo1.maven.org/maven2/archetype-catalog.xml so that I could add the catalog to my searched sites.  Good luck.

Friday, March 7, 2014

Shiro, Guice, Maven and Google App Engine Integration Quick Start Completed!

The Shiro, Guice, Maven and Google App Engine Integration Quick Start tutorial is now complete.  Blogger was having an issue where the usual "preview" feature was not working so I had to publicly post my work to ensure that it was displaying correctly.  For anyone who viewed the page before completion, you can now go back if you want to see the finished work.  Thanks for reading!

Thursday, February 27, 2014

Shiro, Guice, Maven and Google App Engine Integration Quick Start

This brief, Hello World-like, quick start tutorial will help you interpret the semi-byzintine Shiro documentation on Web Integration with Guice (documentation link) in a Maven project with the Google App Engine as the deployment environment.

We are going to assume that you have a working simple Shiro setup (Step 1 in link), that Guice is integrated into Servlets already (link) and that you just need to link the two together.

The biggest inherent challenge in this comes from the following facts:
It seems that we are caught in a classic catch 22, where we need the ServletContext to create the Injector so that we can create the ServletContext.  An answer to this, used here, is to:
  • create a class to hold the current ServletContext and expose it statically
  • to create another class that implements ServletContext and delegates to the static calls and
  • to pass this proxy object to the new ShiroWebModule
Here is the GuiceServletContextListener:

package ...;

import java.util.List;

import org.apache.shiro.guice.web.ShiroWebModule;

import com.google.inject.Guice;
import com.google.inject.Injector;
import com.google.inject.Module;
import com.google.inject.servlet.GuiceServletContextListener;
import com.google.inject.servlet.ServletModule;
import com.peninsulawebsolutions.ioc.PwsGuiceModule;
import com.peninsulawebsolutions.security.shiro.PwsShiroWebModule;
import com.peninsulawebsolutions.servlet.PwsServletContextProxy;

public class PwsGuiceServletContextListener extends GuiceServletContextListener {

    @Override
    protected Injector getInjector() {

        // get list of generic modules, must be List and not Set since ordering counts
        final List<module> pwsGuiceModules = getGuiceModules();

        // add Google Guice servlet integration first
        pwsGuiceModules.add(0, new ServletModule());

        // add the Shiro Web Module and ShiroFilterModule last
        pwsGuiceModules
                .add(new PwsShiroWebModule(new PwsServletContextProxy()));
        pwsGuiceModules.add(ShiroWebModule.guiceFilterModule());

        final Injector injector = Guice.createInjector(pwsGuiceModules);

        return injector;
    }

    ...
}

Here is the ServletContextHolder:

package ...;

import javax.servlet.ServletContext;
import javax.servlet.ServletContextEvent;
import javax.servlet.ServletContextListener;

public class PwsServletContextHolder implements ServletContextListener {

    private static ServletContext servletContext;

    public static ServletContext getServletContext() {
        // may want assert statement here to ensure not null
        return servletContext;
    }

    @Override
    public void contextInitialized(ServletContextEvent sce) {
        servletContext = sce.getServletContext();

    }

    @Override
    public void contextDestroyed(ServletContextEvent sce) {
        servletContext = null;
    }

}

We're almost done with the Java code, this is the ServletContextProxy:
package ...;

import java.io.InputStream;
import java.net.MalformedURLException;
import java.net.URL;
import java.util.Enumeration;
import java.util.Set;

import javax.servlet.RequestDispatcher;
import javax.servlet.Servlet;
import javax.servlet.ServletContext;
import javax.servlet.ServletException;

public class PwsServletContextProxy implements ServletContext {

    @Override
    public String getContextPath() {
        return getServletContext().getContextPath();
    }    
    
    @Override
    public ServletContext getContext(String uripath) {
        return getServletContext().getContext(uripath);
    }
    
    @Override
    public int getMajorVersion() {
        return getServletContext().getMajorVersion();
    }
    
    @Override
    public int getMinorVersion() {
        return getServletContext().getMinorVersion();
    }
    
    @Override
    public String getMimeType(String file) {
        return getServletContext().getMimeType(file);
    }
    
    @SuppressWarnings("rawtypes")
    @Override
    public Set getResourcePaths(String path) {
        return getServletContext().getResourcePaths(path);
    }
    
    @Override
    public URL getResource(String path) throws MalformedURLException {
        return getServletContext().getResource(path);
    }
    
    @Override
    public InputStream getResourceAsStream(String path) {
        return getServletContext().getResourceAsStream(path);
    }
    
    @Override
    public RequestDispatcher getRequestDispatcher(String path) {
        return getServletContext().getRequestDispatcher(path);
    }
    
    @Override
    public RequestDispatcher getNamedDispatcher(String name) {
        return getNamedDispatcher(name);
    }
    
    @SuppressWarnings("deprecation")
    @Override
    public Servlet getServlet(String name) throws ServletException {
        return getServletContext().getServlet(name);
    }
    
    @SuppressWarnings({ "rawtypes", "deprecation" })
    @Override
    public Enumeration getServlets() {
        return getServletContext().getServlets();
    }
    
    @SuppressWarnings({ "rawtypes", "deprecation" })
    @Override
    public Enumeration getServletNames() {
        return getServletContext().getServletNames();
    }
    
    @Override
    public void log(String msg) {
        getServletContext().log(msg);
    }
    
    @SuppressWarnings("deprecation")
    @Override
    public void log(Exception exception, String msg) {
        getServletContext().log(exception, msg);
    }
    
    @Override
    public void log(String message, Throwable throwable) {
        getServletContext().log(message, throwable);
    }
    
    @Override
    public String getRealPath(String path) {
        return getServletContext().getRealPath(path);
    }
    
    @Override
    public String getServerInfo() {
        return getServletContext().getServerInfo();
    }
    
    @Override
    public String getInitParameter(String name) {
        return getServletContext().getInitParameter(name);
    }
    
    @SuppressWarnings("rawtypes")
    @Override
    public Enumeration getInitParameterNames() {
        return getServletContext().getInitParameterNames();
    }
    
    @Override
    public Object getAttribute(String name) {
        return getServletContext().getAttribute(name);
    }
    
    @SuppressWarnings("rawtypes")
    @Override
    public Enumeration getAttributeNames() {
        return getServletContext().getAttributeNames();
    }
    
    @Override
    public void setAttribute(String name, Object object) {
        getServletContext().setAttribute(name, object);
    }
    
    @Override
    public void removeAttribute(String name) {
        getServletContext().removeAttribute(name);
    }
    
    @Override
    public String getServletContextName() {
        return getServletContext().getServletContextName();
    }
    
    protected ServletContext getServletContext() {
        return PwsServletContextHolder.getServletContext();
    }
}

That one had taken me a lot of typing to create.
Lastly for Java code we have the ShiroWebModule subclass:
package ...;
import javax.servlet.ServletContext;

import org.apache.commons.io.FilenameUtils;
import org.apache.shiro.config.Ini;
import org.apache.shiro.guice.web.ShiroWebModule;
import org.apache.shiro.realm.text.IniRealm;

import com.google.inject.Provides;
import com.peninsulawebsolutions.assertions.PwsAssertUtils;
import com.peninsulawebsolutions.exceptions.PwsCheckedException;
import com.peninsulawebsolutions.os.PwsFileUtils;
import com.peninsulawebsolutions.os.PwsWindowsCommandLine;

public class PwsShiroWebModule extends ShiroWebModule {

    public PwsShiroWebModule(ServletContext servletContext) {
        super(servletContext);
    }

    @Override
    protected void configureShiroWeb() {
        try {
            // realm should be created with the IniRealm(Ini) constructor
            // bind to different constructor (e.g. an SQL one) as needed
            bindRealm().toConstructor(IniRealm.class.getConstructor(Ini.class));
        } catch (final NoSuchMethodException e) {
            addError(e);
        }

        // can add Shiro API calls here, e.g. addFilterChain

    }

    @Provides
    public Ini loadShiroIni() throws PwsCheckedException {
        // will probably look like "file:[cwd]/WebContent/WEB-INF/shiro.ini
        // during test, WebContent/ will need to be omitted when run inside
        // of container
        final String path = getIniPath();

        final Ini ret = Ini.fromResourcePath(path);
        PwsAssertUtils.isFalse("Processed Shiro ini file was empty.",
                ret.isEmpty());
        return ret;
    }

    ...
}

The listener order is important here; we need the current servlet context in the holder before we access it from the Guice listener.

If you keep the old EnvironmentLoaderListener you will get an error about users being defined twice.  Also note that this code technically does the same thing as the default behavior and just loads an ini file!  The important part is that you have a place to freely call the Shiro API now.

Almost lastly, we finally have the web.xml configuration:

<?xml version="1.0" encoding="UTF-8"?>
<web-app xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://java.sun.com/xml/ns/javaee" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd" id="WebApp_ID" version="2.5">

  ...
  
  <filter>
    <filter-name>guiceFilter</filter-name>
    <filter-class>com.google.inject.servlet.GuiceFilter</filter-class>
  </filter>
  <filter-mapping>
    <filter-name>guiceFilter</filter-name>
    <url-pattern>/*</url-pattern>
    <dispatcher>FORWARD</dispatcher>
    <dispatcher>REQUEST</dispatcher>
  </filter-mapping>
  ...
  <filter>
    <filter-name>shiroFilter</filter-name>
    <filter-class>org.apache.shiro.web.servlet.ShiroFilter</filter-class>
  </filter>
  <filter-mapping>
    <filter-name>shiroFilter</filter-name>
    <url-pattern>/*</url-pattern>
    <dispatcher>REQUEST</dispatcher>
    <dispatcher>FORWARD</dispatcher>
    <dispatcher>INCLUDE</dispatcher>
    <dispatcher>ERROR</dispatcher>
  </filter-mapping>
  
  <listener>
    <listener-class>com.peninsulawebsolutions.servlet.PwsServletContextHolder</listener-class>
  </listener>
  <listener>
    <listener-class>com.peninsulawebsolutions.guice.PwsGuiceServletContextListener</listener-class>
  </listener>
  <!-- No longer needed with Web configuration! -->
<!--   <listener> -->
<!--     <listener-class>org.apache.shiro.web.env.EnvironmentLoaderListener</listener-class> -->
<!--   </listener> -->
  
  ...
</web-app>

For the last bit of xml we have the Maven dependencies:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">

  ...

  <dependencies>

    <!-- Google Dependencies -->
    
    <dependency>
      <groupId>com.google.inject</groupId>
      <artifactId>guice</artifactId>
      <version>${guice.version}</version>
    </dependency>
    <dependency>
      <groupId>com.google.inject.extensions</groupId>
      <artifactId>guice-servlet</artifactId>
      <version>${guice.version}</version>
    </dependency>
    <dependency>
      <groupId>xml-apis</groupId>
      <artifactId>xml-apis</artifactId>
      <version>1.4.01</version>
    </dependency>

    <!-- Faces and Facelet Dependencies -->

    <dependency>
      <groupId>javax.servlet</groupId>
      <artifactId>servlet-api</artifactId>
      <scope>provided</scope>
    </dependency>

    <!-- Shiro Security Dependencies -->

    <dependency>
      <groupId>org.apache.shiro</groupId>
      <artifactId>shiro-core</artifactId>
    </dependency>
    <dependency>
      <groupId>org.apache.shiro</groupId>
      <artifactId>shiro-web</artifactId>
      <version>${shiro.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.shiro</groupId>
      <artifactId>shiro-guice</artifactId>
      <version>${shiro.version}</version>
    </dependency>

    ...
    
  </dependencies>

  ...

</project>

Our last note / gotcha is that it is easy to get tripped up with NoClassDefFoundError if any of your Shiro modules have different versions or your Guice modules have different versions (including NO_AOP!)  This can be non-trivial if you have a multi-module Maven project.

In conclusion we have the basics for getting Shiro, Guice, Maven and the Google App Engine to play nice with each other.  This is done by having a ServletContextHolder wrapped by a ServletContextProxy put into the Guice Injector in the GuiceServletContextListener.  We also configured the web.xml to use our two listeners and Maven to have the appropriate dependencies.  We also covered a last gotcha.

This should be everything you need to get going.  Code was taken from a working Peninsula Web Solutions (PWS) project.  Good luck!

Objectify 4 TDD Maven Hello World Tutorial with Google App Engine

I've just started getting my hands dirty with Objectify (link) with the Google App Engine (link) and finally started small with unit tests instead of writing a bunch of code that breaks due to 3rd party modules not doing what the documentation says they do, i.e. using Test Driven Development (TDD) in practice.  I came across a fair amount of out-dated tutorials (link link), old documentation (link), forum posts (link) and Stack Overflow questions so it's a perfect candidate for a tutorial post.  In this tutorial we are going to write a JUnit test for objectify-appengine with a persisted Car object with an out-of-place VIN of "Hello World".

First off, figuring out those nasty Maven dependencies was quite a task by itself.  The objectify docs will say you only need objectify (link) but you need the full compliment of Google App Engine dependencies (link link) as well as a Java Persistence API (JPA) implementation (we use Apache Open JPA here, link).

<dependencies>

  <!-- Objectify -->
  <dependency>
    <groupid>com.googlecode.objectify</groupid>
    <artifactid>objectify</artifactid>
    <version>4.0b3</version>
  </dependency>

  <!-- Open JPA -->
  <dependency>
    <groupid>org.apache.openjpa</groupid>
    <artifactid>openjpa-all</artifactid>
    <version>2.2.2</version>
  </dependency>
  

  <!-- GAE Dependencies -->

  <dependency>
    <groupid>com.google.appengine</groupid>
    <artifactid>appengine-api-1.0-sdk</artifactid>
    <version>${appengine.target.version}</version>
  </dependency>
  
  <dependency>
    <groupid>com.google.appengine</groupid>
    <artifactid>appengine-testing</artifactid>
    <version>${appengine.target.version}</version>
    <scope>test</scope>
  </dependency>
  
  <dependency>
    <groupid>com.google.appengine</groupid>
    <artifactid>appengine-api-labs</artifactid>
    <version>${appengine.target.version}</version>
    <scope>test</scope>
  </dependency>
  
  <dependency>
    <groupid>com.google.appengine</groupid>
    <artifactid>appengine-api-stubs</artifactid>
    <version>${appengine.target.version}</version>
    <scope>test</scope>
  </dependency>
</dependencies>

For the Plain Old Java Object (POJO) to persist we are going with the Car class as in the Objectify documentation (link), except they forgot about the @Entity annotation (link).

package ...;

// imports included for your convenience
import javax.persistence.Transient;

import com.googlecode.objectify.ObjectifyService;
import com.googlecode.objectify.annotation.Entity;
import com.googlecode.objectify.annotation.Id;

@Entity
public class Car {
  @Id
  Long id;

  String vin;

  int color;

  @Transient
  String doNotPersist;

  static {
    ObjectifyService.register(Car.class);
  }

  private Car() {
  }

  public Car(String vin, int color) {
    this.vin = vin;
    this.color = color;
  }

}

Lastly we have the test class itself.  Note that save() and load() is used in Objectify 4 API (link) instead of put(...) and get(...) in the old API.  These methods also support using Keys (link) but I wasn't able to figure out how you got Keys at that time, so the example (modified from the Objectify documentation) uses IDs.

package ...;

import org.junit.After;
import org.junit.AfterClass;
import org.junit.Before;
import org.junit.BeforeClass;
import org.junit.Test;

import com.google.appengine.tools.development.testing.LocalDatastoreServiceTestConfig;
import com.google.appengine.tools.development.testing.LocalServiceTestHelper;
import com.googlecode.objectify.Objectify;
import com.googlecode.objectify.ObjectifyService;

public class ObjectifyAdapterTest {

  private final LocalServiceTestHelper helper = new LocalServiceTestHelper(
      new LocalDatastoreServiceTestConfig());

  @BeforeClass
  public static void setUpBeforeClass() throws Exception {
  }

  @AfterClass
  public static void tearDownAfterClass() throws Exception {
  }

  @Before
  public void setUp() throws Exception {
    helper.setUp();
  }

  @After
  public void tearDown() throws Exception {
    helper.tearDown();
  }

  @Test
  public void test() {
    final Objectify ofy = ObjectifyService.ofy();

    // Simple create, note the cheesy use of Hello World
    final Car porsche = new Car("Hello World", 3);
    ofy.save().entities(porsche).now();
    final Long id = porsche.id;
    assert id != null; // id was autogenerated

    // Get it back
    final Car loadedPorsche = ofy.load().type(Car.class).id(id).now();

    assert porsche.equals(loadedPorsche);

    // Change some data and write it
    porsche.color = 1;
    ofy.save().entities(porsche).now();

    // Delete it
    ofy.delete().entity(porsche).now();
  }
}

Some key points are the helper object standing in for the Google App Engine persistence mechanism and that you get the same object back that you saved.  Enjoy your quick start with Objectify, TDD, Maven and the Google App Engine!

Wednesday, February 5, 2014

Jenkins, SSH and "git plugin only support official git client" on Windows

This post will spare you some pain authenticating with Jenkins running on a Windows machine.  In particular this post is about the obstacles to getting Jenkins and Bitbucket communicating (link).

If you try to authenticate with HTTPS you get "jenkins returned status code 255: stdout: stderr: error: could not lock config file .git/config: No such file or directory".  The bug report (link) comments indicate that the problem is purely cosmetic, or caused by upgrade issues (link).  However, I couldn't get my Bitbucket / Jenkins integration working until I resolved this despite it being a fresh (unupgraded) install.  Fortunately a Stack Overflow question (link) suggested just avoiding HTTPS authentication entirely and using SSH as a work-around.

I tried verifying that Bitbucket was trying to connect to Jenkins by enabling an access log for Jenkins, but that feature seems to be Linux only (Google search).

If you try authenticating with SSH by reading the docs (link) then you can get a "git plugin only support official git client" error.  I found that this bug should be fixed (link), but with GIT plugin 2.0.1 (link) and GIT client plugin 1.6.1 (link) I still had the problem.  You can verify by looking at the source (link, line 1041).

I tried setting the GIT_SSH environment variable (link) and ensuring that Git/bin was on my path to no avail.  In contrast I had some success trying polling in the short term instead of having Bitbucket notify my Jenkins server when there was a code check-in (link).

The final solution (besides re-installing in the default location) was to use Junction Link Magic (link) to create a hard link (aka a junction in windows terminology) from the default location to the actual location.

In summary the only way to get Jenkins working correctly on Windows is to use SSH instead of the default HTTPS and with an alternate Git install location you have to create a hard link (junction) from "C:/Program Files (x86)/Git" to where you installed Git.

Tuesday, January 14, 2014

dashboard-maven-plugin Out of Memory errors and alternative Dashboard Reports

In order to better communicate project progress to a non-technical audience (e.g. customers) I found the dashboard-maven-plugin recommended in a few places so I decided to try it out.  I was a little wary since it only had one release in 2008 but didn't find anything else recommended at the time.  I must have been having a bad day.  At first everything went (relatively) smoothly as I integrated Cobratura, PMD, Checkstyle, Findbugs (far better than PMD), JDepend and Taglist (although I couldn't get its regular expressions to work).

However I started getting dreaded JVM out of memory errors.  At first I just increased the memory to 8 gigs and that worked for a little while.  However as the project moved on and increased in size and complexity eventually I got problems even with 16 gigs used and I even had to increase my OS paging file to manage it.  I verified that they were all used up by using the excellent Process Explorer to monitor the process memory usage.  After turning Maven debugging on I found that the dashboard-maven-plugin was the last thing running before the crashes.  After commenting it out I was able to complete my documentation build even going back down to 512m of permgen space and 8g of heapspace.

After getting more versed in reports and dashboards XRadarSquale and SonarQube both look promising as alternative dashboards.

Lastly, to those that celebrate the western new year, Happy New Year!