Saturday, February 26, 2011

Convert image to byte array (byte[]) in JavaMe

Last post showed you how to take a picture using the device's camera and then send it to a servlet in Base64 format. But sometimes, you just need to send a picture that is either inside the application or in your memory card (sd card, etc.).

So in this post, we are going to convert images inside the application to byte arrays so you can use the methods showed in the last post in order to send thouse images to the servlet. Also, we will see how to get byte arrays from a file with the help of the FileConnection API, but we are not getting in details with this API. I will share some links that make a great work for getting started with it.

The following method takes a String as parameter and looks inside the application to get the bytes:

  
     //inside MIDlet class...

   /**
   * Reads the local image resource and returns an
   * array of bytes representing the image
   * @param imgPath path to the image inside the app
   * @return array of bytes representing the image
   */
  public byte[] loadFromApp(String imgPath) {

    byte[] imgBytes = null;
    ByteArrayOutputStream baos = null;
    InputStream in = null;
    try {

      baos = new ByteArrayOutputStream();
      //this image is inside my mobile application
      in = this.getClass().getResourceAsStream(imgPath);
      byte[] buffer = new byte[1024];
      int n = 0;
      while ((n = in.read(buffer)) != -1) {
        baos.write(buffer, 0, n);
      }

      imgBytes = baos.toByteArray();

    } catch (Exception ex) {
      ex.printStackTrace();
    } finally {
      //whatever happends close the streams
      if (baos != null) {
        try {
          baos.close();
        } catch (Exception ex) {
        }
      }
      if (in != null) {
        try {
          in.close();
        } catch (Exception ex) {
        }
      }
    }

    return imgBytes;
  }

So you see it's pretty straightforward, just get the resource as a stream and start passing the bytes from one stream to another, at the end the ByteArrayOutputStream let you get the byte array.

But, what if you just want to send a picture you just take and is saved in your memory card (sd card, micro sd card...etc.)? well you have to access the file with the help of the FileConnection API. Here are some great links to help you get started with this API:

For getting started right from oracle:

For some considerations when using the API in real devices:

OK, after you understand how the API works and can access the files, you can use the following method to get the byte arrays of the files. You will realize that this method is very similar to the previous one:

  
     //inside MIDlet class...

   /**
   * Reads the image and returns an array of bytes
   * @param imgPath Path to the image inside the Device or
   * the memory card
   * @return array of bytes representing the image
   */
  public byte[] loadFromDevice(String imgPath) {

    byte[] imgBytes = null;
    ByteArrayOutputStream baos = null;
    InputStream in = null;
    FileConnection file = null;
    try {
      //Access to the file
      file = (FileConnection) Connector.open(imgPath);
      //Checks whether the file exist
      if (!file.exists()) {
        return new byte[0];
      }
      baos = new ByteArrayOutputStream();
      in = file.openInputStream();
      byte[] buffer = new byte[1024];
      int n = 0;
      while ((n = in.read(buffer)) != -1) {
        baos.write(buffer, 0, n);
      }

      imgBytes = baos.toByteArray();

    } catch (Exception ex) {
      ex.printStackTrace();
    } finally {
      //whatever happends close the streams
      if (baos != null) {
        try {
          baos.close();
        } catch (Exception ex) {
        }
      }
      if (in != null) {
        try {
          in.close();
        } catch (Exception ex) {
        }
      }
      if (file != null) {
        try {
          file.close();
        } catch (Exception ex) {
        }
      }
    }

    return imgBytes;
  }

There's one important thing to keep in mind with this API... not all devices implement it, so you should check if the device supports it before using it. The following method lets you check the availability of the API:

  
     //inside MIDlet class...

  /**
   * Checks whether the device has support for
   * the FileConnection API
   * @return true if and only if the device
   * supports the API false otherwise
   */
  private boolean checkFileSupport() {
    if (System.getProperty
       ("microedition.io.file.FileConnection.version") != null) {
      return true;
    }

    return false;
  }

I hope you can use this methods in your applications and start sharing some images!

see ya soon!


References:

Getting Started with the FileConnection APIs. December 2004. Oracle [online].
Available on Internet: http://developers.sun.com/mobility/apis/articles/fileconnection/
[accessed on February 26 2011].

Working with the Java ME FileConnection API on Physical Devices. March 2007. Java.net[online].
Available on Internet: http://today.java.net/article/2007/03/27/working-java-me-fileconnection-api-physical-devices
[accessed on February 26 2011].

Saturday, February 12, 2011

Taking a picture - Base64 encoding in JavaMe

It's been a while since my last post, January was a busy month, but here I am again.

In this post, we are going to do something really interesting: taking a snapshot with the phone's camera and send it to the server in Base64 format.

You should read the post about Base64 in JavaMe (J2ME) if you haven't:


Taking the snapshot is quite easy, you just have to be careful about blocking calls. The real problem happens when uploading the image to the server. I tried several scenarios using Glassfish 3.0.1 and Tomcat 6.0.32 and none of the scenarios worked on both servers... It was very frustrating becasue the code seems to be OK, but works for Glassfish or for Tomcat but not both.

Let's start with the general code, assume the following is inside the MIDlet class:

  
     //inside MIDlet class...

     /** Commands to control the application flow*/
     private Command commandShoot = null;
     private Command commandExit = null;
     private Command commandSend = null;
     private Command commandBack = null;
     private Display display = null;

     /** flags to indicate whether the phone 
      *  supports png or jpeg encoding for snapshots
      */
     private boolean png = false;
     private boolean jpeg = false;

     /** Canvas where the video is going to be played*/
     private VideoCanvas videoCanvas = null;

     /** Canvas where the snapshot is going to be shown*/
     private ImageCanvas imgCanvas = null;


     public void startApp() {
        display = Display.getDisplay(this);

        commandExit = new Command("Exit", Command.EXIT, 0);
        commandShoot = new Command("Shoot", Command.SCREEN, 0);
        commandSend = new Command("Send", Command.SCREEN, 0);
        commandBack = new Command("Back", Command.BACK, 0);

        //Verifies if the application can use the camera
        //and encoding
        if (checkCameraSupport() && 
            checkSnapShotEncodingSupport()) {

            //this is the canvas where the video will render
            videoCanvas = new VideoCanvas(this);
            videoCanvas.addCommand(commandExit);
            videoCanvas.addCommand(commandShoot);
            videoCanvas.setCommandListener(this);

            display.setCurrent(videoCanvas);
        } else {
            System.out.println("NO camera support");
        }
    }

    /**
     * Checks whether the phone has camera support
     * @return true if and only if the phone has camera
     * support
     */
    private boolean checkCameraSupport() {
        String propValue = System.getProperty
                           ("supports.video.capture");
        return (propValue != null) && propValue.equals("true");
    }
    
    /**
     * Checks if the phone has png or jpeg support.
     * @return true if and only if the phone supports
     * png or jpeg encoding
     */
    private boolean checkSnapShotEncodingSupport() {
        String encodings = System.getProperty
                           ("video.snapshot.encodings");
        png = (encodings != null) 
              && (encodings.indexOf("png") != -1);

        jpeg = (encodings != null) 
               && (encodings.indexOf("jpeg") != -1);

        return png || jpeg;
    }

The previous methods tell you if the phone has camera support and if it can encode snapshots in either png or jpeg format. The +startApp():void method, starts the application, checks for the camera support and starts the canvas where the video will render.

Next is the method that actually encodes an array of bytes. Later we will show how to pass the Image as a byte array to this method in order to get the Image in Base64 format:

  
     //inside MIDlet class...

    /**
     * Encodes an array of bytes
     * @param imgBytes Array of bytes to encode
     * @return String representing the Base64 format of
     * the array parameter
     */
    public String encodeImage(byte[] imgBytes) {
        byte[] coded = Base64.encode(imgBytes);
        return new String(coded);
    }

As we saw in the Base64 encode-decode in JavaMe (J2ME) Post, the previous method encodes using the bouncy castle library.

The following methods goes inside the MIDlet class as well, they are responsible of starting the canvas that takes the snapshot and passing the snapshot to another canvas in order to show it to the user:

  
    //inside MIDlet...

    /**
     * Starts the Video Canvas to take the snapshot
     */
    public void snapShot() {

        if (png) {
            videoCanvas.startSnapShot("png");
        } else if (jpeg) {
            videoCanvas.startSnapShot("jpeg");
        } else {
            videoCanvas.startSnapShot(null);
        }
    }

    /**
     * Shows the snapshot in a Canvas
     * @param bytes Array of bytes representing the
     * snapshot
     */
    public void showSnapShot(byte[] bytes) {
        if (bytes != null) {

            imgCanvas = new ImageCanvas(bytes);
            imgCanvas.addCommand(commandSend);
            imgCanvas.addCommand(commandBack);
            imgCanvas.setCommandListener(this);

            display.setCurrent(imgCanvas);
        }
    }

OK, that's pretty much the general code for the MIDlet, let's see what happens within the VideoCanvas class. The next code should be inside the constructor, it starts the video player:

        
    //inside VideoCanvas class...

    /* Application MIDlet**/
    ImageCaptureMidlet midlet = null;

    /** Control for the video*/
    VideoControl videoControl = null;

    public VideoCanvas(ImageCaptureMidlet mid) {
        midlet = mid;
        Player player = null;

        try {
            player = Manager.createPlayer("capture://video");
            player.realize();
            videoControl = (VideoControl) player.getControl
                           ("VideoControl");
            videoControl.initDisplayMode
                       (VideoControl.USE_DIRECT_VIDEO, this);
            videoControl.setDisplayFullScreen(true);
            videoControl.setVisible(true);

            player.start();

        } catch (Exception e) {
            e.printStackTrace();
        }
    }

So, when the canvas is created it starts the player and the video starts playing inside the canvas. When the MIDlet invokes the +snapShot():void method, the following piece of code is executed inside the VideoCanvas class:

    //inside VideoCanvas class...

    /**
     * Starts a thread to take the snapshot. 
     * Some devices will take the snapshot without 
     * the need of a thread, but some others 
     * doesn't (including my emulator). 
     * So start a new Thread...
     * @param encoding String representing the encoding
     * to use when taking the snapshot
     */
    public void startSnapShot(final String encoding) {
        new Thread(new Runnable() {

            public void run() {
                try {
                    byte[] rawBytes = null;
                    if (encoding != null) {
                        //take the snapshot using the encoding
                        rawBytes = videoControl.getSnapshot
                                   ("encoding=" + encoding);

                    } else {
                        //take the snapshot using the best
                        //possible encoding. 
                        //Implementation dependent
                        rawBytes = videoControl.getSnapshot
                                   (null);
                    }
                    //ask the midlet to show the snapshot
                    midlet.showSnapShot(rawBytes);
                } catch (MediaException ex) {
                    ex.printStackTrace();
                }
            }
        }).start();
    }

You may notice that the method starts a new Thread. That's because not all devices will take the snapshot right away, some times it takes a little bit to start the camera and take the sanpshot. That's why, we have started a new Thread. Also, check that +VideoControl.getSnapshot(String):byte[] method can receive a null parameter, indicating that let the implementation decide which is the best encoding possible to take the snapshot. Finally, when the snapshot is taken, the method asks the application MIDlet to show it.

We have previously seen the +showSnapShot(byte[]):void method in the MIDlet class, so let's take care about the ImageCanvas class where the snapshot is shown:

    //inside ImageCanvas class...

    /** Snapshot to render*/
    private Image image = null;

    /** bytes of the snapshot*/
    private byte[] imageBytes = null;

    public ImageCanvas(byte[] bytes) {
        imageBytes = bytes;

        //creates an Image using the byte array 
        image = Image.createImage(imageBytes, 0, 
                                  imageBytes.length);
    }

    public void paint(Graphics g) {
        int width = getWidth();
        int height = getHeight();
        g.setColor(0x000000);
        g.fillRect(0, 0, width, height);

        //render the snapshot
        g.drawImage(image, getWidth() / 2, getHeight() / 2, 
                    Graphics.HCENTER | Graphics.VCENTER);
    }

    //Getters and Setters...


OK, up until now we have seen how to take a snapshot and render it on a Canvas. That's the general code we were talking about at the beginning of this post. Now let's see how to send the snapshot to the server. Here we are going to present two scenarios, one for Glassfish server and another for the Tomcat server.

Glassfish Server:
The following method is the method inside the MIDlet that sends the snapshot to a servlet deployed on Glassfish server 3.0.1:

    //inside MIDlet class

    /**
     * Sends the snapshot to the server
     */
    public void send() {
        new Thread(new Runnable() {

            public void run() {

                String imageEncoded = 
                   encodeImage(imgCanvas.
                               getImageBytes());                

                String format = png ? "png" : jpeg?"jpeg":"";
                
                //This is my servlet’s URL
                String URL = 
                           "http://localhost:8080/" + 
                           "Base64ExampleServlet_v2/" +
                           "ImageServlet";

                HttpConnection http = null;
                OutputStream os = null;
                DataOutputStream dout = null;
                try {
                    //Open HttpConnection using POST
                    http = (HttpConnection) Connector.open(URL);
                    http.setRequestMethod(HttpConnection.POST);
                    //Content-Type is must to pass parameters 
                    //in POST Request
                    http.setRequestProperty(
                         "Content-Type", 
                         "application/x-www-form-urlencoded");
                   
                    os = http.openOutputStream();

                    ByteArrayOutputStream bout = new 
                                       ByteArrayOutputStream();
                    dout = new DataOutputStream(bout);
                    dout.writeUTF(imageEncoded);
                    dout.writeUTF(format);
                    os.write(bout.toByteArray());
                    
                    os.flush();  
                } catch (IOException ex) {
                    ex.printStackTrace();
                } 
                //whatever happens, close the streams 
                //and connections
                finally {
                    if (os != null) {
                        try {
                            os.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                    if (dout != null) {
                        try {
                            dout.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                    if (http != null) {
                        try {
                            http.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                }
            }
        }).start();
    }

As you can see, the method calls the encoding method passing the snapshot as a byte array, then it starts a HttpConnection to the servlet deployed on Glassfish server. It opens the OutputStream and writes the image as Base64 format and it also writes the encoding (format) used. On the server side, you should read the stream the same way you wrote it, taht is, first read the image in Base64 format and then the encoding used (format).

Tomcat Server:
The following method is the method inside the MIDlet that sends the snapshot to a servlet deployed on Tomcat server 6.0.32:

    //inside MIDlet class

    /**
     * Sends the snapshot to the server
     */
    public void send() {
        new Thread(new Runnable() {

            public void run() {

                String imageEncoded = 
                   encodeImage(imgCanvas.
                               getImageBytes());                

                String format = png ? "png" : jpeg?"jpeg":"";
                
                //This is my servlet’s URL
                String URL = 
                           "http://localhost:8080/" + 
                           "Base64ExampleServlet_v2/" +
                           "ImageServlet";

                HttpConnection http = null;
                OutputStream os = null;
                try {
                    //Open HttpConnection using POST
                    http = (HttpConnection) Connector.open(URL);
                    http.setRequestMethod(HttpConnection.POST);
                    //Content-Type is must to pass parameters 
                    //in POST Request
                    http.setRequestProperty(
                         "Content-Type", 
                         "application/x-www-form-urlencoded");
                   
                    os = http.openOutputStream();

                    //IMPORTANT, when writing Base64 format
                    //there are chars like '+' that  
                    //must be replaced when sending.
                    imageEncoded = 
                        imageEncoded.replace('+', '-');
                    StringBuffer params = new StringBuffer();
                    params.append("image" + "=" + imageEncoded);
                    params.append("&" + 
                                  "format" + "=" + format);
                    System.out.println(params.toString());
                    os = http.openOutputStream();
                    os.write(params.toString().getBytes());
                    
                    os.flush();  
                } catch (IOException ex) {
                    ex.printStackTrace();
                } 
                //whatever happens, close the streams 
                //and connections
                finally {
                    if (os != null) {
                        try {
                            os.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                    if (http != null) {
                        try {
                            http.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                }
            }
        }).start();
    }

This method differs from the Glassfish method in the way it writes the parameters. Tomcat's method writes the parameters as a key/value pair. Notice one important point, the Base64 format uses chars like '+', chars that may be confused in the HTTP protocol, because they can be translated as a space char. So before sending the image in Base64, replace the '+' chars with '-' chars, but when receiving in the servlet (server side), convert the chars back before writing the image to the disk. This is only needed in the Tomcat's method.

Wow, loooooong post... Again, my advise is be careful when deciding which method to use in which server. Sometimes the code you wrote may be well written but for some reasons it won't work on the server... so you have to try another way to write your code until it works.

That's it for now, hope this helps you to write applications that use the device's camera.

see ya soon!


References:

Taking Pictures with MMAPI. 2011. Oracle [online].
Available on Internet: http://developers.sun.com/mobility/midp/articles/picture/
[accessed on February 01 2011].

Java Tips - Capturing Video on J2ME devices. 2011. Java Tips [online].
Available on Internet: http://www.java-tips.org/java-me-tips/midp/capturing-video-on-j2me-devices.html
[accessed on February 01 2011].

J2ME Sample Codes: Image Capturing in J2ME. 2011. J2ME Sample Codes [online].
Available on Internet: http://j2mesamples.blogspot.com/2009/06/image-capturing-in-j2me.html
[accessed on February 01 2011].

Embedded Interaction - Working with J2ME - Picture transmission over HTTP. 2011. Embedded Interaction [online].
Available on Internet: http://www.hcilab.org/documents/tutorials/PictureTransmissionOverHTTP/index.html
[accessed on February 01 2011].

Base64. 2010. Wikipedia [online].
Available on Internet: http://en.wikipedia.org/wiki/Base64
[accessed on December 28 2010].

The Legion of the Bouncy Castle. bouncycastle.org [online].
Available on Internet: http://www.bouncycastle.org/
[accessed on December 28 2010].