Showing posts with label http. Show all posts
Showing posts with label http. Show all posts

Saturday, September 3, 2011

Google Static Maps API and JavaME

Whether you need a map for your location based application or just for fun, you can use the easiest way ever: Google Static Maps API. In this post, we are going to see how you can get a Map as an Image from a latitude and longitude point. The latitude and longitude can be obtained using Location API which we won't discuss in this post.

When writing this post, I realized there is some license restrictions in the use of Google Static Maps API in mobile Apps... I am posting it anyway just for research purposes, but I must warn you about this restriction:



Google Static Maps API Quick Review
This API lets you get an Image based on a URL and several parameters you can pass in to obtain a personalized map. You can play with the zoom, type of map, size of the image (width, height), markers at locations of the map, etc. There is a limit you have to keep in mind, the use of the API is subject to a query limit of 1000 unique (different) image requests per viewer per day, which is a lot of images... but if you need more, there is also a Premium license. For more information:


OK, what we do is the following:
  • Create a method that receives a latitude and longitude point and the size of the image as parameters.
  • Request the map image using the URL: http://maps.googleapis.com/maps/api/staticmap, and adding some parameters.
  • Create an Image object and return it, so we can show it on screen.

Hands on Lab
Following is the method we were talking about. It has parameters for the latitude and longitude, and also for the width and height of the image we are requesting. The latitude and longitude can be retrieved using Location API, and the width and height can be retrieved using the Canvas class.

public Image getMap(double lat, double lon, 
                                int width, int height) 
throws IOException 
{
    String url = "http://maps.google.com/maps/api/staticmap";
    url += "?zoom=15&size=" + width + "x" + height;
    url += "&maptype=roadmap";
    url += "&markers=color:red|label:A|" + lat + "," + lon;
    url += "&sensor=true";

    HttpConnection http = (HttpConnection) Connector.open(url);
    InputStream in = null;
    byte[] imgBytes = null;
    try {
        http.setRequestMethod(HttpConnection.GET);
        in = http.openInputStream();

        ByteArrayOutputStream bos = new ByteArrayOutputStream();
        byte[] buffer = new byte[1024];
        int n = 0;
        while ((n = in.read(buffer)) != -1) {
            bos.write(buffer, 0, n);
        }
        imgBytes = bos.toByteArray();
    } finally {
        if (in != null) {
            in.close();
        }
        http.close();
    }
    Image img = Image.createImage(imgBytes, 0, imgBytes.length);

    return img;
}

As you may see, it is pretty simple to get the image of the map. The retrieving is pure HTTP request.
Next you can find an image retrieved by Google Static Maps from a location in my home town.


OK, you just saw how simply it would be if no restriction exists... What do you think about that restriction? It's kind of confusing, isn't it?

Anyway, we are going to need to find another way to show maps on our mobile apps.

see ya soon!

References:

Google Static Maps API. [online].
Available on Internet: http://code.google.com/intl/en/apis/maps/documentation/staticmaps/
[accessed on August 20 2011].

Developing Location Based Services: Introducing the Location API for J2ME. 2009. MobiForge [online].
Available on Internet: http://mobiforge.com/developing/story/developing-location-based-services-introducing-location-api-j2me
[accessed on August 20 2011].

Mini App With Location API and Google Maps in Java ME. 2009. Nokia [online].
Available on Internet: http://www.developer.nokia.com/Community/Wiki/Mini_App_With_Location_API_and_Google_Maps_in_Java_ME
[accessed on September 02 2011].

Google Maps API Family. FAQ. Google [online].
Available on Internet: http://code.google.com/intl/en/apis/maps/faq.html#mapsformobile
[accessed on September 02 2011].

Saturday, February 12, 2011

Taking a picture - Base64 encoding in JavaMe

It's been a while since my last post, January was a busy month, but here I am again.

In this post, we are going to do something really interesting: taking a snapshot with the phone's camera and send it to the server in Base64 format.

You should read the post about Base64 in JavaMe (J2ME) if you haven't:


Taking the snapshot is quite easy, you just have to be careful about blocking calls. The real problem happens when uploading the image to the server. I tried several scenarios using Glassfish 3.0.1 and Tomcat 6.0.32 and none of the scenarios worked on both servers... It was very frustrating becasue the code seems to be OK, but works for Glassfish or for Tomcat but not both.

Let's start with the general code, assume the following is inside the MIDlet class:

  
     //inside MIDlet class...

     /** Commands to control the application flow*/
     private Command commandShoot = null;
     private Command commandExit = null;
     private Command commandSend = null;
     private Command commandBack = null;
     private Display display = null;

     /** flags to indicate whether the phone 
      *  supports png or jpeg encoding for snapshots
      */
     private boolean png = false;
     private boolean jpeg = false;

     /** Canvas where the video is going to be played*/
     private VideoCanvas videoCanvas = null;

     /** Canvas where the snapshot is going to be shown*/
     private ImageCanvas imgCanvas = null;


     public void startApp() {
        display = Display.getDisplay(this);

        commandExit = new Command("Exit", Command.EXIT, 0);
        commandShoot = new Command("Shoot", Command.SCREEN, 0);
        commandSend = new Command("Send", Command.SCREEN, 0);
        commandBack = new Command("Back", Command.BACK, 0);

        //Verifies if the application can use the camera
        //and encoding
        if (checkCameraSupport() && 
            checkSnapShotEncodingSupport()) {

            //this is the canvas where the video will render
            videoCanvas = new VideoCanvas(this);
            videoCanvas.addCommand(commandExit);
            videoCanvas.addCommand(commandShoot);
            videoCanvas.setCommandListener(this);

            display.setCurrent(videoCanvas);
        } else {
            System.out.println("NO camera support");
        }
    }

    /**
     * Checks whether the phone has camera support
     * @return true if and only if the phone has camera
     * support
     */
    private boolean checkCameraSupport() {
        String propValue = System.getProperty
                           ("supports.video.capture");
        return (propValue != null) && propValue.equals("true");
    }
    
    /**
     * Checks if the phone has png or jpeg support.
     * @return true if and only if the phone supports
     * png or jpeg encoding
     */
    private boolean checkSnapShotEncodingSupport() {
        String encodings = System.getProperty
                           ("video.snapshot.encodings");
        png = (encodings != null) 
              && (encodings.indexOf("png") != -1);

        jpeg = (encodings != null) 
               && (encodings.indexOf("jpeg") != -1);

        return png || jpeg;
    }

The previous methods tell you if the phone has camera support and if it can encode snapshots in either png or jpeg format. The +startApp():void method, starts the application, checks for the camera support and starts the canvas where the video will render.

Next is the method that actually encodes an array of bytes. Later we will show how to pass the Image as a byte array to this method in order to get the Image in Base64 format:

  
     //inside MIDlet class...

    /**
     * Encodes an array of bytes
     * @param imgBytes Array of bytes to encode
     * @return String representing the Base64 format of
     * the array parameter
     */
    public String encodeImage(byte[] imgBytes) {
        byte[] coded = Base64.encode(imgBytes);
        return new String(coded);
    }

As we saw in the Base64 encode-decode in JavaMe (J2ME) Post, the previous method encodes using the bouncy castle library.

The following methods goes inside the MIDlet class as well, they are responsible of starting the canvas that takes the snapshot and passing the snapshot to another canvas in order to show it to the user:

  
    //inside MIDlet...

    /**
     * Starts the Video Canvas to take the snapshot
     */
    public void snapShot() {

        if (png) {
            videoCanvas.startSnapShot("png");
        } else if (jpeg) {
            videoCanvas.startSnapShot("jpeg");
        } else {
            videoCanvas.startSnapShot(null);
        }
    }

    /**
     * Shows the snapshot in a Canvas
     * @param bytes Array of bytes representing the
     * snapshot
     */
    public void showSnapShot(byte[] bytes) {
        if (bytes != null) {

            imgCanvas = new ImageCanvas(bytes);
            imgCanvas.addCommand(commandSend);
            imgCanvas.addCommand(commandBack);
            imgCanvas.setCommandListener(this);

            display.setCurrent(imgCanvas);
        }
    }

OK, that's pretty much the general code for the MIDlet, let's see what happens within the VideoCanvas class. The next code should be inside the constructor, it starts the video player:

        
    //inside VideoCanvas class...

    /* Application MIDlet**/
    ImageCaptureMidlet midlet = null;

    /** Control for the video*/
    VideoControl videoControl = null;

    public VideoCanvas(ImageCaptureMidlet mid) {
        midlet = mid;
        Player player = null;

        try {
            player = Manager.createPlayer("capture://video");
            player.realize();
            videoControl = (VideoControl) player.getControl
                           ("VideoControl");
            videoControl.initDisplayMode
                       (VideoControl.USE_DIRECT_VIDEO, this);
            videoControl.setDisplayFullScreen(true);
            videoControl.setVisible(true);

            player.start();

        } catch (Exception e) {
            e.printStackTrace();
        }
    }

So, when the canvas is created it starts the player and the video starts playing inside the canvas. When the MIDlet invokes the +snapShot():void method, the following piece of code is executed inside the VideoCanvas class:

    //inside VideoCanvas class...

    /**
     * Starts a thread to take the snapshot. 
     * Some devices will take the snapshot without 
     * the need of a thread, but some others 
     * doesn't (including my emulator). 
     * So start a new Thread...
     * @param encoding String representing the encoding
     * to use when taking the snapshot
     */
    public void startSnapShot(final String encoding) {
        new Thread(new Runnable() {

            public void run() {
                try {
                    byte[] rawBytes = null;
                    if (encoding != null) {
                        //take the snapshot using the encoding
                        rawBytes = videoControl.getSnapshot
                                   ("encoding=" + encoding);

                    } else {
                        //take the snapshot using the best
                        //possible encoding. 
                        //Implementation dependent
                        rawBytes = videoControl.getSnapshot
                                   (null);
                    }
                    //ask the midlet to show the snapshot
                    midlet.showSnapShot(rawBytes);
                } catch (MediaException ex) {
                    ex.printStackTrace();
                }
            }
        }).start();
    }

You may notice that the method starts a new Thread. That's because not all devices will take the snapshot right away, some times it takes a little bit to start the camera and take the sanpshot. That's why, we have started a new Thread. Also, check that +VideoControl.getSnapshot(String):byte[] method can receive a null parameter, indicating that let the implementation decide which is the best encoding possible to take the snapshot. Finally, when the snapshot is taken, the method asks the application MIDlet to show it.

We have previously seen the +showSnapShot(byte[]):void method in the MIDlet class, so let's take care about the ImageCanvas class where the snapshot is shown:

    //inside ImageCanvas class...

    /** Snapshot to render*/
    private Image image = null;

    /** bytes of the snapshot*/
    private byte[] imageBytes = null;

    public ImageCanvas(byte[] bytes) {
        imageBytes = bytes;

        //creates an Image using the byte array 
        image = Image.createImage(imageBytes, 0, 
                                  imageBytes.length);
    }

    public void paint(Graphics g) {
        int width = getWidth();
        int height = getHeight();
        g.setColor(0x000000);
        g.fillRect(0, 0, width, height);

        //render the snapshot
        g.drawImage(image, getWidth() / 2, getHeight() / 2, 
                    Graphics.HCENTER | Graphics.VCENTER);
    }

    //Getters and Setters...


OK, up until now we have seen how to take a snapshot and render it on a Canvas. That's the general code we were talking about at the beginning of this post. Now let's see how to send the snapshot to the server. Here we are going to present two scenarios, one for Glassfish server and another for the Tomcat server.

Glassfish Server:
The following method is the method inside the MIDlet that sends the snapshot to a servlet deployed on Glassfish server 3.0.1:

    //inside MIDlet class

    /**
     * Sends the snapshot to the server
     */
    public void send() {
        new Thread(new Runnable() {

            public void run() {

                String imageEncoded = 
                   encodeImage(imgCanvas.
                               getImageBytes());                

                String format = png ? "png" : jpeg?"jpeg":"";
                
                //This is my servlet’s URL
                String URL = 
                           "http://localhost:8080/" + 
                           "Base64ExampleServlet_v2/" +
                           "ImageServlet";

                HttpConnection http = null;
                OutputStream os = null;
                DataOutputStream dout = null;
                try {
                    //Open HttpConnection using POST
                    http = (HttpConnection) Connector.open(URL);
                    http.setRequestMethod(HttpConnection.POST);
                    //Content-Type is must to pass parameters 
                    //in POST Request
                    http.setRequestProperty(
                         "Content-Type", 
                         "application/x-www-form-urlencoded");
                   
                    os = http.openOutputStream();

                    ByteArrayOutputStream bout = new 
                                       ByteArrayOutputStream();
                    dout = new DataOutputStream(bout);
                    dout.writeUTF(imageEncoded);
                    dout.writeUTF(format);
                    os.write(bout.toByteArray());
                    
                    os.flush();  
                } catch (IOException ex) {
                    ex.printStackTrace();
                } 
                //whatever happens, close the streams 
                //and connections
                finally {
                    if (os != null) {
                        try {
                            os.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                    if (dout != null) {
                        try {
                            dout.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                    if (http != null) {
                        try {
                            http.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                }
            }
        }).start();
    }

As you can see, the method calls the encoding method passing the snapshot as a byte array, then it starts a HttpConnection to the servlet deployed on Glassfish server. It opens the OutputStream and writes the image as Base64 format and it also writes the encoding (format) used. On the server side, you should read the stream the same way you wrote it, taht is, first read the image in Base64 format and then the encoding used (format).

Tomcat Server:
The following method is the method inside the MIDlet that sends the snapshot to a servlet deployed on Tomcat server 6.0.32:

    //inside MIDlet class

    /**
     * Sends the snapshot to the server
     */
    public void send() {
        new Thread(new Runnable() {

            public void run() {

                String imageEncoded = 
                   encodeImage(imgCanvas.
                               getImageBytes());                

                String format = png ? "png" : jpeg?"jpeg":"";
                
                //This is my servlet’s URL
                String URL = 
                           "http://localhost:8080/" + 
                           "Base64ExampleServlet_v2/" +
                           "ImageServlet";

                HttpConnection http = null;
                OutputStream os = null;
                try {
                    //Open HttpConnection using POST
                    http = (HttpConnection) Connector.open(URL);
                    http.setRequestMethod(HttpConnection.POST);
                    //Content-Type is must to pass parameters 
                    //in POST Request
                    http.setRequestProperty(
                         "Content-Type", 
                         "application/x-www-form-urlencoded");
                   
                    os = http.openOutputStream();

                    //IMPORTANT, when writing Base64 format
                    //there are chars like '+' that  
                    //must be replaced when sending.
                    imageEncoded = 
                        imageEncoded.replace('+', '-');
                    StringBuffer params = new StringBuffer();
                    params.append("image" + "=" + imageEncoded);
                    params.append("&" + 
                                  "format" + "=" + format);
                    System.out.println(params.toString());
                    os = http.openOutputStream();
                    os.write(params.toString().getBytes());
                    
                    os.flush();  
                } catch (IOException ex) {
                    ex.printStackTrace();
                } 
                //whatever happens, close the streams 
                //and connections
                finally {
                    if (os != null) {
                        try {
                            os.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                    if (http != null) {
                        try {
                            http.close();
                        } catch (IOException ex) {
                            ex.printStackTrace();
                        }
                    }
                }
            }
        }).start();
    }

This method differs from the Glassfish method in the way it writes the parameters. Tomcat's method writes the parameters as a key/value pair. Notice one important point, the Base64 format uses chars like '+', chars that may be confused in the HTTP protocol, because they can be translated as a space char. So before sending the image in Base64, replace the '+' chars with '-' chars, but when receiving in the servlet (server side), convert the chars back before writing the image to the disk. This is only needed in the Tomcat's method.

Wow, loooooong post... Again, my advise is be careful when deciding which method to use in which server. Sometimes the code you wrote may be well written but for some reasons it won't work on the server... so you have to try another way to write your code until it works.

That's it for now, hope this helps you to write applications that use the device's camera.

see ya soon!


References:

Taking Pictures with MMAPI. 2011. Oracle [online].
Available on Internet: http://developers.sun.com/mobility/midp/articles/picture/
[accessed on February 01 2011].

Java Tips - Capturing Video on J2ME devices. 2011. Java Tips [online].
Available on Internet: http://www.java-tips.org/java-me-tips/midp/capturing-video-on-j2me-devices.html
[accessed on February 01 2011].

J2ME Sample Codes: Image Capturing in J2ME. 2011. J2ME Sample Codes [online].
Available on Internet: http://j2mesamples.blogspot.com/2009/06/image-capturing-in-j2me.html
[accessed on February 01 2011].

Embedded Interaction - Working with J2ME - Picture transmission over HTTP. 2011. Embedded Interaction [online].
Available on Internet: http://www.hcilab.org/documents/tutorials/PictureTransmissionOverHTTP/index.html
[accessed on February 01 2011].

Base64. 2010. Wikipedia [online].
Available on Internet: http://en.wikipedia.org/wiki/Base64
[accessed on December 28 2010].

The Legion of the Bouncy Castle. bouncycastle.org [online].
Available on Internet: http://www.bouncycastle.org/
[accessed on December 28 2010].