0

I have a JSP web application and I would like to store images uploaded by users in HDFS and then display the image in JSP front end page on an image tag. I have looked all over the internet but there is oddly not a single example for this particular task. Most code snippets either show the word count Map-Reduce functionality, how to write to and open a file from the FS but not how to display it on a web page.

This is the basic flow of my application.

  1. User uploads a file from web html form

    <form action="profile" method="post" enctype="multipart/form-data">
       <input type="file" name="photo" value=""/>    
       <input type="submit" value="Upload"/>
    </form>
    
  2. Servlet receives request and writes the file to local file system using the apache FileUpload 1.2.2 library. Which raises the question, how can I write the incoming stream directly to HDFS without saving to the local FS first?

    @Override
    protected void doPost(HttpServletRequest request, HttpServletResponse response)
    throws ServletException, IOException {
           String userPath = request.getServletPath();
           if (userPath.equals("/profile")) {
    
               if (ServletFileUpload.isMultipartContent(request)) {
    //            create a new file upload handler
               ServletFileUpload upload = new ServletFileUpload();
    //            parse the request
               FileItemIterator iterator = upload.getItemIterator(request);
               InputStream stream = null;
    
              while (iterator.hasNext()) {
                   FileItemStream item = iterator.next();
                   stream = item.openStream();
                   if (!item.isFormField()) {
                    File file = new File(request.getServletContext().getRealPath("/") + "images/" + item.getName());
    
                    FileOutputStream fos = new FileOutputStream(file);
                    Streams.copy(stream, fos, true);
    
      //            copy from local FS to HDFS
                    ...
    
                 }
          }
    }
    
  3. Copy uploaded image from local filesystem to HDFS

    Configuration config = new Configuration();
    config.addResource(new Path("/usr/local/hadoop/etc/hadoop/core-site.xml"));
    config.addResource(new Path("/usr/local/hadoop/etc/hadoop/hdfs-site.xml"));
    FileSystem fs = FileSystem.get(config); 
    
    //       For illustration purpose lets call the uploaded file bee.jpg  
    
    fs.copyFromLocalFile(new Path("file:///home/qualebs/NetBeansProjects/qualebs/build/web/images/bee.jpg"), new Path("bee.jpg"));
    out.close();  
    

And finally the question Normally to display the uploaded image to a JSP page one would just use the image tag like

<img src="images/bee.jpg"/>

How do I display this image when it's in Hadoop?

qualebs
  • 1,291
  • 2
  • 17
  • 34
  • This has nothing to do with `HDFS`. It's about how to 'publish' the image directory into the internet... and that depends on your infrastructure, e.g. Apache HTTP, storage layout, and so on... – home Nov 12 '13 at 17:25
  • @home I am really new to this and I only have a single node set up. Is there no more help I can get. An example would be nice. – qualebs Nov 12 '13 at 17:43
  • You must configure your web server to 'publish' the image path. An example a servlet is here (there might be better ways): http://stackoverflow.com/questions/417658/how-to-config-tomcat-to-serve-images-from-an-external-folder-outside-webapps – home Nov 13 '13 at 07:56
  • Note that HDFS is not designed for small files in large number, but other way. – Praveen Sripati Nov 13 '13 at 07:59
  • @Praveeen Sripati so I have read but I think I've got that covered by something I read about Sequence Files in this question stackoverflow.com/questions/16546040 I hope that is all I need to address the many small files issue. Also from the answer you gave to this question http://stackoverflow.com/questions/8209616/how-to-read-a-file-from-hdfs-through-browser, can I use HTTPFS for my use case here? – qualebs Nov 13 '13 at 12:56

0 Answers0