converting normal java code to hadoop code without mapreduce? -


I am currently working on the hoop. I want to change the map to my Java code. I want to code my code with hdfs i.e. my code is normal file system, i want to work with hdfs (hd file system) I want to work below code in Hood (HD File System)

  import java.io.file; Import java.io.FileInputStream; Import java.io.FileOutputStream; Import java.io.IOException; Import java.util.zip.GZIPInputStream; Import java.util.zip.GZIPOutputStream; Public class GZIPExample {public static zero gzip () {int i = new file ("/ media / 0052ADF152ADEC1A / all partitions") List (). Length; System.out.println (I + "here"); While (i> gt; {} string file = "/ media / 0052ADF152ADEC1A / all partitions / files" + i + ".txt"; String gzipFile = "/ media / 0052ADF152ADEC1A / collapses / filegages" + i + ".gz"; String Newfile = "/ Media / 0052ADF152ADEC1A / All / ThEup Ebook / Test 1.txt"; CompressGzipFile (file, gzipFile); Decompress Jump File (gzipFile, newFile); I--; }} Private Static void decompressGzipFile (string gzipFile, String newFile) {try {fileInputStream fis = new FileInputStream (gzipFile); GJIPI inputstream GIS = new GJIPI inputstream (FIS); FileOutputStream fos = New FileOutputStream (newFile); Byte [] buffer = new byte [1024]; Int len; While ((len = gis.read (buffer)) = -1) {fos.write (buffer, 0, lane); } // close resources fos.close (); Gis.close (); } Hold (IOException e) {e.printStackTrace (); }} Private Static Zero compressGzipFile (string file, string gzipFile) {try {fileInputStream fis = new FileInputStream (file); FileOutputStream fos = New FileOutputStream (gzipFile); GZIPOutputStream gzipOS = new GZIPOutputStream (fos); Byte [] buffer = new byte [1024]; Int len; While ((len = fis.read (buffer))! = -1) {gzipOS.write (buffer, 0, lane); } // close processing gzipOS.close (); Fos.close (); Fis.close (); } Hold (IOException e) {e.printStackTrace (); }}}}}    

I recommend you to read and understand that a MapReduce What the code looks like and what it needs to achieve:

Then you will understand that "conversion" is not in a very high level view, map code is modified in two steps. There is a way to do this: reduce the map and

from your example, it seems that you can use a large amount of files to be combined Tax / want to narrow parallel computing so you should try to write your MapReduce code as a two-stage process for joint / contraction to a file. Sorry, but I have never encountered compression algorithms.

Comments

Popular posts from this blog

Java - Error: no suitable method found for add(int, java.lang.String) -

java - JPA TypedQuery: Parameter value element did not match expected type -

c++ - static template member variable has internal linkage but is not defined -