Posts

The javax.servlet Package

The javax.servlet package contains a number of interfaces and classes that establish the framework in which servlets operate. The most significant of these is Servlet. All servlets must implement this interface or extend a class that implements the interface Interface Description Servlet Declares life cycle methods for a servlet. ServletConfig Allows servlets to get initialization parameters ServletContext Enables servlets to log events and access information about their environment. ServletRequest Used to read data from a client request. ServletResponse Used to write data to a client response. The following table summarizes the core classes that are provided in the javax.servlet package: Class Description GenericServlet Implements the Servlet and ServletConfig interfaces. ServletInputStream Provides an input stream for reading requests from a client.

Servlet API

Two packages contain the classes and interfaces that are required to build servlets. javax.servlet javax.servlet.http These packages are the standard extensions provided by Tomcat. Hence, these are not included in Java 6. Please follow the links for more information.

Apache Maven - How To Enable Proxy Setting

There might be a chance in your company is that a firewall is set up a and an HTTP proxy server to stop user to connect to internet directly. If you are behind a proxy, Maven will fail to download any dependencies. To make it work, you have to declare the proxy server setting in Maven configuration file settings.xml .  It can be found in conf directory, i.e.,  {M2_HOME}/conf/settings.xml. {M2_HOME}/conf/settings.xml <! -- proxies | This is a list of proxies which can be used on this machine to connect to the network. | Unless otherwise specified ( by system property or command-line switch ) , the first proxy | specification in this list marked as active will be used. | -- > < proxies > <! -- proxy | Specification for one proxy, to be used in connecting to the network. | < proxy > < id > optional </ id > < active > true </ active > < protocol > http </ protocol

Apache Maven - Installation

Apache Maven is an innovative software project management tool, provides new concept of a project object model ( POM ) file to manage project’s build, dependency and documentation. The most powerful feature is able to download the project dependency libraries automatically. We will show you how to install Apache Maven 3 on Ubuntu 12. Searching for Maven Package In a terminal, run apt-cache search maven  to get all the available Maven package. The maven  package always comes with latest Apache Maven. $ apt-cache search maven .... libxmlbeans-maven-plugin-java-doc - Documentation for Maven XMLBeans Plugin maven - Java software project management and comprehension tool maven-debian-helper - Helper tools for building Debian packages with Maven maven2 - Java software project management and comprehension tool Installing Maven Package Run the below command  to install the latest Apache Maven. $ sudo apt-get install maven Verifying Maven Installation Run the below command to veri

Compression in Hadoop

File compression brings two major benefits: it reduces the space needed to store files, and it speeds up data transfer across the network, or to or from disk. When dealing with large volumes of data, both of these savings can be significant, so it pays to carefully consider how to use compression in Hadoop. Some of the compression formats used in Hadoop Compression Format Tool Algorithm Filename Extension Splittable DEFLATE NA DEFLATE .deflate No gzip gzip DEFLATE .gz No bzip2 bzip2 bzip2 .bz2 Yes LZO lzop LZO .lzo No Snappy NA Snappy .snappy No Codecs A codec is the implementation of a compression-decompression algorithm and in Hadoop, it is represented by an implementation of the CompressionCodec interface. Compression Format Hadoop CompressionCodec DEFLATE org.apac

The Hadoop Distributed Filesystem

Design of HDFS HDFS is a filesystem designed for Very large files - Files that are of hundereds of MB, GB or TB. Hadoop clusters running today stores petabytes of data. Streaming data access - write once, read many times pattern Commodity hardware - Hadoop doesn’t require expensive, highly reliable hardware to run on. The applications for which using HDFS does not work so well. While this may change in the future, these are areas where HDFS is not a good fit today Low-latency data access Lots of small files Multiple writers, arbitrary file modifications Blocks HDFS has the concept of a block, but it is a much larger unit—64 MB by default. HDFS blocks are large compared to disk blocks, and the reason is to minimize the cost of seeks. By making a block large enough, the time to transfer the data from the disk can be made to be significantly larger than the time to seek to the start of the block. Thus the time to transfer a large file made of multiple blocks opera

Failed to load Main-Class manifest attribute from HelloWorld.jar - SOLVED

Image
When i try to compile a jar file using the below command in command prompt, java -jar HelloWorld.jar i got an error like Failed to load Main-Class manifest attribute from HelloWorld.jar This is due to the missing launch configuration.  The Main-Class header needs to be in the manifest for the JAR file - this is metadata about things like other required libraries. See the  Sun documentation  for how to create an appropriate manifest. Simply, i followed the eclipse for exporting the jar file instead of remembering all the commands. and choose as specified below. and choose the following options below. 1. Choose your class that contains MAIN method. 2. Choose the destination of Jar file 3. Once, one and two steps are done, Click Finish. Now run the same command via command prompt,  java -jar HelloWorld.jar This will not throw an