File Compression Using Huffman Algorithm - 2003
File Compression Using Huffman Algorithm - 2003
Developed By : Mansuri Amin R. (MCA-20) Vaja Rajesh K. (MCA-59) Guided By: Prof. H. N. Shah. Assistant Professor, MCA Department, DDU, Nadiad Submitted To : Faculty of Management & Information science, Faculty of Technology. `
Project Objective
Support user to compress and decompress text file. Provide a Better understanding Compression Algorithm. of Huffman File
Project Scope
This project will develop and deliver a new application and this is a s study based project. The new application enable you can reduce the size of any text file and allowed data accuracy and save disc space. The new application can give the idea and better understanding to who study the Huffman coding algorithm technique we can use in the real life application. To make the existing system more efficient.
It was developed by David A. Huffman while he was a Ph.D. student at MIT, and published in the 1952 paper "A Method for the Construction of Minimum-Redundancy Codes".
David A. Huffman
Huffman coding is an encoding algorithm used for lossless data compression. Lossless data compression is a class of data compression algorithms that allows the exact original data to be reconstructed from the compressed data. The term lossless is in contrast to lossy data compression, which only allows an approximation of the original data to be reconstructed, in exchange for better compression rates.
Existing System
Existing system refers to the system that is being followed till now. The main disadvantage of this system is that the users depend on third party software's like winzip, winrar, Stuff etc. The existing system requires more computational time, more manual calculations, and the complexity involved in Selection of features is high. .
Deficiency of Data accuracy Time consuming. The users depend on third party software's like winzip, winrar, Stuff etc.
Proposed System
The aim of proposed system is to develop a system of improved facilities. The proposed system tries to eliminate or reduce difficulties up to some extent. The proposed system is file/folder compression or decompression based on the Huffman algorithm. The proposed system will help the user to consume time. The proposed system helps the user to work user friendly and he can easily do the file compression process without time lagging. The system is very simple in design and to implement. The system requires very low system resources and the system will work in almost all configurations.
The system is very simple in design and to implement. The system requires very low system resources and the system will work in almost all configurations. Ensure data accuracy and Save disk space Minimum time needed for the file compression. The user need not depend on third party software's like winzip, winrar, Stuff etc.
H ffm S u an ystem
* * E nter Text or text file uses calculate frequency uses uses
U ser
User
UI
Huffman System
UI : First Screen
UI : Sample Text
Note : This screen display the file that user choose as open
UI : Result Menu
Note : This screen display all option result menu and user select Frequency of character option
Note : This screen display Huffman Tree for currently selected file.
UI : Codeword Screen
Note : This screen display Huffman Codeword for each character in currently selected file.
UI : Encoding Screen
Note : This screen display user select Encoding for currently selected file.
UI : Encoding Screen
Note : This screen display the compress contain of the of currently selected file.
UI : Sample Screen
Note : This screen display the size of original and compress file .
Note : This screen display open dialog box for Huffman tree file.
Note : This screen display open dialog box for Compress file.
UI : Decompress Screen
With the advancements in compression technology, it is now very easy and efficient to compress video files. Various video compression techniques are available. The most common video compression standard is MPEG (Moving Picture Experts Group)[31]. It is a working group of ISO/IEC charged with the development of video and audio encoding standards. An excellent follow-up paper for this one would be to investigate the more sophisticated versions of Huffman coding algorithms. In future , the algorithms achieve much better compression ratios and are better able to compete with todays tools.
Bibliography
Books: JAVA 2 Complete Reference (5th Edition) By Herbert Schildt core java volume 1 by horstmann and cornell core java volume 2 by horstmann and cornell Web Sites: http://en.wikipedia.org/wiki/Huffman_coding http://www.cprogramming.com/tutorial/computersciencetheory/h uffman.html http://www.daniweb.com/software-development/c/code/216267 http://www.javaclass.info/generics/queue/priority-queue.php