100% found this document useful (1 vote)
5 views

Data structures and algorithms made easy in Java data structure and algorithmic puzzles 2nd Edition Narasimha Karumanchi download

The document provides a comprehensive overview of various data structures and algorithms, particularly focusing on Java and C programming languages. It includes multiple resources for downloading eBooks on data structures, algorithmic puzzles, and interview preparation. The content emphasizes problem-solving and analysis over theoretical concepts, making it suitable for students and professionals preparing for technical interviews and competitive exams.

Uploaded by

ayaledikmov
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
5 views

Data structures and algorithms made easy in Java data structure and algorithmic puzzles 2nd Edition Narasimha Karumanchi download

The document provides a comprehensive overview of various data structures and algorithms, particularly focusing on Java and C programming languages. It includes multiple resources for downloading eBooks on data structures, algorithmic puzzles, and interview preparation. The content emphasizes problem-solving and analysis over theoretical concepts, making it suitable for students and professionals preparing for technical interviews and competitive exams.

Uploaded by

ayaledikmov
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 80

Data structures and algorithms made easy in Java

data structure and algorithmic puzzles 2nd


Edition Narasimha Karumanchi pdf download

https://ebookgate.com/product/data-structures-and-algorithms-
made-easy-in-java-data-structure-and-algorithmic-puzzles-2nd-
edition-narasimha-karumanchi/

Get Instant Ebook Downloads – Browse at https://ebookgate.com


Instant digital products (PDF, ePub, MOBI) available
Download now and explore formats that suit you...

Data Structures Algorithms And Applications In C 2nd


Edition Sartaj Sahni

https://ebookgate.com/product/data-structures-algorithms-and-
applications-in-c-2nd-edition-sartaj-sahni/

ebookgate.com

Java Structures Data Structures in Java for the Principled


Programmer 2nd edition Duane Bailey

https://ebookgate.com/product/java-structures-data-structures-in-java-
for-the-principled-programmer-2nd-edition-duane-bailey/

ebookgate.com

Data structures and abstractions with Java 2nd ed Edition


Carrano

https://ebookgate.com/product/data-structures-and-abstractions-with-
java-2nd-ed-edition-carrano/

ebookgate.com

Data Structures and Algorithms in C 4th Edition Adam


Drozdek

https://ebookgate.com/product/data-structures-and-algorithms-in-c-4th-
edition-adam-drozdek/

ebookgate.com
Data Structures and Algorithms in C 1st Edition Michael
Mcmillan

https://ebookgate.com/product/data-structures-and-algorithms-in-c-1st-
edition-michael-mcmillan/

ebookgate.com

Data Structures and Algorithms Using C 1st Edition Michael


Mcmillan

https://ebookgate.com/product/data-structures-and-algorithms-
using-c-1st-edition-michael-mcmillan/

ebookgate.com

Java Software Structures Designing and Using Data


Structures 3rd Edition John Lewis

https://ebookgate.com/product/java-software-structures-designing-and-
using-data-structures-3rd-edition-john-lewis/

ebookgate.com

Java Foundations Introduction to Program Design and Data


Structures 2nd Edition John Lewis

https://ebookgate.com/product/java-foundations-introduction-to-
program-design-and-data-structures-2nd-edition-john-lewis/

ebookgate.com

Data Structures Outside In with Java 1st Edition Sesh


Venugopal

https://ebookgate.com/product/data-structures-outside-in-with-
java-1st-edition-sesh-venugopal/

ebookgate.com
Data Structures
And
Algorithms
Made Easy In JAVA
Data Structures and Algorithmic Puzzles

By
Narasimha Karumanchi

Concepts Problems Interview Questions


Copyright ©2020 by .
All rights reserved.
Designed by ℎ ℎ

©
Copyright 2020 CareerMonk Publications. All rights reserved.
All rights reserved. No part of this book may be reproduced in any form or by any electronic or mechanical means, including information
storage and retrieval systems, without written permission from the publisher or author.

This book has been published with all efforts taken to make the material error-free after the consent of the author. However, the author and
the publisher do not assume and hereby disclaim any liability to any party for any loss, damage, or disruption caused by errors or omissions,
whether such errors or omissions result from negligence, accident, or any other cause.

While every effort has been made to avoid any mistake or omission, this publication is being sold on the condition and understanding that
neither the author nor the publishers or printers would be liable in any manner to any person by reason of any mistake or omission in this
publication or for any action taken or omitted to be taken or advice rendered or accepted on the basis of this work. For any defect in printing
or binding the publishers will be liable only to replace the defective copy by another copy of this work then available.
Acknowledgements
ℎ and ℎ , it is impossible to thank you adequately for everything you have done, from loving me unconditionally to raising me in
a stable household, where your persistent efforts and traditional values taught your children to celebrate and embrace life. I could not have
asked for better parents or role-models. You showed me that anything is possible with faith, hard work and determination.

This book would not have been possible without the help of many people. I would like to express my gratitude to all of the people who
provided support, talked things over, read, wrote, offered comments, allowed me to quote their remarks and assisted in the editing,
proofreading and design. In particular, I would like to thank the following individuals:

 ℎ , IIT Bombay, Architect, dataRPM Pvt. Ltd.


 , Senior Consultant, Juniper Networks Inc.
 . ℎ ℎ , IIT Kanpur, Mentor Graphics Inc.

- ℎ ℎ
M-Tech,
Founder, .
Preface
Dear Reader,

Please hold on! I know many people typically do not read the Preface of a book. But I strongly recommend that you read this particular
Preface.

It is not the main objective of this book to present you with the theorems and proofs on and ℎ . I have followed
a pattern of improving the problem solutions with different complexities (for each problem, you will find multiple solutions with different,
and reduced, complexities). Basically, it’s an enumeration of possible solutions. With this approach, even if you get a new question, it will
show you a way to think about the possible solutions. You will find this book useful for interview preparation, competitive exams preparation,
and campus interview preparations.

As a , if you read the complete book, I am sure you will be able to challenge the interviewers. If you read it as an , it
will help you to deliver lectures with an approach that is easy to follow, and as a result your students will appreciate the fact that they have
opted for Computer Science / Information Technology as their degree.

This book is also useful for and during their academic preparations. In all the
chapters you will see that there is more emphasis on problems and their analysis rather than on theory. In each chapter, you will first read
about the basic required theory, which is then followed by a section on problem sets. In total, there are approximately 700 algorithmic
problems, all with solutions.

If you read the book as a preparing for competitive exams for Computer Science / Information Technology, the content covers
the topics in full detail. While writing this book, my main focus was to help students who are preparing for these exams.

In all the chapters you will see more emphasis on problems and analysis rather than on theory. In each chapter, you will first see the basic
required theory followed by various problems.

For many problems, solutions are provided with different levels of complexity. We start with the solution and slowly
move toward the possible for that problem. For each problem, we endeavor to understand how much time the algorithm takes
and how much memory the algorithm uses.

It is recommended that the reader does at least one complete reading of this book to gain a full understanding of all the topics that are
covered. Then, in subsequent readings you can skip directly to any chapter to refer to a specific topic. Even though many readings have been
done for the purpose of correcting errors, there could still be some minor typos in the book. If any are found, they will be updated
at . . . You can monitor this site for any corrections and also for new problems and solutions. Also, please provide
your valuable suggestions at: @ . .

I wish you all the best and I am confident that you will find this book useful.

- ℎ ℎ
M-Tech,
Founder, .
Other Books by Narasimha Karumanchi

IT Interview Questions
Elements of Computer Networking
Data Structures and Algorithmic Thinking with Python
Data Structures and Algorithms Made Easy (C/C++)
Coding Interview Questions
Data Structures and Algorithms for GATE
Peeling Design Patterns
Algorithm Design Techniques
Data Structure Operations Cheat Sheet
Space
Average Case Time Complexity Worst Case Time Complexity
Data Structure Complexity
Name Accessing Accessing
Search Insertion Deletion Search Insertion Deletion Worst Case
element element
Arrays O(1) O( ) O( ) O( ) O(1) O( ) O( ) O( ) O( )
Stacks O( ) O( ) O(1) O(1) O( ) O( ) O(1) O(1) O( )
Queues O( ) O( ) O(1) O(1) O( ) O( ) O(1) O(1) O( )
Binary Trees O( ) O( ) O( ) O( ) O( ) O( ) O( ) O( ) O( )
Binary Search
O( ) O( ) O( ) O( ) O( ) O( ) O( ) O( ) O( )
Trees
Balanced
Binary Search O( ) O( ) O( ) O( ) O( ) O( ) O( ) O( ) O( )
Trees
Hash Tables N/A O(1) O(1) O(1) N/A O( ) O( ) O( ) O( )

Note: For best case operations, the time complexities are O(1).

Sorting Algorithms Cheat Sheet


Space
Sorting Time Complexity
Complexity Is Sorting Class
Algorithm Remarks
Best Average Worst Stable? Type
Name Worst Case
Case Case Case
Bubble Sort O( ) O( ) O( ) O(1) Yes Comparison Not a preferred sorting algorithm.
In the best case (already sorted), every
Insertion Sort O( ) O( ) O( ) O(1) Yes Comparison
insert requires constant time
Even a perfectly sorted array requires
Selection Sort O( ) O( ) O( ) O(1) Yes Comparison
scanning the entire array
On arrays, it requires O( ) space; and on
Merge Sort O( ) O( ) O( ) O( ) Yes Comparison
linked lists, it requires constant space
By using input array as storage for the
Heap Sort O( ) O( ) O( ) O(1) No Comparison heap, it is possible to achieve constant
space
Randomly picking a pivot value can help
Quick Sort O( ) O( ) O( ) O( ) No Comparison avoid worst case scenarios such as a
perfectly sorted array.
Performing inorder traversal on the
Tree Sort O( ) O( ) O( ) O( ) Yes Comparison
balanced binary search tree.
Where is the range of the non-negative
Counting Sort O( + ) O( + ) O( + ) O( ) Yes Linear
key values.
Bucket sort is stable, if the underlying
Bucket Sort O( + ) O( + ) O( ) O( ) Yes Linear
sorting algorithm is stable.
Radix sort is stable, if the underlying
Radix Sort O( ) O( ) O( ) O( + ) Yes Linear
sorting algorithm is stable.
Table of Contents
1. Introduction -------------------------------------------------------------------------------------------------------------------------------------------------------------------- 15
1.1 Variables ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 15
1.2 Data Types --------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 15
1.3 Data Structure ---------------------------------------------------------------------------------------------------------------------------------------------------------------------- 16
1.4 Abstract Data Types (ADTs) ------------------------------------------------------------------------------------------------------------------------------------------------- 16
1.5 What is an Algorithm? ---------------------------------------------------------------------------------------------------------------------------------------------------------- 16
1.6 Why the Analysis of Algorithms? ------------------------------------------------------------------------------------------------------------------------------------------- 16
1.7 Goal of the Analysis of Algorithms ----------------------------------------------------------------------------------------------------------------------------------------- 17
1.8 What is Running Time Analysis? ------------------------------------------------------------------------------------------------------------------------------------------- 17
1.9 How to Compare Algorithms------------------------------------------------------------------------------------------------------------------------------------------------- 17
1.10 What is Rate of Growth? ----------------------------------------------------------------------------------------------------------------------------------------------------- 17
1.11 Commonly Used Rates of Growth --------------------------------------------------------------------------------------------------------------------------------------- 17
1.12 Types of Analysis ---------------------------------------------------------------------------------------------------------------------------------------------------------------- 18
1.13 Asymptotic Notation ----------------------------------------------------------------------------------------------------------------------------------------------------------- 19
1.14 Big-O Notation ------------------------------------------------------------------------------------------------------------------------------------------------------------------- 19
1.15 Omega-Ω Notation [Lower Bounding Function] ------------------------------------------------------------------------------------------------------------------- 20
1.16 Theta- Notation---------------------------------------------------------------------------------------------------------------------------------------------------------------- 20
1.17 Why is it called Asymptotic Analysis?----------------------------------------------------------------------------------------------------------------------------------- 21
1.18 Guidelines for Asymptotic Analysis -------------------------------------------------------------------------------------------------------------------------------------- 21
1.20 Simplifying properties of asymptotic notations ---------------------------------------------------------------------------------------------------------------------- 22
1.21 Commonly used Logarithms and Summations ---------------------------------------------------------------------------------------------------------------------- 22
1.22 Master Theorem for Divide and Conquer Recurrences --------------------------------------------------------------------------------------------------------- 23
1.23 Divide and Conquer Master Theorem: Problems & Solutions ----------------------------------------------------------------------------------------------- 23
1.24 Master Theorem for Subtract and Conquer Recurrences------------------------------------------------------------------------------------------------------- 24
1.25 Variant of Subtraction and Conquer Master Theorem----------------------------------------------------------------------------------------------------------- 24
1.26 Method of Guessing and Confirming ----------------------------------------------------------------------------------------------------------------------------------- 24
1.27 Amortized Analysis ------------------------------------------------------------------------------------------------------------------------------------------------------------- 26
1.28 Algorithms Analysis: Problems & Solutions -------------------------------------------------------------------------------------------------------------------------- 26
2. Recursion and Backtracking --------------------------------------------------------------------------------------------------------------------------------------------- 36
2.1 Introduction ------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 36
2.2 What is Recursion?--------------------------------------------------------------------------------------------------------------------------------------------------------------- 36
2.3 Why Recursion? ------------------------------------------------------------------------------------------------------------------------------------------------------------------- 36
2.4 Format of a Recursive Function --------------------------------------------------------------------------------------------------------------------------------------------- 36
2.5 Recursion and Memory (Visualization)----------------------------------------------------------------------------------------------------------------------------------- 37
2.6 Recursion versus Iteration ----------------------------------------------------------------------------------------------------------------------------------------------------- 37
2.7 Notes on Recursion -------------------------------------------------------------------------------------------------------------------------------------------------------------- 38
2.8 Example Algorithms of Recursion------------------------------------------------------------------------------------------------------------------------------------------ 38
2.9 Recursion: Problems & Solutions------------------------------------------------------------------------------------------------------------------------------------------- 38
2.10 What is Backtracking?--------------------------------------------------------------------------------------------------------------------------------------------------------- 39
2.11 Example Algorithms of Backtracking ----------------------------------------------------------------------------------------------------------------------------------- 39
2.12 Backtracking: Problems & Solutions------------------------------------------------------------------------------------------------------------------------------------- 39
3. Linked Lists--------------------------------------------------------------------------------------------------------------------------------------------------------------------- 41
3.1 What is a Linked List? ---------------------------------------------------------------------------------------------------------------------------------------------------------- 41
3.2 Linked Lists ADT----------------------------------------------------------------------------------------------------------------------------------------------------------------- 41
3.3 Why Linked Lists? --------------------------------------------------------------------------------------------------------------------------------------------------------------- 41
3.4 Arrays Overview ------------------------------------------------------------------------------------------------------------------------------------------------------------------- 41
3.5 Comparison of Linked Lists with Arrays & Dynamic Arrays ---------------------------------------------------------------------------------------------------- 42
3.6 Singly Linked Lists ---------------------------------------------------------------------------------------------------------------------------------------------------------------- 42
3.7 Doubly Linked Lists ------------------------------------------------------------------------------------------------------------------------------------------------------------- 48
3.8 Circular Linked Lists------------------------------------------------------------------------------------------------------------------------------------------------------------- 54
3.9 A Memory-efficient Doubly Linked List--------------------------------------------------------------------------------------------------------------------------------- 59
3.10 Unrolled Linked Lists --------------------------------------------------------------------------------------------------------------------------------------------------------- 59
3.11 Skip Lists---------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 67
3.11 Linked Lists: Problems & Solutions ------------------------------------------------------------------------------------------------------------------------------------- 70
4. Stacks ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 91
4.1 What is a Stack? ------------------------------------------------------------------------------------------------------------------------------------------------------------------- 91
4.2 How Stacks are used ------------------------------------------------------------------------------------------------------------------------------------------------------------- 91
4.3 Stack ADT --------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 91
4.4 Applications ------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 92
4.5 Implementation -------------------------------------------------------------------------------------------------------------------------------------------------------------------- 92
4.6 Comparison of Implementations ------------------------------------------------------------------------------------------------------------------------------------------- 97
4.8 Stacks: Problems & Solutions ------------------------------------------------------------------------------------------------------------------------------------------------ 97
5. Queues ------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 115
5.1 What is a Queue? --------------------------------------------------------------------------------------------------------------------------------------------------------------- 115
5.2 How are Queues Used? ------------------------------------------------------------------------------------------------------------------------------------------------------ 115
5.3 Queue ADT----------------------------------------------------------------------------------------------------------------------------------------------------------------------- 115
5.4 Exceptions-------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 115
5.5 Applications ----------------------------------------------------------------------------------------------------------------------------------------------------------------------- 115
5.6 Implementation ------------------------------------------------------------------------------------------------------------------------------------------------------------------ 116
5.7 Queues: Problems & Solutions -------------------------------------------------------------------------------------------------------------------------------------------- 121
6. Trees ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 125
6.1 What is a Tree? ------------------------------------------------------------------------------------------------------------------------------------------------------------------ 125
6.2 Glossary ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 125
6.3 Binary Trees----------------------------------------------------------------------------------------------------------------------------------------------------------------------- 126
6.4 Types of Binary Trees--------------------------------------------------------------------------------------------------------------------------------------------------------- 126
6.5 Properties of Binary Trees--------------------------------------------------------------------------------------------------------------------------------------------------- 126
6.4 Binary Tree Traversals -------------------------------------------------------------------------------------------------------------------------------------------------------- 128
6.5 Generic Trees (N-ary Trees)------------------------------------------------------------------------------------------------------------------------------------------------ 149
6.6 Threaded (Stack or Queue less) Binary Tree Traversals-------------------------------------------------------------------------------------------------------- 154
6.7 Expression Trees ---------------------------------------------------------------------------------------------------------------------------------------------------------------- 158
6.10 XOR Trees ---------------------------------------------------------------------------------------------------------------------------------------------------------------------- 160
6.9 Binary Search Trees (BSTs) ------------------------------------------------------------------------------------------------------------------------------------------------ 161
6.10 Balanced Binary Search Trees ------------------------------------------------------------------------------------------------------------------------------------------- 172
6.11 AVL (Adelson-Velskii and Landis) Trees --------------------------------------------------------------------------------------------------------------------------- 172
6.12 Other Variations on Trees ------------------------------------------------------------------------------------------------------------------------------------------------- 183
6.13 Supplementary Questions-------------------------------------------------------------------------------------------------------------------------------------------------- 186
7. Priority Queues and Heaps -------------------------------------------------------------------------------------------------------------------------------------------- 187
7.1 What is a Priority Queue? --------------------------------------------------------------------------------------------------------------------------------------------------- 187
7.2 Priority Queue ADT----------------------------------------------------------------------------------------------------------------------------------------------------------- 187
7.3 Priority Queue Applications ------------------------------------------------------------------------------------------------------------------------------------------------ 187
7.4 Priority Queue Implementations------------------------------------------------------------------------------------------------------------------------------------------ 187
7.5 Heaps and Binary Heaps----------------------------------------------------------------------------------------------------------------------------------------------------- 188
7.6 Binary Heaps---------------------------------------------------------------------------------------------------------------------------------------------------------------------- 189
7.7 Priority Queues [Heaps]: Problems & Solutions ------------------------------------------------------------------------------------------------------------------- 193
8. Disjoint Sets ADT --------------------------------------------------------------------------------------------------------------------------------------------------------- 203
8.1 Introduction ----------------------------------------------------------------------------------------------------------------------------------------------------------------------- 203
8.2 Equivalence Relations and Equivalence Classes -------------------------------------------------------------------------------------------------------------------- 203
8.3 Disjoint Sets ADT -------------------------------------------------------------------------------------------------------------------------------------------------------------- 203
8.4 Applications ----------------------------------------------------------------------------------------------------------------------------------------------------------------------- 203
8.5 Tradeoffs in Implementing Disjoint Sets ADT --------------------------------------------------------------------------------------------------------------------- 204
8.8 Fast UNION Implementation (Slow FIND) ------------------------------------------------------------------------------------------------------------------------- 204
8.7 Fast UNION Implementations (Quick FIND)---------------------------------------------------------------------------------------------------------------------- 206
8.8 Path Compression--------------------------------------------------------------------------------------------------------------------------------------------------------------- 207
8.9 Summary---------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 208
8.10 Disjoint Sets: Problems & Solutions ----------------------------------------------------------------------------------------------------------------------------------- 208
9. Graph Algorithms ---------------------------------------------------------------------------------------------------------------------------------------------------------- 210
9.1 Introduction ----------------------------------------------------------------------------------------------------------------------------------------------------------------------- 210
9.2 Glossary ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 210
9.3 Applications of Graphs-------------------------------------------------------------------------------------------------------------------------------------------------------- 212
9.4 Graph Representation --------------------------------------------------------------------------------------------------------------------------------------------------------- 212
9.5 Graph Traversals ---------------------------------------------------------------------------------------------------------------------------------------------------------------- 217
9.6 Topological Sort ----------------------------------------------------------------------------------------------------------------------------------------------------------------- 224
9.7 Shortest Path Algorithms ----------------------------------------------------------------------------------------------------------------------------------------------------- 226
9.8 Minimal Spanning Tree ------------------------------------------------------------------------------------------------------------------------------------------------------ 232
9.9 Graph Algorithms: Problems & Solutions----------------------------------------------------------------------------------------------------------------------------- 235
10. Sorting -------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 249
10.1 What is Sorting? --------------------------------------------------------------------------------------------------------------------------------------------------------------- 249
10.2 Why is Sorting Necessary? ------------------------------------------------------------------------------------------------------------------------------------------------ 249
10.3 Classification of Sorting Algorithms ------------------------------------------------------------------------------------------------------------------------------------ 249
10.4 Other Classifications --------------------------------------------------------------------------------------------------------------------------------------------------------- 249
10.5 Bubble Sort ---------------------------------------------------------------------------------------------------------------------------------------------------------------------- 250
10.6 Selection Sort-------------------------------------------------------------------------------------------------------------------------------------------------------------------- 251
10.7 Insertion Sort-------------------------------------------------------------------------------------------------------------------------------------------------------------------- 252
10.8 Shell Sort-------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 254
10.9 Merge Sort------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 254
10.10 Heap Sort ----------------------------------------------------------------------------------------------------------------------------------------------------------------------- 257
10.11 Quick Sort ---------------------------------------------------------------------------------------------------------------------------------------------------------------------- 257
10.12 Tree Sort ------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 261
10.13 Comparison of Sorting Algorithms ----------------------------------------------------------------------------------------------------------------------------------- 261
10.14 Linear Sorting Algorithms------------------------------------------------------------------------------------------------------------------------------------------------ 261
10.15 Counting Sort------------------------------------------------------------------------------------------------------------------------------------------------------------------ 261
10.16 Bucket sort (Bin Sort)------------------------------------------------------------------------------------------------------------------------------------------------------ 262
10.17 Radix Sort----------------------------------------------------------------------------------------------------------------------------------------------------------------------- 262
10.18 Topological Sort ------------------------------------------------------------------------------------------------------------------------------------------------------------- 263
10.19 External Sorting -------------------------------------------------------------------------------------------------------------------------------------------------------------- 263
10.20 Sorting: Problems & Solutions ----------------------------------------------------------------------------------------------------------------------------------------- 264
11. Searching----------------------------------------------------------------------------------------------------------------------------------------------------------------------- 274
11.1 What is Searching? ----------------------------------------------------------------------------------------------------------------------------------------------------------- 274
11.2 Why do we need Searching?---------------------------------------------------------------------------------------------------------------------------------------------- 274
11.3 Types of Searching------------------------------------------------------------------------------------------------------------------------------------------------------------ 274
11.4 Unordered Linear Search -------------------------------------------------------------------------------------------------------------------------------------------------- 274
11.5 Sorted/Ordered Linear Search------------------------------------------------------------------------------------------------------------------------------------------- 274
11.6 Binary Search ------------------------------------------------------------------------------------------------------------------------------------------------------------------- 275
11.7 Interpolation Search---------------------------------------------------------------------------------------------------------------------------------------------------------- 275
11.8 Comparing Basic Searching Algorithms ------------------------------------------------------------------------------------------------------------------------------ 276
11.9 Symbol Tables and Hashing -------------------------------------------------------------------------------------------------------------------------------------------- 277
11.10 String Searching Algorithms ------------------------------------------------------------------------------------------------------------------------------------------ 277
11.11 Searching: Problems & Solutions-------------------------------------------------------------------------------------------------------------------------------------- 277
12. Selection Algorithms [Medians] -------------------------------------------------------------------------------------------------------------------------------------- 297
12.1 What are Selection Algorithms?----------------------------------------------------------------------------------------------------------------------------------------- 297
12.2 Selection by Sorting----------------------------------------------------------------------------------------------------------------------------------------------------------- 297
12.3 Partition-based Selection Algorithm ----------------------------------------------------------------------------------------------------------------------------------- 297
12.4 Linear Selection Algorithm - Median of Medians Algorithm ------------------------------------------------------------------------------------------------ 297
12.5 Finding the K Smallest Elements in Sorted Order --------------------------------------------------------------------------------------------------------------- 297
12.6 Selection Algorithms: Problems & Solutions ----------------------------------------------------------------------------------------------------------------------- 297
13. Symbol Tables --------------------------------------------------------------------------------------------------------------------------------------------------------------- 305
13.1 Introduction --------------------------------------------------------------------------------------------------------------------------------------------------------------------- 305
13.2 What are Symbol Tables? ------------------------------------------------------------------------------------------------------------------------------------------------- 305
13.3 Symbol Table Implementations ----------------------------------------------------------------------------------------------------------------------------------------- 305
13.4 Comparison Table of Symbols for Implementations ----------------------------------------------------------------------------------------------------------- 306
14. Hashing ------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 307
14.1 What is Hashing?-------------------------------------------------------------------------------------------------------------------------------------------------------------- 307
14.2 Why Hashing?------------------------------------------------------------------------------------------------------------------------------------------------------------------ 307
14.3 Hash Table ADT-------------------------------------------------------------------------------------------------------------------------------------------------------------- 307
14.4 Understanding Hashing ----------------------------------------------------------------------------------------------------------------------------------------------------- 307
14.5 Components of Hashing ---------------------------------------------------------------------------------------------------------------------------------------------------- 308
14.6 Hash Table----------------------------------------------------------------------------------------------------------------------------------------------------------------------- 308
14.7 Hash Function ------------------------------------------------------------------------------------------------------------------------------------------------------------------ 309
14.8 Load Factor ---------------------------------------------------------------------------------------------------------------------------------------------------------------------- 309
14.9 Collisions-------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 310
14.10 Collision Resolution Techniques-------------------------------------------------------------------------------------------------------------------------------------- 310
14.11 Separate Chaining ----------------------------------------------------------------------------------------------------------------------------------------------------------- 310
14.12 Open Addressing ------------------------------------------------------------------------------------------------------------------------------------------------------------ 311
14.13 Comparison of Collision Resolution Techniques -------------------------------------------------------------------------------------------------------------- 312
14.14 How Hashing Gets O(1) Complexity -------------------------------------------------------------------------------------------------------------------------------- 312
14.15 Hashing Techniques-------------------------------------------------------------------------------------------------------------------------------------------------------- 312
14.16 Problems for which Hash Tables are not suitable -------------------------------------------------------------------------------------------------------------- 312
14.17 Bloom Filters ------------------------------------------------------------------------------------------------------------------------------------------------------------------ 312
14.18 Hashing: Problems & Solutions---------------------------------------------------------------------------------------------------------------------------------------- 314
15. String Algorithms ----------------------------------------------------------------------------------------------------------------------------------------------------------- 322
15.1 Introduction --------------------------------------------------------------------------------------------------------------------------------------------------------------------- 322
15.2 String Matching Algorithms ----------------------------------------------------------------------------------------------------------------------------------------------- 322
15.3 Brute Force Method --------------------------------------------------------------------------------------------------------------------------------------------------------- 322
15.4 Rabin-Karp String Matching Algorithm ------------------------------------------------------------------------------------------------------------------------------ 323
15.5 String Matching with Finite Automata--------------------------------------------------------------------------------------------------------------------------------- 323
15.6 KMP Algorithm ---------------------------------------------------------------------------------------------------------------------------------------------------------------- 324
15.7 Boyer-Moore Algorithm ---------------------------------------------------------------------------------------------------------------------------------------------------- 326
15.8 Data Structures for Storing Strings-------------------------------------------------------------------------------------------------------------------------------------- 327
15.9 Hash Tables for Strings ----------------------------------------------------------------------------------------------------------------------------------------------------- 327
15.10 Binary Search Trees for Strings ---------------------------------------------------------------------------------------------------------------------------------------- 327
15.11 Tries------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 327
15.12 Ternary Search Trees ------------------------------------------------------------------------------------------------------------------------------------------------------ 329
15.13 Comparing BSTs, Tries and TSTs ----------------------------------------------------------------------------------------------------------------------------------- 332
15.14 Suffix Trees -------------------------------------------------------------------------------------------------------------------------------------------------------------------- 332
15.15 String Algorithms: Problems & Solutions -------------------------------------------------------------------------------------------------------------------------- 334
16. Algorithms Design Techniques --------------------------------------------------------------------------------------------------------------------------------------- 344
16.1 Introduction --------------------------------------------------------------------------------------------------------------------------------------------------------------------- 344
16.2 Classification--------------------------------------------------------------------------------------------------------------------------------------------------------------------- 344
16.3 Classification by Implementation Method--------------------------------------------------------------------------------------------------------------------------- 344
16.4 Classification by Design Method ---------------------------------------------------------------------------------------------------------------------------------------- 345
16.5 Other Classifications --------------------------------------------------------------------------------------------------------------------------------------------------------- 345
17. Greedy Algorithms--------------------------------------------------------------------------------------------------------------------------------------------------------- 346
17.1 Introduction --------------------------------------------------------------------------------------------------------------------------------------------------------------------- 346
17.2 Greedy Strategy----------------------------------------------------------------------------------------------------------------------------------------------------------------- 346
17.3 Elements of Greedy Algorithms ----------------------------------------------------------------------------------------------------------------------------------------- 346
17.4 Does Greedy Always Work? --------------------------------------------------------------------------------------------------------------------------------------------- 346
17.5 Advantages and Disadvantages of Greedy Method -------------------------------------------------------------------------------------------------------------- 346
17.6 Greedy Applications---------------------------------------------------------------------------------------------------------------------------------------------------------- 346
17.7 Understanding Greedy Technique ------------------------------------------------------------------------------------------------------------------------------------- 347
17.8 Greedy Algorithms: Problems & Solutions ------------------------------------------------------------------------------------------------------------------------- 349
18. Divide and Conquer Algorithms ------------------------------------------------------------------------------------------------------------------------------------- 354
18.1 Introduction --------------------------------------------------------------------------------------------------------------------------------------------------------------------- 354
18.2 What is the Divide and Conquer Strategy? ------------------------------------------------------------------------------------------------------------------------- 354
18.3 Does Divide and Conquer Always Work? -------------------------------------------------------------------------------------------------------------------------- 354
18.4 Divide and Conquer Visualization -------------------------------------------------------------------------------------------------------------------------------------- 354
18.5 Understanding Divide and Conquer----------------------------------------------------------------------------------------------------------------------------------- 355
18.6 Advantages of Divide and Conquer ------------------------------------------------------------------------------------------------------------------------------------ 355
18.7 Disadvantages of Divide and Conquer -------------------------------------------------------------------------------------------------------------------------------- 355
18.8 Master Theorem --------------------------------------------------------------------------------------------------------------------------------------------------------------- 355
18.9 Divide and Conquer Applications -------------------------------------------------------------------------------------------------------------------------------------- 356
18.10 Divide and Conquer: Problems & Solutions --------------------------------------------------------------------------------------------------------------------- 356
19. Dynamic Programming -------------------------------------------------------------------------------------------------------------------------------------------------- 368
19.1 Introduction --------------------------------------------------------------------------------------------------------------------------------------------------------------------- 368
19.2 What is Dynamic Programming Strategy?--------------------------------------------------------------------------------------------------------------------------- 368
19.3 Properties of Dynamic Programming Strategy --------------------------------------------------------------------------------------------------------------------- 368
19.4 Greedy vs Divide and Conquer vs DP -------------------------------------------------------------------------------------------------------------------------------- 368
19.5 Can DP solve all problems?----------------------------------------------------------------------------------------------------------------------------------------------- 369
19.6 Dynamic Programming Approaches----------------------------------------------------------------------------------------------------------------------------------- 369
19.7 Understanding DP Approaches ----------------------------------------------------------------------------------------------------------------------------------------- 369
19.8 Examples of DP Algorithms ---------------------------------------------------------------------------------------------------------------------------------------------- 372
19.9 Longest Common Subsequence----------------------------------------------------------------------------------------------------------------------------------------- 372
19.10 Dynamic Programming: Problems & Solutions ----------------------------------------------------------------------------------------------------------------- 374
20. Complexity Classes -------------------------------------------------------------------------------------------------------------------------------------------------------- 396
20.1 Introduction --------------------------------------------------------------------------------------------------------------------------------------------------------------------- 396
20.2 Polynomial/Exponential Time ------------------------------------------------------------------------------------------------------------------------------------------- 396
20.3 What is a Decision Problem? -------------------------------------------------------------------------------------------------------------------------------------------- 396
20.4 Decision Procedure----------------------------------------------------------------------------------------------------------------------------------------------------------- 396
20.5 What is a Complexity Class?---------------------------------------------------------------------------------------------------------------------------------------------- 396
20.6 Types of Complexity Classes --------------------------------------------------------------------------------------------------------------------------------------------- 397
20.7 Reductions------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 398
20.8 Complexity Classes: Problems & Solutions ------------------------------------------------------------------------------------------------------------------------- 400
21. Miscellaneous Concepts ------------------------------------------------------------------------------------------------------------------------------------------------- 402
21.1 Introduction --------------------------------------------------------------------------------------------------------------------------------------------------------------------- 402
21.2 Hacks on Bitwise Programming ----------------------------------------------------------------------------------------------------------------------------------------- 402
21.3 Other Programming Questions ------------------------------------------------------------------------------------------------------------------------------------------ 405
References ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 412
Data Structures and Algorithms Made Easy in Java Introduction

Chapter

INTRODUCTION 1
The objective of this chapter is to explain the importance of the analysis of algorithms, their notations, relationships and solving as many
problems as possible. Let us first focus on understanding the basic elements of algorithms, the importance of algorithm analysis, and then
slowly move toward the other topics as mentioned above. After completing this chapter, you should be able to find the complexity of any
given algorithm (especially recursive functions).

1.1 Variables
Before getting in to the definition of variables, let us relate them to an old mathematical equation. Many of us would have solved many
mathematical equations since childhood. As an example, consider the equation below:
+2 −2 =1
We don’t have to worry about the use of this equation. The important thing that we need to understand is that the equation has names ( and
), which hold values (data). That means the ( and ) are placeholders for representing data. Similarly, in computer science
programming we need something for holding data, and is the way to do that.

1.2 Data Types


In the above-mentioned equation, the variables and can take any values such as integral numbers (10, 20), real numbers (0.23, 5.5), or
just 0 and 1. To solve the equation, we need to relate them to the kind of values they can take, and is the name used in computer
science programming for this purpose. A in a programming language is a set of data with predefined values. Examples of data
types are: integer, floating point, unit number, character, string, etc.
Computer memory is all filled with zeros and ones. If we have a problem and we want to code it, it’s very difficult to provide the solution in
terms of zeros and ones. To help users, programming languages and compilers provide us with data types. For example, takes 2
bytes (actual value depends on compiler), takes 4 bytes, etc. This says that in memory we are combining 2 bytes (16 bits) and calling it
an . Similarly, combining 4 bytes (32 bits) and calling it a . A data type reduces the coding effort. At the top level, there are two
types of data types:
 System-defined data types (also called data types)
 User-defined data types.

System-defined data types (Primitive data types)


Data types that are defined by system are called data types. The primitive data types provided by many programming languages
are: int, float, char, double, bool, etc. The number of bits allocated for each primitive data type depends on the programming languages, the
compiler and the operating system. For the same primitive data type, different languages may use different sizes. Depending on the size of the
data types, the total available values (domain) will also change. For example, “ ” may take 2 bytes or 4 bytes. If it takes 2 bytes (16 bits),
then the total possible values are minus 32,768 to plus 32,767 (-2 2 -1). If it takes 4 bytes (32 bits), then the possible values are between
−2,147,483,648 and +2,147,483,647 (-2 2 -1). The same is the case with other data types.

User-defined data types


If the system-defined data types are not enough, then most programming languages allow the users to define their own data types, called
− . Good examples of user defined data types are: structures in / + + and classes in . For example, in
the snippet below, we are combining many system-defined data types and calling the user defined data type by the name “ ”. This
gives more flexibility and comfort in dealing with computer memory.
public class newType {
public int data1;
public int data 2;
private float data3;

1.1 Variables 15
Data Structures and Algorithms Made Easy in Java Introduction


private char data;
//Operations
}

1.3 Data Structure


Based on the discussion above, once we have data in variables, we need some mechanism for manipulating that data to solve problems.
is a particular way of storing and organizing data in a computer so that it can be used efficiently. A is a
special format for organizing and storing data. General data structure types include arrays, files, linked lists, stacks, queues, trees, graphs and
so on.
Depending on the organization of the elements, data structures are classified into two types:
1) : Elements are accessed in a sequential order but it is not compulsory to store all elements sequentially
(say, Linked Lists). : Linked Lists, Stacks and Queues.
2) − : Elements of this data structure are stored/accessed in a non-linear order. : Trees and
graphs.

1.4 Abstract Data Types (ADTs)


Before defining abstract data types, let us consider the different view of system-defined data types. We all know that, by default, all primitive
data types (int, float, etc.) support basic operations such as addition and subtraction. The system provides the implementations for the primitive
data types. For user-defined data types we also need to define operations. The implementation for these operations can be done when we
want to actually use them. That means, in general, user defined data types are defined along with their operations.
To simplify the process of solving problems, we combine the data structures with their operations and we call this
(ADTs). An ADT consists of parts:
1. Declaration of data
2. Declaration of operations
Commonly used ADTs include: Linked Lists, Stacks, Queues, Priority Queues, Binary Trees, Dictionaries, Disjoint Sets (Union and Find),
Hash Tables, Graphs, and many others. For example, stack uses a LIFO (Last-In-First-Out) mechanism while storing the data in data
structures. The last element inserted into the stack is the first element that gets deleted. Common operations are: creating the stack, push an
element onto the stack, pop an element from the stack, finding the current top of the stack, finding the number of elements in the stack, etc.
While defining the ADTs do not worry about the implementation details. They come into the picture only when we want to use them.
Different kinds of ADTs are suited to different kinds of applications, and some are highly specialized to specific tasks. By the end of this
book, we will go through many of them and you will be in a position to relate the data structures to the kind of problems they solve.

1.5 What is an Algorithm?


Let us consider the problem of preparing an . To prepare an omelette, we follow the steps given below:
1) Get the frying pan.
2) Get the oil.
a. Do we have oil?
i. If yes, put it in the pan.
ii. If no, do we want to buy oil?
1. If yes, then go out and buy.
2. If no, we can terminate.
3) Turn on the stove, etc...
What we are doing is, for a given problem (preparing an omelette), we are providing a step-by-step procedure for solving it. The formal
definition of an algorithm can be stated as:
An algorithm is the step-by-step unambiguous instructions to solve a given problem.
In the traditional study of algorithms, there are two main criteria for judging the merits of algorithms: correctness (does the algorithm give
solution to the problem in a finite number of steps?) and efficiency (how much resources (in terms of memory and time) does it take to
execute the).
Note: We do not have to prove each step of the algorithm.

1.6 Why the Analysis of Algorithms?


To go from city “ ” to city “ ”, there can be many ways of accomplishing this: by flight, by bus, by train and also by bicycle. Depending on
the availability and convenience, we choose the one that suits us. Similarly, in computer science, multiple algorithms are available for solving
the same problem (for example, a sorting problem has many algorithms, like insertion sort, selection sort, quick sort and many more).
Algorithm analysis helps us to determine which algorithm is most efficient in terms of time and space consumed.

1.3 Data Structure 16


Data Structures and Algorithms Made Easy in Java Introduction

1.7 Goal of the Analysis of Algorithms


The analysis of an algorithm can help us understand it better and can suggest informed improvements. The main and important role of
analysis of algorithm is to predict the performance of different algorithms in order to guide design decisions. The goal of the
ℎ is to compare algorithms (or solutions) mainly in terms of running time but also in terms of other factors (e.g.,
memory, developer effort, etc.).
In theoretical analysis of algorithms, it is common to estimate their complexity in the asymptotic sense, i.e., to estimate the complexity function
for arbitrarily large input. The term "analysis of algorithms" was coined by Donald Knuth.
Algorithm analysis is an important part of computational complexity theory, which provides theoretical estimation for the required resources
of an algorithm to solve a specific computational problem. Most algorithms are designed to work with inputs of arbitrary length. Analysis of
algorithms is the determination of the amount of time and space resources required to execute it.
Usually, the efficiency or running time of an algorithm is stated as a function relating the input length to the number of steps, known as time
complexity, or volume of memory, known as space complexity.

1.8 What is Running Time Analysis?


It is the process of determining how processing time increases as the size of the problem (input size) increases. Input size is the number of
elements in the input, and depending on the problem type, the input may be of different types. The following are the common types of inputs.
 Size of an array
 Polynomial degree
 Number of elements in a matrix
 Number of bits in the binary representation of the input
 Vertices and edges in a graph.

1.9 How to Compare Algorithms


To compare algorithms, let us define a few :
Execution times? as execution times are specific to a particular computer.
Number of statements executed? , since the number of statements varies with the programming language as well as the
style of the individual programmer.
Ideal solution? Let us assume that we express the running time of a given algorithm as a function of the input size (i.e., ( )) and compare
these different functions corresponding to running times. This kind of comparison is independent of machine time, programming style, etc.

1.10 What is Rate of Growth?


The rate at which the running time increases as a function of input is called ℎ. Let us assume that you go to a shop to buy a
car and a bicycle. If your friend sees you there and asks what you are buying, then in general you say . This is because the cost
of the car is high compared to the cost of the bicycle (approximating the cost of the bicycle to the cost of the car).
= _ _ + _ _
≈ _ _ ( )
For the above-mentioned example, we can represent the cost of the car and the cost of the bicycle in terms of function, and for a given function
ignore the low order terms that are relatively insignificant (for large value of input size, ). As an example, in the case below, , 2 , 100
and 500 are the individual costs of some function and approximate to since is the highest rate of growth.
+ 2 + 100 + 500 ≈

1.11 Commonly Used Rates of Growth


Below is the list of growth rates you will come across in the following chapters.
Time Complexity Name Description
1 Constant Whatever is the input size , these functions take a constant amount of time.
Logarithmic These are slower growing than even linear functions.
Linear These functions grow linearly with the input size .
Linear Logarithmic Faster growing than linear but slower than quadratic.
Quadratic These functions grow faster than the linear logarithmic functions.
Cubic Faster growing than quadratic but slower than exponential.
2 Exponential Faster than all of the functions mentioned here except the factorial functions.
! Factorial Fastest growing than all these functions mentioned here.

1.7 Goal of the Analysis of Algorithms 17


Data Structures and Algorithms Made Easy in Java Introduction

The diagram below shows the relationship between different rates of growth.

4 D
e
c
2 r
e
a
s
i
log n
g
log ( !)
R
a
t
e
s
2
O
f

G
r
o
w
t
h
log log

1.12 Types of Analysis


To analyze the given algorithm, we need to know with which inputs the algorithm takes less time (performing well) and with which inputs the
algorithm takes a long time. We have already seen that an algorithm can be represented in the form of an expression. That means we represent
the algorithm with multiple expressions: one for the case where it takes less time and another for the case where it takes more time.
In general, the first case is called the and the second case is called the for the algorithm. To analyze an algorithm we
need some kind of syntax, and that forms the base for asymptotic analysis/notation. There are three types of analysis:
 Worst case
o Defines the input for which the algorithm takes a long time (slowest time to complete).
o Input is the one for which the algorithm runs the slowest.
 Best case
o Defines the input for which the algorithm takes the least time (fastest time to complete).
o Input is the one for which the algorithm runs the fastest.
 Average case
o Provides a prediction about the running time of the algorithm.
o Run the algorithm many times, using many different inputs that come from some distribution that generates these
inputs, compute the total running time (by adding the individual times), and divide by the number of trials.
o Assumes that the input is random.
<= <=
For a given algorithm, we can represent the best, worst and average cases in the form of expressions. As an example, let ( ) be the function
which represents the given algorithm.
( )= + 500, for worst case
( )= + 100 + 500, for best case
Similarly for the average case. The expression defines the inputs with which the algorithm takes the average running time (or memory).

1.12 Types of Analysis 18


Data Structures and Algorithms Made Easy in Java Introduction

1.13 Asymptotic Notation


Having the expressions for the best, average and worst cases, for all three cases we need to identify the upper and lower bounds. To represent
these upper and lower bounds, we need some kind of syntax, and that is the subject of the following discussion. Let us assume that the given
algorithm is represented in the form of function ( ).

1.14 Big-O Notation


This notation gives the ℎ upper bound of the given function. Generally, it is represented as ( ) = O( ( )). That means, at larger
values of , the upper bound of ( ) is ( ). For example, if ( ) = + 100 + 10 + 50 is the given algorithm, then is ( ).
That means ( ) gives the maximum rate of growth for ( ) at larger values of .

( )
( )

Rate of growth

Input size,

Let us see the O−notation with a little more detail. O−notation defined as O( ( )) = { ( ): there exist positive constants and such
that 0 ≤ ( ) ≤ ( ) for all ≥ }. ( ) is an asymptotic tight upper bound for ( ). Our objective is to give the smallest rate of
growth ( ) which is greater than or equal to the given algorithms’ rate of growth ( ).
Generally, we discard lower values of . That means the rate of growth at lower values of is not important. In the figure, is the point
from which we need to consider the rate of growth for a given algorithm. Below , the rate of growth could be different. is called threshold
for the given function.

Big-O Visualization
O( ( )) is the set of functions with smaller or the same order of growth as ( ). For example; O( ) includes O(1), O( ), O( ), etc.

O(1): 100,1000, 200,1,20, . O( ):3 + 100, 100 , 2 − 1, 3, .

O( ): 5 , 3 − 100, 2 − O( ): , 5 − 10, 100, −


1, 100, 100 , . 2 + 1, 5, .

Note: Analyze the algorithms at larger values of only. What this means is, below we do not care about the rate of growth.

Big-O Examples
Example-1 Find upper bound for ( ) = 3 + 8
Solution: 3 + 8 ≤ 4 , for all ≥ 8
∴ 3 + 8 = O( ) with c = 4 and =8
Example-2 Find upper bound for ( ) = + 1
Solution: + 1 ≤ 2 , for all ≥ 1
∴ + 1 = O( ) with = 2 and = 1
Example-3 Find upper bound for ( ) = + 100 + 50
Solution: + 100 + 50 ≤ 2 , for all ≥ 11
∴ + 100 + 50 = O( ) with = 2 and = 11
Example-4 Find upper bound for ( ) = 2 − 2
Solution: 2 − 2 ≤ 2 , for all ≥ 1
∴ 2 − 2 = O( ) with = 2 and =1
Example-5 Find upper bound for ( ) =
Solution: ≤ , for all ≥ 1

1.13 Asymptotic Notation 19


Data Structures and Algorithms Made Easy in Java Introduction

∴ = O( ) with = 1 and =1
Example-6 Find upper bound for ( ) = 410
Solution: 410 ≤ 410, for all ≥ 1
∴ 410 = O(1 ) with = 1 and =1

No Uniqueness?
There is no unique set of values for and in proving the asymptotic bounds. Let us consider, 100 + 5 = O( ). For this function there
are multiple and values possible.
Solution1: 100 + 5 ≤ 100 + = 101 ≤ 101 , for all ≥ 5, = 5 and = 101 is a solution.
Solution2: 100 + 5 ≤ 100 + 5 = 105 ≤ 105 , for all ≥ 1, = 1 and = 105 is also a solution.

1.15 Omega-Ω Notation [Lower Bounding Function]


Similar to the O discussion, this notation gives the tighter lower bound of the given algorithm and we represent it as ( ) = ( ( )). That
means, at larger values of , the tighter lower bound of ( ) is ( ). For example, if ( ) = 100 + 10 + 50, ( ) is ( ).

( )
( ))

Rate of growth

Input size,

The Ω notation can be defined as Ω( ( )) = { ( ): there exist positive constants c and such that 0 ≤ ( ) ≤ ( ) for all ≥
}. ( ) is an asymptotic tight lower bound for ( ). Our objective is to give the largest rate of growth ( ) which is less than or equal to
the given algorithm’s rate of growth ( ).

Ω Examples
Example-1 Find lower bound for ( ) = 5 .
Solution:  , Such that: 0  5  5  = 5 and =1
∴5 = ( ) with = 5 and =1
Example-2 Prove ( ) = 100 + 5 ≠ ( ).
Solution:  c, Such that: 0   100 + 5
100 + 5  100 + 5 (  1) = 105
 105  ( – 105)  0
Since is positive  – 105  0   105/
 Contradiction: cannot be smaller than a constant
Example-3 2 = ( ), = ( ), = ( ).

1.16 Theta- Notation


c ( )
( )

Rate of growth c ( )

Input size,

1.15 Omega-Ω Notation [Lower Bounding Function] 20


Data Structures and Algorithms Made Easy in Java Introduction

This notation decides whether the upper and lower bounds of a given function (algorithm) are the same. The average running time of an
algorithm is always between the lower bound and the upper bound. If the upper bound (O) and lower bound () give the same result, then
the  notation will also have the same rate of growth. As an example, let us assume that ( ) = 10 + is the expression. Then, its tight
upper bound ( ) is O( ). The rate of growth in the best case is ( ) = O( ).
In this case, the rates of growth in the best case and worst case are the same. As a result, the average case will also be the same. For a given
function (algorithm), if the rates of growth (bounds) for O and  are not the same, then the rate of growth for the  case may not be the same.
In this case, we need to consider all possible time complexities and take the average of those (for example, for a quick sort average case, refer
to the chapter).
Now consider the definition of  notation. It is defined as ( ( )) = { ( ): there exist positive constants , and such that 0 ≤
( ) ≤ ( ) ≤ ( ) for all ≥ }. ( ) is an asymptotic tight bound for ( ). ( ( )) is the set of functions with the same
order of growth as ( ).

 Examples
Example 1 Find  bound for ( ) = −
Solution: ≤ − ≤ , for all, ≥ 2
∴ − = ( ) with = 1/5, = 1 and =2
Example 2 Prove ≠ ( )
Solution: c 2
≤ ≤ c 2 only holds for: ≤ 1/c1
∴ ≠ ( )
Example 3 Prove 6 ≠ ( )
Solution: ≤6 ≤c  only holds for: ≤ c2 /6
∴6 ≠ ( )
Example 4 Prove ≠ ( )
Solution: c ≤ ≤ c c ≥ log
, ≥ 0 – Impossible

Important Notes
For analysis (best case, worst case and average), we try to give the upper bound (O) and lower bound () and average running time (). From
the above examples, it should also be clear that, for a given function (algorithm), getting the upper bound (O) and lower bound () and
average running time () may not always be possible. For example, if we are discussing the best case of an algorithm, we try to give the upper
bound (O) and lower bound () and average running time ().
In the remaining chapters, we generally focus on the upper bound (O) because knowing the lower bound () of an algorithm is of no practical
importance, and we use the  notation if the upper bound (O) and lower bound () are the same.

1.17 Why is it called Asymptotic Analysis?


From the discussion above (for all three notations: worst case, best case, and average case), we can easily understand that, in every case for a
given function ( ) we are trying to find another function ( ) which approximates ( ) at higher values of . That means ( ) is also a
curve which approximates ( ) at higher values of .
In mathematics we call such a curve an . In other terms, ( ) is the asymptotic curve for ( ). For this reason, we call
algorithm analysis .

1.18 Guidelines for Asymptotic Analysis


There are some general rules to help us determine the running time of an algorithm.
1) Loops: The running time of a loop is, at most, the running time of the statements inside the loop (including tests) multiplied by the
number of iterations.
// executes times
for (i=1; i<=n; i++)
m = m + 2; // constant time, c
Total time = a constant × = = O( ).
2) Nested loops: Analyze from the inside out. Total running time is the product of the sizes of all the loops.
//outer loop executed n times
for (i=1; i<=n; i++) {
// inner loop executed n times
for (j=1; j<=n; j++)
k = k+1; //constant time
}

1.17 Why is it called Asymptotic Analysis? 21


Data Structures and Algorithms Made Easy in Java Introduction

Total time = × × = = O( ).
3) Consecutive statements: Add the time complexities of each statement.
x = x +1; //constant time
// executed n times
for (i=1; i<=n; i++)
m = m + 2; //constant time
//outer loop executed n times
for (i=1; i<=n; i++) {
//inner loop executed n times
for (j=1; j<=n; j++)
k = k+1; //constant time
}
Total time = + + = O( ).
4) If-then-else statements: Worst-case running time: the test, plus ℎ the ℎ part or the part (whichever is the larger).
//test: constant
if(length( ) == 0 ) {
return false; //then part: constant
}
else { // else part: (constant + constant) * n
for (int n = 0; n < length( ); n++) {
// another if : constant + constant (no else part)
if(!list[n].equals(otherList.list[n]))
//constant
return false;
}
}
Total time = + ( + ) ∗ = O( ).
5) Logarithmic complexity: An algorithm is O( ) if it takes a constant time to cut the problem size by a fraction (usually by ½). As
an example let us consider the following program:
for (i=1; i<=n;)
i = i*2;
If we observe carefully, the value of is doubling every time. Initially = 1, in next step = 2, and in subsequent steps = 4, 8 and
so on. Let us assume that the loop is executing some times. At step 2 = , and at ( + 1) step we come out of the .
Taking logarithm on both sides, gives
2 =
2=
= //if we assume base-2
Total time = O( ).
Note: Similarly, for the case below, the worst case rate of growth is O( ). The same discussion holds good for the decreasing sequence as
well.
for (i=n; i>=1;)
i = i/2;
Another example: binary search (finding a word in a dictionary of pages)
 Look at the center point in the dictionary
 Is the word towards the left or right of center?
 Repeat the process with the left or right part of the dictionary until the word is found.

1.20 Simplifying properties of asymptotic notations


 Transitivity: ( ) = ( ( )) and ( ) = (ℎ( ))  ( ) = (ℎ( )). Valid for O and  as well.
 Reflexivity: ( ) = ( ( )). Valid for O and .
 Symmetry: ( ) = ( ( )) if and only if ( ) = ( ( )).
 Transpose symmetry: ( ) = O( ( )) if and only if ( ) = ( ( )).
 If ( ) is in O( ( )) for any constant > 0, then ( ) is in O( ( )).
 If ( ) is in O( ( )) and ( ) is in O( ( )), then ( + )( ) is in O(max( ( ), ( ))).
 If ( ) is in O( ( )) and ( ) is in O( ( )) then ( ) ( ) is in O( ( ) ( )).

1.21 Commonly used Logarithms and Summations


Logarithms
= =
1.20 Simplifying properties of asymptotic notations 22
Data Structures and Algorithms Made Easy in Java Introduction

= + = ( )
= ( ) = –

= =
Arithmetic series
( + 1)
= 1 + 2 + ⋯+ =
2
Geometric series
−1
=1+ + …+ = ( ≠ 1)
−1
Harmonic series
1 1 1
= 1 + + …+ ≈
2
Other important formulae

1
= 1 + 2 +⋯+ ≈
+1

1.22 Master Theorem for Divide and Conquer Recurrences


All divide and conquer algorithms (also discussed in detail in the chapter) divide the problem into sub-problems, each of which
is part of the original problem, and then perform some additional work to compute the final answer. As an example, a merge sort algorithm [for
details, refer to chapter] operates on two sub-problems, each of which is half the size of the original, and then performs O( ) additional
work for merging. This gives the running time equation:
T( ) = 2 + O( )
The following theorem can be used to determine the running time of divide and conquer algorithms. For a given program (algorithm), first
we try to find the recurrence relation for the problem. If the recurrence is of the below form then we can directly give the answer without fully
solving it. If the recurrence is of the form T( ) = ( ) + ( ), where ≥ 1, > 1, ≥ 0 and is a real number, then:
1) If > , then ( ) = Θ
2) If =
a. If > −1, then ( ) = Θ
b. If = −1, then ( ) = Θ
c. If < −1, then ( ) = Θ
3) If <
a. If ≥ 0, then ( ) = Θ( )
b. If < 0, then ( ) = O( )

1.23 Divide and Conquer Master Theorem: Problems & Solutions


For each of the following recurrences, give an expression for the runtime ( ) if the recurrence can be solved with the Master Theorem.
Otherwise, indicate that the Master Theorem does not apply.
Problem-1 ( ) = 3 ( /2) +
Solution: ( ) = 3 ( /2) + => ( ) =Θ( ) (Master Theorem Case 3.a)
Problem-2 ( ) = 4 ( /2) +
Solution: ( ) = 4 ( /2) + => ( ) = Θ( ) (Master Theorem Case 2.a)
Problem-3 ( ) = ( /2) +
Solution: ( ) = ( /2) + => Θ( ) (Master Theorem Case 3.a)
Problem-4 ( ) = 2 ( /2) +
Solution: ( ) = 2 ( /2) + => Does not apply ( is not constant)
Problem-5 ( ) = 16 ( /4) +
1.22 Master Theorem for Divide and Conquer Recurrences 23
Data Structures and Algorithms Made Easy in Java Introduction

Solution: ( ) = 16 ( /4) + => ( ) = Θ( ) (Master Theorem Case 1)


Problem-6 ( ) = 2 ( /2) +
Solution: ( ) = 2 ( /2) + => ( ) = Θ( ) (Master Theorem Case 2.a)
Problem-7 ( ) = 2 ( /2) + /
Solution: ( ) = 2 ( /2) + / => ( ) = Θ( ) (Master Theorem Case 2.b)
.
Problem-8 ( ) = 2 ( /4) +
Solution: ( ) = 2 ( /4) + . => ( ) = Θ( .
) (Master Theorem Case 3.b)
Problem-9 ( ) = 0.5 ( /2) + 1/
Solution: ( ) = 0.5 ( /2) + 1/ => Does not apply ( < 1)
Problem-10 ( ) = 6 ( /3) +
Solution: ( ) = 6 ( /3) + => ( ) = Θ( ) (Master Theorem Case 3.a)
Problem-11 ( ) = 64 ( /8) −
Solution: ( ) = 64 ( /8) − => Does not apply (function is not positive)
Problem-12 ( ) = 7 ( /3) +
Solution: ( ) = 7 ( /3) + => ( ) = Θ( ) (Master Theorem Case 3.as)
Problem-13 ( ) = 4 ( /2) +
Solution: ( ) = 4 ( /2) + => ( ) = Θ( ) (Master Theorem Case 1)
Problem-14 ( ) = 16 ( /4) + !
Solution: ( ) = 16 ( /4) + ! => ( ) = Θ( !) (Master Theorem Case 3.a)
Problem-15 ( ) = √2 ( /2) +
Solution: ( ) = √2 ( /2) + => ( ) = Θ(√ ) (Master Theorem Case 1)
Problem-16 ( ) = 3 ( /2) +
Solution: ( ) = 3 ( /2) + => ( ) = ( ) (Master Theorem Case 1)
Problem-17 ( ) = 3 ( /3) + √
Solution: ( ) = 3 ( /3) + √ => ( ) = Θ( ) (Master Theorem Case 1)
Problem-18 ( ) = 4 ( /2) +
Solution: ( ) = 4 ( /2) + => ( ) = ( ) (Master Theorem Case 1)
Problem-19 ( ) = 3 ( /4) +
Solution: ( ) = 3 ( /4) + => ( ) = Θ( ) (Master Theorem Case 3.a)
Problem-20 ( ) = 3 ( /3) + /2
Solution: ( ) = 3 ( /3) + /2 => ( ) = Θ( ) (Master Theorem Case 2.a)

1.24 Master Theorem for Subtract and Conquer Recurrences


Let ( ) be a function defined on positive , and having the property
, if ≤ 1
( )=
( − ) + ( ), if > 1
for some constants , > 0, > 0, ≥ 0, and function ( ). If ( ) is in O( ), then
O( ), if a < 1
( )= O( ), if a = 1
O , if a > 1

1.25 Variant of Subtraction and Conquer Master Theorem


The solution to the equation ( ) = ( ) + ((1 − ) )+ , where 0 < < 1 and > 0 are constants, is O( ).

1.26 Method of Guessing and Confirming


Now, let us discuss a method which can be used to solve any recurrence. The basic idea behind this method is:
the answer; and then it correct by induction.
In other words, it addresses the question: What if the given recurrence doesn’t seem to match with any of these (master theorem) methods?
If we guess a solution and then try to verify our guess inductively, usually either the proof will succeed (in which case we are done), or the
proof will fail (in which case the failure will help us refine our guess).
As an example, consider the recurrence T( ) = √ T(√ ) + . This doesn’t fit into the form required by the Master Theorems. Carefully
observing the recurrence gives us the impression that it is similar to the divide and conquer method (dividing the problem into √ subproblems

1.24 Master Theorem for Subtract and Conquer Recurrences 24


Data Structures and Algorithms Made Easy in Java Introduction

each with size √ ). As we can see, the size of the subproblems at the first level of recursion is . So, let us guess that T( ) = O( ), and
then try to prove that our guess is correct.
Let’s start by trying to prove an bound T( ) ≤ :
T( ) = √ T(√ ) +
≤ √ . √ √ +
= . √ +
= .c. . +

The last inequality assumes only that 1 ≤ c. . . This is correct if is sufficiently large and for any constant , no matter how small. From
the above proof, we can see that our guess is correct for the upper bound. Now, let us prove the bound for this recurrence.
T( ) = √ T(√ ) +
≥ √ . √ √ +
= . √ +
= . . . +

The last inequality assumes only that 1 ≥ . . . This is incorrect if is sufficiently large and for any constant . From the above proof,
we can see that our guess is incorrect for the lower bound.
From the above discussion, we understood that Θ( ) is too big. How about Θ( )? The lower bound is easy to prove directly:
T( ) = √ T(√ ) + ≥
Now, let us prove the upper bound for this Θ( ).
T( ) = √ T(√ ) +
≤ √ . .√ +
= . +
= ( + 1)

From the above induction, we understood that Θ( ) is too small and Θ( ) is too big. So, we need something bigger than and smaller
than . How about ?
Proving the upper bound for :
T( ) = √ T(√ ) +
≤ √ . .√ √ +
= . . √ +

≤ √
Proving the lower bound for :
T( ) = √ T(√ ) +
≥ √ . .√ √ +
= . . √ +

≱ √
The last step doesn’t work. So, Θ( ) doesn’t work. What else is between and ? How about ?
Proving upper bound for :
T( ) = √ T(√ ) +
≤ √ . .√ √ +
= . . - . +
≤ , if ≥ 1
Proving lower bound for :
T( ) = √ T(√ ) +
≥ √ . .√ √ +
= . . - . +
≥ , if ≤ 1
From the above proofs, we can see that T( ) ≤ , if ≥ 1 and T( ) ≥ , if ≤ 1. Technically, we’re still missing the
base cases in both proofs, but we can be fairly confident at this point that T( ) = Θ( ).

1.26 Method of Guessing and Confirming 25


Data Structures and Algorithms Made Easy in Java Introduction

1.27 Amortized Analysis


Amortized analysis refers to determining the time-averaged running time for a sequence of operations. It is different from average case analysis,
because amortized analysis does not make any assumption about the distribution of the data values, whereas average case analysis assumes
the data are not "bad" (e.g., some sorting algorithms do well on over all input orderings but very badly on certain input orderings).
That is, amortized analysis is a worst-case analysis, but for a sequence of operations rather than for individual operations.
The motivation for amortized analysis is to better understand the running time of certain techniques, where standard worst case analysis
provides an overly pessimistic bound. Amortized analysis generally applies to a method that consists of a sequence of operations, where the
vast majority of the operations are cheap, but some of the operations are expensive. If we can show that the expensive operations are
particularly rare we can ℎ ℎ to the cheap operations, and only bound the cheap operations.
The general approach is to assign an artificial cost to each operation in the sequence, such that the total of the artificial costs for the sequence
of operations bounds the total of the real costs for the sequence. This artificial cost is called the amortized cost of an operation. To analyze
the running time, the amortized cost thus is a correct way of understanding the overall running time — but note that particular operations can
still take longer so it is not a way of bounding the running time of any individual operation in the sequence.
When one event in a sequence affects the cost of later events:
 One particular task may be expensive.
 But it may leave data structure in a state that the next few operations become easier.
Example: Let us consider an array of elements from which we want to find the smallest element. We can solve this problem using sorting.
After sorting the given array, we just need to return the element from it. The cost of performing the sort (assuming comparison based
sorting algorithm) is O( ). If we perform such selections then the average cost of each selection is O( / ) = O( ). This
clearly indicates that sorting once is reducing the complexity of subsequent operations.

1.28 Algorithms Analysis: Problems & Solutions


Note: From the following problems, try to understand the cases which have different complexities (O( ), O( ), O( ) etc.).
Problem-21 Find the complexity of the below recurrence:
3 ( − 1), > 0,
( )=
1, ℎ
Solution: Let us try solving this function with substitution.
( ) = 3 ( − 1)
( ) = 3 3 ( − 2) = 3 ( − 2)
( ) = 3 (3 ( − 3))
.
.
( ) = 3 ( − ) = 3 (0) = 3
This clearly shows that the complexity of this function is O(3 ).
Note: We can use the master theorem for this problem.
Problem-22 Find the complexity of the below recurrence:
2 ( − 1) − 1, > 0,
( )=
1, ℎ
Solution: Let us try solving this function with substitution.
( ) = 2 ( − 1) − 1
( ) = 2(2 ( − 2) − 1) − 1 = 2 ( − 2) − 2 − 1
( ) = 2 (2 ( − 3) − 2 − 1) − 1 = 2 ( − 4) − 2 − 2 − 2
( ) = 2 ( − )−2 −2 −2 ….2 − 2 −2
( ) =2 −2 −2 −2 ….2 −2 −2
( ) = 2 − (2 − 1) [ :2 +2 +⋯+2 = 2 ]
( )=1
∴Complexity is O(1). Note that while the recurrence relation looks exponential, the solution to the recurrence relation here gives a different
result.
Problem-23 What is the running time of the following function?
public void Function(int n) {
int i=1, s=1;
while( s <= n) {
i++;
s= s+i;
System.out.println(“*");
}
}
Solution: Consider the comments in the below function:
public void function (int n) {
int i=1, s=1;
// s is increasing not at rate 1 but i
1.27 Amortized Analysis 26
Data Structures and Algorithms Made Easy in Java Introduction

while( s <= n) {
i++;
s= s+i;
System.out.println(“*");
}
}
We can define the ‘ ’ terms according to the relation = + . The value of ‘ ’ increases by 1 for each iteration. The value contained in
‘ ’ at the iteration is the sum of the first ‘ ’ positive integers. If is the total number of iterations taken by the program, then the ℎ
loop terminates if:
( )
1 + 2+...+ = > ⟹ = O(√ ).
Problem-24 Find the complexity of the function given below.
public void function(int n) {
int i, count =0;
for(i=1; i*i<=n; i++)
count++;
}
Solution:
void function(int n) {
int i, count =0;
for(i=1; i*i<=n; i++)
count++;
}
In the above-mentioned function the loop will end, if > ⟹ ( ) =O(√ ). The reasoning is same as that of Problem-23.
Problem-25 What is the complexity of the program given below?
public void function(int n) {
int i, j, k , count =0;
for(i=n/2; i<=n; i++)
for(j=1; j + n/2<=n; j++)
for(k=1; k<=n; k= k * 2)
count++;
}
Solution: Consider the comments in the following function.
public void function(int n) {
int i, j, k , count =0;
//Outer loop execute n/2 times
for(i=n/2; i<=n; i++)
//Middle loop executes n/2 times
for(j=1; j + n/2<=n; j++)
//Inner loop execute logn times
for(k=1; k<=n; k= k * 2)
count++;
}
The complexity of the above function is O( ).
Problem-26 What is the complexity of the program given below?
public void function(int n) {
int i, j, k , count =0;
for(i=n/2; i<=n; i++)
for(j=1; j<=n; j= 2 * j)
for(k=1; k<=n; k= k * 2)
count++;
}
Solution: Consider the comments in the following function.
public void function(int n) {
int i, j, k , count =0;
//Outer loop execute n/2 times
for(i=n/2; i<=n; i++)
//Middle loop executes logn times
for(j=1; j<=n; j= 2 * j)
//Inner loop execute logn times
for(k=1; k<=n; k= k*2)
count++;
}
The complexity of the above function is O( ).
Problem-27 Find the complexity of the program given below.

1.28 Algorithms Analysis: Problems & Solutions 27


Data Structures and Algorithms Made Easy in Java Introduction

public void function( int n ) {


if(n == 1) return;
for(int i = 1 ; i <= n ; i + + ) {
for(int j= 1 ; j <= n ; j + + ) {
System.out.println(“*" );
break;
}
}
}
Solution: Consider the comments in the following function.
public void function( int n ) {
//constant time
if( n == 1 ) return;
//Outer loop execute times
for(int i = 1 ; i <= n ; i + + ) {
// Inner loop executes only time due to statement.
for(int j= 1 ; j <= n ; j + + ) {
System.out.println(“*" );
break;
}
}
}
The complexity of the above function is O( ). Even though the inner loop is bounded by , due to the break statement it is executing only
once.
Problem-28 Write a recursive function for the running time ( ) of the function given below. Prove using the iterative method that
( ) = ( ).
public void function( int n ) {
if( n == 1 ) return;
for(int i = 1 ; i <= n ; i + + )
for(int j = 1 ; j <= n ; j + + )
System.out.println(“*" ) ;
function( n-3 );
}
Solution: Consider the comments in the function below:
public void function (int n) {
//constant time
if( n == 1 ) return;
//Outer loop execute times
for(int i = 1 ; i <= n ; i + + )
//Inner loop executes n times
for(int j = 1 ; j <= n ; j + + )
//constant time
System.out.println(“*" ) ;
function( n-3 );
}
The recurrence for this code is clearly T( ) = ( − 3) + for some constant > 0 since each call prints out asterisks and calls
itself recursively on n - 3. Using the iterative method we get: ( ) = ( − 3) + . Using the master
theorem, we get: ( ) =Θ( ).
Problem-29 Determine  bounds for the recurrence relation: ( ) = 2 + .
Solution: Using Divide and Conquer master theorem, we get: O( ).
Problem-30 Determine  bounds for the recurrence: ( ) = + + + .
Solution: Substituting in the recurrence equation, we get:
( )≤ 1 ∗ + 2 ∗ + 3 ∗ + ≤ ∗ , where is a constant.
Problem-31 Determine  bounds for the recurrence relation: ( ) = ( /2) + 7.
Solution: Using Master Theorem we get: ( ).
Problem-32 Prove that the running time of the code below is Ω( ).
public void Read(int n) {
int k = 1;
while( k < n )
k = 3k;
}
Solution: The ℎ loop will terminate once the value of ‘ ’ is greater than or equal to the value of ‘ ’. In each iteration the value of ‘ ’ is
multiplied by 3. If is the number of iterations, then ‘ ’ has the value of 3 after iterations. The loop is terminated upon reaching iterations
when 3 ≥ n ↔ ≥ log , which shows that = Ω ( ).

1.28 Algorithms Analysis: Problems & Solutions 28


Data Structures and Algorithms Made Easy in Java Introduction

Problem-33 Solve the following recurrence.


1, =1
( ) =
( − 1) + ( − 1), ≥2
Solution: By iteration:
( ) = ( − 2) + ( − 1)( − 2) + ( − 1)

( ) = (1) + ( − 1)

( ) = (1) + −

(( + 1)(2 + 1) ( + 1)
( )=1+ −
6 2
( ) =( )
Note: We can use the master theorem for this problem.
Problem-34 Consider the following program:
Fib[n]
if(n==0) then return 0
else if(n==1) then return 1
else return Fib[n-1]+Fib[n-2]
Solution: The recurrence relation for the running time of this program is
( ) = ( − 1) + ( − 2) + .
Note T(n) has two recurrence calls indicating a binary tree. Each step recursively calls the program for reduced by 1 and 2, so the depth of
the recurrence tree is O( ). The number of leaves at depth is 2 since this is a full binary tree, and each leaf takes at least O(1) computations
for the constant factor. Running time is clearly exponential in .
Problem-35 Running time of following program?
public void function(n) {
for(int i = 1 ; i <= n ; i + + )
for(int j = 1 ; j <= n ; j+ = i )
System.out.println(“*”) ;
}
Solution: Consider the comments in the function below:
public void function (n) {
//this loop executes n times
for(int i = 1 ; i <= n ; i + + )
//this loop executes j times with j increase by the rate of i
for(int j = 1 ; j <= n ; j+ = i )
System.out.println(“*”) ;
}
In the above program, the inner loop executes n/i times for each value of . Its running time is × (∑ni=1 n/i) = O( ).
Problem-36 What is the complexity of ∑ ?
Solution: Using the logarithmic property, = + , we can see that this problem is equivalent to

= 1+ 2+⋯+ = (1 × 2 × … × ) = ( !) ≤ ( )≤

This shows that the time complexity = O( ).


Problem-37 What is the running time of the following recursive function (specified as a function of the input value )? First write the
recurrence formula and then find its complexity.
public void function(int n) {
if(n <= 1) return ;
for (int i=1 ; i <= 3; i++ )
f( );
}
Solution: Consider the comments in the function below:
public void function (int n) {
//constant time
if(n <= 1) return;
//this loop executes with recursive loop of value
for (int i=1 ; i <= 3; i++ )
f( );

1.28 Algorithms Analysis: Problems & Solutions 29


Data Structures and Algorithms Made Easy in Java Introduction

}
We can assume that for asymptotical analysis =   for every integer ≥ 1. The recurrence for this code is ( ) = 3 ( ) + Θ(1).
Using master theorem, we get ( ) = Θ( ).
Problem-38 What is the running time of the following recursive function (specified as a function of the input value )? First write a
recurrence formula, and show its solution using induction.
public void function(int n) {
if(n <= 1) return;
for (int i=1 ; i <= 3 ; i++ )
function (n − 1).
}
Solution: Consider the comments in the below function:
public void function (int n) {
//constant time
if(n <= 1) return;
//this loop executes 3 times with recursive call of n-1 value
for (int i=1 ; i <= 3 ; i++ )
function (n − 1).
}
The statement requires constant time (O(1)). With the loop, we neglect the loop overhead and only count three times that the function
is called recursively. This implies a time complexity recurrence:
( ) = , ≤ 1;
= + 3 ( − 1), > 1.
Using the master theorem, we get ( ) = Θ(3 ).
Problem-39 Write a recursion formula for the running time ( ) of the function , whose code is given below. What is the running
time of , as a function of ?
public void function (int n) {
if(n <= 1) return;
for(int i = 1; i < n; i + +)
System.out.println(“*”);
function ( 0.8n ) ;
}
Solution: Consider the comments in the below function:
public void function (int n) {
//constant time
if(n <= 1) return;
// this loop executes times with constant time loop
for(int i = 1; i < n; i + +)
System.out.println(“*”);
//recursive call with 0.8n
function ( 0.8n ) ;
}
4
The recurrence for this piece of code is ( ) = (. 8 ) + O( ) = T + O( ) = ( ) + O( ). Applying master theorem, we get
5n
( ) = O( ).
Problem-40 Find the complexity of the recurrence: ( ) = 2 (√ ) +
Solution: The given recurrence is not in the master theorem format. Let us try to convert this to the master theorem format by assuming =
2 . Applying the logarithm on both sides gives, = 2⟹ = . Now, the given function becomes:
( ) = (2 ) = 2 √2 + =2 2 + .
To make it simple we assume ( ) = (2 ) ⟹ ( ) = (2 ) ⟹ ( ) = 2 + . Applying the master theorem would result
( ) =O( ). If we substitute = back, ( ) = ( ) =O(( ) ).
Problem-41 Find the complexity of the recurrence: ( ) = (√ ) + 1

Solution: Applying the logic of Problem-40 gives ( ) = + 1. Applying the master theorem would result in ( ) =O( ).
Substituting = , gives ( ) = ( ) =O( ).
Problem-42 Find the complexity of the recurrence: ( ) = 2 (√ ) + 1

Solution: Applying the logic of Problem-40 gives: ( ) = 2 + 1. Using the master theorem results ( ) =O =O( ).
Substituting = gives ( ) = O( ).
Problem-43 Find the complexity of the function given below.
public int function (int n) {
if(n <= 2) return 1;

1.28 Algorithms Analysis: Problems & Solutions 30


Data Structures and Algorithms Made Easy in Java Introduction

else
return (Function (floor(sqrt(n))) + 1);
}
Solution: Consider the comments in the below function:
public int function (int n) {
if(n <= 2) return 1; //constant time
else
// executes √ + 1 times
return (Function (floor(sqrt(n))) + 1);
}
For the above function, recurrence function can be given as: ( ) = (√ ) + 1. This is same as that of Problem-41.
Problem-44 Analyze the running time of the following recursive psuedocode as a function of .
public void function(int n) {
if( n < 2 ) return;
else counter = 0;
for i = 1 to 8 do
function ( );
for i =1 to do
counter = counter + 1;
}
Solution: Consider the comments in below psuedocode and call running time of function(n) as ( ).
public void function(int n) {
if( n < 2 ) return; //constant time
else counter = 0;
// this loop executes 8 times with n value half in every call
for i = 1 to 8 do
function ( );
// this loop executes times with constant time loop
for i =1 to do
counter = counter + 1;
}
( ) can be defined as follows:
( ) = 1 < 2,
=8 ( ) + 3 + 1 ℎ .
2
Using the master theorem gives: ( ) =Θ( ) = Θ( ).
Problem-45 Find the complexity of the pseudocode given below:
temp = 1
repeat
for i = 1 to n
temp = temp + 1;
n = ;
until n <= 1
Solution: Consider the comments in the pseudocode given below:
temp = 1 // constant time
repeat
// this loops executes n times
for i = 1 to n
temp = temp + 1;
//recursive call with value
n = ;
until n <= 1
The recurrence for this function is ( ) = ( /2) + . Using master theorem we get: ( ) = O( ).
Problem-46 Running time of the following program?
publicvoid function(int n) {
for(int i = 1 ; i <= n ; i + + )
for(int j = 1 ; j <= n ; j * = 2 )
System.out.println(“*”);
}
Solution: Consider the comments in the function given below:
public void function(int n) {
// this loops executes n times
for(int i = 1 ; i <= n ; i + + )

1.28 Algorithms Analysis: Problems & Solutions 31


Data Structures and Algorithms Made Easy in Java Introduction

// this loops executes logn times from our logarithms


//guideline
for(int j = 1 ; j <= n ; j * = 2 )
System.out.println(“*”);
}
Complexity of above program is O( ).
Problem-47 Running time of the following program?
public void function(int n) {
for(int i = 1 ; i <= n/3 ; i + + )
for(int j = 1 ; j <= n ; j += 4 )
System.out.println(“ ∗ ”);
}
Solution: Consider the comments in the function given below:
public void function(int n) {
// this loops executes n/3 times
for(int i = 1 ; i <= n/3 ; i + + )
// this loops executes n/4 times
for(int j = 1 ; j <= n ; j += 4)
System.out.println(“ ∗ ”);
}
The time complexity of this program is: O( ).
Problem-48 Find the complexity of the below function:
public void function(int n) {
if(n <= 1) return;
if(n > 1) {
System.out.println(“ ∗ ”);
function( );
function( );
}
}
Solution: Consider the comments in the function given below:
public void function(int n) {
if(n <= 1) return; //constant time
if(n > 1) {
System.out.println(“ ∗ ”); //constant time
//recursion with n/2 value
function( n/2 );
//recursion with n/2 value
function( n/2 );
}
}
The recurrence for this function is: ( )=2 + 1. Using master theorem, we get ( ) = O( ).
Problem-49 Find the complexity of the below function:
public void function(int n) {
int i=1;
while (i < n) {
int j=n;
while(j > 0)
j = j/2;
i=2*i;
} // i
}
Solution:
public void function(int n) {
int i=1;
while (i < n) {
int j=n;
while(j > 0)
j = j/2; //logn code
i=2*i; //logn times
} // i
}
Time Complexity: O( ∗ ) =O( ).

1.28 Algorithms Analysis: Problems & Solutions 32


Data Structures and Algorithms Made Easy in Java Introduction

Problem-50 ∑ O( ), where O( ) stands for order is:


(a) O( ) (b) O( ) (c) O( ) (d) O(3 ) (e) O(1.5 )
Solution: (b). ∑ O( ) = O( ) ∑ 1 =O( ).
Problem-51 Which of the following three claims are correct
I ( + ) = ( ), where and are constants II 2 = O(2 ) III 2 = O(2 )
(a) I and II (b) I and III (c) II and III (d) I, II and III
Solution: (a). (I) ( + ) = + c1* + ... = ( ) and (II) 2 = 2*2 = O(2 )
Problem-52 Consider the following functions:
f( ) = 2 g( ) = ! h( ) =
Which of the following statements about the asymptotic behavior of f( ), g( ), and h( ) is true?
(A) f( ) = O(g( )); g( ) = O(h( )) (B) f( ) =  (g( )); g( ) = O(h( ))
(C) g( ) = O(f( )); h( ) = O(f( )) (D) h( ) = O(f( )); g( ) =  (f( ))
Solution: (D). According to the rate of growth: h( ) < f( ) < g( ) (g( ) is asymptotically greater than f( ), and f( ) is asymptotically greater than
h( )). We can easily see the above order by taking logarithms of the given 3 functions: < < ( !). Note that, ( !) =
O( ).
Problem-53 Consider the following segment of C-code:
int j=1, n;
while (j <=n)
j = j*2;
The number of comparisons made in the execution of the loop for any > 0 is:
(A) ceil( )+ 1 (B) (C) ceil( ) (D) floor( )+1
Solution: (a). Let us assume that the loop executes times. After step the value of is 2 . Taking logarithms on both sides gives = .
Since we are doing one more comparison for exiting from the loop, the answer is ceil( )+ 1.
Problem-54 Consider the following C code segment. Let T( ) denote the number of times the for loop is executed by the program
on input . Which of the following is TRUE?
public int IsPrime(int n){
for(int i=2;i<=sqrt(n);i++)
if(n%i == 0)
{printf(“Not Prime\n”); return 0;}
return 1;
}
(A) T( ) = O(√ ) and T( ) = (√ ) (B) T( ) = O(√ ) and T( ) = (1)
(C) T( ) = O( ) and T( ) = (√ ) (D) None of the above
Solution: (B). Big O notation describes the tight upper bound and Big Omega notation describes the tight lower bound for an algorithm. The
loop in the question is run maximum √ times and minimum 1 time. Therefore, T( ) = O(√ ) and T( ) = (1).
Problem-55 In the following C function, let ≥ . How many recursive calls are made by this function?
public int gcd(n,m){
if (n%m ==0) return m;
n = n%m;
return gcd(m,n);
}
(A) ( ) (B) ( ) (C) ( ) (D) ( )
Solution: No option is correct. Big O notation describes the tight upper bound and Big Omega notation describes the tight lower bound for
an algorithm. For = 2 and for all = 2 , the running time is O(1) which contradicts every option.
Problem-56 Suppose ( ) = 2 ( /2) + , T(0)=T(1)=1. Which one of the following is FALSE?
(A) ( ) = O( ) (B) ( ) = ( ) (C) ( ) = ( ) (D) ( ) = O( )
Solution: (C). Big O notation describes the tight upper bound and Big Omega notation describes the tight lower bound for an algorithm.
Based on master theorem, we get ( ) = ( ). This indicates that tight lower bound and tight upper bound are the same. That means,
O( ) and ( ) are correct for given recurrence. So option (C) is wrong.
Problem-57 Find the complexity of the below function:
public void function(int n) {
for (int i = 0; i<n; i++)
for(int j=i; j<i*i; j++)
if (j %i == 0){
for (int k = 0; k < j; k++)
printf(" * ");
}
}

1.28 Algorithms Analysis: Problems & Solutions 33


Data Structures and Algorithms Made Easy in Java Introduction

Solution:
public void function(int n) {
for (int i = 0; i<n; i++) // Executes n times
for(int j=i; j<i*i; j++) // Executes n*n times
if (j %i == 0){
for (int k = 0; k < j; k++) // Executes j times = (n*n) times
printf(" * ");
}
}
Time Complexity: O( 5 ).
Problem-58 To calculate 9 , give algorithm and discuss its complexity.
Solution: Start with 1 and multiply by 9 until reaching 9 .
Time Complexity: There are − 1 multiplications and each takes constant time giving a ( ) algorithm.
Problem-59 For Problem-58, can we improve the time complexity?
Solution: Refer to the chapter.
Problem-60 Find the complexity of the below function:
public void function(int n) {
int sum = 0;
for (int i = 0; i<n; i++)
if (i>j)
sum = sum +1;
else {
for (int k = 0; k < n; k++)
sum = sum -1;
}
}
}
Solution: Consider the worst-case.
public void function(int n) {
int sum = 0;
for (int i = 0; i<n; i++) // Executes times
if (i>j)
sum = sum +1; // Executes times
else {
for (int k = 0; k < n; k++) // Executes times
sum = sum -1;
}
}
}
Time Complexity: O( ).
Problem-61 Solve the following recurrence relation using the recursion tree method: T( )=T( ) +T( )+ .
Solution: How much work do we do in each level of the recursion tree?
T( )

T( ) T( )

2
T( ) T( ) T( ) T( )
2 3

T( ) T( ) T( ) T( ) 2 T( ) T( ) T( ) T( ) 2
2 3 2 3
In level 0, we take time. At level 1, the two subproblems take time:
1 2 1 4 25
+ = + =
2 3 4 9 36
1.28 Algorithms Analysis: Problems & Solutions 34
Data Structures and Algorithms Made Easy in Java Introduction

At level 2 the four subproblems are of size , , , and respectively. These two subproblems take time:
1 1 1 4 625 25
+ + + = =
4 3 3 9 1296 36
Similarly the amount of work at level is at most .
Let = , the total runtime is then:

T( ) ≤

1
=
1−∝
1
= 25
1 − 36
1
= 11
36
36
=
11
= O( )
That is, the first level provides a constant fraction of the total runtime.
Problem-62 Find the time complexity of recurrence T(n) = T( ) + T( ) + T( ) + .
Solution: Let us solve this problem by method of guessing. The total size on each level of the recurrance tree is less than , so we guess that
( ) = will dominate. Assume for all < that ≤ T( ) ≤ . Then,
+ + + ≤ T( ) ≤ + + +
( + + + ) ≤ T( ) ≤ ( + + + )
( + ) ≤ T( ) ≤ ( + )
If ≥ 8k and ≤ 8k, then = T( ) = . So, T( ) = Θ( ). In general, if you have multiple recursive calls, the sum of the arguments to
those calls is less than n (in this case + + < ), and ( ) is reasonably large, a good guess is T( ) = Θ(f( )).
Problem-63 Rank the following functions by order of growth: ( + 1)!, n!, 4 , ×3 ,3 + + 20 , ( ) , 4 ,4 , + 200,
/
20 + 500, 2 , , 1.
Solution:
Function Rate of Growth
( + 1)! O( !)
! O( !)
4 O(4 )
×3 O( 3 )
3 + + 20 O(3 )
3 O(( ) )
( ) Decreasing rate of growths
2
4 O( )
4 O( )
+ 200 O( )
20 + 500 O( )
2 O( )
/
O( / )
1 O(1)
.
Problem-64 Can we say 3 = O(3 )?
.
Solution: Yes: because 3 < 3 .
Problem-65 Can we say 2 = O(2 )?
Solution: No: because 2 = (2 ) = 8 not less than 2 .

1.28 Algorithms Analysis: Problems & Solutions 35


Data Structures and Algorithms Made Easy in Java Recursion and Backtracking

Chapter

RECURSION AND
BACKTRACKING 2
2.1 Introduction
In this chapter, we will look at one of the important topics, “ ”, which will be used in almost every chapter, and also its relative
“ ”.

2.2 What is Recursion?


Any function which calls itself is called . A recursive method solves a problem by calling a copy of itself to work on a smaller
problem. This is called the recursion step. The recursion step can result in many more such recursive calls. It is important to ensure that the
recursion terminates. Each time the function calls itself with a slightly simpler version of the original problem. The sequence of smaller
problems must eventually converge on the base case.

2.3 Why Recursion?


Recursion is a useful technique borrowed from mathematics. Recursive code is generally shorter and easier to write than iterative code.
Generally, loops are turned into recursive functions when they are compiled or interpreted. Recursion is most useful for tasks that can be
defined in terms of similar subtasks. For example, sort, search, and traversal problems often have simple recursive solutions.

2.4 Format of a Recursive Function


A recursive function performs a task in part by calling itself to perform the subtasks. At some point, the function
encounters a subtask that it can perform without calling itself. This case, where the function does not recur, is called the . The
former, where the function calls itself to perform a subtask, is referred to as the . We can write all recursive functions using
the format:
if(test for the base case)
return some base case value
else if(test for another base case)
return some other base case value
// the recursive case
else return (some work and then a recursive call)
As an example consider the factorial function: ! is the product of all integers between and 1. The definition of recursive factorial looks
like:
! = 1, if = 0
! = ∗ ( − 1)! if > 0
This definition can easily be converted to recursive implementation. Here the problem is determining the value of !, and the subproblem is
determining the value of ( − )!. In the recursive case, when is greater than 1, the function calls itself to determine the value of ( − )! and
multiplies that with . In the base case, when is 0 or 1, the function simply returns 1. This looks like the following:
class Factorial {
// recursive definition of method factorial
public long factorial(long number) {
if (number <= 1) // test for base case
return 1; // base cases: 0! = 1 and 1! = 1
else
// recursion step
return number * factorial(number - 1);
}

2.1 Introduction 36
Data Structures and Algorithms Made Easy in Java Recursion and Backtracking

public static void main(String args[]) {


Factorial obj = new Factorial();
for (int counter = 0; counter <= 10; counter++)
System.out.printf("%d! = %d\n", counter, obj.factorial(counter));

}
}

2.5 Recursion and Memory (Visualization)


Each recursive call makes a new copy of that method (actually only the variables) in memory. Once a method ends (that is, returns some
data), the copy of that returning method is removed from memory. The recursive solutions look simple but visualization and tracing takes
time. For better understanding, let us consider the following example.
public int Print(int n) {
if( n == 0) // this is the terminating base case
return 0;
else {
System.out.println(n);
return Print(n-1); // recursive call to itself again
}
}
For this example, if we call the print function with n=4, visually our memory assignments may look like:

Print (4)

Print (3)
Returns 0 Print (2)
Returns 0 Print (1)
Returns 0 to main function
Returns 0 Print (0)
Returns 0

Now, let us consider our factorial function. The visualization of factorial function with n = 4 will look like:

4!

4* 3!
4*6=24 is returned 3*2!

3*2=6 is returned 2*1!


Returns 24 to main function 2*1=2 is returned 1
Returns 1

2.6 Recursion versus Iteration


While discussing recursion, the basic question that comes to mind is: which way is better? – iteration or recursion? The answer to this question
depends on what we are trying to do. A recursive approach mirrors the problem that we are trying to solve. A recursive approach makes it
simpler to solve a problem that may not have the most obvious of answers. But, recursion adds overhead for each recursive call (needs space
on the stack frame).

Recursion
 Terminates when a base case is reached.
 Each recursive call requires extra space on the stack frame (memory).
 If we get infinite recursion, the program may run out of memory and result in stack overflow.
 Solutions to some problems are easier to formulate recursively.

Iteration
 Terminates when a condition is proven to be false.
 Each iteration does not require any extra space.
 An infinite loop could loop forever since there is no extra memory being created.
 Iterative solutions to a problem may not always be as obvious as a recursive solution.

2.5 Recursion and Memory (Visualization) 37


Data Structures and Algorithms Made Easy in Java Recursion and Backtracking

2.7 Notes on Recursion


 Recursive algorithms have two types of cases, recursive cases and base cases.
 Every recursive function case must terminate at a base case.
 Generally, iterative solutions are more efficient than recursive solutions [due to the overhead of function calls].
 A recursive algorithm can be implemented without recursive function calls using a stack, but it’s usually more trouble than its worth.
That means any problem that can be solved recursively can also be solved iteratively.
 For some problems, there are no obvious iterative algorithms.
 Some problems are best suited for recursive solutions while others are not.

2.8 Example Algorithms of Recursion


• Fibonacci Series, Factorial Finding
• Merge Sort, Quick Sort
• Binary Search
• Tree Traversals and many Tree Problems: InOrder, PreOrder PostOrder
• Graph Traversals: DFS [Depth First Search] and BFS [Breadth First Search]
• Dynamic Programming Examples
• Divide and Conquer Algorithms
• Towers of Hanoi
• Backtracking Algorithms [we will discuss in next section]

2.9 Recursion: Problems & Solutions


In this chapter we cover a few problems with recursion and we will discuss the rest in other chapters. By the time you complete reading the
entire book, you will encounter many recursion problems.
Problem-1 Discuss Towers of Hanoi puzzle.
Solution: The Towers of Hanoi is a mathematical puzzle. It consists of three rods (or pegs or towers) and a number of disks of different sizes
which can slide onto any rod. The puzzle starts with the disks on one rod in ascending order of size, the smallest at the top, thus making a
conical shape. The objective of the puzzle is to move the entire stack to another rod, satisfying the following rules:
 Only one disk may be moved at a time.
 Each move consists of taking the upper disk from one of the rods and sliding it onto another rod, on top of the other disks that may
already be present on that rod.
 No disk may be placed on top of a smaller disk.
Algorithm
 Move the top − 1 disks from to tower,
 Move the disk from to tower,
 Move the − 1disks from tower to tower.
 Transferring the top − 1 disks from to tower can again be thought of as a fresh problem and can be solved
in the same manner. Once we solve with three disks, we can solve it with any number of disks with the above
algorithm.
public void TowersOfHanoi(int n, char frompeg, char topeg, char auxpeg) {
/* If only 1 disk, make the move and return */
if(n==1) {
System.out.println("Move disk 1 from peg “ + frompeg + “ to peg " + topeg);
return;
}
/* Move top n-1 disks from A to B, using C as auxiliary */
TowersOfHanoi(n-1,frompeg,auxpeg,topeg);
/* Move remaining disks from A to C */
System.out.println("Move disk from peg” + frompeg + “ to peg " + topeg);
/* Move n-1 disks from B to C using A as auxiliary */
TowersOfHanoi(n-1,auxpeg,topeg,frompeg);
}
Problem-2 Given an array, check whether the array is in sorted order with recursion.
Solution:
public int isArrayInSortedOrder(int[] A, int index){

2.7 Notes on Recursion 38


Data Structures and Algorithms Made Easy in Java Recursion and Backtracking

if(A.length() == 1 || index == 1)
return 1;
return (A[index -1] < A[index -2])?0: isArrayInSortedOrder(A, index -1);
}
Time Complexity: O( ). Space Complexity: O( ) for recursive stack space.

2.10 What is Backtracking?


Backtracking is an improvement of the brute force approach. It systematically searches for a solution to a problem among all available options.
In backtracking, we start with one possible option out of many available options and try to solve the problem if we are able to solve the
problem with the selected move then we will print the solution else we will backtrack and select some other option and try to solve it. If none
if the options work out we will claim that there is no solution for the problem.
Backtracking is a form of recursion. The usual scenario is that you are faced with a number of options, and you must choose one of these.
After you make your choice you will get a new set of options; just what set of options you get depends on what choice you made. This
procedure is repeated over and over until you reach a final state. If you made a good sequence of choices, your final state is a goal state; if you
didn't, it isn't.
Backtracking can be thought of as a selective tree/graph traversal method. The tree is a way of representing some initial starting position (the
root node) and a final goal state (one of the leaves). Backtracking allows us to deal with situations in which a raw brute-force approach would
explode into an impossible number of options to consider. Backtracking is a sort of refined brute force. At each node, we eliminate choices
that are obviously not possible and proceed to recursively check only those that have potential.
What’s interesting about backtracking is that we back up only as far as needed to reach a previous decision point with an as-yet-unexplored
alternative. In general, that will be at the most recent decision point. Eventually, more and more of these decision points will have been fully
explored, and we will have to backtrack further and further. If we backtrack all the way to our initial state and have explored all alternatives
from there, we can conclude the particular problem is unsolvable. In such a case, we will have done all the work of the exhaustive recursion
and known that there is no viable solution possible.
 Sometimes the best algorithm for a problem is to try all possibilities.
 This is always slow, but there are standard tools that can be used to help.
 Tools: algorithms for generating basic objects, such as binary strings [2 possibilities for -bit string], permutations
[ !], combinations [ !/ ! ( − )!], general strings [k − ary strings of length has possibilities], etc...
 Backtracking speeds the exhaustive search by pruning.

2.11 Example Algorithms of Backtracking


 Binary Strings: generating all binary strings
 Generating k-ary Strings
 The Knapsack Problem
 N-Queens Problem
 Generalized Strings
 Hamiltonian Cycles [refer to ℎ chapter]
 Graph Coloring Problem

2.12 Backtracking: Problems & Solutions


Problem-3 Generate all the strings of bits. Assume [0. . − 1] is an array of size .
Solution:
import java.util.*;
public class BinaryStrings {
int[] A;
public BinaryStrings(int n) {
A = new int[n];
}
public void binary(int n) {
if (n <= 0) {
System.out.println(Arrays.toString(A));
} else {
A[n - 1] = 0;
binary(n - 1);
A[n - 1] = 1;
binary(n - 1);
}

2.10 What is Backtracking? 39


Data Structures and Algorithms Made Easy in Java Recursion and Backtracking

}
public static void main(String[] args) throws java.lang.Exception {
int n = 4;
BinaryStrings i = new BinaryStrings(n);
i.binary(n);
}
}
Let ( ) be the running time of ( ). Assume function . . takes time O(1).
, if < 0
( )=
2 ( − 1) + , otherwise
Using Subtraction and Conquer Master theorem, we get ( ) =O(2 ). This means the algorithm for generating bit-strings is optimal.
Problem-4 Generate all the strings of length drawn from 0. . . − 1.
Solution: Let us assume we keep current k-ary string in an array [0. . − 1]. Call function - (n, k):
import java.util.*;
class K_aryStrings {
int[] A;
public BinaryStrings(int n) {
A = new int[n];
}
public void base_K_strings(int n, int k) {
//process all k-ary strings of length m
if(n <= 0)
System.out.println(Arrays.toString(A)); //Assume array A is a class variable
else {
for (int j = 0 ; j < k ; j++) {
A[n-1] = j;
base_K_strings(n - 1, k);
}
}
}
public static void main(String[] args) throws java.lang.Exception {
int n = 4;
K_aryStrings obj = new K_aryStrings (n);
obj.base_K_strings(n, 3);
}
}
Let ( ) be the running time of − ( ). Then,
, <0
( )=
( − 1) + , ℎ
Using Subtraction and Conquer Master theorem, we get: ( ) =O( ).
Note: For more problems, refer to ℎ chapter.
Problem-5 Solve the recurrence T( ) = 2T( − 1) + 2 .
Solution: At each level of the recurrence tree, the number of problems is double from the previous level, while the amount of work being
done in each problem is half from the previous level. Formally, the level has 2 problems, each requiring 2 work. Thus the level
requires exactly 2 work. The depth of this tree is , because at the level, the originating call will be T( − ). Thus the total complexity
for T( ) is T( 2 ).

2.12 Backtracking: Problems & Solutions 40


Data Structures and Algorithms Made Easy in Java Linked Lists

Chapter

LINKED LISTS 3
3.1 What is a Linked List?
One disadvantage of using arrays to store data is that arrays are static structures and therefore cannot be easily extended or reduced to fit the
data set. Arrays are also expensive to maintain new insertions and deletions. In this chapter we consider another data structure called Linked
Lists that addresses some of the limitations of arrays. A linked list is a data structure used for storing collections of data. A linked list has the
following properties. A linked list is a linear dynamic data structure. The number of nodes in a list is not fixed and can grow and shrink on
demand. Each node of a linked list is made up of two items - the data and a reference to the next node. The last node has a reference to null.
The entry point into a linked list is called the head of the list. It should be noted that head is not a separate node, but the reference to the first
node. If the list is empty then the head is a null reference.
 Successive elements are connected by pointers.
 The last element points to NULL.
 Can grow or shrink in size during execution of a program.
 Can be made just as long as required (until systems memory exhausts).
 Does not waste memory space (but takes some extra memory for pointers). It allocates memory as list grows.

4 15 7 40 NULL

head

3.2 Linked Lists ADT


The following operations make linked lists an ADT:
Main Linked Lists Operations
 Insert: inserts an element into the list
 Delete: removes and returns the specified position element from the list
Auxiliary Linked Lists Operations
 Delete List: removes all elements of the list (disposes the list)
 Count: returns the number of elements in the list
 Find node from the end of the list

3.3 Why Linked Lists?


There are many other data structures that do the same thing as linked lists. Before discussing linked lists it is important to understand the
difference between linked lists and arrays. Both linked lists and arrays are used to store collections of data, and since both are used for the
same purpose, we need to differentiate their usage. That means in which cases are suitable and in which cases are
suitable.

3.4 Arrays Overview


One memory block is allocated for the entire array to hold the elements of the array. The array elements can be accessed in constant time by
using the index of the particular element as the subscript.
3 2 1 2 2 3
Index 0 1 2 3 4 5

Why Constant Time for Accessing Array Elements?


To access an array element, the address of an element is computed as an offset from the base address of the array and one multiplication is
needed to compute what is supposed to be added to the base address to get the memory address of the element. First the size of an element
of that data type is calculated and then it is multiplied with the index of the element to get the value to be added to the base address. This

3.1 What is a Linked List? 41


Data Structures and Algorithms Made Easy in Java Linked Lists

process takes one multiplication and one addition. Since these two operations take constant time, we can say the array access can be performed
in constant time.

Advantages of Arrays
 Simple and easy to use
 Faster access to the elements (constant access)

Disadvantages of Arrays
 Preallocates all needed memory up front and wastes memory space for indices in the array that are empty.
 Fixed size: The size of the array is static (specify the array size before using it).
 One block allocation: To allocate the array itself at the beginning, sometimes it may not be possible to get the memory for the
complete array (if the array size is big).
 Complex position-based insertion: To insert an element at a given position, we may need to shift the existing elements. This will
create a position for us to insert the new element at the desired position. If the position at which we want to add an element is at
the beginning, then the shifting operation is more expensive.

Dynamic Arrays
Dynamic array (also called , , , or ) is a random access, variable-size list data
structure that allows elements to be added or removed.
One simple way of implementing dynamic arrays is to initially start with some fixed size array. As soon as that array becomes full, create the
new array double the size of the original array. Similarly, reduce the array size to half if the elements in the array are less than half the size.
Note: We will see the implementation for in the , and ℎ chapters.

Advantages of Linked Lists


Linked lists have both advantages and disadvantages. The advantage of linked lists is that they can be in constant time. To create
an array, we must allocate memory for a certain number of elements. To add more elements to the array when full, we must create a new
array and copy the old array into the new array. This can take a lot of time.
We can prevent this by allocating lots of space initially but then we might allocate more than we need and waste memory. With a linked list,
we can start with space for just one allocated element and on new elements easily without the need to do any copying and reallocating.

Issues with Linked Lists (Disadvantages)


There are a number of issues with linked lists. The main disadvantage of linked lists is to individual elements. Array is random-
access, which means it takes O(1) to access any element in the array. Linked lists take O( ) for access to an element in the list in the worst
case. Another advantage of arrays in access time is in memory. Arrays are defined as contiguous blocks of memory, and so
any array element will be physically near its neighbors. This greatly benefits from modern CPU caching methods.
Although the dynamic allocation of storage is a great advantage, the ℎ with storing and retrieving data can make a big difference.
Sometimes linked lists are ℎ to . If the last item is deleted, the last but one must then have its pointer changed to hold a
NULL reference. This requires that the list is traversed to find the last but one link, and its pointer set to a NULL reference. Finally, linked
lists waste memory in terms of extra reference points.

3.5 Comparison of Linked Lists with Arrays & Dynamic Arrays


As with most choices in computer programming and design, no method is well suited to all circumstances. A linked list data structure might
work well in one case, but cause problems in another. This is a list of some of the common tradeoffs involving linked list structures. In general,
if you have a dynamic collection, where elements are frequently being added and deleted, and the location of new elements added to the list
is significant, then benefits of a linked list increase.
Parameter Linked list Array Dynamic array
Indexing O( ) O(1) O(1)
Insertion/deletion at
O(1) O( ), if array is not full (for shifting the elements) O( )
beginning
O(1), if array is not full
Insertion at ending O( ) O(1), if array is not full
O( ), if array is full
Deletion at ending O( ) O(1) O( )
Insertion in middle O( ) O( ), if array is not full (for shifting the elements) O( )
Deletion in middle O( ) O( ), if array is not full (for shifting the elements) O( )
Wasted space O( ) (for pointers) 0 O( )

3.6 Singly Linked Lists


The linked list consists of a series of structures called nodes. We can think of each node as a record. The first part of the record is a field that
stores the data, and the second part of the record is a field that stores a pointer to a node. So, each node contains two fields: a field and

3.5 Comparison of Linked Lists with Arrays & Dynamic Arrays 42


Data Structures and Algorithms Made Easy in Java Linked Lists

a field which is a pointer used to link one node to the next node. Generally "linked list" means a singly linked list. This list consists of a
number of nodes in which each node has a pointer to the following element. The link of the last node in the list is NULL, which
indicates the end of the list. Each node is allocated in the heap with a call to (), so the node memory continues to exist until it is explicitly
deallocated with a call to (). The node called a ℎ is the first node in the list. The last node's next pointer points to NULL value.

4 15 7 40 NULL

Head
Following is a type declaration for a linked list:
public class ListNode {
private int data;
private ListNode next;
public ListNode(int data){
this.data = data;
}
public void setData(int data){
this.data = data;
}
public int getData(){
return data;
}
public void setNext(ListNode next){
this.next = next;
}
public ListNode getNext(){
return this.next;
}
}

Basic Operations on a List


 Traversing the list
 Inserting an item in the list
 Deleting an item from the list

Traversing the Linked List


Let us assume that the ℎ points to the first node of the list. To traverse the list we do the following.
 Follow the pointers.
 Display the contents of the nodes (or count) as they are traversed.
 Stop when the next pointer points to NULL.

4 15 7 40 NULL

head
The ListLength() function takes a linked list as input and counts the number of nodes in the list. The function given below can be used for
printing the list data with extra print function.
public int length(ListNode headNode) {
int length = 0;
ListNode currentNode = headNode;
while(currentNode != null){
length++;
currentNode = currentNode.next;
}
return length;
}
Time Complexity: O( ), for scanning the list of size . Space Complexity: O(1), for creating a temporary variable.

Singly Linked List Insertion


Insertion into a singly-linked list has three cases:
 Inserting a new node before the head (at the beginning)
 Inserting a new node after the tail (at the end of the list)
 Inserting a new node at the middle of the list (random location)
Note: To insert an element in the linked list at some position , assume that after inserting the element the position of this new node is .

3.6 Singly Linked Lists 43


Data Structures and Algorithms Made Easy in Java Linked Lists

Inserting a Node in Singly Linked List at the Beginning


In this case, a new node is inserted before the current head node. needs to be modified (new node’s next pointer)
and it can be done in two steps:
 Update the next pointer of new node, to point to the current head.
new node
data 15 7 40 NULL

head
 Update head pointer to point to the new node.
new node
data 15 7 40 NULL

head

Inserting a Node in Singly Linked List at the Ending


In this case, we need to modify (last nodes next pointer and new nodes next pointer).
 New nodes next pointer points to NULL.
NULL
new node
4 15 7 0 data NULL

head
 Last nodes next pointer points to the new node.
new node
4 15 7 0 data NULL

head

Inserting a Node in Singly Linked List at the Middle


Let us assume that we are given a position where we want to insert the new node. In this case also, we need to modify two next pointers.
 If we want to add an element at position 3 then we stop at position 2. That means we traverse 2 nodes and insert the new node.
For simplicity let us assume that the second node is called node. The new node points to the next node of the position
where we want to add this node.
node
4 15 7 40 NULL

head new node

data
 Position node’s next pointer now points to the new node.
node
4 15 7 40 NULL

head
data
new node
Note: We can implement the three variations of the operation separately.
Time Complexity: O( ). since, in the worst case, we may need to insert the node at the end of the list. Space Complexity: O(1).

Singly Linked List Deletion


Similar to insertion, here we also have three cases.
 Deleting the first node
 Deleting the last node
 Deleting an intermediate node.

3.6 Singly Linked Lists 44


Data Structures and Algorithms Made Easy in Java Linked Lists

Deleting the First Node in Singly Linked List


First node (current head node) is removed from the list. It can be done in two steps:
 Create a temporary node which will point to the same node as that of head.

4 15 7 40 NULL

Head Temp
 Now, move the head nodes pointer to the next node and dispose of the temporary node.

4 15 7 40 NULL

Temp Head

Deleting the Last node in Singly Linked List


In this case, the last node is removed from the list. This operation is a bit trickier than removing the first node, because the algorithm should
find a node, which is previous to the tail. It can be done in three steps:
 Traverse the list and while traversing maintain the previous node address also. By the time we reach the end of the list, we will have
two pointers, one pointing to the node and the other pointing to the node the tail node.

4 15 7 40 NULL

Head Node previous to tail Tail


 Update previous node’s next pointer with NULL.
NULL

4 15 7 40 NULL

Head
Previous node to tail Tail
 Dispose of the tail node.
NULL

4 15 7 40 NULL

Head Previous node to tail Tail

Deleting an Intermediate Node in Singly Linked List


In this case, the node to be removed is two nodes. Head and tail links are not updated in this case. Such a removal
In this case, the node to be removed is two nodes. Head and tail links are not updated in this case. Such a removal
can be done in two steps:
 Similar to the previous case, maintain the previous node while traversing the list. Once we find the node to be deleted, change the
previous node’s next pointer to the next pointer of the node to be deleted.

4 15 7 40 NULL

Head Previous node Node to be deleted


 Dispose of the current node to be deleted.

4 15 7 40 NULL

Head Previous node Node to be deleted


Time Complexity: O( ). In the worst case, we may need to delete the node at the end of the list. Space Complexity: O(1).

Deleting Singly Linked List


This works by storing the current node in some temporary variable and freeing the current node. After freeing the current node, go to the
next node with a temporary variable and repeat this process for all nodes.
Time Complexity: O( ), for scanning the complete list of size . Space Complexity: O(1), for temporary variable.

3.6 Singly Linked Lists 45


Another Random Scribd Document
with Unrelated Content
“Dear Mr. Rushton,— I think I have found some one else that is all
that is required by papa’s will. This time it is a gentleman, and as he is not
married, and has no children, it will not require so much. He is very clever,
and has a good profession; but his health is not good, and he wants rest.
This is just what papa would have wished, don’t you think so? Two or three
thousand pounds would do, I think—and I will tell you everything about it
and explain all, if you will come to me, or if I can go and see you. I have
written to Mr. Chervil too.
“Sincerely yours,
Lucy Trevor.”
“Did you ever hear anything like it?” said the lawyer, exasperated. “If
there is still time, you will thank me for letting you know, Sir Thomas. Who
can tell who this person is? and the moment you appear, no doubt much
better worth the trouble—”
“Must I again remind you of what I said?” Sir Thomas repeated. “This
has reference, so far as I can see, to a condition of the father’s will, which
Miss Trevor has very much in her mind.”
“She has told you of it? There never was so mad a proviso. They have ‘a
bee in their bonnet,’ as the Scotch say. And I’ve got to stand by and see a
fine fortune scattered to the winds! That girl will drive me mad. I lose my
head altogether when I think of her. The old man was always an eccentric,
and he couldn’t take the money with him. You know a man doesn’t feel it,
what he does by his will; but that any living creature, in their senses, should
throw away good money! I believe that girl will drive me mad.”
“A la bonne heure,” said Sir Thomas, “you have nothing to do but
transfer your charge to me.”
“Ah! you’ll put a stop to it? I see. A husband can do a great many things;
that is what I thought, that was my idea when— There are a great many
things to be taken into consideration, Sir Thomas,” Mr. Rushton said,
recovering his self-possession. “Your proposal is one to be treated
respectfully, but nevertheless in my ward’s interest—”
“I think those interests have been considerably risked already,” said Sir
Thomas, gravely. “I do not think they are safe here; she is with people who
do not know how to take care of her.”
“According to the will, Sir Thomas—”
“But it is not according to the will that she should have no guardianship
at all, but be approached by every youth that happens to cross her path.”
Mr. Rushton winced; if his wife schemed, was it his fault? “Ah! I had
heard something of that,” he said. “Some young fellow who followed her
from town; it must be put a stop to.”
“It is put a stop to,” said Sir Thomas, “Miss Trevor has, as I tell you,
accepted me.”
“That is the most effectual way, certainly, isn’t it?” Mr. Rushton said,
discomfited. He rubbed his hands ruefully, and shifted from one foot to
another. “It is a very serious question. I must go into it fully before I can
pretend to say anything; you have a fine property, but it is heavily burdened,
and a good position, an excellent position; but with her fortune my ward has
a right to look very high indeed, Sir Thomas,” the lawyer said.
“You will not promise me your support?” said Sir Thomas. “I have a
hard task before me, I understand, and the consent of a great many people
to secure. And how about Miss Trevor’s letter?” he said, with a twinkle in
his eye; “she will ask me what you said.”
Mr. Rushton grew crimson once more. “It is out of the question,” he
cried; “the girl is mad, and she will drive me mad. Two or three thousands!
only two or three thousand pounds! the other day she made away with six
thousand— I declare before heaven she will bring down my gray hairs—no,
that’s not what I mean to say. But you can’t treat money in this way, Sir
Thomas, you can’t do it; it will make me ill, it will give me a fever, or
something. The girl does not know what she is doing. Money! the one thing
in the world that you can’t treat in this way.”
“But the will permits it?” said Sir Thomas, with a fictitious look of
sympathy.
“Oh, the will, the will is mad too. I dare not take it into a court of law. It
would not stand, it could not stand for a moment. And what would be the
issue?” cried Mr. Rushton, almost weeping, “the money would be divided.
The old man would be declared intestate, and the child, Jock, as they call
him, would take his share. She would deserve it—upon my honor, she
would deserve it—but it would cut the property to pieces all the same, and
that would be worse than anything. It will drive me out of my senses; I can’t
bear this anxiety much longer,” Mr. Rushton said.
Sir Thomas shook his head. “I don’t see how it is to be mended. She has
set her heart on carrying out the will, and unless you can show that she has
no right—”
“Right, there is no right in it!” Mr. Rushton cried. “She will find out she
has me to deal with. I am not a fool like Chervil. I will not give in at the
first word; I will make my stand. I will put down my foot.”
“But, my good fellow,” said Sir Thomas sympathetically, “first word or
last word, what can it matter? What can you do against her? The will gives
it, and the law allows it—you are helpless—you must give in to her at the
last.”
“I won’t!” he said, “or else I’ll throw up the whole concern, it has been
nothing but botheration and annoyance. And now my wife at me—and Ray.
I’ll wash my hands of the whole matter. I’ll not have my life made a burden
to me, not for old Trevor, nor for Lucy, nor for any will in the world.”
“Give her to me, and you will be free,” said Sir Thomas, looking at his
excited opponent steadily, to conceal the laughter in his own eyes.
He came out of Mr. Rushton’s office an hour after, triumphant, and came
along the market-place, and down the High Street, with a smile upon his
face. Sir Tom felt that the ball was at his foot. An air of success and
prosperity was about him, which vaguely impressed all the passers-by, and
even penetrated through the shows in the shop-windows, and made
everybody aware that something fortunate had happened. What had come to
him? A fortune had been left him—he had been appointed Embassador
somewhere, he had been made an Under-Secretary of State. All these
suggestions were abroad in Farafield before night; for at this time it was
quite early, and the people about were at comparative leisure, and free to
remark on what they saw. Something had happened to Sir Tom, and it was
something good. The town in general disapproved of many of his ways, but
yet liked Sir Tom. It pleased the public to see him streaming along like a
procession, with all his colors flying. He went on till he came to the
Terrace, pervading the streets like a new gleam of sunshine; but then he
stopped short, just as he was about to enter the gate-way. Lucy herself was
at the window, looking for him. He paused as he was about to go in, then
waved his hand to her, and turned the other way. Lucy followed him with
her eyes, with astonishment, and disappointment, and consternation. Where
could he be going across the common, away from her, though he saw her
waiting for him? Sir Tom looked back once more, and waved his hand again
when he was half way along the uneven road. He was bound for the White
House. He recollected the letter of the will, which Lucy had vowed to keep,
though Lucy herself had forgotten the marriage committee, and Mr.
Rushton had this very morning openly scoffed at it. But Sir Thomas was
confident in the successfulness of his success. Already of the six votes he
had secured three. One more, and all was safe.
Mrs. Stone was in her parlor, like the queen in the ballad, and, like that
royal lady, was engaged upon a light refection. She had been worried, and
she had been crossed, and teaching is hungry work. The two sisters were
strengthening themselves with cake and wine for their work, when Sir
Thomas Randolph was suddenly shown into the Queen Anne parlor, taking
them by surprise. Sir Tom was not a man to alarm any woman with the
mildest claim to personal attractiveness, and he admired the handsome
school-mistress, and was not without an eye to see that even the little
Southernwood, with her little old-fashioned curls upon her cheek, had a
pretty little figure still, and a complexion which a girl need not have
despised. How Sir Tom made it apparent that he saw these personal
advantages, it would be hard to say—yet he managed to do so; and in five
minutes had made himself as comfortable as the circumstances permitted in
one of the lofty Chippendale chairs, and was talking of most things in
heaven and earth in his easy way. The ladies saw, as the people in the streets
had seen, that some good fortune had happened to Sir Tom. But he was very
wary in his advances, and it was not till a little stir in the passages gave him
warning that the girls were flocking in again to their class-rooms, and the
moment of leisure nearly over, that he ventured on the real object of his
visit. It was more difficult than he had thought; he had his back to the
window, and the room was not very light, which was a protection to him;
but still he had to clear his throat more than once before he began.
“I have a selfish object in this early visit,” he said; “you will never
divine it. I have come to throw myself on your charity. You have it in your
power to make me or to mar me. I want you to give me your consent.”
“To what?” Mrs. Stone said, surprised. Was it for a general holiday? was
it an indulgence for Lily Barrington, for whom he professed a partiality?
What was it? perhaps a protégée of doubtful pedigree, whom he wished to
put under her care.
Sir Thomas got up, keeping his back to the window. It was not half so
easy as dealing with Mr. Rushton. “It is something about your little pupil,
Lucy Trevor.”
“Oh!” Mrs. Stone got up too. “I want to hear nothing more of Lucy
Trevor. I wash my hands of her,” she said.
“Ah?” said Miss Southernwood, coming a step closer. She divined
immediately, though she was not half so clever as her sister, what it was.
“I am sorry she has displeased you,” said Sir Tom. “I want you to let me
marry her, Mrs. Stone.”
“Marry her!” Mrs. Stone said, almost with a shriek; and then she drew
herself up to a great deal more than her full height, as she knew, very well
how to do. “I have taken an interest in her, and she has disappointed me,”
she said; “and as to consenting or not consenting, all that is nonsense
nowadays. It might have answered last century, but now it is obsolete.”
Then she made him a stately courtesy. “I could have nothing to oppose to
Sir Thomas Randolph, even, if I meant to oppose at all,” she said.
Miss Southernwood came up to him as the door closed on her sister.
“Was this what she meant all the time?” asked the milder woman. “It
was you she was thinking of all the time? Well, I do not blame her, and I
hope you may be very happy. But, Sir Thomas, tell Lucy that I rely upon
her to do nothing more in the matter we were talking of. It could not be
done, it would not be possible to have it done; but, surely, surely, you could
make it up between you to poor Frank. There are so many appointments that
would suit him, if he had good friends that would take a little trouble. I do
think, Sir Thomas, that it might be made up to Frank.”
Miss Southernwood, after all, was the best partisan and most staunch
supporter; but it was strange that she, who had not originated, nay, who had
disapproved of her sister’s scheme in respect to Frank St. Clair, should be
the one to insist upon a compensation to that discomfited hero.
Lucy was still standing at the window when Sir Tom came back. He
made signs of great despondency when he came in sight and alarmed her.
“She will not give me her consent, though I made sure of it,” he said.
“Lucy, what shall we do if we can not get Mrs. Stone’s consent?”
“Her consent?” said Lucy, with momentary surprise. Then she made her
first rebellion against all she had hitherto considered most sacred. “I think
we might do without it,” she said.
CHAPTER XLVI.

THE END.

There was one thing which Sir Thomas got out of his matrimonial
arrangements which was more than he expected, and that was a great deal
of fun. After he had received, in the way above described, the angry
submission of the two whom he chiefly feared, he had entered into the spirit
of the thing, and determined that he would faithfully obey the will, and
obtain the assent of all that marriage committee, who expected to make
Lucy’s marrying so difficult a matter. He was even visited by some
humorous compunctions as he went on. The entire failure of poor old
Trevor’s precautions on this point awakened a kind of sympathetic regret in
his mature mind. “Poor old fellow!” he said; “probably I was the last person
he would have given his heiress to: most likely all these fences were made
to keep me out,” he laughed; yet he felt a kind of sympathy for the old man,
who, indeed, however, would have had no such objection to Sir Thomas as
Sir Thomas thought. Next morning Lucy’s suitor went to the rector, who, to
be sure, had it in his power to stop the whole proceedings, advanced as they
were. But the rector had heard, by some of the subtle secret modes of
communication which convey secrets, of something going on, and patted
Sir Thomas on the shoulder.
“My dear Sir Tom,” he said, “I never for a moment attached any
importance to the vote given to me. Why should I interfere with Miss
Trevor’s marriage? Your father-in-law that is to be (if one can speak in the
future tense of a person who is in the past) entertained some odd ideas. He
was an excellent man, I have not a doubt on that point, but— Now what
could I know about it, for instance? I know Lucy—she’s a very nice girl,
my girls like what they have seen of her immensely; but I know nothing
about her surroundings. I am inclined to think she is very lucky to have
fallen into no worse hands than yours.”
“The compliment is dubious,” said Sir Tom, “but I accept it; and I may
take it for granted that I have your consent?”
“Certainly, certainly, you have my consent. I never thought of it but as a
joke. That old man— I beg your pardon—your father-in-law must have had
queer ideas about many things. I hear he left his heiress great latitude about
spending—allowed her, in short, to give away her money.”
“I wonder how you heard that?”
“Ah! upon my word I can scarcely tell you. Common talk. They say, by
the way, she is going to give a fortune to Katie Russell on her marriage with
young Rainy, the school-master; compensation, that! Rainy (who is a young
prig, full of dissenting blood, though it suits him to be a churchman) no
doubt thought he had a good chance for the heiress herself.”
“Don’t speak any worse than you can help of my future relations,” said
Sir Tom, with a laugh: “it might make things awkward afterward;” upon
which the rector perceived that he had gone half a step too far.
“Rainy is a very respectable fellow; there is not a word to be said against
him. I wish I could say as much for all my own relations,” he said; “but,
Randolph, as I am a kind of a guardian, you know, take my advice in one
thing. It is all very fine to be liberal; but I would not let her throw her
money away.”
Sir Tom made no direct reply. He shook the rector’s hand, and laughed.
“I’ll tell Lucy you send her your blessing,” he said.
And then he went off in a different direction, from the fine old red-brick
rectory, retired in its grove of trees, to the little, somewhat shabby street in
which Mr. Williamson, the Dissenting minister, resided—if a man can be
said to reside in a back street. The house was small and dingy, the door
opening into a very narrow passage, hung with coats and hats, for Mr.
Williamson, as was natural, had a large family. It was only after an interval
of running up and down-stairs, and subdued calling of one member of the
household after another, that the minister was unearthed and brought from
the little back room, called his study, in his slippers and a very old coat, to
receive the unlikely visitor. Sir Thomas Randolph! what could he want?
There is always a certain alarm in a humble household attendant upon the
unexpectedness of such a visit. Could anything have happened? Could some
one have gone wrong, was the anxious question of the Williamsons, as the
minister was roused, and gently pushed into the parlor, where Sir Thomas,
surrounded by all the grim gentility of the household gods, was awaiting
him. The mother and daughter were on tiptoe in the back room, not
listening at the door certainly, but with excited ears ready for every
movement. The vague alarm that they felt was reflected in the minister’s
face. Sir Thomas Randolph! What could he want? It was a relief to Mr.
Williamson when he heard what it was; but he was not so easy in his assent
as the rector. He took a seat near the suitor, with an air of great importance
replacing the vague distrust and fear that had been in his face.
“It is a great trust, Sir Thomas,” he said. “And I must be faithful. You
will not expect me to do anything against my conscience. Lucy Trevor is a
lamb of the flock, though spiritually no longer under my charge, her mother
was an excellent woman, and our late friend, Mr. Trevor— This is an
altogether unexpected application, you must allow me to think it over. I owe
it to—to our late excellent friend who committed this trust to my unworthy
hands.”
“I thought,” said Sir Tom, “that it was a matter of form merely; but,” he
added, with a better inspiration, “I quite see how, to a delicate sense of duty
like yours, it must take an aspect—”
“That is it, Sir Thomas—that is it,” Mr. Williamson said. “I must be
faithful at whatever cost. Yourself now, you will excuse me; there are
reports—”
“A great many, and at one time very well founded,” said Sir Thomas,
with great seriousness, looking his judge in the face.
This took the good minister by surprise, and the steady look confused
him. A great personage, the greatest man in the county, a baronet, a man
whose poverty (for he was known to be poor) went beyond Mr.
Williamson’s highest realization of riches! It gave the excellent minister’s
bosom an expansion of solemn pride, and, at the same time, a thrill of
alarm. Persecution is out of date, but to stand up in the presence of one of
the great ones of the earth, and convict him of evil—this is still occasionally
possible. Mr. Williamson rose to the grandeur of his position. Such an
opportunity had never been given to him before, and might never be again.
“I am glad that you do not attempt to deny it, Sir Thomas; but at the
same time there is a kind of bravado that boasts of evil-doing. I hope that is
not the source of your frankness. The happiness of an innocent young girl is
a precious trust, Sir Thomas. Unless we have guarantees of your change of
life, and that you are taking a more serious view of your duties, how can I
commit such a trust into your hands?”
“What kind of guarantees can I offer?” said Sir Thomas, with great
seriousness. “I can not give securities for my good conduct, can I? I will
cordially agree to anything that your superior wisdom and experience can
suggest.”
“Do not speak of my wisdom, for I have none—experience, perhaps, I
may have a little; and I think we must have guarantees.”
“With all my heart—if you will specify the kind,” Sir Thomas said.
But here the good minister was very much at a loss, for he did not in the
least know what kind of guarantees could be given, or taken. He was not
accustomed to have his word taken so literally. He cleared his throat, and a
flush came over his countenance, and he murmured, “Ah!” and “Oh!” and
all the other monosyllables in which English difficulty takes refuge. “You
must be aware,” he said, “Sir Thomas—not that I mean to be disagreeable
—that there are many things in your past life calculated to alarm the
guardians.”
“But, my dear sir, when I confess it,” said Sir Thomas, “when I admit it!
when I ask only—tell me what guarantees I can give—what I can do or say
—”
“Guarantees are necessary—certainly guarantees are necessary,” said the
minister, shaking his head; and then he gave to his attentive hearer a little
sermon upon marriage, which was one of the good man’s favorite subjects.
Sir Thomas listened with great gravity and sympathy. He subdued the
twinkling in his eyes—he wanted to take advantage of the honorable estate.
He said very little, and allowed his mentor to discourse freely. And nothing
was said further about guarantees. Mr. Williamson gave his consent with
effusion before the interview was over. “You have seen the folly of a
careless life,” he said, “I can not but hope that your heart is touched, Sir
Thomas, and that all the virtues of maturity will develop in you; and if my
poor approval and blessing can do you any good, you have it. I am not of
those who think much of, neither do I belong to a denomination which
gives special efficacy to, any man’s benediction; but as Jacob blessed
Joseph, I give you my blessing.” Then as his visitor rose content, and
offered him his hand, an impulse of hospitality came over the good man.
“My wife would say I was letting you go coldly, without offering you
anything; but I believe it is quite out of fashion to drink wine in the morning
—which is a very good thing, an excellent thing. But if you will come to tea
—any afternoon, Sir Thomas. If you will bring Lucy to tea!”
Afterward, after the door was shut, the minister darted out again and
called after his visitor, “My wife says if you would name an afternoon, or if
Lucy would write to her what day we may expect you—not to make
preparations,” said the minister, waving his hand, “but in case we should be
out, or engaged.”
Sir Thomas promised fervently. “You shall certainly hear a day or two
before we come,” he said, and walked away with a smile on his face. To be
sure he never meant to go back to tea, but his conscience did not smite him.
He had got off safe and sound without any guarantees.
“Now there is only my aunt’s consent to get,” he said, when he had gone
back to the Terrace. “We have stuck to the very letter of the will, and you
see all has gone well. I am going off to Fairhaven to-morrow. I know she is
there.”
“But must you ask her consent? you know she will give it,” Lucy said.
“How do I know she will give it? Perhaps she would prefer to keep you
to herself.” Lucy smiled at the thought; but Sir Thomas did not feel so sure.
His aunt meant him to marry Lucy eventually; but that was a very different
thing from carrying her off now.
When Sir Thomas went away, Lucy had a great many visitors. Even Mrs.
Rushton came, embarrassed, but doing her best to look at her ease. “Why
did you not tell me that this was going on, you silly child? I should have
understood everything, I should have made allowances for everything. But,
perhaps, he had never come to the point till the other day? Mr. Rushton and
Raymond send you their very best wishes. And Emmie has hopes that after
seeing so much of each other all the autumn, you will choose her for one of
your brides-maids, Lucy. And I wish you every happiness, my dear,” Mrs.
Rushton cried, kissing her with a little enthusiasm, having talked all her
embarrassment away. Lucy was surprised by this change, but she was no
casuist, and she did not inquire into it. It was a relief which she accepted
thankfully. Mrs. Stone came also with her congratulations. “Lady Randolph
was very wise to forestall everybody,” she said. “And, Lucy, I shall be very
glad to have you near me, to watch how you go on in your new life. Never
hesitate to come to me in a difficulty.” This was the way in which she took
her pupil’s elevation. Had Lucy been raised to a throne, she would have
made a similar speech to her. She would have felt that she could instruct her
how to reign. As for Mr. St. Clair, Lucy still had much trouble to go through
on his account. She was very reluctant to give up her scheme for his help,
but at last, after a great many interviews with Miss Southwood, was got to
perceive that the thing to be done was to make Sir Thomas “find an
appointment” for her unfortunate suitor. “He can easily do it,” said Miss
Southwood, with that innocent faith in influence which so many good
people still retain.
Bertie Russell disappeared from Farafield on the day after the advent of
Sir Thomas. He was the most angry of all Lucy’s suitors, and he put her this
time into his book in colors far from flattering. But, fortunately, nobody
knew her, and the deadly assault was never found out, not even by its
immediate victim, for, like many writers of fiction, and, indeed, like most
who are worth their salt, Bertie was not successful in the portraiture of real
character. His fancy was too much for his malevolence, and his evil
intentions thus did no harm.
Sir Thomas traveled as fast as expresses could take him to the house in
which his aunt was paying one of her many autumn visits—for I need not
say that she had returned from Homburg some time before. The house was
called Fairhaven. It was the house of a distinguished explorer and
discoverer; and the company assembled there included various members of
Lady Randolph’s special “society.” When Sir Thomas walked into the
room, where, all the male portion of the party being still in the covers, the
ladies were seated at tea, his aunt rose to meet him, from out of a little
group of her friends. Her privy council, that dread secret tribunal by which
her life was judged, were all about her in the twilight and firelight. When
his name was announced, to the great surprise of everybody, Lady
Randolph rose up with a similar but much stronger sense of vague alarm
than that which had moved the minister the previous day. “Tom!” she cried,
with surprise which she tried to make joyful; but indeed she was frightened,
not knowing what kind of news he might have come to tell. Mrs. Berry-
Montagu who was sitting as usual with her back to the light, though there
was so little of that, gave a little nod and glance aside to Lady Betsinda,
who was seated high in a throne-like, antique chair, and did not care how
strong the light was which fell on her old shiny black satin and yellow lace.
“I told you!” said Mrs. Berry-Montagu. She thought all her friend’s hopes,
so easily penetrated by those keen-eyed spectators, were about to be thrown
to the ground, and the desire to observe “how she would bear it,”
immediately stirred up those ladies to the liveliest interest. Sir Thomas,
however, when he had greeted his aunt, sat down with his usual friendly
ease, and had some tea. He was quite ready to answer all their questions,
and he was not shy about his good news, but ready to unfold them
whenever it might seem most expedient so to do.
“Straight from the Hall?” Lady Randolph said, with again a tremor. Did
this mean that he had been making preparations for his setting out?
“I got there three days ago,” said Sir Tom; “poor old house, it is a pity to
see it so neglected. It is not such a bad house—”
“A bad house! there is nothing like it in the county. If I could but see you
oftener there, Tom,” his aunt cried in spite of herself.
Sir Tom smiled, pleased with the consciousness which had not yet lost
its amusing aspect; but he did not make any reply.
“He likes his own way,” said Lady Betsinda; “I don’t blame him. If I
were a young man—and he is still a young man— I’d take my swing. When
he marries, then he’ll range himself, like all the rest, I suppose.”
“Lady Betsinda talks like a book—as she always does,” said Sir Tom,
with his great laugh; “when I marry, everything shall be changed.”
“That desirable consummation is not very near at hand, one can see,”
said Mrs. Berry-Montagu, out of the shadows, in her thin, fine voice.
Sir Tom laughed again. There was something frank, and hearty, and
joyous in the sound of his big laugh; it tempted other people to laugh too,
even when they did not know what it was about. And Lady Randolph did
not in the least know what it was about, yet the laugh gained her in spite of
herself.
“Apropos of marriage,” said Mrs. Montagu once more, “have you seen
little Miss Trevor in your wilds, Sir Tom? Our young author has gone off
there, on simulated duty of a domestic kind, but to try his best for the
heiress, I am sure. Do you think he has a chance? I am interested,” said the
little lady. “Come, the latest gossip! you must know all about it. In a
country neighborhood every scrap is worth its weight in gold.”
“I know all about it,” said Sir Tom.
“That you may be sure he does; where does all the gossip come from but
from the men? we are never so thorough. He’ll give you the worst of it, you
may take my word for that. But I like that little Lucy Trevor,” cried old
Lady Betsinda; “she was a nice, modest little thing. She never looked her
money; she was more like a little girl at home, a little kitten to play with. I
hope she is not going to have the author. I always warned you, Mary
Randolph, not to let her have to do with authors, and that sort of people; but
you never take my advice till it’s too late.”
“She is not going to marry the author,” said Sir Tom, with another laugh;
and then he rose up, almost stumbling over the tea-table. “My dear ladies,”
he said, “who are so much interested in Lucy Trevor, the fact is that the
author never had the slightest chance. She is going to marry—me. And I
have come, Aunt Mary, if you please, to ask if you will kindly give your
consent? The other guardians have been good enough to approve of me,” he
added, making her a bow, “and I hope I may not owe my disappointment to
you.”
“The other guardians— Tom!” cried Lady Randolph, falling upon him
and seizing him with both hands, “is this true?”
Sir Tom kissed her hand with a grace which he was capable of when he
pleased, and drew it within his arm.
“I presume, then,” he said, as he led her away, “that I shall get your
consent too.”
Thus old Mr. Trevor’s will was fulfilled. It was not fulfilled in the way
he wished or thought of, but what then? He thought it would have kept his
daughter unmarried, whereas her mourning for him was not ended when she
became Lady Randolph—which she did very soon after the above scene, to
the apparent content of everybody. Even Philip Rainy looked upon the
arrangement with satisfaction. Taking Lucy’s fortune to redeem the great
Randolph estate, and to make his little cousin the first woman in the county,
was not like giving it “to another fellow;” which was the thing he had not
been able to contemplate with patience. The popular imagination, indeed,
was more struck with the elevation of little Lucy Trevor to be the mistress
of the Hall than with Sir Thomas’s good fortune in becoming the husband
of the greatest heiress in England. But when his settlements were signed,
both the guardians, Mr. Chervil and Mr. Rushton, took the bridegroom-elect
aside.
“We can not do anything for you about that giving-away clause,” Mr.
Chervil said, shaking his head.
“But Sir Thomas is not the man I take him for, if he don’t find means to
keep that in check,” said Mr. Rushton.
Sir Tom made no reply, and neither of these gentlemen could make out
what was meant by the humorous curves about his lips and the twinkle in
his eye.

THE END.
*** END OF THE PROJECT GUTENBERG EBOOK THE GREATEST
HEIRESS IN ENGLAND ***

Updated editions will replace the previous one—the old editions will
be renamed.

Creating the works from print editions not protected by U.S.


copyright law means that no one owns a United States copyright in
these works, so the Foundation (and you!) can copy and distribute it
in the United States without permission and without paying
copyright royalties. Special rules, set forth in the General Terms of
Use part of this license, apply to copying and distributing Project
Gutenberg™ electronic works to protect the PROJECT GUTENBERG™
concept and trademark. Project Gutenberg is a registered trademark,
and may not be used if you charge for an eBook, except by following
the terms of the trademark license, including paying royalties for use
of the Project Gutenberg trademark. If you do not charge anything
for copies of this eBook, complying with the trademark license is
very easy. You may use this eBook for nearly any purpose such as
creation of derivative works, reports, performances and research.
Project Gutenberg eBooks may be modified and printed and given
away—you may do practically ANYTHING in the United States with
eBooks not protected by U.S. copyright law. Redistribution is subject
to the trademark license, especially commercial redistribution.

START: FULL LICENSE


THE FULL PROJECT GUTENBERG LICENSE
PLEASE READ THIS BEFORE YOU DISTRIBUTE OR USE THIS WORK

To protect the Project Gutenberg™ mission of promoting the free


distribution of electronic works, by using or distributing this work (or
any other work associated in any way with the phrase “Project
Gutenberg”), you agree to comply with all the terms of the Full
Project Gutenberg™ License available with this file or online at
www.gutenberg.org/license.

Section 1. General Terms of Use and


Redistributing Project Gutenberg™
electronic works
1.A. By reading or using any part of this Project Gutenberg™
electronic work, you indicate that you have read, understand, agree
to and accept all the terms of this license and intellectual property
(trademark/copyright) agreement. If you do not agree to abide by all
the terms of this agreement, you must cease using and return or
destroy all copies of Project Gutenberg™ electronic works in your
possession. If you paid a fee for obtaining a copy of or access to a
Project Gutenberg™ electronic work and you do not agree to be
bound by the terms of this agreement, you may obtain a refund
from the person or entity to whom you paid the fee as set forth in
paragraph 1.E.8.

1.B. “Project Gutenberg” is a registered trademark. It may only be


used on or associated in any way with an electronic work by people
who agree to be bound by the terms of this agreement. There are a
few things that you can do with most Project Gutenberg™ electronic
works even without complying with the full terms of this agreement.
See paragraph 1.C below. There are a lot of things you can do with
Project Gutenberg™ electronic works if you follow the terms of this
agreement and help preserve free future access to Project
Gutenberg™ electronic works. See paragraph 1.E below.
1.C. The Project Gutenberg Literary Archive Foundation (“the
Foundation” or PGLAF), owns a compilation copyright in the
collection of Project Gutenberg™ electronic works. Nearly all the
individual works in the collection are in the public domain in the
United States. If an individual work is unprotected by copyright law
in the United States and you are located in the United States, we do
not claim a right to prevent you from copying, distributing,
performing, displaying or creating derivative works based on the
work as long as all references to Project Gutenberg are removed. Of
course, we hope that you will support the Project Gutenberg™
mission of promoting free access to electronic works by freely
sharing Project Gutenberg™ works in compliance with the terms of
this agreement for keeping the Project Gutenberg™ name associated
with the work. You can easily comply with the terms of this
agreement by keeping this work in the same format with its attached
full Project Gutenberg™ License when you share it without charge
with others.

1.D. The copyright laws of the place where you are located also
govern what you can do with this work. Copyright laws in most
countries are in a constant state of change. If you are outside the
United States, check the laws of your country in addition to the
terms of this agreement before downloading, copying, displaying,
performing, distributing or creating derivative works based on this
work or any other Project Gutenberg™ work. The Foundation makes
no representations concerning the copyright status of any work in
any country other than the United States.

1.E. Unless you have removed all references to Project Gutenberg:

1.E.1. The following sentence, with active links to, or other


immediate access to, the full Project Gutenberg™ License must
appear prominently whenever any copy of a Project Gutenberg™
work (any work on which the phrase “Project Gutenberg” appears,
or with which the phrase “Project Gutenberg” is associated) is
accessed, displayed, performed, viewed, copied or distributed:
This eBook is for the use of anyone anywhere in the United
States and most other parts of the world at no cost and with
almost no restrictions whatsoever. You may copy it, give it away
or re-use it under the terms of the Project Gutenberg License
included with this eBook or online at www.gutenberg.org. If you
are not located in the United States, you will have to check the
laws of the country where you are located before using this
eBook.

1.E.2. If an individual Project Gutenberg™ electronic work is derived


from texts not protected by U.S. copyright law (does not contain a
notice indicating that it is posted with permission of the copyright
holder), the work can be copied and distributed to anyone in the
United States without paying any fees or charges. If you are
redistributing or providing access to a work with the phrase “Project
Gutenberg” associated with or appearing on the work, you must
comply either with the requirements of paragraphs 1.E.1 through
1.E.7 or obtain permission for the use of the work and the Project
Gutenberg™ trademark as set forth in paragraphs 1.E.8 or 1.E.9.

1.E.3. If an individual Project Gutenberg™ electronic work is posted


with the permission of the copyright holder, your use and distribution
must comply with both paragraphs 1.E.1 through 1.E.7 and any
additional terms imposed by the copyright holder. Additional terms
will be linked to the Project Gutenberg™ License for all works posted
with the permission of the copyright holder found at the beginning
of this work.

1.E.4. Do not unlink or detach or remove the full Project


Gutenberg™ License terms from this work, or any files containing a
part of this work or any other work associated with Project
Gutenberg™.

1.E.5. Do not copy, display, perform, distribute or redistribute this


electronic work, or any part of this electronic work, without
prominently displaying the sentence set forth in paragraph 1.E.1
with active links or immediate access to the full terms of the Project
Gutenberg™ License.

1.E.6. You may convert to and distribute this work in any binary,
compressed, marked up, nonproprietary or proprietary form,
including any word processing or hypertext form. However, if you
provide access to or distribute copies of a Project Gutenberg™ work
in a format other than “Plain Vanilla ASCII” or other format used in
the official version posted on the official Project Gutenberg™ website
(www.gutenberg.org), you must, at no additional cost, fee or
expense to the user, provide a copy, a means of exporting a copy, or
a means of obtaining a copy upon request, of the work in its original
“Plain Vanilla ASCII” or other form. Any alternate format must
include the full Project Gutenberg™ License as specified in
paragraph 1.E.1.

1.E.7. Do not charge a fee for access to, viewing, displaying,


performing, copying or distributing any Project Gutenberg™ works
unless you comply with paragraph 1.E.8 or 1.E.9.

1.E.8. You may charge a reasonable fee for copies of or providing


access to or distributing Project Gutenberg™ electronic works
provided that:

• You pay a royalty fee of 20% of the gross profits you derive
from the use of Project Gutenberg™ works calculated using the
method you already use to calculate your applicable taxes. The
fee is owed to the owner of the Project Gutenberg™ trademark,
but he has agreed to donate royalties under this paragraph to
the Project Gutenberg Literary Archive Foundation. Royalty
payments must be paid within 60 days following each date on
which you prepare (or are legally required to prepare) your
periodic tax returns. Royalty payments should be clearly marked
as such and sent to the Project Gutenberg Literary Archive
Foundation at the address specified in Section 4, “Information
about donations to the Project Gutenberg Literary Archive
Foundation.”

• You provide a full refund of any money paid by a user who


notifies you in writing (or by e-mail) within 30 days of receipt
that s/he does not agree to the terms of the full Project
Gutenberg™ License. You must require such a user to return or
destroy all copies of the works possessed in a physical medium
and discontinue all use of and all access to other copies of
Project Gutenberg™ works.

• You provide, in accordance with paragraph 1.F.3, a full refund of


any money paid for a work or a replacement copy, if a defect in
the electronic work is discovered and reported to you within 90
days of receipt of the work.

• You comply with all other terms of this agreement for free
distribution of Project Gutenberg™ works.

1.E.9. If you wish to charge a fee or distribute a Project Gutenberg™


electronic work or group of works on different terms than are set
forth in this agreement, you must obtain permission in writing from
the Project Gutenberg Literary Archive Foundation, the manager of
the Project Gutenberg™ trademark. Contact the Foundation as set
forth in Section 3 below.

1.F.

1.F.1. Project Gutenberg volunteers and employees expend


considerable effort to identify, do copyright research on, transcribe
and proofread works not protected by U.S. copyright law in creating
the Project Gutenberg™ collection. Despite these efforts, Project
Gutenberg™ electronic works, and the medium on which they may
be stored, may contain “Defects,” such as, but not limited to,
incomplete, inaccurate or corrupt data, transcription errors, a
copyright or other intellectual property infringement, a defective or
damaged disk or other medium, a computer virus, or computer
codes that damage or cannot be read by your equipment.

1.F.2. LIMITED WARRANTY, DISCLAIMER OF DAMAGES - Except for


the “Right of Replacement or Refund” described in paragraph 1.F.3,
the Project Gutenberg Literary Archive Foundation, the owner of the
Project Gutenberg™ trademark, and any other party distributing a
Project Gutenberg™ electronic work under this agreement, disclaim
all liability to you for damages, costs and expenses, including legal
fees. YOU AGREE THAT YOU HAVE NO REMEDIES FOR
NEGLIGENCE, STRICT LIABILITY, BREACH OF WARRANTY OR
BREACH OF CONTRACT EXCEPT THOSE PROVIDED IN PARAGRAPH
1.F.3. YOU AGREE THAT THE FOUNDATION, THE TRADEMARK
OWNER, AND ANY DISTRIBUTOR UNDER THIS AGREEMENT WILL
NOT BE LIABLE TO YOU FOR ACTUAL, DIRECT, INDIRECT,
CONSEQUENTIAL, PUNITIVE OR INCIDENTAL DAMAGES EVEN IF
YOU GIVE NOTICE OF THE POSSIBILITY OF SUCH DAMAGE.

1.F.3. LIMITED RIGHT OF REPLACEMENT OR REFUND - If you


discover a defect in this electronic work within 90 days of receiving
it, you can receive a refund of the money (if any) you paid for it by
sending a written explanation to the person you received the work
from. If you received the work on a physical medium, you must
return the medium with your written explanation. The person or
entity that provided you with the defective work may elect to provide
a replacement copy in lieu of a refund. If you received the work
electronically, the person or entity providing it to you may choose to
give you a second opportunity to receive the work electronically in
lieu of a refund. If the second copy is also defective, you may
demand a refund in writing without further opportunities to fix the
problem.

1.F.4. Except for the limited right of replacement or refund set forth
in paragraph 1.F.3, this work is provided to you ‘AS-IS’, WITH NO
OTHER WARRANTIES OF ANY KIND, EXPRESS OR IMPLIED,
INCLUDING BUT NOT LIMITED TO WARRANTIES OF
MERCHANTABILITY OR FITNESS FOR ANY PURPOSE.

1.F.5. Some states do not allow disclaimers of certain implied


warranties or the exclusion or limitation of certain types of damages.
If any disclaimer or limitation set forth in this agreement violates the
law of the state applicable to this agreement, the agreement shall be
interpreted to make the maximum disclaimer or limitation permitted
by the applicable state law. The invalidity or unenforceability of any
provision of this agreement shall not void the remaining provisions.

1.F.6. INDEMNITY - You agree to indemnify and hold the Foundation,


the trademark owner, any agent or employee of the Foundation,
anyone providing copies of Project Gutenberg™ electronic works in
accordance with this agreement, and any volunteers associated with
the production, promotion and distribution of Project Gutenberg™
electronic works, harmless from all liability, costs and expenses,
including legal fees, that arise directly or indirectly from any of the
following which you do or cause to occur: (a) distribution of this or
any Project Gutenberg™ work, (b) alteration, modification, or
additions or deletions to any Project Gutenberg™ work, and (c) any
Defect you cause.

Section 2. Information about the Mission


of Project Gutenberg™
Project Gutenberg™ is synonymous with the free distribution of
electronic works in formats readable by the widest variety of
computers including obsolete, old, middle-aged and new computers.
It exists because of the efforts of hundreds of volunteers and
donations from people in all walks of life.

Volunteers and financial support to provide volunteers with the


assistance they need are critical to reaching Project Gutenberg™’s
goals and ensuring that the Project Gutenberg™ collection will
remain freely available for generations to come. In 2001, the Project
Gutenberg Literary Archive Foundation was created to provide a
secure and permanent future for Project Gutenberg™ and future
generations. To learn more about the Project Gutenberg Literary
Archive Foundation and how your efforts and donations can help,
see Sections 3 and 4 and the Foundation information page at
www.gutenberg.org.

Section 3. Information about the Project


Gutenberg Literary Archive Foundation
The Project Gutenberg Literary Archive Foundation is a non-profit
501(c)(3) educational corporation organized under the laws of the
state of Mississippi and granted tax exempt status by the Internal
Revenue Service. The Foundation’s EIN or federal tax identification
number is 64-6221541. Contributions to the Project Gutenberg
Literary Archive Foundation are tax deductible to the full extent
permitted by U.S. federal laws and your state’s laws.

The Foundation’s business office is located at 809 North 1500 West,


Salt Lake City, UT 84116, (801) 596-1887. Email contact links and up
to date contact information can be found at the Foundation’s website
and official page at www.gutenberg.org/contact

Section 4. Information about Donations to


the Project Gutenberg Literary Archive
Foundation
Project Gutenberg™ depends upon and cannot survive without
widespread public support and donations to carry out its mission of
increasing the number of public domain and licensed works that can
be freely distributed in machine-readable form accessible by the
widest array of equipment including outdated equipment. Many
small donations ($1 to $5,000) are particularly important to
maintaining tax exempt status with the IRS.

The Foundation is committed to complying with the laws regulating


charities and charitable donations in all 50 states of the United
States. Compliance requirements are not uniform and it takes a
considerable effort, much paperwork and many fees to meet and
keep up with these requirements. We do not solicit donations in
locations where we have not received written confirmation of
compliance. To SEND DONATIONS or determine the status of
compliance for any particular state visit www.gutenberg.org/donate.

While we cannot and do not solicit contributions from states where


we have not met the solicitation requirements, we know of no
prohibition against accepting unsolicited donations from donors in
such states who approach us with offers to donate.

International donations are gratefully accepted, but we cannot make


any statements concerning tax treatment of donations received from
outside the United States. U.S. laws alone swamp our small staff.

Please check the Project Gutenberg web pages for current donation
methods and addresses. Donations are accepted in a number of
other ways including checks, online payments and credit card
donations. To donate, please visit: www.gutenberg.org/donate.

Section 5. General Information About


Project Gutenberg™ electronic works
Professor Michael S. Hart was the originator of the Project
Gutenberg™ concept of a library of electronic works that could be
freely shared with anyone. For forty years, he produced and
distributed Project Gutenberg™ eBooks with only a loose network of
volunteer support.
Project Gutenberg™ eBooks are often created from several printed
editions, all of which are confirmed as not protected by copyright in
the U.S. unless a copyright notice is included. Thus, we do not
necessarily keep eBooks in compliance with any particular paper
edition.

Most people start at our website which has the main PG search
facility: www.gutenberg.org.

This website includes information about Project Gutenberg™,


including how to make donations to the Project Gutenberg Literary
Archive Foundation, how to help produce our new eBooks, and how
to subscribe to our email newsletter to hear about new eBooks.
back
Welcome to Our Bookstore - The Ultimate Destination for Book Lovers
Are you passionate about books and eager to explore new worlds of
knowledge? At our website, we offer a vast collection of books that
cater to every interest and age group. From classic literature to
specialized publications, self-help books, and children’s stories, we
have it all! Each book is a gateway to new adventures, helping you
expand your knowledge and nourish your soul
Experience Convenient and Enjoyable Book Shopping Our website is more
than just an online bookstore—it’s a bridge connecting readers to the
timeless values of culture and wisdom. With a sleek and user-friendly
interface and a smart search system, you can find your favorite books
quickly and easily. Enjoy special promotions, fast home delivery, and
a seamless shopping experience that saves you time and enhances your
love for reading.
Let us accompany you on the journey of exploring knowledge and
personal growth!

ebookgate.com

You might also like