ISR Prints
ISR Prints
Thus, Precision and recall have been extensively used to evaluate the retrieval
performance of IR systems or algorithms. However, a more careful reflection reveals
problems with these two measures: First, the proper estimation of maximum recall for a
query requires detailed knowledge of all the documents in the collection Second, in
many situations the use of a single measure could b e more appropriate Third, recall and
precision measure the effectiveness over a set of queries processed in batch mode
Fourth, for systems which require a weak ordering though, recall and precision might be
inadequate.
B. VivaQuestions:
Title : Implement a program to calculate precision and recall for sample input. (Answer set A, Query q1,
Relevant documents to query q1- Rq1 )
Program:
#include <iostream>
#include <string.h>
#include <iomanip>
#include <fstream>
using namespace std;
string left(const string s, const int w)
{ // Left aligns input string in table
stringstream ss, spaces;
int padding = w - s.size(); // count excess room to pad
for (int i = 0; i < padding; ++i)
spaces << " ";
ss << s << spaces.str() << '|'; // format with padding
return ss.str();
}
string center(const string s, const int w)
{ // center aligns input string in table
stringstream ss, spaces;
int padding = w - s.size(); // count excess room to pad
for (int i = 0; i < padding / 2; ++i)
spaces << " ";
ss << spaces.str() << s << spaces.str(); // format with padding
if (padding > 0 && padding % 2 != 0) // if odd #, add 1 space
Output:
| Documents | |Ra| | |A| | Precision(%)|Recall(%) |
| E-Value |
B. Viva Questions:
A. Importing an Image:
Importing an image in python is easy. Following code will help you import an image on
Python :
3. This is done by Gray-scaling ,Here is how you convert a RGB image to Gray
scale.
Output:
Title : Build the web crawler to pull product information and links from an e-commerce website. (Python).
Program:
import java.net.*;
import java.io.*;
public class Crawler{
public static void main(String[] args) throws Exception{
String urls[] = new String[1000];
String url = "https://www.cricbuzz.com/live-cricket-scores/20307/aus-vs-ind-3rd-odi-india-tour-of-australia-
2018-19";
int i=0,j=0,tmp=0,total=0, MAX = 1000;
int start=0, end=0;
String webpage = Web.getWeb(url);
end = webpage.indexOf("<body");
for(i=total;i<MAX; i++, total++){
start = webpage.indexOf("http://", end);
if(start == -1){
start = 0;
end = 0;
try{
webpage = Web.getWeb(urls[j++]);
}catch(Exception e){
System.out.println("******************");
System.out.println(urls[j-1]);
System.out.println("Exception caught \n"+e);
}
/*logic to fetch urls out of body of webpage only */
end = webpage.indexOf("<body");
if(end == -1)
Code Snippets
Sample Solution:
Python Code:
Sample Output: