0% found this document useful (0 votes)
2 views8 pages

Experiment-07 Import JSOn CSV

This document outlines the process of handling datasets in MongoDB, including importing JSON and CSV files and executing queries. It provides step-by-step instructions for setting up MongoDB tools, importing sample datasets related to a bookstore and student attendance, and verifying the import through command line operations. Additionally, it emphasizes the importance of performing CRUD operations and aggregations on the imported documents.

Uploaded by

23b01a1253
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views8 pages

Experiment-07 Import JSOn CSV

This document outlines the process of handling datasets in MongoDB, including importing JSON and CSV files and executing queries. It provides step-by-step instructions for setting up MongoDB tools, importing sample datasets related to a bookstore and student attendance, and verifying the import through command line operations. Additionally, it emphasizes the importance of performing CRUD operations and aggregations on the imported documents.

Uploaded by

23b01a1253
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 8

EXPERIMENT -- 7

7. Demonstrate how to handle datasets in MongoDB and Import various collections and apply some
queries to get specified output.

Tools: At first we can download Mongo database Tools from Google then after download ZIP File, it
(ZIP FILE) will be Extract, it changes to File Folder then Folder Path will be copied up to Bin, then
after goto Environment Variables select PATH and inserts the copied Folder path into Environmental
Variables then click Ok.

Necessary Tools: Already we Install Mongo Server & Mongo Shell script those two files are also we can
set the path in Environment Variables.

Command Line: After setting the Path, goto Command line type Mongoimport

Objectives:
The objective is to understand and perform hands-on the process of handling datasets in MongoDB.
You will understand how to import collections into a database and applying queries to retrieve
specific outputs.
For this demonstration, we use a sample dataset related to a bookstore (JSON) and Student
a t t e n d a n c e (CSV) datasets.

Prerequisites:
MongoDB installed and running.
MongoServer, MongoShell, MongoDB Tools.
Sample dataset (e.g., JSON or CSV files) ready for import.
Step 1: Importing datasets
Import the sample collections using the mongoimport command. For example, if you have a JSON file
named books.json, use:

mongoimport --db books --collection books --file bookstore.json -------------


Importing JSON File.

Where,
--type csv : indicates type of file being imported as CSV.
--headerline: indicates the field titles will appear in the first row of the CSV file.
--db : indicates the name of the d/b where the CSV file will be imported.
--collection: the name of the collection for importing the data of the CSV file.
--file : the path to the CSV file that you want to import.

Prepare your own “json” file or download any json file, keep it on Desktop or any
other location in the system.
 Copy the path (json file location)
 Type the above command (mongo import) with this path as argument to the command.

Now open another terminal and see whether these JSON (Documents) are import or not.

Loading JSON file from Command Prompt (bookstore. json)

Sample JSON File must be keep in Desktop with file name bookstore. json

{
Book no: 1,
Book name: 'Network Security',
Price: 300,
Book Author : 'Chris Maney’,
Publisher: ‘TATA MCGRAW HILL'
}

{
Book no: 123,
Book name: 'Computer Network Security',
Price: 700,
Book Author : 'Chris Maney’,
Publisher: ‘Pearson'
}

{
Book no:1,
Book name: 'Network Security',
Price: 300,
Book Author : 'Chris Maney’,
Publisher: ‘TATA MCGRAW HILL'
}
C:\Users\acer>cd desktop

C:\Users\acer\Desktop>mongoimport --db test --collection books --file bookstore.json


2024-02-27T10:56:27.864+0530 connected to: localhost
2024-02-27T10:56:28.304+0530 [########################] test.books 170B/170B
(100.0%)
2024-02-27T10:56:28.703+0530 [########################] test.books 170B/170B
(100.0%)
2024-02-27T10:56:28.703+0530 imported 3 documents

Loading another JSON file from Command Promt (students.json)


Sample JSON File must be keep in Desktop with file name

students.json

{
"sno": "434",
"sname": 'NagaRaju',
"sage": "21",
"sarea" : "Godavari",
"saddress": "BVRM"
}

{
"sno":"112",
"sname":"KSN Raju",
"sage":"32",
"saddress":"HYD"
}

{
"sno":12,
"sname":"BSNRaju",
"sarea":"VSR Colony",
"sage":50,
"saddress":"Guntur"
}
C:\Users\acer\Desktop>mongoimport --db test --collection books --file students.json
2024-02-27T11:16:15.670+0530 connected to: localhost
2024-02-27T11:16:15.698+0530 imported 3 documents

Directly we can give the file Path & if already same collection name is available then
we can overwrite it with your new collection using –jsonArray --drop.

C:\> mongoimport "C:\Users\acer\Desktop\students.json" -d test -c students


-- jsonArray --drop
2024-02-27T11:37:28.958+0530 connected to: localhost
2024-02-27T11:37:28.961+0530 dropping: test.students
2024-02-27T11:37:29.917+0530 [########################] test.students
279B/279B (100.0%)
2024-02-27T11:37:31.423+0530 [########################] test.students
279B/279B (100.0%)
2024-02-27T11:37:31.423+0530 imported 3 documents.

//// Goto test DB, in students collection we will display the 3 records use find( ).
[{
"sno": "434",
"sname": 'NagaRaju',
"sage": "21",
"sarea" : "Godavari",
"saddress": "BVRM"
},
{
"sno":"112",
"sname":"KSN Raju",
"sage":"32",
"saddress":"HYD
},
{
"sno":12,
"sname":"BSNRaju",
"sarea":"VSR Colony",
"sage":50,
"saddress":"Guntur"
}]

Sample CSV File:

By collecting the Student attendance data we create one CSV File.

TOTAL
SNo Roll No Student Name Classes 351
%
1 21B01A1218 BAVIREDDY KALYANI 142 40.46
2 21B01A1296 KURAPATI D P R HARSHINI 220 62.78
3 21B01A1286 KORAM SIRI 224 63.82
4 21B01A12I4 VALLABHAPURAPU PAVANI 227 65.23
5 21B01A1276 KAZA MANASA 233 66.38
6 21B01A12D6 PASUMARTHI PRANEETHA 241 69.25
7 21B01A12F3 RAMAYANAM ABHIGNA 245 70.4
8 21B01A1236 CHENNAMSETTI JYOTHI SRI PADMA 250 71.23
9 21B01A12E2 PETTA RESHMA 248 71.26
10 21B01A12J1 VENTRAPATI SINDHU 248 71.26
11 21B01A12D8 PATHIKAYALA JYOTHSNA PRIYANKA 252 72.41
12 21B01A12I0 TOPELLA SRI LALITHA 252 72.41
13 21B01A1213 BALA PARINITHA SAI 255 72.65
14 21B01A1252 GEDDAM NISSI OLIVE 256 72.93
15 21B01A12I9 VEMURI SRAVANI 254 72.99
16 21B01A12A9 MATHI GAYATHRI 257 73.22
17 21B01A12D5 PARISALA SUPRIYA 256 73.56

Loading CSV file from Command Promat (attendance1.csv)


Attendance.csv file contains 63 (rows) members of Student Data.

Command:
Mongoimport –type csv –headerline –db DBNAME –collection CollectionName –file
[Path of the CSV File]

C:\Users\acer\Desktop>mongoimport --type csv --headerline --db ravi --


collection mycoll --file attendance1.csv

2024-02-27T13:17:12.212+0530 connected to: localhost


2024-02-27T13:17:12.396+0530 imported 63 documents

Loading the CSV File into mongod Server, from Command Line.

After imported the 63 documents into mongod from cmd, then we can connect to
Server using, Mongosh.

We can switched into ravi database, all 63 documents are imported into mycoll
collection. So we can display all the documents by using the command find( ).

C:\Users\acer\Desktop>mongod
C:\Users\acer\Desktop>mongosh
test> use ravi
switched to db ravi
ravi> show collections
mycoll
students
ravi> db.mycoll.find( ).count( )
63
/////////////////////////////////////////////////////////////////////////////////////////////
[{
_id: ObjectId("65dd9380ddadd5e6cfdcdff4"),
SNo: 2,
'Roll No': '21B01A1296',
'Student Name': 'KURAPATI D P R HARSHINI',
'TOTAL Classes 351': 220,
'%': 62.78,
},

{
_id: ObjectId("65dd9380ddadd5e6cfdcdff5"),
SNo: 5,
'Roll No': '21B01A1276',
'Student Name': 'KAZA MANASA',
'TOTAL Classes 351': 233,
'%': 66.38,
},

{
_id: ObjectId("65dd9380ddadd5e6cfdcdff6"),
SNo: 4,
'Roll No': '21B01A12I4',
'Student Name': 'VALLABHAPURAPU PAVANI',
'TOTAL Classes 351': 227,
'%': 65.23,
},

{
_id: ObjectId("65dd9380ddadd5e6cfdcdff7"),
SNo: 3,
'Roll No': '21B01A1286',
'Student Name': 'KORAM SIRI',
'TOTAL Classes 351': 224,
'%': 63.82,
},

{
_id: ObjectId("65dd9380ddadd5e6cfdcdff8"),
SNo: 1,
'Roll No': '21B01A1218',
'Student Name': 'BAVIREDDY KALYANI',
'TOTAL Classes 351': 142,
'%': 40.46,
},

{
_id: ObjectId("65dd9380ddadd5e6cfdcdff9"),
SNo: 6,
'Roll No': '21B01A12D6',
'Student Name': 'PASUMARTHI PRANEETHA',
'TOTAL Classes 351': 241,
'%': 69.25,
}

“Once the dataset loaded into database, perform Various Operations on the
documents like CRUD, Aggregations, Single Purpose Evolutionary Query
Operations”.

You might also like