0% found this document useful (0 votes)
2 views4 pages

Experiment 6.2

The document outlines a process for performing sentiment analysis using Long Short-Term Memory (LSTM) networks. It includes steps for data preparation, model definition, training, and evaluation, utilizing libraries such as NLTK and Pandas. The final model demonstrates good performance on the validation set, with visualizations of accuracy and loss over training epochs.

Uploaded by

Monika Bhutani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views4 pages

Experiment 6.2

The document outlines a process for performing sentiment analysis using Long Short-Term Memory (LSTM) networks. It includes steps for data preparation, model definition, training, and evaluation, utilizing libraries such as NLTK and Pandas. The final model demonstrates good performance on the validation set, with visualizations of accuracy and loss over training epochs.

Uploaded by

Monika Bhutani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

..

Sentiment Analysis using LSTM

• Import the necessary libraries

nltk
nl tk , OC-,IOCl<l('•t...,..or4• ")

fnltk _d•t.J Pownloadln1 pach1e •to~d• to / root/nltk_dah,,


!nltk_dah) Pack•1e•t-d>hdce..:l1up -to·d•tel
"~
P•nduupd
nllOll>)lunp
... tplotllb.pn,lot u pit
, ubom u •ni
,,..,. • klum.-l_ Hlectlon 1_,-t traln_ t o t _ ipllt
,,. ... $klHNl,P<'-•Ul"C J_,-t Lat..Unc""e"
fr""'kel"H.-hl.......,_-.1
,,. ... keru.hyff• 1.......,_ LS TPI , Actlwatl .... , DenH, lk"opO<lt, Input, E-.Uln1
,,. ...
fl" ... kerH.p.-ep,-o<eul .... tut 1_,-t TokitnlHI"
Hquitnc e
fl" ... kerH.Ullbach l....,,..r hrlyStopplna
fl"""' kerH_,PNP<'O<eUl"C-•e.i....,,ce 1_,-t pad_ ,eq,..ncH
fl" ... nltk . corpu•lllll>Ot"tUO_.,.

• Load the data into Pandas dataframe

df • pd.l"ead_uw(' / content / ~ O..taut . uw' ,d.-llalter• ' •' ,enc,,.,in1• •lat1n·l ')
<If . II.ad()
G Onecflhe-=·-tiasmenlionedlhat Jl()SitiYe

1 Awcnderfulilllepmduc:lion. <br r-.brt>The Jl()SitiYe

Z llhoughtthis-sawonderfulwayl>spe,Klti Jl()SitiYe

Basicaltylhere'salamitywtlerealitdeboy negative

S_--.:1 • s t - , l s . .....-.ts("ef11lhh")
Mfclun(•l'

• • ...., _,ut>( "( "\w\ dJ", " ",•l


• - • " . joln(b for- y ln •-•Plit()lf '/ not in s_..,rd])

df["tut"J - df("rnlftO"].apply(laabcla • : clun(•))


df . h..ad()

wonclerfullittle
posiwe product""' br br ftlming
tec:11 ...

.. Understanding the Distribution

s ns.countplot(df.slfflli-t)
plt.dabel("sentl.,.t " )
pit.title(",.._ of i.. - •P• • ssa1n")

/u sr/ local / llb/ python3.7/ dlst -pa< ka1n /su born/_Oecor1tor s .py
futurftlarnln1
Tut(8.S, I.I, •...-r of h• and sp• ••ncn")
_..,.cf~-sp¥t'I-~

l-

• CrHttinputandoutpulvttt~
• Pl"ocessthtlabels.

le • tabeltn<odff()
V• le .flt_tran sfono(ctf[ ' •entl.,.t"J)
V•V.rnh-( · l ,l)

Splilinlotrainingandtestdata.

X_traln,X_test,V_tnl n,V_test • traln_test_1pllt(df("revlft0"],Y,tut_lize-a. 1s )

.. Process the data


• Toktniztll'ledalaand eonvertlhe lUI IOStqUfflCH.
• Addpadding 1oensurttl'lalallll'lesequencesl'lavttl'ltsamesl'lape

N•_wor-d1 • l -
N•_len • lst
tok • Tol<enlre.-(-_.....-.t,-n_.....-.ts)
tok.flt _on_tuts ( X_traln)
sequences •tol<.tuU_to_sequen<es(X_traln)
sequences_aatrh • pad_•-ces(sequences,.aden•lSI)
print(tol<)

chras,pr-eproc es1l n1 .t ut . Tol<enhee object It h7f1983bll1'911>

.. RNN
DefinelheRNNstructure

.S.f!UIN() :
Inputs• Input(I\Ne• "inputs',shape•[ N•_ len])
!ayes • f-i"C(N• _wrds,st,lnput_ len1th-a•. len)(lnput 1 )
!ayes• 1STM(6-l)(la.,er)
layer•DenH ( 2S6,I\Ne• "fCl')(layer)
layer •Ac:tivatlon("r-elu")(laytr)
layer•Dropout(8,S)(la.,er )
layer•DenH( l ,I\Ne• ' out_ laytr ' )(layer)
layer• Ac:tivatlon( "slpoid")(layer)
- 1 • -l{lnputs•lnl"JU,outputs•byer)
,..turn-I
Call the fune1iOl'I and compile the model

-el•IIIIN()
-el.s.-..-y()

l•ytr(type) Output Shape

out_hyer (Offlse) (None, l)

acl1ntion_l (Activation) (lfone, l)

96,ll7

FitOl'ltheUalningdatt

.___._,__.~ ...
LPIOt • -.1.flt(1eq... ncu_aatrlx,Y_ tr"ain,batch_si u • Ul,tpochs- ll, va lldatlon_,pllt•l.l)
- • l . f l t ( , _ . , U_Ntl"h,Y_tl"a ln,batch_dlt• lll,tpoch•• ll,
I

l pochl/11
lr.6/266 1- - - - -- - 1 SJ• 29911• / step accuracy: 9.78SJ v a l _ lou: 1.3462 val_accuracy·
l pochl/11
2&6/266 1- - - - -- - 1 76• accuracy: 9.8-4S4 v a l _ lou: l.32et val_accuracy·
lpochJ / 11
266/266 [ - - - - -- - ) 10 27. .•/step - • accuracy: 9.ISJ6 - val_lou: 1.3127 - val_accuracy:

1-------1-
Epoch. / 11
n, • accuracy: 9.1614 • v a l _ lou: I.JIIS • val_accuracy:

1-------1-
266/266 21911, / stei, -
EpochS /11
n, • accuracy: 0.161S - nl_lou: I .JIIS - Yal_accuracy:

1-------1-7••
266/266 2-•/stei, -
£poch6/ 11
266/266 2 - • /step · accuracy: 0.11•1 - Vll_ lou: e.Je79 - val_ acc uracy:
£poch7 /11
:~!6:A,----- - -1- 1,. 2...,,nei,. accuracy: 0.1791 - Yll_ lou: e.J21t - val_ acc uracy:

:~!6:A, accuracy: O,HSS - ul_lon: 8.3221 • val_ acc uracy:

266/266[•···························•• 1 accuracy: 0,1921 - vd_lon: 8 .J197 - val_ acc uracy:


(pochlll/111
266/266 [ - - - - - - - ] - 77s 291as /s tep · accuracy: l,19S6 - vd_lon: e.J2n - val_ acc uracy:

The model pe,rfOfms well Ol'I the validation set and this conf1!1Ufalion is chosen as Ille final model

I s.-.r"!u history for accuracy


plt.plot(1..9lot.h!story['accuracy'))
plt.tltlt('-laccuracy')
plt.ylabtl('accuracy')
plt.dabtl('-h')
aplt.lecend( [ 'tra!n', 'tut·], loc•'upperltft')
plt. • "-0

I s.-.r!~t history for lo"


plt.plot(1..9lot . h!story['lo"'Jl
plt.tltlt('-llou')
plt.ylabtl('loss')
plt.dabtl('-h')
plt.lestnd(['traln', 'tut'J, loc•'upperltft')
plt. • "-0
lHt_uq ... n<H. tok . tuU_to_uqu,nc•s(X_lHl)
tut_uq.,.n<H_aatrh • pad_uqu,nus(tHt_uq.,.ncH,a.nl...,_,,•_l...,)

MCI"• - • 1 . ..-al ....

2JS/2JS {----------------------•--•----) • 6• · Ion: l.l21l • accuracy: 8. 1612

prlnt("Ten Ht\n Lon: (:1.Jf)\n Accuracy: (:l.lf}' .forut (a<cr[IJ,accr{ l)l)

THl HI
Lou: I.HI
A.ccuracy:1.167

You might also like