0% found this document useful (0 votes)
12 views14 pages

The Usability Analysis Online Learning Site For Supporting Computer Programming Course Using System Usability Scale (SUS) in A University

This research analyzes the usability of the online learning site CodeSaya.com for computer programming courses using the System Usability Scale (SUS). The study involved 162 respondents, revealing that familiar users rated the site's usability at 72.1 and unfamiliar users at 70, both indicating a 'Good' usability score. The findings suggest that CodeSaya.com meets usability criteria, making it a viable learning medium for programming education.

Uploaded by

Haru to
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views14 pages

The Usability Analysis Online Learning Site For Supporting Computer Programming Course Using System Usability Scale (SUS) in A University

This research analyzes the usability of the online learning site CodeSaya.com for computer programming courses using the System Usability Scale (SUS). The study involved 162 respondents, revealing that familiar users rated the site's usability at 72.1 and unfamiliar users at 70, both indicating a 'Good' usability score. The findings suggest that CodeSaya.com meets usability criteria, making it a viable learning medium for programming education.

Uploaded by

Haru to
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Paper—The Usability Analysis Online Learning Site for Supporting Computer programming Course…

The Usability Analysis Online Learning Site for


Supporting Computer programming Course Using
System Usability Scale (SUS) in a University
https://doi.org/10.3991/ijim.v14i09.13123

Derisma
Universitas Andalas, Padang, Indonesia
[email protected]

Abstract—Computer programming subject is the most-feared subject by


students because it is hard to learn. At present, various learning media for pro-
gramming have been available; therefore, the learning process that used to be
monotonous and challenging to learn has become exciting and easy to do. One
of those media is the Website. Usability is a crucial aspect to be noticed for the
users so they would be facilitated during the usage, acquiring the required
knowledge, and be interested to always rely on the site in their learning process.
This research was done to analyze the success of codesaya.com as an online
learning site should it meets the usability criteria by implementing the System
Usability Scale (SUS) as the research instrument. SUS is an effective and relia-
ble instrument to measure the usability of various products and services. The re-
search questionnaire consisted of 10 questions which disseminated toward 162
respondents divided into two groups, namely familiar users that amounted to 81
respondents and unfamiliar users that amounted to 81 respondents as well. The
usability of the CodeSaya site acquired 72.1 of a score for familiar users and 70
of a score for unfamiliar users. The score acquired from the rating given by fa-
miliar respondents only a bit higher compared to the unfamiliar respondents.
The rating in the alphabet resulted in B or Good. It can be concluded that this
computer programming learning site has met the usability criteria; therefore, it
can be implemented as a learning medium that is feasible to be used by the us-
ers. Referring to such results, the web-based learning of computer programming
as the alternative of pedagogic approach and teaching technique can be applied
based on the psychological needs of students in their adolescence period, allow-
ing them to collaborate, stimulating creative and analytical thinking in solving
problems.

Keywords—Computer programming, System Usability Scale (SUS), Website,


Usability, CcodeSaya.com

1 Introduction

The teaching and learning of programming courses are often considered as a diffi-
cult topic for students and teachers due to its complexity and abstract. In the last 30

182 http://www.i-jim.org
Paper—The Usability Analysis Online Learning Site for Supporting Computer programming Course…

years, the scientific community has yet to quit the search for new pedagogic ap-
proaches and teaching techniques for the introduction course of computer program-
ming [1]. Coding is a part of logical thinking and is one of the necessary skills known
as 21st-century skills. However, computer programming is hard to learn, and the pro-
gram course has often resulted in a high drop-out rate [2]. One of the main factors of
the difficulties faced in learning programming has been associated with the traditional
approach to teaching the basics of programming, which unable to provide students
with an attractive and simulation-rich environment in which the problems and con-
cepts become the subject of investigation in a pleasuring and creative manner. [3]
The studies of teaching and learning programs in various country and education
contexts had revealed that novice programmers faced the same challenges in the effi-
ciency of writing, debugging, and running the programs. These difficulties had to lead
them to be involved in the teaching of programming to forwardly consider the most
effective methods that could facilitate the novice programmers in learning the basic
concepts of programming. The visual programming environments that support the
development of the program through a drag-and-drop interface are some of the most
popular coding tools to teach the novice programmers. In this paper, Papadakis [4]
investigated the use of scratch and app inventor. Papadakis [4] wished that “his study
will become useful guidance in the hand of every teacher in the middle school who
teaches or designs the introduction course of programming in an effort of attracting
more students into programming [2]”. App Inventor and Scratch are the environments
of block-based programming designed explicitly by considering the novices. A study
had found that the programming environment of Scratch does not only help novice
programmers in concentrating on syntax issues but also toward the design and devel-
opment of comprehensive systems and algorithms [5]. Scratch offers easy modifica-
tion in the constructs of programming (i.e., variables, loops, control statements, func-
tions, etc.), this will become a good idea of integration in the course of smart robots
such as Bee-bot and Kibo or internet-connected smart toys, i.e., Sphero [6]. Papadakis
[3] teaches basic programming concepts to novice programmers. The proposed meth-
odologies utilize various technologies, including Twitter, Python programming lan-
guage, and Arduino board. By involving five students as part of the school activity
program, the results of the case study were difficult to be generalized, the evaluation
of the implementation through semi-structured interview showed pleasuring results.
Through their involvement, students understood the basic concepts of programming
and technology when they get involved in a practical work within the inter-discipliner
authentic environment [18, 19].
In Indonesia, one of the online learning media that have been developed and began
to be introduced as a reference for students and teachers is CodeSaya.com Portal. A
learning medium in the form of CodeSaya.com online site provides a service of
providing materials related to basic concepts of programming languages that are pre-
sented online; therefore, the users can access the site swiftly. This site also provides a
platform for the users to practice several instructions of programs that are discussed in
sequences as well as giving a clear error message of the user program. This is one of
the facilities that can be enjoyed by the users of this learning medium. Through this
medium, the learning process does not have to be done face-to-face with a teacher in a

iJIM ‒ Vol. 14, No. 9, 2020 183


Paper—The Usability Analysis Online Learning Site for Supporting Computer programming Course…

specific place. The users only have to rely on their smartphone or computer to be able
to understand the learning materials. The most common mistake is a syntax error,
associated with the error in the code writing of the program and algorithmic error
associated with the steps of program finishing. The difficulty in understanding syntax
so the syntax error cannot be corrected. This condition is seen from some programs
that still leave issues.
Learning media are inseparable from the usability factor, both the creation or the
utilization. An international standard has expressed the definition of usability. ISO
9241-11 defines it as "the extent to which a product can be used by specific users to
achieve specific goals with effectiveness, efficiency, and satisfaction in a specified
context of use. This definition connects three criteria (effectiveness, efficiency, and
satisfaction) [13].
System Usability Scale (SUS) is a questionnaire to measure the perception of usa-
bility. This instrument was created by John Brooke in 1986 and previously used to
test the electronic office system. Despite its simplicity, SUS has become a question-
naire that is widely used for the assessment of the experienced usability [7] [8] [9].
The studies of Sauro [8] and Tullis [10] showed that the System Usability Scale
(SUS) is a usability testing instrument that is valid and reliable. Tullis and Stetson
[10] measured the usability of two websites by using five different surveys (SUS,
QUIS, CSUQ, a variant of Microsoft's Product Reaction Cards, and one that have
been used in their Usability Lab for several years), and found that SUS provides the
most reliable result in various sample sizes. It is interesting to be noted that SUS is the
most uncomplicated questionnaire to be learned, SUS (with only ten measurement
scales), produces reliable results in all sample sizes. It is also interesting that SUS is
the only questionnaire from those who were studied in which the entire questions
discuss every aspect of user reaction toward the website comprehensively. There are
some characteristics of SUS that attract users. First, it only consists of ten questions;
therefore, it is relatively fast and easy to be completed by study participants and to be
scored by administrators. Second, it is nonproprietary; thus, it is cost-efficient to use
and value as very fast, soon after it be finished. Third, SUS it an agnostic technology,
which means that it can be used by a large group of usability practitioner to evaluate
almost every type of user interface, including Websites, cellphones, Interactive Voice
Response (IVR) (both touch-tone), and expression), TV application, and many others.
The last survey result is a single score, ranged from 0 to 100, and relatively easy-to-
understand by various people from other disciplines who work in a project team.
Usability is a quality attribute that assesses the easiness rate of a user interface to
be used. Usability also refers to the method in improving the ease of use during the
designing process. Users will abandon a web with poor usability. A study on usability
had been done by Mostakhdemin [15] that gave Usability Considerations for Mobile-
Learning Applications. Usability and pedagogic factors play a crucial role in m-
learning. Dirin [16] developed a m-learning application in which the proposed
mLUX's framework was implemented. The main goal of the proposed mLUX's
framework was to ensure that the stakeholders, especially students, admitted that m-
learning application was the learning medium that fulfilled the critical need for their
education. This paper also argued that emotional factors, such as user pleasure, adapt-

184 http://www.i-jim.org
Paper—The Usability Analysis Online Learning Site for Supporting Computer programming Course…

ability, and reliability, are the significant matter of design in learning. Hussain, A [17]
reported the perception of the user regarding the usability of oBike cellular—a global
sharing platform of a bicycle. The study results revealed that most of the participants
found that the cellular application of oBike should be improved to enhance their satis-
faction. Some suggested recommendations will ensure the improvement of the appli-
cation should ensure they are being implemented.
As an ICT-based medium, CodeSaya.com Portal has some advantages to support
the activity of teaching and learning. Even though sufficient studies are required to
substantiate the assumption. The reason is, no matter how good the course of
CodeSaya.com portal, it would be less objective if only be assessed internally by the
administrator. Therefore, the author proposed the use of SUS to test the usability of
the CodeSaya.com website. How is the effect of the web usability variable in the
perspective of respondents? Therefore, the author proposed the utilization of SUS to
test the usability of the Codesaya.com website. The results of this study are expected
to become a benchmark and recommendation in the future development of the online
learning site of CodeSaya to become a more effective, efficient, and reliable learning
medium.

2 Literature Review

Nielsen [7], “Usability is a quality attribute that assesses how easy user interfaces
are to use. The word "usability" also refers to methods for improving ease-of-use
during the design process. Five quality components define usability:

1. Learnability: How easy is it for users to accomplish basic tasks the first time they
encounter the design?
2. Efficiency: Once users have learned the design, how quickly can they perform
tasks?
3. Memorability: When users return to the design after a period of not using it, how
easily can they reestablish proficiency?
4. Errors: How many errors do users make, how severe are these errors, and how easi-
ly can they recover from the errors?”
5. Satisfaction: How pleasant is it to use the design?

System Usability Scale (SUS) is a questionnaire that can be used to measure the
usability of a computer system from the subjective point of view of the users. John
Broke developed SUS in 1986. Until present, SUS is widely used to measure usability
and has shown various benefits, including [12]:

1. SUS can be used easily because the results are in the score range of 0-100.
2. SUS is very easy to use; it does not require complicated calculation.
3. SUS is free; it does not require additional cost.
4. SUS is proven as valid and reliable, even though with a small sample size.

SUS is a questionnaire consisting of 10 question items, as shown in Table 1. SUS


questionnaire uses five points of Likert's scale. Respondents are asked to give ratings

iJIM ‒ Vol. 14, No. 9, 2020 185


Paper—The Usability Analysis Online Learning Site for Supporting Computer programming Course…

i.e., "Strongly Disagree," "Disagree," "Neutral," "Agree," and Strongly Agree" over
10 question items of SUS according to the subjective rating. If the respondents think
that they do not find the right response scale, they have to fill the midpoint of the
testing scale. Each question item has a contribution score. Every item contribution
score will range between 0 to 4. For items 1, 3, 5, 7, and 9, their contribution score is
the position of scale subtracted by 1. For items 2, 4, 6, 8, and 10, their contribution
score is five subtracted by the position of the scale. The total contribution score is
multiplied by 2.5 to acquire the overall value of system usability. The SUS scores are
ranged between 0 to 100. This is the calculation formula for the SUS score:
The whole SUS scores are acquired from the mean of individual SUS mean score.
The SUS questionnaire is distributed through google form link toward students in
System Department.

Table 1. SUS Questionaire


1 Disagree 2 3 4 5 Agree
I think that I would like to use this system frequent-
ly.
I found the system unnecessarily complex.
I thought the system was easy to use.
I think that I would need the support of a technical
person to be able to use this system.
I found the various functions in this system were
well integrated.
I thought there was too much inconsistency in this
system.
I would imagine that most people would learn to use
this system very quickly.
I found the system very cumbersome to use.
I felt very confident using the system.
I needed to learn a lot of things before I could get
going with this system.

3 Methodology

The research flow was started with questionnaire designing, data collecting, and
data processing. The implemented questionnaire was a questionnaire regarding the
usability rate of codesaya.com as a medium learning for programming.

186 http://www.i-jim.org
Paper—The Usability Analysis Online Learning Site for Supporting Computer programming Course…

Fig. 1. Research Flow

3.1 Research object

The object of this research was the usability of an online learning site for python
programming language, namely CodeSaya. In this research, the collecting of quantita-
tive data was conducted by performing usability testing consisted of several scenarios
to rate/measure the effectiveness, efficiency, and error criteria. The collecting of qual-
itative data was conducted by using the questionnaire to rate the satisfaction criterion.

Fig. 2. The Page Display of Codesaya.com Website

iJIM ‒ Vol. 14, No. 9, 2020 187


Paper—The Usability Analysis Online Learning Site for Supporting Computer programming Course…

3.2 Research respondents

This study tested 162 people who use the Codesaya learning site to learn program-
ming languages (which in this context is Python). Respondents were divided into two,
namely familiar users (two months of usage, the site was used in Algorithm and
Computer Programming subject to learn Python) and unfamiliar users with two weeks
of usage. The respondents were active students in the Computer Systems Department
of Information Technology Faculty in Universitas Andalas. Respondents were asked
to fill their identities while filling the questionnaire such as name, age, gender, and
signature.

3.3 Research procedures


In this study, the performance measurement technique can measure effectiveness,
efficiency, and error. The satisfaction criterion was measured through the question-
naire. The tasks given on the tested site were as follows:

1. Register and Login: Respondents conducted a registration process with the pre-
determined Username, Email, and Password until they succeed. Next, respondents
will log in by entering the registered data until they can get into the Home menu.
2. Selecting the Material: Respondents selected the learning materials, in this case,
respondents used the site to learn Python programming languages, and the re-
spondents acted as the new user.
3. Learning: Respondents started the learning process to use the site. In this site, ma-
terials such as basic concepts and the application on programs as well as the dis-
cussion forum that can be utilized by users should some obstacles occur.

3.4 Questionnaire designing


After the study regarding the usability of a web, in general, was done, the next step
was determining the way to measure web usability. In general, the criterion that de-
termines that the web is useful is if the users can acquire and discover the things re-
quired by them from the website. According to the definition from Nielsen, there are
five measurement criteria that work as the condition of which web will meet particu-
lar usability criteria; thus, the benchmark of a web usability fulfillment can be ob-
tained.
The used questionnaire was the System Usability Scale. This SUS questionnaire
was used for assessment after respondents done the given assignments. This question-
naire was provided as a post-test questionnaire that functioned to complete the data
acquired from the subjective arguments of the respondents. SUS consists of 10 ques-
tions. In its original term, SUS uses English. However, there was a research or a paper
that has made it in Bahasa, namely the research of Z. Sharfina and H.B. Santoso [13].
At each question point, there were five measurement scales from 1 to 5 (1 for the
lowest, 5 for the highest). The answer options consisted of strongly disagree, disagree,
hesitant, and strongly agree. SUS has 0 as the minimum score and 100 as the maxi-

188 http://www.i-jim.org
Paper—The Usability Analysis Online Learning Site for Supporting Computer programming Course…

mum score. This SUS-implementation research resulted in the form of a single num-
ber with the result numbers ranged between 0 to 100. Then, this SUS score will be put
in average/mean to acquire the final score of SUS.

3.5 Data processing

After data collecting was done toward the respondents, the data then be calculated.
There are some rules in the calculation of SUS score when using the System Usability
Scale (SUS). These are the rules during the score calculation of the questionnaire:
a) For every question in odd numbers, the score of each question is acquired from the
user score subtracted by 1.
b) For every question in even numbers, the final score is acquired from 5 subtracted
by the score of question acquired from the users.
c) The SUS score is acquired from the score addition result of each question multi-
plied by 2.5.

The Method to Calculate SUS


“.. score of 𝑆𝑈𝑆 = ((𝑅1−1) + (5−𝑅2) + (𝑅3−1)+ (5−𝑅4)+(𝑅5−1)+ (5−𝑅6)+
(𝑅7−1)+(5−𝑅8)+ (𝑅9−1)+ (5−𝑅10))∗2.5) (1)

The rule of score calculation applies to one respondent. For the next calculation,
the mean of the SUS score of each respondent is found by adding whole scores divid-
ed by the number of respondents. This is the formula to calculate the SUS score:

(2)

In which:
x =Mean Score
  b = Total SUS scores
2

N = Number of respondents

Table 2. The Example of Calculation Result Data of SUS


Score
(Total
Respondent Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 Total
score
x 2.5)
1 Respondent 1 4 4 3 4 4 3 3 2 4 3 32 85
2 Respondent 2 4 4 3 4 4 3 3 2 4 3 32 85
3 Respondent 3 4 4 3 4 4 3 3 2 4 3 32 85
Respondent 4 4 3 4 4 3 3 2 4 3 32 85

iJIM ‒ Vol. 14, No. 9, 2020 189


Paper—The Usability Analysis Online Learning Site for Supporting Computer programming Course…

If the stage above has been reached, the previous formula is applied, the mean of
the score is calculated. The method is based on the formula, namely, summing the
scores of all respondents, then divided by the number of respondents. The mean score
of SUS from the data result above is 85.

4 Results

4.1 Validity and reliability tests of the instrument

The validation of the instrument was made to obtain valid data, so the instrument
can be used to measure what should be measured. Validity shows the extent to which
the relevancy of questions toward what being asked or what to measure in this study.
The validity and reliability of the questionnaire were measured by testing 162 re-
spondents. This effort was made to determine whether the instrument that will be
distributed toward the respondents is valid or not as well as the questions associated
with the research variables of usability, learnability, efficiency, memorability, error
and user satisfaction. The validity test used Pearson's Product-Moment correlation
formula presented as follows:

(3)
r =Correlation between x and y
N = The number of samples
 x =The amount of item score
 y =The number of total item score
The validity of the questionnaire was acquired by comparing r-count values with r-
table values. The rule of decision is if r-count > r-table; so, the questionnaire is valid,
on the opposite, if r-count < r-table; it means that the questionnaire is invalid. The
validation test of this study used 162 respondents; the r-table value was 0.514. There-
fore, the question item will be determined as valid if the value is > 0.514, and it would
be determined as invalid if it is < 0.514.

4.2 Instrument reliability test


A reliable instrument means that if the instrument is used several times to measure
the same object, it will generate the same measurement result. The answers to the
questionnaire were processed through a statistical test by using a computer program.
The reliability test was performed through Cronbach's Alpha test with a formulation
as follows:

190 http://www.i-jim.org
Paper—The Usability Analysis Online Learning Site for Supporting Computer programming Course…

(4)

r1= Coefficient of reliability


k=The number of question item

2
b =Total item variance

 b2 = Total variance
Criteria: if ri-count > ri-table, thus, the instrument is reliable. If ri-count ≤ ri-table,
thus, the instrument is unreliable. The reliability of the questionnaire is discovered by
comparing the r-count value with the r-table value in which the used r-count is the
Alpha value. It is determined as reliable if the r-count value is greater than minimum
reliability value i.e., 0.7. The reliability test showed that this questionnaire is very
reliable because it has more than 0.7 value, namely 0.968.

4.3 The analysis of SUS score

SUS is a global assessment of usability aspects (effectiveness, efficiency, and satis-


faction) which subjectively experienced by the users. The SUS score can indicate the
acceptability range of the users. The mean score of SUS has to be higher than 70
(Brook, 2013) to be categorized as acceptable. The mean score of SUS of the
Codesaya website was 70. It means that the questionnaire of the Codesaya website
was good, as we can see from the figure presented below.

Fig. 3. SUS Score

The study was done by [8] also explained about the category of SUS score. To ac-
quire A as the rank, the SUS score has to be mounted at least 81. The SUS score of
Codesaya was 70 and 72.1; it means that the score can be categorized in rank B, as
shown in Figure 4.

iJIM ‒ Vol. 14, No. 9, 2020 191


Paper—The Usability Analysis Online Learning Site for Supporting Computer programming Course…

Fig. 4. Percentile Ranks

The calculation results performed through the SUS method then be converted into
percentile ranks and letter grades. Percentile ranks show the usability rate in the form
of a percentage (%), while letter grades show the usability rate from A to F classes
where A is the best class, and F is the worst class. The rule for percentile ranks and
letter grades are as follows:
1. Grade A: score >= 80.3, percentile >= 90 %
2. Grade B: 74 <= score < 80.3, 70 % <= percentile < 90 %
3. Grade C: 68 <= score < 74, 40 % <= percentile < 70 %
4. Grade D: 51 <= score < 68, 20 % <= percentile < 40 %
5. Grade F: score < 51, percentile < 20 %

SUS can also be interpreted into adjective ratings to elucidate the usability rate of a
system, which later be interpreted into the user acceptability range toward a system to
determine whether the system is acceptable or not by the users.
Also, according to this data, the mean score from the SUS rating on the CodeSaya
site for familiar users was amounted to 72.1, while the unfamiliar users acquired 70.0
of the score. Of both data above, the comparison of SUS scores on both respondent
categories can be acquired as follows.

Fig. 5. The comparison graphic of SUS questionnaire on both user categories

192 http://www.i-jim.org
Paper—The Usability Analysis Online Learning Site for Supporting Computer programming Course…

According to the scale of SUS scores presented in Figure 3, 70, and 72.1, scores
are categorized in the Good range for adjective ratings with B-C grade of scale and
High acceptability range. Of the entire measurements, it can be concluded that the
CodeSaya site was still categorized as a high mean class or almost categorized as
acceptable. The score acquired from the rating done by familiar respondents only a bit
higher than the unfamiliar respondents. It means the rating criteria for the users were
almost the same, either for familiar or unfamiliar respondents.
About the utilization of ICT-based learning medium, i.e., a website that contains
many elements of image animation, text, motion, and audio categorized as multime-
dia, thus, the learning will emerge learning motivations by the theory of motivation.
The learning objects that are perceived as complicated or uncomplicated will generate
a low motivation because students will feel bored, or even frustrated due to the feeling
of the impossibility in finishing the assignment properly. The argument of Santrock
[14] also stated that teenagers between the age of 15-18 were having a lot of cogni-
tive, emotional, and social changes, capable of thinking in a more sophisticated man-
ner, and often spent their time with peers. In view of the above, a web-based learning
design can be implemented according to the psychological needs of students in the
adolescent period.
Web-based learning is not punitive because in this theory; teachers are more em-
phasizing on the learning process rather than the new learning outcomes in this theo-
ry. This condition is substantiated by the establishment of a good environment (col-
laborative). Therefore, the increase in motivation and the minimum occurrence of
errors are possible due to the pleasuring learning experienced by students.

5 Conclusion

After the measurement to acquire the Usability Scale on the platform of online
learning of programming through CodeSaya that is used by students who take Algo-
rithm and Computer Programming in Computer System Department of the Faculty of
Information Technology in Andalas University was done, it can be concluded that the
usability of Codesaya Portal was experienced by the user, highly-correlated with
learnability, efficiency, memorability, errors, and satisfaction. In general, this rating
ability in children is pragmatic. They frequently rate an object as "good" if it emerges
a pleasure for them, while something "bad" is an annoying or unpleasant thing. It can
be concluded that most of the students argued that the Usability of this CodeSaya
Platform had reached the scale with 72.1 of the score for familiar users, while the
unfamiliar users acquired 70.0 of the score; or acquired B (Good) based on the Al-
phabetical rating. Therefore, it can be determined that this website can be implement-
ed as a learning medium that is easy to learn, understand, and operate by the users.
The score acquired from the rating of familiar respondents only a bit higher than the
unfamiliar users. It means that the measurement criteria for both familiar and unfamil-
iar users were equal. However, this website still requires more evaluations for its
future developments. Because, based on the acquired scores, this website still leaves

iJIM ‒ Vol. 14, No. 9, 2020 193


Paper—The Usability Analysis Online Learning Site for Supporting Computer programming Course…

the impression that the users have yet to feel satisfied in using the website, so future
improvements are required.
I suggest the improvement on the performance of this website, so the users will al-
ways use this medium as their learning platform because most of the respondents
conduct the website evaluation through their mobile, due to that matter, respondents
request a more mobile-friendly website for more comfortable use. The login should be
made more accessible and simplified, so the users do not feel complicated. The vali-
dation results in the form of data of qualitative assessment on web products from the
learning technology experts showed that the whole aspect of learning technology in
the product of CodeSaya.com web has been sufficient; thus, they can be used as a new
pedagogic approach and learning technique in the course of computer programming
for teenagers. Teen students who participate in an online web-based are collaborative-
ly showing the increase in motivation, depth concept comprehension on materials, and
the increase in the desire to solve problems.

6 References
[1] Papadakis, S. (2018). Is pair programming more effective than solo programming for sec-
ondary education novice programmers? A case study. International Journal of Web-Based
Learning and Teaching Technologies, 13(1), 1–16. https://doi.org/10.4018/
ijwltt.2018010101
[2] Papadakis, S., & Orfanakis, V. (2018). Comparing novice programing environments for
use in secondary education: App Inventor for Android vs. Alice. International Journal of
Technology Enhanced Learning, 10(1–2), 44–72. https://doi.org/10.1504/
ijtel.2018.10008587
[3] Papadakis S., & Orfanakis V. (2017). The Combined Use of Lego Mindstorms NXT and
App Inventor for Teaching Novice Programmers. In: Alimisis D., Moro M., Menegatti E.
(Eds.), Educational Robotics in the Makers Era. Edurobotics 2016. Advances in Intelligent
Systems and Computing, vol 560, pp.193-204. Springer, Cham. https://doi.org/10.
1007/978-3-319-55553-915
[4] Papadakis, S., Kalogiannakis, M., Orfanakis, V., & Zaranis, N. (2017). The appropriate-
ness of scratch and app inventor as educational environments for teaching introductory
programming in primary and secondary education. International Journal of Web-Based
Learning and Teaching Technologies, 12(4), 58–77. https://doi.org/10.4018/ijwltt.
2017100106
[5] Kim, H., Choi, H., Han, J. and So, H.J. (2012) ‘Enhancing teachers’ ICT capacity for the
21st century learning environment: Three cases of teacher education in Korea’, Australa-
sian Journal of Educational Technology, Vol. 28, No. 6. https://doi.org/10.14742/
ajet.805
[6] Kalogiannakis, M., & Papadakis, S. (2019). Evaluating a course for teaching introductory
programming with Scratch to pre-service kindergarten teachers. International Journal of
Technology Enhanced Learning, 11(3), 231. https://doi.org/10.1504/ijtel.2019.
10020447
[7] Nielsen, J. (2003). Usability 101: Introduction to Usability Why Usability is Important
How to Improve Usability.
[8] ¨Sauro, Jeff. (2011). Measuring Usability with the System Usability Scale (SUS). Tersedia:
https://measuringu.com/sus/

194 http://www.i-jim.org
Paper—The Usability Analysis Online Learning Site for Supporting Computer programming Course…

[9] Lewis, J. R. (2019). Item Benchmarks for the System Usability Scale Item Benchmarks for
the System Usability Scale. (May 2018).
[10] Tullis, T. S., & Stetson, J. N. (2004). A Comparison of Questionnaires for Assessing Web-
site Usability ABSTRACT: Introduction. 1–12.
[11] N, I. A. H., Santoso, P. I., & Ferdiana, R. (2015). Pengujian Usability Website
Menggunakan System Usability Scale Website Usability Testing using System Usability
Scale. 17(1), 31–38. https://doi.org/10.33164/iptekkom.17.1.2015.31-38
[12] Setiawati, A., Rahim, A., Kisbianty, D., Studi, P., Informatika, T., & Bangsa, S. D. (2018).
Pengembangan dan Pengujian Aspek Usability pada Sistem Informasi Perpustakaan ( Studi
Kasus : STIKOM Dinamika Bangsa Jambi ). 13(1). https://doi.org/10.
35143/jkt.v5i1.2487
[13] Z. Sharfina and H. B. Santoso, “An Indonesian adaptation of the System Usability Scale
(SUS),” in International Conference on Advanced Computer Science and Information Sys-
tems, ICACSIS 2016, 2017, pp. 145–148 https://doi.org/10.1109/icacsis.2016.
7872776
[14] Santrock, Jhon, W. (2003). Adolescene: Perkembangan Remaja. Terjemahan oleh Shinto
B. Adelar dan Sherly Siragih. Jakarta: Erlangga.
[15] Mostakhdemin-Hosseini, A. (2009). Usability Considerations of Mobile Learning Applica-
tions. International Journal of Interactive Mobile Technologies (IJIM), 3(0), 29–31.
https://doi.org/10.3991/ijim.v3s1.854 https://doi.org/10.3991/ijim.v3s1.854
[16] Dirin, A., & Nieminen, M. (2015). mLUX: Usability and user experience development
framework for M-learning. International Journal of Interactive Mobile Technologies, 9(3),
37–51. h https://doi.org/10.3991/ijim.v9i3.4446
[17] Hussain, A., Mkpojiogu, E. O. C., Nabeel, N., & Alathwari, A. (2019). Users perception of
the mobile usability of a global bicycle sharing platform. International Journal of Interac-
tive Mobile Technologies, 13(11), 125–136. https://doi.org/10.3991/ijim.v13i11.
11298
[18] Papadakis, S. (2020). Evaluating a game-development approach to teach introductory pro-
gramming concepts in secondary education. Int. J. Technology Enhanced Learning, 12(2),
127–145. https://doi.org/10.1504/ijtel.2020.106282
[19] Papadakis, S. (2018). Evaluating pre-service teachers' acceptance of mobile devices with
regards to their age and gender: a case study in Greece. International Journal of Mobile
Learning and Organisation, 12(4), 336-352. https://doi.org/10.1504/ijmlo.2018.10013372

7 Author

Derisma is a faculty member at the Department of Information Technology Uni-


versitas Andalas (UNAND) Padang, Indonesia. She teaches several courses e.g., hu-
man-computer interaction, Artificial Intelligence, Multimedia Systems. Her current
research interests include Soft Computing, Artificial Intelligence, Multimedia Sys-
tems, Human Computer Interaction, E-health and E-Learning.
Email: [email protected]

Article submitted 2020-01-10. Resubmitted 2020-03-16. Final acceptance 2020-03-16. Final version
published as submitted by the authors.

iJIM ‒ Vol. 14, No. 9, 2020 195

You might also like