0% found this document useful (0 votes)
79 views15 pages

IB Computer Science Dossier: Grade Program

The document provides guidelines for sections of a program dossier, including: 1. An analysis section describing the problem from the user's perspective. 2. Criteria for success that clearly states the objectives and expected behavior of the solution. 3. A prototype of the solution to demonstrate how the system will operate for user evaluation. 3 sentences

Uploaded by

asdlf777
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
79 views15 pages

IB Computer Science Dossier: Grade Program

The document provides guidelines for sections of a program dossier, including: 1. An analysis section describing the problem from the user's perspective. 2. Criteria for success that clearly states the objectives and expected behavior of the solution. 3. A prototype of the solution to demonstrate how the system will operate for user evaluation. 3 sentences

Uploaded by

asdlf777
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 15

Blah. Blah. Blah.

IB Computer
Science
Dossier
Grade Program
Table of Contents
A1- Analysis.................................................................................................................................................2
A2- Criteria for Success................................................................................................................................3
A3- Prototype..............................................................................................................................................4
B1- Data Structures.....................................................................................................................................5
B2-Algorithms..............................................................................................................................................6
B3- Modular Organization...........................................................................................................................7
C2- Handling Errors......................................................................................................................................8
C3- Success of Program...............................................................................................................................9
C1 – Code Listing.......................................................................................................................................10
D2- Evaluation of Sources..........................................................................................................................12
A1- Analysis

This section of the program dossier would typically be two to three pages in length. It
should include a brief statement of the problem as seen by the end-user.

– A discussion of the problem from the end-user’s point of view should take place,
including the user’s needs, required input and required output.

• For example, evidence could be sample data, interviews and so on, and could be
placed in an appendix.
A2- Criteria for Success

• Criterion A2 (0 – 3 points): Criteria for success

– This section of the program dossier will clearly state the objectives/goals of the solution
to the problem.

• The expected behaviour of the solution should be clearly described and the limits under which it
can operate outlined
Criterion A2 (0 – 3 points) : Criteria for success

– This section of the program dossier would typically be one to two pages in length.

– Objectives should include minimum performance and usability.

– These criteria for success will be referred to in subsequent criteria.

• For example criterion C2 (usability), C4 (success of the program); D2 (evaluating


solutions) and D3 (including user documentation).

– The limits under which the solution will operate will vary.

• Some examples are:

– time taken to return a research result from a data file

– the response of the program to invalid and extreme data input

– limitations on the volume of data stored in the program


– usability of user input screen

the proper response of the program to user


A3- Prototype

Criterion A3 (0 – 3 points) : Prototype solution

The prototype solution must be preceded by an initial design for some of the main objectives that were
determined to be the criteria for success.

A prototype of the solution should be created.

A prototype is: “The construction of a simple version of the solution that is used as part of the
designprocess to demonstrate how the system will work.”
The prototype need not be functional, it could be constructed using a number of tools such as: Visual
Basic, PowerPoint, Mac Paint, Corel Draw for a simple Java program.

The intent is to show the user how the system is expected to operate, what inputs are required and
what outputs will be produced.

A number of screenshots will be required for the user to be able to evaluate the solution properly.

The prototype, at its simplest, could be a series of clear, computer-generated drawings, a hierarchical
outline of features in text mode, or a series of screenshots.

Documentation of user feedback could be, for example, a report of the user’s comments on the
prototype.
B1- Data Structures

Criterion B1 (0 – 3 points) : Data structures

– Students should choose data structures, at the design stage, that fully support the data-
storage requirements of the problem, and that allow clear, efficient algorithms to be
written.

– The data structures must fully support the objectives of the solution (criterion A2).

– The classes chosen should be logical in that the data is sensible for the objects in
question and the methods are appropriate for the data given.

• This section of the program dossier could include class definitions, file structures, abstract data
types (particularly at higher level) and some consideration of alternatives
Criterion B1 (0 – 3 points) : Data structures

– This section would typically be two to five pages in length.

– Data structures and data members that are to be used in the programmed solution
should be discussed here.

– Sample data, sketches/illustrations, including discussion of the way data objects will be
changed during program execution should be demonstrated to achieve a level 4 in
criterion B1.
B2-Algorithms

Students should choose algorithms, at the design stage, that fully support the
processes needed to achieve the objectives of the solution (criterion A2), and provide
sufficient support for the required data structures.

– The classes chosen should be logical in that the methods are appropriate for the data
given.

– Students must include parameters, return values, and descriptions of pre- and post-
conditions.This section would typically be two to five pages in length.

– This can be a list or outline of all the algorithms, presented as text, possibly in outline
format.

– Standard algorithms (such as search or sort) can simply be named (with parameters),
but non-standard algorithms must be described in more detail.
B3- Modular Organization

Students should choose modules, at the design stage, that incorporate the data
structures and methods required for the solution (criteria B1 and B2) in a logical way.

– The data structures must fully support the objectives of the solution (criterion A2).

– Students must present this organization in a structured way that clearly shows
connections between modules (hierarchical decomposition or class dependencies).

– The connections between modules, algorithms and data structures must also be
presented.

– This section would typically be three to five pages in length.

– A variety of presentations are possible here. Some possibilities are:

– a top-down hierarchical decomposition chart containing the names of


modules, showing connections between modules and showing details of
which data structures and methods are connected with (or part of) which
modules

– a text outline showing hierarchical decomposition (equivalent to above)

– a hard copy of CRC cards showing dependencies between collaborating


classes, with details of which data structures and methods are connected with
(or part of) which classes.

– The design is assessed independently from the programming stage (stage C).

– The design should be complete, logical and usable, but the student may deviate from
it or expand it during stage C, without penalty.
C2- Handling Errors

This refers to detecting and rejecting erroneous data input from the user, and
preventing common run-time errors caused by calculations and data-file errors.

– Students are not expected to detect or correct intermittent or fatal hardware errors
such as paper-out signals from the printer or damaged disk drives, or to prevent data-
loss during a power outage.

– This section would typically be one to two pages in length.

– For this criterion, students must attempt to trap as many errors as possible.

– The documentation in the dossier can take a variety of forms.

– For example, students could highlight relevant comments within the program
listing or they could produce a table with two columns, one that identifies any
error possibilities, and one that shows the steps taken to trap the errors.

– It is not expected that extra output is produced for this section.


C3- Success of Program

– Evidence here refers to hard copy output in criterion D1.

The teacher should run the program with the student to confirm that the program functions, and that
it produces the hard copy output submitted with the program dossier
C1 – Code Listing

Good programming style can be demonstrated by program listings that are easily readable, even by a
programmer who has never used the program.

These would include small and clearly structured Java methods, sufficient and appropriate comments,
meaningful identifier names and a consistent indentation scheme.

Comments should be included to describe the purpose and parameters of each method, and also when
code is difficult to understand.

The program should demonstrate the use of good programming techniques. It should include: (Page
115)

an identification header indicating the program name

Author

Date

School

Computer used

IDE used

Purpose.
D1- Annotated Hard Copy

Hard copy output from one or more sample runs should be included to show that the different branches
of the program have been tested;

testing one set of valid data will not be sufficient.

The hard copy submitted should demonstrate the program’s responses to inappropriate or erroneous
data, as well as to valid data.

Thus the usefulness of the error-handling routines mentioned above should become evident.

While at least one complete test run must be included in the dossier, it is not necessary that the hard
copy reflect every key stroke of every test run.

Cutting and pasting of additional test runs should be done to illustrate the testing of different aspects of
the program.

All test runs should be annotated in such a way that the student is stating what aspect of the program is
being tested.

Sample output must never be altered by hand, erased or covered up.


Sample output can be “captured” and combined electronically with explanatory annotations into a
single document.

However, it is forbidden to alter or reformat sample output in any fashion (except to add page numbers
or annotate in order to highlight user friendliness or error-handling facilities as discussed above),
especially if these alterations would give an unrealistic impression of the performance of the program.

Examples of such “abuse” include: lining up text that was not originally aligned; adding colour or other
special effects; changing incorrect numerical output; erasing evidence of errors.
D2- Evaluation of Sources

This section of the dossier would typically be two pages in length.

The evaluation/conclusion should include reflections on the effectiveness of the programmed solution of
the original problem. It should discuss answers to the following questions.

Did it work?

Did it address the criteria for success?

Did it work for some data sets, but not others?

Does the program in its current form have any limitations?

What additional features could the program have?

Was the initial design appropriate?

A thorough evaluation should also discuss possible future enhancements that could be made to the
program.

Sample – Evaluating solutions

Page 140
D3- User Documentation

You might also like