0417_w20_er

Download as pdf or txt
Download as pdf or txt
You are on page 1of 22

Cambridge International General Certificate of Secondary Education

0417 Information and Communication Technology November 2020


Principal Examiner Report for Teachers

INFORMATION AND COMMUNICATION


TECHNOLOGY

Paper 0417/11
Written Paper

Key messages

• Candidates should be advised to read the questions thoroughly and plan their answers carefully.

• Candidates who performed well in this paper used specific and detailed language when replying to
‘describe’ and ‘write down steps’ type questions. They also gave a justification of the statements and
discussed the arguments for and against.

• Candidates must give the generic names for software rather than the brand name. It is clearly stated
on the front page of the question paper ‘No marks will be awarded for using brand names of software
packages or hardware.’

• If candidates need to expand their answers on to other parts of the question paper or onto extra
sheets, they should clearly identify in the original answer space where the extra part can be found.

General comments

The paper gave all candidates an opportunity to demonstrate their knowledge and understanding of ICT
using a wide variety of topics. The vast majority of candidates were able to complete the paper in the allotted
time, and most were able to make an attempt at all the questions.

When a question indicates a specific number of answers, candidates should only write one answer in each
allocated space as only one is marked for each space. Any question inviting the candidate to describe,
discuss, explain advantages or disadvantages requires specific detailed points relevant to the questions
asked. Some candidates wrote conclusions on the discuss type questions. Conclusions must be reasoned
and detailed in order to gain the mark.

A few candidates used tables or a line down the middle of the answer to list advantages and disadvantages
in separate sections when answering the discussion questions, producing repeated or shortened answers.
This method of answering a question can result in missed points as comparisons are difficult.

Comments on specific questions

Question 1

(a) Candidates did well on this part of the question with many candidates gaining both marks. Some
candidates misread output as input.

(b) Most candidates were able to gain at least a mark in this question. Some candidates gave the
answer hard drives rather than optical devices.

Question 2

Most candidates were able to gain at least two marks in this question. A common error was for the control
unit to carry out logical decisions. However, main memory as IAS and the ALU carrying out calculations were
generally chosen correctly.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

Question 3

(a) Most candidates were able to gain two marks for this question. However, some candidates mixed
up switch and bridge, and modem and switch.

(b) Stronger candidates were able to correctly identify routing table.

(c) Many candidates had an idea of how the data was routed but could not describe the process fully.
Candidates were able to state that data is sent in data packets or explained this in words. Some
candidates thought that data was sent using the internet. Stronger responses clearly described the
process of using the IP addresses in the data packets and how this is used in the router to work out
the best route to send the data packet. Candidates should be encouraged to give more detail in
their answers. Some candidates did not give sufficient detail to gain full marks.

Question 4

(a) Most candidates were able to gain a mark for this question, usually for giving the name ‘title’. If a
second mark was awarded it was usually for ‘target frame’. This question was challenging for some
candidates who appeared not to understand the meaning of the head section.

(b) Many candidates were able to gain a mark for this question, but it was a challenging question for
some candidates. Although many candidates appeared to understand the meaning of the body
section, the issue was in writing the answer with enough detail to gain the marks. As a result, most
marks were awarded for the content/information and an example.

(c) This again was a challenging question with only stronger candidates gaining full marks. However, a
few candidates gained a mark for identifying that a class name starts with a dot. Some candidates
wrote about styles using H1, H2, etc.

Question 5

(a) Most candidates managed to gain at least a mark on this question. Those candidates who
understood the key word evaluate scored well on this question. Evaluation questions require the
candidates to give both positives and negatives, full marks cannot be achieved unless both are
given. Most candidates were able to give negatives, but few were able to give positives. It is
important in questions of this type that candidates read the question fully before answering.

(b) This part was better answered than part (a) with many candidates gaining at least four marks.
There were some candidates who misread the question and drew a ticket that was not a printable
ticket as it included drop down menus. There were some very good drawings of QR and bar codes.
Some candidates redrew a box inside the answer box. This reduced the area that the candidate
had in which to answer the question.

Question 6

(a) Many candidates understood that encryption was the process of converting sensitive data to
meaningless data, though many thought that data became unreadable. Candidates gained marks
for encryption key and decryption key as well as explaining that the data was scrambled.

(b) Please note that due to an issue with question 6(b), full marks have been awarded to all candidates
for this question to make sure that no candidates were disadvantaged.

Question 7

(a) Many candidates gained a mark for length check and a few for format check. The instruction in the
question for no fields to be repeated was often ignored. A significant number of candidates,
including many who answered the rest of the paper well, did not attempt this question.

(b) Those candidates that gave test data usually did well on this question and the concepts of normal,
abnormal and extreme were well understood. However, some candidates did not give suitable
examples and therefore did not gain full marks.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

Question 8

This question gave a different scenario to the standard explanation of the use of a microprocessor in a
control system. Some candidates misread the question and wrote about converting the data using an ADC
when the question was about the use of the microprocessor. Many responses clearly identified the use of a
pre-set value and how the microprocessor used this value along with the data from the sensor to send a
signal to an actuator when needed. Some candidates understood the comparison of the value with the pre-
set value but others produced vague answers about what happened next. For example, candidates stated
that a vehicle must be slowed down or sped up but could not explain how this was achieved. Weaker
candidates explained that the sensor made the changes to the speed of the vehicle or that warnings were
given to the driver to slow down or to speed up rather than being autonomous as stated in the question.

Overall, many candidates achieved some good marks on this question and showed a good understanding of
how a proximity sensor and microprocessor work together in this situation.

Question 9

Some candidates misread this question answering as though the laptop computer was connected to the
internet and had been hacked. The question stated that the laptop had been lost and required a discussion
of the methods needed to protect the data stored on the laptop. Some candidates gave one of the correct
answers relating to biometrics but used this as their only answer therefore only gaining some of the marks
available. There were some good explanations about strong passwords, encryption and biometric
passwords.

Question 10

Most candidates were able to gain some marks on Question 10.

(a) Almost every candidate scored marks here with many gaining all of the available marks.

(b) Almost every candidate correctly identified Japan as satisfying the search criteria.

(c) (i) Most candidates were able to gain at least two marks from the four that were available in this
question. Many candidates were able to explain the function of the COUNTIF formula but had more
difficulty in identifying the reason for the $ signs. A few candidates thought that this denoted
currency rather than absolute values. Common mistakes were that the formula counted the
countries or related to the column rather than the cell reference.

(ii) Most candidates were able to gain at least four marks on this question. The marking was split
between the setting up of the graph and the naming of the title and labels. Most candidates
managed a range of marks between the two parts.

Many candidates obviously understood the process of creating and labelling a chart. Some
candidates had problems writing the answer in the form of steps to show what they would do if they
were creating the chart, and this led to some vague answers. Generally, the correct range and the
correct chart were selected. The labels given to the axes and the chart itself were usually correctly
chosen.

Question 11

Most candidates were able to gain at least three marks on this question. The difference between technical
and user documentation was generally understood. Some candidates were confused about what should go
into both. The first three items were generally better understood than the second three.

Question 12

Most candidates were able to gain at least half of the available marks on this question with stronger answers
given in parts (a) and (b) rather than part (c).

(a) There was a good understanding of the devices used in a VR system with many candidates gaining
maximum marks. However, there was sometimes a lack of detail such as ‘gloves’ instead of ‘data
gloves’ or equivalent, and ‘mobile phone’ instead of ‘smartphone’.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

(b) As with part (a), part (b) was generally well answered but some candidates gave incorrect
answers such as ‘brain problems’ as health problems associated with VR.

(c) This part of the question was not as well answered as part (a) or part (b) with many candidates
focusing on the creation of a presentation rather than how the presentation should be presented for
adults. Some answers suggested the use a large font even though the opposite is true as
presentations of this type have more smaller text and fewer images.

Responses which identified that the question was asking candidates to describe the features
needed to specifically appeal to adults were able to access the full mark range. Weaker responses
gave features to have in a general presentation without demonstrating an understanding of the
different audience. Many candidates had some knowledge of presentations but were unable to
adapt a presentation for a certain audience.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

INFORMATION AND COMMUNICATION


TECHNOLOGY

Paper 0417/12
Written Paper

Key messages

• Candidates should be advised to read the questions thoroughly and plan their answers carefully.

• Candidates who performed well in this paper used specific and detailed language when replying to
‘describe’ and ‘write down steps’ type questions. They also gave a justification of the statements and
discussed the arguments for and against.

• Candidates must give the generic names for software rather than the brand name. It is clearly stated
on the front page of the question paper ‘No marks will be awarded for using brand names of software
packages or hardware.’

• If candidates need to expand their answers on to other parts of the question paper or onto extra
sheets, they should clearly identify in the original answer space where the extra part can be found.

General comments

The paper gave all candidates an opportunity to demonstrate their knowledge and understanding of ICT
using a wide variety of topics. The vast majority of candidates were able to complete the paper in the allotted
time, and most were able to make an attempt at all the questions.

When a question indicates a specific number of answers, candidates should only write one answer in each
allocated space as only one is marked for each space. Any question inviting the candidate to describe,
discuss, explain advantages or disadvantages requires specific detailed points relevant to the questions
asked. Some candidates wrote conclusions on the discuss type questions. Conclusions must be reasoned
and detailed in order to gain the mark.

A few candidates used tables or a line down the middle of the answer to list advantages and disadvantages
in separate sections when answering the discussion questions, producing repeated or shortened answers.
This method of answering a question can result in missed points as comparisons are difficult.

Comments on specific questions

Question 1

Most candidates were able to gain at least two marks for this question.

(a) This question was answered well with most candidates gaining at least a mark, and many gaining
both marks. The most popular incorrect answer was hard disc drive.

(b) Even though most candidates gained at least a mark for this question, some candidates did not
understand what was meant by direct data entry. Some candidates gave output devices, most
candidates gave input devices, but some of those selected incorrect input devices.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

Question 2

Most candidates were able to gain at least two marks in this question. There were very few candidates who
put more than one tick on a line, and there were not many blank answers.

Question 3

Most candidates were able to gain at least two marks for this question. Candidates appeared to understand
the question well.

(a) This was a challenging question with some candidates answering HDD rather than ROM.

(b) Most candidates were able to give the correct answer.

(c) This question was generally answered well.

Question 4

Most candidates gained at least three marks for this question.

(a) This was a very challenging question. Candidates managed to do well in the first three answers but
less so in the last part. The question asked for technical terms, but some candidates gave general
answers like temperature rather than pre-set value. Other candidates thought the windows opened
rather than actuators opening the windows. However, many candidates gained marks for analogue
or ADC.

(b) Most candidates were able to gain at least a mark for this question.. Some candidates failed to
read the question carefully enough and gave output devices, and LED monitor was a common
incorrect answer.

(c) Few candidates obtained more than five marks on this question. Marks were typically given for
24/7, expensive to maintain and set-up, but there were many vague answers. A significant number
of candidates talked about lifting heavy items, power outages, or being faster than human. Those
that mentioned retraining, or that more of a particular staff type was needed did not make the
reference to an increase in costs. Many responses were too vague.

Question 5

This question was fairly well answered with candidates gaining more marks on the first part than the second.

(a) This question was fairly well answered with many candidates giving more differences than
similarities. In order to gain full marks candidates needed to give both similarities and differences.
Both the intranet and the internet are similar technologies and therefore there should have been
more scope for similarities rather than differences.

(b) This question was not answered as well with many candidates not giving a use for the protocol if
they had managed to name the protocol. A popular incorrect protocol was ISP which sometimes
was written out as Internet Security Protocol when ISP is the acronym for Internet Service Provider.
Some candidates, possibly by accident, wrote down very obscure protocols as they only had to
give initials.

The protocols that were expected were HTTP, HTTPS and FTP. These were the most popular
correct answers.

Question 6

Most candidates were able to gain half of the available marks for this question with part (b) being the most
straightforward.

(a) Most candidates were able to gain two or three marks for this part of the question. The question
was split into three sections, with one mark for the explanation of $, one mark for the explanation of
! and three marks for the explanation if the calculation. Most candidates could explain the
calculation.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

However, some candidates thought that the use of the $ sign was related to currency rather than
absolute referencing. The majority of candidates that knew it was absolute referencing had issues
explaining how the $ could be used when the data was replicated. Candidates also had issues
explaining the ! sign and how it was used.

(b) Most candidates were able to gain at least two marks for this question with many gaining all three
marks. Candidates were able to write down that the formula was comparing the value in B20 to
determine if it was greater than £35 000. A significant number of candidates wrote that if it was
greater it would display Y otherwise it would display N. Both a detailed explanation of the formula
as well as a more general version relating to whether the car could be afforded or not were
accepted.

(c) This question was fairly well answered, with most of the marks being gained from the setting up of
the chart. The naming of the most appropriate chart and the explanation of how this could be
placed on a new sheet was not as well answered, but only carried two marks. Many candidates
thought that the bar chart was the best chart rather than the pie chart. This question was about
writing the steps needed to complete the task but some candidates failed to answer it in this way
and therefore did not gain all the marks that they could have achieved.

Question 7

This question was fairly well answered with candidates gaining most of their marks on part (b).

(a) In order to gain full marks in this question both benefits and drawbacks are needed. This was a
very challenging question for some candidates. The main issue with the answers given was a lack
of detail. For example, a number of candidates mentioned that interviewing would take a long time
without mentioning that it would take a long time to interview all the members. Many candidates
gave answers which showed that they understood the process of interviews but did not show an
understanding of the benefits and drawbacks of them. Some candidates identified that this method
was expensive without explaining that all the players are interviewed.

(b) Most candidates answered this question well. The question stated that all answers should be
different, but some candidates gave the answer ‘text’ which had already been given in the table.
Some candidates gave the answer as numeric without expanding on this.

(c) Some candidates were able to gain a mark on this question for explaining that the first character
was a 0 and therefore had to be text. However, many candidates answered by stating that in the
future letters could be used in the membership number whilst others thought it was needed as it
was a primary key.

(d) This question was very challenging with many candidates not achieving high marks. There were
references to brand names in the answers given which are not permitted. The answers given
showed little understanding of the process. The most challenging part of the question was the
describing the placing of fields in the master document template.

Question 8

Overall, Question 8 was very challenging for many candidates with a lack of depth in the answers given on
both of the first two parts. However, the third part produced stronger answers.

(a) This was a very challenging question with only stronger candidates gaining many marks. Many
other candidates understood the question but had misread it, answering it with the effects on the
customers rather than the store. Some candidates thought that the online store was a different
organisation than then physical store and answered the question in this way. The question stated
that supermarkets allowed online shopping; therefore these candidates had misread the question.

(b) As with part (a) this was also a challenging question. As with the previous part many of the
answers lacked detail and, in some cases, candidates had not read the question. Some candidates
responded about the use of debit cards to make online transactions rather than answering the
question about how to reduce fraud. There was a common theme around keeping your pin number
safe and not letting others know it, and not entering your personal details online. The most popular

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

correct answer related to the use of the green padlock or HTTPS websites or the use of strong
passwords.

(c) This part was generally well answered with most candidates gaining at least three marks.

Question 9

This question was challenging for many candidates. Stronger candidates provided an answer that related to
making a copy of the software and then illegally using it by distribution, selling, etc. Some candidates
answered the question on different types of copyright rather than the one listed in the question and others
gave answers that referred to hacking. As with previous questions, a lack of detail in the answer was the
main issue.

Question 10

Most candidates were able to gain some marks on this question.

(a) Many candidates were able to answer that the hyperlink took the user to a web page. Fewer
candidates were able to correctly identify what a hyperlink was.

(b) Many candidates were able to gain a mark for this part by explaining what ISP meant. However,
some candidates thought that ISP meant Internet Security Protocol. There were a few candidates
that stated that ISP was an Internet Service Provider that provides the internet.

Question 11

This question was generally well answered with most candidates gaining half of the available marks for each
part.

(a) This part was generally well answered. A number of candidates described what a monitor was and
where it could be used. Others described the health issues in great detail without going on to detail
what they could do to prevent or reduce these from occurring. The most common answers were
reducing brightness and taking regular breaks.

(b) This part was also well answered. Popular answers were ergonomic keyboard and taking regular
breaks.

Question 12

This question was fairly well answered with most candidates gaining at least two marks. Most candidates
understood what GUI and CLI were. However, some of the differences between the two interfaces lacked
detail.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

INFORMATION AND COMMUNICATION


TECHNOLOGY

Paper 0417/13
Written Paper

Key messages

• Candidates should be advised to read the questions thoroughly and plan their answers carefully.

• Candidates who performed well in this paper used specific and detailed language when replying to
‘describe’ and ‘write down steps’ type questions. They also gave a justification of the statements and
discussed the arguments for and against.

• Candidates must give the generic names for software rather than the brand name. It is clearly stated
on the front page of the question paper ‘No marks will be awarded for using brand names of software
packages or hardware.’

• If candidates need to expand their answers on to other parts of the question paper or onto extra
sheets, they should clearly identify in the original answer space where the extra part can be found.

General comments

The paper gave all candidates an opportunity to demonstrate their knowledge and understanding of ICT
using a wide variety of topics. The vast majority of candidates were able to complete the paper in the allotted
time, and most were able to make an attempt at all the questions.

When a question indicates a specific number of answers, candidates should only write one answer in each
allocated space as only one is marked for each space. Any question inviting the candidate to describe,
discuss, explain advantages or disadvantages requires specific detailed points relevant to the questions
asked. Some candidates wrote conclusions on the discuss type questions. Conclusions must be reasoned
and detailed in order to gain the mark.

A few candidates used tables or a line down the middle of the answer to list advantages and disadvantages
in separate sections when answering the discussion questions, producing repeated or shortened answers.
This method of answering a question can result in missed points as comparisons are difficult.

Comments on specific questions

Question 1

This question was fairly well answered.

(a) Most candidates were able to gain at least a mark from this part of the question. The first two items
in the table were more familiar to candidates than the second two.

(b) Most candidates were able to gain a mark.

Question 2

Most candidates were able to gain at least three marks on this question.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

Question 3

(a) The answer to this part of the question was gutter margin but it appeared to be very challenging for
candidates.

(b) The answer to this part of the question was header and this was well answered by many of the
candidates.

(c) Candidates found this part of the question very challenging.

(d) Most candidates were able to gain a mark on this part of the question giving justify or justified.

(e) Most candidates were able to gain a mark for this answer.

Question 4

Most candidates did well on this question although there were many brand names given in answers.

Question 5

Candidates did better on part (a) than part (b).

(a) This question was fairly well answered. Stronger answers gave relevant information and gave
examples. It was rare to see an answer that it was created by HTML.

(b) This part was not answered as well as part (a) as many candidates thought that this layer was
what the audience saw, probably derived from the name – presentation.

Question 6

This question was very challenging for some candidates. Part (b) was the more straightforward of all the
parts.

(a) To gain full marks in this question both benefits and drawbacks needed to be given. There were
some attempts at conclusions these were often just repeats of the answer already given. There
were a large number of possible answers. Some candidates misread the question and answered
as to how the system could be improved but others produced good answers but rarely expanded
upon them.

(b) Although this topic was not new, it appeared to be very challenging for some candidates.

(c) Candidates also found this question challenging. Candidates were able to answer about validation
or verification but rarely managed to give a comparison between them. There was a lack of detail in
the answers given.

Question 7

This question was quite well answered by most candidates with part (a) being answered better than part (b).

(a) This was a straightforward question with most candidates gaining at least a mark.

(b) Many answers lacked enough detail to gain full marks. Some candidates missed out the pre-set
value and therefore struggled to gain marks. With questions of this type candidates should look at
the topic which is either control or measurement, in this case control, and answer explaining how
control works, customising their answer to the question. Some candidates misread the question
and included ADC in the answer.

Question 8

This question was fairly well answered by many candidates especially the parts that included the formula.

(a) This part of the question was well answered, with many candidates understanding the formula.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

(b) As with part (a), this was also well answered. Candidates were able to gain a mark by stating that it
rounded I3 to 1 decimal place but many did not add that the answer could be rounded up or
rounded down.

(c) Candidates produced some good answers for this part of the question. Most candidates wrote
about the sort and some candidates explained that extra columns would need to be added and a
few candidates explained how the data was displayed.

(d) This question was quite a challenging question and many answers lacked detail.

Question 9

This question was fairly well answered with candidates gaining some marks, but as with some other
questions on this paper many candidates failed to add enough details to their answers.

(a) This question was very challenging with many answers not giving enough detail. For example,
some candidates stated that emails were received more quickly but did not give a valid reason or
advantage for this. Candidates often gave the advantage without making a comparison.

(b) Some candidates did not understand the question which was about stopping infected emails
placing viruses in the computer. Many candidates answered in terms of how to prevent viruses
entering the machine, for example: checking where the email came from, rather than
understanding that the email was infected.

(c) Most candidates gained at least two marks for this part . Again, a lack of detail in the answer was
an issue, for example “it makes the computer slower” without explaining what was slower.

Question 10

This question was very challenging for candidates with many struggling to understand the topic.

(a) This was not answered well as many candidates thought that it referred to computers making jobs
easier, and so quicker.

(b) This part was answered in the same way as part (a). Many candidates understood that workers
could start and finish at different times but thought that the worker decided on the time rather than
in negotiation with the manager. Most marks were awarded for the example.

Question 11

As with other questions candidates understood the topic but did not give detailed answers to the questions.

(a) Candidates understood that the skeleton had to be scanned but then did not seem to understand
that the data was sent rather than the physical copy of the skeleton.

(b) Many candidates answered this part better than part (a) but others were vague in their answers.
Some candidates reworded part (a) and then gave this as answer to this part.

Question 12

Some candidates wrote about the issues with making the taxi drivers unemployed rather than the concept of
driverless taxis. There were issues with the key word evaluate which means to give advantages and
disadvantages. Some candidates wrote about reducing accidents and computer hardware malfunctioning. As
with other questions there was a lack of detail in the answers given.

Question 13

Most candidates were able to gain at least three marks for this question. There were some good answers
given as many candidates answered from experience. The most common answers were don’t write in
capitals, be polite or do not use rude words. The reasons were less successful, they knew what they should
do but not why in many cases.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

INFORMATION AND COMMUNICATION


TECHNOLOGY

Paper 0417/02
Practical Test A

Key messages

The requirements for doing well in this paper were the ability to:

• distinguish between the typeface categories of serif and sans-serif font types
• distinguish between a page header/footer area and a report header/footer area and understand
which is appropriate to use
• format data to match the layout of an image on the question paper
• identify appropriate file names for exported data
• insert fields into a master document whilst maintaining the existing spacing and punctuation
• create presenter notes in a presentation
• produce legible screenshots to capture the required evidence
• use proofing techniques to identify data entry errors, ensure all data is displayed in full and to
ensure consistency of presentation
• produce screenshots to show the outcome of an action rather than the skill in process
• follow the instructions on the question paper to include identification details on work before printing.

General comments

Many candidates demonstrated a high level of practical ICT skills and appeared well prepared for this
examination. On this paper the word processed document and mail merge task were particularly well done.
Formatting data to match the layout of an image on the question paper proved to be challenging for some
candidates.

Text to be entered by the candidate as part of a task is displayed in bold on the examination paper. The
accuracy of this data is assessed so it must be entered exactly as shown, including punctuation and
capitalisation. There were a number of typographical errors in data entry and many of these inaccuracies
could have been avoided with more careful checking and proofreading. Good proofing techniques are
important and candidates are advised to take time to carefully check the content, layout and presentation of
their work. Common errors included incorrect capitalisation, incorrect or missing characters, omission of
spaces, additional punctuation and truncation of headings and data in the database reports.

A common issue continues to be the ability to distinguish between the typeface categories of serif and sans-
serif font types. These are categories of font type with specific attributes and not the actual names of font
styles so will not appear in an installed font list. Candidates must be able to select an appropriate font style
for the font type specified; entering ‘serif’ or ‘sans-serif’ in a font dialogue box is not the correct response.
Some candidates apply the capitalised font style Algerian to all styles and headings entered. Selecting a font
style with extra enhancement such as Algerian will be penalised unless that enhancement has been
specifically requested on the question paper.

Candidates are required to produce screenshots to evidence the ICT skills that cannot be assessed through
the printed product alone. In this session, some candidates printed work that was too small to read even
using magnification devices. Candidates must ensure that the printed screenshots are clear and large
enough to be easily read with the naked eye. Marks cannot be awarded if the evidence is too small to read.
Similarly, some candidates did not achieve marks as a result of presenting screenshots with important
elements cropped out.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

A number of candidates did not print all of the required tasks, even though they had indicated on the
question paper that they had completed them. Candidates should be encouraged to print evidence as it is
completed rather than waiting until the end of the examination. It is essential that candidates print their
Evidence Document towards the end of the examination time, regardless of whether they have finished all
the questions, as this document contains supporting evidence which can substantially improve a candidate’s
mark.

A small number of candidates did not print their name, centre number and candidate number on some of the
documents submitted for assessment despite this being part of the task. Without clear printed evidence of
the author of the work, marks cannot be awarded for these pages. It is not acceptable for candidates to
annotate their printouts by hand with their name as this is not evidence that they are the originators of the
work. Candidates should submit all printouts and cross through any draft versions which are not to be
marked. If multiple printouts are submitted without draft versions crossed through, only the first occurrence of
each page will be marked.

Centres should not staple work and care should be taken if hole-punching work as punch holes often result
in missing letters in report titles which cannot then be assessed for accuracy. Each candidate’s work must be
submitted inside the original hard-copy Assessment Record Folder (ARF) that has been provided to the
centre by Cambridge Assessment. Photocopies should not be used. Please make sure a Supervisor’s
Report Folder is also completed and included with the candidates’ work as this shows the software that has
been used and any issues that were experienced during the practical assessment.

Comments on specific questions

Task 1 – The Evidence Document

An evidence document was created and used by most candidates to store screenshot evidence.
Occasionally the screenshots were too small or faint to be read, even with the use of magnification devices.
Candidates MUST ensure that all text can be easily read with the naked eye. A small number of candidates
did not present the evidence document for marking, and others omitted identification details so marks could
not be awarded for these pages.

Task 2 – Document

Question 1

All candidates opened the correct file and most saved it with the specified file name. Occasionally it was
saved in the original RTF format rather than the format of the word processing software being used and a
few candidates did not enter the filename in capitals as shown on the question paper. Screenshot evidence
of the save was often inconclusive, showing the save in process rather than capturing the outcome of the file
saved in the work area. A screenshot of the folder contents after saving provides the evidence required. The
majority of candidates retained the page setup settings as instructed.

Question 2

Headers and footers were usually inserted and aligned as instructed. A few candidates incorrectly spaced
their identification details in the header so their name was left aligned, the centre number centre aligned and
the candidate number right aligned. An automated field was not always used for the page numbers with the
keyed number 1 appearing on every page of the document. Occasionally a page number field was displayed
rather than the actual page number and some header/footer items did not align with the page margins on all
pages. Candidates who used the built in content control did not always remove superfluous text or
placeholders in the header and/or footer areas.

Question 3

The title text was usually entered accurately at the start of the document. A few candidates incorrectly
inserted this text below the subtitle text which was part of the recall document. Occasional errors included
‘Tamara’ or ‘Tarawa’ instead of ‘Tawara’ and ‘MBT’ instead of ‘MTB’. Capitalisation of the title text was not
always as shown on the question paper.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

Question 4

The creation and storage of styles to meet the House style specification was done well by most candidates.
Errors included capitalisation or typographical errors in the style names, serif or sans-serif font styles set
incorrectly and styles containing additional formatting not listed in the House style specification. Additional
formatting is often caused by a new style being based on an existing style which then inherits the formatting
attributes of the existing style. To avoid this it is recommended that each new style created is based on the
‘default’ or ‘normal’ paragraph style. Several candidates incorrectly keyed ‘serif’ or ‘sans-serif’ into the font
dialogue box which, as that font name does not exist, displayed the default font style. A named font style with
attributes of the specified typeface category must be selected and applied. A capitalised font such as
Algerian should not be used unless an all capital letters enhancement was listed on the House style
specification. Screenshot evidence of the RW-Subhead style provided details of the settings created for this
style and the formatting of all subheadings in the document needed to match these settings. The evidence of
spacing before and after each paragraph was often presented as a separate screenshot which was
inconclusive as it did not show the name of the style this was applied to. A number of candidates applied
formatting directly to the text without providing evidence of creating and saving styles, and a few reproduced
the House style specification table by typing this into their document exactly as it appeared on the question
paper.

Question 5

The list of styles from the style manager/organiser provided evidence that the styles had been created,
named and saved. It was not necessary to show all the attributes set for every style. Any screenshot that
showed a list of these style names was acceptable although the style ribbon toolbar often truncated the style
names or did not show all the styles. The subsequent style application marks in the document were only
awarded if this style list evidenced that the style had been created and saved.

Question 6

The RW-Body style had already been created, stored and applied to the body text in the recall document.
Candidates were required to modify the body style settings which would automatically update the formatting
of the body text in the document without further user intervention. Screenshot evidence needed to show that
the original RW-Body style had been modified. Rather than modifying the existing style, some candidates
created a completely new style or applied the formatting to some text and tried to create a new style based
on this formatting. Occasionally the screenshot evidence was too small or had been cropped, removing
evidence of modification. When the existing style was modified the italic enhancement was not always
removed.

Question 7

The RW-Title style was usually applied correctly to the title text. Occasionally there was extra space between
the title and subtitle text. Application of the RW-Title style was only awarded if the formatting met the House
style specification and the style was listed in the Question 5 style list.

Question 8

Most candidates added their name to the subtitle text and applied the RW-Subtitle style to this line of text.
Some changed the recall text ‘by’ to ‘By’ or an extra space was inserted before the colon. Occasionally the
style was only applied to the recall text and not the full line, or there was additional space after this line.
Application of the RW-Subtitle style was only awarded if the formatting met the House style specification and
the style was listed in the Question 5 style list.

Question 9

Most candidates changed the correct text to two equally spaced columns with the correct spacing between
the columns. Some candidates inserted the initial column break below rather than above the subheading and
occasionally a page break was inserted instead of a section break. Some candidates incorrectly displayed
the entire document in two columns and occasionally the last few words were not in the column with the
column break positioned in the middle of the last sentence. There were a few instances where the column
formatting had been applied in separate sections resulting in the loss of paragraph order and flow.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

Question 10

The application of bullets to the specified text was done well. Any consistent bullet style was accepted. The
size of the bulleted text was not assessed. The list was not always presented in single line spacing with a 9
point space after the last item in the list. The bullet indent was often not set accurately with some candidates
using the default measurement or indenting the text rather than the bullet 2.5 centimetres from the left
margin.

Question 11

Application of the RW-Subhead style to the six subheadings in the document was usually done well. The
mark was awarded if there was evidence that the RW-Subhead style had been created and saved with the
formatting of all six subheadings matching the formatting seen in the screenshot evidence for Question 4.
Occasionally the style applied did not match the formatting set, usually with underline and/or italics applied to
the subheadings in the document but not seen in the evidence of the style. A few candidates applied the
style to only five subheadings instead of six.

Question 12

The RW-Table style was supplied in the recall document and candidates were required to apply this to the
table in the document. The style formatting was not always maintained within the table with most issues seen
in the fourth column where the text was not fully justified and/or there was additional space above and/or
below the text.

Question 13

Most candidates deleted the row and data as instructed. A few candidates deleted more than one row and
occasionally the row content was deleted leaving an empty row in the table.

Question 14

The table in the recall document was to be formatted to match the image on the question paper. This proved
challenging for some candidates with very few achieving full marks for this question. The formatting of the
first row of the table was mostly done well with the cells merged and the title centred over the four columns.
Most candidates presented the data in columns 1, 2 and 3 on one line but wrapped the column headings in
row 2. Centre aligning the data horizontally in columns 2 and 3 was not done well with candidates failing to
centre the associated column headings in row 2. A few candidates also incorrectly included the data in
column 1 in the centre alignment. Candidates who attempted the vertical merge in columns 3 and 4, rows 3
and 4, usually did so successfully but many candidates did not attempt this. Very few candidates managed to
vertically centre the data as shown on the question paper.

Question 15

Most candidates correctly formatted the title in the first row of the table. A few candidates used a serif rather
than a sans-serif font type and some inserted extra space below this heading. Italic enhancement and/or a
14 point font size was not always applied. A few candidates applied the black background to the text instead
of the row. Occasionally this formatting was applied to row 2 of the table, not row 1.

Question 16

The table borders were often not within the column width. Most candidates correctly displayed internal and
external gridlines with no split words seen on wrapping in column 4. Many candidates did not set 9pt spacing
below the table.

Question 17

In most cases the document was well presented, particularly where the styles had been created and applied
correctly. This question paper required no integration so spacing between items was generally consistent
and the table and bulleted list were rarely split. A few candidates left large gaps between paragraphs for no
apparent reason and the columns were not always aligned at the top of the page. Occasionally there was a
widow or orphan, most commonly where a subheading had been left at the bottom of a column.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

Task 3 – Database

Questions 18 to 20

Not all candidates attempted the database task but the importing of the csv files, creation of primary keys
and relationships between the tables were well done by those that did. The field names and data types were
mostly set correctly. The most common error related to the requirement to format the numeric field KM _Hour
to 2 decimal places. This was assessed in report 1 and a significant number of candidates rounded the data
in this field, displaying the values to 0 or 1 decimal place.

Question 21

Most candidates created relationships between the tables but the screenshot evidence did not always
confirm that one-to-many relationships had been created. The screenshot was often captured during the
process of creating the relationship rather than showing the outcome. A screenshot of the relationship
dialogue box would have evidenced the relationship type. The relationship diagram can only be credited if it
shows the single and one-to-many infinity symbols confirming the relationship type. Some candidates
showed the single and one-to-many infinity symbols but the field list was not expanded so it was not possible
to assess which fields were linked.

Question 22

Most candidates entered the new record accurately. However, data entry errors occasionally appeared in
‘Susan’ and ’Flater’ and it was not possible to assess the accurate entry of 4.05 if the KM_Hour field was not
stored and displayed to 2 decimal places. Some candidates incorrectly overwrote the first record in the
database (Marg Padgham) instead of entering this data as a new record.

Question 23

The first tabular report used fields from all tables and was generally done well by candidates who attempted
this question. The report title occasionally contained data entry or capitalisation errors, or displayed
additional text such as ‘Query 1’ in the title area. A large font size was usually used but the title text was not
always black and many used a sans-serif instead of a serif font style as instructed. Occasionally the title was
truncated as the text box had not been adjusted to accommodate the larger font. The new field heading was
usually entered accurately with only a few candidates omitting the underscore, or making data entry or
capitalisation errors. Those that created the new calculated field usually used the correct calculation but did
not format this to display as hh:mm:ss. Occasionally this field was displayed as h:mm:ss (5:30:30) or 5:30:30
AM but more commonly as a decimal value. Where the required display is not one of the pre-set formatting
options a custom format (hh:mm:ss) can be entered in the format properties box. The search was based on
three criterion and this was generally done well. Most candidates managed to find the club names containing
MTB. The Age search was not always accurate with some candidates searching for > 20 instead of > = 20
and a few candidates confusing the greater than (>) and less than (<) operators searching for < 20 or < = 20.
Most candidates displayed the correct fields in the report although these were not always in the correct order
and many had truncated data in one or more fields. The two field sort was general completed well although a
few candidates sorted on Club_Name only. Most candidates presented the report in landscape orientation.
Identification details were often entered in the report footer so they printed at the end of the report rather
than in the page footer, printing at the bottom of every page as instructed. A few candidates omitted their
identification details from all pages of the report which could not then be assessed. A number of candidates
did not remove the automated date and/or page numbers so only the identification details were displayed.

Question 24

The second report was well done by most candidates who attempted it. Most candidates completed the
search successfully although a small number incorrectly searched on the Club_Position instead of the
Position field. The fields were usually displayed in the correct order with data fully visible. The one field sort
was well done with only a few candidates sorting in ascending rather than descending order of County. The
report title ‘Winning Club Members’ was occasionally entered as ‘Wining Club Members’ and occasionally
the title was truncated vertically as the text box had not been adjusted to allow for the font size of the title.
Most candidates presented the report in portrait orientation but few candidates fitted the report on a single
page as this required some manipulation such as adjusting the row height or making the font sizes smaller.
The average calculation was usually done well and positioned under the Distance_KM column although this
was often not displayed as an integer value. The average label was usually positioned correctly but

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

occasionally contained capitalisation errors and/or a superfluous punctuation. Not all candidates produced
screenshot evidence of the database formula used.

Question 25

Most candidates provided evidence of exporting the correct report in portable document format (pdf). The
most common error was not saving this with an appropriate file name. The title of the report was an
appropriate file name but names such as Question 25, Query 2 or Report 2 were frequently used. A few
candidates exported this report in the wrong file type or the file type was not evident. Screenshot evidence
often showed the export process rather than the outcome of the file saved in the work area.

Task 4 – Mail merge

Question 26

The mail merge task was done well with many candidates producing error-free work. Most candidates
correctly replaced the text and chevrons in the master document with the correct fields from the data source
file. The most common errors were not removing all the chevrons, deleting punctuation and spaces between
the fields, and removing existing line spaces as the merge fields are inserted. A few candidates replaced the
text Date as Postmark with a date field which was not assessed.

Question 27

Most candidates replaced the required text with their name and inserted their identification details in the
footer of the letter. A small number did not replace the text Candidate Name and some incorrectly placed
their identification details in the header instead of the footer.

Question 28

The merge selection was based on one search criteria and was completed well. Screenshot evidence of a
tick box selection method did not provide evidence that an automated filter had been used. Most candidates
set the correct criteria on the Rank field of ‘less than or equal’ although a few incorrectly set ‘less than’.
Some candidates made use of the OR options with ‘equals to 3’ OR ‘equals to 2’ OR ‘equals to 1’. Any
criteria set in the automated filter which produced the correct result was accepted.

Question 29

Most candidates merged and printed the letters as specified. A few candidates printed additional letters
which did not match the search criteria. A small number of candidates provided no evidence of the master
letter and it was therefore not possible to assess whether mail merge had been used to complete this task.
The merge result must match the master document for this merge mark to be credited. On occasions the
merge result did not match the layout, spacing and formatting of the master document so, for example, the
fields were inserted in the master letter without spaces but in the resulting merged letters the fields had
spaces so did not match the master letter. If candidates identify an error after merging they should correct
the master document and complete the merge again.

Task 5 – Printing the Evidence Document

Almost all candidates printed their evidence document.

Task 6 – Presentation

Question 30

Most candidates correctly imported the five slides and presented each as a title and bulleted list. Most
candidates entered their identification details, slide numbers and logo in the correct position on the master
slide so they displayed in the same position on all slides in the presentation. Occasionally the master items
were not consistent on all slides, often with slide items in a different position on slide 1. Errors included
showing additional items on the master slide, items that overlapped the data on the slides or appeared in a
different position on any slide. Built-in slide designs could be used but often applied a different layout to slide
1 so candidates needed to ensure the design chosen met all the master slide requirements. Marks were not
awarded where incorrect software had been used such as the RTF file opened, manipulated and printed
using word processing software.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

Question 31

The majority of candidates correctly inserted a new slide as slide 1 and applied a title slide layout so the title
was larger than the subtitle and both were centred in the middle of the slide. Occasionally the subtitle
incorrectly displayed a bullet.

Question 32

Most candidates entered the text correctly. There were occasional errors in the data entry of ‘Tawara’ and
‘MTB’ and a few candidates omitted ‘2020’ from the title. The subtitle text was usually accurate with most
candidates entering their name after the text. Occasionally the word ‘by’ was changed to ‘By’ or an additional
space was inserted before the colon.

Question 33

Demoting the bulleted items to look like the image provided on the question paper proved challenging for
many candidates. Most managed to apply a dashed bullet to the correct three lines of text but these lines
were not always indented and few candidates reduced the size of text or applied italic enhancement. The
dashed bullets were often not aligned consistently to the left and had no space between the bullet and text.
Instead of formatting the three lines of imported text some candidates re-typed the text and in doing so
introduced new data entry errors.

Question 34

Few candidates entered the text and printed as presenter/speaker. Some incorrectly added the
presenter/speaker text to the slide as a comment, but most simply typed the presenter/speaker notes directly
onto slide 3, either as an additional bullet or in a text box, and then printed this as a full page slide.
Occasionally the text was entered in the header/footer area. Where the text was entered correctly as
presenter notes it often contained data entry or capitalisation errors, most commonly omitting the space
either side of the dash, or the full stop was omitted.

Question 35

Most candidates printed the presentation with two slides to the page. Either portrait or landscape orientation
was accepted. A few printed the complete presentation as individual slides and some printed with three or six
slides to the page.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

INFORMATION AND COMMUNICATION


TECHNOLOGY

Paper 0417/03
Practical Test B

Key messages

• Candidates need a better understanding of the syntax of CSS in a stylesheet.


• Candidates need to understand the importance of following the instructions on the question paper.
• Candidates must ensure that they include their candidate details in the correct place on all printouts.
• Candidates need to take greater care with the accuracy of data entry.
• Candidates need a better understanding of conditional formatting within a spreadsheet and how to
evidence this.
• Candidates need to take greater care with the formatting of the spreadsheet, particularly the setting of
row heights and column widths to match both the question and the data or labels contained within the
cells.
• Candidates need a better understanding of html syntax, particularly in the use of head and body tags.

General comments

Some candidates printed work that was too small to read even using magnification devices. Candidates must
ensure that all text can be easily read with the naked eye.

Comments on specific questions

Question 1

Most candidates placed the specified files in the correct folder, although not all candidates displayed the
folder name in their screenshot evidence. Fewer candidates included all the specified file details, and file
dimensions were not always added to the folder specifications before the screenshot was taken.

Question 2

Most candidates created a web page structure with the correct cells in the correct places but the width of the
table data shown in cell A was often placed in the table tag. A significant number of candidates, either
manually or through the WYSIWYG software that they were using, overrode this instruction by setting the
table width to a different value, usually 100 per cent. Whilst the structure of the table was frequently correct,
table cell dimensions were not always as specified in the question paper. Most candidates removed the cell
letters and the cell borders as instructed. Some candidates appeared to ignore the information that cell
widths allowed for border spacing and removed this spacing with embedded CSS in the html by collapsing
the borders separated within the stylesheet.

Question 3

Few candidates set the title for the web page. Many allowed the software’s default setting to be visible. Of
those candidates who attempted to change this to the specified text, a significant number did not place this
within the head section of the page, or the text contained data entry errors.

Question 4

Most candidates placed the logo image as specified but a significant number did not resize this image in the
html as specified.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

Question 5

This question was answered well by many candidates although a number introduced typographical errors,
especially related to initial capitalisation of words where this was not shown. Most candidates placed this text
in style h2, although some added extra classes in the HTML to override the h2 settings from the stylesheet.
A few candidates transposed the contents of rows 2 and 3.

Question 6

This question was answered well by the majority of candidates, but a small number of candidates placed the
background image instead of the daisy image. A few candidates resized the image but did not maintain its
aspect ratio.

Question 7

Almost all candidates completed this step as instructed. Occasionally the text was incomplete. Some did not
set this in paragraph style. Some candidates included a link to the n20text.txt file and some included the
contents of the n20contact.htm file.

Question 8

Almost all candidates entered the text as instructed, although a few typographical errors were seen and
‘updated’ was occasionally entered as ‘edited’. Most candidates placed their candidate details on a new line.
Most candidates set this in style h3.

Question 9

This question was rarely completed well. The four images in row 2 required alternative text which should
have been suitable for a text reader to enable partially sighted users to ‘hear’ rather than ‘see’ elements on
the page. In this case a description of the image was required; many candidates who attempted this used the
file names as the alt text.

Question 10

Few candidates set the default target window in the head section of their web page. Of those who did set it
(presumably with the aid of their WYSIWYG package) this was set to _blank rather than _self.

Question 11

Few candidates set hyperlinks on both the image and the text, often attempting one or the other rather than
both. Of those who set a hyperlink, many of the links to the web page contained file paths which would
enable them to work on this candidate’s computer but not on others with a different file/folder structure. Few
candidates set the correct target window for this hyperlink.

Question 12

Most candidates attached the stylesheet. A number of candidates included file paths in their attached
stylesheets which would enable them to work on the candidates’ computer but not on others with a different
file/folder structure. Some used classes or attached a different stylesheet below this which would override
the original attachment.

Question 13

This question proved challenging for a number of candidates. Some candidates created a new stylesheet
rather than updating the supplied stylesheet. Few candidates placed the background elements in the body
section of their stylesheet. Of those who did, a number did not set the correct background colour as they did
not rearrange the colour elements into their RGB components. Some candidates did not use the # symbol to
denote a hexadecimal number. Candidates often did not use a single selector thereby increasing the
chances of making mistakes when repeating the contents of styles h1, h2, h3 and p. For the text-based
styles many candidates attempted the heading styles with some elements correct, but a significant number
tried to set the paragraph style as paragraph rather than as style p. Where style p was set, a number of
these candidates specified the height of the text in pixels rather than in points. A large number of candidates
set the second font style to Helvetica rather than Helvetica Neue. More candidates this series than previous

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

series included their name as a CSS comment, but there were still a number of stylesheets that contained
html elements. A significant number of candidates included their details at the bottom on the stylesheet and
not the top as instructed. The stylesheet, browser view of the page were usually included but there were
some submissions in an editor rather than a browser and html. Some screen prints were cropped so that the
address bar was not visible.

Question 14

Many candidates completed this as specified, but a few did not include the spacing as specified or did not
format the data within the header to be right aligned.

Question 15

Most candidates inserted the two new rows at the top of the spreadsheet and merged the cells in row 1. The
data entry in this cell was not always as specified, with a number of typographical errors including case
errors. The formatting of the spreadsheet to match the diagram was generally performed very well. The few
errors or omissions seen included the lack of right alignment in column A, the column widths not resized for F
and K, the relative font sizes (despite being given the required point sizes) and merging the correct cells in
row 3.

Question 16

This question was performed well by most candidates, but some placed a formula rather than function and
others placed inefficient formulae like = SUM(B5 + C5 + D5) where the function element was not used.

Question 17

This question was performed well by most candidates, but some placed a formula rather than function and
others placed inefficient formulae like = SUM(G5 + H5 + I5) where the function element was not used.

Question 18

This question was performed well by most candidates, but some placed a superfluous function around the
calculation for example = SUM(E5 – J5).

Question 19

The formulae used in this question produced a number of different responses, some correct and others
where the wrong text was assigned to the wrong inequality. There were a number of case errors in the text to
be displayed, despite these being shown in bold on the question paper. The conditional formatting was more
challenging for many candidates; a significant number did not demonstrate any understanding of this and
appeared to manually format each cell depending upon the value in it. Where candidates provided
screenshot evidence of the conditional formatting, the comparison text often contained capitalisation errors,
or the text was truncated. The evidence often did not show the cell which the rules were applied to and the
formatting was often transposed so the text Loss had white text on a black background instead of back text
on a red background.

Question 20

Many candidates completed this as specified, but a few did not use appropriate column widths (especially
column M) so the individual formulae and the associated replication could be fully/clearly seen.

Question 21

Most candidates completed this as specified.

Question 22

Many candidates completed this as specified, but a few did not widen the columns to show all the formulae
and others did not include the row and column headings.

© 2020
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2020
Principal Examiner Report for Teachers

Question 23

Many candidates completed this as specified, but a few did not adjust the columns to show all the values and
labels.

Question 24

Almost all candidates printed their evidence document which contained all the evidence for the web page as
well as the process of establishing the conditional formatting. Without this document most evidence of
candidate work would be lost.

© 2020

You might also like