ai project

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 29

AI PROJECT LOGBOOK

Resource for Students


(Adapted from “IBM EdTech Youth Challenge – Project Logbook” developed by
IBM in collaboration with Macquarie University, Australia and Australian Museum)

KEY PARTNERS

INDIA IMPLEMENTATION PARTNERS

GLOBAL PARTNERS

1
AI Project Logbook

PROJECT NAME:

SCHOOL NAME:

YEAR/CLASS:

TEACHER NAME:

TEACHER EMAIL:

TEAM MEMBER NAMES AND GRADES:

1.

2.

3.

4.

5.

6.

Note: Add more rows if there are more members in your team

2
1. Introduction
This document is your Project Logbook, and it will be where you record your
ideas, thoughts and answers as you work to solve a local problem using AI.

Make a copy of the document in your shared drive and work through it digitally
with your team. You can also print a copy of the document and submit a scanned
copy once you have completed the Project Logbook. Feel free to add pages and
any other supporting material to this document.

Refer to the AI Project Guide for more details about what to do at each step of your
project.

2. Team Roles
2.1 Who is in your team and what are their roles?

Role Role description Team Member Name

LEADER TO SCHEDULE AND ALLOCATE


TASKS AMONG TEAM MEMBERS
FILLS LOGBOOK AND ACT AS A
Naren Ragunandhan
LINK BETWEENTEACHER AND THE
TEAM MEMBERS.

DESIGNER TO WORK WITH THE TEAM TO A.K.Jeevesh


DESIGN THE PROJECT
PROTOTYPE.AND ALSO WORKS
WITH THE USERS TO TEST THE
PROTOTYPE AND GET FEEDBACK
FROM THE USERS.

Lokeshwaran
INFORMATION COLLECTS QUESTIONS FROM
RESEARCHER TEAM, FINDS ANSWERS AND
FORWARD IT TO TEAM LEADER.

DATA EXPERT Nithish


DECIDES ON WHAT TYPE OF DATA
TO WORK WITH TO TRAIN AN Al
MODEL.

3
Project plan

The following table is a guide for your project plan. You may use this or create
your own version using a spreadsheet which you can paste into this section. You
can expand the ‘Notes’ section to add reminders, things that you need to follow
up on, problems that need to be fixed urgently, etc.

Phase Task Planned Planned Planned Actual Actual Actual Who is Notes/Remarks
start end duration start end date duration responsible
date date (hours, date (hours,
minutes) minutes)
Preparing for Coursework,
the project readings
Set up a
team
folder
on a
shared
drive
Defining the Background
problem reading
Research
issues in our
community
Team
meeting to
discuss
issues and
select an
issue for the
project
Complete
section 3 of
the Project
Logbook
Rate
yourselves
Understanding Identify
users
the users
Meeting with
users to
observe
them
Interview
with user (1)
Interview
with user
(2),
etc…
Complete
section 4 of
the Project
Logbook
Rate
yourselves
Brainstorming Team
meeting to
generate
ideas for a
solution
Complete
section 5 of
the Project
Logbook
Rate
yourselves
Designing Team
your solution meeting to
design the
solution
Complete
section 6 of
the logbook
Rate
yourselves
4
Collecting and Team
preparing data meeting
to
discuss
data
requiremen
ts
Collecting and Data
preparing data collectio
Prototyping n

Data
preparatio
n and
labelling
Complete
Section 6
of the
Project
Logbook
Team
meeting
to plan
prototypi
ng phase
Prototyping Train
Testing your
model
with
input
dataset
Test your
model and
keep
training
with more
data until
you think
your
model
is
accurat
e
Write a
program to
initiate
actions
based on
the result
of your
model
Complete
section 8
of the
Project
Logbook
Rate
yourselves
Team
meeting
to
discuss
testing
plan
Testing Invite
Creating the users to
video test your
prototype
Conduct
testing
with
users
Complete
section 9
of the
Project
Logbook

5
Rate
yourselves
Team
meeting
to
discuss
video
creatio
n
Write your
script
Film your
video
Edit your
video
Completing Reflect
the logbook on the
project
with
your
team

6
Complete
sections 10
and 11 of
the Project
Logbook
Review
your
Project
logbook
and
video
Submission Submit
your
entries on
the IBM

2.2 Communications plan

1.How will you planned to meet for discussion?

Online and Offline modes

2.How often will you come together to share your progress?

Weekly 2-3 times


3.Who will set up online documents and ensure that everyone is contributing?

Naren Ragunandhan
4.What tool will you use for communication?

Face to face, Google Drive, Whatsapp, Gmail.

2.3 Team meeting minutes

Date of meeting : 15/11/2024

Who attended: Everyone

Who wasn’t able to attend: Nil

Purpose of meeting: To decide roles and responsibilities.

7
Topic discussed

1. Project Topic – General

2. Team Roles

3. Problem Definition

4. Communication plans

Things to do(what, by whom, by when)

1. Explanation of topics by project leader

2. Creative thinking by project leader

3. Executing ideas by designer

8
3. Problem Definition
3.1 List important local issues faced by your school or community

1.Customizable Filters

2.User Feedback Integration

3.Educational Resources

4.Personalized Reporting

5.Integration with Existing Tools

6.Mobile Accessibility

7.Community Support

8.Context-Aware Detection

9.Targeted Solutions for Specific Groups

• Which issues matter to you and why?

Spam emails can harbor malware, ransomware, or links to harmful sites.


Ensuring robust spam detection helps protect users from potential security
breaches and financial loss.

1. Privacy Concerns
 Why it Matters: Spam emails often contain phishing attempts or
malicious links aimed at stealing personal information. Protecting user
data is crucial, especially with increasing data privacy regulations.
2. User Experience
 Why it Matters: A cluttered inbox filled with spam can lead to
frustration and decreased productivity. A good spam filter enhances
user experience by allowing individuals to focus on relevant
communications.
3. Security Threats
 Why it Matters: Spam emails can harbor malware, ransomware, or
links to harmful sites. Ensuring robust spam detection helps protect
users from potential security breaches and financial loss.

3.2 Which issue will you focus

9
The Users

Who are the users and how are they affected by the problem?

Spam emails affect a diverse range of users, each facing unique


challenges. General email users, including everyday individuals, often
find their inboxes cluttered, making it difficult to locate important
messages. Students can miss critical academic notifications due to the
overwhelming presence of spam. In the business realm, employees
experience decreased productivity and heightened risks of phishing
scams, while IT departments must manage and maintain spam filters,
increasing their workload and requiring additional resources.
Management is also concerned about the potential for data breaches
and reputational damage stemming from spam-related issues.

Marketing and sales teams are impacted as well, with spam diminishing
the effectiveness of legitimate outreach efforts, causing important
communications with potential clients to get lost. Elderly users are
particularly vulnerable to spam and phishing attempts, which can lead to
financial scams and identity theft. Nonprofit organizations rely on email
for communication and fundraising, and spam can hinder their outreach
and engagement efforts. Educational institutions face disruptions in
communication between teachers, administrators, students, and parents
due to spam.

E-commerce platforms also struggle with spam affecting customer


communications, order confirmations, and marketing initiatives.
Healthcare providers must safeguard sensitive patient information, and
spam can compromise security in this context. Regulatory bodies
focused on consumer protection are concerned about the implications of
spam for privacy and security. Finally, tech companies that provide
email services continually innovate to improve spam detection
technologies to protect their users, while researchers and analysts in
cybersecurity study spam trends and tactics, influencing their work in
threat detection and prevention. Each of these user groups experiences
spam differently, necessitating a comprehensive approach to spam
detection and management.

10
1.1 What have you actually observed about the users and how the problem affects
them?

Users often experience significant frustration and anxiety


due to spam emails cluttering their inboxes, which makes it
difficult to locate important messages. This is particularly
challenging for students, who may miss critical academic
notifications. In the workplace, employees face distractions
that reduce productivity and increase stress levels, while IT
departments become overwhelmed managing spam filters,
diverting resources from other essential tasks. Management
is concerned about potential security breaches linked to
phishing attempts hidden within spam, which can jeopardize
sensitive information. Additionally, marketing professionals
struggle as spam diminishes the effectiveness of legitimate
outreach, resulting in lost opportunities. Vulnerable groups,
such as the elderly, may fall prey to scams that exploit their
trust, leading to financial losses. Nonprofit organizations and
educational institutions also suffer, as spam disrupts
essential communication and outreach efforts. E-commerce
businesses face challenges in customer communications,
undermining trust. Overall, spam emails create a pervasive
issue that impacts users’ daily lives, productivity, and
security, highlighting the need for effective spam detection
1.2

11
1.3 Record your interview questions here as well as responses from users.

What types of spam emails do you receive most frequently?


Response: "I mostly get a lot of promotional emails for
products I’ve never shown interest in, along with phishing
attempts that look like they’re from banks."
How do you typically identify spam emails in your inbox?
 Response: "I look for certain keywords in the subject line, like
'Congratulations!' or 'You've won!', and I also check the sender’s
email address for any signs that it’s fake."
Have you ever fallen for a spam email? If so, what happened?
 Response: "Yes, I clicked on a link in an email that looked
legitimate and ended up downloading malware. It was a
nightmare to fix my computer afterward."
What steps do you take to avoid spam emails?
 Response: "I try to unsubscribe from mailing lists I don’t use,
and I also use filters in my email to automatically sort out
unwanted messages."
How do you feel about the effectiveness of spam filters?
 Response: "They’re somewhat effective, but sometimes
legitimate emails end up in the spam folder, and other times
spam gets through. It’s hit or miss."
Have you ever reported spam emails? If yes, what was your
experience?
 Response: "I’ve reported a few, but I don’t think it made much
difference. It would be nice if there was a more streamlined way
to handle it."
Do you believe spam emails have become more sophisticated
over time?
 Response: "Definitely! Some of them are so well-crafted that
it’s hard to tell they’re spam. They even use personal
information to seem more credible."
What impact do spam emails have on your overall email
experience?
 Response: "They make it really frustrating. I spend too much
time sorting through my inbox to find important emails, and it’s
just annoying."
What advice would you give to someone who is struggling
with spam emails?
 Response: "Use a secondary email for sign-ups and be cautious
about sharing your main email. Also, regularly update your
spam filter settings."

12
1.4 Empathy Map

Map what the users say, think, do and feel about the problem in this table

What our users are saying: What our users thinking:

Effectiveness: Impressed with Accuracy:


 "I was impressed by how accurately the  Many users appreciate the high accuracy
detector identifies spam. It caught 95% of of the detector, noting that it successfully
unwanted emails in my inbox!" filters out most spam emails while
User-Friendly Interface: preserving important messages.
 "The interface is really intuitive. I could Speed and Performance:
easily navigate through the options and  The quick processing of incoming emails
see how the tool works." is a common highlight, with users
Customization: appreciating that the tool operates
 "I love that I can adjust the sensitivity of seamlessly without noticeable delays.
the spam filter. It lets me find the right Integration with Email Clients:
balance between blocking spam and not  Users often mention how easy it is to
missing important emails." integrate the detector with their existing
email clients, which enhances the overall
experience.

What our users are doing: How our users feel:

Configuring Settings: Satisfaction:


 Users are customizing the filter settings  A general sense of satisfaction is
to adjust sensitivity levels and tailor the common, especially among users who
detection to their specific email habits. see a significant improvement in their
Reviewing Spam Reports: inbox organization and spam
 Many users regularly check the spam management.
reports or logs to see which emails have Curiosity:
been flagged and to ensure important  Users may feel curious about how the AI
messages are not missed. works and how it improves over time,
Providing Feedback: prompting them to engage more with the
 Users are actively providing feedback on tool.
false positives or negatives, helping the Empowerment:
AI improve its accuracy over time.  Users often feel empowered by the
Unsubscribing from Newsletters: customization options, allowing them to
 Some users are taking the opportunity to tailor the spam detection to their specific
clean up their inbox by unsubscribing needs.
from unwanted newsletters that the
detector identifies as spam.

13
1.5 What are the usual steps that users currently take related to the problem and
where are the difficulties?

1. Some spam emails are sophisticated and mimic legitimate messages,


making them hard to identify.

2. This process can be tedious, especially if users frequently encounter


spam that the filter does not catch.

3. The unsubscribe process can be complicated or ineffective, with some


services making it difficult to opt out.

4. This can be time-consuming, and users might miss important emails


that were incorrectly marked as spam.

5. These filters may not be effective enough, leading to a high volume of


spam or false positives.

6. Some users may not know how to give feedback or feel that their
input does not lead to improvements.

7. Response times can vary, and users may feel frustrated if their
concerns are not addressed quickly

14
2. Brainstorming
2.1 Ideas

How might you use the power of AI/machine learning to solve the users’ problem by
increasing their knowledge or improving their skills?

AI Idea Create interactive tutorials that adapt to each user’s behavior and knowledge level.
#1 These modules could teach users how to identify spam, manage their inbox, and
effectively use the spam detector.

AI Idea Use machine learning to analyze user interactions with spam detection (e.g.,
#2 marking emails as spam or not) and provide immediate feedback on their choices

AI Idea The AI could offer contextual tips and recommendations when users receive
#3 emails. For example, if an email looks suspicious, the detector could suggest
actions based on common spam characteristics.

AI Idea Incorporate gamified elements, such as quizzes or challenges related to spam


#4 detection. Users could earn points or rewards for correctly identifying spam or
completing educational modules.

AI Idea Use AI to analyze trends in spam emails and regularly update users on new tactics
#5 used by spammers. This could be delivered through newsletters or in-app
notifications.

15
2.2 Priority Grid

Evaluate your five AI ideas based on value to users and ease of creation and
implementation.
Hig
h

AI idea #1: AI idea #2:

Users can understand the impact of their decisions on


Tailored content helps users learn at their own pace, filtering accuracy, allowing them to refine their
improving their skills in email management. judgment over time.
VALUE TO USERS

AI idea #3: AI idea #4:

Users learn to recognize spam traits and become This approach makes learning engaging and
more adept at identifying unwanted emails. encourages users to improve their skills actively.

Low

Easy Hard
EASE OF DEVELOPMENT

16
2.3 Based on the priority grid, which AI solution is the best fit for your users and for
your team to create and implement?

Briefly summarize the idea for your solution in a few sentences and be sure to
identify the tool that you will use.

The best solution would be to use AI idea#2:

Using machine learning to analyze user interactions with spam


detection is highly beneficial because it creates a dynamic
feedback loop that enhances both the user's learning experience
and the effectiveness of the spam detector. When users mark
emails as spam or not, the machine learning model can learn from
these decisions, identifying patterns and refining its algorithms to
improve future spam detection.
This immediate feedback helps users understand the rationale
behind the spam classification process. When they receive
notifications or explanations about why a particular email was
flagged or not, they gain insights into the characteristics of spam.
This educational component fosters better decision-making skills,
as users start to recognize common indicators of spam, leading to
a more intuitive understanding of their inbox.

Furthermore, as users engage with the system and provide


feedback, they contribute to a continuously improving model. The
machine learning algorithm can adapt over time based on real-
world interactions, becoming more accurate and tailored to user
preferences. This collaborative enhancement not only makes the
tool more effective at filtering spam but also empowers users,
making them feel like active participants in the system rather than
passive recipients.

Overall, this approach not only helps users manage their emails
more effectively but also builds their confidence and skills in
recognizing spam, ultimately creating a more efficient and
satisfying email experience.

3. Design
17
3.1 What are the steps that users will now do using your AI solution to address the
problem?

1. Users will start by installing or integrating the AI spam detector with their email
client. They will configure initial settings, such as sensitivity levels and preferred
filters, to tailor the tool to their needs.

2. In the beginning, users will engage in an initial training phase where they mark
emails as
spam or not spam. This helps the AI learn their preferences and improves its
filtering accuracy.

3. Users will engage with educational resources or prompts from the AI that explain
why certain emails were flagged. This will enhance their understanding of spam
characteristics.

4. Based on their experiences and the feedback received, users will periodically
revisit the settings to adjust filters or add specific senders to their whitelist or
blacklist, refining the tool’s accuracy.

5. Users will take advantage of any insights or analytics provided by the detector,
such as trends in spam emails or common characteristics of flagged messages.
This information will help them better understand spam patterns.

6. Users will remain engaged with the AI spam detector, continuously providing
feedback and adjusting settings as they become more familiar with its
functionalities, leading to an increasingly efficient email management experience.

18
4. Data

4.1 What data will you need to train your AI solution?

Labeled Email Data: A diverse dataset of emails that are labeled as "spam" or
"not spam," including various types of spam (e.g., phishing, promotional) and
legitimate messages.

4.2 Where or how will you source your data?

Where will the Do you have Ethical


data come Who owns the permission to considerations
Data needed from? data? use the data?
Past records Public dataset yes Should be authentic
Have

Identification Public dataset yes Should be accurate


Want/Need

AI Models Public dataset yes Should be accurate


Nice to have
and authentic

19
5. Prototype

5.1 Which AI tool(s) will you use to build your prototype?

TensorFlow or PyTorch: For developing more complex deep learning


models if needed.
Flask or FastAPI: For creating a web service to serve your AI model.

5.2 Which AI tool(s) will you use to build your solution?

AWS, Google Cloud, or Azure: For scalable storage and computing


resources, especially for training and deploying the model.
spaCy or NLTK: For text processing and feature extraction from email content.
Transformers (Hugging Face): For advanced models like BERT, which can
help with context understanding in emails.

5.3 What decisions or outputs will your tool generate and what further action needs to
be taken after a decision is made?

1.Each incoming email will be classified as "spam" or "not spam."

2.Users can provide feedback on the classification (e.g., mark as correct or


incorrect).

3.Users may receive alerts for suspicious emails or summaries of spam activity.

4.The tool may generate insights about spam trends, such as common keywords
or

20
6. Testing
6.1 Who are the users who tested the prototype?

1.Individuals who regularly use email for personal or professional


communication, such as students, professionals, and freelancers.

2. Individuals with a keen interest in technology who can offer


constructive criticism and suggest improvements based on their
experiences.

3. Users of popular email platforms (like Gmail, Outlook, etc.) who can
give feedback on how well the detector integrates with existing
services.

6.2 List your observations of your users as they tested your solution.

1.Many users found the interface intuitive and easy to navigate, which made
onboarding and setup straightforward.

2. Users noted that the AI effectively identified most spam emails, but some
mentioned occasional false positives where legitimate emails were flagged

3. Testers appreciated the ability to provide feedback on spam classifications,


feeling that this feature helped the AI improve over time.

21
6.3 Complete the user feedback grid
;
What works: What needs to change:

Accurate Spam Classification: Reduce False Positives:


 The tool's ability to correctly identify  Enhance the model to minimize the
and filter out a significant number of legitimate emails
percentage of spam emails, incorrectly flagged as spam,
minimizing clutter in users' inboxes. improving user trust in the system.
User-Friendly Interface: Customization Features:
 An intuitive and easy-to-navigate  Introduce more granular
design that allows users to quickly customization options, allowing users
understand and utilize the features to set specific filters, keywords, and
without extensive training. sender preferences to better suit
Real-Time Feedback: their individual needs.
 The system's capacity to learn from User Education:
user interactions and adjust its spam  Provide more comprehensive
detection based on feedback, leading educational resources or tutorials to
to continuous improvement in help users understand spam
accuracy. characteristics and how to maximize
the tool’s effectiveness.

Questions?: Ideas:

1. How does the spam detection algorithm 1. Users requested more specific filtering
work? capabilities, such as filtering by
language, attachment types, or certain
2. What criteria do you use to determine if phrases.
an email is spam?
2. A feature that allows users to easily
3. Can I customize the spam filter settings? unsubscribe from unwanted
newsletters or promotional emails
4. How does the tool learn from my directly from the spam folder.
feedback?
3. A tool that lets users report spam
5. What should I do if a legitimate email is directly to the service provider, helping
marked as spam? to improve the overall detection
system.
6. Is my data safe and private?
4. Options to customize how and when
7. Can the detector integrate with my users receive notifications about
email client? flagged emails or spam activity.
8. Are there any additional features I
should know about?

22
6.4 Refining the prototype: Based on user testing, what needs to be acted on now so
that the prototype can be used?

Address False Positives:


 Improve the spam classification algorithm to reduce the number of
legitimate emails flagged as spam, enhancing user trust in the tool.
Enhance Customization Options:
 Implement more granular filter settings, allowing users to adjust
sensitivity levels, add specific keywords, and manage sender
preferences.
Streamline Feedback Mechanism:
 Simplify the process for users to provide feedback on spam
classifications, making it easy for them to report inaccuracies.
Improve Notification Clarity:
 Revise the notification system to provide clearer context and reasons for
why emails are flagged, helping users understand the decision-making
process.
6.5 What improvements can be made later?

Advanced Machine Learning Algorithms:


 Implement more sophisticated models, such as deep learning techniques,
to improve detection accuracy and adaptiveness over time.
Personalized User Profiles:
 Develop features that create personalized profiles based on user
behavior and preferences, allowing for more tailored spam filtering.
Multi-Language Support:
 Expand the tool to support multiple languages, accommodating users
from diverse linguistic backgrounds.
Integration with Other Security Tools:
 Collaborate with antivirus and cybersecurity solutions to provide a
comprehensive protection package against various online threats.

23
7. Team collaboration
7.1 How did you actively work with others in your team and with stakeholders?

Regular Meetings:
 Conducted weekly or bi-weekly meetings to discuss project progress,
share insights, and address any challenges faced by team members.
Collaborative Brainstorming Sessions:
 Engaged in brainstorming sessions to generate ideas for features,
improvements, and user testing strategies, ensuring diverse perspectives
were considered.
Feedback Loops:
 Established structured feedback loops with stakeholders, including users,
to gather input on the prototype, which informed iterative design and
development.
Cross-Functional Collaboration:
 Worked with different departments (e.g., development, marketing, and
customer support) to align goals, share knowledge, and ensure a
cohesive approach to the product.
User Testing and Interviews:
 Involved team members in user testing sessions, gathering firsthand
insights on user experiences and challenges, which helped refine the
tool.
Documentation and Communication:
Maintained clear documentation of decisions, processes, and user
feedback to keep
everyone informed and aligned throughout the project lifecycle.

24
8. Individual learning reflection
11.1. Team Reflections

A good way to identify what you have learned is to ask yourself what surprised
you during the project. List the things that surprised you and any other thoughts
you might have on issues in your local community.

Team member name:

Team member name:

Team member name:

25
Team member name:

Team member name:

Team member name:

Note: Add more boxes if there are more members in your team

Rate yourself

Individual Learning Reflection

1 point – Some team members present an account of their learning during


the project. 2 points - Each team presents an account of their learning
during the project.
3 points - Each team member presents a reflective and insightful account of their
learning during the project.

26
9. Video link

Enter the URL of your team video:

Enter the password (if any):

27
Appendix
Recommended Assessment Rubric (for Teachers)

LOGBOOK AND VIDEO CONTENT


Steps 3 points 2 points 1 point Points
Given
Proble A local problem which has A local problem which has A local problem
m not been fully solved not been fully solved is described
definitio before is explained in before is described.
n detail with supporting
research.
The Users Understanding of the user Understanding of the user The user group is
group is evidenced by group is evidenced by described but it is
completion of all of the completion of most of the unclear how they are
steps in Section 4 The Users steps in Section 4 The affected by the
and thorough investigation. Users. problem.

Brainstorming A brainstorming session A brainstorming session A brainstorming


was conducted using was conducted using session was
creative and critical creative and critical conducted. A solution
thinking. A compelling thinking. A solution was was selected.
solution was selected selected with supporting
with supporting arguments in Section 5
arguments from Section 5 Brainstorming.
Brainstorming.
Design The use of AI is a good fit for The use of AI is a good fit The use of AI is a good
the solution. The new user for the solution and there fit for the solution.
experience is clearly is some documentation
documented showing how about how it meets the
users needs of users.
will be better served than
they are today.
Data Relevant data to train the Relevant data to train the Relevant data to train
AI model have been AI model have been the AI model have
identified as well as how identified as well as how been identified as well
the data will be sourced or the data will be sourced or as how the data will be
collected. There is collected. There is sourced or collected.
evidence that the dataset evidence that the dataset
is balanced, and that is balanced.
safety and
privacy have been
considered.
Prototype A prototype for the solution A prototype for the A concept for a
has been created and solution has been created prototype shows how
successfully and trained. the AI model will work
trained to meet
users’
requirements.
Testing A prototype has been A prototype has been A concept for a
tested with a fair tested with users and prototype shows how
representation of users improvements have been it will be tested.
and all tasks in Section 9 identified to meet
Testing have been completed. user requirements.
Team Effective team collaboration Team collaboration among There is some
collaborati and communication among peers and stakeholders is evidence of team
on peers and stakeholders is clearly documented in interactions among
clearly documented in Section 10 Team peers and
Section 10 Team collaboration. stakeholders.
collaboration.
Individu Each team member Each team presents Some team members
al presents a reflective and an account of their present an account of
learning insightful account of their learning during the their learning during
learning during the project. project. the project.

Total points

28
VIDEO PRESENTATION
Points
Given
Criteria 3–
excellent 2
– very
good
1 – satisfactory

Communicatio The video is well-paced and communicated, following


n a clear and logical sequence.

Demonstrations and/or visuals are used to


Illustrative
illustrate examples, where appropriate.

Accurate The video presents accurate science and


language technology and uses appropriate language.

The video demonstrates passion from team


Passion
members about their chosen topic/idea.

Sound and
The video demonstrates good sound and image
image
quality quality.

The content is presented in the video within a


Length
3-minute timeframe.

Total points

29

You might also like