Google YouTube

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

FINAL - DO NOT EDIT

Cooperative arrangement for complaints handling on


social networking sites

YouTube
In the interests of transparency, providers supporting the Cooperative Arrangement for
Complaints Handling on Social Networking Sites agree to provide information on how they give
effect to the Principles in relation to the social networking services they offer, using this form.

1. About the Social Networking Service

Google is deeply committed to protecting young people on the Internet and providing all of our
users with a safe experience online through empowerment, education and protective
measures. That's why we empower people with tools to help them choose what content they
see online; provide educational materials on how to stay safe online; and protect through
partnerships including with law enforcement and industry. We incorporate these three key
principles across all Google products and services as broadly as possible. Google's Family
Safety Center1 and the YouTube Safety Center2 are great resources for any user looking for
safety tools and resources. They are one-stop shops for teens, parents, teachers, and carers
about staying safe online. We’ve recently updated it to include advice from leading child safety
organisations around the world, tips and ideas from parents at Google, as well as information
on how to use the safety tools and controls built into Google products.

In relation to this protocol, there are social elements to our video sharing platform YouTube.
YouTube is a user generated video sharing platform around which communities form, have
discussion and interact. Bearing this in mind, Google provides information below about how it
maintains these principles on the YouTube platform, where the relevant principles can be
applied.

YouTube allows billions of people to discover, watch and share originally-created videos.
YouTube provides a forum for people to connect, inform, and inspire others across the globe

1
http://www.google.com.au/goodtoknow/familysafety/
2
http://support.google.com/youtube/bin/request.py?&contact_type=abuse
and acts as a distribution platform for original content creators and advertisers large and
small.

Members of the YouTube community engage with one another by uploading, watching,
favouriting, liking and commenting on videos, subscribing to channels, creating playlists and
sharing this activity. The following statistics particularly demonstrate the social nature of
YouTube.

● 500 years of YouTube video are watched every day on Facebook, and over 700
YouTube videos are shared on Twitter each minute
● 100 million people take a social action on YouTube (likes, shares, comments, etc) every
week
● Millions of subscriptions happen each day. Subscriptions allow you to connect with
someone you're interested in — whether it's a friend, or the Sydney Opera House —
and keep up on their activity on the site
● More than 50% of videos on YouTube have been rated or include comments from the
community
● Millions of videos are favourited every day

2. How will the provider give effect to the complaints handling aspect of the Cooperative
Arrangement

The following is an outline of how YouTube has considered the Cooperative Arrangement for
complaints handling in relation to its Social Networking Service(s). This section will make
reference to the recommendations made in the Principles document, where they are applicable
and outline how they are applied.

1. Policies for acceptable use

YouTube's Terms of Service require all users to abide by our Community Guidelines before
uploading videos. Our Community Guidelines are purposefully written in easy-to-understand
language and are designed to provide users with clear advice on what content is acceptable
and what is not (e.g., hate speech, pornography, images of drug abuse, and graphic
violence). We have zero tolerance for predatory behaviour, harassment, revealing other users’
personal information, or any activity that endangers safety or privacy. Users who repeatedly
violate our Community Guidelines will have their accounts terminated.

Content deemed “age-restricted” after flagging and subsequent staff review is only viewable
by signed-in users who represent that they are 18 years of age and older and who’ve clicked
through a warning message.

Our Community Guidelines are clear about what is allowed on YouTube and what is not.
There is, however, a category of content that is not illegal and does not breach our Terms of
Service, but could still be inappropriate for users under the age of 18.3 Should a user who is
not signed into the service come across such a video they will be greeted with an interstitial
page stating:

“This video or group may contain content that is inappropriate for some users, as flagged by
YouTube's user community. To view this video or group, please verify that you are 18 or older
by signing in or signing up4.”

A user can then sign in and choose to watch the video or not. Only users with a birth date
putting them over the 18 year old threshold will be allowed to watch the video. In order to
prevent a teen from signing out of their account and then trying to create a new account with
an older birth date, YouTube places a cookie on the user’s browser preventing the user from
re-registering with a different age.

Users under the age of 13 are prohibited from using YouTube and are blocked from creating
accounts by a permanent cookie.

Within YouTube's safety tips page and the Terms of Service 5 it is clearly stated that the
YouTube service is for people of 13 years and older. If, upon registration, a user puts a birth
date that makes them under 13, they will be refused use of YouTube. In order to prevent a
teen from trying to create a new account with an older birth date, YouTube places a cookie on
the user’s browser which prevents the user from registering with a different birth date.

Upon notification or if YouTube reasonably suspects that a particular user is less than 13
years old, that account will be closed. YouTube will send a confirming email to that user in
those suspect cases.

2 & 3. Complaints Mechanisms and Review Processes

Every minute, 72 hours of video are uploaded to YouTube, totaling hundreds of thousands of
videos every day. To handle this much content, we have developed an innovative and reliable
community policing system that involves our users in helping us enforce You Tube’s rules.

Millions of users report potential violations of our Community Guidelines by selecting the
“Flag” link when they encounter inappropriate content. This reporting mechanism is
straightforward and easily navigable for users of any age. Users are able to select from a list
of over a dozen reasons for flagging a video and are given the opportunity to provide
additional information, such as the specific time when the objectionable material appears in

3
http://support.google.com/youtube/bin/answer.py?hl=en-GB&answer=92486
4
www.youtube.com/create_account
5
http://www.youtube.com/t/terms
the video. Flagged videos are promptly reviewed for compliance with our Community
Guidelines.

In addition to the aforementioned flagging mechanism to report content policy violations,


YouTube has formal contact forms through which users can contact us directly regarding
privacy, harassment and legal complaints.

Dedicated YouTube staff review flagged content 24 hours a day, seven days a week, with a
lot of content being reviewed in under one hour. If a user seeks more information on how we
enforce our policies, there are additional details available in the Help Center6.

Users whose content is flagged and removed pursuant to our Community Guidelines receive
messaging specific to the policy violated. For instance, a user who violates our policies
regarding Harmful & Dangerous Content would receive messaging tailored to that policy area.

If users submit a Harassment complaint via our Help & Safety Tool7 they are informed,
“We appreciate your bringing this to our attention. We will investigate your claim and take
action if necessary. Thanks for helping us keep YouTube safe.” If the content violates our
policies regarding Harassment we notify the uploader as well.

If someone feels that their privacy has been violated on YouTube, a privacy complaint can be
submitted to YouTube through our online tool8. We also accept complaints from parents or
legal guardians of a minor child.

Throughout the Privacy Complaint Process the support team engages in correspondence with
both the complainant and the uploader. We first email both the complainant and the uploader
to confirm that a privacy complaint has been filed and to explain how reports are handled. In
certain circumstances we ask either or both parties for additional information regarding the
claim. After we review the complaint against our Community Guidelines, we take action as
necessary and email the complainant regarding our decision. We likewise engage in email
correspondence with users who submit impersonation and legal complaints. We confirm that
we’ve received their complaint and notify them of the action taken.

We also offer our users tools to protect their privacy on the site, such as the ability to hide
personal information, make videos private, remove videos from public listings, or share videos
selectively with family and friends.

We respond immediately to threatening situations and report all instances of child


abuse/exploitation to the National Center for Missing and Exploited Children (NCMEC).

6
http://www.google.com/support/youtube/bin/answer.py?hl=en&answer=92486
7
http://www.youtube.com/reportabuse
8
http://support.google.com/youtube/bin/answer.py?hl=en-GB&hlrm=en&answer=142443
4. Child abuse material

YouTube takes reports of illegal content on the site seriously. When we become aware that a
video violates the law, we cooperate with law enforcement agencies quickly and in the proper
legal framework. For example, we report all instances of child abuse/exploitation to the
National Center for Missing and Exploited Children (NCMEC). Content that demonstrates
imminent threats to life or limb can be reported to relevant law enforcement agencies. Content
that is flagged or reported through the Help & Safety Tool is reviewed and dealt with
appropriately.

On the Google-wide level, we actively support law enforcement efforts to keep kids safe
online. Google has a specialized legal team dedicated to working with law enforcement
officials, available 24 hours a day, 7 days a week.

We’re also leveraging Google tools to combat online child abuse material. Throughout 2007,
engineers used some of their 20% time to create innovative software tools9. The keys here
were organization, scalability, and search. In particular, the tools we provided will aid in
organizing and indexing the National Centre for Missing and Exploited Children's (NCMEC)
information so that analysts can both deal with new images and videos more efficiently and
also reference historical material more effectively. This task has been time-consuming, and
NCMEC analysts were simply getting overwhelmed by all of the data they had to sift through
to help NCMEC track down child predators through video and image search. With these tools,
analysts will be able to more quickly and easily search NCMEC's large information systems to
sort and identify files that contain images of child pornography. In addition, a new video tool
we built streamlines analysts' review of video snippets.

Google has also donated Google Search Appliances10 and hundreds of thousands of dollars
of in-kind advertising each year through our Google Grants11 program to NCMEC.

Google is a member of both NCMEC's Financial Coalition Against Child Pornography and its
Technology Coalition. The Financial Coalition includes leading banks, credit card companies,
third party payment companies and Internet services companies, and is dedicated to fighting
child pornography/abuse over the Internet. The goal of the Financial Coalition Against Child
Pornography is to eradicate commercial child pornography. NCMEC's Technology Coalition's
mission is to develop and deploy technology solutions that disrupt the ability of predators to
use the Internet to exploit children or traffic in child pornography/abuse material.

9
http://missingkids.com/missingkids/servlet/NewsEventServlet?LanguageCountry=en_US&PageId=3644
10
http://www.google.com/enterprise/gsa/
11
http://www.google.com/support/youtube/bin/answer.py?answer=126289
5. Identified contact person

YouTube’s identified contact person is Google’s representative on the Consultative Working


Group on CyberSafety. This is the person with whom the Australian Government can discuss
issues and any appropriate messaging to the community and media in response to issues as
they arise.

6 & 7. Education and awareness raising

YouTube as a video platform not only demonstrates its own safety features but also provides
information to parents, teachers and young people on how to remain safe online, many of
whom we partner with and support. Users are able to access YouTube's Community
Guidelines12, Help Center13 and Safety Center14 from every YouTube page. All of these pages
are written in an easy to understand, user-friendly format.

The Help Center provides advice on a vast array of topics from changing personal settings to
privacy complaint guidelines. Our dedicated Safety Center, linked to at the bottom of each
page on our site, provides safety tips to our users, including advice on keeping personal
videos private, teen safety, protecting online identities, appropriately managing interactions
with other users, being responsible digital citizens, and using the community flagging system
(see www.youtube.com/t/safety). We also provide Educator15 and Parent Resources16 pages
which are targeted specifically at those audiences. From the Safety Center, it is possible to
reach the Google Family Safety Center which offers additional tools and resources on a
variety of topics. This information is made available to users, parents and teachers.

Also, we recently launched the YouTube Curriculum -- an interactive resource to help


students, parents and educators learn how to be responsible digital citizens and how to
navigate YouTube’s policies and tools with ease.

We also support child safety organisations’ efforts to educate Internet users through new
media, including YouTube. YouTube has partnered with many organisations in multiple
countries and languages who have their own channels and whose expertise we’ve
incorporated into our help resources and safety tips. Some examples are Kids Helpline 17,

12
http://www.youtube.com/t/community_guidelines
13
http://support.google.com/youtube/?hl=en
14
http://support.google.com/youtube/bin/request.py?contact_type=abuse
15
http://support.google.com/youtube/bin/answer.py?hl=en&answer=157105
16
http://support.google.com/youtube/bin/answer.py?hl=en&answer=126289
17
http://www.kidshelp.com.au/
Bravehearts 18 , Beat Bullying 19 , Childnet 20 , eEnfance 21 , Save the Children 22 , ICMEC 23 , Ad
Council24 and others.

We also teamed up with online safety organisation iKeepSafe to develop a curriculum25 that
educators can use in the classroom to teach what it means to be a responsible online citizen.
The curriculum is designed to be interactive, discussion filled and allow students to learn
through hands-on and scenario activities. The site also provides a resource booklet for both
educators and students that can be downloaded in PDF form, presentations to accompany the
lesson and animated videos to help frame the conversation. We’ve also launched a number of
videos from our Safety Center, one26 of which focuses exclusively on how to stay safe on
YouTube.

YouTube always encourages users to employ a safe approach to personal information and
privacy. YouTube does not have profile pages in the same way as social networking services,
instead YouTube users use Channels to share user created content. YouTube does provide
the option for users to share their videos privately to a limited number of people, or remove
their videos from public listings. Users can also have the option to prevent videos being
embedded on 3rd party pages, to prevent other sites from putting the user's video elsewhere
on the web.

When setting up an account, users are advised in the Safety Tips, also accessible from every
page, on how to protect their identity. Users are able to add some personal information on
their channel should they choose to do so. Once the user has set up their channel they have
the option to add information through their account settings. Privacy settings are readily
available to users regarding recent activity sharing 27 which allow a user to choose which
information they wish to share. Help articles on ‘Editing your Channel for Privacy’ and related
issues can be found in the Help Center. Users can also have the option to prevent videos
being embedded on 3rd party pages, to prevent other sites from putting the user's video
elsewhere on the web.

18
http://www.bravehearts.org.au/
19
http://www.youtube.com/user/Beatbullying
20
http://www.youtube.com/user/childnet
21
http://www.youtube.com/user/eenfance
22
http://www.youtube.com/user/savethechildrenuk
23
http://www.youtube.com/user/DontYouForgetAboutMe
24
http://www.youtube.com/user/adcouncil
25
http://www.google.com/educators/digitalliteracy.html
26
http://www.youtube.com/watch?v=DQ5zJvA0NYY
27
http://www.youtube.com/account_sharing
8. Collaboration with Government on education and awareness raising initiatives

YouTube, through its Google representative, is a member of the Consultative Working Group
on Cybersafety and is committed to helping all Australians be smart, safe and responsible
online.

Google works to promote the Australian Government’s CyberSafety Help Button, including
through its blog and Family Safety Center. Google also participates in cybersafety events and
awareness activities organised by the Government, including Safer Internet Day, Privacy
Awareness Week and National CyberSecurity Awareness Week. For example, for Safer
Internet Day Google published advertisements promoting its safety tools in newspapers
across Australia and online.

Google also distributes information about its safety tools through the Australian
Communications Authority’s Cybersmart program, and through events such as the
CyberSafety Summit, the National Center Against Bullying Conference and consumer
organisations such as ACCAN. We also meet with education institutions to raise awareness
of the safety tools we make available, and how to report content on our services.

We support the part-Government funded Cooperative Research Centre for Youth and
Wellbeing (YAW-CRC). The YAW-CRC contributes important research to this area, including
research which is helping us understand how we as a community can harness the knowledge
and skills of young people to help parents and guardians learn about the online world (see for
example the YAW-CRC’s Living Lab study on Intergenerational Attitudes to Social Networking
and Cybersafety).

We also support non-government educational efforts to increase awareness about digital


citizenship. In Australia we support non-profit organisations including The National
Association for Prevention of Child Abuse and Neglect (NAPCAN), Inspire Foundation, The
Alannah and Madeline Foundation, Kids Helpline and Bravehearts, to provide online public
service announcements that promote access to resources about safety and other educational
efforts. We actively support their efforts to raise awareness and educate people about digital
citizenship. This includes our support for The Alannah and Madeline Foundation’s eSmart
Schools Program (http://www.amf.org.au/eSmart/).

9. Continued innovation

We are continually developing innovative tools to keep our community safe. For example, we
use digital hashing technologies to prevent the re-upload of files that have been removed
pursuant to policy violations. We also offer our users tools to protect their privacy on the site,
such as the ability to hide personal information, make videos private, remove videos from
public listings, or share videos selectively with family and friends. Our Help & Safety Tool lets
users report concerns to the YouTube team (such as harassment, privacy violations and
cyberbullying), block comments from specific other users and disable the video comments
feature on videos.

YouTube specifically created Safety Mode 28 , which allows users to choose not to see
potentially objectionable content they may find offensive, to give parents additional controls
over their teen’s account. When users opt in to Safety Mode, videos with potentially
objectionable content will not show up in video search. While no filter is 100% accurate, we use
community flagging and hide objectionable comments to identify and hide inappropriate
content. Like Google SafeSearch, Safety Mode on YouTube does not remove content from the
site but rather keeps it off the page for users who opt in.

In order to make it less likely for users to stumble upon this type of content it are excluded from
certain listings and areas of the site such as the "Most Viewed" page. YouTube has
implemented automated systems to help classify content based on their content and meta-data
and where videos are determined to be unsuitable for younger viewers, such content is
demoted in browse pages, for example.

3. Other actions taken on implementation of these arrangements

This Self-Declaration is designed to provide information about how YouTube handles


complaints and works to raise awareness about how to stay smart, safe and responsible
online. YouTube, through its Google representative, is a member of the Consultative Working
Group on CyberSafety and is committed to ongoing collaboration to help all Australians have
a safe experience online.

28
http://support.google.com/youtube/bin/answer.py?hl=en-GB&answer=174084

You might also like