0% found this document useful (0 votes)
14 views19 pages

Spring 2021 - 0831 - 2

AIOU 0831 2nd sol

Uploaded by

Areeshy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views19 pages

Spring 2021 - 0831 - 2

AIOU 0831 2nd sol

Uploaded by

Areeshy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Foundation of Education

Course Code: 0831

Assignment #2

Submitted to: Adeel Abbas


Submitted by: Zanjbeela Maryam
Student ID: CE620363
Submission Date: 17 October, 2021

ALLAMA IQBAL OPEN UNIVERSITY ISLAMABAD


Question 1:
Differentiate between aims, goals and objectives by giving examples.
Solution:
Aims, goals and objectives help to make your business successful day in and day out. Although
they are interrelated, aims, goals and objectives have important distinctions and their roles in
business are often confused. Aims relate to the end results, but goals and objectives help you
achieve these results. Goals are abstract ideas, while objectives are more tangible and concrete.
Plan, Articulate and Document Aims, Goals and Objectives
Taking the time to plan, articulate and document in writing your aims, goals and objectives
contributes to the success of your business. The three interrelated concepts concern future
intentions and all three must be set in motion if your plans are to have a realistic chance of
succeeding. You are more likely to reach a goal when you’ve planned and implemented objectives
to achieve it. You’ll also see greater success in your business if you share your aims and strategies
to achieve them with associates and employees who will be carrying out those objectives. Keep in
mind that aims, goals and objectives must be clear and specific as well as realistic.
Aims Are Desired Outcomes
An aim is a purpose or the desired outcome. Aims tend to be more general than goals and objectives
because aims refer to the end results. But while they are general in nature, aims are also bigger.
They are the vision for your business. Aims are not always accompanied by goals and objectives,
but to achieve the desired outcome there must be an action plan in place. For example, a person
might state his or her aim is to be a successful entrepreneur, without setting the goals and objectives
that would help him or her to achieve this.
Set Specific Goals
Goals are specific statements of intent. For example, a company might have an aim to increase
profits so they set a specific goal to increase profits by 25 percent within one year. A goal is usually
broad and does not lay out the steps to achieve it. A goal is a target or destination. Goals keep you
focused on your aim and on track working to achieve it. Objectives Are the Action Plan Goals are
destinations and objectives are the actions needed to arrive at that destination. Objectives are
measurable and there may be multiple objectives leading to your goals and aims. If your overall
goal is to get a more rewarding job, for example, you may have a set of objectives that help you to
achieve this. Such objectives might include sending letters to companies you want to work for,
brushing up on interview techniques and learning skills or obtaining qualifications that would
increase your employment prospects. Objectives are like a road map, giving you direction as to
what you what you need to do and when you need to do it in order to achieve your business goals
and aims. Strategic objectives are clearly defined, often quantifiable goals that a business uses as
benchmarks to evaluate its progress. Implementation of strategic objectives does not start when
you begin working toward achieving a goal, but rather when you first define your goals and set up
a process for gauging your success.
Objectives
Defining strategic objectives in clear and relevant ways makes the difference between hollow,
meaningless standards and useful tools that genuinely improve the quality of your work
performance. It is more useful to say, "We will work to increase our sales by 50 percent over the
next year," than to say, "We will grow our operation significantly over the next year." Set
objectives using quantifiable criteria that you can measure in order to assess whether you are
meeting your goals and, if not, how much you are falling short. State your goals in language that
is easy to understand and define objectives that are clear and relevant to your work.
Establishing Benchmarks
You can implement strategic objectives most effectively if you break them down into manageable
chunks and outline specific actions that you will take to achieve them. Once you have stated that
you intend to grow your business by 50 percent over the next year, stipulate whether that growth
will occur in a steady and even fashion, or whether it will be tied to actions and events during a
particular part of the year that will generate a more dramatic rate of growth during that time. Tie
the growth that you expect to sales and marketing activities, and new product introductions so you
can measure the impact of each action and evaluate whether it is achieving the desired results.
Implementation of Strategic Objectives
To implement strategic objectives, conduct business activities with an eye toward meeting the
goals and benchmarks you have set forward but also be willing to reevaluate your targets and
strategies if situations change. Gauge your progress relative to your objectives by evaluating
whether you have effectively followed the steps you outlines, and whether these actions have
produced the desired results. Also examine whether the objectives themselves are realistic and
whether these steps and actions are in fact the best way to achieve your strategic objectives. Many
people use the words “goal” and “objective” interchangeably to refer to any forward-looking
statement of intention. In the personal development context, you may describe losing 10 pounds,
saving $500 or adopting a vegan diet as either goals or objectives. Some business leaders may refer
to increasing profits by 10 percent or slashing costs by 5 percent as both goals and objectives in a
single conversation. Certainly there are similarities between goals and objectives. However, goals
are not the same thing as objectives. Crucial differences exist between these two concepts. It’s
important to understand the differences between goals and objectives, as well as how they can
work together to improve an organization’s likelihood of success,
The Difference between Goals and Objectives
A goal is typically broader in scope than an objective, but not as comprehensive as a statement of
purpose. Goals are designed to achieve an intention concerning one or more specific business
functions, such as profits, costs, human resources, operations or IT. For example, a non-profit
organization may set a goal of “serving 2,000 low-income families in the greater metropolitan area
over the next 10 years.” A manufacturing business may adopt a goal of “cutting expenses across
the board by 25 percent compared to the last fiscal year.” A services business – for example, a law
firm – may decide to “bill 100 more hours next month.” As these examples of goals illustrate, a
goal may be either short-term, like next month, or long-term which could be over the next ten
years. Goals may deal with financial operations such as profits or costs, or they may be
transactions, customers or contracts. Goals establish both a clear direction forward and the desired
endpoint. As a result, goals help a business or organization make progress, grow and develop.
However, on their own, goals are insufficient to guide the day-to-day actions of employees or
members of organizations. Therefore, the business or group and its leaders will find it difficult to
reach that goal without setting objectives. Objectives help translate goals into actionable items,
tasks, needs and project plans. With objectives, managers can create project timelines and decide
on specific deliverables and budget resources including employees, time and funds. Consequently,
objectives are based on the goals they seek to accomplish but they are more specific statements of
how an individual, company or organization can attain the goal in question.
Purposes and Goals, the Same Thing
Goals are broader statements than objectives, but a company’s statement of purpose conveys the
overarching vision of the business, which is usually established by a CEO or board. The statement
of purpose is in alignment with a company’s mission, more than its goals. Goals can and should
be aligned with the company’s purpose, but they are not the same thing. For example, a company
may have a purpose of “eradicating childhood hunger.” How the company will fulfill that lofty
purpose depends on the goals the company establishes. It may set a goal of “launching a new pest-
resistant strain of seeds for eight popular vegetables.” Or it may establish a goal of “investing 50
percent more into research and development.” Purpose motivates and inspires, but it does not give
effective guidance about how a plan should be implemented, pursued or fulfilled. Companies also
need good goals and SMART objectives.

Question 2:
Discuss the awakening movements in Muslims of the sub-continent during the British period.
Solution:
Muslim communities of India have never been united as a single cohesive entity. Their religious
identity was transformed from a passive state to an active one according to the changing priorities
of the ruling classes. They invoked religious sentiments when they fought against Hindu rulers and
suppressed them when the shariah hindered their absolute rule. The concept of a Muslim political
identity was a product of British rule when the electoral process, the so-called democratic
institutions and traditions were introduced. British rule that created a minority complex amongst
Indian Muslims and thereby a consciousness of Muslim political identity. After passing through a
series of upheavals, the Muslim community shed its minority complex and declared itself a nation,
asserting its separateness.
Northern India remained the center of Muslim power, historically. The class of leading Muslim
elites played an active role in determining and affirming Muslim identity according to their
economic and political interests. Muslims of the other parts of India followed in their footsteps
and looked at issues and problems from the point of view of northern Indian Muslims. We shall
look at the changing concepts of Muslim identity in the Indian subcontinent before 1947.
Three elements were amalgamated in the making of Muslim communities in India,
namely conquerors who came from the north-west, immigrants, and local converts. The conquerors
and their entourage had a sense of higher rank and superiority as it was they who wielded political
power. Arab, Persian, Turkish, Central Asian, and Pathan immigrants, who came to India to make
careers for themselves, were treated as if they shared a common ethnic background, and were
integrated with the conqueror class as the ruling elite. Local converts, on the other hand, were
treated as being lower down the social ladder and never accorded an equal place in the ethnically
divided Muslim society. Thus, ethnic identity was more powerful in dividing Muslim society than
the religious factor was in unifying it.
We can find an example of this in Chachnama, which is a basic source of the history of Sindh.
Muslim conquerors of Sindh are referred to in the Chachnama as Arabs. Similarly, the early
conquerors of northern India were known by their ethnic identity as Turks. After the foundation of
their kingdom (AD 1206) they maintained their exclusive ethnic domination and did not share their
power and privileges with other Muslim groups. The same policy was followed by other Muslim
dynasties. The founder of the Lodhi dynasty, Bahlul (1451-1489), did not trust non-Afghan
Muslims and invited Afghans from the mountains (Roh) to support him.
Locally converted Muslims were excluded from high positions and were despised by their foreign
(Muslim) brothers. Ziauddin Barani (fourteenth century) cited a number of examples in the Tarikh-
i-Firuzshahi when the Sultan refused to appoint lower caste Muslims to high posts, despite their
intelligence, ability, and integrity. Barani propounds his racist theory by advising Muslim rulers
to appoint only racially pure family members to high administrative jobs. He suggested that low
caste Muslims should not be allowed to acquire higher education as that would make them
arrogant. The theory of racial superiority served to reserve the limited available resources of the
kingdom for the benefit of the privileged elite who did not want to share them with others. The
ruling dynasties kept available resources in the hands of their own communities and excluded
others.
The Mughals wrested power from a Muslim dynasty (AD 1526). On their arrival, therefore, they
posed a threat to other Muslim rulers as well as to Hindu rulers. The danger of Mughal hegemony
united Muslim Afghans and Hindu Rajputs in a common cause. They fought jointly against Babar
in the battle of Kanwaha (AD 1527). However, Mughal rule changed the social structure of the
Muslim community in India, as a large number Iranaians and Turks arrived in India after the
opening of the North-West frontier. These new immigrants revived Iranian and Central Asian
culture which had been in a process of decline during Afghan rule. To monopolize top positions
in the state, Muslims of foreign origin formed a socially and culturally privileged group that not
only excluded locally converted Muslims but also Afghans who were deprived of high status jobs.
The Mughals were also very conscious of their fair color, which distinguished them from the
converted, darker complexioned Muslims. Since being a Muslim of foreign origin was considered
prestigious, most of the locally converted Muslim families began to trace their origin to famous
Arab tribes or to prominent Persian families.
The social structure of the Mughal aristocracy changed further when the empire extended its
territories and required more people to administer them. Akbar (AD 1556-1605) as the emperor,
realized that to rule the country exclusively with the help of Muslims of foreign origin posed a
problem as there would not be enough administrators for the entire state. He realized that the
administration had to be Indianized. Therefore, he broadened the Muslim aristocracy by including
Rajputs in the administration. He eliminated all signs and symbols which differentiated Muslims
and Hindus, and made attempts to integrate them as one. Despite Akbar’s efforts, however, the
rigid social structure did not allow lower class (caste) Hindus and Muslims to move from their
lower position in society to a higher status. Class rather than faith was the true dividing line. The
Muslim aristocracy preferred to accept upper caste Rajputs as their equals rather than integrate
with lower caste Muslims. Akbar’s policy was followed by his successors. Even Aurangzeb, in
spite of his dislike of Hindus, had to keep them in his administration. He tried to create a semblance
of homogeneity in the Muslim community by introducing religious reforms. But all his attempts
to create a consciousness of Muslim identity came to nothing. During the entire Sultanate and
Mughal periods, politically there was no symbol that could unite the Muslims into a single
cohesive community. In the absence of any common economic interest that might bind the
different groups of Muslims, they failed to cohere and achieve homogeneity as a single
community. Biradaris, castes, professions, and class interests kept them politically and culturally
divided.
The ulema made strenuous attempts to foster a religious consciousness and to build a Muslim
identity on such consciousness, by dividing Indian society into believers and non-believers. They
fulminated against ‘Hindu rituals’ being practiced mainly by lower-class Muslims and warned
them to reform and keep their religion ‘pure’. Their attitude towards locally converted Muslims
was particularly hostile. They argued that by retaining some of their indigenous Indian customs,
they were half Muslims and half Hindus. The ulema further argued that true Islam could be
understood only through knowledge of Arabic or Persian. Therefore, to integrate with the ‘Muslim
Community’ locally converted Muslims should abandon their vernacular culture and learn Arabic
and Persian (the everyday language of the ruling elite). By that definition, Muslims of foreign
origin were taken to be better than those who had been locally converted. These latter were
categorized as ignorant, illiterate, and bad Muslims. However, it must be said that in that period
(AD 1206-1707) when the power of the Muslim rulers in India was at its height, no attempts were
made to arouse religious, political, or social consciousness on the basis of a Muslim identity. It
was only in the period of Akbar, when Rajputs were being integrated with Mughal nobility, that
some ulema raised a voice against his religious, political, and social reforms and asserted the
separateness of Hindu and Muslim communities. Later on, Aurangzeb tried to rally Muslim
support by trying to unite them under a state-imposed version of fiqh (Islamic jurisprudence),
compiled as the Fatawa-i-Alamgiri. But all his efforts failed to arrest the process of political
disintegration which he was thus trying to avoid.
During the later period, the decline of Mughal political power dealt a heavy blow to the ruling
Mughal aristocracy. Immigrants from Iran and Central Asia stopped coming in due to lack of
patronage. The dominance of the Persian language weakened. Urdu emerged as the new language
of the Muslim elite. The social as well as the political hegemony of Muslims of foreign origin was
reduced. Locally converted Muslims began to claim and raise themselves to a new, higher status.
The rise and successes of the East India Company undermined the role of the Muslim ruling
classes. Defeats in the battles of Plassey (AD 1757), Buxar (AD 1764) and, finally, the occupation
of Delhi by the British (AD 1803) sealed the fate of Mughal power and threatened the privileged
existence of the Muslim ruling elite, as the Mughal emperor became incapable of defending their
interests.
Under these circumstances, after Shah Alam II, the practice of reciting the name of the Ottoman
caliph in the khutba began. This was meant to indicate that the Ottoman Caliph, and no longer the
Mughal emperor, was the defender and protector of the Muslim community in India. Another
significant change was that with the eclipse of the political authority of the Mughal emperor, the
ulema began to represent themselves as the protectors and custodians of the interests of the
community. They were now contemptuous of the Mughals whose decline they attribute to their
indifference towards religion. They embarked on revivalistic movements which they claimed
would lift the community from the low position to which it had fallen. Their revivalism was
intended to reform the Muslim community and infuse homogeneity in order to meet the challenges
that confronted them.
Sayyid Ahmed’s Jihad (AD 1831) and Haji Shariatullah’s Faraizi movements’ were revivalist and
strove to purify Islam of Hindu rituals and customs. Their ultimate goal was to establish an Islamic
state in India and to unite Muslims into one community on the basis of religion. Two factors played
an important role in reinforcing the creation of a separate identity amongst Indian Muslims. They
were, firstly, the activities of Christian Missionaries and secondly, the Hindu reformist and
revivalist movements. Muslims felt threatened by both. The fear of Muslims being converted into
another faith, and of being dominated by others, led the ulema to organize themselves ‘to save
Muslims from extinction’. Recognizing the authority of the ulema, Muslims turned towards them
for guidance. They sought fatawa over whether they should learn the English language, serve the
East India Company, and regard India as Dar-ul-Islam (under which they could live peacefully)
rather than as Dar-ul- Harb (which imposed upon them an obligation to rebel). Thus, external and
internal challenges brought the Muslims of India closer together. Religious consciousness paved
the way towards their separate identity. The madrassa, mosque, and khanqah became symbols of
their religious identity. However, the hopes that they placed in religious revivalism as the path to
political power came to an end when Sayyid Ahmed was defeated and his Jihad movement failed
to mobilize Muslims to fight against British rule. Bengali Muslims were subdued with the
suppression of the Faraizi movement, and the brutal repression that followed the uprising of 1857
reduced the Muslim upper classes to a shadow of what they had been.
Indian Muslims were demoralized after the failure of the rebellion of 1857. Sadness and gloom
prevailed everywhere. Muslims felt crushed and isolated. There came a challenge from British
scholars who criticized Islamic institutions as being unsuitable for modern times. Never before
had Indian Muslims faced such criticism of their religion. This frightened and angered them. In
response, Indian Muslim scholars came forward to defend their religion. This led them to study
Islamic history in order to rediscover that they believed to be a golden past. In reply to Western
criticism they formulated their arguments, substantiated by historical facts, that Europe owed its
progress to the contributions of Muslim scientists and scholars, which were transmitted to it
through the University of Cordoba in Moorish Spain, where, under Umayyid rule, there was a
policy of religious tolerance towards Christians and Jews. Muslim contributions to art, literature,
architecture, and science, thus enriched human civilization. To popularize this new image of the
role of Muslims in history, there followed a host of historical literature, popular as well as
scholarly, to satiate the thirst of Muslims for recognition of their achievements. Such images of a
golden past provided consolation to a community that felt helpless and forlorn. Images of the
glories of the Abbasids, the grandeur of the Moors of Spain, and the conquests of the Seljuks
healed their wounded pride and helped to restore their self- confidence and pride. Ironically, while
glorifying the Islamic past outside India, they ignored the past of the Delhi Sultanate and Mughal
India. In their eyes, the distant and outside past was more attractive than the past they had actually
inherited. It was left to the nationalist historians of India, mainly Hindu, to reconstruct the glory
of Muslim India in building a secular, nationalist ideology in the struggle against British rule.
Muslim search for pride in their Islamic past, thus, once again turned the orientation of Indian
Muslims towards the rest of the Muslim world. That consciousness of a greater Muslim identity
obscured their Indian identity from their minds. Their sense of solidarity with the Muslim world
found expression, especially, in sympathy for the Ottoman Empire. Although most educated
Indians were quite unaware of the history of the Ottomans, it became a focal point of their pride,
displacing the Mughals. Sayyid Ahmad Khan, while explaining the attachment of Muslims to
Turkey, said ‘When there were many Muslim kingdoms we did not feel grief when one of them
was destroyed. If Turkey is conquered, there will be great grief, for she is the last of the great
powers left to Islam.
The Khilafat movement extended the consciousness of a greater Muslim identity amongst Indian
Muslims. It also united the ulema and Western educated Muslims. The Muslim League, in its
session of 1918, invited leading ulema to join the party. They grasped the opportunity and soon
established control of the movement. When Gandhi supported the Khilafat issue and launched his
non- cooperation movement (AD 1919-20), he brought out Hindus to protest in solidarity with the
Muslims. But the withdrawal of the non-cooperation movement and the eventual collapse of the
Muslims, their unity with the Hindus evaporated.
Question 3:
Discuss ways and means to enhance international understanding through education.
Solution:
In the past decade or two teaching has changed significantly, so much in fact that schools may not
be what some of us remember from our own childhood. Changes have affected both the
opportunities and the challenges of teaching, as well as the attitudes, knowledge, and skills needed
to prepare for a teaching career. The changes have influenced much of the content of the books.
To see what we mean, look briefly at four new trends in education, at how they have changed what
teachers do, and at how you will therefore need to prepare to teach:
 Increased diversity: there are more differences among students than there used to be.
Diversity has made teaching more fulfilling as a career, but also made more challenging in
certain respects.
 Increased instructional technology: classrooms, schools, and students use computers more
often today than in the past for research, writing, communicating, and keeping records.
Technology has created new ways for students to learn (for example, this textbook would
not be possible without Internet technology!). It has also altered how teachers can teach
most effectively, and even raised issues about what constitutes ‛true‚ teaching and learning.
 Greater accountability in education: both the public and educators themselves pay more
attention than in the past to how to assess (or provide evidence for) learning and good
quality teaching. The attention has increased the importance of education to the public (a
good thing) and improved education for some students. But it has also created new
constraints on what teachers teach and what students learn.
 Increased professionalism of teachers: Now more than ever, teachers are able to assess the
quality of their own work as well as that of colleagues, and to take steps to improve it when
necessary. Professionalism improves teaching, but by creating higher standards of practice
it also creates greater worries about whether particular teachers and schools are ‛good
enough.‚
How do these changes show up in the daily life of classrooms? The answer depends partly on
where you teach; circumstances differ among schools, cities, and even whole societies. Some clues
about the effects of the trends on classroom life can be found, however, by considering one
particular case—the changes happening in North America.
New trend #1: diversity in students
Students have, of course, always been diverse. Whether in the past or in the present day, students
learn at unique paces, show unique personalities, and learn in their own ways. In recent decades,
though, the forms and extent of diversity have increased. Now more than ever, teachers are likely
to serve students from diverse language backgrounds, to serve more individuals with special
educational needs, and to teach students either younger or older than in the past.
Language diversity
Take the case of language diversity. In the United States, about 40 million people, or 14 per cent
of the population are Hispanic. About 20 per cent of these speak primarily Spanish, and
approximately another 50 per cent speak only limited English (United States Census Bureau,
2005). The educators responsible for the children in this group need to accommodate instruction
to these students somehow. Part of the solution, of course, is to arrange specialized second-
language teachers and classes. But adjustment must also happen in ‛regular‚ classrooms of various
grade levels and subjects. Classroom teachers must learn to communicate with students whose
English language background is limited, at the same time that the students themselves are learning
to use English more fluently (Pitt, 2005). Since relatively few teachers are Hispanic or speak fluent
Spanish, the adjustments can sometimes be a challenge. Teachers must plan lessons and tasks that
students actually understand. At the same time teachers must also keep track of the major learning
goals of the curriculum. As you gain experience teaching, you will no doubt find additional
strategies and resources (Gebhard, 2006), especially if second-language learners become an
important part of your classes.
Diversity of special educational needs
Another factor making classroom increasingly diverse has been the inclusion of students with
disabilities into classrooms with non-disabled peers. In the United States the trend began in the
1970s, but accelerated with the passage of the Individuals with Disabilities Education Act in 1975,
and again when the Act was amended in 2004 (United States Government Printing Office, 2005).
In Canada similar legislation was passed in individual provinces during the same general time
period. The laws guarantee free, appropriate education for children with disabilities of any kind—
whether the impairment is physical, cognitive, emotional, or behavioral. The laws also recognize
that such students need special supports in order to learn or function effectively in a classroom
with non- disabled peers, so they provide for special services (for example, teaching assistants)
and procedures for making individualized educational plans for students with disabilities.
As a result of these changes, most American and Canadian teachers are likely to have at least a
few students with special educational needs, even if they are not trained as special education
teachers or have had no prior personal experience with people with disabilities. Classroom teachers
are also likely to work as part of a professional team focused on helping these students to learn as
well as possible and to participate in the life of the school.
The diversity of modern classrooms is not limited to language or disabilities. Another recent
change has been the broadening simply of the age range of individuals who count as ‛students’. In
many nations of the world, half or most of all three- and four-year-olds attend some form of
educational program, either part-time preschool or full-time child care (National Institute for Early
Education Research, 2006). In North America some public school divisions have moved toward
including nursery or preschool programs as a newer ‛grade level‚ preceding kindergarten. Others
have expanded the hours of kindergarten (itself considered a ‛new‚ program early in the 20th
century) to span a full-day program.
The obvious differences in maturity between preschoolers and older children lead most teachers
of the very young to use flexible, open-ended plans and teaching strategies, and to develop more
personal or family-like relationships with their young ‛students‚ than typical with older students
(Bredekamp & Copple, 1997). Just as important, though, are the educational and philosophical
issues that early childhood education has brought to public attention. Some educational critics ask
whether preschool and day care programs risk becoming inappropriate substitutes for families.
Other educators suggest, in contrast, that teachers of older students can learn from the flexibility
and open- ended approach common in early childhood education. For teachers of any grade level,
it is a debate that cannot be avoided completely or permanently.
The other end of the age spectrum has also expanded. Many individuals take courses well into
adulthood even if they do not attend formal university or college. Adult education, as it is
sometimes called, often takes place in workplaces, but it often also happens in public high schools
or at local community colleges or universities. Some adult students may be completing high school
credentials that they missed earlier in their lives, but often the students have other purposes that
are even more focused, such as learning a trade-related skill. The teachers of adult students have
to adjust their instructional strategies and relationships with students so as to challenge and respect
their special strengths and constraints as adults (Bash, 2005). The students’ maturity often means
that they have had life experiences that enhance and motivate their learning. But it may also mean
that they have significant personal responsibilities—such as parenting or a full-time job—which
compete for study time, and that make them impatient with teaching that is irrelevant to their
personal goals or needs. These advantages and constraints also occur to a lesser extent among
‛regular‚ high school students. Even secondary school teachers must ask, how they can make sure
that instruction does not waste students’ time, and how they can make it truly efficient, effective,
and valuable.
New trend #2: using technology to support learning
For most teachers, ‛technology‚ means using computers and the Internet as resources for teaching
and learning. These tools have greatly increased the amount and range of information available to
students, even if their benefits have sometimes been exaggerated in media reports (Cuban, 2001).
With the Internet, it is now relatively easy to access up-to-date information on practically any
subject imaginable, often with pictures, video clips, and audio to accompany them. It would seem
not only that the Internet and its associated technologies have the potential to transform traditional
school- based learning, but also that they have in fact begun to do so.
For a variety of reasons, however, technology has not always been integrated into teachers’
practices very thoroughly (Haertel & Means, 2003). One reason is practical: in many societies and
regions, classrooms contain only one or two computers at most, and many schools have at best
only limited access to the Internet. Waiting for a turn on the computer or arranging to visit a
computer lab or school library limits how much students use the Internet, no matter how valuable
the Internet may be. In such cases, furthermore, computers tend to function in relatively traditional
ways that do not take full advantage of the Internet: as a word processor (a ‛fancy typewriter‚), for
example, or as a reference book similar to an encyclopedia.
Even so, single-computer classrooms create new possibilities and challenges for teachers. A single
computer can be used, for example, to present upcoming assignments or supplementary material
to students, either one at a time or small groups. In functioning in this way, the computer gives
students more flexibility about when to finish old tasks or to begin new ones. A single computer
can also enrich the learning of individual students with special interests or motivation and it can
provide additional review to students who need extra help. These changes are not dramatic, but
they lead to important revisions in teachers’ roles: they move teachers away from simply delivering
information to students, and toward facilitating students’ own constructions of knowledge.
A shift from ‛full-frontal teaching‚ to ‛guide on the side‚ becomes easier as the amount and use of
computer and Internet technologies increases. If a school (or better yet, a classroom) has numerous
computers with full Internet access, then students’ can in principle direct their own learning more
independently than if computers are scarce commodities. With ample technology available,
teachers can focus much more on helping individuals in developing and carrying out learning
plans, as well as on assisting individuals with special learning problems. In these ways a strong
shift to computers and the Internet can change a teacher’s role significantly, and make the teacher
more effective.
But technology also brings some challenges, or even creates problems. It costs money to equip
classrooms and schools fully: often that money is scarce, and may therefore mean depriving
students of other valuable resources, like additional staff or additional books and supplies. Other
challenges are less tangible. In using the Internet, for example, students need help in sorting out
trustworthy information or websites from the ‛fluff’ websites that are unreliable or even damaging
(Seiter, 2005). Providing this help can sometimes be challenging even for experienced teachers.
Some educational activities simply do not lend themselves to computerized learning—sports, for
example, driver education, or choral practice. As a new teacher, therefore, you will need not only
to assess what technologies are possible in your particular classroom, but also what will actually
be assisted by new technologies. Then be prepared for your decisions to affect how you teach—
the ways you work with students.
New trend #3: accountability in education
In recent years, the public and its leaders have increasingly expected teachers and students to be
accountable for their work, meaning that schools and teachers are held responsible for
implementing particular curricula and goals, and that students are held responsible for learning
particular knowledge. The trend toward accountability has increased the legal requirements for
becoming and (sometimes) remaining certified as a teacher. In the United States in particular,
preservice teachers need more subject-area and education-related courses than in the past. They
must also spend more time practice teaching than in the past, and they must pass one or more
examinations of knowledge of subject matter and teaching strategies. The specifics of these
requirements vary among regions, but the general trend—toward more numerous and ‛higher‚
levels of requirements—has occurred broadly throughout the English-speaking world. The
changes obviously affect individuals’ experiences of becoming a teacher— especially the speed
and cost of doing so.
Public accountability has led to increased use of high-stakes testing, which are tests taken by all
students in a district or region that have important consequences for students’ further education
(Fuhrman & Elmore, 2004). High-stakes tests may influence grades that students receive in courses
or determine whether students graduate or continue to the next level of schooling. The tests are
often a mixture of essay and structured-response questions (such as multiple-choice items), and
raise important issues about what teachers should teach, as well as how (and whether) teachers
should help students to pass the examinations. It also raises issues about whether high-stakes
testing is fair to all students and consistent with other ideals of public education, such as giving
students the best possible start in life instead of disqualifying them from educational opportunities.
Furthermore, since the results of high-stakes tests are sometimes also used to evaluate the
performance of teachers, schools, or school districts, insuring students’ success on them becomes
an obvious concern for teachers—one that affects instructional decisions on a daily basis.
New trend #4: increased professionalism of teachers
Whatever your reactions to the first three trends, it is important to realize that they have contributed
to a fourth trend, an increase in professionalism of teachers. By most definitions, an occupation
(like medicine or law—or in this case teaching) is a profession if its members take personal
responsibility for the quality of their work, hold each other accountable for its quality, and
recognize and require special training in order to practice it.
By this definition, teaching has definitely become more professional than in the past (Cochran-
Smith & Fries, 2005). Increased expectations of achievement by students mean that teachers have
increased responsibility not only for their students’ academic success, but also for their own
development as teachers. Becoming a new teacher now requires more specialized work than in the
past, as reflected in the increased requirements for certification and licensing in many societies
and regions. The increased requirements are partly a response to the complexities created by the
increasing diversity of students and increasing use of technology in classrooms.
Greater professionalism has also been encouraged by initiatives from educators themselves to
study and improve their own practice. One way to do so, for example, is through action
research (sometimes also called teacher research), a form of investigation carried out by teachers
about their own students or their own teaching. Action research studies lead to concrete decisions
that improve teaching and learning in particular educational contexts (Dick, B. 2006).

Question 4:
Social media is a good source of informal learning. Justify by arguments.
Solution:
There are lots of social networking tools with weird-sounding names: blogs, wikis, Twitter (also
known as micro-blogs), Ning, Facebook, and more. Similarly, we hear buzz phrases: learning 2.0,
social media, co-creation, user-generated content, and so on. The question is, what are the real
opportunities?
Things are not getting slower: we are seeing decreasing time to market for products and services,
more information coming in, and fewer resources with which to cope. The rate of disruption in
industries is increasing to the point that it’s almost continuous. The days when you could plan,
adapt, and then execute are mostly behind us.
What we need, going forward, is the ability to take a continuous read on the environment and to
adapt quickly. The nimble organization will be the one that thrives.
The ability to adapt comes both from a good background of theory, and from the ability to problem-
solve and innovate. You need to support learners in communicating and collaborating. That’s
where social learning comes in. The new ideas, the collaborative problem-solving, can be
augmented with tools that provide value even with co-location, but when geographic reach is added
in, the value is even higher.
I’ll first explore the informal learning roles for social media tools, and make the case that social
learning tool skills make sense. Then I’ll explore the formal learning applications of these tools,
concluding that using the tools for formal learning provides a valuable ‛onramp‚ to their use more
broadly. I’ll focus on five particular tools, but the arguments extend.
Informal learning payoffs in real life
Think of the way people work together in the workplace: they pop over the cubicle to ask a
question, they sit together over a document, they brainstorm around a whiteboard, they hold
meetings, and give presentations. Now, can we support, and augment, that?
Let’s turn it around, and think of some particular activities. We’ll go through several cases, and
for each we’ll look at the benefits, and then see the social media tools that support this.
Making it possible for a group of people to converse means that they can cover issues, solve
problems, debate approaches, ask questions, get thoughtful responses, and more. Someone in the
problems, debate approaches, ask questions, get thoughtful responses, and more. Someone in the
group can schedule specific topics, or the group members can call for discussion as needed.
E-mail forums are just such a discussion tool. Group members receive questions, and their
responses go back out to the group. Before the World Wide Web, Usenet was an internet-based e-
mail discussion list that was quite popular and very useful.
We often overlook discussion forums in the excitement of new technologies, but the simple
capabilities of an e-mail list are quite powerful. And anyone interested (and appropriate) can
become a forum member, or opt out, while no one has to figure out just who to send it to. For over
10 years, ITFORUM has been a way for those interested in instructional technology to discuss
current topics, as well as to get and provide help.
Having people work together to craft a statement, document and approach, or generate a response
can be a powerful tool for developing a shared understanding. A team can develop their ideas,
others can review, add, and edit; ultimately the best ideas can coalesce. Managed properly, the
whole is greater than the sum of the parts.
Wikis are collaboration tools. In essence, they are shared editable spaces, where individuals can
access and edit a document in an ongoing process. A wiki can track contributions and history, so
who does what is known, and participants can revisit previous versions. Wikipedia is the poster-
child for these tools, but organizations from Intel to the CIA have used them. Collaborative
document services, like Google Docs, are essentially the same as a wiki.
In the old days, this ‛one place‚ might have been a manual or a library. When users can find the
tools they need in a reliable place, they don’t waste time searching, or making things up in lieu of
the answer. The estimates are that people spend 15-20% or more of their time searching, and up to
40% of that unsuccessfully. People do prefer self-help, if they can, but if they can’t find what they
need easily, or if there are too many places to search, they’ll use more costly resources such as
phone calls, or worse, just wing it and make mistakes.
The modern-day equivalent of the library is the portal. A well-configured portal provides a place
for people to stash and look for the resources they need. Note that ‛well-configured‚ is a rare
quality, and it’s all too common to hear ‛we’ve got hundreds of resources’‚ just to find that they’re
organized in only one way. You can’t let someone handcraft a portal; it requires the same
information architecture that other online resources need. So, doing it by role or task makes much
more sense than doing it by, say, department.
When done right, however, portals are powerful resources for self-help and performance. IBM has
taken it a step further and actually created custom portals, based upon employee roles and tasks.
The person nearest to you, or your boss, may not be the best person to ask! If you have met folks
in the organization, you might know who to go to. If you don’t, you could waste time asking
around. Being able to identify people based upon their knowledge and expertise is powerful both
for getting answers, and for getting collaboration when it’s a new problem. (And the latter is
increasingly going to be the case!)
In knowledge management, the usual way to identify people based on their knowledge and
expertise is an employee ‛yellow pages’‚ and personal profiles are a common tool to provide this.
Granted, having a system auto-troll for people’s expertise by parsing their e-mail or documents is
going to be more accurate than what they self-describe, but it’s also part of building a culture of
trust, and it’s much easier. There are additional benefits in allowing people to express not only
their expertise, but also their personality (for instance, the customization of avatars in virtual
worlds).
Personal profiles are a way for individuals to present themselves to the organization. People can
use officially sanctioned tags, but they can also add personal characteristics or interests. This
combination creates a richer picture of the individual, supporting communication and a sense of
support of self- image.
Typically we think of journals as personal, but there can be benefits from sharing reflections.
Recording your thoughts is a valuable way to make them concrete. You probably have experienced
the situation where, by writing something down, you had to work out some details that were missed
when the idea was pure conjecture. Keeping a journal forces you to take time for reflection.
Moreover, if you share your journal, you can get feedback on your thoughts. If a leader keeps a
journal and makes it available, then that person’s employees or peers can follow the leader’s
thoughts, and keep in better touch with where the leader is going. It’s a form of virtual mentorship,
or thinking out loud (an important aspect of learning).
A blog is just such an online journal. It’s a way a person can write their thoughts down and easily
publish them for all to see. Better yet, others can add their own thoughts as comments. It provides
a simple and useful way to share thoughts, progress, etc. Blogging has proved valuable both
internally and outwardly to customers. Similarly, a project, or a product or service team, can update
progress with a blog, and solicit feedback on new ideas. Sun and Oracle are among the companies
exploring blogs.
There are more tools we could discuss, including IM (Instant Messaging) and ‛micro-blogs‚ (read:
Twitter and Yammer), but the goal here is to point out some more common business goals and
how these tools augment and/or accelerate them. Some of the emerging tools provide capabilities
that are truly new, and it’s worth getting on top of the old ones to fully comprehend the
opportunities of both. Today's informal learning environment has many of the same characteristics
as the system of the bygone ages. However, with the proliferation of the Internet, learners are
turning to social connections and tools to learn informally.

Question 5:
Discuss the status of literacy in Pakistan compare it with 5 developing and 5 developed
countries. Suggest ways and means to improve the literacy rate.
Solution:
Despite recent improvements, it remains a major challenge but is massively underfunded and
subject to a number of misconceptions, experts said.
The Sustainable Development Goals call for ‛all youth and a substantial proportion of adults’ both
men and women, to achieve literacy and numeracy‚ by 2030. While youth literacy rates have
jumped in the past 50 years, progress is not fast enough, experts warned.
Approximately 750 million people over the age of 15 still lack basic reading and writing skills.
Two- thirds of these are women, according to the United Nations, with female literacy improving
by just 1 percent since 2000. Sub-Saharan Africa and Southern Asia have the lowest literacy rates,
and the poorest and most marginalized are least likely to be able to read and do basic sums.
HRH Princess Laurentien of the Netherlands gave an opening address at World Literacy Summit,
calling on literacy to be framed as a ‛win-win‚ for everyone, and not simply as an education goal.
‛We need to be framing literacy not as an educational issue but [as something] of importance to
the ministry of finance because by helping literacy you help crime, poverty, health issues,
employment issues” she told Devex.
Here are five key takeaways for development from the two-day conference.
1. Remember adult learning
Historically, donor funding for literacy has focused on young school children and has tended to
miss adolescent or adult literacy, according to Katy Newell-Jones of the British Association for
Literacy in Development, or BALID. In the past, literacy programs assumed a ‛trickle up feeling
that if we can educate the next generation of children then literacy problems will be solved” she
said, but this has been ‛proved to be so wrong.‚
Instead, a holistic approach to literacy is needed, Newell-Jones told Devex, which supports adults,
especially women, to become literate and which also emphasizes the role of learning within the
family, including intergenerational learning and creating a ‛learning environment in the home.‚
The theme of adult learning was picked up throughout the conference’s sessions.
2. Teach in the mother tongue
In many developing countries, lessons are taught in English or another nonlocal language, such as
French, from a young age.
In Pakistan, for example, this is has resulted in children learning to read English but with very little
comprehension, according to Nadia Naviwala, an adviser to the Citizens Foundation in Pakistan.
‛Kids in Pakistan do learn to read English; they just have no comprehension of it’, she said. ‛Is
literacy impeded because it’s being done in a language that’s not their own?’
Teachers are also often not proficient in the language they are instructing in, according to Ian
Cheffy from BALID.
Instead, children and adults should be learning to read and write in their local languages, he said.
‛Parents may be demanding English but let’s not ignore local languages,‚ Cheffy said, pointing
out that in sub-Saharan Africa more than 1,700 languages are still regularly spoken by 750 million
people, and of those 1,100 languages are also being written down. ‛Let’s not marginalize these
supposedly marginal languages’‚ he said.
Nal’ibali Trust, a charity that aims to promote a culture of reading in South Africa, has made
multilingual storytelling the center of its work to drive literacy rates among children. It is crucial
for both readers and listeners that written stories are available in local languages, so they can
understand and enjoy the experience, according to managing director Jade Jacobsohn, who spoke
during the summit.
‛Most parents work, and in South Africa they travel a long distance … [so] by the time they get
home they’re exhausted,‚ and sitting down to read to their child is ‛the last thing they want to do,‚
she said. In response, Nal’ibali aims to make it as easy as it possible for parents to ‛access the
resources‚ they need to read to their children. A key component is that books and other materials
are ‛in a language that the child and the parent understands’‚ she said.
She also stressed the importance of recognizing the role played by grandparents, who tend to have
lower literacy rates but can still offer oral storytelling. ‛How do you make sure that [grandparents]
know that what they do have is good enough and even if you can’t read you can tell a story…
[And] put value in what they are able to do already’‚ she said.
Matthew Johnson from Universal Learning Solutions, a U.K.-based social enterprise working with
governments and donors to improve literacy, agreed that young children can be taught to read
English without comprehending what they are reading.
In order to overcome this, ULS has been piloting an oral storytelling project that enables educators
to teach in both English and their students’ mother tongue by ‛creating stories in mother tongue
and then adding actions so it becomes universally understandable … then transferring that into
English and developing the two side by side,‚ he said.
‛The main message is that the more that children hear words — the more they get to experience
stories and tell and share stories — [then] the more language and vocabulary and understanding
they will have,‚ Johnson added.
3. Don’t just hand out books: Foster a love of reading
The emphasis on storytelling in local languages is also key to We Love Reading, an NGO started
in Jordan that aims to foster a love of reading among children by training local volunteers to read
to them. Rana Dajani, the NGO’s founder, told Devex that fostering a love of reading is the first
step to improving literacy but is something that many development programs fail to appreciate,
instead focusing on inputs such as books.
‛It’s not about giving books; that is secondary and I have seen books sitting on shelves but not
being used’‚ Dajani said. Instead, it is important to ‛plant the need and the love of books first,‚
which she says leads to direct literacy, as well as a host of other gains by encouraging a love of
school.
A molecular biologist by training, Dajani was at the summit to pick up an award from the World
Literacy Council, and told Devex that We Love Reading has spread to 36 countries in 10 years
with very little donor funding because of its low-cost, ‛niche‚ approach to promoting learning
through reading and storytelling for pleasure, and its use of volunteers. Last year, the NGO secured
funding from UNICEF and has recently begun partnering with international NGOs including Plan
International.
4. Embed literacy into other programs
Standalone literacy programs are not necessarily the best approach, according to Newell-Jones
from BALID, who argued that literacy and numeracy should instead be embedded into community
development projects.
Presenting at the conference, she gave examples of where applying literacy training had led to a
‛deeper understanding‚ of the topic being discussed, and thus to better results. For example, she
described a program to help women secure land rights in Rwanda by training them up as paralegals.
The project was much more effective once the NGO in charge of the project changed the type of
language it was using from legal jargon to ‛simplified land right laws‚ in the mother tongue, ‛so
that the community women could understand.‚ These changes meant ‛there was a real
understanding of the sensitive topic’‚ but the program is also an example of increasing literacy
levels within a community while not explicitly running literacy classes, Newell-Jones said. It is
something she wants development programmers to do more of, especially for adults.
‛Let’s get on with life and pull in literacy as we go, and people will develop literacy as they go’,
she said. ‛They don’t have to learn the skills first and apply them [later].‚ Instead, developers can
take advantage of ‛hidden literacies‚ within communities.
5. Use technology — but use it carefully
According to a 2016 analysis of the global literacy sector by United States NGO Results for
Development, donors focus too much on technology at a time when there is a ‛significant lack of
evidence on what types of technology interventions actually work.‚ Critics, including Princess
Laurentien of the Netherlands, also warn that the digitalization of communication could have
negative impacts on literacy rates.
‛We know reading and writing comes through talking, [but] research shows that in this digital age,
through social media, we talk less to each other’, she told Devex.
However, Sun Books Uganda, a project by the World Literacy Foundation, which presented during
the summit, offers an example of how technology can help. It provides low-cost, solar-powered
tablets loaded up with ‛a toolbox of digital books and learning resources to ‘off the grid’
classrooms with no internet and electricity.‚ Usually one per classroom, the Sun Books tablets are
written in Swahili and English, but Grace Baguma from Uganda’s National Curriculum
Development Center, which has recently partnered with the NGO, said the plan is to add more
languages so that mother tongue can be used as the mode of instruction, especially for younger
years.
Word Scientists also presented about its work offering free online resources to improve early
reading in Nepal, including lesson guides, teacher tutorials, and books. What is sometimes missed
in ed tech interventions, said chief executive Jacob Bronstein, is the need to focus on the content
and the software as opposed to the technology itself, since ‛the tech can’t do it alone.‚
Word Scientists has developed materials intended to be engaging and practical, written in local
languages so a teacher can read the story to pupils in their mother tongue before reading it in
English. The ‛software‚ is also free to access and can be downloaded onto a USB or printed out,
and so does not rely on internet access.
If you live in a developed country, you probably started school at a very young age, where you
learned to read and write. Unfortunately, not all nations have this luxury. In some countries, the
literacy rate is that the number of people at least 15 years old that can read and write is meager.
As a whole, the global literacy rate is high. The literacy rate for all males and females that are at
least 15 years old is 86.3%. Males aged 15 and over have a literacy rate of 90%, while females lag
at just 82.7%. Developed nations as a whole have a literacy rate of 99.2%.

Most of the illiterate adults live in South Asia, West Asia, and sub-Saharan Africa. Of all of the
illiterate adults in the world, nearly two-thirds are female. In total, there are about 781 million
adults worldwide that can’t read or write.
One of the lowest literacy rates in the world is Niger, where just over 19% of adults can read and
write. More than one-quarter of the males in this nation are literate, while only 11% of females are
literate. Chad follows, with a literacy rate of 22.3%
Other nations with low literacy rates include:
 Guinea-Bissau (30.4%)
 South Sudan (34.5%)
 Mali (35.5%)
 Central African Republic (37.4%)
 Burkina Faso (41.2%)
 Benin (42.4%)
 Afhganistan (43.0%)
The developed nations of the world have much higher literacy rates with smaller gaps – if any –
between the genders. Thirty-one nations reported literacy rates of 99.0% or above. Four countries
reported 100% literacy among their people:
 Andorra
 Greenland
 North Korea
 Uzbekistan
Asia has been the cradle to numerous civilizations in the world. Unfortunately, it accumulates a
widening range of illiterates now. There exists a huge disparity in the literacy levels of men and
women. Asia being land to some of the biggest economies and developed countries, whereas
including the underdeveloped countries in its boundaries as well, large differences can be seen in
the literacy rate of different Asian countries. Pakistan, unfortunately, is amongst one of the most
uneducated nations of Asia. Literacy rate of Pakistan which is 58.7 is even lesser than Bangladesh

and Nepal, that have literacy rates of 61.5 and 64.7 per cent correspondingly. Sri Lanka and the
Maldives have accomplished far more remarkable results considering that more than 90% of the
population in both of these countries is knowledgeable. India exhibits 74% literacy rate, in spite
of its massive population.
As the 21st-century origins, the education condition trials world leaders with its sheer burden of
illiteracy in some countries and inadequate progress in ensuring betterment in adult literacy rate.
The adult literacy rate is significant to improve unemployment and business conditions.
Considering counterparts of other Asian countries such as Malaysia, Indonesia and Singapore,
more than 90 per cent of its population is literate. Whereas, Pakistan displays a loathing figure of
just 55 per cent of adult literates. This trend directly shows us that developed countries have a
massive proportion of the adult literacy rate.
Another problem that hinders the progress of literacy is the amount spent by the government on
the education sector. Asia is the most populous continent in the world having burgeoning populous
countries, such as China, India, Pakistan and Indonesia make it a little difficult for the governments
to share a chunk of the budget towards education. But in order to place the country towards
development, education and literacy must be a priority. If we evaluate the expenditure on education
by the percentage of GDP by country, regrettably, Pakistan ranks amongst the bottom-line that is
2.76%. whereas other developed and developing countries are allotting a substantial proportion
towards the education sector. Even, war-torn Afghanistan has managed to contribute greater chunk
of its GDP towards education. Afghanistan’s government expenditure by the percentage of GDP
is 3.9%. India and Iran have allotted 3.8% of the GDP. Sri Lanka’s spending on education is
somewhat nearer to Pakistan which is 2.8%.
Pakistan has achieved much in terms of literacy but its progress is still much far behind its
neighbours, especially Iran. Iran had considerably lower literacy rate than Pakistan in the 1950s
but now it has marked a momentous figure near to 90%. Shah Reza Shah Pahlavi, the ruler of Iran
from 1941 to 1979 kept education as a prime focus. The Shah devoted a weighty portion of his
country’s oil profits to facilitate education, health care and infrastructure.

Despite the huge population, India’s literacy is much better compared to Pakistan and India is
miles ahead of Pakistan because of the wide gap difference of literacy rate. India’s literacy rate is
more than 74% and that of Pakistan is 58%. Moreover, this trickles down to the gender disparity
with respect to education.

You might also like