0% found this document useful (0 votes)
52 views1 page

Exampl

A simple reflex agent takes action based only on the current situation, ignoring past percepts. It maps current percepts directly to actions using a table or rules. An example is a robotic vacuum that decides whether to suck or continue moving based on detecting a location as clean or dirty. A goal-based reflex agent has a goal and strategy to reach it, selecting actions that improve progress toward the goal. An example is a searching robot that wants to reach a destination. A learning agent can learn from experience by acquiring and integrating new information automatically to be successful in uncertain environments.

Uploaded by

Mule Won Tege
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views1 page

Exampl

A simple reflex agent takes action based only on the current situation, ignoring past percepts. It maps current percepts directly to actions using a table or rules. An example is a robotic vacuum that decides whether to suck or continue moving based on detecting a location as clean or dirty. A goal-based reflex agent has a goal and strategy to reach it, selecting actions that improve progress toward the goal. An example is a searching robot that wants to reach a destination. A learning agent can learn from experience by acquiring and integrating new information automatically to be successful in uncertain environments.

Uploaded by

Mule Won Tege
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

simple reflex agent takes an action based on only the current environmental situation; it
maps the current percept into proper action ignoring the history of percepts. The mapping
process could be simply a table-based or by any rule-based matching algorithm. An
example of this class is a robotic vacuum cleaner that deliberate in an infinite loop, each
percept contains a state of a current location [clean] or [dirty] and, accordingly, it decides
whether to [suck] or [continue-moving].
A model-based reflex agent needs memory for storing the percept history; it uses the percept
history to help to reveal the current unobservable aspects of the environment.
An example of this IA class is the self-steering mobile vision, where it's necessary to
check the percept history to fully understand how the world is evolving.
Agent Example: Vacuum- cleaner world. Robot vacuum cleaner, cleans your carpet.
Roomba. Reflex agent (for the most part). Maps current percepts directly to ...

A goal-based reflex agent has a goal and has a strategy to reach that goal. All actions are
taken to reach this goal. More precisely, from a set of possible actions, it selects the one
that improves the progress towards the goal (not necessarily the best one). An example of
this IA class is any searching robot that has an initial location and wants to reach a
destination.
An utility-based reflex agent is like the goal-based agent but with a measure of "how much
happy" an action would make it rather than the goal-based binary feedback ['happy',
'unhappy']. This kind of agents provide the best solution. An example is the route
recommendation system which solves the 'best' route to reach a destination.
A learning agent is an agent capable of learning from experience. It has the capability of
automatic information acquisition and integration into the system. Any agent designed and
expected to be successful in an uncertain environment is considered to be learning agent.

You might also like