We shall discuss various areas of common sense knowledge.
1. The most salient common sense knowledge concerns situations that change in time as a result of events. The most important events are actions, and for a program to plan intelligently, it must be able to determine the effects of its own actions.
Consider the MYCIN domain as an example. The situation with which MYCIN deals includes the doctor, the patient and the illness. Since MYCIN's actions are advice to the doctor, full planning would have to include information about the effects of MYCIN's output on what the doctor will do. Since MYCIN doesn't know about the doctor, it might plan the effects of the course of treatment on the patient. However, it doesn't do this either. Its rules give the recommended treatment as a function of the information elicited about the patient, but MYCIN makes no prognosis of the effects of the treatment. Of course, the doctors who provided the information built into MYCIN considered the effects of the treatments.
Ignoring prognosis is possible because of the specific narrow domain in which MYCIN operates. Suppose, for example, a certain antibiotic had the precondition for its usefulness that the patient not have a fever. Then MYCIN might have to make a plan for getting rid of the patient's fever and verifying that it was gone as a part of the plan for using the antibiotic. In other domains, expert systems and other AI programs have to make plans, but MYCIN doesn't. Perhaps if I knew more about bacterial diseases, I would conclude that their treatment sometimes really does require planning and that lack of planning ability limits MYCIN's utility.
The fact that MYCIN doesn't give a prognosis is certainly a limitation. For example, MYCIN cannot be asked on behalf of the patient or the administration of the hospital when the patient is likely to be ready to go home. The doctor who uses MYCIN must do that part of the work himself. Moreover, MYCIN cannot answer a question about a hypothetical treatment, e.g. ``What will happen if I give this patient penicillin?'' or even ``What bad things might happen if I give this patient penicillin?''.
2. Various formalisms are used in artificial intelligence for representing facts about the effects of actions and other events. However, all systems that I know about give the effects of an event in a situation by describing a new situation that results from the event. This is often enough, but it doesn't cover the important case of concurrent events and actions. For example, if a patient has cholera, while the antibiotic is killing the cholera bacteria, the damage to his intestines is causing loss of fluids that are likely to be fatal. Inventing a formalism that will conveniently express people's common sense knowledge about concurrent events is a major unsolved problem of AI.
3. The world is extended in space and is occupied by objects that change their positions and are sometimes created and destroyed. The common sense facts about this are difficult to express but are probably not important in the MYCIN example. A major difficulty is in handling the kind of partial knowledge people ordinarily have. I can see part of the front of a person in the audience, and my idea of his shape uses this information to approximate his total shape. Thus I don't expect him to stick out two feet in back even though I can't see that he doesn't. However, my idea of the shape of his back is less definite than that of the parts I can see.
4. The ability to represent and use knowledge about knowledge is often required for intelligent behavior. What airline flights there are to Singapore is recorded in the issue of the International Airline Guide current for the proposed flight day. Travel agents know how to book airline flights and can compute what they cost. An advanced MYCIN might need to reason that Dr. Smith knows about cholera, because he is a specialist in tropical medicine.
5. A program that must co-operate or compete with people or other programs must be able to represent information about their knowledge, beliefs, goals, likes and dislikes, intentions and abilities. An advanced MYCIN might need to know that a patient won't take a bad tasting medicine unless he is convinced of its necessity.
6. Common sense includes much knowledge whose domain overlaps that of the exact sciences but differs from it epistemologically. For example, if I spill the glass of water on the podium, everyone knows that the glass will break and the water will spill. Everyone knows that this will take a fraction of a second and that the water will not splash even ten feet. However, this information is not obtained by using the formula for a falling body or the Navier-Stokes equations governing fluid flow. We don't have the input data for the equations, most of us don't know them, and we couldn't integrate them fast enough to decide whether to jump out of the way. This common sense physics is contiguous with scientific physics. In fact scientific physics is imbedded in common sense physics, because it is common sense physics that tells us what the equation means. If MYCIN were extended to be a robot physician it would have to know common sense physics and maybe also some scientific physics.
It is doubtful that the facts of the common sense world can be represented adequately by production rules. Consider the fact that when two objects collide they often make a noise. This fact can be used to make a noise, to avoid making a noise, to explain a noise or to explain the absence of a noise. It can also be used in specific situations involving a noise but also to understand general phenomena, e.g. should an intruder step on the gravel, the dog will hear it and bark. A production rule embodies a fact only as part of a specific procedure. Typically they match facts about specific objects, e.g. a specific bacterium, against a general rule and get a new fact about those objects.
Much present AI research concerns how to represent facts in ways that permit them to be used for a wide variety of purposes.