Dialogue systems II - University of Manchester

Dialogue systems II Understanding speakers intentions Cooperative responses Speakers intentions Speech acts and illocutionary force (Austin 1962, Searle 1969) Traditionally, meanings of utterances expressed in terms of truth conditions Austin distinguished utterances as Constative: as above Performative: utterances constituting verbal actions per se Can you pass the salt?

Its too hot in here. 2/17 Types of verbal act Locution Literal use of an utterance Illocution Intending meaning of the utterance hence illocutionary force Perlocution What is achieved by the utterance Fun starts when locution and illocution do not match

Can we characterize the circumstances when a given locution should (or should not) have a given illocutionary force? 3/17 Types of illocutionary act Assertives Stating, claiming, reporting, announcing Directives Ordering, requesting, demanding, begging Commissives Promising, offering

Expressives Thanking, apologizing, congratulating Declarations Naming a ship, resigning, sentencing, dismissing, marrying 4/17 Examples of illocutionary act Assertives Bare declaratives, Id like to report that, Did you know that Directives Imperatives, Can I have, Id like , Can you ,

Commissives I promise , Would you like , Can I give you , Expressives Thank you, (Im) sorry, Well done, Id like to say well done Declarations I resign, Youre fired, I declare this supermarket open, OK thats enough (cricket declaration), jadoube, Pax, Barley 5/17 Mismatches Can you X? Can I X? Apparently yes/no questions about ability or

permission, but may be Directive: can you pass the salt? Commissive: can I give you a lift? Expressive: can I just say well done? Ambiguity Can you tell me if Mr Smith was on that plane? Context Why have you taken your shirt off? Its too hot in here. Yes, Joe, what do you want? Its too hot in here. 6/17 Beliefs and intentions In order to correctly understand the

speech act involved, we must know about Literal meaning (propositional content) Conventions associated with certain constructions Conditions/context in which it was uttered Beliefs/intentions of the speaker eg if theres no way of making it cooler, its too hot is just a statement. 7/17 Understanding dialogues A. Will you have another beer? B. Well Im driving. No is understood

Statement is added as an explanation How do you know that no is implied? A. Do you want some coffee? B. Coffee would keep me awake Does B mean Yes or No? Ambiguous because being kept awake could be good or bad, depending on the circumstances. 8/17 Understanding dialogues A. Id like to buy a ticket to London. B. Which train do you want to get? A. Well my meeting is at 2 oclock.

As response is a perfectly reasonable answer to the question. Though it assumes (or begs) knowledge of where the meeting is in relation to the station Assumptions and beliefs of both parties have to be modelled In a real application, this can only be done with a restricted domain 9/17 Cooperative responses Other side of the coin In any dialogue, responses generally express what the speaker thinks the hearer wants to know

From analysis point of view need to understand why a given response is appropriate, and what it means. Equally, when generating dialogue, responses need to be cooperative 10/17 Grices Maxims H P Grice (1975) Principles of cooperative dialogue Maxim of Quantity: Make your contribution to the conversation as informative as necessary. Do not make your contribution to the conversation more informative than necessary. Maxim of Quality:

Do not say what you believe to be false. Do not say that for which you lack adequate evidence. Maxim of Relevance: Be relevant (i.e., say things related to the current topic of the conversation). Maxim of Manner: Avoid obscurity of expression. Avoid ambiguity. Be brief (avoid unnecessary wordiness).

Be orderly. 11/17 Maxim of quantity Maggie ate some of the chocolate Implies she didnt eat it all, because otherwise you would say so Logically no such implication can be made A. What time is the next train to London? B. The next train is 1015. It arrives at 1245. Information given is more than was asked for B guesses that this is the sort of information A

might also need, and so offers it unsolicited 12/17 Maxim of quantity A. What films are there on this afternoon? B. [Lists all the films showing at all 12 screens] Does answer the question literally, but this maxim would tell you that listing all the films is too much information Better to say There are 12 screens, shall I list everything? Or narrow down the search: What time? Or What genre? A. Who is enrolled in LELA10011?

B. There are 145 students enrolled in that course. Shall I list them? 13/17 Maxim of quality A. How many first-year students are taking LELA20032? B. None. Obvious requirement to say the truth But includes the whole truth A. How many first-year students are taking LELA20032? B. None, because its a second-year course. This kind of informative answer could be triggered by recognizing that the original question implies some mistaken

assumption about the data. 14/17 Maxim of quality A. What is the capital of Edinburgh? B1. I dont know. B2. Edinburgh is a city: only countries have capitals. A. How much is a ticket to London? B. 63. A. Does the train stop at Watford? B. Yes. If you want to get off at Watford, its only 60. A. Is The Exorcist showing this half term? B. Yes, but its a 15. A. Did Everton win this weekend?

B1. No. B2. No, they drew. B3. No, they didnt play. 15/17 Maxim of relevance Answer must be relevant. How to determine relevance? Need a model of speakers and hearers beliefs; and the domain Works both ways: An answer that seems to be irrelevant must be interpreted as if it obeyed this maxim.

A. Did Everton win this weekend? B. It was an international break. 16/17 When maxims are disobeyed Deceit, long-windedness, irrelevance, obscurity, taciturnity are all part of natural communication May be for rhetorical effect eg an apparent lie might be sarcasm Obscurity may be to cover up lack of knowledge An irrelevant answer may reveal a mismatch in mutual belief systems 17/17

Conversation machines In contrast to purposeful dialogue systems, often with little or no genuine linguistic processing Quite good fun, and interesting to look at as a demonstration of what can be done without genuine understanding or processing 18/17 Conversation machines Early prototypes: ELIZA, PARRY Worked by paraphrasing the input

Or recognizing keywords A. I have been feeling very worried lately. B. You have been feeling very worried lately? A. My mother doesnt love me any more. B. Tell me more about your family. A. I am very nervous. B. Why are you very nervous? Other examples: Racter, ALICE, Verbot, Easy to trick 19/17

Recently Viewed Presentations