Consider the following context-free grammar, partially augmented with semantic rules: {NP.sem(VP.sem)} + Det Nominal {Det.sem(Nominal.sem)} Nominal → Adj Nominal {Adj.sem(Nominal.sem)} S → NP VP NP Det + every {...} { { Det a + small Nominal → dog + barks Adj } } VP Now consider the following two sentences, with their desired meaning represen- tations:

Computer Networking: A Top-Down Approach (7th Edition)
7th Edition
ISBN:9780133594140
Author:James Kurose, Keith Ross
Publisher:James Kurose, Keith Ross
Chapter1: Computer Networks And The Internet
Section: Chapter Questions
Problem R1RQ: What is the difference between a host and an end system? List several different types of end...
icon
Related questions
Question
(b) Consider the following context-free grammar, partially augmented with semantic
rules:
{NP.sem(VP.sem)}
+ Det Nominal {Det.sem(Nominal.sem)}
Nominal → Adj Nominal {Adj.sem(Nominal.sem)}
{..}
{ ...}
{ ...}
S
→ NP VP
NP
Det
+ every
Det
a
→ small
Nominal → dog
→ barks
Adj
VP
{...}
Now consider the following two sentences, with their desired meaning represen-
tations:
A1 every dog barks
A2 Vd Dog (d) =→ Je Barking (e) A Barker(e, d)
B1 a small dog barks
B2 3d Dog(d) A Small(d) ^ Je Barking(e) A Barker(e, d)
Complete the above augmented grammar, by giving the semantics of 'every',
'a', 'small', 'dog', and barks', so that the desired meaning representations are
obtained for A1 and B1. Show that your solution is correct, by first giving
the lambda expressions obtained from A1 and B1 before any beta reductions,
and then showing step-by-step that the beta reductions lead to A2 and B2.
Transcribed Image Text:(b) Consider the following context-free grammar, partially augmented with semantic rules: {NP.sem(VP.sem)} + Det Nominal {Det.sem(Nominal.sem)} Nominal → Adj Nominal {Adj.sem(Nominal.sem)} {..} { ...} { ...} S → NP VP NP Det + every Det a → small Nominal → dog → barks Adj VP {...} Now consider the following two sentences, with their desired meaning represen- tations: A1 every dog barks A2 Vd Dog (d) =→ Je Barking (e) A Barker(e, d) B1 a small dog barks B2 3d Dog(d) A Small(d) ^ Je Barking(e) A Barker(e, d) Complete the above augmented grammar, by giving the semantics of 'every', 'a', 'small', 'dog', and barks', so that the desired meaning representations are obtained for A1 and B1. Show that your solution is correct, by first giving the lambda expressions obtained from A1 and B1 before any beta reductions, and then showing step-by-step that the beta reductions lead to A2 and B2.
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 2 steps with 1 images

Blurred answer
Recommended textbooks for you
Computer Networking: A Top-Down Approach (7th Edi…
Computer Networking: A Top-Down Approach (7th Edi…
Computer Engineering
ISBN:
9780133594140
Author:
James Kurose, Keith Ross
Publisher:
PEARSON
Computer Organization and Design MIPS Edition, Fi…
Computer Organization and Design MIPS Edition, Fi…
Computer Engineering
ISBN:
9780124077263
Author:
David A. Patterson, John L. Hennessy
Publisher:
Elsevier Science
Network+ Guide to Networks (MindTap Course List)
Network+ Guide to Networks (MindTap Course List)
Computer Engineering
ISBN:
9781337569330
Author:
Jill West, Tamara Dean, Jean Andrews
Publisher:
Cengage Learning
Concepts of Database Management
Concepts of Database Management
Computer Engineering
ISBN:
9781337093422
Author:
Joy L. Starks, Philip J. Pratt, Mary Z. Last
Publisher:
Cengage Learning
Prelude to Programming
Prelude to Programming
Computer Engineering
ISBN:
9780133750423
Author:
VENIT, Stewart
Publisher:
Pearson Education
Sc Business Data Communications and Networking, T…
Sc Business Data Communications and Networking, T…
Computer Engineering
ISBN:
9781119368830
Author:
FITZGERALD
Publisher:
WILEY