L = {<M₁,M₂,s> : |s| > the length of any string in L(M₁) - L(M₂) }. For example, suppose L(M') = {abc,ℇ}, and L(M'') = {abc,defghi}. Then the set difference L(M') - L(M'') = {ℇ} (that is, ℇ is the only string in L(M') but not in L(M''), and <M',M'',a> ∈ L since |a| = 1 > |ℇ| = 0. Prove that L ∉ SD by a reduction from ¬H. Do not use the variables s, M₁, M₂, M', or M'' in your proof. Remember that the input to your mapping reduction function R is <M,w>, something that could be in ¬H, while R's output is something that could be in L. By convention, x is the input to any TM defined by R. The supposed Oracle semidecides whether R(<M,w>) ∈ L.
L = {<M₁,M₂,s> : |s| > the length of any string in L(M₁) - L(M₂) }. For
example, suppose L(M') = {abc,ℇ}, and L(M'') = {abc,defghi}. Then the set difference
L(M') - L(M'') = {ℇ} (that is, ℇ is the only string in L(M') but not in L(M''), and
<M',M'',a> ∈ L since |a| = 1 > |ℇ| = 0. Prove that L ∉ SD by a reduction from ¬H. Do
not use the variables s, M₁, M₂, M', or M'' in your proof. Remember that the input
to your mapping reduction function R is <M,w>, something that could be in ¬H, while
R's output is something that could be in L. By convention, x is the input to any TM
defined by R. The supposed Oracle semidecides whether R(<M,w>) ∈ L.
Trending now
This is a popular solution!
Step by step
Solved in 2 steps