[Search for users]
[Overall Top Noters]
[List of all Conferences]
[Download this site]
Title: | Mathematics at DEC |
|
Moderator: | RUSURE::EDP |
|
Created: | Mon Feb 03 1986 |
Last Modified: | Fri Jun 06 1997 |
Last Successful Update: | Fri Jun 06 1997 |
Number of topics: | 2083 |
Total number of notes: | 14613 |
211.0. "the two-nots problem" by SPRITE::OSMAN () Fri Jan 18 1985 13:44
This is another of my favorites from HAKMEM. The problem is to design
a black box that negates three binary inputs with only two NOT (~) gates,
and as many ANDs (*) and ORs (+) as you like.
Certainly, if you were allowed three not gates, your black box would
take inputs A, B, and C, and merely do ~A, ~B, and ~C and you're done.
(In the following discussion, I'll denote AND as multiplication and hence
"ab" means a AND b. I'll denote OR as addition and hence "ab+c" means
(a AND b) OR c. [Note isomorphically inherited precedence]
I've heard it suggested that a way of doing it with only two NOTs might start
by designing signals for things like ~3, or 2, which means "not all
three signals are 1's" and "exactly 2 signals are 1's" etc.
Then, we could express ~A as
~A = 0 + 1(B + C) + 2BC, read as "NOT A can be inferred by NO
signals present, or ONE signal present with one of them being
B or C, or TWO signals present with both B and C being present."
and by symmetry, the rest is solved. Namely
~B = 0 + 1(A + C) + 2AC
~C = 0 + 1(A + B) + 2AB
The problem, of course, is to design 0, 1, and 2 from just two NOTs. If
this isn't possible, maybe primitives like ~0, ~1 etc. will work instead.