Patchbay for an old analog computer.

Scientific Computing (P05 Spring, 2004)
[Grades] [Compilers] [Links] [Syllabus]
[Examples] [Exams] [Reference]


 

Time to a Computer

If you've heard the joke about time to a pig, or would rather not, hear it again.


Time to a Pig

A farmer is picking apples when his creditor appears suspended in foliage and
says "Hi ya! We've hired a consultant <with your money> to improve the work
product efficiency of your methodology. Meet our consultant!"

The consultant appears in a clearing under the tree in a monkey suit with a
naughahyde case from which he produces a clipboard.

"Hello Mister . . ." he says with airy grating superiority.   The farmer leaves the
blank blank. The consultant fills the silence with pencil scratching, pausing now and
again to fix the farmer and tree in duplicitous quizzlingness.
 

All the time the creditor is grinning and nodding approvingly with staccato jabs of his chin.
Finally the consultant produces a pig,

"What y'all are needing here is a specimen of these," he says. . . .

"What for?" would be the polite response, but the farmer says nothing.
So the creditor asks it: "What for?"

"Wa-hell-ya..." says the consultant, taking off his jacket and rolling up his sleeves,
"you grab the pig by the back legs, like this, and hold it up into the tree, like this . . .
and the pig picks the apples with its mouth."

To this awkward and arduous display the creditor volunteers, "Pigs are pretty smart,
you know!" shaking his head so violently that apples are jumping of their own accord.

The farmer is incredulou: "Isn't that kind of a waste of time . . . for the pig, of course?"

The consultant's practiced retort: "What's time to a pig?"

In graduate school, the implication was that the creditor was an administrator and the
farmer was the faculty whose writing on the blackboard could be improved by hoisting
a teaching assistant (graduate student) with chaulk in his or her mouth by the hind legs
and . . .well, what's time to a graduate student?
 

    So what's time to a computer? At the microprocessor level, computers operate on binary digits called bits.
A bit in the computer is a little packet of electric charge which either exists (1), or doesn't exist (0.)
    The number 13 (which in base 10 is 1*101 + 3*100)  is represented in base 2 with 4 bits:

    Bit operations: the ways of combining two bits, are a natural measure for time on a computer, especially since most of what a computer does is in terms of these bit operations (there is some administrative duty -
like resource allocation, but it requires much less activity.)

     13 = (1101)2 = 1*23 + 1*22 + 0*21 + 1*20  requires 4 bits
while 16 = 24 = (10000)2 requires 5.  A moment's thoughtful reflection is enough to convince one that the number of bits required to represent N is one more than the largest exponent that can be applied to 2 to produce a number less than N, that is, the number of bits is the greatest integer less than or equal to 1 + log2N.

    For example the greatest integer less than or equal to

1+log213 = 1+3.7 = 4.7

is 4. The greatest integer less than or equal to

is 5.

Using [ ] to represent "the greatest integer less than" we have
that it takes  k = 1 + [log2N]  bits to represent N.

           010110
          +100101
           111011

    Bit Operation Defined.

      This requires starting with the right-most column and
      completing the following steps 6 times:

      1. Look at the top and bottom bit and or not whether there's a carry.

      2. If both bits are 0 and there's no carry, put down a zero and move to the left.

      3. If either both bits are zero and there is a carry or only one bit is zero
        and there is no carry, put a 1 down and continue to the left.

      4. If either only one of the bits is 1 and there is a carry or both of the bits
        are 1 and there is no carry, put a zero down, put a carry in the next
        column and move on to the left.

      5. If both bits are 1 and there is a carry, put down a 1 and
        put a carry in the next column and move on.

    Following this procedure once is called a bit operation.
Adding two k-bit numbers requires k bit operations.

    How much computer time (bit operation) does it take
    to multiply a k-bit number by a m-bit number, k<=m?

      15*59, for example, in binary form this looks like

111011
X 1111
111011
111011-
111011--
+ 111011---
1101110101

 

     In general, a k-bit number involves at most k rows (if it's all 1's.)
Here k = 4 and m = 6. Multiplying by 1111 gives the maximum
of four rows to add, with each row a copy of the top number shifted to the
left from  0  to  m 1  places.

In terms of bit operations, this requires k 1 additions of two,
at most (m+k1)-bit numbers, for a total of at most
(k1)*(m+k1) <= k*m+k2 <= m2 bit operations.

Start with 2*3 = 3! and then multiply this by 4 to get 4! and so on,
multiplying   j!  by  (j+1)  to get  (j+1)!  until you get to N!

The most time-consuming multiplication is the last one
which produces the binary digits in N!

Note that the number of digits in a product is either the sum of digits
in the two factors or 1 less than this sum. In the preceeding example,
6-digit and 4-digit factors produce a 10-digit product.

Thus if N is a k-bit integer then each factor in N! has no more than
k bits so that N! has no more than k*N bits.

Therefore, in each of the N2 multiplications needed to compute N!
we multiply an integer with at most N*k bits (j!) by an integer with at
most k bits (j+1) requiring at most N*k*(k) = N*k2 bit operations.

This needs to be done N2 times so we have at most

bit operations to compute N!

So why is it important what time is to a computer?

Because people don't much like waiting around for computers to compute,
I know I don't - that's why every year or so I'm willing to plunk down another
couple grand to double my CPU speed.

Beyond that, it's a matter of individual, corporate and government security that
some computations are not feasible in a short amount of time. The RSA and elliptic
curve cryptic systems depend on the inability of even the fastest computers to
compute discrete logarithms in a short amount of time - less than 100 years, say.

To understand how more about these cryptic systems we need to have
some concepts about modular arithmetic and congruences.