Algoritm Problem: Add Two Unsigned Integers

By | 2012 年 12 月 21 日

To add two unsigned integers, A and B, we can exam a binary add by hand:
Let A = 5, B = 3, that is:

 0101 A
+0011 B
-------
 1000

We can notice that, if we ignore the overflow in some digital, that will be:

 0101 A
+0011 B
-------
 0110 C

Where C just is A^B.

If we consider only the overflow, we can get:

 0101 A
+0011 B
-------
 0001 D

The equation will be right: A + B = C + D << 1. And the D is just A&B
So we can conclude that A + B = (A^B) + (A&B)<<1.
When we recursively doing this, until B is zero, or the most significant bit of B is 1(in this case, A+B exceeds the bounds of unsigned int).



Codes


//
//  main.cpp
//  AddTwoUnsingedInteger
//
//  Created by Qiu Xiangyu on 12-12-21.
//  Copyright (c) 2012年 Qiu Xiangyu. All rights reserved.
//

#include 
using namespace std;

unsigned int add(unsigned int i0, unsigned int i1) {
    static const unsigned int hMask = 0x1 << (sizeof(unsigned int) * 8 - 1);
    while (i1 != 0) {
        unsigned int base = i0 ^ i1;
        unsigned int pro = i0 & i1;
        if (pro & hMask) {
            //over flow
            return 0;
        }
        i0 = base;
        i1 = pro << 1;
    }
    return i0;
}

int main(int argc, const char * argv[])
{

    // insert code here...
    std::cout << "Hello, World!\n";
    unsigned int uMax = ~0;
    cout<< add(uMax - 1,2);
    return 0;
}

发表评论

电子邮件地址不会被公开。 必填项已用*标注