A computer program uses 3 bits to represent integers. When the program adds the decimal (base 10) numbers 5
and 3, the result is 0. Which of the following is the best explanation for the result?
Approach
We need to represent 5+3=8. If we only have 3 bits, the highest decimal number we would be able to represent is: