Mike's computer can perform 1 million calculations in one second. Moore's Law says that computer performance doubles every 2 years.
Mike always upgrades his computer to the latest technology, so in 4 years,
Mike's computer will do 4 million calculations per second according to Moore's Law.
To run a popular video game Crysis, Mike needs a computer that can do 3.84 billion calculations per minute. In how many years will Mike's computer be able to run the video game?
We should first convert the given rate 3.84 billion per minute to match the units given in the initial description of Moore's Law.
1 minute3.84 billion calculations⋅1 billion calculations1,000 million calculations⋅60 seconds1 minute
=64 million calculations per second
We can construct a simple exponential model or use a table. Below I demonstrate both options:
Exponential Model (initial value=1, rate=2, final value=64)
y=abx
64=1(2)x
64=2x
26=2x
x=6 years
Table
Year0123456Million calculationsper second1248163264