A program executes 100 billion instructions. It executes on an IBM processor that has an average CPI of 1.2 and a clock frequenc
y of 4.0 GHz. How many seconds does the program take to execute? What is the cycle time of this IBM processor? Assume that an ARM processor takes 30 seconds to execute the program. What is the speedup provided by the IBM processor, relative to the ARM processor?
How many seconds does the program take to execute? r/ 30 seg
What is the cycle time of this IBM processor? r/ 2.5x10^-10 seg
What is the speedup provided by the IBM processor, relative to the ARM processor? r/ none
Explanation:
CPI is cycles per instruction and it tells you on average how many clock cycles it takes to a processor to execute one instruction.
1) Clock cycle time = Ct = T = 1/F -> T = 1/4.0GHz = 2.5x10^-10 seg therefore Ct = 2.5x10^-10
2) Number of clock cycles needed to execute 100 billion instructions = CI -> CI = 1.2*100 billion = 120 billion Clock cycles
3) Execution time = Et = Ct*CI = 30 seg
As you can see from our calculation the IBM processor doesn't provide a speedup relative to the ARM processor, because both take 30 seconds to execute the program.