So, what is the case in which he uses the least coins? can he use 1 coin? no, there isn't a 35 cent coin. can he use 2 coins? yes! he can use 25+10 cent coins!
now, the largest number of coins means the coins of the smallest value: so using only 5 cent coins. How many would this be?
we have to divide:
35/5=7
so we would use 7 coins.
And the difference is 5: 7-2 is 5.
To
answer this question we have to first determine how many times 100
can go into 13,300. To find this we do the following:
<span>$13,300/$100
= 133</span>
<span>Now
we simply multiply the result by $2.74:</span>
<span>$2.74*133
= $364.42.</span>
<span>Finally
we just multiply that by the total number of payments which are
(monthly payment, 12 months in a year, for 4 years).</span>
<span>$364.42*(12*4)
---> $364.42*48 = $17,492.16.</span>
The correct answer is choice 4.) 17,492.16
<span>I
hope this helps, Regards.</span>
Answer:
I am unable to solve this problem Sorry
Step-by-step explanation:
The answer would be c because if you see closely the function is -1 and 4 March determining the perspectives and how the procedure goes
Answer:

Step-by-step explanation:
~Kandy~
Hope this helped!
Brainliest please!