A person borrows $10,000 and repays the loan at the rate of $2,400 per year. The lender charges interest of 10% per year. Assumi
ng the payments are made continuously and interest is compounded continuously (a pretty good approximation to reality for long-term loans), the amount M(t) of money (in dollars) owed t years after the loan is made satisfies the differential equationdMdt = 110 M − 2400and the initial conditionM(0) = 10000.(a) Solve this initial-value problem for M(t).M(t) = (b) How long does it take to pay off the loan? That is, at what time t is M(t) = 0? Give your answer (in years) in decimal form with at least 3 decimal digits. years