A computer is rendering two scenes then after 14 minutes of the scenes have the same amounts rendered.
As given in the question,
As per condition:
Computer rendered 540 thousands pixels of a kitchen scene and renders another 90kpx each minute:
The expression for the kitchen is illustrated as 540 + 90t
Computer rendered 960kpx of a garden scene and renders 60kpx more pixels each minute
The expression for the garden is illustrated as 960 + 60t
For the scenes have the same amounts rendered is given by :
540 + 90t = 960 + 60t
Collect like terms
90t - 60t = 960 - 540
⇒30t = 420
⇒t = 420/30
⇒t = 14
Therefore, a computer is rendering two scenes then after 14 minutes of the scenes have the same amounts rendered.
Learn more about amounts here:
brainly.com/question/16629663
#SPJ1
For every cd it cost 8.25
Answer:
No
Step-by-step explanation:
One hour lasts 60 minutes, so two hours last 120 minutes.
The movie lasts more than 120 minutes, so she hasn't enough time.
Please comment this in english.