PLEASE HELP WILL MARK BRAINLIEST!!!!!! Base the question off this info: An experienced cashier at a grocery store takes 2 second
s to scan each item and 40 seconds to process the customer's payment. ______________________________________________________________ Anna bought 31 items at the grocery store. There was a glitch in the computer system that caused the processing of the payment to take longer than usual. If the total time for Anna's transaction took 2 minutes, how much time did the processing of the payment (p) take?
Answer:the time for processing the payment took 58 seconds.
Step-by-step explanation:
An experienced cashier at a grocery store takes 2 seconds to scan each item and 40 seconds to process the customer's payment. If Anna bought 31 items at the grocery store, the total time for Anna's transaction would be
(31 × 2) + 40 = 62 + 40 = 102 seconds.
The time to to process the Anna's payment took longer due to glitches. If the total time for Anna's transaction took 2 minutes = 120 seconds, then the amount of time it took to process the payment (p) would be
You need to use basic algebra for this. For this I’ll use o as the items and p for the payment. First you need to find out how long it took for all the items to scan, so if it took each item 2 seconds to be scanned you need to times the total number of items (o) by two e.g. o x 2 = 62 items times two seconds which is equivalent to 62 seconds (1.02 minutes) after this step you need to minus the total time it took to scan the items for the transaction time (2 minutes) e.g. 2.00 - 1.02 = 2.58 minutes.