Answer:
Here is the Python program:
COOKIES_PER_BAG = 40 #sets constant value for bag of cookies
SERVINGS_PER_BAG = 10 #sets constant value for serving in bag
CALORIES_PER_SERVING = 300 #sets constant value servings per bag
cookies = int(input("How many cookies did you eat? ")) #prompts user to input how many cookies he or she ate
totalCalories = cookies * (CALORIES_PER_SERVING / (COOKIES_PER_BAG / SERVINGS_PER_BAG)); #computes total calories consumed by user
print("Total calories consumed:",totalCalories) #displays the computed value of totalCalories consumed
Explanation:
The algorithm is:
- Start
- Declare constants COOKIES_PER_BAG, SERVINGS_PER_BAG and CALORIES_PER_SERVING
- Set COOKIES_PER_BAG to 40
- Set SERVINGS_PER_BAG to 10
- Set CALORIES_PER_SERVING to 300
- Input cookies
- Calculate totalCalories: totalCalories ← cookies * (CALORIES_PER_SERVING / (COOKIES_PER_BAG / SERVINGS_PER_BAG))
- Display totalCalories
I will explain the program with an example:
Lets say user enters 5 as cookies he or she ate so
cookies = 5
Now total calories are computed as:
totalCalories = cookies * (CALORIES_PER_SERVING / (COOKIES_PER_BAG / SERVINGS_PER_BAG));
This becomes:
totalCalories = 5 * (300/40/10)
totalCalories = 5 * (300/4)
totalCalories = 5 * 75
totalCalories = 375
The screenshot of program along with its output is attached.
Answer:
Peopleware is a term used to refer to one of the three core aspects of computer technology, the other two being hardware and software.
Answer:
A. Student Name, Instructor, Course, Date
Explanation:
Begin one inch from the top of the first page and flush with the left margin. Type your name, your instructor's name, the course number, and the date on separate lines, using double spaces between each. Double space once more and center the title
The managed service should you use if you want to do a lift and shift of an existing Hadoop cluster without having to rewrite your Spark code is f G. Cloud's.
<h3>What is Lift and shift Hadoop clusters?</h3>
This is one where a person can migrate their existing Hadoop and Spark deployment so that they can use G. Cloud without having to engage in re-architecting.
The managed service should you use if you want to do a lift and shift of an existing Hadoop cluster without having to rewrite your Spark code is f G. Cloud's as it is advantage fast and flexible way to put together infrastructure as a service and others.
Learn more about managed service from
brainly.com/question/20495853
#SPJ1