I've been awfully busy programming lately. My Django-based side

project is coming along well and I hope to have it ready for use in a

few weeks. Please don't ask more about it, that's really all I can say

for now. Anyways, I came across an interesting little math problem

today and was hoping some skilled programmers out there could come up

with a more elegant solution than mine.

Problem: Star Ratings

People can rate cheeseburgers on my website with a star rating of 0-5

stars (whole stars only), 5 being mighty tasty and 0 being disgusting.

I would like to show the average of everyone's ratings of a particular

cheeseburger to the nearest half star. I have already calculated the

average rating as a float (star_sum) and the total number of people

that rated the particular cheeseburger (num_raters). The result should

be stored as a float in a variable named "stars."

My Solution (in Python):

# round to one decimal place and

# separate into whole and fractional parts

parts = str(round(star_sum/num_raters, 1)).split('.')

whole = int(parts[0])

frac = int(parts[1])

if frac < 3:

___frac = 0

elif frac 7:

___frac = 0

___whole += 1

else:

___frac = 5

# recombine for a star rating rounded to the half

stars = float(str(whole)+'.'+str(frac))

Mmmm… In-N-Out Burgers… Please reply if you've got a better solution.