The game "Pennies in the Ring" is often played by bored computer programmers who have gotten tired of playing solitare. The objective is to see how many pennies can be put into a circle. The circle is drawn on a grid, with its center at the coordinate . A single penny is placed on every integer grid coordinate (e.g., , , etc.) that lies within or on the circle. It's not a very exciting game, but it's very good for wasting time. Your goal is to calculate how many pennies are needed for a circle with a given radius.
The input is a sequence of positive integer values, one per line, where each integer is the radius of a circle. You can assume the radius will be less than or equal to . The last integer will be indicated by . You may assume that the grid is large enough for two pennies to be on adjacent integer coordinates and not touch.
You are to output, each on its own line, the number of pennies needed for each circle. You do not need to output for the last . You may assume that the number of possible pennies is less than 2 billion (which is only $20 million dollars: computer scientists have lots of money).
2 3 4 0
Output for Sample Input
13 29 49