Ancient mariners used to gauge how fast their ship was moving by throwing a piece of wood or other floatable object over the vessel’s bow then counting the amount of time that elapsed before its stern passed the object. This method was known as a Dutchman’s log. By the late 16th century, sailors had begun using a chip log to measure speed. In this method, knots were tied at uniform intervals in a length of rope and then one end of the rope, with a pie-slice-shape piece of wood (or “chip”) attached to it, was tossed behind the ship. As the vessel moved forward, the line of rope was allowed to roll out freely for a specific amount of time, which was typically tabulated with an hourglass. Afterward, the number of knots that had gone over the ship’s stern was counted and used in calculating the vessel’s speed. A knot came to mean one nautical mile per hour. Therefore, a ship traveling at 15 knots could go 15 nautical miles per hour.
For a number of years, there was disagreement among various nations about the exact measurement of a nautical mile, which is based on the Earth’s circumference. In 1929, the international nautical mile was standardized at 6,076 feet; it was adopted by the United States in 1954. A nautical mile is different from a mile on land, which is based on walking distance. The Romans first defined a land mile as 1,000 paces or pairs of steps; it was set at its current measurement of 5,280 feet by Queen Elizabeth I in 1593.