What is time?
I define time this way:
In physics, a small "c" indicates the speed of light. The triple equation sign is often used to indicate a definition. Thus, what my definition of time is saying is that what has been called "time" and what has been called "the speed of light" are defined to be the same thing.
Time is a specific, constant quantity of motion that we use to measure all other motions, and should be defined to be what is now called the speed of light.
Time is a specific, constant quantity of motion that we use to measure all other motions. Some people don't believe that, but it is true. Just as you and I use a constant length ruler to measure other lengths, we use a constant motion to measure other motions.
What "specific" motion is time?
Through history, the motion we have used to measure other motions, the motion we have been calling "time" is the apparent motion of the sun crossing the sky for an earthbound observer.
Traditionally, the motion of the sun is represented by the moving hands on the dial face of a clock. The sun's motion has been called, "time." However, it would be incorrect to call one of individual motions: the motion of the hour hand, or the minute hand, or the second hand, "time." The hour hand moves twice as fast as the sun's motion. It appears to go around the dial face 2 times for every time the sun appears to go around the earth once. The minute hand moves 24 times as fast as this traditional definition of time. It appears to go around the dial face 24 times for every time the sun appears to go around the earth once. The second hand moves 1440 times as fast as the solar definition of time. It appears to go around the dial face 1440 times in a day.
Almost everyone knows how to use a traditional timepiece. Almost everyone knows how to tell time. The problem is, everyone who knows how to tell time in the traditional way and is using time in the traditional way, is making a mistake. From a modern physics point-of-view, it is really stupid to use the sun's motion as our standard of motion. The reason is, if you move a traditional timepiece -- whether it is analog or digital timepiece, whether it is a sundial or an atomic clock -- the motion on the clock combines with the motion of the clock, and your standard constant quantity of motion is now different.
The act of moving a traditional timepiece destroys a key characteristic of a good timepiece -- its constancy of motion. How did this come about? How did this mistake weave its way throughout everyday life? As people developed timepieces over hundreds of years, they did not foresee the developments of modern physics. The key detail that is now known in modern physics is, it appears impossible to move an object infinitely fast. The universe has a speed limit. Currently, the fastest known motion is the speed of light. This fact is why it is a mistake to move a traditional timepiece. It is the reason you can not move a traditional timepiece without it developing an error. If you add 1 to 1, you get 2. Add 1 foot to 1 foot, you get 2 feet. However, adding 1 second to 1 second does not necessarily give 2 seconds. If the 1st second is measured with a stationary clock, then its "1 second" is the basic "second" you expect. If the 2nd second is measured with a moving clock, its "1 second" is not equal.
Einstein solved this problem one way. I found another way. He took the "traditional second," as defined using the sun's motion -- or, as defined with an atomic clock -- and made that "traditional second" slow down if the clock moves with respect to an earthbound reference frame. I abandon the "traditional second," and redefine time to travel at the fastest known motion. What has been called, "the speed of light" is now defined as "time."
If time is defined as the fastest known motion, then all other motions are fractions of this fastest motion. You can add 1/2 the speed of light to 1/2 the speed of light and get the speed of light. You can add any fraction of the speed of light to another fraction of the speed of light just as you would expect when it comes to adding fractions -- with one catch. You can not get a result greater than one. That would be improper. It is elementary algebra.