# 9,223,372,036,854,775,807

(Redirected from 0x7FFFFFFFFFFFFFFF)
9223372036854775807
Cardinal nine quintillion two hundred twenty-three quadrillion three hundred seventy-two trillion thirty-six billion eight hundred fifty-four million seven hundred seventy-five thousand eight hundred seven
Ordinal 9223372036854775807th
(nine quintillion two hundred twenty-three quadrillion three hundred seventy-two trillion thirty-six billion eight hundred fifty-four million seven hundred seventy-five thousand eight hundred seventh)
Factorization 72 × 73 × 127 × 337 × 92737 × 649657
Greek numeral ${\displaystyle {\stackrel {\mathrm {\sampi} \kappa \beta \gamma \tau \mathrm {o} \beta \tau \xi \eta \epsilon \upsilon \mathrm {o} \zeta }{\mathrm {M} }}}$͵εωζ´
Roman numeral ${\displaystyle {\overset {ix}{MMMMMM}}\quad {\overset {ccxxiii}{MMMMM}}\quad {\overset {ccclxxii}{MMMM}}\quad {\overset {xxxvi}{MMM}}\quad {\overset {dcccliv}{MM}}\quad {\overset {dcclxxv}{M}}\quad {\overset {}{DCCCVII}}}$

The number 9,223,372,036,854,775,807 is an integer equal to 263 − 1. Its prime factorization is 72 · 73 · 127 · 337 · 92737 · 649657, which is equal to Φ1(2) · Φ3(2) · Φ7(2) · Φ9(2) · Φ21(2) · Φ63(2).

## In computing

The number 9,223,372,036,854,775,807, equivalent to the hexadecimal value 7FFF,FFFF,FFFF,FFFF16, is the maximum value for a 64-bit signed integer in computing. It is therefore the maximum value for a variable declared as a long integer (long, long long int, or bigint) in many programming languages running on modern computers. The presence of the value may reflect an error, overflow condition, or missing value.

This value is also the largest positive signed address offset for 64-bit CPUs utilizing sign-extended memory addressing (such as the AMD x86-64 architecture, which calls this "canonical form" extended addressing). Being an odd value, its appearance may reflect an erroneous (misaligned) memory address. Such a value may also be used as a sentinel value to initialize newly allocated memory for debugging purposes.[citation needed]

The C standard library data type time_t, used on operating systems such as Unix, is typically implemented as either a 32- or 64-bit signed integer value, counting the number of seconds since the start of the Unix epoch (midnight UTC of 1 January 1970).[citation needed] Systems employing a 32-bit type are susceptible to the Year 2038 problem, so many implementations have moved to a wider 64-bit type, with a maximal value of 263−1 corresponding to a number of seconds 292 billion years from the start of Unix time.

Other systems encode system time as a signed 64-bit integer count of the number of ticks since some epoch date. On some systems (such as the Java standard library), each tick is one millisecond in duration, yielding a usable time range extending 292 million years into the future. On other systems (such as Win32), each tick is 100 nanoseconds long, yielding a time range of ±29,227 years from the epoch.[citation needed]