Jump to content

9,223,372,036,854,775,807

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by LiberatorG (talk | contribs) at 17:57, 25 April 2019 (Reverted 1 edit by VideoFan1100 (talk): 49 can be factored further (TW)). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

9223372036854775807
Cardinalnine quintillion two hundred twenty-three quadrillion three hundred seventy-two trillion thirty-six billion eight hundred fifty-four million seven hundred seventy-five thousand eight hundred seven
Ordinal9223372036854775807th
(nine quintillion two hundred twenty-three quadrillion three hundred seventy-two trillion thirty-six billion eight hundred fifty-four million seven hundred seventy-five thousand eight hundred seventh)
Factorization72 × 73 × 127 × 337 × 92737 × 649657
Greek numeral͵εωζ´
Roman numeral

The number 9,223,372,036,854,775,807 is the integer equal to 263 − 1. Its prime factorization is 72 · 73 · 127 · 337 · 92737 · 649657, which is equal to Φ1(2) · Φ3(2) · Φ7(2) · Φ9(2) · Φ21(2) · Φ63(2).

In computing

The number 9,223,372,036,854,775,807, equivalent to the hexadecimal value 7FFF,FFFF,FFFF,FFFF16, is the maximum value for a 64-bit signed integer in computing. It is therefore the maximum value for a variable declared as a long integer (long, long long int, or bigint) in many programming languages running on modern computers.[1][2][3] The presence of the value may reflect an integer overflow, or error.[4]

This value is also the largest positive signed address offset for 64-bit CPUs utilizing sign-extended memory addressing (such as the x86-64 architecture, which calls this "canonical form" extended addressing[5]: 130 ). Being an odd value, its appearance may reflect an erroneous (misaligned) memory address.

The C standard library data type time_t, used on operating systems such as Unix, is typically implemented as either a 32- or 64-bit signed integer value, counting the number of seconds since the start of the Unix epoch (midnight UTC of 1 January 1970).[6] Systems employing a 32-bit type are susceptible to the Year 2038 problem, so many implementations have moved to a wider 64-bit type, with a maximal value of 263−1 corresponding to a number of seconds 292 billion years from the start of Unix time.

Other systems encode system time as a signed 64-bit integer count of the number of ticks since some epoch date. On some systems (such as the Java standard library), each tick is one millisecond in duration, yielding a usable time range extending 292 million years into the future.

In C# this is available as the data type long.[7] The constant is available as long.MaxValue. In the .NET Framework it is available as the Int64 struct.[8]

See also

References

  1. ^ "Primitive Data Types" Oracle Corporation
  2. ^ Saisang Cai, Mike B, Colin Robertson, Mike Jones, Gordon Hogenson, Billy O'Neal "Integer Limits" Microsoft Corporation Jan. 28 2018
  3. ^ "BIGINT data type" The Apache Software Foundation
  4. ^ "Integer Overflow" Techopedia
  5. ^ AMD Corporation (December 2016). "Volume 2: System Programming" (PDF). AMD64 Architecture Programmer's Manual. AMD Corporation. {{cite web}}: Cite has empty unknown parameter: |1= (help)
  6. ^ "time_t" cplusplus.com
  7. ^ long (C# Reference) | Microsoft Docs
  8. ^ Int64 Struct (System) | Microsoft Docs