Enjoy an ad free experience by logging in. Not a member yet? Register.

Results 1 to 5 of 5

12072009, 10:56 AM #1
 Join Date
 Dec 2009
 Posts
 1
 Thanks
 0
 Thanked 0 Times in 0 Posts
modulus operator is not working if i give number more than 24 digits
If number is more than 24 digits, modulus operator is not giving correct output
here attached sample code
[code]<script type="text/javascript">
var a=10000000000000000000000.0;
var b=10.0;
var c=a % b;
alert("c"+c);
</script>[code]
please tel me solution
12072009, 11:47 AM
#2
 Join Date
 Jun 2002
 Location
 London, England
 Posts
 18,079
 Thanks
 203
 Thanked 2,542 Times in 2,520 Posts
The largest number that Javascript can handle reliably without loss of precision is 9e15 or 9000000000000000. Any number greater than that is liable to return incorrect values for parseInt(), % modulus etc.
See also: http://jsfromhell.com/classes/bignumber
All advice is supplied packaged by intellectual weight, and not by volume. Contents may settle slightly in transit.
12072009, 12:04 PM
#3
 Join Date
 Dec 2007
 Posts
 6,682
 Thanks
 436
 Thanked 890 Times in 879 Posts
12072009, 12:29 PM
#4
 Join Date
 Jun 2002
 Location
 London, England
 Posts
 18,079
 Thanks
 203
 Thanked 2,542 Times in 2,520 Posts
Not so! The real problem is that which I mentioned above. Integer numbers greater than 9e15 may or may not render accurately, depending of course on whether they are amenable to binary or not. Just as you cannot write 1/3 as a binary floating point number (resolves to 0.3333333333333333) but 1/4 is correctly evaluated to 0.25.
Code:<script type="text/javascript"> var a=100000000000000000000000; var b=10; var c=a % b; alert("c"+c); // 2 var a=90000000000000000000000; // 9e22 var b=10; var c=a % b; alert("c"+c); // 6 var a=9000000000000000000000; // 9e21 var b=10; var c=a % b; alert("c"+c); // 0 var a = 9971992547409847; document.write(a); // 9971992547409848 </script>
Last edited by Philip M; 12072009 at 01:11 PM.
Users who have thanked Philip M for this post:
oesxyl (12072009)
12072009, 08:52 PM
#5
As a further point of clarification:
JavaScript doesn't *REALLY* use integer arithmetic when ANY value involved exceeds 2147483647 (2^311). (And the spec doesn't require it to ever use integers, but I would strongly suspect that all modern implementations do so when they can.) That number is the largest positive integer that can be held in a standard 32bit signed integer. (The smallest negative number is one greater, thanks to the foibles of 2'scomplement binary representation.)
So...instead, JS must use a double precision floating point number. And the IEEE/ANSI format for such numbers (used by all modern CPUs) gives only 53 bits for the "mantissa". And 2^53 is 9007199254740992, whence the number that Philip is citing as the maximum possible integer before you begin losing precision. (It's probably actually 9007199254740991, one less than 2^53, again because of 2'scomplement notation, but it's been too long since I investigated the format for me to remember that for sure.)
An optimist sees the glass as half full.
A pessimist sees the glass as half empty.
A realist drinks it no matter how much there is.