BREAKING: throw RangeError when decoding bad buf

Chris Dickinson 8 years ago
parent f78634a6c3
commit 15598cb409

@ -25,6 +25,8 @@ modified.
decodes `data`, which can be either a buffer or array of integers, from position `offset` or default 0 and returns the decoded original integer.
Throws a `RangeError` when `data` does not represent a valid encoding.
### varint.decode.bytes
if you also require the length (number of bytes) that were required to decode the integer you can access it via `varint.decode.bytes`. this is an integer property that will tell you the number of bytes that the last .decode() call had to use to decode.
@ -41,7 +43,7 @@ returns the number of bytes this number will be encoded as, up to a maximum of 8
## usage notes
If varint is passed a buffer that does not contain a valid end
byte, then `decode` will return undefined, and `decode.bytes`
byte, then `decode` will throw `RangeError`, and `decode.bytes`
will be set to 0. If you are reading from a streaming source,
it's okay to pass an incomplete buffer into `decode`, detect this
case, and then concatenate the next buffer.

@ -12,10 +12,9 @@ function read(buf, offset) {
, l = buf.length
do {
if(counter >= l) {
if (counter >= l) {
read.bytes = 0
read.bytesRead = 0 // DEPRECATED
throw new Error('Could not decode varint')
throw new RangeError('Could not decode varint')
b = buf[counter++]
res += shift < 28

@ -124,10 +124,12 @@ test('buffer too short', function (assert) {
var l = buffer.length
while(l--) {
var val = decode(buffer.slice(0, l))
assert.equal(val, undefined)
assert.equal(decode.bytes, 0)
assert.equal(decode.bytesRead, 0)
try {
var val = decode(buffer.slice(0, l))
} catch (err) {
assert.equal(err.constructor, RangeError)
assert.equal(decode.bytes, 0)