diff --git a/docs/data/index.html b/docs/data/index.html new file mode 100644 index 0000000..ef11652 --- /dev/null +++ b/docs/data/index.html @@ -0,0 +1,95 @@ + + +
+ +Returns a representation of an immutable list of all enum entries, in the order they're declared.
This method may be used to iterate over the enum entries.
Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are not permitted.)
Returns an array containing the constants of this enum type, in the order they're declared.
Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are not permitted.)
if this enum type has no constant with the specified name
Returns an array containing the constants of this enum type, in the order they're declared.
This method may be used to iterate over the constants.
Returns a representation of an immutable list of all enum entries, in the order they're declared.
This method may be used to iterate over the enum entries.
Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are not permitted.)
Returns an array containing the constants of this enum type, in the order they're declared.
Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are not permitted.)
if this enum type has no constant with the specified name
Returns an array containing the constants of this enum type, in the order they're declared.
This method may be used to iterate over the constants.
Returns a representation of an immutable list of all enum entries, in the order they're declared.
This method may be used to iterate over the enum entries.
Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are not permitted.)
if this enum type has no constant with the specified name
Returns a Double value from a byte array with a given offset.
Double.
The index to start from.format
The byte order, default is ByteOrder.BIG_ENDIAN.
If the length of the byte array is not >= offset + 8.
Returns a Float value from a byte array with a given offset.
Float.
The index to start from.
The format or the Float value to read.
The byte order, default is ByteOrder.BIG_ENDIAN.
If the length of the byte array is shorter than the type length.
Returns an Int from a byte array with a given offset.
Int.
The index to start from.
The format or the Int value to read.
The byte order, default is ByteOrder.BIG_ENDIAN.
If the length of byte array is not >= offset + 4.
Returns an Int from a byte array with a given offset.
Int.
The index to start from.
The byte order, default is ByteOrder.BIG_ENDIAN.
If the length of byte array is not >= offset + 4.
Returns an Int from a byte array with a given offset.
Short.
The index to start from.
The byte order, default is ByteOrder.BIG_ENDIAN.
If the length of the byte array is not >= offset + 2.
Returns an Int from a byte array with a given offset.
UInt.
The index to start from.
The format or the Int value to read.
The byte order, default is ByteOrder.BIG_ENDIAN.
If the length of byte array is not >= offset + 4.
Returns an Int from a byte array with a given offset.
UShort.
The index to start from.
The byte order, default is ByteOrder.BIG_ENDIAN.
If the length of the byte array is not >= offset + 2.
Converts a byte array to a 128-bit UUID.
UUID
The index to start from.
The byte order, default is ByteOrder.BIG_ENDIAN.
If the byte array is shorter than 16 bytes long.
Returns whether ALL bits that are equal to 1 in the given mask are set to 0 in the receiver.
Example:
0b00001111 hasAllBitsCleared 0b00000111 == false
0b00001111 hasAllBitsCleared 0b00001000 == false
0b10101010 hasAllBitsCleared 0b00000100 == true
0b10101000 hasAllBitsCleared 0b10101010 == false
0b10101010 hasAllBitsCleared 0b10000000 == false
Byte value.
Returns whether ALL bits that are equal to 1 in the given mask are also set to 1 in the receiver.
Example:
0b00001111 hasAllBitsSet 0b00000111 == true
0b00001111 hasAllBitsSet 0b00001000 == true
0b10101010 hasAllBitsSet 0b00000100 == false
0b10101000 hasAllBitsSet 0b10101010 == false
0b10101010 hasAllBitsSet 0b10000000 == true
Byte value.
Returns whether the bit at the given bit is set to 0 in the receiver.
Byte value.
Returns whether the bit at the given bit is set to 0 in the receiver.
UByte value.
Returns whether the bit at the given bit is set to 0 in the receiver.
Short value.
Returns whether the bit at the given bit is set to 0 in the receiver.
UShort value.
Returns whether the bit at the given bit is set to 0 in the receiver.
Int value.
Returns whether the bit at the given bit is set to 0 in the receiver.
UInt value.
Returns whether the bit at the given bit is set to 1 in the receiver.
Byte value.
Returns whether the bit at the given bit is set to 1 in the receiver.
UByte value.
Returns whether the bit at the given bit is set to 1 in the receiver.
Short value.
Returns whether the bit at the given bit is set to 1 in the receiver.
UShort value.
Returns whether the bit at the given bit is set to 1 in the receiver.
Int value.
Returns whether the bit at the given bit is set to 1 in the receiver.
UInt value.
Returns whether ALL bits that are equal to 1 in the given mask are set to 0 in the receiver.
Returns whether ALL bits that are equal to 1 in the given mask are also set to 1 in the receiver.
Returns whether the bit at the given bit is set to 0 in the receiver.
Returns whether the bit at the given bit is set to 1 in the receiver.
Converts a Byte to a byte array.
Converts a UByte to a byte array.
Converts a 128-bit UUID to a byte array.
Converts an Int to a byte array using the given endianness.
Converts a Short to a byte array using the given endianness.
Converts an UInt to a byte array using the given endianness.
Converts a UShort to a byte array using the given endianness.
Converts a byte array to a hex string.
Applies the XOR operator on two byte arrays. Compared to already existent xor functions, this one does not require the arrays to be of the same length.
Shifts this value left by the bitCount number of bits.
Note that only the three lowest-order bits of the bitCount are used as the shift distance. The shift distance actually used is therefore always in the range 0..7
.
Shifts this value left by the bitCount number of bits.
Note that only the four lowest-order bits of the bitCount are used as the shift distance. The shift distance actually used is therefore always in the range 0..15
.
Shifts this value right by the bitCount number of bits, filling the leftmost bits with copies of the sign bit.
Note that only the three lowest-order bits of the bitCount are used as the shift distance. The shift distance actually used is therefore always in the range 0..7
.
Shifts this value right by the bitCount number of bits, filling the leftmost bits with zeros.
Note that only the three lowest-order bits of the bitCount are used as the shift distance. The shift distance actually used is therefore always in the range 0..7
.
Shifts this value right by the bitCount number of bits, filling the leftmost bits with copies of the sign bit.
Note that only the four lowest-order bits of the bitCount are used as the shift distance. The shift distance actually used is therefore always in the range 0..15
.
Converts an Int to a byte array using the given endianness.
The byte order, default is ByteOrder.BIG_ENDIAN.
Converts an UInt to a byte array using the given endianness.
The byte order, default is ByteOrder.BIG_ENDIAN.
Converts a Short to a byte array using the given endianness.
The byte order, default is ByteOrder.BIG_ENDIAN.
Converts a UShort to a byte array using the given endianness.
The byte order, default is ByteOrder.BIG_ENDIAN.
Converts a Byte to a byte array.
Converts a UByte to a byte array.
Converts a 128-bit UUID to a byte array.
The byte order, default is ByteOrder.BIG_ENDIAN.
Converts a byte array to a hex string.
Hex string representation of the byte array.
Whether to prefix the hex string with 0x.
The format of the hex string.
Converts a byte array to a 128-bit UUID.
UUID
The byte order, default is ByteOrder.BIG_ENDIAN.
If the byte array is not 16 bytes long.
Shifts this value right by the bitCount number of bits, filling the leftmost bits with zeros.
Note that only the three lowest-order bits of the bitCount are used as the shift distance. The shift distance actually used is therefore always in the range 0..7
.
Shifts this value right by the bitCount number of bits, filling the leftmost bits with copies of the sign bit.
Note that only the four lowest-order bits of the bitCount are used as the shift distance. The shift distance actually used is therefore always in the range 0..15
.
Applies the XOR operator on two byte arrays. Compared to already existent xor functions, this one does not require the arrays to be of the same length.
XOR of the two byte arrays.
The other byte array which is xor ed with this one.