A Tour of Morfa

Integer types

Integer types int8, int16, int32 and int64 represent 8, 16, 32 and 64 bit signed integers, respectively. For each type intN, the least value of the type is intN.min = -2(N-1), the greatest value is intN.max = 2(N-1)-1, and the default value is intN.init = 0.

For example, int16.min = -32768, int16.max = 32767 and int16.init = 0.

The integer type that you are going to see and use the most is int which is simply an alias for int64.

For reference, the table below shows the least, the greatest and the default value of each integer type.

Type T.min T.max T.init
int8 -128 127 0
int16 -32 768 32767 0
int32 -2 147 483 648 2 147 483 647 0
int64 -9 223 372 036 854 775 808 -9 223 372 036 854 775 807 0

The type int is the default type for variables and constants initialized with integer literals:

const two  = 2;

Thus the type inferred for two is int. To create a variable or constant of a smaller type you can use an explicit type annotation:

const four: int8 = 2 * two;

The compiler checks that the initializing expression fits in the declared type (to do this, it has to substitute 2 for two). Initializing a variable of type int8 with a too large value, say 128, would be an error.

In case the initializing expression is not a compile-time constant and its inferred type is larger (that is, has more bytes width) than the type declared, or desired, for a variable, you may use explicit cast to possibly truncate the value to the required type:

func truncateTo8Bytes(arg: int64): int8
{
    var result = cast<int8> arg;
    return result;
}

unittest
{
    assert (truncateTo8Bytes(4) == 4);
    assert (truncateTo8Bytes(255) == -1);
}

Integer literals

Numeric literals in Morfa may contain _ for readability (except as the first character).

Besides decimal literals, Morfa has hexadecimal, octal and binary literals. Hexadecimal literals are prefixed with the usual 0x, octal literals with 0o and binary literals with 0b.

const max = 9_223_372_036_854_775_807;
const mode = 0o644;            // octal literal
const hash = 0xDead_Beef;      // hexadecimal literal
const year = 0b0111_1101_1111; // binary literal