Page 48 - ARM 64 Bit Assembly Language
P. 48

Introduction  31

                            e. Convert 301 10 to base 2.
                            f. Represent 301 10 as a null-terminated ASCII string. (Write your answer in hex-
                               adecimal.)
                            g. Convert 3420 5 to base ten.
                            h. Convert 2314 5 to base nine.
                            i. Convert 116 7 to base three.
                            j. Convert 1294 11 to base 5.
                      1.8. Given the following binary string:
                           01001001 01110011 01101110 00100111 01110100 00100000 01000001 01110011
                           01110011 01100101 01101101 01100010 01101100 01111001 00100000 01000110
                           01110101 01101110 00111111 00000000
                            a. Convert it to a hexadecimal string.
                            b. Convert the first four bytes to a string of base ten numbers.
                            c. Convert the first (little-endian) halfword to base ten.
                            d. Convert the first (big-endian) halfword to base ten.
                            e. If this string of bytes were sent to an ASCII printer or terminal, what would be
                               printed?
                      1.9. The number 1,234,567 is stored as a 32-bit word starting at address F0439000 16 .Show
                           the address and contents of each byte of the 32-bit word on a
                            a. little-endian system.
                            b. big-endian system.
                     1.10. Really good assembly programmers can convert small numbers between binary, hex-
                           adecimal, and decimal in their heads. Without referring to any tables or using a calcu-
                           lator or pencil, fill in the blanks in the following table:
                                                Binary       Decimal     Hexadecimal
                                                                5

                                                 1010
                                                                               C
                                                                23
                                               0101101
                                                                              4B
                     1.11. UTF-8 is often referred to as Unicode. Why is this not correct?
                     1.12. What are the differences between a CPU register and a memory location?
                     1.13. The ISO/IEC 10646 standard defines 1,112,064 code points (glyphs). Each code point
                           could be encoded using 24 bits, or three bytes. The UTF-8 encoding uses up to four
                           bytes to encode a code point. Give three reasons why UTF-8 is preferred over a simple
                           3-byte per code point encoding.
   43   44   45   46   47   48   49   50   51   52   53