What is 16-bit Unicode?
16-bit Unicode or Unicode Transformation Format (UTF-16) is a method of encoding character data, capable of encoding 1,112,064 possible characters in Unicode.
UTF-16 encodes characters into specific binary sequences using one or two 16-bit sequences. Three different encoding schemes map code points to 8-bit or octet sequences around the basic 16-bit sequence model.
What is the difference between Unicode 16 and UTF-8?
UTF-16 encodes a Unicode character into a string of either two or four bytes. So, the distinction between UTF-16 and UTF-8 is visible in their names. In UTF-8, the smallest representation of a character is one byte or eight bits. For UTF-16, the smallest representation of a character is two bytes or 16 bits.