Explain clearly the difference between a 'Kilo' and 1,000,
and, between a 'Mega' and 1,000,000.
Arrange these terms for capacity in computer systems and decimal values in ascending order:
peta, mega, giga, kilo, tera, exa, 100, 1,000, 1,000,000, 1,000,000,000
A file with an even megabyte in a directory has roughly how many
megabits when it is put over a DMA channel or network?
What are the max directly addressable RAM sizes for 16, 32 & 64-bit CPUs?
Discuss these data representations, tell where they used, their limitations, and how many characters are defined in each:
ASCII, EBCDIC, UniCode.
UTF-8 is the most widely used character set for Linux-based web servers. What problem does it solve?
Describe the applications where these character definitions prevail:
ASCII, UTF-8, UniCode, ISO-8859, and HTML Special Characters?
Name three common 'control characters' in ASCII or EBCDIC & and tell what they control or signify.
What is a 'collating sequence'? What does it affect? Name a couple two systems where collating sequence differs.
Converting values among number systems:
2610 = ______2
2610 = ______16
AA16 = ________2
AA16 = ______10
10102 = _______16 10102 = _______10
Given this outline, describe how the data is represented and give an appropriate use for each entry:
Numeric
Integer
Float
Decimal
Character
Char
String
Text
Date/Time
Boolean
BLOB
How is an 'epoch' used in modern date and date/time data types?
What caused the very real Y2K crisis?
What would be the appropriate data type for any/all of these?