In computer programming, What does "String" mean?
In computer programming, What does "String" mean?
5 Answers
It is a data type. Examples:
- A bit is either a 0 or a 1
- A byte is a number between 0 and 255 (if unsigned), equivalent to 8 bits.
- A word is a number between 0and 65535 (if unsigned), equivalent to 16 bits.
- A string is a sequence of characters, eg. "Hello, world!".
In C++/C programming
A string is a series of characters treated as a single unit. A string may include letters, digits, and various special characters.
In C, string constants/string literals are written with double quotation marks, such as.
"James Madison" [a name]
"2525 E. Main Street [an address]
"(888) 555-4567 [a telephone number]
It is not uncommon to encounter programs that require the user to enter in this information presented above (Name, address, telephone number, etc). The data typed in by the user is usually stored in the memory as a string.
A string, in the C language, acts as a pointer in that their (memory) address is accessed via a pointer to the first character in the string with the end of the string indicated by a null character.
The value of the string is the address of its first character (analogous to an array as an array is also a pointer to its first element)
Although I only have a passing familiarity with other computer languages, I believe that similar data is handled in an analagous fashion in other computer languages as well (i.e. with the use of strings).
Good luck
Wow, we have a good group of C#s here. I just could not resist that. But really, I never encountered so many brilliant people in so many fields as we have here.
A word is a number between 0and 65535 (if unsigned), equivalent to 16 bits # A string is a sequence of characters, eg. "Hello, world!".
The definition of a "word" depends on the computer's architecture. Most of my professional career was spent working with IBM mainframes for which a word was 4 bytes (32 bits) and a halfword word was 16 bits (what Intel architecture calls a word). There have also been architectures for which a word was (I think) 36 bits. On the whole, the length of a word corresponds to the length/width of the most commonly used register of that architecture.
The definition of a string depends on the (programming) language under discussion. Most languages will accept any sequence of bytes (printable or not). C (and C++) are somewhat peculiar in that they don't really recognize "strings" as language elements at all. There are sub-routines for dealing with strings (in the normal sense) but these are not built into the language (as contrasted with, for example, Pascal).
A set of characters.