If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

### Course: AP®︎/College Computer Science Principles>Unit 1

Lesson 1: Bits and bytes

# How do computers represent data?

When we look at a computer, we see text and images and shapes.
To a computer, all of that is just binary data, 1s and 0s.
The following 1s and 0s represents a tiny GIF:
This next string of 1s and 0s represents a command to add a number:
You might be scratching your head at this point. Why do computers represent information in such a hard to read way? And how can 1s and 0s represent so many different things? That's what we'll explore in this lesson.
To start off, check out the next video from Code.org where engineers from Microsoft and Adafruit introduce the basics of bits and binary data.

## Want to join the conversation?

• Why do computers use 1s and 0s specifically? Wouldn't it become very challenging for them to convert trillions of 1s and 0s so using 2s, 3s, 4s, etc would shorten the input they need to process?
• It really has more to do with the way computers work. At the fundamental level, the transceiver is how the computer interprets anything (this is where you can find binary). A wire can either be sent electrical signals, or it cannot (there is no in between for on and off after all). This means that the representation for when a wire is sent an electrical signal has to be of 2 possible values. As such, binary is used.
• 1)
How much jargon does one need to know before beginning the course?

2) Is there a jargon dictionary?

3) What is a GIF?
• Good question! There's a lot of jargon in the world of computers, so it's possible that I use jargon that some folks aren't familiar with.

A GIF is a type of image file that's popular on the internet these days, but you're right, "GIF" is jargon. I'd encourage learners to search the internet for jargon that is unfamiliar or ask a question as you've done here. I can then decide whether to reword something to avoid the jargon.

There is a vocabulary review here:
That only goes over the high-level vocabulary that's covered by the exam, it does not include all the jargon used in the articles and exercises.
• how is this class going to get better?
• its not
• doing AP comp sci this year and I have no idea what im doing and no prior comp sci knowledge lmao. wish me luck
• When you get used to it, it's not that hard, but if it's still difficult for you then you may want to practice coding in your free time
• why do we have to use 1 and 0's why not different numbers?
• We use base 10 because we have 10 fingers. Computers at the lowest level only understand On/Off. On is represented by a 1 and off is represented by a 0. So we talk to computers using a series of ons and offs (1s and 0s).
• How do software engineers simplify this coding process? Or do they end up having to enter in all of those ones and zeros