Post Job Free
Sign in

Civil Engineering Helper

Location:
Cameroon
Posted:
March 14, 2023

Contact this candidate

Resume:

[MUSIC PLAYING]

DAVID MALAN: All right, this is CS50, Harvard University's introduction

to the intellectual enterprises of computer science

and the art of programming, back here on campus in beautiful Sanders Theatre

for the first time in quite a while.

So welcome to the class.

My name is David--

OK.

[CHEERING AND APPLAUSE]

So my name is David Malan.

And I took this class myself some time ago, but almost didn't.

It was sophomore fall and I was sitting in on the class.

And I was a little curious but, eh, it didn't really

feel like the field for me.

I was definitely a computer person, but computer science

felt like something altogether.

And I only got up the nerve to take the class,

ultimately, because the professor at the time, Brian Kernighan,

allowed me to take the class pass/fail, initially.

And that is what made all the difference.

I quickly found that computer science is not just

about programming and working in isolation on your computer.

It's really about problem solving more generally.

And there was something about homework, frankly,

that was, like, actually fun for perhaps the first time in, what, 19 years.

And there was something about this ability

that I discovered, along with all of my classmates,

to actually create something and bring a computer to life to solve a problem,

and sort of bring to bear something that I'd been using every day

but didn't really know how to harness, that's been gratifying ever since,

and definitely challenging and frustrating.

Like, to this day, all these years later,

you're going to run up against mistakes, otherwise known as bugs,

in programming, that just drive you nuts.

And you feel like you've hit a wall.

But the trick really is to give it enough time,

to take a step back, take a break when you need to.

And there's nothing better, I daresay, than that sense of gratification

and pride, really, when you get something

to work, and in a class like this, present, ultimately,

at term's end, something like your very own final project.

Now, this isn't to say that I took to it 100% perfectly.

In fact, just this past week, I looked in my old CS50 binder, which I still

have from some 25 years ago, and took a photo

of what was apparently the very first program that I wrote and submitted,

and quickly received minus 2 points on.

But this is a program that we'll soon see in the coming days that

does something quite simply like print "Hello, CS50," in this case,

to the screen.

And to be fair, I technically hadn't really

followed the directions, which is why I lost those couple of points.

But if you just look at this, especially if you've never programmed before,

you might have heard about programming language

but you've never typed something like this out,

undoubtedly it's going to look cryptic.

But unlike human languages, frankly, which

were a lot more sophisticated, a lot more vocabulary, a lot more

grammatical rules, programming, once you start to wrap your mind around what

it is and how it works and what these various languages are, it's so easy,

you'll see, after a few months of a class like this,

to start teaching yourself, subsequently,

other languages, as they may come, in the coming years as well.

So what ultimately matters in this particular course

is not so much where you end up relative to your classmates

but where you end up relative to yourself when you began.

And indeed, you'll begin today.

And the only experience that matters ultimately in this class is your own.

And so, consider where you are today.

Consider, perhaps, just how cryptic something like that

looked a few seconds ago.

And take comfort in knowing just some months from now all of that

will be within your own grasp.

And if you're thinking that, OK, surely the person in front of me, to the left,

to the right, behind me, knows more than me, that's statistically not the case.

2/3 of CS50 students have never taken a CS course before, which is to say,

you're in very good company throughout this whole term.

So then, what is computer science?

I claim that it's problem solving.

And the upside of that is that problem solving is

something we sort of do all the time.

But a computer science class, learning to program,

I think kind of cleans up your thoughts.

It helps you learn how to think more methodically, more carefully, more

correctly, more precisely.

Because, honestly, the computer is not going

to do what you want unless you are correct and precise and methodical.

And so, as such, there's these fringe benefits

of just learning to think like a computer scientist and a programmer.

And it doesn't take all that much to start doing so.

This, for instance, is perhaps the simplest picture of computer science,

sure, but really problem solving in general.

Problems are all about taking input, like the problem you want to solve.

You want to get the solution, a.k.a.

output.

And so, something interesting has got to be happening in here,

in here, when you're trying to get from those inputs to outputs.

Now, in the world of computers specifically,

we need to decide in advance how we represent these inputs and outputs.

We all just need to decide, whether it's Macs or PCs or phones or something

else, that we're all going to speak some common language, irrespective

of our human languages as well.

And you may very well know that computers tend to speak only

what language, so to speak?

Assembly, one, but binary, two, might be your go-to.

And binary, by implying two, means that the world of computers

has just two digits at its disposal, 0 and 1.

And indeed, we humans have many more than that, certainly not just zeros

and ones alone.

But a computer indeed only has zeros and ones.

And yet, somehow they can do so much.

They can crunch numbers in Excel, send text messages,

create images and artwork and movies and more.

And so, how do you get from something as simple as a few zeros, a few ones,

to all of the stuff that we're doing today

in our pockets and laptops and desktops?

Well, it turns out that we can start quite simply.

If a computer were to want to do something as simple as count, well,

what could it do?

Well, in our human world, we might count doing this,

like 1, 2, 3, 4, 5, using so-called unitary notation, literally the digits

on your fingers where one finger represents one person in the room,

if I'm, for instance, taking attendance.

Now, we humans would typically actually count 1, 2, 3, 4, 5, 6.

And we'd go past just those five digits and count much higher,

using zeros through nines.

But computers, somehow, only have these zeros and ones.

So if a computer only somehow speaks binary, zeros and ones,

how does it even count past the number 1?

Well, here are 3 zeros, of course.

And if you translate this number in binary, 000,

to a more familiar number in decimal, we would just call this zero.

Enough said.

If we were to represent, with a computer, the number 1,

it would actually be 001, which, not surprisingly,

is exactly the same as we might do in our human world,

but we might not bother writing out the two zeros at the beginning.

But a computer, now, if it wants to count as high as two,

it doesn't have the digit 2.

And so it has to use a different pattern of zeros and ones.

And that happens to be 010.

So this is not 10 with a zero in front of it.

It's indeed zero one zero in the context of binary.

And if we want to count higher now than two,

we're going to have to tweak these zeros and ones further to get 3.

And then if we want 4 or 5 or 6 or 7, we're

just kind of toggling these zeros and ones, a.k.a.

bits, for binary digits that represent, via these different patterns,

different numbers that you and I, as humans, know,

of course, as the so-called decimal system, 0 through 9,

dec implying 10, 10 digits, those zeros through nine.

So why that particular pattern?

And why these particular zeros and ones?

Well, it turns out that representing one thing or the other

is just really simple for a computer.

Why?

At the end of the day, they're powered by electricity.

And it's a really simple thing to just either store some electricity

or don't store some electricity.

Like, that's as simple as the world can get, on or off.

1 or 0, so to speak.

So, in fact, inside of a computer, a phone, anything

these days that's electronic, pretty much,

is some number of switches, otherwise known as transistors.

And they're tiny.

You've got thousands, millions of them in your Mac or PC or phone these days.

And these are just tiny little switches that can get turned on and off.

And by turning those things on and off in patterns,

a computer can count from 0 on up to 7, and even higher than that.

And so these switches, really, you can think of being as like switches

like this.

Let me just borrow one of our little stage lights here.

Here's a light bulb.

It's currently off.

And so, I could just think of this as representing,

in my laptop, a transistor, a switch, representing 0.

But if I allow some electricity to flow, now I, in fact, have a 1.

Well, how do I count higher than 1?

I, of course, need another light bulb.

So let me grab another one here.

And if I put it in that same kind of pattern, I don't want to just do this.

That's sort of the old finger counting way of unary, just 1, 2.

I want to actually take into account the pattern

of these things being on and off.

So if this was one a moment ago, what I think I did earlier was I turned it off

and let the next one over be on, a.k.a.

010.

And let me get us a third bit, if you will.

And that feels like enough.

Here is that same pattern now, starting at the beginning with 3.

So here is 000.

Here is 001.

Here is 010, a.k.a., in our human world of decimal, 2.

And then we could, of course, keep counting further.

This now would be 3 and dot dot dot.

If this other bulb now goes on, and that switch is turned

and all three stay on-- this, again, was what number?

AUDIENCE: Seven.

DAVID MALAN: OK, so, seven.

So it's just as simple, relatively, as that, if you will.

But how is it that these patterns came to be?

Well, these patterns actually follow something very familiar.

You and I don't really think about it at this level

anymore because we've probably been doing math and numbers since grade

school or whatnot.

But if we consider something in decimal, like the number 123,

I immediately jump to that.

This looks like 123 in decimal.

But why?

It's really just three symbols, a 1, a 2 with a bit of curve, a 3

with a couple of curves, that you and I now instinctively

just assign meaning to.

But if we do rewind a few years, that is one hundred twenty-three

because you're assigning meaning to each of these columns.

The 3 is in the so-called ones place.

The 2 is in the so-called tens place.

And the 1 is in the so-called hundreds place.

And then the math ensues quickly in your head.

This is technically 100 times 1, plus 10 times 2, plus 1 times 3, a.k.a.

100 plus 20 plus 3.

And there we get the sort of mathematical notion we know as 123.

Well, nicely enough, in binary, it's actually the same thing.

It's just these columns mean a little something different.

If you use three digits in decimal, and you have the ones place,

the tens place, and the hundreds place, well, why was that 1, 10, and 100?

They're technically just powers of 10.

So 10 to the 0, 10 to the 1, 10 to the 2.

Why 10?

Decimal system, "dec" meaning 10.

You have 8 and 10 digits, 0 through 9.

In the binary system, if you're going to use three digits,

just change the bases if you're using only zeros and ones.

So now it's powers of 2, 2 to the 0, 2 to the 1, 2 to the 2, a.k.a.

1 and 2 and 4, respectively.

And if you keep going, it's going to be 8s column, 16s column, 32, 64,

and so forth.

So, why did we get these patterns that we did?

Here's your 000 because it's 4 times 0, 2 times 0, 1 times 0, obviously 0.

This is why we got the decimal number 1 in binary.

This is why we got the number 2 in binary, because it's 4 times

0, plus 2 times 1, plus 1 times 0, and now 3, and now 4, and now 5, and now 6,

and now 7.

And, of course, if you wanted to count as high as 8, to be clear,

what do you have to do?

What does a computer need to do to count even higher than 7?

AUDIENCE: Add a bit.

DAVID MALAN: Add a bit.

Add another light bulb, another switch.

And, indeed, computers have standardized just how

many zeros and ones, or bits or switches,

they throw at these kinds of problems.

And, in fact, most computers would typically use at least eight at a time.

And even if you're only counting as high as three or seven,

you would still use eight and have a whole bunch of zeros.

But that's OK, because the computers these days certainly

have so many more, thousands, millions of transistors and switches

that that's quite OK.

All right, so, with that said, if we can now count as high as seven

or, frankly, as high as we want, that only

seems to make computers useful for things like Excel,

like number crunching.

But computers, of course, let you send text messages,

write documents, and so much more.

So how would a computer represent something like a letter,

like the letter A of the English alphabet, if, at the end of the day,

all they have is switches?

Any thoughts?

Yeah.

AUDIENCE: You can represent letters in numbers.

DAVID MALAN: OK, so we could represent letters using numbers.

OK, so what's a proposal?

What number should represent what?

AUDIENCE: Say if you were starting at the beginning of the alphabet,

you could say 1 is A, 2 is B, 3 is C.

DAVID MALAN: Perfect.

Yeah, we just all have to agree somehow that one number is

going to represent one letter.

So 1 is A, 2 is B, 3 is C, Z is 26, and so forth.

Maybe we can even take into account uppercase and lowercase.

We just have to agree and sort of write it down in some global standard.

And humans, indeed, did just that.

They didn't use 1, 2, 3.

It turns out they started a little higher up.

Capital A has been standardized as the number 65.

And capital B has been standardized as the number 66.

And you can kind of imagine how it goes up from there.

And that's because whatever you're representing,

ultimately, can only be stored, at the end of the day, as zeros and ones.

And so, some humans in a room before, decided that capital A shall be 65,

or, really, this pattern of zeros and ones inside of every computer

in the world, 01000001.

So if that pattern of zeros and ones ever appears in a computer,

it might be interpreted then as indeed a capital letter A, eight of those bits

at a time.

But I worry, just to be clear, we might have now created a problem.

It might seem, if I play this naively, that, OK,

how do I now actually do math with the number 65?

If now Excel displays 65 is an A, let alone Bs and Cs.

So how might a computer do as you've proposed,

have this mapping from numbers to letters, but still support numbers?

It feels like we've given something up.

Yeah?

AUDIENCE: By having a prefix for letters?

DAVID MALAN: By having a prefix?

AUDIENCE: You could have prefixes and suffixes.

DAVID MALAN: OK, so we could perhaps have some kind of prefix,

like some pattern of zeros and ones--

I like this-- that indicates to the computer

here comes another pattern that represents a letter.

Here comes another pattern that represents a number or a letter.

So, not bad.

I like that.

Other thoughts?

How might a computer distinguish these two?

Yeah.

AUDIENCE: Have a different file format, so,

like, odd text or just check the graphic or--

DAVID MALAN: Indeed, and that's spot-on.

Nothing wrong with what you suggested, but the world generally does just that.

The reason we have all of these different file formats in the world,

like JPEG and GIF and PNGs and Word documents, .docx,

and Excel files and so forth, is because a bunch of humans got in a room

and decided, well, in the context of this type of file, or really,

more specifically, in the context of this type of program,

Excel versus Photoshop versus Google Docs or the like,

we shall interpret any patterns of zeros and ones as being maybe numbers

for Excel, maybe letters in, like, a text messaging program or Google Docs,

or maybe even colors of the rainbow in something like Photoshop and more.

So it's context dependent.

And we'll see, when we ourselves start programming,

you the programmer will ultimately provide

some hints to the computer that tells the computer, interpret it as follows.

So, similar in spirit to that, but not quite a standardized with these

prefixes.

So this system here actually has a name ASCII, the American Standard

Code for Information Interchange.

And indeed, it began here in the US, and that's

why it's actually a little biased toward A's through Z's

and a bit of punctuation as well.

And that quickly became a problem.

But if we start simply now, in English, the mapping

itself is fairly straightforward.

So if A is 65, B it 66, and dot dot dot, suppose

that you received a text message, an email, from a friend,

and underneath the hood, so to speak, if you kind of

looked inside the computer, what you technically received in this text

or this email happened to be the numbers 72, 73, 33,

or, really, the underlying pattern of zeros and ones.

What might your friend have sent you as a message, if it's 72, 73, 33?

AUDIENCE: Hey.

DAVID MALAN: Hey?

Close.

AUDIENCE: Hi.

DAVID MALAN: Hi.

It's, indeed, hi.

Why?

Well, apparently, according to this little cheat sheet, H is 72, I is 73.

It's not obvious from this chart what the 33 is,

but indeed, this pattern represents "hi."

And anyone want to guess, or if you know, what 33 is?

AUDIENCE: Exclamation point.

DAVID MALAN: Exclamation point.

And this is, frankly, not the kind of thing most people know.

But it's easily accessible by a nice user-friendly chart like this.

So this is an ASCII chart.

When I said that we just need to write down this mapping earlier,

this is what people did.

They wrote it down in a book or in a chart.

And, for instance, here is our 72 for H, here is our 73 for I,

and here is our 33 for exclamation point.

And computers, Macs, PCs, iPhones, Android devices,

just know this mapping by heart, if you will.

They've been designed to understand those letters.

So here, I might have received "hi."

Technically, what I've received is these patterns of zeros and ones.

But it's important to note that when you get these patterns of zeros and ones

in any format, be it email or text or a file,

they do tend to come in standard lengths,

with a certain number of zeros and ones altogether.

And this happens to be 8 plus 8, plus 8.

So just to get the message "hi, exclamation point,"

you would have received at least, it would seem, some 24 bits.

But frankly, bits are so tiny, literally and mathematically,

that we don't tend to think or talk, generally, in terms of bits.

You're probably more familiar with bytes.

B-Y-T-E-S is a byte, is a byte, is a byte.

A byte is just 8 bits.

And even those, frankly, aren't that useful if we do out the math.

How high can you count if you have eight bits?

Anyone know?

Say it again?

Higher than that.

Unless you want to go negative, that's fine.

256, technically 255.

Long story short, if we actually got into the weeds of all of these zeros

and ones, and we figured out what 11111111 mathematically adds up

to in decimal, it would indeed be 255, or less

if you want to represent negative numbers as well.

So this is useful because now we can speak, not just in terms of bytes

but, if the files are bigger, kilobytes is thousands of bytes,

megabytes is millions of bytes, gigabytes is billions of bytes,

terabytes are trillions of bytes, and so forth.

We have a vocabulary for these increasingly large quantities of data.

The problem is that, if you're using ASCII and, therefore, eight bits or one

byte per character, and originally, only seven, you

can only represent 255 characters.

And that's actually 256 total characters, including zero.

And that's fine if you're using literally English, in this case,

plus a bunch of punctuation.

But there's many human languages in the world

that need many more symbols and, therefore, many more bits.

So, thankfully, the world decided that we'll indeed

support not just the US English keyboard, but all

of the accented characters that you might want for some languages.

And heck, if we use enough bits, zeros and ones,

not only can we represent all human languages in written form,

as well as some emotions along the way, we

can capture the latter with these things called emojis.

And indeed, these are very much in vogue these days.

You probably send and/or receive many of these things any given day.

These are just characters, like letters of an alphabet, patterns

of zeros and ones that you're receiving, that the world has also standardized.

For instance, there are certain emojis that

are represented with certain patterns of bits.

And when you receive them, your phone, your laptop, your desktop,

displays them as such.

And this newer standard is called Unicode.

So it's a superset of what we called ASCII.

And Unicode is just a mapping of many more numbers to many more letters

or characters, more generally, that might

use eight bits for backwards compatibility

with the old way of doing things with ASCII, but they might also use 16 bits.

And if you have 16 bits, you can actually

represent more than 65,000 possible letters.

And that's getting up there.

And heck, Unicode might even use 32 bits to represent letters and numbers

and punctuation symbols and emojis.

And that would give you up to 4 billion possibilities.

And, I daresay, one of the reasons we see so many emojis these days is we

have so much room.

I mean, we've got room for billions more, literally.

So, in fact, just as a little bit of trivia,

has anyone ever received this decimal number, or if you prefer binary now,

has anyone ever received this pattern of zeros and ones on your phone,

in a text or an email, perhaps this past year?

Well, if you actually look this up, this esoteric sequence of zeros and ones

happens to represent face with medical mask.

And notice that if you've got an iPhone or an Android device,

you might be seeing different things.

In fact, this is the Android version of this, most recently.

This is the iOS version of it, most recently.

And there's bunches of other interpretations by other companies

as well.

So Unicode, as a consortium, if you will,

has standardized the descriptions of what these things are.

But the companies themselves, manufacturers out there,

have generally interpreted it as you see fit.

And this can lead to some human miscommunications.

In fact, for like, literally, embarrassingly, like a year or two,

I started being in the habit of using the emoji that kind of looks

like this because I thought it was like woo, happy face, or whatever.

I didn't realize this is the emoji for hug

because whatever device I was using sort of looks like this, not like this.

And that's because of their interpretation of the data.

This has happened too when what was a gun became a water

pistol in some manufacturers' eyes.

And so it's an interesting dichotomy between what information we all

want to represent and how we choose, ultimately, to represent it.

Questions, then, on these representations of formats,

be it numbers or letters, or soon more.

Yeah?

AUDIENCE: Why is decimal popular for a computer

if binary is the basis for everything?

DAVID MALAN: Sorry, why is what so popular?

AUDIENCE: Why is the decimal popular if binary is the fundamental--

DAVID MALAN: Yeah, so we'll come back to this in a few weeks, in fact.

There are other ways to represent numbers.

Binary is one.

Decimal is another.

Unary is another.

And hexadecimal is yet a fourth that uses 16 total digits, literally 0

through 9 plus A, B, C, D, E, F. And somehow,

you can similarly count even higher with those.

We'll see in a few weeks why this is compelling.

But hexadecimal, long story short, uses four bits per digit.

And so, four bits, if you have two digits in hex, that gives you eight.

And it's just a very convenient unit of measure.

And it's also human convention in the world of files and other things.

But we'll come back to that soon.

Other questions?

AUDIENCE: Do the lights on the stage supposedly say that--

DAVID MALAN: Do the lights on the stage supposedly say anything?

Well, if we had thought in advance to use maybe 64 light bulbs,

that would seem to give us 8 total bytes on stage, 8 times 8,

giving us just that.

Maybe.

Good question.

Other questions on 0's and 1's?

It's a little bright in here.

No?

Oh, yes?

Where everyone's pointing somewhere specific.

There we go.

Sorry.

Very bright in this corner.

AUDIENCE: I was just going to ask about the 255 bits,

like with the maximum characters.

[INAUDIBLE]

DAVID MALAN: Ah, sure, and we'll come back to this, in some form,

in the coming days too, at a slower pace too,

we have, with eight bits, two possible values for the first

and then two for the next, two for the next, and so forth.

So that's 2 times 2 times 2.

That's 2 to the eighth power total, which

means you can have 256 total possible patterns of zeros and ones.

But as we'll see soon computer scientists, programmers,

software often starts counting at 0 by convention and if you use one of those

patterns, 00000000 to represent the decimal number we know is zero,

you only have 255 other patterns left to count as high as therefore 255.

That's all.

Good question.

All right, so what then might we have besides these emojis and letters

and numbers?

Well, we of course have things like colors and programs

like Photoshop and pictures and photos.

Well let me ask the question again.

How might a computer, do you think, knowing what you know now, represents

something like a color?

Like what are our options if all we've got are zeros and ones and switches?

Yeah?

AUDIENCE: RGB

DAVID MALAN: RGB.

RGB indeed is this acronym that represents some amount of red

and some amount of green and blue and indeed computers

can represent colors by just doing that.

Remembering, for instance, this dot.

This yellow dot on the screen that might be part of any of those emojis

these days, well that's some amount of red, some amount of green,

some amount of blue.

And if you sort of mix those colors together,

you can indeed get a very specific one.

And we'll see you in just a moment just that.

So indeed earlier on, humans only used seven bits total.

And it was only once they decided, well, let's add an eighth bit that they

got extended ASCII and that was initially in part

a solution to the same problem of not having enough room, if you will,

in those patterns of zeros and ones to represent all of the characters

that you might want.

But even that wasn't enough and that's why we've now gone up to 16 and 32

and long past 7.

So if we come back now to this one particular color.

RGB was proposed as a scheme, but how might this work?

Well, consider for instance this.

If we do indeed decide as a group to represent any color of the rainbow

with some mixture of some red, some green, and some blue,

we have to decide how to represent the amount of red and green and blue.

Well, it turns out if all we have are zeros and ones, ergo numbers,

let's do just that.

For instance, suppose a computer we're using, these three numbers 72, 73, 33,

no longer in the context of an email or a text message,

but now in the context of something like Photoshop, a program for editing

and creating graphical files, maybe this first number

could be interpreted as representing some amount of red, green, and blue,

respectively.

And that's exactly what happens.

You can think of the first digit as red, second as green, third as blue.

And so ultimately when you combine that amount of red, that amount of green,

that amount of blue, it turns out it's going to resemble the shade of yellow.

And indeed, you can come up with a numbers between 0 and 255

for each of those colors to mix any other color that you might want.

And you can actually see this in practice.

Even though our screens, admittedly, are getting really good

on our phones and laptops such that you barely see the dots, they are there.

You might have heard the term pixel before.

Pixel's just a dot on the screen and you've

got thousands, millions of them these days horizontally and vertically.

If I take even this emoji, which again happens

to be one company's interpretation of a face with medical mask

and zoom in a bit, maybe zoom in a bit more,

you can actually start to see these pixels.

Things get pixelated because what you're seeing

is each of the individual dots that compose this particular image.

And apparently each of these individual dots

are probably using 24 bits, eight bits for red, eight bits for green, eight

bits for blue, in some pattern.

This program or some other like Photoshop is interpreting one pattern

and it's white or yellow or black or some brown in between.

So if you look sort of awkwardly, but up close to your phone or your laptop

or maybe your TV, you can see exactly this, too.

All right, well, what about things that we also

watch every day on YouTube or the like?

Things like videos.

How would a computer, knowing what we know now,

represent something like a video?

How might you represent a video using only zeros and ones?

Yeah?

AUDIENCE: As we can see here, they represent images, right?

[INAUDIBLE] sounds of the 0 and 1s as well.

[INAUDIBLE]

DAVID MALAN: Yeah, exactly.

To summarize, what video really adds is just some notion of time.

It's not just one image, it's not just one letter or a number,

it's



Contact this candidate