The First Programming Languages: Crash Course Computer Science #11

The First Programming Languages: Crash Course Computer Science #11


This episode is brought to you by CuriosityStream. Hi, I’m Carrie Anne and welcome to CrashCourse
Computer Science! So far, for most of this series, we’ve focused
on hardware — the physical components of computing — things like: electricity and
circuits, registers and RAM, ALUs and CPUs. But programming at the hardware level is cumbersome
and inflexible, so programmers wanted a more versatile way to program computers – what
you might call a “softer” medium. That’s right, we’re going to talk about
Software! INTRO In episode 8, we walked through a simple program
for the CPU we designed. The very first instruction to be executed,
the one at memory address 0, was 0010 1110. As we discussed, the first four bits of an
instruction is the operation code, or OPCODE for short. On our hypothetical CPU, 0010 indicated a
LOAD_A instruction — which moves a value from memory into Register A. The second set of four bits defines the memory
location, in this case, 1110, which is 14 in decimal. So what these eight numbers really mean is
“LOAD Address 14 into Register A”. We’re just using two different languages. You can think of it like English and Morse
Code. “Hello” and “…. . .-.. .-.. —” mean
the same thing — hello! — they’re just encoded differently. English and Morse Code also have different
levels of complexity. English has 26 different letters in its alphabet
and way more possible sounds. Morse only has dots and dashes. But, they can convey the same information,
and computer languages are similar. As we’ve seen, computer hardware can only
handle raw, binary instructions. This is the “language” computer processors
natively speak. In fact, it’s the only language they’re
able to speak. It’s called Machine Language or Machine
Code. In the early days of computing, people had
to write entire programs in machine code. More specifically, they’d first write a
high-level version of a program on paper, in English, for example… “retrieve the next sale from memory, then
add this to the running total for the day, week and year, then calculate any tax to be
added” …and so on. An informal, high-level description of a program
like this is called Pseudo-Code. Then, when the program was all figured out
on paper, they’d painstakingly expand and translate it into binary machine code by hand,
using things like opcode tables. After the translation was complete, the program
could be fed into the computer and run. As you might imagine, people quickly got fed
up with this process. So, by the late 1940s and into the 50s, programmers
had developed slightly higher-level languages that were more human-readable. Opcodes were given simple names, called mnemonics,
which were followed by operands, to form instructions. So instead of having to write instructions
as a bunch of 1’s and 0’s, programmers could write something like “LOAD_A 14”. We used this mnemonic in Episode 8 because
it’s so much easier to understand! Of course, a CPU has no idea what “LOAD_A
14” is. It doesn’t understand text-based language,
only binary. And so programmers came up with a clever trick. They created reusable helper programs, in
binary, that read in text-based instructions, and assemble them into the corresponding binary
instructions automatically. This program is called — you guessed it — an
Assembler. It reads in a program written in an Assembly
Language and converts it to native machine code. “LOAD_A 14” is one example of an assembly
instruction. Over time, Assemblers gained new features
that made programming even easier. One nifty feature is automatically figuring
out JUMP addresses. This was an example program I used in episode
8:Notice how our JUMP NEGATIVE instruction jumps to address 5, and our regular JUMP goes
to address 2. The problem is, if we add more code to the
beginning of this program, all of the addresses would change. That’s a huge pain if you ever want to update
your program! And so an assembler does away with raw jump
addresses, and lets you insert little labels that can be jumped to. When this program is passed into the assembler,
it does the work of figuring out all of the jump addresses. Now the programmer can focus more on programming
and less on the underlying mechanics under the hood enabling more sophisticated things
to be built by hiding unnecessary complexity. As we’ve done many times in this series,
we’re once again moving up another level of abstraction. A NEW LEVEL OF ABSTRACTION! However, even with nifty assembler features
like auto-linking JUMPs to labels, Assembly Languages are still a thin veneer over machine
code. In general, each assembly language instruction
converts directly to a corresponding machine instruction – a one-to-one mapping – so
it’s inherently tied to the underlying hardware. And the assembler still forces programmers
to think about which registers and memory locations they will use. If you suddenly needed an extra value, you
might have to change a lot of code to fit it in. Let’s go to the Thought Bubble. This problem did not escape Dr. Grace Hopper. As a US naval officer, she was one of the
first programmers on the Harvard Mark 1 computer, which we talked about in Episode 2. This was a colossal, electro-mechanical beast completed in 1944 as part of the allied war effort. Programs were stored and fed into the computer
on punched paper tape. By the way, as you can see, they “patched”
some bugs in this program by literally putting patches of paper over the holes on the punch
tape. The Mark 1’s instruction set was so primitive,
there weren’t even JUMP instructions. To create code that repeated the same operation
multiple times, you’d tape the two ends of the punched tape together, creating a physical
loop. In other words, programming the Mark 1 was
kind of a nightmare! After the war, Hopper continued to work at
the forefront of computing. To unleash the potential of computers, she
designed a high-level programming language called “Arithmetic Language Version 0”,
or A-0 for short. Assembly languages have direct, one-to-one
mapping to machine instructions. But, a single line of a high-level programming
language might result in dozens of instructions being executed by the CPU. To perform this complex translation, Hopper
built the first compiler in 1952. This is a specialized program that transforms
“source” code written in a programming language into a low-level language, like assembly
or the binary “machine code” that the CPU can directly process. Thanks, Thought Bubble. So, despite the promise of easier programming,
many people were skeptical of Hopper’s idea. She once said, “I had a running compiler
and nobody would touch it. … they carefully told me, computers could
only do arithmetic; they could not do programs.” But the idea was a good one, and soon many
efforts were underway to craft new programming languages — today there are hundreds! Sadly, there are no surviving examples of
A-0 code, so we’ll use Python, a modern programming language, as an example. Let’s say we want to add two numbers and
save that value. Remember, in assembly code, we had to fetch
values from memory, deal with registers, and other low-level details. But this same program can be written in python
like so: Notice how there are no registers or memory
locations to deal with — the compiler takes care of that stuff, abstracting away a lot
of low-level and unnecessary complexity. The programmer just creates abstractions for
needed memory locations, known as variables, and gives them names. So now we can just take our two numbers, store
them in variables we give names to — in this case, I picked a and b but those variables
could be anything – and then add those together, saving the result in c, another variable I
created. It might be that the compiler assigns Register
A under the hood to store the value in a, but I don’t need to know about it! Out of sight, out of mind! It was an important historical milestone,
but A-0 and its later variants weren’t widely used. FORTRAN, derived from “Formula Translation”,
was released by IBM a few years later, in 1957, and came to dominate early computer
programming. John Backus, the FORTRAN project director,
said: “Much of my work has come from being lazy. I didn’t like writing programs, and so … I
started work on a programming system to make it easier to write programs.” You know, typical lazy person. They’re always creating their own programming
systems. Anyway, on average, programs written in FORTRAN
were 20 times shorter than equivalent handwritten assembly code. Then the FORTRAN Compiler would translate
and expand that into native machine code. The community was skeptical that the performance
would be as good as hand written code, but the fact that programmers could write more
code more quickly, made it an easy choice economically: trading a small increase in
computation time for a significant decrease in programmer time. Of course, IBM was in the business of selling
computers, and so initially, FORTRAN code could only be compiled and run on IBM computers. And most programing languages and compilers
of the 1950s could only run on a single type of computer. So, if you upgraded your computer, you’d
often have to re-write all the code too! In response, computer experts from industry,
academia and government formed a consortium in 1959 — the Committee on Data Systems Languages,
advised by our friend Grace Hopper — to guide the development of a common programming language
that could be used across different machines. The result was the high-level, easy to use,
Common Business-Oriented Language, or COBOL for short. To deal with different underlying hardware,
each computing architecture needed its own COBOL compiler. But critically, these compilers could all
accept the same COBOL source code, no matter what computer it was run on. This notion is called write once, run anywhere. It’s true of most programming languages
today, a benefit of moving away from assembly and machine code, which is still CPU specific. The biggest impact of all this was reducing
computing’s barrier to entry. Before high level programming languages existed,
it was a realm exclusive to computer experts and enthusiasts. And it was often their full time profession. But now, scientists, engineers, doctors, economists,
teachers, and many others could incorporate computation into their work . Thanks to these languages, computing went
from a cumbersome and esoteric discipline to a general purpose and accessible tool. At the same time, abstraction in programming
allowed those computer experts – now “professional programmers” – to create increasingly
sophisticated programs, which would have taken millions, tens of millions, or even more lines
of assembly code. Now, this history didn’t end in 1959. In fact, a golden era in programming language
design jump started, evolving in lockstep with dramatic advances in computer hardware. In the 1960s, we had languages like ALGOL,
LISP and BASIC. In the 70’s: Pascal, C and Smalltalk were
released. The 80s gave us C++, Objective-C, and Perl. And the 90’s: python, ruby, and Java. And the new millennium has seen the rise of
Swift, C#, and Go – not to be confused with Let it Go and Pokemon Go. Anyway, some of these might sound familiar
— many are still around today. It’s extremely likely that the web browser
you’re using right now was written in C++ or Objective-C. That list I just gave is the tip of the iceberg. And languages with fancy, new features are
proposed all the time. Each new language attempts to leverage new
and clever abstractions to make some aspect of programming easier or more powerful, or
take advantage of emerging technologies and platforms, so that more people can do more
amazing things, more quickly. Many consider the holy grail of programming
to be the use of “plain ol’ English”, where you can literally just speak what you
want the computer to do, it figures it out, and executes it. This kind of intelligent system is science
fiction… for now. And fans of 2001: A Space Odyssey may be okay
with that. Now that you know all about programming languages,
we’re going to deep dive for the next couple of episodes, and we’ll continue to build
your understanding of how programming languages, and the software they create, are used to
do cool and unbelievable things. See you next week. Hey guys, this week’s episode was brought
to you by CuriosityStream which is a streaming service full of documentaries and non­fiction
titles from some really great filmmakers, including exclusive originals. I just watched a great series called “Digits”
hosted by our friend Derek Muller. It’s all about the Internet – from its origins,
to the proliferation of the Internet of Things, to ethical, or white hat, hacking. And it even includes some special guest appearances…
like that John Green guy you keep mentioning in the comments. And Curiosity Stream offers unlimited access
starting at $2.99 a month, and for you guys, the first two months are free if you sign
up at curiositystream.com/crashcourse and use the promo code “crash course” during the sign-up process.

100 thoughts on “The First Programming Languages: Crash Course Computer Science #11

  • July 1, 2018 at 8:56 am
    Permalink

    Why there was no reference for JavaScript. Especially this is a youtube video, so it uses JS???

    Reply
  • July 11, 2018 at 9:32 am
    Permalink

    Don't confuse it with "Let it Go" and "Pokemon Go" LMAO!!!

    Reply
  • July 21, 2018 at 8:11 pm
    Permalink

    Wouldn't Siri and Alexa be examples of English being used as a direct input for a computing processed result? It's not quite writing a program, but it's still an English>Binary>Output example in existence.

    Reply
  • August 5, 2018 at 8:02 pm
    Permalink

    If you ever go for a bachelors degree in computer science, at some point you will take a course in assembly to give you an appreciation for higher level languages.

    Reply
  • August 5, 2018 at 8:34 pm
    Permalink

    It's funny. Previously, the increase in levels of abstraction only made things more obtuse and difficult to fully grasp/keep track of, but for the very first time, it's like a breath of fresh air, things finally start to click and make more sense. The feeling is similar to reaching the peak of a mountain and seeing the clear skies and breathing a deep breath of fresh air after trudging through all the difficult and overly complicated nonsense.

    This was a good video.

    Reply
  • August 7, 2018 at 6:21 pm
    Permalink

    0:34 And JavaScript, which is one of the most popular programming languages today, as it powers the front-end web, and more.

    Reply
  • August 9, 2018 at 12:22 am
    Permalink

    Carrie Ann is the best!

    Reply
  • August 9, 2018 at 7:15 pm
    Permalink

    Use computers to program other computers ?could this be a thing?

    Reply
  • August 13, 2018 at 12:38 pm
    Permalink

    There’s an arduino running blink in the background

    Reply
  • August 26, 2018 at 4:39 am
    Permalink

    As a self-taught dev I really appreciate the work put into this. I am always thinking what video to best reference for what exactly is happening when you punch a bunch of code on the keyboard. Keep it up 🙂

    Reply
  • August 26, 2018 at 5:52 pm
    Permalink

    slow.down.

    Reply
  • August 30, 2018 at 5:11 am
    Permalink

    "You know typical lazy person, writing their own programming systems" hah!!

    Reply
  • September 7, 2018 at 2:59 pm
    Permalink

    Great content in these videos and a good voice, but she talks too fast, it spoils it.

    Reply
  • September 8, 2018 at 12:36 pm
    Permalink

    ……yhhheeh…..shes kinda fat………..

    Reply
  • September 11, 2018 at 1:05 pm
    Permalink

    english please

    Reply
  • September 11, 2018 at 2:32 pm
    Permalink

    If necessity is the mother of all inventions then pure human laziness is the father…

    Reply
  • September 11, 2018 at 2:37 pm
    Permalink

    Pascal is still around today…
    It evolved into object Pascal and is currently found within Delphi and Lazarus integrated development environments

    Reply
  • September 11, 2018 at 3:08 pm
    Permalink

    1:13 kkkkkkk U R amazing

    Reply
  • September 25, 2018 at 8:50 pm
    Permalink

    if you got this far… just know you're never getting laid again

    Reply
  • September 27, 2018 at 4:41 pm
    Permalink

    Sometimes programming languages make more sense to me than "English"…

    Reply
  • September 28, 2018 at 7:57 am
    Permalink

    JOHN GREEN

    Reply
  • September 28, 2018 at 8:47 am
    Permalink

    What I really want to know is has this course covered how 1's and 0's were assigned instructions? Not to mention what a brilliant idea it was to use 1's 0's in the first place.

    Reply
  • September 28, 2018 at 8:55 am
    Permalink

    No programming examples of A-0. Could it be re-invented?

    Reply
  • September 28, 2018 at 8:58 am
    Permalink

    Just how long would it take to come up with a simple language from scratch.

    Reply
  • September 28, 2018 at 8:59 am
    Permalink

    Is typing text on a screen via a keyboard a program in itself? And was writing text based programs only possible via this method.

    Reply
  • October 2, 2018 at 11:05 pm
    Permalink

    your turkey neck amuses me.

    Reply
  • October 14, 2018 at 3:41 pm
    Permalink

    I have a question still don't understand. compiler coveted to binary to final result. But how and who give a 0 or 1 to transistor?

    Reply
  • October 17, 2018 at 12:15 am
    Permalink

    Ha, python is interpreted, not compiled.

    Reply
  • October 19, 2018 at 9:34 am
    Permalink

    6:27 where is the declaring variable type

    Reply
  • October 25, 2018 at 7:04 am
    Permalink

    Python is a scripting language not a programming language. ughhhh

    Reply
  • October 29, 2018 at 2:37 pm
    Permalink

    For real i was looking at the minecraft creeper 😂

    Reply
  • October 31, 2018 at 11:02 pm
    Permalink

    I see ghost in the wires by Kevin mitnick 👀

    Reply
  • November 4, 2018 at 12:33 pm
    Permalink

    I wish I had professors like you in the uni..

    Reply
  • November 4, 2018 at 9:33 pm
    Permalink

    This made me wonder if learning coding languages can have similar benifits as being bilingual.

    Reply
  • November 6, 2018 at 7:46 am
    Permalink

    not gonna lie i dont get most of this because of how complex she makes it cant she make it simpler

    Reply
  • November 21, 2018 at 12:52 pm
    Permalink

    Love all your videos
    Hope to work with you

    Reply
  • December 3, 2018 at 7:22 am
    Permalink

    Нихера не понял, но очень интересно…

    Reply
  • December 5, 2018 at 6:11 pm
    Permalink

    Isnt Python one of those languages does not use a compiler?

    Reply
  • December 8, 2018 at 10:09 pm
    Permalink

    There is hardware to run compilers isn’t there? How does that work?, like what’s the electrical logic behind that?

    Reply
  • December 16, 2018 at 7:40 pm
    Permalink

    Wait, how do the binary instructions make their way into RAM and addresses?

    Sorry, but it's a bit difficult to understand 🙂

    Reply
  • December 18, 2018 at 5:45 pm
    Permalink

    Very good video love your style I'll be looking out for others. Thanks this was just what I was looking for a quick but useful explanation of programming history.

    Reply
  • December 25, 2018 at 2:11 pm
    Permalink

    9:34 And the most important one that is used everywhere today: JavaScript, derived from C

    Reply
  • January 2, 2019 at 11:29 pm
    Permalink

    These are very good Carrie-Anne. Well done!

    Reply
  • January 4, 2019 at 8:50 am
    Permalink

    That's actually pretty crazy to think about. That one day I'll be able to sit down at a coffee shop, jot down some pseudo code on a napkin, open my laptop, and simply say, 'Computer, create an array of numbers 1-10, then add those numbers and print the result to the screen' and the computer would execute it. This could happen in the next 10 years or less with the way computing is panning out in an exponential growth fashion.

    Reply
  • January 6, 2019 at 5:05 pm
    Permalink

    This episode was very difficult to me as it doesn't go in depth on how early assemblers and compilers worked. Love the series

    Reply
  • January 7, 2019 at 10:18 am
    Permalink

    But who is this Fortran you speak of?

    Reply
  • January 15, 2019 at 1:09 am
    Permalink

    9:57 Scratch, look at what you have done.

    Reply
  • January 19, 2019 at 4:08 am
    Permalink

    John Backus and Linus Torvalds are brothers or what they said the same thing. And I they that I am not lazy and those men building languages and operating systems.

    Reply
  • January 25, 2019 at 5:19 am
    Permalink

    crash course pls make a music theory series

    Reply
  • January 25, 2019 at 8:16 am
    Permalink

    talk too faster

    Reply
  • January 25, 2019 at 5:56 pm
    Permalink

    10:15

    Now, to be fair… HAL only tried to execute conflicting instructions. Up until then he operated with no recorded major misstep. :/

    Reply
  • February 4, 2019 at 1:56 pm
    Permalink

    Can assembly language play a significant role in programming for the Super Computers?

    Reply
  • February 19, 2019 at 8:19 pm
    Permalink

    My mom only let me program in assembly, preferably 6052 or 68k, later PowerPC (never x86, of course). I was first amazed at how she reacted when she found me dabbling in C, but now that I've grown up, I see why higher levels are the work of the devil.

    Reply
  • February 21, 2019 at 2:11 pm
    Permalink

    Thanks for this series CrashCourse! This is amazing. Looking forward to continuing on!

    Reply
  • March 3, 2019 at 6:56 pm
    Permalink

    Back in the days: NO WAY we're using compilers. Too high level.

    Today: Compilers are like the lowest level thing ever… Python is the best!

    In the future: Interpreted languages are like programming on the metal! Just use plain english!

    Reply
  • March 4, 2019 at 6:45 am
    Permalink

    Damn… This video is so good. Almost all my doubts about programming languages are cleared, thanks to this.

    Reply
  • March 14, 2019 at 12:27 pm
    Permalink

    the videos present some of the extraordinary details about computing in a simplistic way but i
    think i would appreciate if there is any way we can revise all that was taught from beginning. Tests or Assessments would be
    a great way to start with. I request you to provide short tests over the topics. Best Of Luck. Long live this channel.

    Reply
  • March 31, 2019 at 6:01 pm
    Permalink

    10:12 not science fiction when using Idris

    Reply
  • April 2, 2019 at 11:22 pm
    Permalink

    Python is a bad example, it does not have a compiler. It is interpreted lol

    Reply
  • April 5, 2019 at 2:03 pm
    Permalink

    Talks to fast, not gonna watch

    Reply
  • April 17, 2019 at 4:49 am
    Permalink

    The painstaking effort the producers went to, in order to find so many "diverse" engineer pictures, is hilarious.

    Reply
  • April 23, 2019 at 4:56 pm
    Permalink

    I love you Carrie! Thank you!

    Reply
  • April 24, 2019 at 1:54 am
    Permalink

    >no mention of Haskell
    Oh okay, I see how it is

    Reply
  • May 9, 2019 at 1:55 am
    Permalink

    Could you make a video on compiler construction

    Reply
  • May 12, 2019 at 1:08 am
    Permalink

    Fortran is still alive and well.

    Reply
  • May 17, 2019 at 6:36 pm
    Permalink

    she talks too fast! turn it down a notch!

    Reply
  • May 20, 2019 at 8:40 am
    Permalink

    yeah typical lazy programmers always writing compilers 😒

    Reply
  • May 20, 2019 at 10:38 pm
    Permalink

    3:00 no… no one guess it was an assembler

    Reply
  • May 22, 2019 at 12:56 pm
    Permalink

    Understanding the basis of computing makes me realized how crazy advanced are computers.

    Reply
  • May 27, 2019 at 2:50 am
    Permalink

    I just signed up for Curiosity stream ty crashcourse.

    Reply
  • June 21, 2019 at 12:38 pm
    Permalink

    One point is missed,how do i write 1s and 0s in the ram in voltages. Ram is loaded with predefined voltage instruction when i boot the computer. But how do i change the voltages to represent a different instruction?

    Reply
  • June 21, 2019 at 12:46 pm
    Permalink

    In short how the 1s and 0s are input to high volt and low volt inside the ram?

    Reply
  • June 29, 2019 at 5:47 pm
    Permalink

    LiveCode would like a word with you regarding plain English being the realm of science fiction.

    It's been around since the 80s as well, under different names.

    Reply
  • July 15, 2019 at 2:42 am
    Permalink

    The holy grail is to understand English? It should be to read my desires before even I know them.

    Reply
  • July 16, 2019 at 12:46 am
    Permalink

    can you teach me im just a 10 year old boy ??

    Reply
  • August 7, 2019 at 4:17 am
    Permalink

    Programming is my favorite stuff.

    Reply
  • August 17, 2019 at 8:00 pm
    Permalink

    Watch all videos or bust!

    Reply
  • August 20, 2019 at 10:29 am
    Permalink

    Such a great video! Made a very complex concept so easy to get. Thanks CC!

    Reply
  • August 23, 2019 at 5:49 pm
    Permalink

    Could someone send me the link to this table 2:16

    Reply
  • August 24, 2019 at 6:22 pm
    Permalink

    you talk too fast😔

    Reply
  • September 4, 2019 at 12:04 pm
    Permalink

    …and now Fortran still remains a language of choice for heaviest numerical computations. Yes, nowaday codes do look much different. Not because they really have to, just for the sake of clarity, reusability etc.

    Reply
  • September 21, 2019 at 2:23 am
    Permalink

    Did anyone notice that minecraft creeper model in the background?

    Reply
  • September 25, 2019 at 11:39 pm
    Permalink

    Speaking tooooo…. fast Carrie Ann; maybe digital computers can comprehend you but neurons have refractory time limits to fire -:)

    Reply
  • October 1, 2019 at 2:57 am
    Permalink

    im trying so hard to learn computer science but im dumb 🙁

    Reply
  • October 13, 2019 at 3:16 am
    Permalink

    What?

    Reply
  • October 16, 2019 at 2:22 am
    Permalink

    Python is interpreted, not compiled

    Reply
  • October 30, 2019 at 1:49 pm
    Permalink

    2:51 can anybody help me here?

    how can a binary program read text? how is the text converted into binary?

    Reply
  • December 3, 2019 at 8:48 pm
    Permalink

    Thankful to finally understand this after watching this video a few times.

    Reply
  • December 9, 2019 at 1:24 am
    Permalink

    It's not sci fi, I can tell Siri to open reminders and she will?

    Reply
  • December 22, 2019 at 10:10 am
    Permalink

    Really cool content but it was a bit too fast for me.

    Reply
  • December 23, 2019 at 7:46 pm
    Permalink

    this is NOT!!! in the right category for this google search called

    ( how do people program computers? )

    I want to know how to reprogram a old windows 7 computer

    Reply
  • December 25, 2019 at 7:22 am
    Permalink

    Python is interpreted not compiled

    Reply
  • December 30, 2019 at 4:10 pm
    Permalink

    I would pay good money for this. Thanks for offering such great content for free

    Reply
  • January 5, 2020 at 12:15 am
    Permalink

    I remember fooling around with basic. I wrote a program to pick lottery numbers. No I didn't win.😞

    Reply
  • January 18, 2020 at 5:35 pm
    Permalink

    This course is too good to be real

    Reply
  • January 30, 2020 at 1:27 pm
    Permalink

    So let me get this straight… Before… We were skeptical that a computer could handle turning human words into multiple computer instructions by itself… And now we have machine learning. Oh how far we have come. Amazing.

    Reply
  • January 31, 2020 at 12:04 pm
    Permalink

    I feel so bad that I didn't know anything about the marvellous Grace hopper She built the first compiler and a simpler HLL and laid to the foundation of basically all the other ones. I don't understand why is there nothing named after her, why doesn't the school teach about her?

    Reply
  • February 1, 2020 at 1:05 pm
    Permalink

    lovely

    Reply
  • February 9, 2020 at 6:38 am
    Permalink

    Excellent tutorial 🙂🙂🙂🙂

    Reply
  • February 9, 2020 at 10:43 am
    Permalink

    Simply programming languages to Hz, CPU understand hz but we can't understand it, made to communicate with CPU.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *