有人可以告诉我计算机编程工作原理的基础知识吗? [英] Can someone tell me the very basics of how computer programming works?

查看:152
本文介绍了有人可以告诉我计算机编程工作原理的基础知识吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

使编程语言的所有词语真正做任何事情?我的意思是,实际发生的事情使计算机知道这些字是什么意思?如果我口头告诉我的电脑做某事,它不会这样做,因为它不明白。那么,这些人类词汇写入一种语言究竟是什么实际上会导致计算机做一些理想的活动?

解决方案

这一切都从CPU或处理器开始。每个处理器类型具有其能够执行的一组定义的指令。这些指令操作在一和零,这反过来代表任何你希望他们:数字,字母,甚至指令本身。



在最低电平,零是由晶体管上存在一定电压(通常接近0V)决定的,而1是存在不同的电压CPU相关,例如5V)



机器指令本身是置于处理器中称为寄存器的特殊位置的零和一组,处理器接收指令及其操作数来自特定位置并执行操作,将结果放置在另一位置上,之后取得下一个指令等等,直到用完指令执行或被关闭。



一个简单的例子。假设机器指令001表示添加两个数字。



然后,您编写一个程序,添加两个数字,通常如下:

  4 + 5 

然后将此文本传递给编译器,对于处理器,你将运行程序(sidenote,你可以编译代码在不同的处理器中运行,你正在运行,它的过程称为交叉编译,它是有用的,例如,在嵌入式平台)。编译器最终会生成

  001 00000100 00000101 

使用附加的样板机器代码将001指令放入下一个指令寄存器(指令指针)和二进制编码数字寄存器(或RAM)中。 / p>

从结构化语言生成机器代码的过程相当复杂,并且限制了这些语言最终看起来的正常程度。这就是为什么你不能用英语编写程序,为了编译器能够生成适当的零和一个序列有太多含糊。



指令CPU可以执行是相当基本和简单,加法,除法,否定,从RAM读取,放在RAM中,从寄存器读取等。



下一个问题是,这些简单的数字指令如何生成我们在计算(互联网,游戏,电影播放器​​等)中看到的所有奇迹?



基本上归结为创建足够的模型,例如3D游戏引擎有一个数学模型,代表游戏世界,可以计算游戏的位置/碰撞基于它的对象。



这些模型是建立在许多这些小指令上的,这里的高级语言(不是机器代码)真正闪耀,因为它们提高了抽象级别,然后可以更接近要实现的模型 ,允许您轻松地解释如何有效地计算下一个位置,士兵将基于从控制器接收的输入,而不是阻止你很容易推理,因为你太忙,试图不忘记0.



从汇编语言(一种非常类似于机器代码的语言,它是第一种编程语言,它是CPU特定的,每个汇编指令直接转换为机器代码)C(在不同的CPU之间可移植,并且在比汇编更高的抽象层次上:C代码的每一行代表许多机器码指令)。这对于程序员来说是一个巨大的生产力提升,他们不再需要在不同的CPU之间移植程序,他们可以更容易地考虑底层模型,导致我们已经看到的软件(甚至需求)从20世纪70年代直到今天。



待处理的缺少链接是如何控制该信息的处理方式以及如何从外部源接收输入,例如在屏幕上显示图像或将信息写入硬驱动或在打印机上打印图像,或从键盘接收键按钮。这一切都可能由计算机中存在的其他硬件以类似于CPU的方式进行控制,您将数据和指令放置在图形卡或网卡或硬盘驱动器中的某些晶体管中,或者随机存取存储器。 CPU具有允许其将一些数据或指令放入(或从不同硬件的正确位置读出信息)的指令。



另一个相关事情我们现在的存在是,所有现代计算机都有大型程序,称为操作系统,管理所有的基本东西,如说硬件和错误处理,如果一个程序崩溃等情况下会发生什么。此外,许多现代编程环境具有很多已编写的代码(标准库)来处理许多基本任务,例如在屏幕上绘图或读取文件。这个库将反过来要求操作系统代表它与硬件交谈。



如果这些不可用,编程将是一个非常困难和繁琐的任务,因为每个程序,你必须重新创建代码来绘制一个字母在



似乎我已经走了,我希望你能理解这些: - - )


What makes all the words of a programming language actually do anything? I mean, what's actually happening to make the computer know what all of those words mean? If I verbally tell my my computer to do something, it doesn't do it, because it doesn't understand. So how exactly can these human words written into a language actually cause the computer to do some desirable activity?

解决方案

It all starts with the CPU or processor. Each processor type has a defined set of instructions it's able to perform. These instructions operate over ones and zeroes, which in turn represent whatever you wish them to: numbers, letters, even the instructions themselves.

At the lowest level, a zero is determined by the presence of a certain voltage (usually near 0V) at a transistor and a 1 is the presence of a different voltage (CPU dependent, say 5V)

The machine instructions themselves are sets of zeroes and ones placed in a special locations called registers in the processor, the processor takes the instruction and its operands from specific locations and performs the operation, placing the result on yet another location, afterwards going to fetch the next instruction and so on and so forth until it runs out of instructions to perform or is turned off.

A simple example. Let's say the machine instruction 001 means add two numbers.

Then you write a program that adds two numbers, usually like this:

4 + 5

Then you pass this text to a compiler which will generate the adequate machine code for the processor you will run the program on (sidenote, you can compile code to be run in a different processor from the one you are currently running, it's a process called cross compilation and it's useful, for instance, in embedded platforms). Well, the compiler will end up generating, roughly,

001 00000100 00000101

with additional boilerplate machine code to place the 001 instruction in the next instruction register (instruction pointer) and the binary encoded numbers in data registers (or RAM).

The process of generating machine code from structured languages is fairly complex and places limits on how normal these languages can end up looking like. That's why you can't write a program in english, there's too much ambiguity in it for a compiler to be able to generate the proper sequence of zeroes and ones.

The instructions CPUs can execute are fairly basic and simple, addition, division, negation, read from RAM, place in RAM, read from register, and so on.

The next question is, how can these simple instructions over numbers generate all the wonders we see in computing (internet, games, movie players, etc.)?

It basically boils down to the creation of adequate models, for instance a 3D gaming engine has a mathematical model that represents the game world and can calculate the position/collisions of game objects based on it.

These models are built on very many of these small instructions, and here's where high level languages (which are not machine code) really shine because they raise the abstraction level and you can then think closer to the model you want to implement, allowing you to easily reason about things like how to efficiently calculate the next position the soldier is going to be based on the received input from the controller instead of preventing you to reason easily because you are too busy trying not to forget a 0.

A crucial moment occurred with the jump from assembly language (a language very similar to machine code, it was the first programming language and it's CPU specific. Every assembly instruction directly translates into machine code) to C (which is portable among different CPUs and is at a higher level of abstraction than assembly: each line of C code represents many machine code instructions). This was a huge productivity increase for programmers, they no longer had to port programs between different CPUs, and they could think a lot more easily about the underlying models, leading to the continued complexity increase in software we've seen (and even demand) from the 1970s until today.

The pending missing link is how to control what to do with that information and how to receive input from external sources, say displaying images in the screen or writing information to a hard drive, or printing an image on a printer, or receiving keypunches from a keyboard. This is all made possible by the rest of the hardware present in the computer which is controlled in a way similar to that of the CPU, you place data and instructions in certain transistors in the graphic card or the network card or the hard drive or the RAM. The CPU has instructions that will allow it to place some data or instruction into (or read information out of) the proper location of different pieces of hardware.

Another relevant thing to the existence of what we have today is that all modern computers come with big programs called operating systems that manage all the basic stuff like talking to hardware and error handling, like what happens if a program crashes and so on. In addition, many modern programming environments come with a lot of already written code (standard libraries) to handle many basic tasks like drawing on a screen or read a file. This libraries will in turn will ask the operating system to talk to the hardware in its behalf.

If these weren't available, programming would be a very very hard and tedious task as every program you write would have to create again code to draw a single letter on the screen or to read a single bit from each specific type of hard drive, for example.

It seems I got carried away, I hope you understand something out of this :-)

这篇关于有人可以告诉我计算机编程工作原理的基础知识吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆