window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'UA-63172957-1');
Sound
Awwwards
</html>
Created by potrace 1.16, written by Peter Selinger 2001-2019
Back to blog
Computer Science

Basic Concepts of Data Processing (Part 2)

Article index:

Languages, Syntax, and Semantics

Now that we understand that machines work on a binary level, the next step is to find a way for humans to communicate with a computer. We’ve learned that an OS and applications provide instructions so people can use the computer, but where does this software come from? The answer is: programmers. Programmers (or coders) have to use a special language (or languages) that both humans and computers can understand. To this end, programming languages were invented.

There are many languages, but they all perform a similar function: they are words and symbols that translate directly to binary commands for the computer. Fortunately, one simple word in English could stand for a long series of binary commands, making it much easier for the programmer to specify what they want to do (Eck, 2009). One such language is Java.

Syntax

Besides coming up with words and symbols which will translate to computer instructions (or machine language), the correct ordering of the symbols and words must also be established. Just like the words in an English sentence, programming instructions only make sense in the correct order. Otherwise, it is just a jumble. If you took the words of this sentence and put them in a random order, it would be ignoring proper syntax and make little sense. Syntax also includes rules on when certain symbols are allowed to be used, just like rules for English as to where punctuation can go in a sentence.

foodItem = JOptionPane.showInputDialog('Your choices are: \n1.Hamburger\n2.Cheeseburger\n3.Fish Burger\n4.Veggie Burger');

The above line of code in Java is an assignment statement—it takes a variable, foodItem, and stores in it the result of a dialog box that appears on the screen.

The text to be displayed is in quotes; however, there is a special code “\n”, which means: insert an End of Line here. To humans, the “\n” probably looks odd, but in Java, the computer knows it means it should go to the next line. Syntax is crucial in computer programming. If just one character or symbol is out of place, the code will not work. Computers are terrible at guessing—if you don’t say exactly what you want according to the rules, they will just stop working.

Semantics

After you’re sure you’ve followed all the rules, you have to make sure that you’ve got the correct meaningthe semantics. If I very clearly describe how to make a peanut butter sandwich, but actually wanted a chicken salad, then I have a semantics problem. It’s possible to create perfectly readable code for a computer, yet have it do something different than your intention. Experienced coders learn many skills to prevent that kind of thing from happening; it is best if you are aware of this from the beginning since it is difficult to unlearn bad programming habits. One such skill is adding comments to code. These are notes to yourself (and other coders) that describe what the code is doing in a certain section. It’s especially helpful if you haven’t looked at that section of the program for a day or longer, so you know what the code was intended to do instead of trying to figure it out all over again. Advanced programs consist of thousands of lines of code; without comments, they are very difficult for any coder to work with.

Speaking “computerese”

In many ways, a computer language is similar to human language. There are words and symbols that have meaning, and these elements have a specific set of rules about the order that they can be used in and when they can be used at all. Finally, following these rules, you must assemble a message that makes sense and that explains the concept that you intend to convey. Just like with learning a new human language, you have to start with the basics, and it definitely takes time. The good thing is, once you have learned one computer language, the concepts and functionality are similar so that most others are also easily learned.

At first, the semantics of programming languages may seem pedantic and difficult to understand. However, if we “think like a computer” it will become easier to understand. In English, we have several ways of conveying the same concept: for humans, this is artistic expression, but for a computer this is ambiguous and unnecessary. The more precise and exact you are with a computer, the better results you can expect. Look at the following sentence:

“One day I ate a hot dog in my underwear.”

You’d probably imagine someone sitting in their boxers eating a hot dog. But you might also imagine that the hot dog is wearing underwear. The sentence is grammatically correct, but semantically it could be clearer. Computer code should be used precisely to get the proper results.

Historical Overview of Computers

There’s no machine that we can point to and say “here is the first computer.” It was more of an evolution of advanced machines into something that eventually resembles the computers we have today. In 1801, Joseph Maria Jacquard invented a loom that used punch cards made of wood to create fabric designs automatically (Zimmermann, 2017). Technically, this was programming—a pre-made set of instructions changed into a “machine language” that tells a machine what to do. However, this machine wasn’t doing any computing, simply weaving cloth.

Later, during World War II, machines were used to encode secret messages. These machines used gears that would align with symbols to create a coded text from regular or “plaintext” messages. Breaking those codes, however, required actual computing power. The 2014 film The Imitation Game tells the story of Alan Turing; his Turing machine was able to take in encrypted messages, process them, and output an answer. However, his machine had no keyboard or monitor, let alone a mouse. To look at the history of modern computing machines, we will break it down into four eras.

Behemoths

These enormous computers were built with vacuum tubes and wires and took up entire rooms. The input was usually entered by flipping switches and turning dials. The output was often given via lights or punched holes on paper. They were basically number-crunching machines that were used to perform large mathematical computations, or to work with large amounts of numerical data. The U.S. Census Bureau purchased one of these huge computers, the UNIVAC, in 1951 to help with counting the population of the United States (Fabry, 2016).

Business

Eventually, these large behemoths got a little smaller and were able to be produced more quickly. They eventually became affordable for medium to large-sized businesses (instead of requiring an entire government’s GDP to purchase).

The output of these machines was a teletype terminal, where text would come automatically out of a typewriter like device. Eventually, in the 1970’s, “dumb terminals” were created with monochrome CRT screens. Unlike the personal computers of today, these screens with a keyboard did not have their own CPU or RAM, they were connected via an electrical signal to the mainframe which did all of the calculations. Several “dumb terminals” could be connected to a mainframe at the same time, sharing the central computer’s processing power. Universities also became major purchasers of these mainframes at this time. Many of these mainframes ran the Unix operating system, the predecessor of Linux. It wasn’t until the 1980s that the idea of taking a computer home for personal use became widespread. Early home computers were text only—there were no graphics yet and no need for a mouse, so only technical people and hobbyists owned them.

Graphical User Interfaces

The invention of Apple’s Macintosh computer in the 1980s started a personal computer revolution. Though there were previously computers with graphics capabilities (e.g., the Commodore 64 in 1982), the Macintosh included a mouse and a Graphical User Interface (GUI). Instead of having to type in precise and obscure text commands to use a computer, users could now point-and-click. This made them much more accessible and led to a new, booming market for home computers. They were not only used for business applications, home budgeting, and the like—they were also used for gaming. Richard Garriott’s Ultima II was released for the Macintosh in 1985. At first, the Mac’s display was only black and white, but as computers became more powerful, more colors and pixels were available. The main competitor to Macintosh was the PC, a computer originally made by IBM using an Intel CPU. Even though the Mac was technically also a personal computer, “PC” became synonymous with the Intel-based IBM system (and later other similar “clones”). In the 1990s, Microsoft released Windows, a GUI operating system for the PC. This was the beginning of the legendary “Mac versus PC” divide that still exists today. Modern Macs and PCs now both use Intel CPUs and have very similar hardware designs.

Portable

The next generation of personal computing was driven by display technology—the invention and widespread use of flat screens. If you were born in the mid-1990s, you may never have known a world filled with clunky CRT monitors giving off waves of heat. Once flat-screen technology was refined, it enabled truly portable computers to be mass-produced. In the past, some enterprising computer manufacturers had attempted to create “portable” computers such as the Osborne, which weighed over 24 pounds and had a five-inch monochrome CRT built-in. Unsurprisingly, these did not become popular. The flat screens of the 1990s were used to make easy-to-carry laptops and also replaced the bulky CRT monitors used for desktops. As the technology developed, more pixels and colors were made to fit on ever-shrinking screens for tablets and smartphones. Today’s smartphone is still a computer—it has a CPU, as well as long-term and short-term storage.



Back to blog

</html>
Wordpress Developer Loader, Web Developer Loader , Front End Developer Loader Jack is thinking