computer science definition

Informatics is the science that studies the automatic processing of information in computers, electronic devices and computer systems.

Computer science studies the methods and techniques to store, process and transmit information and data in an automated way. In other words, it studies the use of computing and its applications.

Computer science is based on multiple sciences such as mathematics, physics, electronics, etc.

Its development began in the middle of the 20th century, with great advances such as the integrated circuit, the arrival of the PC, the Internet and the mobile phone.

In fact, computing has given rise to the so-called Information Age, propellant of the Information Revolution, seen as the third greatest technological progress of humanity together with the Industrial Revolution (1750-1850, of our era) and the Neolithic Revolution ( agriculture) (80005000 BC).

computer science definition

Informatics is the systematic study of the feasibility, structure, expression, and mechanization of methodical procedures (or algorithms) that underlie the acquisition, representation, processing, storage, communication, and access to information.

Origin of the term computing

It comes from the French informatique and was coined by the engineer Philippe Dreyfus in 1962. It formed a conjunction between the words «information» and «automatique».

Other sources say that it comes from the German word Informatik from a 1957 document written by Karl Steinbuch called Informatik: Automatische Informationsverarbeitung (Informatics: Automatic Information Processing).

Meanwhile, the Englishman Walter F. Bauer, independently in 1962, co-founded the company called Informatics General.

In English it is computer science.

Great contributions of computing

– Start of the so-called Digital Revolution or Information Revolution, which includes the Internet and the Information Age.

– The conception of programming languages: high-precision tools for expressions of information at various levels of abstraction.

– Cryptography. In fact the Allied victory in World War II was largely due to the German Enigma code being broken.

– Scientific computing, which makes it possible to evaluate complex processes and situations, carry out experiments completely through software, and allows prediction tools. Simulation of physical and chemical processes.

– Algorithms improve the efficiency and liquidity of financial markets, using artificial intelligence and other large-scale statistical and numerical techniques.

– Computer graphics, which are used for entertainment: cinema, television, advertising, video games.

– Optimization of machines, for example, modern aviation would be impossible without computing.

– Artificial Intelligence is probably the area where the greatest achievements will be seen in the coming decades.

origins of computing

The foundations of computing can be traced back hundreds of years before the invention of the modern digital computer. Devices or mechanisms for calculation tasks such as Bacchus have existed since ancient times, even with more complex calculations such as multiplication and division.

While the algorithms to perform calculations have existed since ancient times. For example, the ancient Sanskrit treatises Shulba Sutras (or «Rope Rules»), are books of algorithms written by various authors around 800 BC for the construction of geometrical objects such as fire altars using a peg and string. , an early precursor to the modern field of computational geometry. It also contained attempts at solving the squaring of the circle.


Blaise Pascal designed and built the first working mechanical calculator, Pascal’s calculator, in the year 1642.

In 1673, Gottfried Leibniz demonstrated a mechanical digital calculator called the Stepped Reckoner. He can be considered the first computer scientist in history. Document, among other things, the binary number system.


In 1820, Thomas de Colmar launched the first industrial mechanical calculator when he released his simplified arithmeter, it was the first calculating machine strong enough and reliable enough to be used on a daily basis in an office environment.

Charles Babbage (1791-1871), the father of computing

In 1822, Charles Babbage (1791-1871) began the design of the first automatic mechanical calculator, called the Difference Engine. Which eventually gave him the idea for the first programmable mechanical calculator, his Analytical Engine.

Difference Engine by Charles Babbage in 1833, exhibited in 1862.

He began the development of this machine in 1834 and in less than two years he made a sketch of all the most outstanding features that a modern computer would have today. A crucial step was the adoption of a punch card system derived from the Jacquard loom, which made it «infinitely» programmable.

In 1843, while translating an article in French for Analytical Engine, Ada Lovelace (1815-1852) wrote what is considered today the first computer program in history, used to compute Bernoulli numbers.

Ada Lovelace (1815-1852), wrote the first computer program in history.

In 1885, Herman Hollerith invented the tabulator, which was used to process punched cards with statistical information; eventually this company would be part of IBM.


In 1937, a hundred years after Babbage’s impossible dream, Howard Aiken convinced IBM (which was making all kinds of punch card equipment and was also in the calculator business) to develop the giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage’s Analytical Engine, which in turn uses cards and a central computing unit. When the machine was completed, some said it was «Babbage’s dream come true.»

During the 1940s, as new computing machines were being developed, the term computer began to refer to machines rather than people (the term was previously used to designate someone who did calculations).

It became clear that computers could be used for many more applications than just mathematical calculations.

The first computer science degree program (the Cambridge Diploma in Computer Science) was delivered by the University of Cambridge in 1953.

Undoubtedly, IBM was the company that contributed the most to computer knowledge in these decades.

Over time, computing has gone from being a field dominated by experts and professionals, to being available and accessible to anyone who is interested in it.

computer illiteracy

It refers to the lack of basic knowledge about new technologies.

See Computer illiteracy.

computer security

Area specialized in computer security.

See the full article here: Computer Security.

related terminology




Doubts? needs more information? Write and we will respond to your email: click here