Moments from IT history that changed the world

 Moments from IT history that changed the world

     The development of the first programming language

Everything we can see now, searching in the browser or performing some tasks on our PC can only be thankful for programming languages. They are the bridges between human beings and machines, so how did everything start?

Ada Lovelace: mathematician famous for writing the first computer program

Ada Lovelace, a mathematical genius whose friendship with the mathematician, philosopher, inventor and mechanical engineer Charles Babbage resulted in early ideas for mechanical calculators and a preliminary prototype for a general-purpose computer.

Her key idea was that the machine manipulated numbers as abstract quantities so that the analytical engine “might act upon other things besides numbers”. (1)

In this way Lovelace first stated of the analytical engine’s potential beyond a simple calculator. Among the numerous appendices of her paper, she included an algorithm in appendix G for finding Bernoulli numbers.


                   Diagram for the computation of Bernoulli numbers

It is considered to be the first published algorithm ever specifically tailored for implementation on a computer, and Ada Lovelace has often been cited as the first computer programmer for this reason. The engine was never completed so her program was never tested. (2)

What is next? Machine learning

Have you ever heard about Machine learning? Of course, the answer may be something like: “Emm, you meant AI?” You will be theoretically correct but actually, it branched off in 1970 to develop independently. ML is the study of computer algorithms that can improve automatically through experience and by the use of data. (4)

Hebb and Samuel

Machine learning is an important aspect of modern business and scientific research. It employs algorithms and neural network models to help computer systems enhance their performance. With the use of sample data – also known as “training data” – algorithms build a mathematical model that automatically makes decisions without being specifically programmed to take those decisions. It is based on a model of brain cell interaction. The model was created in 1949 by Donald Hebb in a book titled “The Organization of Behavior“. The book presents Hebb’s theories on neuron excitement and communication between neurons.


Hebb wrote, “When one cell repeatedly assists in firing another, the axon of the first cell develops synaptic knobs (or enlarges them if they already exist) in contact with the soma of the second cell.” Translating Hebb’s concepts to artificial neural networks and artificial neurons, his model can be described as a way of altering the relationships between artificial neurons (also referred to as nodes) and the changes to individual neurons.

Regarding the first algorithm, Arthur Samuel used minimax strategy with alpha-beta pruning in his computer program for playing checkers in the 1950s. The scoring function attempted to measure the chances of each side winning, which evolved into minimax algorithm.

The program Samuel designed also had a number of mechanisms that enabled it to improve. Through what Samuel called rote learning, the program could record and remember the positions it had already seen and then combine those with the reward function values. He was the first to come up with the phrase “machine learning” in 1952. (3)

ML nowadays

For what do we use ML now? We can’t even think of all possible ways to use it in modern business, but here are some of the most common examples:

  • Analyzing Sales Data
  • Real-Time Mobile Personalization
  • Fraud Detection
  • Product Recommendations
  • Learning Management Systems
  • Dynamic Pricing
  • Natural Language Processing (3)

CPU and its history

Which component is the most crucial in modern computing? What is the first thing from which we start the process of building a new PC? Definitely CPU. It has lots of characteristics and models to choose from, but what was the first prototype invented to start the history of it?

What is a starting point?

Silicon – main component of modern CPUs. It was invented by Baron Jons Jackob Berzelius in 1823 and his excellent semiconducting properties and abundance made it so suitable for the microchip industry. (7)

The first generation – vacuum tubes are the first generation of Central Processing Unit. The first computer was using vacuum tubes and Magnetic Drums for memory. The computer itself was huge and used the whole room, not to mention the fact of electricity consumption. Due to the last, CPUs tended to overheat and malfunction. So something had to be changed.

Transistors are the second generation CPU. They replaced the first generation but the problem with heat wasn’t solved. Transistor was invented in 1947, but at first, it was rarely used in computers. This component made computers cheaper, smaller, faster, more energy-efficient and reliable than their earlier predecessors.


        IBM PowerPC 604e processor

Third generation CPUs were presented as integrated circuits. Smaller and cheaper than transistors made integrated circuits become products for a mass audience. It was not using printouts and could run more than one program at once. (5)

Microprocessors brought the fourth generation of Central Processing Unit. They were far superior to the predecessors due to the structure: thousands of integrated circuits were built on a single chip. Compared to the first generation, we can simply hold microprocessors in our hands. Advances in technology led to the invention of the microprocessor in the early 1970s. Since the introduction of the first commercially available microprocessor, the Intel 4004 in 1971, and the first widely used microprocessor, the Intel 8080 in 1974, this class of CPUs has almost completely overtaken all other central processing unit implementation methods. (6)

                Intel 4004 CPU

Combined with the advent and eventual success of the ubiquitous personal computer, the term CPU is now applied almost exclusively to microprocessors. Several CPUs (denoted cores) can be combined in a single processing chip. (6)

1 – https://www.newscientist.com/people/ada-lovelace/

2 – https://en.wikipedia.org/wiki/Ada_Lovelace#First_computer_program

3 – https://www.dataversity.net/a-brief-history-of-machine-learning/#

4 – https://en.wikipedia.org/wiki/Machine_learning

5 – https://www.uniassignment.com/essay-samples/information-technology/the-history-of-central-processing-unit-information-technology-essay.php

6 – https://en.wikipedia.org/wiki/Central_processing_unit

7 – https://www.computerhope.com/history/processor.htm


Коментарі

Популярні дописи з цього блогу

IT professionals in Ukraine

Assistive technology

Help keep flame wars under control