Developments in computing.
Long distance telephone signals needed to be amplified, and vacuum tubes were pretty bad at the job.
1945 AT&T decided to find a substitute for the vacuum tubes.
After a number of unsuccessful tries, tensions started in the team, and Shockley decided to go it alone.
On Dec 16, 1947, Bardeen and Brattain, without telling Shockley, build a point - contact transistor from strips of gold foil. Shockley built a better one and pushed them aside, and tried to patent it himself.
A transistor is a three terminal device, one can control electrical current between the two terminals by applying electric current to the third terminal.
Some say that Shockley put the Silicon in Silicon Valley. He hired a few people to Shockley Semiconductor in Palo Alto, but no one from Bell labs (full of brilliant people) would work for him.
Seems like everyone who ever worked for him ended up leaving (and forming companies like Fairchild and Intel).
Bardeen stayed at Bell labs, and died in 1991.
Shockley's company folded and he joined Stanford. He had notorious thinking on race, genetics and intelligence. He even suggested voluntary sterilisation for low IQ people. He died in disgrace, in 1989.
Integrated circuits.
The transistor and advances in solid state physics led to the development of an integrated circuit. Chips.
Kilby, in 1956, produced the first IC to prove that resistors and capacitors can exist on the same piece of material.
Texas instruments came up with the same idea though, at the same time.
Claude Shannon
Shannon is called "The Father of Information Theory".
His thesis was called 'the most important masters theses of the century!'.
He said that switching is like boolean algebra. So you can analyse any kind of circuit using boolean algebra. Everyone was terribly impressed.
He also finished a thesis under Vannevar Bush.
Shannon said
"How can we measure information?"
"What does information mean?"
How do you define and quantify information?
Shannon provided mathematical definition of information. Defined entropy. Defined how to measure information as if it was a physical quantity.
Information is related to the probability of something happening. If you find out that something happened that you didn't know happened, then that is a lot of information. Conversely, if you are sure something has happened, and someone tells you it has, then that isn't much information.
In information theory, the entropy equations in like E=Mc2.
You can afford to have a longer code when the uncertainty is higher, but a shorter code when you are more sure about what you're sending will be understood. This is how you minimise message length.
He died of Alzheimer's in 2001.
Artificial Intelligence.
Once again, a paper of Turing, written in 1950, dealing with the issue of whether machines can think. It detailed the Turing test.
Hashing
Appears to be developed in mid 1950s at IBM by Luhn. It arose from the problem of accessing information that is stored on disk. It's used in many database systems.
Fortran and it's compiler.
Before this, all programming had to be done in machine language!
It was the first high level language.
John Backus hated studying, and was thrown out of Uni. So he joined the Army.
But they sent him to study pre-med in Atlantic City, treating head wounds. He was found to have some tumor in his head himself. he quit medical school after just 9 months.
He moved to new york, and went to a radio technician school to learn to build a hi-fi unit. He enjoyed the teacher and work. He liked the fact that math had an application.
He did math at Columbia University, and joined IBM in 1950.
There he started developing Fortran, round about when Von Nuemann was getting sick of it and saying that we didn't even need a higher level programming and that programming wasn't even a problem.
One of the hardest problems was what the language would look like, and how would the computer translate (parse) it?
In 1959 he invented the Backus Naur Form (BNF).
Backus thought that FORTRAN and Algol weren't very good ways for humans to tell the computer what to do. His idea was that you should be able to tell the machine "This is what I want to be done". So he thought up the idea of functional programming. He defined all these languages that never took off though.
After FORTRAN, Algol and Lisp followed, and them more after that.
Algol had two major advantages over FORTRAN. The notion of local variables and that of recursion. Lisp was invented to help with AI work.
Algol was sort of designed, praised, but never took off.