Continuing on from the last class, we began the morning discussing objections and shortcomings of the Turing test, the state of artificial intelligence, and more on the proceedings of previous Loebner prize competitions.
While the judges are still not being convinced by machines, one such argument against the Turing test brought up in class is its inability to discern between intelligence and the appearance of intelligence. Searle’s Chinese Room, a thought experiment, was brought up to illustrate this point. The experiment imagines a person in a room with no knowledge of the Chinese language tasked with writing responses to messages written in Chinese using a book of every necessary Chinese response to any set of inputs. If the entire process is done from the book the person would be conversing in Chinese but would not have any understanding of what they are reading or writing. Likewise, the machines attempting to win the Loebner prize simply respond in a simple programmed fashion and do not yet understand exactly what they are saying. The use of the Turing test as a metric for machine intelligence was also questioned as it is fairly subjective, and as we saw with the Loebner prize, depends on the test giver’s experience with pseudo-intelligent conversation machines.
Other AI related topics were also discussed, relating to the history of AI. The term AI originated in 1956 with John McCarthy’s proposal for the Dartmouth conference, a short summer long research period to study the learning capability of machines. McCarthy was unable to reach his lofty goals, as many of them have not even been attained since then. We also learned about the general slump in AI research (Artificial Intelligence Winter) that continued on up through the 1990s until adequate computer hardware to tackle many of the problems started to become available.
After concluding our discussion on artificial intelligence, the Loebner Prize, and the Turing Test, we transitioned into talking about advancements in computer architecture leading to microcomputers.
We began this discussion outlining the evolution in computers from vacuum tubes into discrete transistors, into integrated circuits, and eventually microcomputers. Integrated circuits are entire circuits created on a single semiconductor through the process of photolithography. It was discussed how this method of making circuits is advantageous over hand assembly as it can produce smaller, cheaper, more efficient, and less error prone circuits, a necessity for the creation of microcomputers. Today almost every computer made is mostly constructed from integrated circuits.
Along with the introduction of the integrated circuit, we discussed the necessary advances in memory that have made the modern computer possible. Two categories of memory were analyzed in class, including serially accessed storage memory and random access memory (RAM).
On the subject of RAM, we watched a short video discussing three types of RAM including magnetic core memory, static RAM (SRAM), and dynamic RAM (DRAM). Although antiquated today, core memory was predominant in the early ages of electronic computing up through the 1970’s. We learned how core memory utilizes wires hand woven between magnetic rings (toroids) and electric currents to store bits of data magnetically. Being hand assembled, these devices commonly only held up to a few kilobits. We also briefly discussed the two other forms of RAM that are commonly seen today SRAM and DRAM. SRAM that use transistors and capacitors respectively to store bits of information.
After talking about RAM, we moved on to cover how information has been stored on computers, and how the methods of doing so have changed since the introduction of electronic computers. The first method, after the era of punch cards to catch on was the use of data tapes. These tapes stored data magnetically on large reels. One example was the 1.6kilobit/inch tape shown during the previous week, where one large 700 inch reel could only hold 140kB. Another storage device was the floppy disk, which progressed from monstrous 8” 80kB disks down to 5.25” and later 3.5” 1.44MB disks. The class then examined the innards of a 3.5” floppy disk to see the magnetic film disk inside the cartridge where the data is actually stored.
Similar in function to a floppy disk are hard disk drives, which were also discussed. Using spinning multiple magnetic disks and read/write heads on movable arms, hard disks are able to store very large amounts of data in comparison to other storage media. Two old and disassembled hard disk drives were passed around the room to get a hands on look at how the devices work (which hasn’t changed much in recent decades).
On a side topic, we also conversed about the increasing use of non-volatile solid state (semiconductor based) storage drives in place of hard disks. These devices replace the need for moving parts with integrated circuits allowing for faster operation, however these devices have not yet matched the storage density of traditional hard disks.
At the end of the conversation we also discussed the origins of the magnetic platter hard disk with the IBM 350 “RAMAC” disk drive. Utilizing 50 2 foot diameter plates spinning at 1200rpm and a single read/write head, the IBM 350 could hold 5MB of data. Of course in comparison to modern disk drives, this seems outlandish, but this device was a computing breakthrough when it was introduced in 1956.