E-SPEAIT T2 General IT History: Difference between revisions
Kaido.kikkas (talk | contribs) |
Kaido.kikkas (talk | contribs) m Kaido.kikkas moved page E-SPEAIT T1 General IT History to E-SPEAIT T2 General IT History without leaving a redirect: loengute järjekorra muutmine |
||
(3 intermediate revisions by the same user not shown) | |||
Line 82: | Line 82: | ||
Finally, [https://en.wikipedia.org/wiki/Fidonet Fidonet] was also born that year - the dial-in ([https://en.wikipedia.org/wiki/Bulletin_board_system BBS]) based network which was especially popular in places where local phone calls were inexpensive or free and vice versa, computer access was restricted (including the former Eastern Bloc). Only when the Internet became the household word at the end of the century, did it back down but is maintained by enthusiasts till today. | Finally, [https://en.wikipedia.org/wiki/Fidonet Fidonet] was also born that year - the dial-in ([https://en.wikipedia.org/wiki/Bulletin_board_system BBS]) based network which was especially popular in places where local phone calls were inexpensive or free and vice versa, computer access was restricted (including the former Eastern Bloc). Only when the Internet became the household word at the end of the century, did it back down but is maintained by enthusiasts till today. | ||
1984 was the beginning of mass-produced CD-ROMs and IBM released [https://en.wikipedia.org/wiki/IBM_Personal_Computer_AT PC AT] which was a substantial improvement over the XT (from the software point of view, the IBM compatible computers have not changed much since then - still, the hardware side as well as the overall performance have changed dramatically). Massachusetts Institute of Technology releases the first version of [https://en.wikipedia.org/wiki/X_Window_System X Window System] - interestingly, the Unix standard of graphical user interface predates Microsoft Windows by several years. In Moscow, Alexey Pajitnov wrote one of the most popular computer games - [https://en.wikipedia.org/wiki/Tetris Tetris]. It was followed by a multitude of both legal and not-so-legal copies, one of them which created a real boom among Estonian computer users at the end of the 80s (Tallinn University of Technology even | 1984 was the beginning of mass-produced CD-ROMs and IBM released [https://en.wikipedia.org/wiki/IBM_Personal_Computer_AT PC AT] which was a substantial improvement over the XT (from the software point of view, the IBM compatible computers have not changed much since then - still, the hardware side as well as the overall performance have changed dramatically). Massachusetts Institute of Technology releases the first version of [https://en.wikipedia.org/wiki/X_Window_System X Window System] - interestingly, the Unix standard of graphical user interface predates Microsoft Windows by several years. In Moscow, Alexey Pajitnov wrote one of the most popular computer games - [https://en.wikipedia.org/wiki/Tetris Tetris]. It was followed by a multitude of both legal and not-so-legal copies, one of them which created a real boom among Estonian computer users at the end of the 80s (Tallinn University of Technology even organized a championship tournament!) was aptly named 'Totrus' (Estonian for 'folly', 'stupidity', also 'rubbish'). | ||
By the mid-80s, Internet had matured to be a widespread, yet mostly academic and non-profit (commercial activities like advertisement were prohibited) entity, with e-mail, [https://en.wikipedia.org/wiki/Telnet Telnet], FTP and Usenet as its main services. | By the mid-80s, Internet had matured to be a widespread, yet mostly academic and non-profit (commercial activities like advertisement were prohibited) entity, with e-mail, [https://en.wikipedia.org/wiki/Telnet Telnet], FTP and Usenet as its main services. | ||
Line 162: | Line 162: | ||
2021 - The pandemic carries on and vaccination becomes a hot topic in the social media. Microsoft introduces the [https://en.wikipedia.org/wiki/Windows_11 Windows 11] and Apple [https://en.wikipedia.org/wiki/MacOS_Monterey MacOS Monterey], the mobile devices get [https://en.wikipedia.org/wiki/Android_12 Android 12] and [https://en.wikipedia.org/wiki/IOS_15 iOS 15]. The next stage of global cyber conflict takes place in [https://en.wikipedia.org/wiki/2021_Cyberattacks_on_Sri_Lanka Sri Lanka] (The cyber force of the Tamil Tigers attack the government). Ireland gets strongly hit by a [https://en.wikipedia.org/wiki/Health_Service_Executive_ransomware_attack ransomware attack], the same happens (in a little smaller scale) in [https://en.wikipedia.org/wiki/Waikato_District_Health_Board_ransomware_attack New Zealand]. | 2021 - The pandemic carries on and vaccination becomes a hot topic in the social media. Microsoft introduces the [https://en.wikipedia.org/wiki/Windows_11 Windows 11] and Apple [https://en.wikipedia.org/wiki/MacOS_Monterey MacOS Monterey], the mobile devices get [https://en.wikipedia.org/wiki/Android_12 Android 12] and [https://en.wikipedia.org/wiki/IOS_15 iOS 15]. The next stage of global cyber conflict takes place in [https://en.wikipedia.org/wiki/2021_Cyberattacks_on_Sri_Lanka Sri Lanka] (The cyber force of the Tamil Tigers attack the government). Ireland gets strongly hit by a [https://en.wikipedia.org/wiki/Health_Service_Executive_ransomware_attack ransomware attack], the same happens (in a little smaller scale) in [https://en.wikipedia.org/wiki/Waikato_District_Health_Board_ransomware_attack New Zealand]. | ||
2022 - the War in Ukraine started its new, active phase in February, also boosting cyberattacks in many places. Russia as the aggressor was targeted by international sanctions, lost a significant share of IT companies previously working there as well as a lot of its IT personnel (mostly due to emigration). It was also targeted by a controversial enemy - [https://en.wikipedia.org/wiki/Anonymous_(hacker_group) Anonymous] launched the Operation Russia. In other areas, [https://en.wikipedia.org/wiki/OpenAI OpenAI] caused a serious mindquake in education worldwide (some panicked reactions suggesting education being doomed). In the business world, Microsoft set out to buy [https://en.wikipedia.org/wiki/Activision_Blizzard], a major video game company. | |||
== Conclusion == | == Conclusion == | ||
For most people, computers are the invention of the late 20th century - but as seen from above, information technology in general (and even computers as such) are much older. More than often, learning about the past can help to predict the future - therefore, tech history should be studied. It has also been said that history tends to repeat itself.. | For most people, computers are the invention of the late 20th century - but as seen from above, information technology in general (and even computers as such) are much older. More than often, learning about the past can help to predict the future - therefore, tech history should be studied. It has also been said that history tends to repeat itself - some mistakes have been made before, as have some nasty tricks and scams. But it also helps one to understand the actual value of what we currently have in our hands. | ||
== References == | == References == |
Latest revision as of 12:57, 28 August 2024
From ENIAC to iPad: Moments from IT History
What was in the beginning...
... is still under discussion (both in IT and otherwise).
But it is likely that humans started to compute soon after the first days. Besides fingers, sticks and stones, one of the first computing devices was probably the abacus that came into use as early as about 2700-3000BC (different sources offer different dates, e.g. Ifrah[1] suggests 2700BC). Another device that could be considered a sort of computer was the weighing scale used to compare the weight of an object with another. A very interesting find has been the Antikythera mechanism which was recovered only in 1900AD but is thought to be an astronomical computer dating back to 100BC [2].
Some sources[3] suggest that Leonardo da Vinci designed also a mechanical calculator (around 1492-93). The calculating clock by German professor of Hebrew and astronomy Wilhelm Schickard allowed addition and subtraction of up to six-digit numbers - the blueprints of the machine were rediscovered only in the 19th century [4]. On the other hand, the device built by Blaise Pascal - known as the Arithmetic Machine, Pascal's Calculator and Pascaline - has been known for a long time and considered by many the forefather of today's computers. A number of sources suggest the initial motive for building the device to help his father Étienne who in 1639 was appointed the King's commissioner of taxes, thus having to perform a lot of complicated calculations[5].
Older people may still recall a calculating device called the slide rule which was once taught at schools (in the former USSR, it went on up to the 80s - partially due to the electronic calculators being largely unavailable) and is sometimes still seen in the hands of 'old school' engineers. It was invented in 1632 in Oxford by a mathematician and Anglican minister[6] William Oughtred who built his invention on the logarithmic scale by another Englishman, Edmund Gunter. The cursor or runner (the sliding window with a marker) was added by John Robertson in 1775[7] (other sources point to French artillery lieutenant Amédée Mannheim, who is said to give the rule its present form in 1859).
Modern computers use binary number system. In the modern context, it was first formalized by Gottfried Wilhelm Leibniz (who also built his version of calculator called the Stepped Reckoner in 1673), the traces of binary system however lead to a much more ancient time (some sources point to ancient India and a mathematician named Pingala living in the 2nd or 4th century before Christ). An early example of a programmable device can be found in the Jacquard loom by a French weaver and merchant Joseph Marie Jacquard. The loom used punched cards to set the pattern for the textile.
In 1822, English professor Charles Babbage designs the Difference Engine, followed by the larger and more complex Analytical Engine in 1830. Although both machines were incomplete, they are considered an important step in the history of computing (some of them have been built lately to prove the validity of their concepts[8]. The Difference Engine was powered by cranking a handle[9], the Analytical Engine was to be steam-powered.
Quite exceptionally for the period, Babbage was assisted by a young lady named Augusta Ada Byron, daughter of the famous English poet and later Countess of Lovelace. Babbage considered her one of the few people understanding his ideas, calling him "the Enchantress of Numbers"[10]. She prepared a set of instructions (algorithm) for the to-be-built Analytical Engine that has been validated by later researchers as correct. Thus she is often considered the first programmer in the world - among other things, the new programming language designed by the US Department of Defense in 1979 was named Ada after her.
In 1837, Samuel Morse invents the Morse code, the first globally used method for data transmission (telegraph in its heyday was perhaps the most extensive unified information system before Internet). Likewise, the punched telegraph tape invented in 1857 by Charles Wheatstone was the first widely used medium for data recording, becoming the forerunner of floppies, CD/DVD records and memory sticks.
The first modern computer
... is a hotly debated issue again. Various suggestions have been made, but four of them seem to be the most common:
- ENIAC (Electronic Numerical Integrator And Computer) was launched in 1943 at the University of Pennsylvania by a team led by John W. Mauchly and J. Presper Eckert[11].
- ABC (Atanasoff-Berry Computer) was completed a year earlier, in 1942, at the Iowa State College by John Vincent Atanasoff and Clifford Berry. It is interesting to note that the bitter rivalry between the two machines and their builders spanned over several decades - while ENIAC was initially considered first, the 1973 court case decided in favour of Atanasoff and Berry. On the other hand, several other factors show ENIAC as a more complete machine that also found more use.
- The Colossus series were built by British cryptanalysts during the WWII to assist in breaking German codes. Yet, as the ABC, the were not 'complete' computers and were dedicated to a single type of tasks[12].
- Z3 and other machines built by Konrad Zuse in Germany. The original Z3 was destroyed in the war and the work of Zuse fell into obscurity. Yet his machines could be perhaps considered the closest to the modern ones[13]. It used the binary system (contrary to the ENIAC's decimal) and Zuse's programming language called Plankalkül is said to contain most of the features of ALGOL, the language from the 50s usually considered the first[14]. On the other hand, Z3 was an electro-mechanical, not an electronic device. Thus, the whole matter is still largely undecided.
The Stone Age
In 1947, one of the best-known anecdotal stories in computing took place - a computer operator (most sources suggest that it was 'Grandma COBOL' or Grace Murray Hopper) searched for a reason of the computer not working, and finding a dead moth between two relay points, taped it to the log book and added the notice "The first actual case of a bug being found". This is considered the birth of the computing terms of "bug" (error), "debug" (finding and fixing errors) and "debugger" (error finding/fixing software[15].
In 1950, Alan Turing formulated the Turing test to determine if a machine is capable of intelligent behaviour. Simply put: the machine is considered intelligent if a human interacting with it cannot determine if he/she is interacting with another human or a machine. During the same year, the first computer named MESM is also built in the Soviet Union. While the Communist countries had a good share of talented mathematicians to provide a theoretical base for technology, they fell seriously behind in the realm of computing - two main reasons being the closedness as well as the overall ignorance (including labelling of many kinds of actual research like psychology and cybernetics 'Western pseudoscience'[16]) and ineffectiveness of the regime.
During the fifties, two important technologies contributed towards the miniaturization of computing technology - in 1954, Texas Instruments started to mass-produce transistors (which were actually invented already in 1934) and in 1958, the same company introduces the integrated circuit. One of the first transistor-based computers, the TX-0 (or "Tixo"), became one of the launchers of the hacker culture at MIT.
Another noteworthy person who was clearly ahead of his time in the 60s was Douglas Engelbart - the computer mouse was first patented by him in 1963[17] (but became widely used only in the 80s). He went on to develop hypertext (the most common example of the principle nowadays being the Web with its linked pages) and by 1968, an early graphical user interface (which is today used by all widely used computer systems, examples being MS Windows, Apple OS X as well as several desktop systems of variants of Unix and Linux). His 1968 demonstration of then-future computer technologies was later nicknamed The Mother of All Demos[18].
In 1966, IBM develops the first magnetic disk systems. At first, the "hard disks" of the day were huge, cupboard-like devices which gave rise to various 'urban legends' - e.g. the one about walking disk drives[19]. However, some years later in 1971, the company already produced the first generation of the floppy disks, the 8-inch diskette[20].
The year of 1969 has several important landmarks in the history of IT. It can be considered the birth year of Internet, as the first packet switching network (the communication is done by small chunks of information called packages, which travel independently over the network; the principle is a basic one in today's Internet as well) connects four universities in the US[21] - at first, the network becomes known as ARPANET. The AT&T develops the first version of Unix, the first computer operating system being widely portable[22] and eventually becoming a parent for today's two well-known systems in Apple OS X and the Linux family.
The Bronze Age
In 1971 Ray Tomlinson, an engineer at MIT, managed to send a message from one computer to another - his system got adopted by the ARPANET the next year, becoming what we know today as e-mail.
In 1972, Intel released the first 8-bit microprocessor - with its frequency of 200kHz, it looks antique in comparison with todays 2GHz+ models, yet it was an important step towards personal computers. In the same year, two young men named William Henry Gates III and Paul Allen, create their first company to sell a simple system for traffic density analysis - both the device and the company were called Traf-O-Data. On another front, the newly formed company called Atari releases its first commercial video game titled Pong - while not the first computer game in history, Pong is definitely a forefather of game consoles like Nintendo, Playstation or X-Box. And finally, diskettes started to shrink - the new ones went from 8 inches to mere 5.25.
The same year marks the first long-distance (MIT-Stanford) Internet chat - but not between actual people, but programs that we today would call chatbots - it was a virtual session involving ELIZA (a tongue-in-cheek version of a psychiatrist) and PARRY (a bot emulating someone with paranoid schizophrenia). Notably, while ELIZA was written by MIT computer scientists as a generic experiment, the 'father' of PARRY was an actual Stanford psychiatrist, Kenneth Colby.
The landmarks of 1973 include the CP/M, an operating system by Gary Kildall which among other machines was also popular on various computers built in the Eastern Bloc of the day (e.g. the Juku in Estonia); the specification of Ethernet by Bob Metcalfe which is still today the main standard for local computer networks; FTP is devised to move files over the network; IBM releases its first series of hard disks codenamed Winchester - the name would become the generic term for hard disks for some time.
1975 is the birth year of a big player in the world of software. Two guys from Traf-O-Data wrote their version of the well-known BASIC programming language and renamed their company Microsoft (at first, in the form of 'Micro-Soft'). They also created a substantial controversy by starting to ask money for every copy of their software, a concept totally alien back then. Bill's attempt to convince users that the new model works well was not considered very successful[23]. Two signs from that year indicated the growing mainstreaming of computers - a retail store chain specializing on computers ('Computer Store') was founded, just as the first specialized magazine titled "Byte". ARPANET opens the first mailing lists.
Apple was founded just a year later in 1976 by Steve Jobs and Steve Wozniak. While Gates and Allen started with and stayed mostly on software, Apple started out as a hardware company selling the do-it-yourself type Apple I.
It perhaps noteworthy that even if Microsoft Excel is nearly synonymous with spreadsheet software for many people, the first spreadsheet clearly predates Microsoft system - the first was VisiCalc, released in 1979. The same year gave us also Usenet newsgroups and first MUDs.
While Microsoft definitely has many friends, the controversies surrounding the company started early on as well. There are several different story versions about the 1980-81 birth of MS DOS, the first best-selling software by Microsoft - the official version by the company recalls buying the system from the Seattle Computer Products, some more critical voices suggest far less favourable scenarios[24]. On the hardware front, Sony released the highly successful [3.5-inch diskette], and in co-operation with Philips, develops the first version of CD audio standard.
Microsoft's historical dominance started with their luck (or business skill, depending on the view) to strike a deal with IBM to supply operating system for their new |Personal Computer Model 5150], which went on to be the first of the whole generation of personal computers. Among three candidates, IBM chose MS-DOS and the huge sales of the computer translated into a huge success for the operating system vendor.
The Iron Age
1982 was the year of rapid spread for the IBM's PC - an important reason being the emergence of the PC compatibles or "PC clones". Due to some technical factors, IBM could not withhold others from copying the design - the fact that was initially considered a curse but came out a blessing. At the time of its birth, IBM PC was one of the many kinds of personal computers available, by 1986 the market share of PC compatibles had exceeded 50%[25]. A number of technically more advanced designed were forced out of the market. This could be seen an important lesson about the power of open standards. The same year, Mouse Systems introduced their mouse for PCs, yet it remained a little more than curiosity for several years (largely up to the emergence of Microsoft Windows 3.0 in 1990; in contrast, Apple made mouse an integral part of their systems remarkably earlier). ARPANET extends to the United Kingdom and Norway, becoming Internet. Scott Fahlman proposes using emoticons (also known as smileys) to denote emotions in online texts[26] (there are also differing views on the origin).
In 1983, IBM produced a next model of PCs, the 5160 or PC XT, which was the first model with a (typically 10MB) hard disk. Apple counters with Lisa and IIe - also, one of their trademark models, the Macintosh (giving its name to all later Macs) was released about then (different sources mention either 1983 or 1984). Microsoft announced the initial version of a new environment named Windows - yet it was released only in 1985 and became practically usable (more or less so) with the version 3.0 in 1990.
It is noteworthy that while there was no Windows, there were already several providers of what we nowadays know as office software - e.g. Lotus 1-2-3, WordPerfect, Multi-Tool Word and others. Yet in the realm of word processing, perhaps the most successful software product of the early 80s was WordStar - it was originally designed for CP/M, but due to good portability took soon over the PC market as well[27].
In general it was the time of entrenchment of the proprietary, commercial and closed software model. Yet there was an 'old school hacker' named Richard Stallman who decided to rebel against it - at the end of 1983, he announced a plan to re-write the widely used Unix operating system and give it freely away to anyone interested[28]. While many people shrugged on the idea, he was determined and launched the GNU project which became the foundation of the Free Software movement. This topic will be covered more thoroughly in the next lectures.
Finally, Fidonet was also born that year - the dial-in (BBS) based network which was especially popular in places where local phone calls were inexpensive or free and vice versa, computer access was restricted (including the former Eastern Bloc). Only when the Internet became the household word at the end of the century, did it back down but is maintained by enthusiasts till today.
1984 was the beginning of mass-produced CD-ROMs and IBM released PC AT which was a substantial improvement over the XT (from the software point of view, the IBM compatible computers have not changed much since then - still, the hardware side as well as the overall performance have changed dramatically). Massachusetts Institute of Technology releases the first version of X Window System - interestingly, the Unix standard of graphical user interface predates Microsoft Windows by several years. In Moscow, Alexey Pajitnov wrote one of the most popular computer games - Tetris. It was followed by a multitude of both legal and not-so-legal copies, one of them which created a real boom among Estonian computer users at the end of the 80s (Tallinn University of Technology even organized a championship tournament!) was aptly named 'Totrus' (Estonian for 'folly', 'stupidity', also 'rubbish').
By the mid-80s, Internet had matured to be a widespread, yet mostly academic and non-profit (commercial activities like advertisement were prohibited) entity, with e-mail, Telnet, FTP and Usenet as its main services.
In 1987, Microsoft and IBM teamed up and announced a new operating system named OS/2. The system can be considered a classic example of a superior product falling victim to the market manipulations - after some years, Microsoft backed out with most of the know-how obtained going into their new Windows. While IBM kept OS/2, it was thoroughly hampered by the rights on some parts which belonged to Microsoft, and was never able to really take off.
In 1989, CERN in Switzerland is connected to the Internet. A staff member named Tim Berners-Lee starts to work on an idea of scientific documents linked together as hypertext - today we know it as the Web[29].
1990 is the release year of the first version of MS Windows that was stable enough for wider public - Windows 3.0. Still, its moderate success was surpassed by far by its very popular successor, 3.1. The University of California, Berkeley introduced the BSD Unix system family. Since then, they have been a rather significant niche systems since (especially in business server systems, due to a liberal licence; OpenBSD is also considered one of the most secure systems in its default configuration), but are also notable for forming the base of Apple OS X (Apple built a commercial "upper layer" on Darwin, a variant of BSD).
The Internet Age
1991 was another landmark year. Tim Berners-Lee published his method of hypertext (notably under public domain - had he chosen otherwise, saying that the world would have been different is not an exaggeration). In the United States, the National Science Foundation lifts the ban on the commercial use of Internet[30] - it makes possible a lot of new services, but also opens the door to spam and cybercrime. A student of the University of Helsinki in Finland named Linus Torvalds announces his "new Unix-like operating system" hobby project[31], later known as Linux.
The next year, Microsoft releases one of their most successful systems which makes Bill Gates the richest person in the US: Windows 3.1, followed by the success of the enterprise-oriented NT series. In Internet, the Web technology started its explosive growth. In 1994, Mark Andreessen and Jim Clark founded Netscape and created their browser of the same name, a successor to the first widely used web browser Mosaic.
For awhile, Microsoft downplayed the role of the Internet, preferring to focus on their own proprietary Microsoft Network. By 1995, they had learned of their mistake and decided to catch up[32]. At first trying unsuccessfully to convince Netscape to divide the market[33], they obtained a Mosaic-derived browser from Spyglass and developed it into the [Internet Explorer https://en.wikipedia.org/wiki/Internet_Explorer]. Initially not very successful, the version 2.0 was distributed for free (Netscape was only free for non-commercial use) and launched the so-called browser wars which were finally won by Microsoft in 1997 when Netscape was acquired by AOL. However, the win resulted in several years of monopoly charges for Microsoft and the open-sourcing of Mozilla (initially the codename for Netscape's development version) giving rise to Firefox and several other browsers.
1995 was a successful year for Microsoft, who released Windows 95 as well as MS Office 95 and Windows NT 3.51. Yet in the same year, Bob Young and Marc Ewing found possibly the most successful commercial Linux company, Red Hat. In the hardware world, it is the birth year of two well-known connector standards: USB and IEEE 1394, also known as FireWire (used extensively by Apple).
While Linux established itself as viable server system soon enough (especially as the software set known as LAMP used to power a large share of web servers since mid-90s), it did not become as successful as a desktop system. In late nineties however, a step to that direction was made by the foundation of the two major desktop environments in GNOME and KDE.
In 1998, Microsoft releases its next operating system, the Windows 98. Apple returned to the spotlight with their iMac, a novel design combining the case and the monitor of the personal computer. Mandrake (later Mandriva) is the first Linux distribution specifically meant for desktop use (at first using mostly KDE as its environment). The [1] raised briefly the hopes of many people, but the dreams came true only for a small minority.
In the turn of the year 1999/2000, Microsoft released Windows 2000 - despite extensive promotion, the initial reception was somewhat disappointing[34], forcing the company to follow up with Service Pack 1 rather soon. Moreover, the desktop-oriented replacement for Windows 98, the Windows ME, has been considered by many the worst release of Windows[35], forcing the company to speed up the development of the next release. However, the 2001 release of Windows XP, being one of the most successful releases, set the company back on the tracks.
The new century began well for Microsoft, having overcome the legal hassles and successfully released Windows XP and MS Office 2000. However, its competitors did not relax either - Apple developed their new Unix-based operating system family in OS X, and for the first time, Linux software displayed more serious ambition on desktop as well - the WordPerfect Office 2000 was the first major office suite released for Linux among other platforms, and Sun started to distribute their freshly acquired StarOffice 5.2 as a free download. Moreover, in late 2000 Sun split their StarOffice into a commercial offering of the same name and a community project title OpenOffice.org (nowadays distributed as Apache OpenOffice, but has recently been all but replaced by LibreOffice), the latter becoming an usable solution in less than a year.
The New Millennium
The first decade of the century feature among other things
- the wireless boom - wireless networks spread fast both as commercial offerings and as public services.
- the rise of free and open-source software in developing countries - especially prominent examples were Chile, Peru and Venezuela[36].
- laptop computers and notebooks becoming affordable for 'average people'
- rapid spread of affordable broadband Internet connections
- the dark side of networking - hijacking the computers of 'ordinary users' becomes a major problem
- various regulative legislation proposed, some of which are considered threatening to overall freedom
- move towards the 'post-desktop' era - stationary computers are outnumbered by mobile devices (laptops, notebooks, tablets and smartphones).
- the cybersecurity boom
Some interesting examples from the last decades:
- The SCO lawsuit 2003-2011 (could even continue in the future) - the claim by SCO Group that Linux kernel 2.4 contains some code belonging to them. Most sources agree that the motivation was to force IBM to buy out the company. Interesting factors include
- for the better part of the suit, the software in question has been outdated and replaced by new 2.6 kernel series (i.e. is not in use anymore)
- throughout the process, the suitor has not managed to identify the exact code in question
- from time to time, the suitor appeared to have received new funding to continue the process
- The software patent war in European Parliament 2004-2005 - the controversal EU directive proposal about whether software is patentable or not bounced several times between the European Parliament (majority opposing it) and the European Commission (supporting it), each time changing slightly. Finally, the proposal was overwhelmingly rejected in July 2005.
- Ubuntu - starting in 2004, the Linux distribution backed by Mark Shuttleworth and Canonical Ltd. had a rapid rise to the top soon after its initial release. With a steady release cycle (twice a year) and orientation to ordinary computer users, it became 'THE Linux' for many and at times, was able to challenge the established desktop operating systems by Microsoft and Apple. However, some recent development have resulted in controversy and alienated some of its user base (who fortunately have a wide choice to move to).
- Netbooks - since 2007, the small, light and cheap notebook computers mostly meant for e-mail, notetaking and simple web surfing. It is interesting to note that they were initially dominated by Linux - due to the fact that the Microsoft operating system of the day (Vista) was unable to run on them. Microsoft reacted by extending the support of XP until their new release (Windows 7) with lesser requirements on hardware arrived. Recently, they are increasingly being challenged by tablets.
- Politically motivated cyberattacks - while the history of such attacks dates back to the 90s, they have been increasingly prominent recently.
- The Android operating system developed by Google-led Open Handset Alliance since 2007 and together with Apple iOS, successfully bridging the gaps between the phones, larger handheld devices and notebooks (a good example of the latter is the Asus Eee Pad Transformer).
- iPad - first released in 2010, it has defined the tablet computer for many users.
- ACTA and its countercampaign in 2012 - while Europe had not experienced 'Facebook revolutions' similar to the Arab Spring or the Occupy movement, the large-scale public outlash was still remarkable.
- Steam has significantly diversified the gaming industry - even if Windows has remained a primary platform for gamers, both OS X and Linux have developed a steady niche following.
- In 2014, three big vulnerabilities - Heartbleed, Shellshock and POODLE - remind people that even well-designed and open-source systems may get serious security holes.
- Windows 10 marks a shift in Microsoft's strategy, as it was delivered for free as an upgrade or for people willing to participate in testing. Yet again, concerns of privacy and snooping run strong.
- Apple Watch is the first widespread smart watch, heralding the rise of wearable computers.
- Pokemon Go of 2016 created a global phenomenon with thousands of (seemingly) normal people hunting virtual monsters...
- 2017 proved quite eventful in IT. The continuing conflict in Ukraine brought along the outbreaks of Petya Petya (and later, NotPetya) malware that transcended the place of origin. Another epidemic occurred in WannaCry ransomware that reached more than 230 000 computers[37]. The original optimism surrendering the 'Smart Devices' or 'Internet of Things' starts to fade due to clueless manufacturers and default passwords (so it is sometimes referred to as the Internet of Bad Things). The U.S. embraces Net Neutrality. Equifax, a credit reporting agency (in fact, a personal information broker) gets breached due to weak security, leaking personal information of about 143 million Americans online.The Estonian ID card gets its security seriously challenged for the first time.
- 2018 - Right at the beginning, news arrived about Meltdown and Spectre. In March, the Cambridge Analytica scandal rocks the world. Net neutrality fights in the US continue and cryptocurrency mining is becoming a major trend in malware. The European Union adopts the GDPR (General Data Protection Regulation) that creates its fair share of controversy. In summer, Microsoft announces the purchase of GitHub and in autumn, IBM follows suit with Red Hat. At the end of the year, a major data leak is uncovered in Estonia (approximately 460 000 addresses in .ee domain).
- 2019 - Huawei, a major provider of mobile and network technology, is accused of leaking information to Chinese government by the U.S. Google+ and Yahoo! Groups are closed down, marking another step towards phasing out the previous generation of online services. Google (actually Wing, a parallel auxiliary under the same umbrella corporation, Alphabet) starts drone-based delivery of packages in some locations (in the U.S. they are considered an airline, thus a FAA certificate is required). At the end of the year, some claims from Google about reaching the quantum supremacy - using a quantum computer to solve a problem that is inherently out of the reach of traditional computers -, yet it is contested by IBM and several others. Big Data and data mining as well as AI continue as 'hot topics' in IT. At the end of the year, the COVID-19 pandemic starts to spread all over the world.
2020 - the COVID-19 pandemic continues throughout the year, bringing along a long list of IT problems from system overloads to scams, but also stimulating the development of many innovative solutions (e.g. video conferencing) and forcing ignoramuses to learn new IT skills. It is also a driving force behind the Folding@home project that uses computer resources donated by volunteers (like the previous SETI@home, but this time for medical research instead)[38]. The market share of Linux-based desktop computers rises above 3% for the first time in June[39]. In August, Elon Musk announces that his company Neuralink has build a successful prototype of a brain-computer interface and tested it on pigs[40].
2021 - The pandemic carries on and vaccination becomes a hot topic in the social media. Microsoft introduces the Windows 11 and Apple MacOS Monterey, the mobile devices get Android 12 and iOS 15. The next stage of global cyber conflict takes place in Sri Lanka (The cyber force of the Tamil Tigers attack the government). Ireland gets strongly hit by a ransomware attack, the same happens (in a little smaller scale) in New Zealand.
2022 - the War in Ukraine started its new, active phase in February, also boosting cyberattacks in many places. Russia as the aggressor was targeted by international sanctions, lost a significant share of IT companies previously working there as well as a lot of its IT personnel (mostly due to emigration). It was also targeted by a controversial enemy - Anonymous launched the Operation Russia. In other areas, OpenAI caused a serious mindquake in education worldwide (some panicked reactions suggesting education being doomed). In the business world, Microsoft set out to buy [2], a major video game company.
Conclusion
For most people, computers are the invention of the late 20th century - but as seen from above, information technology in general (and even computers as such) are much older. More than often, learning about the past can help to predict the future - therefore, tech history should be studied. It has also been said that history tends to repeat itself - some mistakes have been made before, as have some nasty tricks and scams. But it also helps one to understand the actual value of what we currently have in our hands.
References
- ↑ IFRAH, G. The Universal History of Computing: From the Abacus to the Quantum Computer, John Wiley & Sons 2001. ISBN 0471396710
- ↑ http://www.antikythera-mechanism.gr/project/overview
- ↑ https://history-computer.com/MechanicalCalculators/Pioneers/Leonardo.html
- ↑ https://history-computer.com/MechanicalCalculators/Pioneers/Schickard.html
- ↑ https://history-computer.com/People/PascalBio.html
- ↑ https://mathshistory.st-andrews.ac.uk/Biographies/Oughtred/
- ↑ https://sliderulemuseum.com/SR_Course.htm#History
- ↑ https://www.computerhistory.org/babbage/
- ↑ https://www.youtube.com/watch?v=Lcedn6fxgS0
- ↑ https://people.well.com/user/adatoole/PoeticalScience.htm
- ↑ https://ethw.org/J._Presper_Eckert
- ↑ https://www.ivorcatt.com/47c.htm
- ↑ https://web.archive.org/web/20080318184915/https://www.crash-it.com/crash/index.php?page=73
- ↑ http://zuse.zib.de/album/ImvGLEqrWp9c9LA
- ↑ https://www.agnesscott.edu/lriddle/women/hopper.htm
- ↑ https://balticworlds.com/the-cybernetics-scare-and-the-origins-of-the-internet/
- ↑ https://news.bbc.co.uk/2/hi/science/nature/1633972.stm
- ↑ https://www.wired.com/science/discoveries/news/2008/12/dayintech_1209
- ↑ http://www.catb.org/jargon/html/W/walking-drives.html
- ↑ https://www-03.ibm.com/ibm/history/exhibits/storage/storage_chrono20.html
- ↑ MOSCHOVITIS, C.J.P., POOLE, H., SCHUYLER, T., SENFT, T.M. History of the Internet: A Chronology, 1843 to the Present. ABC-Clio 1999, ISBN: 1576071189
- ↑ http://www.unix.org/what_is_unix/history_timeline.html
- ↑ http://www.blinkenlights.com/classiccmp/gateswhine.html
- ↑ https://dataswamp.org/~john/assets/writings/library/microsoft/IhateMS.html
- ↑ https://arstechnica.com/old/content/2005/12/total-share.ars/1
- ↑ https://web.archive.org/web/20071012051803/https://www.cnn.com/2007/TECH/09/18/emoticon.anniversary.ap/index.html
- ↑ https://www.wordstar.org/index.php/wordstar-history
- ↑ https://www.gnu.org/gnu/manifesto.html
- ↑ https://www.w3.org/History/1989/proposal.html
- ↑ https://www.zakon.org/robert/internet/timeline/
- ↑ https://groups.google.com/group/comp.os.minix/msg/b813d52cbc5a044b?dmode=source&pli=1
- ↑ https://www.usdoj.gov/atr/cases/exhibits/20.pdf
- ↑ https://cyber.law.harvard.edu/msdoj/transcript/summaries1.html
- ↑ https://www.zdnet.com/article/bugfest-win2000-has-63000-defects/
- ↑ https://www.pcworld.com/article/535838/worst_products_ever.html
- ↑ http://www.brod.com.br/files/helsinki.pdf
- ↑ https://www.theguardian.com/technology/2017/jun/27/petya-ransomware-cyber-attack-who-what-why-how
- ↑ https://www.extremetech.com/computing/308332-foldinghome-crushes-exascale-barrier-now-faster-than-dozens-of-supercomputers
- ↑ https://www.techradar.com/news/microsoft-may-finally-have-some-encouraging-news-for-windows-10-users
- ↑ https://www.bbc.com/news/world-us-canada-53956683
Additional reading
- CARLTON, J. Apple: The Inside Story of Intrigue, Egomania, and Business Blunders, Harper Paperbacks 1998. ISBN: 0887309658
- FREIBERGER, P., SWAINE, M. Fire in the Valley: The Making of The Personal Computer, 2nd ed. McGraw-Hill 2000. ISBN: 0071358927
- GATES, B. The Road Ahead. 2nd Revised ed. Pearson Elt 2008. ISBN: 140587932
- ISAACSON, W. Steve Jobs. Simon & Schuster 2011. ISBN: 1451648537
- LEVY, S. Hackers: Heroes of the Computer Revolution - 25th Anniversary Edition. O'Reilly 2010. ISBN: 1449388396
- MOODY, G. Rebel Code: Linux And The Open Source Revolution. Basic Books 2002. ISBN: 0738206709
- VISE, D., MAISEED, M. The Google Story: For Google's 10th Birthday. Delacorte Press 2008. ISBN: 0385342721
Study & Write
Find three interesting examples of IT solutions (hardware, software, networking... the more exotic the better) from three different decades (e.g. 1970s, 1990s, 2010s) and introduce them in a blog post.
The content of this course is distributed under the Creative Commons Attibution-ShareAlike 3.0 Estonian license (English: CC Attribution-ShareAlike, or CC BY-SA) or any newer version of the license.