Digit Geek
Digit Geek > Recent Articles > Technology > How the modern day GUI came to be

How the modern day GUI came to be

Every single device with a screen has one, and it is something that can make or break your experience on a device.

If you’re a denizen of the electronic world and use anything that has some sense of smartness about it, you’ve used a Graphical User Interface (GUI). In a highly touch-driven world, the GUI is not only omnipresent, it is essential in determining the fate of a particular innovation. As long as tech is made for humans, GUIs will be an imperative part of any technology.

While some of you may already know about Steve Job’s now historic visit to the Xerox PARC centre where he saw a GUI in operation for the first time ever, most tech users are clueless about the origins of something that they use on a daily basis. What is even more unknown is that the quest for a compact, user-friendly personal computing interface began long before that.

Before it could be done

The initial idea, or at best a concept, behind the graphical user interface, came long before the technology to make it happen was even remotely ready. Vannevar Bush postulated a device called the Memex in the 1930s. The concept looked like a desk with two touch-screen graphical displays, a keyboard, and a scanner attached to it. Obviously, none of this could be actually made back then, as there was little to no interest in digital computers. Bush’s work went largely unnoticed until World War II was around the corner when a rush to stay ahead on the technology front had gripped almost every nation around the world. Unknown to Bush, his speculations would actually inspire a generation of innovators to work towards making the GUI a reality around this time.

A diagram representing the internals of the Memex

Douglas Engelbart was one such innovator, who decided to quit his comfortable job at the NACA Institute (what would later become NASA) and build something not too different from Bush’s vision. Of course, his years of experience during the war as a radar operator would prove to be crucial in his understanding of a cathode-ray tube display based UI controllable by the user. In a time when entering text in real time into a computer was considered a radical idea, Engelbart and his team worked for years to bring about something that would change the world of GUI as we know it. It was 1968 when it all changed.

And one more thing…

Douglas Engelbart’s historic demonstration was overwhelming, to say the least. The system was called NLS, or oN-Line System. The display of the system was vector-based and was capable of displaying both text and solid lines on the same screen. The system also featured what would be the precursor of the modern mouse, in the form of a three-button device that let the user perform actions not too different from today’s mouse.

Douglas Engelbart, the man behind the mouse with his creation

Apart from this, the device itself was capable of hypertext linking, full-screen document editing, context-sensitive help, networked document collaboration, e-mail, instant messaging, even video-conferencing! Though Engelbart’s organisation had to shut down in 1989 due to lack of funding, this one demo had stirred things up, especially for a company that saw its future as bleak in front of technology that could make paper obsolete.

Paperwork

Xerox, reasonably alarmed by the possibility of the demise of their paper-based company in a future world where people could collaborate and work on digital documents from across continents, saw reason to invest into and control this new technology. This led to the setup of PARC (Palo Alto Research Centre) in 1970, hot on the heels of Engelbart’s demo. Employees at PARC, some of the best computer scientists from across the country, had full freedom to work on their dream projects to build the future of computing – which is exactly what they did.

The Xerox Alto – the display was oriented to represent an A4 size paper accurately

Alto and Smalltalk, Xerox’s take on the first graphical computer and graphical user interface, were functional initially in 1973 and 1974 respectively, although they were improved upon for years to come. Alto featured full raster-based bitmap graphics at a resolution of 606 by 808, and other innovations like a dynamic mouse cursor, a word processor, a bitmap editor that are so familiar to us today. What was missing was a consistent user interface across these applications, which was brought about by Smalltalk. Almost everything you use today on a personal computer – tiled windows, scroll bars, icons – was first implemented here. The combination of Alto and Smalltalk was essentially the first personal computer. But it was a completely different company that reaped its benefits.

The fruit of Eden

By the time Xerox went commercial with their product, most of its good engineers and innovators had left for other brighter avenues. One such avenue was Apple, founded by Steve Jobs and Steve Wozniak in 1976. Although Apple’s Lisa computer, which had started off as a command line based business-use computer, was one of the first beneficiaries of the influx of PARC people into Apple, it wasn’t until the 1984 launch of the much more affordable and lightweight Macintosh that Apple truly reaped the benefits of the innovations from PARC.

Back in the day, Apple provided a California elementary school free machines for all students. And during the summers, engineers from Apple worked with the teachers and kids to enhance the software and the GUI, mainly because they believed that their reactions would be the most honest.

The Macintosh shipped with a GUI – a welcome relief over command line text-based interfaces (Listen to the reactions of the audience at the famous 1984 demo: http://dgit.in/1984Demo). Apple was set to ride a wave of success that would thwart out any competition for a good decade, to say the least. The remaining years of the 1980s saw Microsoft collaborate with Apple to develop applications for the Macintosh, in turn launching their own platform Windows – which, along with its successor, eventually failed to catch the attention in the market. Nobody was in a hurry to leave the comfort of the world of Macintosh yet. Until the 90s arrived.

The Redmond wave

The 1990s saw many smaller but significant GUIs dwindle out to leave only Apple and Microsoft to duke it out in the market. Windows 3.0 arrived in May 1990, and with it came the applications that Windows was already building for Macintosh. Word and Excel were particularly popular, along with the tons of other 3rd party applications that could now run on Windows. By the time Windows 3.1.1 had hit the market, with GUI features like sculpted buttons, better colour support, real-time multitasking, scalable TrueType font support and more, Windows-PCs were outselling Macs. The update, Windows NT, did similarly well and won over the business world too with good graphical support.

Remember good old Windows NT 4.0?

By this time, a rough lawsuit war had heated up between Apple and Microsoft over the similarities of the design of Windows to the Macintosh platform. But Microsoft turned the table with allegations that both Apple and Microsoft had merely taken the liberty of being inspired by technology from Xerox PARC. Microsoft won the lawsuit and the story hasn’t changed much since then.

Apple eventually did up its game – the next-gen OS X finally appeared in March 2001, along with the iMac line which breathed back some critical life into Apple’s business.

GUIs right now

The GUI world hasn’t changed drastically since the beginning of this millennium for the personal computer. On the other hand, the massively popular smartphone and its touch interface have brought about an entirely new type of GUI. The next frontier in interfaces, virtual reality, demands something more from GUI. In a model of computing that can only be deemed successful if it replicates the real world accurately, the GUI has to resemble our real world as closely as possible – downright copying it if necessary. If someone in the world of tech does get it right, let’s hope that this time, there won’t be any lawsuits as a result.

Arnab Mukherjee

Arnab Mukherjee

A former tech-support desk jockey, you can find this individual delving deep into all things tech, fiction and food. Calling his sense of humour merely terrible would be a much better joke than what he usually makes.