Katana VentraIP

User interface

In the industrial design field of human–computer interaction, a user interface (UI) is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, while the machine simultaneously feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls and process controls. The design considerations applicable when creating user interfaces are related to, or involve such disciplines as, ergonomics and psychology.

For the boundary between computer systems, see Interface (computing).

Generally, the goal of user interface design is to produce a user interface that makes it easy, efficient, and enjoyable (user-friendly) to operate a machine in the way which produces the desired result (i.e. maximum usability). This generally means that the operator needs to provide minimal input to achieve the desired output, and also that the machine minimizes undesired outputs to the user.


User interfaces are composed of one or more layers, including a human-machine interface (HMI) that typically interfaces machines with physical input hardware (such as keyboards, mice, or game pads) and output hardware (such as computer monitors, speakers, and printers). A device that implements an HMI is called a human interface device (HID). User interfaces that dispense with the physical movement of body parts as an intermediary step between the brain and the machine use no input or output devices except electrodes alone; they are called brain–computer interfaces (BCIs) or brain–machine interfaces (BMIs).


Other terms for human–machine interfaces are man–machine interface (MMI) and, when the machine in question is a computer, human–computer interface. Additional UI layers may interact with one or more human senses, including: tactile UI (touch), visual UI (sight), auditory UI (sound), olfactory UI (smell), equilibria UI (balance), and gustatory UI (taste).


Composite user interfaces (CUIs) are UIs that interact with two or more senses. The most common CUI is a graphical user interface (GUI), which is composed of a tactile UI and a visual UI capable of displaying graphics. When sound is added to a GUI, it becomes a multimedia user interface (MUI). There are three broad categories of CUI: standard, virtual and augmented. Standard CUI use standard human interface devices like keyboards, mice, and computer monitors. When the CUI blocks out the real world to create a virtual reality, the CUI is virtual and uses a virtual reality interface. When the CUI does not block out the real world and creates augmented reality, the CUI is augmented and uses an augmented reality interface. When a UI interacts with all human senses, it is called a qualia interface, named after the theory of qualia. CUI may also be classified by how many senses they interact with as either an X-sense virtual reality interface or X-sense augmented reality interface, where X is the number of senses interfaced with. For example, a Smell-O-Vision is a 3-sense (3S) Standard CUI with visual display, sound and smells; when virtual reality interfaces interface with smells and touch it is said to be a 4-sense (4S) virtual reality interface; and when augmented reality interfaces interface with smells and touch it is said to be a 4-sense (4S) augmented reality interface.

computerized library database

The user interface of a system, a vehicle or an industrial installation is sometimes referred to as the human–machine interface (HMI).[4] HMI is a modification of the original term MMI (man–machine interface).[5] In practice, the abbreviation MMI is still frequently used[5] although some may claim that MMI stands for something different now. Another abbreviation is HCI, but is more commonly used for human–computer interaction.[5] Other terms used are operator interface console (OIC) and operator interface terminal (OIT).[6] However it is abbreviated, the terms refer to the 'layer' that separates a human that is operating a machine from the machine itself.[5] Without a clean and usable interface, humans would not be able to interact with information systems.

mechanical

There is a difference between a user interface and an operator interface or a human–machine interface (HMI).


In science fiction, HMI is sometimes used to refer to what is better described as a direct neural interface. However, this latter usage is seeing increasing application in the real-life use of (medical) prostheses—the artificial extension that replaces a missing body part (e.g., cochlear implants).[7][8]


In some circumstances, computers might observe the user and react according to their actions without specific commands. A means of tracking parts of the body is required, and sensors noting the position of the head, direction of gaze and so on have been used experimentally. This is particularly relevant to immersive interfaces.[9][10]

1968 – demonstrated NLS, a system which uses a mouse, pointers, hypertext, and multiple windows.[13]

Douglas Engelbart

1970 – Researchers at (many from SRI) develop WIMP paradigm (Windows, Icons, Menus, Pointers)[13]

Xerox Palo Alto Research Center

1973 – : commercial failure due to expense, poor user interface, and lack of programs[13]

Xerox Alto

1979 – and other Apple engineers visit Xerox PARC. Though Pirates of Silicon Valley dramatizes the events, Apple had already been working on developing a GUI, such as the Macintosh and Lisa projects, before the visit.[14][15]

Steve Jobs

1981 – : focus on WYSIWYG. Commercial failure (25K sold) due to cost ($16K each), performance (minutes to save a file, couple of hours to recover from crash), and poor marketing

Xerox Star

1982 – and others at Bell Labs designed Blit, which was released in 1984 by AT&T and Teletype as DMD 5620 terminal.

Rob Pike

1984 – Apple popularizes the GUI. Super Bowl commercial shown twice, was the most expensive commercial ever made at that time

Macintosh

1984 – 's X Window System: hardware-independent platform and networking protocol for developing GUIs on UNIX-like systems

MIT

1985 – – provided GUI interface to MS-DOS. No overlapping windows (tiled instead).

Windows 1.0

1985 – Microsoft and IBM start work on OS/2 meant to eventually replace MS-DOS and Windows

1986 – Apple threatens to sue because their GUI desktop looked too much like Apple's Mac.

Digital Research

1987 – – Overlapping and resizable windows, keyboard and mouse enhancements

Windows 2.0

1987 – Macintosh II: first full-color Mac

1988 – 1.10 Standard Edition (SE) has GUI written by Microsoft, looks a lot like Windows 2

OS/2

Common practices for interaction specification include , persona, activity-oriented design, scenario-based design, and resiliency design.

user-centered design

Common practices for interface software specification include and constrain enforcement by interaction protocols (intended to avoid use errors).

use cases

Common practices for prototyping are based on libraries of interface elements (controls, decoration, etc.).

Historic HMI in the driver's cabin of a German steam locomotive

Historic HMI in the driver's cabin of a German steam locomotive

Modern HMI in the driver's cabin of a German Intercity-Express high-speed train

Modern HMI in the driver's cabin of a German Intercity-Express high-speed train

The HMI of a toilette (in Japan)

The HMI of a toilette (in Japan)

HMI for audio mixing

HMI for audio mixing

HMI of a machine for the sugar industry with pushbuttons

HMI of a machine for the sugar industry with pushbuttons

HMI for a computer numerical control (CNC)

Slightly newer HMI for a CNC-machine

Slightly newer HMI for a CNC-machine

Emergency switch/panic switch

Emergency switch/panic switch

DMD 5620 terminal

DMD 5620 terminal

– covering a wide area of user interface publications

Conference series

Chapter 2. History: A brief history of user interfaces