肺脓肿是什么病严重吗| 飞机什么时候停止登机| 肠衣是什么做的| 刘亦菲是什么星座| 白化病有什么危害吗| 脚趾甲发白是什么原因| 淋巴细胞升高说明什么| 我是小姨的什么人| 检查艾滋病挂什么科| 央企和国企有什么区别| 脑门痒痒是什么预兆| 为什么会经常口腔溃疡| 今年是什么年庚| 刘备的马叫什么| 撒拉族和回族有什么区别| 怀孕不能吃什么药| 脑出血是什么原因造成的| 你是什么| 为什么会得艾滋病| 正襟危坐什么意思| 派石项链有什么功效| 死胎有什么症状| 国家副主席是什么级别| 钧五行属什么| 桔子什么时候成熟| 执念是什么意思| 湿热内蕴吃什么中成药| 腋下异味挂什么科| 避火图是什么| 不想吃饭没胃口是什么原因| 脚气是什么原因引起的| 置换补贴什么意思| 手脚麻木挂什么科| 十月份是什么季节| 梦见下雪是什么征兆| 鹅口疮是什么原因引起的| 风疹病毒igg阳性是什么意思| 风湿是什么原因造成的| 舌根发黄是什么原因造成的| 什么床垫好| 什么驱蚊效果最好| 血脂查什么项目| 身体发烧是什么原因| 换手率是什么意思| 抽烟有什么好处| 鱼腥草破壁饮片有什么功效| 耳朵疼吃什么消炎药| 四月十六日是什么星座| 发膜和护发素有什么区别| 抑菌是什么意思| 司令是什么意思| 为什么硬不起来| 七夕节干什么| 好吃懒做是什么生肖| 燥湿什么意思| 男人梦见鱼是什么征兆| 三刀六洞什么意思| 结节性甲状腺肿是什么意思| 什么是平年| 睡不着挂什么科| 盆腔积液什么症状| 治疗脚气用什么药| 井是什么生肖| 头响脑鸣是什么原因引起的| 五劳七伤什么生肖| 老克勒是什么意思| 色达在四川什么地方| 女生排卵期在什么时候| 子宫肌瘤是什么原因造成的| 小孩血压高是什么原因| 不悔梦归处只恨太匆匆是什么意思| 荷叶和什么搭配最减肥| 丁五行属什么| 爱出汗吃什么药| 汽车五行属什么| 褶是什么意思| 什么行业赚钱| 石字旁有什么字| 莲子适合什么人吃| 内膜居中是什么意思| 0是什么数| 什么赴什么继| 前列腺炎吃什么药最有效| 什么是义齿| 阴道发热是什么原因| 四个雷念什么| 飘飘然是什么意思| 忠心不二是什么生肖| 耳鼻喉属于什么科| 王字旁加己念什么| 如饥似渴是什么意思| 膝盖疼是什么原因引起的| rh血型阴性是什么意思| 膝盖骨质增生用什么药效果好| 非淋菌尿道炎用什么药| 血糖高可以吃什么主食| 心跳突然加快是什么原因| 美国白宫是干什么的| 天王星是什么颜色| 黄酒是什么酒| 指导员是什么级别| 口舌生疮吃什么药最好| 圆是什么结构| 愿字五行属什么| 血糖高饮食需要注意什么| 前列腺在什么位置| 腰疼是什么病的前兆| 时乖命蹇是什么意思| 尿道炎什么症状| 微信被拉黑后显示什么| 菲薄是什么意思| 箬叶和粽叶有什么区别| 虎什么熊什么| 长痘是什么原因| 藿香正气水不能和什么药一起吃| 头发爱出油什么原因| 经常肚子疼是什么原因| 776是什么意思| 杀青是什么意思| qn是什么意思| 儿童调理脾胃用什么药最好| 吊膀子是什么意思| 梦见黄狗是什么意思| mc什么意思| 月经刚完同房为什么痛| 同病相怜什么意思| 吃brunch是什么意思啊| 脆肉鲩是什么鱼| 你说什么| 黄铜是什么| 缺二氧化碳是什么症状| 御史相当于现在什么官| 脸浮肿是什么原因| 结核有什么症状| 什么样的夜晚| 湿疹涂什么| 减肥可以吃什么菜| 朱允炆为什么不杀朱棣| 偏头疼挂什么科室| 亚是什么意思| 经期肚子疼是什么原因| 安乃近是什么药| 医者仁心是什么意思| 梦见自己和别人结婚是什么意思| 风暴是什么意思| 出是什么意思| 泌尿系统感染吃什么消炎药| 吃什么死的比较舒服| 银杯子喝水有什么好处与坏处| 自助是什么意思| 平时血压高突然变低什么原因| 智字五行属什么| 什么体质容易高原反应| 嗓子中间的那块小肉叫什么| 粉色配什么颜色| 内膜b型是什么意思啊| 手足口病用什么药最好| 心眼多是什么意思| 十一月五号是什么星座| 养老院护工都做些什么| 孕妇有狐臭擦什么最好| 刚出生的宝宝要注意什么| 36年属什么生肖| canon是什么意思| 美国为什么有哥伦比亚| 蛀牙的早期症状是什么| 什么时候喝牛奶最好| 重庆五行属什么| 鄂尔多斯是什么意思| 猪油不凝固是什么原因| 慢性结膜炎用什么眼药水| 什么牌子的冰箱最好| std是什么意思| 花生死苗烂根用什么药| 腹胀是什么病的前兆| 吃什么有助于骨头恢复| 肚子大是什么原因造成的| 高危行为是什么意思| 笔芯是什么意思| 什么是冰晶| 风热证是什么意思| 乙型肝炎表面抗体阳性是什么意思| 老鼠屎长什么样| 出现幻觉是什么原因引起的| 怕冷又怕热是什么原因| 什么的诉说| lv什么牌子| dob值阳性是什么意思| 吃黑豆有什么好处| 警察为什么叫条子| 能吃是福是什么意思| 头秃了一块是什么原因| 拿什么拯救你我的爱人演员表| 胆道闭锁有什么症状| 低脂是什么意思| 二级b超是检查什么| 小孩舌头发白什么原因| 雌性激素是什么| 木九十眼镜什么档次| 一什么草坪| 梦见白萝卜是什么意思| 什么是穿刺手术| 滑石是什么| 3月27日什么星座| 克拉是什么单位| 乳果糖是什么| 出煞是什么意思| 骨质增生吃什么药最好| 巨蟹座和什么座最配| 杨桃是什么季节的水果| 什么的风筝| 7月23是什么星座| 额头疼是什么原因| 七夕节的含义是什么| 瘰疬是什么病| 龙的本命佛是什么佛| 你本来就很美是什么广告| 什么叫法令纹| 亚麻酸是什么东西| pet是什么| 我想长胖点有什么办法| 什么样的沙滩| 叶凡为什么要找荒天帝| 6月6是什么星座| 梦见自己生个女孩是什么意思| 什么官许愿| 1978年是什么年| 甄嬛传什么时候上映的| 完全性右束支阻滞是什么意思| 静的部首是什么| min代表什么| paco2是什么意思| 无名指下面的竖线代表什么| 海参几头是什么意思| 馊主意是什么意思| 什么颜色的猫最旺财| 男人为什么喜欢胸| 脑软化灶是什么意思| 防晒霜和防晒乳有什么区别| 憨笑是什么意思| 亥是什么意思| 男人阴茎硬不起来是什么原因| 977是什么意思| 京东自营店是什么意思| 小孩为什么吃果糖二磷酸钠| 保鲜卡是什么原理纸片| 欢五行属什么| 菠萝是什么意思| 什么是白色家电| 胡子长的快是什么原因| 念旧的人是什么样的人| 什么朦胧| 铁树是什么生肖| 满足是什么意思| 侬是什么意思| abo是什么| 双脚浮肿是什么原因| 宝宝说话晚是什么原因造成的| 杏花什么季节开| 感冒低烧吃什么药| 知了猴什么时候出土| 劫煞是什么意思| 查激素六项挂什么科| 眼屎多用什么眼药水好| 百度Jump to content

教育合作促民心相通(深观察)

From Wikipedia, the free encyclopedia
(Redirected from User-interface)
百度 ”党的十八大以来,党中央加强党内法规制度建设,制定首部党内“立法法”,明确了党内法规制度建设的总体目标,并按照“废改立”的思路,对新中国成立以来的党内法规和规范性文件进行了系统清理。

The Xfce desktop environment offers a graphical user interface following the desktop metaphor.

In the industrial design field of human–computer interaction, a user interface (UI) is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, while the machine simultaneously feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls and process controls. The design considerations applicable when creating user interfaces are related to, or involve such disciplines as, ergonomics and psychology.

Generally, the goal of user interface design is to produce a user interface that makes it easy, efficient, and enjoyable (user-friendly) to operate a machine in the way which produces the desired result (i.e. maximum usability). This generally means that the operator needs to provide minimal input to achieve the desired output, and also that the machine minimizes undesired outputs to the user.

User interfaces are composed of one or more layers, including a human–machine interface (HMI) that typically interfaces machines with physical input hardware (such as keyboards, mice, or game pads) and output hardware (such as computer monitors, speakers, and printers). A device that implements an HMI is called a human interface device (HID). User interfaces that dispense with the physical movement of body parts as an intermediary step between the brain and the machine use no input or output devices except electrodes alone; they are called brain–computer interfaces (BCIs) or brain–machine interfaces (BMIs).

Other terms for human–machine interfaces are man–machine interface (MMI) and, when the machine in question is a computer, human–computer interface. Additional UI layers may interact with one or more human senses, including: tactile UI (touch), visual UI (sight), auditory UI (sound), olfactory UI (smell), equilibria UI (balance), and gustatory UI (taste).

Composite user interfaces (CUIs) are UIs that interact with two or more senses. The most common CUI is a graphical user interface (GUI), which is composed of a tactile UI and a visual UI capable of displaying graphics. When sound is added to a GUI, it becomes a multimedia user interface (MUI). There are three broad categories of CUI: standard, virtual and augmented. Standard CUI use standard human interface devices like keyboards, mice, and computer monitors. When the CUI blocks out the real world to create a virtual reality, the CUI is virtual and uses a virtual reality interface. When the CUI does not block out the real world and creates augmented reality, the CUI is augmented and uses an augmented reality interface. When a UI interacts with all human senses, it is called a qualia interface, named after the theory of qualia.[citation needed] CUI may also be classified by how many senses they interact with as either an X-sense virtual reality interface or X-sense augmented reality interface, where X is the number of senses interfaced with. For example, a Smell-O-Vision is a 3-sense (3S) Standard CUI with visual display, sound and smells; when virtual reality interfaces interface with smells and touch it is said to be a 4-sense (4S) virtual reality interface; and when augmented reality interfaces interface with smells and touch it is said to be a 4-sense (4S) augmented reality interface.

Overview

[edit]
The Reactable musical instrument, an example of a tangible user interface

The user interface or human–machine interface is the part of the machine that handles the human–machine interaction. Membrane switches, rubber keypads and touchscreens are examples of the physical part of the Human Machine Interface which we can see and touch.[1]

In complex systems, the human–machine interface is typically computerized. The term human–computer interface refers to this kind of system. In the context of computing, the term typically extends as well to the software dedicated to control the physical elements used for human–computer interaction.

The engineering of human–machine interfaces is enhanced by considering ergonomics (human factors). The corresponding disciplines are human factors engineering (HFE) and usability engineering (UE) which is part of systems engineering.

Tools used for incorporating human factors in the interface design are developed based on knowledge of computer science, such as computer graphics, operating systems, programming languages. Nowadays, we use the expression graphical user interface for human–machine interface on computers, as nearly all of them are now using graphics.[citation needed]

Multimodal interfaces allow users to interact using more than one modality of user input.[2]

Terminology

[edit]
A human–machine interface usually involves peripheral hardware for the INPUT and for the OUTPUT. Often, there is an additional component implemented in software, like e.g. a graphical user interface.

There is a difference between a user interface and an operator interface or a human–machine interface (HMI).

  • The term "user interface" is often used in the context of (personal) computer systems and electronic devices.
    • Where a network of equipment or computers are interlinked through an MES (Manufacturing Execution System)-or Host to display information.
    • A human–machine interface (HMI) is typically local to one machine or piece of equipment, and is the interface method between the human and the equipment/machine. An operator interface, on the other hand, is the interface method by which multiple pieces of equipment, linked by a host control system, are accessed or controlled.
    • The system may expose several user interfaces to serve different kinds of users. For example, a computerized library database might provide two user interfaces, one for library patrons (limited set of functions, optimized for ease of use) and the other for library personnel (wide set of functions, optimized for efficiency).[3]
  • The user interface of a mechanical system, a vehicle or an industrial installation is sometimes referred to as the human–machine interface (HMI).[4] HMI is a modification of the original term MMI (man–machine interface).[5] In practice, the abbreviation MMI is still frequently used[5] although some may claim that MMI stands for something different now.[citation needed] Another abbreviation is HCI, but is more commonly used for human–computer interaction.[5] Other terms used are operator interface console (OIC) and operator interface terminal (OIT).[6] However it is abbreviated, the terms refer to the 'layer' that separates a human that is operating a machine from the machine itself.[5] Without a clean and usable interface, humans would not be able to interact with information systems.

In science fiction, HMI is sometimes used to refer to what is better described as a direct neural interface. However, this latter usage is seeing increasing application in the real-life use of (medical) prostheses—the artificial extension that replaces a missing body part (e.g., cochlear implants).[7][8]

In some circumstances, computers might observe the user and react according to their actions without specific commands. A means of tracking parts of the body is required, and sensors noting the position of the head, direction of gaze and so on have been used experimentally. This is particularly relevant to immersive interfaces.[9][10]

History

[edit]

The history of user interfaces can be divided into the following phases according to the dominant type of user interface:

1945–1968: Batch interface

[edit]
IBM 029 card punch
IBM 029

In the batch era, computing power was extremely scarce and expensive. User interfaces were rudimentary. Users had to accommodate computers rather than the other way around; user interfaces were considered overhead, and software was designed to keep the processor at maximum utilization with as little overhead as possible.

The input side of the user interfaces for batch machines was mainly punched cards or equivalent media like paper tape. The output side added line printers to these media. With the limited exception of the system operator's console, human beings did not interact with batch machines in real time at all.

Submitting a job to a batch machine involved first preparing a deck of punched cards that described a program and its dataset. The program cards were not punched on the computer itself but on keypunches, specialized, typewriter-like machines that were notoriously bulky, unforgiving, and prone to mechanical failure. The software interface was similarly unforgiving, with very strict syntaxes designed to be parsed by the smallest possible compilers and interpreters.

Holes are punched in the card according to a prearranged code transferring the facts from the census questionnaire into statistics.

Once the cards were punched, one would drop them in a job queue and wait. Eventually, operators would feed the deck to the computer, perhaps mounting magnetic tapes to supply another dataset or helper software. The job would generate a printout, containing final results or an abort notice with an attached error log. Successful runs might also write a result on magnetic tape or generate some data cards to be used in a later computation.

The turnaround time for a single job often spanned entire days. If one was very lucky, it might be hours; there was no real-time response. But there were worse fates than the card queue; some computers required an even more tedious and error-prone process of toggling in programs in binary code using console switches. The very earliest machines had to be partly rewired to incorporate program logic into themselves, using devices known as plugboards.

Early batch systems gave the currently running job the entire computer; program decks and tapes had to include what we would now think of as operating system code to talk to I/O devices and do whatever other housekeeping was needed. Midway through the batch period, after 1957, various groups began to experiment with so-called "load-and-go" systems. These used a monitor program which was always resident on the computer. Programs could call the monitor for services. Another function of the monitor was to do better error checking on submitted jobs, catching errors earlier and more intelligently and generating more useful feedback to the users. Thus, monitors represented the first step towards both operating systems and explicitly designed user interfaces.

1969–present: Command-line user interface

[edit]
Teletype Model 33
Teletype Model 33 ASR

Command-line interfaces (CLIs) evolved from batch monitors connected to the system console. Their interaction model was a series of request-response transactions, with requests expressed as textual commands in a specialized vocabulary. Latency was far lower than for batch systems, dropping from days or hours to seconds. Accordingly, command-line systems allowed the user to change their mind about later stages of the transaction in response to real-time or near-real-time feedback on earlier results. Software could be exploratory and interactive in ways not possible before. But these interfaces still placed a relatively heavy mnemonic load on the user, requiring a serious investment of effort and learning time to master.[11]

The earliest command-line systems combined teleprinters with computers, adapting a mature technology that had proven effective for mediating the transfer of information over wires between human beings. Teleprinters had originally been invented as devices for automatic telegraph transmission and reception; they had a history going back to 1902 and had already become well-established in newsrooms and elsewhere by 1920. In reusing them, economy was certainly a consideration, but psychology and the rule of least surprise mattered as well; teleprinters provided a point of interface with the system that was familiar to many engineers and users.

The VT100, introduced in 197″8, was the most popular VDT of all time. Most terminal emulators still default to VT100 mode.
DEC VT100 terminal

The widespread adoption of video-display terminals (VDTs) in the mid-1970s ushered in the second phase of command-line systems. These cut latency further, because characters could be thrown on the phosphor dots of a screen more quickly than a printer head or carriage can move. They helped quell conservative resistance to interactive programming by cutting ink and paper consumables out of the cost picture, and were to the first TV generation of the late 1950s and 60s even more iconic and comfortable than teleprinters had been to the computer pioneers of the 1940s.

Just as importantly, the existence of an accessible screen—a two-dimensional display of text that could be rapidly and reversibly modified—made it economical for software designers to deploy interfaces that could be described as visual rather than textual. The pioneering applications of this kind were computer games and text editors; close descendants of some of the earliest specimens, such as rogue(6), and vi(1), are still a live part of Unix tradition.

1985: SAA user interface or text-based user interface

[edit]

In 1985, with the beginning of Microsoft Windows and other graphical user interfaces, IBM created what is called the Systems Application Architecture (SAA) standard which include the Common User Access (CUA) derivative. CUA successfully created what we know and use today in Windows, and most of the more recent DOS or Windows Console Applications will use that standard as well.

This defined that a pulldown menu system should be at the top of the screen, status bar at the bottom, shortcut keys should stay the same for all common functionality (F2 to Open for example would work in all applications that followed the SAA standard). This greatly helped the speed at which users could learn an application so it caught on quick and became an industry standard.[12]

1968–present: Graphical user interface

[edit]
AMX Desk made a basic WIMP GUI.
Linotype WYSIWYG 2000, 1989
  • 1968 – Douglas Engelbart demonstrated NLS, a system which uses a mouse, pointers, hypertext, and multiple windows.[13]
  • 1970 – Researchers at Xerox Palo Alto Research Center (many from SRI) develop WIMP paradigm (Windows, Icons, Menus, Pointers)[13]
  • 1973 – Xerox Alto: commercial failure due to expense, poor user interface, and lack of programs[13]
  • 1979 – Steve Jobs and other Apple engineers visit Xerox PARC. Though Pirates of Silicon Valley dramatizes the events, Apple had already been working on developing a GUI, such as the Macintosh and Lisa projects, before the visit.[14][15]
  • 1981 – Xerox Star: focus on WYSIWYG. Commercial failure (25K sold) due to cost ($16K each), performance (minutes to save a file, couple of hours to recover from crash), and poor marketing
  • 1982 – Rob Pike and others at Bell Labs designed Blit, which was released in 1984 by AT&T and Teletype as DMD 5620 terminal.
  • 1984 – Apple Macintosh popularizes the GUI. Super Bowl commercial shown twice, was the most expensive commercial ever made at that time
  • 1984 – MIT's X Window System: hardware-independent platform and networking protocol for developing GUIs on UNIX-like systems
  • 1985 – Windows 1.0 – provided GUI interface to MS-DOS. No overlapping windows (tiled instead).
  • 1985 – Microsoft and IBM start work on OS/2 meant to eventually replace MS-DOS and Windows
  • 1986 – Apple threatens to sue Digital Research because their GUI desktop looked too much like Apple's Mac.
  • 1987 – Windows 2.0 – Overlapping and resizable windows, keyboard and mouse enhancements
  • 1987 – Macintosh II: first full-color Mac
  • 1988 – OS/2 1.10 Standard Edition (SE) has GUI written by Microsoft, looks a lot like Windows 2

Interface design

[edit]

Primary methods used in the interface design include prototyping and simulation.

Typical human–machine interface design consists of the following stages: interaction specification, interface software specification and prototyping:

  • Common practices for interaction specification include user-centered design, persona, activity-oriented design, scenario-based design, and resiliency design.
  • Common practices for interface software specification include use cases and constrain enforcement by interaction protocols (intended to avoid use errors).
  • Common practices for prototyping are based on libraries of interface elements (controls, decoration, etc.).

Principles of quality

[edit]

In broad terms, interfaces generally regarded as user friendly, efficient, intuitive, etc. are typified by one or more particular qualities. For the purpose of example, a non-exhaustive list of such characteristics follows:

  1. Clarity: The interface avoids ambiguity by making everything clear through language, flow, hierarchy and metaphors for visual elements.
  2. Concision:[16] However ironically, the over-clarification of information—for instance, by labelling the majority, if not the entirety, of items displayed on-screen at once, and regardless of whether or not the user would in fact require a visual indicator of some kind in order to identify a given item—can, and, under most normal circumstances, most likely will lead to the obfuscation of whatever information.
  3. Familiarity:[17] Even if someone uses an interface for the first time, certain elements can still be familiar. Real-life metaphors can be used to communicate meaning.
  4. Responsiveness:[18] A good interface should not feel sluggish. This means that the interface should provide good feedback to the user about what's happening and whether the user's input is being successfully processed.
  5. Consistency:[19] Keeping your interface consistent across your application is important because it allows users to recognize usage patterns.
  6. Aesthetics: While you do not need to make an interface attractive for it to do its job, making something look good will make the time your users spend using your application more enjoyable; and happier users can only be a good thing.
  7. Efficiency: Time is money, and a great interface should make the user more productive through shortcuts and good design.
  8. Forgiveness: A good interface should not punish users for their mistakes but should instead provide the means to remedy them.

Principle of least astonishment

[edit]

The principle of least astonishment (POLA) is a general principle in the design of all kinds of interfaces. It is based on the idea that human beings can only pay full attention to one thing at one time,[20] leading to the conclusion that novelty should be minimized.

Principle of habit formation

[edit]

If an interface is used persistently, the user will unavoidably develop habits for using the interface. The designer's role can thus be characterized as ensuring the user forms good habits. If the designer is experienced with other interfaces, they will similarly develop habits, and often make unconscious assumptions regarding how the user will interact with the interface.[20][21]

A model of design criteria: User Experience Honeycomb

[edit]
User interface / user experience guide
User Experience Design Honeycomb[22] designed by Peter Morville[23]

Peter Morville of Google designed the User Experience Honeycomb framework in 2004 when leading operations in user interface design. The framework was created to guide user interface design. It would act as a guideline for many web development students for a decade.[23]

  1. Usable: Is the design of the system easy and simple to use? The application should feel familiar, and it should be easy to use.[23][22]
  2. Useful: Does the application fulfill a need? A business's product or service needs to be useful.[22]
  3. Desirable: Is the design of the application sleek and to the point? The aesthetics of the system should be attractive, and easy to translate.[22]
  4. Findable: Are users able to quickly find the information they are looking for? Information needs to be findable and simple to navigate. A user should never have to hunt for your product or information.[22]
  5. Accessible: Does the application support enlarged text without breaking the framework? An application should be accessible to those with disabilities.[22]
  6. Credible: Does the application exhibit trustworthy security and company details? An application should be transparent, secure, and honest.[22]
  7. Valuable: Does the end-user think it's valuable? If all 6 criteria are met, the end-user will find value and trust in the application.[22]

Types

[edit]
Touchscreen of the HP Series 100 HP-150
HP Series 100 HP-150 touchscreen
  1. Attentive user interfaces manage the user attention deciding when to interrupt the user, the kind of warnings, and the level of detail of the messages presented to the user.
  2. Batch interfaces are non-interactive user interfaces, where the user specifies all the details of the batch job in advance to batch processing, and receives the output when all the processing is done. The computer does not prompt for further input after the processing has started.
  3. Command line interfaces (CLIs) prompt the user to provide input by typing a command string with the computer keyboard and respond by outputting text to the computer monitor. Used by programmers and system administrators, in engineering and scientific environments, and by technically advanced personal computer users.
  4. Conversational interfaces enable users to command the computer with plain text English (e.g., via text messages, or chatbots) or voice commands, instead of graphic elements. These interfaces often emulate human-to-human conversations.[24]
  5. Conversational interface agents attempt to personify the computer interface in the form of an animated person, robot, or other character (such as Microsoft's Clippy the paperclip), and present interactions in a conversational form.
  6. Crossing-based interfaces are graphical user interfaces in which the primary task consists in crossing boundaries instead of pointing.
  7. Direct manipulation interface is a general class of user interfaces that allow users to manipulate objects presented to them, using actions that correspond to the physical world, at least loosely.
  8. Gesture interfaces are graphical user interfaces which accept input in a form of hand gestures, or mouse gestures sketched with a computer mouse or a stylus.
  9. Graphical user interfaces (GUI) accept input via devices such as a computer keyboard and mouse and provide articulated graphical output on the computer monitor.[25] There are at least two different principles widely used in GUI design: Object-oriented user interfaces (OOUIs) and application-oriented interfaces.[26]
  10. Hardware interfaces are the physical, spatial interfaces found on products in the real world from toasters, to car dashboards, to airplane cockpits. They are generally a mixture of knobs, buttons, sliders, switches, and touchscreens.
  11. Holographic user interfaces provide input to electronic or electro-mechanical devices by passing a finger through reproduced holographic images of what would otherwise be tactile controls of those devices, floating freely in the air, detected by a wave source and without tactile interaction.
  12. Intelligent user interfaces are human–machine interfaces that aim to improve the efficiency, effectiveness, and naturalness of human–machine interaction by representing, reasoning, and acting on models of the user, domain, task, discourse, and media (e.g., graphics, natural language, gesture).
  13. Motion tracking interfaces monitor the user's body motions and translate them into commands, some techniques of which were at one point patented by Apple.[27]
  14. Multi-screen interfaces, employ multiple displays to provide a more flexible interaction. This is often employed in computer game interaction in both the commercial arcades and more recently the handheld markets.
  15. Natural-language interfaces are used for search engines and on webpages. User types in a question and waits for a response.
  16. Non-command user interfaces, which observe the user to infer their needs and intentions, without requiring that they formulate explicit commands.[28]
  17. Object-oriented user interfaces (OOUI) are based on object-oriented programming metaphors, allowing users to manipulate simulated objects and their properties.
  18. Permission-driven user interfaces show or conceal menu options or functions depending on the user's level of permissions. The system is intended to improve the user experience by removing items that are unavailable to the user. A user who sees functions that are unavailable for use may become frustrated. It also provides an enhancement to security by hiding functional items from unauthorized persons.
  19. Reflexive user interfaces where the users control and redefine the entire system via the user interface alone, for instance to change its command verbs. Typically, this is only possible with very rich graphic user interfaces.
  20. Search interface is how the search box of a site is displayed, as well as the visual representation of the search results.
  21. Tangible user interfaces, which place a greater emphasis on touch and physical environment or its element.
  22. Task-focused interfaces are user interfaces which address the information overload problem of the desktop metaphor by making tasks, not files, the primary unit of interaction.
  23. Text-based user interfaces (TUIs) are user interfaces which interact via text. TUIs include command-line interfaces and text-based WIMP environments.
  24. Touchscreens are displays that accept input by touch of fingers or a stylus. Used in a growing amount of mobile devices and many types of point of sale, industrial processes and machines, self-service machines, etc.
  25. Touch user interface are graphical user interfaces using a touchpad or touchscreen display as a combined input and output device. They supplement or replace other forms of output with haptic feedback methods. Used in computerized simulators, etc.
  26. Voice user interfaces, which accept input and provide output by generating voice prompts. The user input is made by pressing keys or buttons, or responding verbally to the interface.
  27. Zero-input interfaces get inputs from a set of sensors instead of querying the user with input dialogs.[29]
  28. Zooming user interfaces are graphical user interfaces in which information objects are represented at different levels of scale and detail, and where the user can change the scale of the viewed area in order to show more detail.
[edit]

See also

[edit]

References

[edit]
  1. ^ "Eurotherm Parker SSD Link Hardware L5392 | Automation Industrial". l5392.com. Retrieved 11 January 2024.
  2. ^ Cohen, Philip R. (1992). "The role of natural language in a multimodal interface". Proceedings of the 5th annual ACM symposium on User interface software and technology - UIST '92. pp. 143–149. doi:10.1145/142621.142641. ISBN 0897915496. S2CID 9010570.
  3. ^ "The User Experience of Libraries: Serving The Common Good User Experience Magazine". uxpamagazine.org. 7 May 2017. Retrieved 23 March 2022.
  4. ^ Griffin, Ben; Baston, Laurel. "Interfaces" (Presentation): 5. Archived from the original on 14 July 2014. Retrieved 7 June 2014. The user interface of a mechanical system, a vehicle or an industrial installation is sometimes referred to as the human–machine interface (HMI). {{cite journal}}: Cite journal requires |journal= (help)
  5. ^ a b c d "User Interface Design and Ergonomics" (PDF). Course Cit 811. NATIONAL OPEN UNIVERSITY OF NIGERIA: SCHOOL OF SCIENCE AND TECHNOLOGY: 19. Archived (PDF) from the original on 14 July 2014. Retrieved 7 June 2014. In practice, the abbreviation MMI is still frequently used although some may claim that MMI stands for something different now.
  6. ^ "Introduction Section". Recent advances in business administration. [S.l.]: Wseas. 2010. p. 190. ISBN 978-960-474-161-8. Other terms used are operator interface console (OIC) and operator interface terminal (OIT)
  7. ^ Cipriani, Christian; Segil, Jacob; Birdwell, Jay; Weir, Richard (2014). "Dexterous control of a prosthetic hand using fine-wire intramuscular electrodes in targeted extrinsic muscles". IEEE Transactions on Neural Systems and Rehabilitation Engineering. 22 (4): 828–36. doi:10.1109/TNSRE.2014.2301234. ISSN 1534-4320. PMC 4501393. PMID 24760929. Neural co-activations are present that in turn generate significant EMG levels and hence unintended movements in the case of the present human machine interface (HMI).
  8. ^ Citi, Luca (2009). "Development of a neural interface for the control of a robotic hand" (PDF). Scuola Superiore Sant'Anna, Pisa, Italy: IMT Institute for Advanced Studies Lucca: 5. Retrieved 7 June 2014. {{cite journal}}: Cite journal requires |journal= (help)[permanent dead link]
  9. ^ Jordan, Joel. "Gaze Direction Analysis for the Investigation of Presence in Immersive Virtual Environments" (Thesis submitted for the degree of Doctor of Philosophy). University of London: Department of Computer Science: 5. Archived (PDF) from the original on 14 July 2014. Retrieved 7 June 2014. The aim of this thesis is to investigate the idea that the direction of gaze may be used as a device to detect a sense-of-presence in Immersive Virtual Environments (IVE) in some contexts. {{cite journal}}: Cite journal requires |journal= (help)
  10. ^ Ravi (August 2009). "Introduction of HMI". Archived from the original on 14 July 2014. Retrieved 7 June 2014. In some circumstance computers might observe the user, and react according to their actions without specific commands. A means of tracking parts of the body is required, and sensors noting the position of the head, direction of gaze and so on have been used experimentally. This is particularly relevant to immersive interfaces.
  11. ^ "HMI Guide". Archived from the original on 20 June 2014.
  12. ^ Richard, Stéphane. "Text User Interface Development Series Part One – T.U.I. Basics". Archived from the original on 16 November 2014. Retrieved 13 June 2014.
  13. ^ a b c McCown, Frank. "History of the Graphical User Interface (GUI)". Harding University. Archived from the original on 8 November 2014. {{cite journal}}: Cite journal requires |journal= (help)
  14. ^ "The Xerox PARC Visit". web.stanford.edu. Retrieved 8 February 2019.
  15. ^ "apple-history.com / Graphical User Interface (GUI)". apple-history.com. Retrieved 8 February 2019.
  16. ^ Raymond, Eric Steven (2003). "11". The Art of Unix Programming. Thyrsus Enterprises. Archived from the original on 20 October 2014. Retrieved 13 June 2014.
  17. ^ C. A. D'H Gough; R. Green; M. Billinghurst. "Accounting for User Familiarity in User Interfaces" (PDF). Retrieved 13 June 2014. {{cite journal}}: Cite journal requires |journal= (help)
  18. ^ Sweet, David (October 2001). "9 – Constructing A Responsive User Interface". KDE 2.0 Development. Sams Publishing. Archived from the original on 23 September 2013. Retrieved 13 June 2014.
  19. ^ John W. Satzinger; Lorne Olfman (March 1998). "User interface consistency across end-user applications: the effects on mental models". Journal of Management Information Systems. Managing virtual workplaces and teleworking with information technology. 14 (4). Armonk, NY: 167–193. doi:10.1080/07421222.1998.11518190.
  20. ^ a b Raskin, Jef (2000). The human interface : new directions for designing interactive systems (1. printing. ed.). Reading, Mass. [u.a.]: Addison Wesley. ISBN 0-201-37937-6.
  21. ^ Udell, John (9 May 2003). "Interfaces are habit-forming". Infoworld. Archived from the original on 4 April 2017. Retrieved 3 April 2017.
  22. ^ a b c d e f g h "User Interface & User Experience Design | Oryzo | Small Business UI/UX". Oryzo. Retrieved 19 November 2019.
  23. ^ a b c Wesolko, Dane (27 October 2016). "Peter Morville's User Experience Honeycomb". Medium. Retrieved 19 November 2019.
  24. ^ Errett, Joshua. "As app fatigue sets in, Toronto engineers move on to chatbots". CBC. CBC/Radio-Canada. Archived from the original on 22 June 2016. Retrieved 4 July 2016.
  25. ^ Martinez, Wendy L. (23 February 2011). "Graphical user interfaces: Graphical user interfaces". Wiley Interdisciplinary Reviews: Computational Statistics. 3 (2): 119–133. doi:10.1002/wics.150. S2CID 60467930.
  26. ^ Lamb, Gordana (2001). "Improve Your UI Design Process with Object-Oriented Techniques". Visual Basic Developer magazine. Archived from the original on 14 August 2013. Table 1. Differences between the traditional application-oriented and object-oriented approaches to UI design.
  27. ^ appleinsider.com Archived 2025-08-06 at the Wayback Machine
  28. ^ Jakob Nielsen (April 1993). "Noncommand User Interfaces". Communications of the ACM. 36 (4). ACM Press: 83–99. doi:10.1145/255950.153582. S2CID 7684922. Archived from the original on 10 November 2006.
  29. ^ Sharon, Taly, Henry Lieberman, and Ted Selker. "A zero-input interface for leveraging group experience in web browsing Archived 2025-08-06 at the Wayback Machine." Proceedings of the 8th international conference on Intelligent user interfaces. ACM, 2003.
[edit]
中心句是什么意思 warning是什么意思 优思悦是什么药 做梦梦见火是什么意思 头晕恶心挂什么科
什么都不是 lcp是什么意思 喝水不排尿是什么原因 雾化器是干什么用的 潮热是什么意思
什么的季节 螨虫长什么样 中午是什么时辰 乳腺结节什么症状表现 什么病不能吃鲤鱼
惆怅是什么意思 老实的动物是什么生肖 数字7五行属什么 欲言又止的欲什么意思 血型o型rh阳性是什么意思
老面是什么面hcv9jop4ns1r.cn 嘴唇挂什么科hcv8jop4ns3r.cn 5月1日什么星座hcv9jop6ns0r.cn 明朝前面是什么朝代gysmod.com 什么是冰晶hcv7jop4ns5r.cn
西门子洗衣机不脱水是什么原因hcv7jop4ns8r.cn 旅拍什么意思hcv7jop5ns0r.cn 肠胀气是什么原因引起的hcv7jop6ns2r.cn 匮乏是什么意思bysq.com 脸浮肿是什么原因引起的hcv7jop9ns4r.cn
中指戴戒指什么意思hcv8jop7ns6r.cn 唵嘛呢叭咪吽什么意思jinxinzhichuang.com 心衰做什么检查能确诊hcv8jop9ns9r.cn 77年属什么hcv8jop3ns0r.cn 五行中水是什么颜色zhongyiyatai.com
扁桃体结石长什么样wzqsfys.com 发痧吃什么药可以断根hcv9jop3ns1r.cn 湿疹和荨麻疹有什么区别hcv9jop0ns7r.cn 5.13是什么星座wzqsfys.com 甲状腺钙化是什么意思hcv8jop5ns3r.cn
百度