HK1178637B - Role based user interface for limited display devices - Google Patents
Role based user interface for limited display devices Download PDFInfo
- Publication number
- HK1178637B HK1178637B HK13106235.7A HK13106235A HK1178637B HK 1178637 B HK1178637 B HK 1178637B HK 13106235 A HK13106235 A HK 13106235A HK 1178637 B HK1178637 B HK 1178637B
- Authority
- HK
- Hong Kong
- Prior art keywords
- component
- role
- screen
- user
- option
- Prior art date
Links
Description
Technical Field
The present invention relates to user interfaces, and more particularly to role-based user interfaces.
Background
Limited display devices, such as smart phones, are increasingly being used to perform tasks that are traditionally performed with desktop computing devices having larger screens. However, performing certain tasks on a limited display device can be cumbersome for the user. For example, it may be difficult for a user to perform project tasks on a limited display device.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
A role-based graphical User Interface (UI) is used to receive user input for project/task related input/editing using a limited display device. Functional components are grouped into logical hubs that can be displayed within a user interface. The grouping of components is based on the role of the user (e.g., project manager, project participant, contractor …). For example, for one or more users, the role-based graphical UI may group together the following components: fee entry and approval; time entry and approval; a notification message; information collaboration (e.g., documents, project information, etc.); reporting; and setting. After selecting one of the components from the role based UI, the user may use the displayed component for interacting with the function (e.g., enter cost, time entry …). The UI is configured to allow navigation between different functions included within the logical hub.
Drawings
FIG. 1 illustrates an exemplary computing device;
FIG. 2 illustrates an exemplary system including a display for interacting with a character-based UI on a limited display device screen;
FIG. 3 shows an illustrative process related to a role-based user interface;
FIG. 4 illustrates an example layout of a role based UI;
FIG. 5 illustrates a high-level display for accessing a role-based UI;
FIG. 6 illustrates a component screen for entering a fee;
FIG. 7 illustrates a component screen for entering a time entry; and
fig. 8 shows a screen for entering an item identifier.
Detailed Description
Referring now to the drawings, in which like numerals represent like elements, various embodiments will be described. In particular, FIG. 1 and the corresponding discussion are intended to provide a brief, general description of a suitable computing environment in which embodiments may be implemented.
Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Other computer system configurations may also be used, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. Distributed computing environments may also be used where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
Referring now to FIG. 1, an illustrative computer architecture for a computer 100 utilized in the various embodiments will be described. The computer architecture shown in fig. 1 may be configured as a mobile computing device (e.g., a smartphone, notebook, tablet … …) or a desktop computer, and includes a central processing unit 5 ("CPU"), a system memory 7, including a random access memory 9 ("RAM") and a read only memory ("ROM") 10, and a system bus 12 that couples the memory to the central processing unit ("CPU") 5.
A basic input/output system containing the basic routines that help to transfer information between elements within the computer, such as during startup, is stored in the ROM 10. The computer 100 also includes a mass storage device 14, the mass storage device 14 being used to store an operating system 16, application programs 24, and other program modules 25, files 27, and a UI manager 26, as will be described in greater detail below.
The mass storage device 14 is connected to the CPU 5 through a mass storage controller (not shown) connected to the bus 12. The mass storage device 14 and its associated computer-readable media provide non-volatile storage for the computer 100. Although the description of computer-readable media contained herein refers to a mass storage device, such as a hard disk or CD-ROM drive, the computer-readable media can be any available media that can be accessed by the computer 100.
By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, erasable programmable read-only memory ("EPROM"), electrically erasable programmable read-only memory ("EEPROM"), flash memory or other solid state memory technology, CD-ROM, digital versatile disks ("DVD"), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer 100.
According to various embodiments, the computer 100 may operate in a networked environment using logical connections to remote computers through a network 18, such as the Internet. The computer 100 may connect to the network 18 through a network interface unit 20 connected to the bus 12. The network connection may be wireless and/or wired. The network interface unit 20 may also be used to connect to other types of networks and remote computer systems. The computer 100 may also include an input/output controller 22 for receiving and processing input from a number of other devices, including a touch input device 28. The touch input device may utilize any technology that allows for the recognition of single/multi-touch inputs (touch/no-touch). For example, techniques may include, but are not limited to: heat, finger pressure, high capture rate cameras, infrared light, optical capture, tuned electromagnetic induction, ultrasonic receivers, sensing microphones, laser rangefinders, shadow capture, and the like. According to one embodiment, the touch input device may be configured to detect a proximity touch (i.e., within a certain distance from, but not in physical contact with, the touch input device). The touch input device 28 may also act as a display. An input/output controller 22 may also provide output to one or more display screens, printers, or other types of output devices.
The camera and/or some other sensing device may be operable to record one or more users and capture motions and/or gestures made by the user of the computing device. The sensing device may also be operable to capture words such as dictated by a microphone and/or to capture other input from the user such as by a keyboard and/or mouse (not depicted). The sensing device may comprise any motion detection device capable of detecting movement of a user. For example, the camera may include Microsoft WindowsA motion capture device comprising a plurality of cameras and a plurality of microphones.
Embodiments of the invention may be practiced with a system on a chip (SOC) in which each or many of the components/processes shown in the figures may be integrated onto a single integrated circuit. Such SOC devices may include one or more processing units, graphics units, communication units, system virtualization units, and various application functions, all integrated (or "burned") onto a chip substrate as a single integrated circuit. When operating via an SOC, all or a portion of the functionality described herein with respect to unified communications may be operated via application-specific logic integrated with other components of computing device/system 100 on a single integrated circuit (chip).
As mentioned briefly above, a number of program modules and data files may be stored in the mass storage device 14 and RAM 9 of the computer 100, including an operating system 16 suitable for controlling the operation of a networked personal computer, such as those available from Microsoft corporation of Redmond, WashAnd (4) operating the system. According to one embodiment, the operating system is configured to include support for the touch input device 28. According to another embodiment, UI manager 26 may be used to process touch input received from touch input device 28Some/all of (a).
The mass storage device 14 and RAM 9 may also store one or more program modules. In particular, the mass storage device 14 and the RAM 9 may store one or more application programs 24, such as applications related to project management. For example, the functionality included within MICROSOFT DYNAMICS SL may be used for project management. Computing device 100 may access one or more applications included on computing device 100 and/or included at some other location. For example, the computing device 100 may connect to the cloud-based service 29 to access functionality accessed using a role-based graphical user interface. Computing device 100 may also be configured to access functionality on one or more networked computing devices. In connection with the operation of an application, UI manager 26 is operative to display and receive input from role-based UIs that group common functions/components together. Generally, UI manager 26 is configured to use a limited display device to facilitate displaying, processing, and receiving user input for a role-based graphical User Interface (UI) related to a project/task. Additional details regarding the operation of UI manager 26 will be provided below.
FIG. 2 illustrates an exemplary system including a display for interacting with a character-based UI on a screen of a limited display device. As shown, system 200 includes application 24, callback code 212, UI manager 26, cloud-based service 210, and touchscreen input device/display 202.
To facilitate communication with UI manager 26, one or more callback routines, illustrated in fig. 2 as callback code 212, may be implemented. According to one embodiment, the application program 24 is a commercial productivity application configured to receive input from the touch-sensitive input device 202 and/or keyboard input (e.g., physical keyboard and/or SIP). For example, UI manager 26 may provide information to application 24 in response to a user's gesture (i.e., a finger on hand 230) selecting a user interface option within the role-based UI.
The illustrated system 200 includes a touch screen input device/display 202 that detects when a touch input (e.g., a finger touch or near touch to the touch screen) is received. Any type of touch screen that detects a touch input by a user may be utilized. For example, a touch screen may include one or more layers of capacitive material that detect touch inputs. Other sensors may be used in addition to or in place of capacitive materials. For example, an Infrared (IR) sensor may be used. According to an embodiment, the touch screen is configured to detect an object in contact with or located on the touchable surface. Although the term "upper" is used in this specification, it should be understood that the orientation of the touch panel system is irrelevant. The term "upper" is intended to be applicable to all such orientations. The touch screen may be configured to determine a location (e.g., a start point, an intermediate point, and an end point) at which the touch input is received. The actual contact between the touchable surface and the object may be detected by any suitable means, including, for example, a vibration sensor or microphone coupled to the touch panel. A non-exhaustive list of examples of sensors for detecting contact includes: pressure-based mechanisms, micromechanical accelerometers, piezoelectric devices, capacitive sensors, resistive sensors, inductive sensors, laser vibrometers, and LED vibrometers.
UI manager 26 is configured to display a role based UI and process received input devices/displays 202. A role-based graphical User Interface (UI) is used to receive user input for entry/editing related to a project/task. The role based UI 201 groups similar and commonly used functional components based on the user's role (e.g., project manager, project participant, contractor …). For example, for one or more users, the role-based user graphical UI may group together the following functions: a time component 203; a fee component 204; a collaboration component 205; a notification component 206; a reporting component 207; and a setup component 208. Upon selecting one of the components (e.g., by tapping 230 on the display of the components), the user may use the displayed interface for interacting with functions (e.g., entering fees, time entries …) (see the example component screens of fig. 6-8). Generally, the time component 203 is for receiving a time entry and/or approving/viewing a time entry. The fee component 204 is used to enter fees and/or approve/view fee entries. The collaboration component 205 is used for the sharing/collaboration of information. For example, a user may share a document among project members. The notification component 206 shows a number of notifications pending by the user. In the example shown, the user has 8 pending notifications. According to an embodiment, the notification is about a notification associated with each of the different components. According to another embodiment, all/some of the components within the role-based UI may include an indicator specifying pending notifications for the components. For example, the time component may show the project manager that there are 12 time entries to approve. The report component 207 is used to select reports to be displayed. For example, the report may display a subset of KPIs ("key performance indicators") subscribed to by the user. Settings 208 are used to configure settings (e.g., components to display, options displayed) of the role-based UI.
Cloud-based services 210 may be configured to provide cloud-based services for a variety of different applications/components using role-based UIs. For example, cloud-based service 210 may be configured to provide business services. According to one embodiment, the service is comparable to that provided by the MICROSOFT DYNAMICS SL program. The services may include, but are not limited to: financial management, business intelligence and reporting, project management, and service management. Some of the different functions may include time entry, expense review/entry, information collaboration, task/information notification, reporting, and the like.
Referring now to FIG. 3, an illustrative process 300 is described in connection with a role-based user interface. When reading the discussion of the routines provided herein, it should be appreciated that the logical operations of various embodiments are implemented (1) as a sequence of computer implemented acts or program modules running on a computing system and/or (2) as interconnected machine logic circuits or circuit modules within the computing system. The implementation is selected depending on the performance requirements of the computing system implementing the invention. Accordingly, the logical operations illustrated and making up the embodiments described herein are referred to variously as operations, structural devices, acts or modules. These operations, structural devices, acts and modules may be implemented in software, in firmware, in special purpose digital logic, and any combination thereof.
After a start operation, the process flows to operation 310, where the role of the user is determined. According to one embodiment, roles relate to tasks assigned to a user in one or more projects. For example, a user may be a project manager, a project member, a contractor, a consultant involved in one or more projects.
Moving to operation 320, a set of components is determined based on the role of the user. For example, project members typically have different assigned tasks and responsibilities than project managers. The components grouped together for a project manager can include components for approving/assigning information, while the components grouped together for project members include components for entering information approved/viewed by the project manager. According to one embodiment, the components that group project members together include a time input component, a fee input component, a collaboration component, a notification component, a reporting component, and a settings component. According to one embodiment, the components grouped together for project managers include a time entry and approval component, a fee entry and approval component, a collaboration component, a notification component, a reporting component, and a settings component.
The components may be determined automatically/manually. For example, a user can manually select components to be included within the role-based UI by using a user interface and/or a settings profile. The components may also be determined automatically by checking the usage patterns of the different components for the user. Components for inclusion within the role-based user interface may be selected based on the usage pattern. A component may be associated with one or more applications.
Flowing to operation 330, the grouped components are displayed within the role based UI. The components may be displayed in different manners (e.g., lists, buttons, different icons, etc. (see, e.g., fig. 4-8)). According to one embodiment, the role based UI groups components on a single display of the restricted display so that common functions are easily accessible to the user.
Transitioning to operation 340, input is received to select one of the components displayed within the role-based UI. For example, the user may tap on a component within the display of the role based UI.
Moving to operation 350, the display of the role based UI is updated to reflect the selected component. According to one embodiment, a component screen is displayed to receive input related to a selected component.
Flowing to operation 360, input is received to interact with the component screen (see, e.g., FIGS. 6-8).
Transitioning to decision operation 370, a decision is made to determine whether to select another component. According to one embodiment, the user can select another component directly from the component screen without having to return to the home screen of the character-based UI.
When another component has been selected, the process moves to operation 350.
When another component has not been selected, the process flows to an end operation and returns to processing other actions.
FIG. 4 illustrates an example layout of a role based UI. As shown, fig. 4 includes two different displays (410, 420) showing two different layouts. The display may be shown on a computing device with a limited display size (e.g., a cellular telephone with a display of about 2x3 inches, a tablet with a display of about 7-10 inches, and/or other devices with other display sizes). According to one embodiment, the display includes a touchscreen for receiving gestures to interact with the character-based UI.
Displays 410 and 420 each show a role-based UI that includes a selection of components that are selected based on the user's role. Any number of multiple components of different functions may be grouped. For example, three, four, five, six, seven, eight, etc. may be grouped together. According to one embodiment, the grouped components are displayed on a single display screen such that each of the grouped components can be selected from the same screen. As shown, each role-based UI includes a navigation area that can be used to provide additional functionality that may/may not be related to the role-based UI. The navigation area may include any combination of hardware/software components. For example, the navigation area may be a hardware button that is part of the computing device. The navigation area may also be an area with programmable software buttons.
FIG. 5 illustrates a high-level display for accessing a role-based UI.
Display 510 shows an exemplary screen that may be used to launch a role-based UI. Display 510 may be a home screen associated with the device and/or another page on the device. In this example, the illustrated role-based UI launch icon 511 shows that 8 messages relating to the role-based UI are waiting for the user.
In response to launching the role based UI, display 520 is shown. The components 521, 522, 523, 524, 525, and 526 are grouped based on the role of the user. As shown, the role based UI includes a time component 521, an expense component 522, a collaboration component 523, a notification component 524, a reporting component 525, and a settings component 526. According to one embodiment, the functionality of the components may be configured differently depending on the role of the user. For example, a project manager may be allowed to enter and approve entries for various project members, while project members may be allowed to enter entries but not approve entries for other project members. Some/all of the components may change depending on the role of the user. For example, a project manager can include a component for updating tasks assigned to project members.
FIG. 6 shows a component screen for entering a fee.
Display 610 shows an exemplary component screen for entering a fee that is initiated in response to selection of the fee component on the role-based UI (see, e.g., fig. 5). The configuration of the cost component may vary depending on the role of the user. For example, a fee component screen for a project manager can include an option for viewing/approving a fee.
As shown, the fee component screen 610 includes options 611-618 for entering a fee. Option 611 allows the user to save/cancel the cost entry. In response to saving the cost entry, cost information may be stored. According to one embodiment, the saved cost information is moved to a cloud-based service. Option 612 is for receiving a date entry for the fee. According to one embodiment, the default date is the current date. Option 613 is for receiving an identifier of an item to be charged for. Option 614 is for receiving a category of fees. Option 615 is for receiving an amount of the fee. Option 616 is used to receive any notes that the user may want to include with the fee. Option 617 is for receiving an image of a receipt for the fee. Option 618 is for receiving an entry to move to another component screen associated with the role-based UI and/or to change a setting associated with the cost component and/or the role-based UI. For example, the setup options displayed in option 618 may be used to select the default fields that the user wishes to display when the cost component screen is initially displayed.
Fig. 7 illustrates a component screen for entering a time entry.
Display 710 illustrates an exemplary component screen for entering a time entry that is launched in response to selecting a time component on the role-based UI (e.g., see fig. 5). The configuration of the time component screen may vary depending on the role of the user. For example, a time component screen for a project manager may include an option to view/approve time entries for other project members.
As shown, time component screen 710 includes options 711-716 for entering time entries. Option 711 allows the user to save/cancel/start time entries. According to one embodiment, the start button in option 711 may be used to start a timer that may be used to track the time of a time entry (time option 713). According to one embodiment, selecting the start button changes the start button to a stop button that can be used to stop the timer. Once the stop button is selected, the button changes to a save option. Option 712 is for receiving a date entry for the time entry. According to one embodiment, the default date is the current date. Option 713 is for receiving the time of the time entry. The time may be manually entered or may be determined in response to a timer. Option 714 is for receiving an identifier of a time entry (e.g., project, task code). Option 715 is for receiving any notes that the user may want to include with the time entry. Option 716 is used to receive an entry to move to another component screen associated with the role-based UI and/or change a setting associated with the time component and/or the role-based UI. For example, the setup options displayed in option 716 may be used to select the default fields that the user wishes to display when the time component screen is initially displayed.
Fig. 8 shows a screen for entering an item identifier.
Display 810 shows an exemplary screen for entering a value in response to selecting an option within a component screen on a role-based UI (see, e.g., fig. 6-7). As shown, screen 810 includes options 811-815 for entering values for the items. Option 811 allows the user to save/cancel values. Option 812 is for displaying the current value of the item. Option 813 is used to display the current value of the task for the project. Option 814 is for receiving a value for the selected option. As shown, the user may select a company name and an item. Option 815 is for receiving an entry to move to another component screen associated with the role based UI and/or to change a setting associated with the time component and/or the role based UI.
The above specification, examples and data provide a complete description of the manufacture and use of the composition of the invention. Since many embodiments of the invention can be made without departing from the spirit and scope of the invention, the invention resides in the claims hereinafter appended.
Claims (10)
1. A method for displaying a role based User Interface (UI) on a restricted display device, comprising:
determining the role of a user;
determining a grouping of components comprising different functions based on the roles of the users, wherein similar and commonly used functional components are grouped based on the roles of the users and the functions of at least one of the functional components are configured based on the roles of the users;
displaying the grouped functional components within the role-based UI on a single screen on the limited display device to enable selection of each grouped functional component from the single screen;
receiving input from the role-based UI selecting one of the functional components within the grouped functional components; and
updating the role-based UI to display a component screen related to the selected functional component such that the role-based UI facilitates interaction with the configured functionality of the selected functional component, wherein the selected functional component includes one or more selectable options including a settings option for selecting one or more default fields for display when the functional component is selected, and the one or more selectable options change based on the user's role.
2. The method of claim 1, further comprising displaying a selectable option for each functional component with the display of the component screen, the selectable option updating the display of the component screen when selected to display functionality related to the functional component associated with the selected option.
3. The method of claim 1, wherein the grouped functional components include a cost component, and a time component, and a notification component that provides notifications related to a project for which the user is a team member, and a reporting component.
4. The method of claim 3, further comprising displaying an expense screen in response to receiving a selection of the expense component, the expense screen including options for setting a date of the expense, an item identifier, a category of the expense, an amount of the expense, a remark for the expense, and a photograph of the expense.
5. The method of claim 3, further comprising displaying a time entry screen in response to receiving a selection of a time entry component, the time entry screen including an option to determine a time duration for a time entry, an option to set a date for the time entry, a remark for the time entry, and an option to enter an item identifier.
6. The method of claim 1, further comprising displaying a collaboration screen in response to receiving a selection of a collaboration component, the collaboration screen including an option to indicate information to share and an option to configure options associated with the information to share.
7. A method for displaying a role based User Interface (UI) on a restricted display device, comprising:
determining a grouping of components comprising different functions based on the user's role in the project, wherein similar and commonly used functional components are grouped based on the user's role, and configuring the function of at least one of the functional components based on the user's role;
displaying the grouped functional components within the role-based UI on a single screen on the limited display device to enable selection of each grouped functional component from the single screen;
receiving input from the role-based UI selecting one of the functional components within the grouped functional components;
updating the role-based UI to display a component screen related to the selected functional component such that the role-based UI facilitates interaction with the configured functionality of the selected functional component, wherein the selected functional component comprises one or more selectable options including a set option for selecting one or more default fields for display when the functional component is selected, and the one or more selectable options change based on the role of the user; and
updating a cloud-based service with information obtained from interaction with the role-based UI.
8. A system for displaying a role based User Interface (UI) on a limited display device, comprising:
a display;
a touch surface configured to receive touch input;
a processor and a computer readable medium;
an operating environment stored on the computer-readable medium and executing on the processor; and
a UI manager operating under control of the operating environment, the UI manager to:
displaying, on a single screen, a grouping of components including different functions based on the user's role in an item, wherein similar and commonly used functional components are grouped based on the user's role, and configuring a function of at least one of the functional components based on the user's role;
receiving input from the role-based UI selecting one of the functional components within the grouped functional components;
updating the role-based UI to display a component screen related to the selected functional component such that the role-based UI facilitates interaction with the configured functionality of the selected functional component, wherein the selected functional component comprises one or more selectable options including a set option for selecting one or more default fields for display when the functional component is selected, and the one or more selectable options change based on the role of the user; and
updating a cloud-based service with information obtained from interaction with the role-based UI.
9. The system of claim 8, the grouped functionality components include a fee component, a time component, a notification component that provides notifications related to a project for which the user is a team member, and a reporting component.
10. The system of claim 8, further comprising displaying a time entry screen in response to receiving a selection of the time component, displaying an expense entry screen in response to receiving a selection of the expense component, and displaying a collaboration screen in response to receiving a selection of a collaboration component.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/231,621 | 2011-09-13 | ||
| US13/231,621 US20130067365A1 (en) | 2011-09-13 | 2011-09-13 | Role based user interface for limited display devices |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| HK1178637A1 HK1178637A1 (en) | 2013-09-13 |
| HK1178637B true HK1178637B (en) | 2017-07-28 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6088520B2 (en) | Roll user interface for narrow display devices | |
| US10430917B2 (en) | Input mode recognition | |
| US10324592B2 (en) | Slicer elements for filtering tabular data | |
| KR102020345B1 (en) | The method for constructing a home screen in the terminal having touchscreen and device thereof | |
| CN103853425B (en) | Method and electronic device for display application | |
| KR102009054B1 (en) | Formula entry for limited display devices | |
| RU2501068C2 (en) | Interpreting ambiguous inputs on touchscreen | |
| US8990686B2 (en) | Visual navigation of documents by object | |
| US20140304648A1 (en) | Displaying and interacting with touch contextual user interface | |
| US20130191779A1 (en) | Display of user interface elements based on touch or hardware input | |
| TW201337712A (en) | Docking and undocking dynamic navigation bar for expanded communication service | |
| JP2015512078A (en) | Confident item selection using direct manipulation | |
| JP6178421B2 (en) | User interface for content selection and extended content selection | |
| US20160085388A1 (en) | Desktop Environment Differentiation in Virtual Desktops | |
| US20110193785A1 (en) | Intuitive Grouping and Viewing of Grouped Objects Using Touch | |
| HK1178637B (en) | Role based user interface for limited display devices | |
| CN106415626A (en) | Group selection initiated from a single item | |
| HK1181163B (en) | Visual navigation of documents by object |