ABSTRACT
User Interfaces (UIs) are mostly produced along a mental forward engineering process consisting in step by step transforming abstract descriptions into more concrete ones. Transformations make trade-offs between the context of use (<user, platform, environment>) and the usability properties that have been elicited as key. In ubiquitous computing, neither the context of use nor the user's objectives can be set at design time: they may opportunistically emerge with the arrival/departure of interaction resources and/or user's needs. As a result, there is a need for dynamically composing interactive systems. We explore multiagents planning for tackling the combinatory issue when sharing interaction resources among interactive systems and UI elements. The gateway between HCI and planning is performed using Model Driven Engineering (MDE). Experience shows that MDE is powerful for chaining domains together as well as for better understanding and improving domains languages.
- Bastien JM., Scapin D. 1993. Ergonomic Criteria for the Evaluation of Human-Computer Interaction, Rapport Technique n° 156, juin 1993, INRIA.Google Scholar
- Florins M. 2006. Method for Designing Multiplatform Graphical User Interfaces, Thèse de doctorat, Université catholique de Louvain.Google Scholar
- Guilio M., Paterno F., Santoro C. 2004. Design and Development of Multidevice User Interfaces through Multiple Logical Description, IEEE Transactions on Software Engineering, vol. 20, n° 8, p. 507--520. Google ScholarDigital Library
- Ilghami O., Nau DS. 2003. A general approach to synthesize problem-specific planners, Technical report CS-TR-4597, UMIACS-TR-2004-40, University of Maryland, October 2003.Google Scholar
- Liu, H., Singh, P. 2004. Focusing on ConceptNet's natural language knowledge representation, Commonsense Reasoning in and over Natural Language Proceedings of the 8th International Conference on Knowledge-Based Intelligent Information & Engineering Systems (Wellington, New Zealand, September 22--24, 2004) KES'2004, Lecture Notes in Artificial Intelligence.Google Scholar
- Nau, D., Ghallab, M., Traverso, P. 2004. Automated Planning: Theory and practice, Morgan Kaufmann Publishers Inc. Google ScholarDigital Library
- Sottet JS., Calvary G., Favre JM. 2006. Towards mapping and model transformation for consistency of Plastic User Interfaces, Computer Human Interaction, Workshop on The Many Faces of Consistency in Cross-platform Design.Google Scholar
Index Terms
- Composing interactive systems by planning
Recommendations
Tangibilisation of an educational scheduling game’s graphical interface: Tangibilisation de l’interface d’un jeu de formation à la planification
IHM '21 Adjunct: Adjunct Proceedings of the 32nd Conference on l'Interaction Homme-MachineLes TUIs ou Tangible User Interfaces (Interface utilisateur tangibles) sont un type d’interface qui combine physique et numérique. Elles sont généralement opposées aux Graphical User Interfaces, ou GUI (Interface utilisateur graphique) où les éléments ...
Sécurité avant tout! Étude par entretiens de la planification de missions de drones pour guider la conception des systèmes soutenant les analyses de sécurité et les demandes d’autorisation: Safety first! An Interview Study of Drone Mission Development to Guide System Design Supporting Safety Analyse and Authorization Requests
IHM '22: Proceedings of the 33rd Conference on l'Interaction Humain-MachineThe use of civilian drones is a growing market offering services such as delivery, surveillance or rescue. The operationalization of these uses is complex and presents a level of risk that must be controlled through safety analysis processes and ...
Comments