Next Article in Journal
Sliding Mode Controller for Autonomous Tractor-Trailer Vehicle Reverse Path Tracking
Previous Article in Journal
Tyre–Road Heat Transfer Coefficient Equation Proposal
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Concept to Reality: An Integrated Approach to Testing Software User Interfaces

1
School of Information Systems, Queensland University of Technology, Brisbane 4000, Australia
2
Torrens University, Melbourne 3000, Australia
3
Department of Computer Science and Engineering, Bangladesh University of Business and Technology, Dhaka 1216, Bangladesh
4
Department of Software Engineering, Daffodil International University, Dhaka 1342, Bangladesh
5
School of Computer Science, Queensland University of Technology, Brisbane 4000, Australia
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(21), 11997; https://doi.org/10.3390/app132111997
Submission received: 28 September 2023 / Revised: 17 October 2023 / Accepted: 25 October 2023 / Published: 3 November 2023

Abstract

:
This paper delves into the complex task of evaluating a website user interface (UI) and user experience (UX), a process complicated by gaps in research. To bridge this, we introduced an innovative human–computer interaction (HCI) framework that synergizes expert cognitive walkthroughs and user surveys for a comprehensive view. We transformed user responses into three key domains—control, engagement, and goal. Our work also generalized an extract of four context-level data metrics for a robust evaluation. The numerical evidence, such as a C1 score of 4.1, surpassing the expert usability benchmark, indicated our framework’s effectiveness. Our research not only addresses an essential gap by integrating assessments from both users and experts, but also offers actionable insights for UI/UX design. The findings extend beyond this specific context, providing a potent evaluation tool for website usability across various websites. Lastly, the research underscores the importance of prioritizing users’ needs and expert recommendations in design principles, significantly contributing to the broader domain of website usability and user experience.

1. Introduction

The terms UI and UX refer to user interface (UI) and user experience (UX), respectively. A software program or a system’s visual and interactive components, known as the user interface (UI), allow users to engage with them. This includes the layout, buttons, menus, icons, and general design with the goal of creating a user-friendly and aesthetically pleasing interface. UX, on the other hand, is concerned with users’ entire enjoyment and experience when they engage with the product. It considers elements like usability, effectiveness, learnability, and emotional reaction. In order to determine how effectively the UI design supports the users’ objectives and improves their overall experience, UX assessment [1] studies the users’ cognitive processes, perceptions, and actions. Analyzing usability and user experience is essential because doing so will allow us to detect any issues or faults and fix them. A usability evaluation ensures that the system meets the users’ goals or needs. Both utility and design are evaluated. Two related UI and UX evaluation objectives are identifying usability issues and providing solutions. A cognitive walkthrough can assess the user experience (UX) [2] and a user interface (UI). A cognitive walkthrough is a technique for evaluating a system’s capacity to be taught from the perspective of a brand-new user. Unlike user testing, it excludes users. It depends on the expertise of a group of reviewers who thoroughly go through a task and assess the user interface from the perspective of a new user. It is a methodical approach to predict what people would think and do while using an interface for the first time. The cognitive approach to usability evaluation helps us understand the real world and ensures that the requirements are met. It compares various design possibilities and assesses the design regarding standards or goals. It provides suggestions for improving learnability and is both economical and efficient. The emphasis is on how well the design helps users learn tasks and identify problems using psychological concepts. It necessitates some familiarity with cognitive psychology. Both cognitive and user studies were applied in our usability and user experience assessment. Because users are the primary focus of the design process for websites and applications [3], it is crucial to consider their suggestions and criticisms.
The cognitive walkthrough currently uses a variety of case studies, such as positive bike ride-sharing, cyberlearning, open community online portals, highly automated driving training applications, computers in human behavior, and so on. At the moment, cognitive studies only use the opinions of experts [4,5]. However, finding a combined cognitive method and user study is unusual. We combined user research with the cognitive method in our new framework for evaluating a website’s usability and user experience, emphasizing users the most.
In this approach, the cognitive method was used to simplify the process of analyzing user interfaces while being prompt and inexpensive. Creating a sample website to test case study software can have significant advantages. Developers and researchers can gather valuable feedback from users without the need for extensive testing or costly investments by creating a simulated environment that mimics the actual website. This approach enables the identification of potential issues and improvements that can be made before the actual website is released, lowering the risk of negative user experiences and increasing the likelihood of successful implementation. Creating a sample website can also serve as a platform for users to provide feedback and suggestions, allowing developers to tailor the website to the specific needs and preferences of the target audience. In this regard, we initially developed a web-based “Student Hostel” Website to execute a case study. The usability and user experience of the “Student Hostel” were then evaluated using both user research and the cognitive technique.
The key contributions of this paper are listed below:
  • Development of a hybrid HCI framework: We propose a new hybrid HCI framework that integrates a cognitive method to evaluate website usability and user experience.
  • Incorporation of user participation in cognitive evaluation: We incorporated user participation in the cognitive evaluation process to make it more effective and accurate and contribute to the HCI field by providing a more user-centric approach to website evaluation.
  • Evaluation of a website: We evaluated a website titled “Student Hostel” to assess the user interface and user experience using a questionnaire with three domains and ten questions.
  • Comparison of end-users and expert evaluation: We compared the outcomes of the survey of the end-users and the cognitive evaluation by the experts. This comparison provides valuable insights into the differences and similarities between end-users and experts’ opinions on user interfaces.
To evaluate our developed web platform’s usability and user experience, we created a Student Hostel Management application and used a hybrid human–computer interaction (HCI) framework. Through our research, we aimed to identify the gaps that exist in the evaluation of software in terms of usability and user experience. To achieve this, we combined the cognitive walkthrough method of the HCI framework with a study that analyzed expert- and user-level metrics. Our research focused on the comprehensive evaluation of our platform’s usability and user experience, providing insights that can help us improve the application’s design and functionality.
Although the majority of our study was devoted to evaluating the usability and user experience of Student Hostel Management software, it is important to acknowledge the wider applicability of our results to the field of human-machine interaction (HMI) in Industry 4.0 and the Industrial IoT [6,7]. Beyond the confines of software, good user interface design and satisfying user experiences are universal. The widespread use of web-based interfaces and the deployment of open-source IoT platforms also emphasize the relevance of our study in a broader technical environment. Even though it was focused on one type of technology environment, our work offers insightful contributions that are applicable across a wide range of technological circumstances [8].
The paper is structured as follows: Section 2 details earlier research on various systems. The research approach for our proposed framework is presented in Section 3. We discuss the website’s design in Section 4, which we evaluated for usability and user experience. Our suggested system’s installation, evaluation, and experimental results are discussed in Section 5. The key findings of this study, while mentioning the limitations of this study and future research directions, are explained in Section 6. Finally, the conclusion of this study is depicted in Section 7.

2. Literature Review

Assessing a website in terms of the HCI perspective is a crucial indicator for a user-oriented agile software-development procedure. Several significant recent works have made contributions in this regard. Still, some research gaps need to be addressed to ensure that the correlation between humans and software is aligned.
Juergen Sauer et al. proposed a new method of evaluating the three factors of accessibility, usability, and user experience [9]. They proposed “Interaction Experience” (IX) as a brand-new, more-advanced concept combining usability with user experience. However, IX cannot entirely be quantified over the importance of usability, user experience, and accessibility. Another study on user–intuitive web interface sign design and evaluation [10] followed a structured HCI, the SIDE framework. This study focused on three processes: generative, interpretive, and evaluative. The semantic information was evaluated based on a semiotic approach where the meaningness of icons, logos, and design artifacts were addressed in terms of the design paradigm. Using this empirical study, the authors identified several possibilities for studying and redesigning sensible indicators used in mobile user interfaces.
In terms of services, web portals play a vital role in making sure that their intended users are satisfied with what they provide to the customers. To assess the usability and user experience (UX), Shabnam Kazemi et al. [11] performed a survey on trendy web portals. They used a mapping method to quantify the research outcomes. However, the research primarily focused on one dimension (i.e., open community web portals). On the other hand, another research work focused only on the mobile app interface usability paradigm [12]. The paper was primarily concerned with how security perceptions influence the usability and user experience level of its intended users. To address and overcome these limitations, Muhammad Nazrul Islam et al. [13] evaluated the usability and UX both of web portals and mobile applications. To do so, they chose a systemic evaluation framework of HCI called the semiotic evaluation process. They investigated the intuitive nature of the interface signs of web and mobile applications. To support their primary goals, they assessed both platforms by evaluating the severity rating of the usability problems found on those platforms.
To improve the quality of the software, it is essential to take online reviews [14] and take necessary actions accordingly. The challenging part of evaluating online reviews is to classify the reviews into fake and real. If the reviews are categorized into different aspects of software design parameters, the task outcomes reveal the issues regarding the user interface and user experience. When the ratio of online fake reviews increases, it becomes difficult for the software development team to address the issues in the next development phase. To overcome and propose an alternative approach, A. Baki Kocaballi et al. proposed a conversational-review-collecting [15] method to reduce fake and biased user comments. The selected questionnaires of the interviewing process were divided into three domains to assess the hedonic, aesthetic, and pragmatic dimensions of UX.
Hakam W. Alomari et al. suggested a blended software engineering human–computer interaction framework in terms of a cyberlearning software’s UI and UX evaluation [16]. The collected data from the users were assessed in terms of a cognitive walkthrough [17] and compared with a heuristic evaluation for fine-tuning purposes. Twenty-five questions from three domains were utilized to conduct the survey. To assess the impact of the software’s cyberlearning tools on its users, six questions were chosen, and five questions were used to evaluate the impact of using collaborative learning components from the developed software the authors provided. The user reaction to the web pages was assessed based on 14 questions. What they lacked in their research was expert feedback and the correlation of their users’ data with the experts’ opinions. To address and improve these issues, in our research, we tried to correlate both parties (i.e., users and experts), and we modified ten questions from three domains for the purpose of the survey.
Recently, several authors have tried to improve the usability of software platforms using a newly evolved cognitive walkthrough [18] and assessed the actions needed to fix those issues. Scenarios, individual or group tasks and sub-tasks, and actions are all part of the newly evolved cognitive task model. The scenario overviews the necessary activities to achieve a specific outcome. The task analysis process begins with gathering data about the software to be assessed and progresses to identifying instances of collaborative communication [19]. Both scenarios and tasks can be defined, and teamwork can be evaluated by analyzing the collaboration dynamics [20].
Some researchers have contributed to the usability evaluation method design to extend the research dimension while performing their research on empirical and inspection methods [21,22]. While performing experimental data analysis, the collected data are analyzed based on quantitative and qualitative contexts. Yet, these research works failed to allow an in-depth understanding of the software products being evaluated and to set a benchmark for comparison with other platforms. In addition, Hafiz Abid Mahmood Malik et al. emphasized evaluating the user interface based on two intuitive methods: user testing and heuristic evaluation [23]. The heuristic evaluation was performed to ensure the safety and effectiveness of the software, while the user testing focused on evaluating the satisfaction of novice users and the efficiency levels of the UI. Based on the heuristic reports from the experts, several recommendations were provided to improve the usability issues faced by its users.
Daniela de Moura Pavão Farias et al. [24] proposed a new framework to categorize Hostel Management software into five classes based on users’ feedback, which was analyzed further to recommend the improvements required to solve the deficiencies. This research was conducted in a prescribed domain with 29 questions divided into ten different dimensions. Another research work focused on customer satisfaction attributes [25] to understand traditional customer perceptions by surveying users’ data [26] of a Hostel Management software.
A website’s overall look and feel can significantly impact the users’ experience and determine whether they stay or leave. A well-designed website with appealing aesthetics [27] can convey a sense of professionalism and trustworthiness, boosting the site’s credibility. Aesthetics can also influence the user’s emotions, aiding in the creation of a positive first impression and encouraging them to interact with the site’s content [28]. Furthermore, a visually appealing website with a clear layout and simple navigation can help reduce user frustration and improve the overall user experience [29], resulting in higher engagement, longer visits, and a higher likelihood of returning to the site.
While most of the research has focused solely on design perceptions based on human opinions, a few researchers have contributed to this field with artificial intelligence to assist developers while designing user-oriented web applications. This automates the concept design for the web applications in terms of the architectural design paradigm and shortens the time span instead of performing a rigorous user study. Machine learning models [30] and neural networks can be useful in creating user-friendly websites. Web designers can use these technologies to create personalized user experiences [31] that are tailored to individual user preferences, behavior [32], and demographics. Machine learning models can analyze user data such as click patterns, search queries, and page views to provide insights into how users interact with the website. These data can be used to enhance the user interface, increase user engagement, and boost conversion rates. Web designers can use these technologies to create websites that are visually appealing, highly functional, and user-friendly, resulting in increased user engagement and conversion rates. However, implementing these cutting-edge technologies can be costly and requires rigorous data-centric evaluation [33] to assess the usability and UX of a software platform. It is important to remember that the efficacy of machine learning models in UI and UX development depends on various elements, including the project’s unique goals, the availability and caliber of data, and the experience of the designers and developers involved. At the end of the day, a good UI/UX design frequently necessitates a combination of deliberate human design and the intelligent application of machine learning approaches.
The user experience (UX) evaluation is essential for determining how usable and successful software systems are. To assess and enhance UX in various software applications, human–computer interaction (HCI) frameworks like cognitive walkthroughs and heuristic assessments have been frequently used [34]. The cognitive walkthrough technique aims to comprehend how people engage with software interfaces in light of their cognitive processes. It entails replicating user tasks and methodically examining the software’s user interface [35,36], feedback features, and task completion efficiency. The cognitive walkthrough provides insightful information on improving the user experience by detecting potential cognitive barriers and usability problems. The use of a predetermined set of heuristics or usability principles, on the other hand, is what is used in heuristic evaluation to find usability issues in software interfaces. These heuristics are created by HCI specialists and act as standards for assessing the software’s user interface, navigation, feedback, and general user interaction. Evaluation experts provide practical suggestions to enhance UX by methodically assessing the product against these principles. Despite being useful, these HCI frameworks have certain drawbacks as well. The findings of cognitive walkthroughs and heuristic assessments have been complemented by complementary methodologies, such as user testing and user feedback analysis, to solve these shortcomings. User testing entails watching actual users while they use the product and gathering real-time feedback to gain insightful information about their experiences. Surveys and interviews are useful tools for gathering subjective comments and ideas for enhancing the software’s user experience [37].

3. Research Methodology of Our Framework

A system of methods used in a particular area of study or activity is known as a methodology. In other words, a methodology is a planned process for achieving a specific objective. In Figure 1, we outline the research procedure for our suggested model.

3.1. Preparation of Questionnaire

The use of a questionnaire survey is based on the idea that asking users about usability issues in a system is the best way to learn about them. Their main advantage is that they are inexpensive and relatively simple to use. Query approaches are desirable for evaluating web-based application software (i.e., website) because we want effective and affordable methods to evaluate the website developed for this research.
The creation of a questionnaire is the initial stage in our approach. A total of 10 questions from 3 different domains were prepared. Overall, the selection of 10 questions seemed reasonable to us, given that they covered all three domains, avoided redundancy, and were sufficient to provide data-centric information in the analysis. In addition, we considered the specific context of the survey and ensured that the number of questions aligned with the research objectives and constraints of this study. The inquiries were based on the website we created for completing the usability and user experience evaluations, named “Student Hostel”. The control domain, the engagement domain, and the goal domain were the three domains. The control domain had four questions, the engagement domain four, and the goal domain two.
The control domain incorporated the questions regarding the controlling functions of the application software (i.e., our developed website). On the other hand, the engagement domain held the context of how the users interacted with the website platform’s contents, features, and available functions. Finally, the questions of the goal domain were selected based on what services the website intended to offerto its users. The survey questions are presented in Table 1, as well as mentioning the domain and key topics related to the questions.

3.2. Subject Preparation and Performance of the Survey

We chose a sample of 250 participants for our study project from among the university students living in dorms. Numerous factors led us to choose university students who reside in dorms as our target demographic. The demographic data are presented in Table 2 in terms of the survey participants. Firstly, as a sizeable portion of university students must have access to student housing, this group is important and representative when evaluating the quality of the website’s user interface and user experience. Second, by focusing on this group in particular, we hoped to understand the special difficulties, preferences, and requirements that students residing in hostels would encounter while engaging with a website. Their expertise might offer helpful recommendations for enhancing the website’s functioning and appearance. We used Google Forms to carry out our survey. Participants were expected to respond to every question on the questionnaire. Additionally, we included an open-ended feedback section in Google Forms. We gathered their opinions and, therefore, received both optimistic and pessimistic feedback.

3.3. Context Mapping

We performed the user survey, and the survey questions were categorized into four contexts. The contexts were linked with the three domains of the questionnaire section so that the survey questions could be analyzed in a data-driven approach regarding the newly evolved cognitive walkthrough [18].
Table 3 displays the cognitive approach’s mapping to the questionnaire’s domains. We linked Context 1 to the goal domain because Questions Q9 and Q10 of the goal domain reflected what the end users intended to obtain from the website platform. Questions Q1-Q4 of the control domain were linked to Context 2 as the survey questions gathered information regarding the controlling features of the website. To assess how the controlling features interacted with the platform’s contents and functions, the selected questions from the survey (Q1-Q8) that related to it were linked with Context 3. Finally, our last Context 4 incorporated all possible cases of the question domain regarding evaluating users’ feedback in terms of usability and user experience.

3.4. User Experience Evaluation

User experience (UX) is widely used, but the multidisciplinary nature of UX has led to multiple definitions and perspectives. Although there are well-defined standards of UX, there are often no agreed-upon definitions of UX or the ways in which UX is evaluated or measured. The term user experience is often interchangeable with usability, interaction design, or just general customer experience. The challenge is that “users” is a very broad term, and there are many types of users who will all have different perspectives on their user experience with a website, for example. During the design phase of a new website, designers will narrow down their end-users by understanding their target audience. The website will use certain graphics, layouts, colors, typographies, and interfaces that meet the needs of this target audience. If the website is selling a physical product, then the design of the website branding often aligns with the product design, the packaging, and the after-sales service. It is a holistic management of brand and user/customer experience to ensure customers are receiving a consistent experience that meets their particular needs. A number of companies map this whole user experience journey and, in relation to a website, understand the qualities of both the users and the system with which they are interacting (see Figure 2).

3.4.1. User

It is crucial to consider the user’s context. Even though the context of usage should be as important as other perspectives, we excluded it from the conceptual framework because we believe it is abstract and difficult to quantify in practice. Our premise is that the user and the system are the physical assets upon which UX assessment is performed. The user’s internal and external states and the six traits he/she gives to the system are shown by how he/she reacts mentally and physically. The psychological reactions, perceptions, and physical responses all reflect the user’s internal qualities, while the physical responses indicate the user’s current bodily state. These states are gathered from the point of view of the user via the use of psychophysiology.

3.4.2. System

A crucial component of website usage is user experience (UX), which may be measured in terms of branding, appearance, functionality, performance, interaction, and help. The user’s view of the website’s brand, including its reputation and perceived value, is used to assess brand image. The presentation features the website’s visual design, including color scheme, typography, and layout. The term “functionality” relates to a website’s features and capabilities and whether they satisfy the user’s demands. The website’s speed, dependability, and responsiveness are considered while rating performance. The term “interactivity” describes the user-friendliness and intuitiveness of the website’s user interface, including its feedback and navigational features. Lastly, help evaluates the quantity and caliber of support and direction given to the customer. By considering these factors, a cognitive walkthrough can provide valuable insights into the user’s experience with the website and identify areas for improvement.

3.4.3. Psychophysiology

Psychophysiology is a field that studies the physiological and psychological aspects of human behavior, emotions, and cognitive activity. In terms of user experience, psychophysiology can provide insights into how users respond to websites at a subconscious level. By understanding the user’s psychophysiological response, website designers and developers can identify specific aspects of the website that elicit positive or negative emotions and cognitive activity and adjust the website accordingly to optimize the user experience.

3.4.4. Quality Requirements

Regarding website quality requirements, it is essential to evaluate the system’s many characteristics. Quality requirements can be divided into four categories: process, in-use, external, and internal. Process characteristics focus on how the website is developed and maintained, such as scalability, maintainability, and testability. In-use characteristics are concerned with how the website performs in real-world use, such as usability, reliability, and security. External characteristics are those that are visible to the end-user, such as performance, compatibility, and portability. Internal characteristics are those that are hidden from the end-user, such as modularity, cohesion, and coupling. These characteristics are evaluated using phrases such as “effectiveness”, “efficiency”, “satisfaction”, and “freedom from risk” to describe the level of quality required. By defining quality requirements in terms of these characteristics, website designers and developers can ensure that the website meets the needs of its users and performs optimally in its intended environment.
To assess our website’s user experience, we collected user feedback through a survey form. After using our platform, the participants provided mixed reactions, which we analyzed in two dimensions: positive and negative feedback. By categorizing the feedback into these two dimensions, we could identify the features of the website that were working well and those that needed improvement. We mapped the existing and missing features with website design principles to further improve the website. By doing so, we ensured that the website design adhered to best practices in website design, such as usability, accessibility, and user-centered design. This approach helped us improve our website’s user experience and ensure that our users had a positive experience while using our platform.

3.5. Weighted Average of Survey Questions Based on Users’ Data

We calculated the weighted average of the survey questions using user data. A score of 1 indicates a high level of dissatisfaction, while a score of 5 indicates a high level of satisfaction for each question. For every question, we added up the user-provided scores. Following that, the weighted average for that specific question was calculated by dividing the sum of the score for the question by the number of total participants. The weighted average has a value between 1 and 5.

3.6. Expert Evaluation

Then, we conducted a professional review by three industry experts. Based on their answers to the four cognitive questions [17], the three experts reviewed our website and provided remarks on all the web pages. The experts offered advice for the pages that did not meet with a particular cognitive issue [18]. Finally, they provided an overall rating based on their perspective regarding the cognitive issues.
The experts were asked to respond to 4 questions in terms of the cognitive walkthrough [38,39]:
  • Is the effect of the current action the same as the user’s goal? Our user interface should make it easier for people to determine what to do. For instance, the initial step is to locate and click the flow-starting button. We anticipate that users will recognize the UI when they see it. The system gives the user the incorrect cue about what to do if they feel that they must drag something or take any other action besides pressing the button.
  • Is the control for the action visible? Let’s assume that the user knows they must click on something. That’s excellent, but they’ll veer off the correct course if they can’t locate this button.
  • Is there a strong link between the control and the action? Through the label-following approach, the user will go through all the options and attempt to link them to the primary objective. For instance, a user is expected to identify the label "Upload" with the button’s primary objective. If people cannot make this association while the button is there, the interface must be reviewed, and how the various actions are presented must be reconsidered.
  • Is the feedback appropriate? The system ought to inform the user of what transpired. The answer should let the user know where they are and how to return on the proper route if they pick a single action that leads them off course. If the user makes the appropriate decision, the system should support it somehow.

3.7. Comparison between User and Expert Values

We determined the expert mean by averaging the three experts’ evaluations. We then determined an overall average for each of the four contexts. To obtain the aggregated average for each context, we mapped each of the four contexts to a set of domains. The weighted averages of all the questions within the chosen domains were added together for any particular context. The final step was to divide the total weighted average for the chosen domains by the total number of questions for the given context. Then, we contrasted the combined average with the expert mean. The fluctuation level was determined when the aggregated score of a context was higher than the expert mean. This was performed by subtracting the expert’s mean from the context’s aggregated score. The fluctuation symbol was P if the fluctuation level was above the expert mean, signifying that user satisfaction has been obtained. The fluctuation level was calculated by subtracting the context’s aggregated score from the expert mean, and the fluctuation notation was N, which denotes that user satisfaction was not obtained if any context’s aggregated score was lower than the expert mean.

3.8. Mapping of Positive and Negative Feedback Based on Design Principles

We examined the user feedback and mapped it by design principles. We selected eight design criteria: aesthetic design, communication, compatibility, cost, performance, satisfaction, security, and usefulness. We highlighted the positive (P) and negative (N) feedback for each factor. We also identified the keywords for each comment.

3.9. Mapping of User Feedback

A fish-bone diagram maps the users feedback to build an intelligible framework around a core idea using a non-linear graphical layout. A central notion or topic is tied to and grouped around a visual representation of various behaviors, expressions, ideas, or objects. A mapping can transform a lengthy list of related information into a visually appealing, engaging, and well-organized graphic that mirrors how our brains naturally process information. Based on several design concepts, we created a fish-bone map diagram of the positive and negative phrases we discovered in the previous stage. For each design parameter, the fish-bone diagram displays the satisfied keywords as “s” and the dissatisfied keywords as “d”.

4. Website Design

4.1. Designing Use Case Diagram

To visualize the interactions and relationships among these actors, we developed a Unified Modeling Language (UML) diagram for the Student Hostel Management System illustrated in Figure 3. This diagram provides a clear representation of the system’s structure and functionality, aiding in understanding and implementing the proposed solution.
The Super Admin, as the highest authority in the system, holds the responsibility of overseeing and managing the entire system. He/she had privileged access to administrative functions like user management, hostel allocation, and overall system configuration. The System User, typically an administrative staff member, utilizes the system to perform various administrative tasks. These tasks include managing student records, generating reports, handling complaints, and coordinating with the Hostel Wardens. The Allottees, the students residing in the hostels, interact with the system to avail themselves of various services. They can submit applications for hostel accommodation, view their room allotment details, pay fees, and register complaints or requests through the system. The Hostel Wardens, designated individuals responsible for managing specific hostels, rely on the system for efficient hostel administration. They can view and update student records, manage room allocations, monitor hostel facilities and maintenance, and communicate hostel-related matters with the Allottees and System Users.

4.2. System Design

We first created the user interface for the “Student Hostel” Website to evaluate it. This website was created using Bootstrap, MySQL (5.0.37), HTML, CSS, and Amazon EC2 cloud support as back-end hosting services. Figure 4 displays the five web pages we created for this website.
These pages include the login page, the sign-up page, the home page, the room booking page, and the contact us page. The username and password text fields are present on the login page. There are numerous text fields on the sign-up page. The home page includes images of the dorm rooms and details about the rooms and the hostel’s facilities. Users can readily see the accommodation booking button on the home page. A user can access the room booking page by clicking the room booking button. There are a few text boxes on the page for booking rooms. Students must choose the rooms they want, fill out the form with their check-in information and stay details, and then, submit it. Several text boxes can be found on the contact us page. Users can ask questions and request the management staff to cancel their booking through this page. We tried to adhere to the user interface design guidelines when creating the web pages. We tried to maintain the look and feel of the interface consistent across all of our web pages.

4.3. HTML and CSS

HTML is the name of the standard markup language used to create documents that are meant to be viewed in a web browser. To create the framework for our website, we used HTML to design the website’s forms, buttons, and overall layout. In addition, Cascading Style Sheets is how CSS is formally referred to. CSS controls how HTML elements are displayed across various platforms. Since HTML and CSS are widely supported, reliable, and accessible technologies, their use in developing our case study website is quite pertinent. Because they are well-documented and serve as the cornerstone of contemporary web-development techniques, they are simple to learn and easy to maintain for upcoming changes and improvements. Furthermore, HTML and CSS follow web standards, guaranteeing cross-browser compatibility and compliance with accessibility requirements, which is essential for delivering an inclusive user experience. To assess the user interface and user experience of the Student Hostel administration system, we were able to build a visually beautiful, well-structured, and user-friendly sample website by utilizing the capabilities of HTML and CSS. The presentation and design of the website were greatly influenced by these technologies, which resulted in a smooth and interesting user experience.

4.4. Bootstrap and MySQL

The popular front-end framework Bootstrap offers a selection of pre-designed JavaScript and CSS components. It is important because it makes responsive and quick web building possible. By using Bootstrap, we can speed up the development process and guarantee uniformity in the look and feel of the website by employing its ready-to-use components, such as navigation bars, buttons, forms, and grids. Our showcase website can adjust to various screen sizes and devices with ease thanks to Bootstrap’s adaptable design, giving users of computers, tablets, and mobile devices the best possible viewing experience. Thanks to Bootstrap’s rich documentation and community assistance, it is also simpler to understand, deploy, and maintain.
Websites can easily adjust to different screen sizes and devices thanks to the responsive design capabilities of Bootstrap, which include its flexible grid structure and adaptable components. A strong basis for content organization, the grid system also enables automated re-flowing for the best viewing experiences. Bootstrap streamlines the development process by eliminating the need for substantial CSS adjustments and specific media queries with its responsive navigation bars, drop-down menus, graphics, and typography. Bootstrap’s responsive features enable developers to build user-friendly, accessible websites that provide a consistent user experience on all devices.
MySQL, a popular open-source relational database management system, contributes significantly to our development process by effectively managing data storage and retrieval. We can safely store and manage data for managing Student Hostels, such as student information, room assignments, billing data, and administration records, by utilizing MySQL’s capabilities. The relational design of MySQL enables us to create meaningful links between various data entities, facilitating efficient querying and data processing. This guarantees the efficient operation of our example website, giving users accurate and current information while simplifying simple data administration for administrators.
In conclusion, combining MySQL and Bootstrap technologies in developing our demonstration website has several benefits, including a more streamlined front-end interface, responsive user experience, effective data management, and scalability. These technological advancements are essential for maintaining our demonstration website’s applicability, functionality, and usability for testing the user interface and user experience of the Student Hostel administration system.

5. Implementation and Experimental Results

We list the weighted average values of the ten survey questions in Table 4. From the survey result, we found that the total score of Questions Q1, Q2, Q3, and Q4 for the control domain were 1035, 975, 1010, and 985, respectively. We divided each sum by 250, the total number of participants, to determine the weighted averages for each of the four questions (Q1, Q2, Q3, and Q4). Questions Q5, Q6, Q7, and Q8 added up to 1000, 1005, 855, and 990 for the engagement domain. Each sum was divided by 250, the total number of participants, to determine the weighted averages for Questions Q5, Q6, Q7, and Q8. For the goal domain, Q9 and Q10 totaled 1040 and 1010, respectively. We divided each of the sums by 250, which represents the total number of participants, to determine the weighted averages for Questions Q9 and Q10. The equation for the weighted average is represented as (1), where a q depicts the individual scores for each question and n represents the total number of participants in the survey. The weighted average scores for the ten questions are illustrated in Figure 5 through the mapping diagram regarding user feedback from the survey results.
W A v g . = q = 1 10 a q n
Table 5 offers cognitive research on the “Student Hostel Website”, expert analysis, and a final score. Based on the cognitive questions, this table includes the opinions of three experts on each of the five web pages depicted in Figure 4. For pages that answer a specific cognitive question, the expert’s feedback is shown as “Yes”, and for pages that do not answer a specific cognitive question, the expert’s advice is given. Additionally, the three experts’ overall expert ratings are shown in this table. Experts 1, 2, and 3 had overall expert ratings of 3.5, 4.0, and 4.5, respectively.
Table 6 shows the mapping between the aggregated context-based user scores and the cognitive-evaluation-based expert ratings. The expert mean, which can be seen in Table 4, is the average overall expert rating. The aggregated score of each of the four situations is contrasted with this expert mean. The equation for the individual context’s aggregated score is expressed as (2), where W i depicts the weighted average scores for each question and T Q D represents the total number of questions required in a particular context in terms of context mapping.
C j = j = 1 m W i T Q D
When a context’s aggregated score exceeded the expert mean, the fluctuation level was calculated by deducting the expert mean from the context’s aggregated score. If the fluctuation level exceeded the expert mean, the fluctuation notation was P, indicating that user satisfaction was attained. The fluctuation level was derived by deducting the aggregated score from the expert mean of that specific context, and the fluctuation notation was N, which denotes that user satisfaction was not achieved if the aggregated score of any context was lower than the expert mean. The fluctuation level in terms of the contexts’ aggregated score with the expert mean score is illustrated in Figure 6.
The highlighted positive and negative feedback from the users of the “Student Hostel” in terms of its services is given in Table 7. After reviewing the user feedback gathered, we developed ten design principles, including aesthetic design, communication, compatibility, cost, performance, satisfaction, security, usefulness [40], responsiveness, and technical issues. The appropriate user remark was added to the database for each design parameter, with P denoting positive comments and N denoting negative comments.
To thoroughly assess the user experience of the Student Hostel Management System, it was essential for our research to explore ten design aspects. Each parameter reflects an important component of the system’s operation and user interaction, from aesthetic design and communication to compatibility, cost, performance, satisfaction, security, utility, responsiveness, and technical problems. We determined the strengths, shortcomings, and areas for improvement across multiple dimensions by mapping user input to these criteria. This thorough assessment strategy, in line with user-centered design principles, gave holistic knowledge of user demands and enabled focused recommendations for improving the system’s UI and UX:
  • Aesthetic design: A design’s attractiveness is determined by its aesthetics, a fundamental design principle [41]. It impacts how much interest people have in using the product. It depends on making a good first impression and keeping the relationship with the user. Both favorable and unfavorable comments were obtained for the aesthetic design parameter. This design is pleasing to the eye overall, which is a bonus. However, the navigation bar’s enhancement and the form’s layout were criticized.
  • Communication: We also found positive and negative comments on the communication parameter. Positive feedback is the management team’s capacity to be reached by users. We received negative feedback because we did not use the live chat option.
  • Compatibility: When referring to compatibility in this context, we mean that an application runs successfully on a specific platform version, typically the most-recent version. We received bad feedback on this parameter. Because the system is not a cross-platform application and cannot be used offline, we received some critical [42] comments.
  • Cost: Positive feedback was provided in this area because of the reasonable hostel pricing, but negative feedback was provided due to the lack of user-specific offers [43].
  • Performance: Performance parameters, as their name implies, provide suggestions for improving the user experience of the website. They also determine how accurately and efficiently [44] the system’s functionality works. We received both positive and negative criticism regarding the system’s usability, with the latter focusing on users’ inability to filter their rooms by their preferences [45].
  • Satisfaction: One of the satisfaction design criteria is the user’s level of satisfaction with the website after use. We received favorable feedback as users did not see pop-up advertising when using the website. Due to the few room classifications and the lack of better room images, negative feedback was also received.
  • Security: When it comes to security, questions like whether the application seeks personal information or has unnecessarily complex terms and conditions are of concern. We received a compliment on how convenient the login feature is. We received criticism for not having two-factor authentication [46].
  • Usefulness: Anything that helps a user achieve his/her objectives is useful. Usefulness is one of the many elements that influence and enhance a system’s usability [47]. Both favorable and unfavorable remarks were found in this area of design concepts [48,49]. We were praised for making the system easy to use. Negative comments were generated because users demanded more details regarding the hostel and rooms.
  • Responsiveness: In this field of design concepts, there were both positive and negative comments. We received positive comments as booking requests from students were responded to quickly. Due to the lack of helpline feedback, people made negative comments.
  • Technical errors: The users commented positively on the high-resolution and clear images. Additionally, the absence of 404 errors received excellent feedback from users. Regarding the design thoughts, we did not hear any criticism.
The mapping of software design principles to our web application is shown in Figure 7, where non-dotted lines depict the missing features and dotted lines depict the existing features.
Our web application design principles were communication, accessibility, consistency, learnability, readability, visible navigation, ease of use, user feedback, and help and FAQ. Our web application does not address the design requirements for cross-platform, flexibility, latency reduction, real-world metaphor, integrity, track state, live chat, digital marketing, and user manual.

6. Discussion and Limitations

6.1. Key Findings

The key findings of the hybrid user–expert study for the case of the “Student Hostel” Website are as follows:
  • The order of high to low scores in terms of user evaluation was sorted as follows: C1 > C2 > C4 > C3.
  • For the goal domain, Context C1 (4.1) only exceeded the score of the expert mean (4.00) with a low margin because the user’s target mostly matched with the website context.
  • In the case of the blended control and engagement domain, the context of C3 could not satisfy the required level. This was due to the lack of user engagement reflected less on our developed website. The same goes for the context of C4 as well.
  • For the case of Context C2, the score (4.00) equaled the mean expert score (4.00). We concluded that we anticipated the control-based domain just like our expert opinions.
  • It was evident from the mapping diagram of the software design principles to our web application that 50% of the design issues need to be resolved to satiate users’ satisfaction levels.

6.2. Limitations

We chose a reasonable number of participants involved in the survey. A wider and more-varied participant pool could provide deeper perceptions of the research aspect. Future research may use various platforms to improve the study’s reliability.

6.3. Future Research Prospects

The following are the suggestions for additional research:
  • Assessing other web-based software platforms using the same criteria used in this case study.
  • Selecting the metrics that are most suited for use in both user-only and expert-only evaluations.
  • Examining the use of the heuristics produced by this study as rules of thumb for developing web-based application software.

7. Conclusions

In this paper, we evaluated a website user interface (UI) and user experience (UX) using a hybrid HCI framework. Through combining user research and a cognitive walkthrough, we developed a quantifiable metric for measuring user satisfaction. Comparing user and expert data measures attrition levels, enhancing UI and UX designs. The C1 context level (4.1) outperformed the average expert-level data (4.0). User reviews are categorized and grouped based on software design parameters to effectively evaluate user experience levels. This study created a hybrid framework for assessing software usability and user experience, enhancing the evaluation process. It also evaluated the effectiveness and value of this framework in HCI research by integrating several usability and user experience related criteria.

Author Contributions

Conceptualization, S.C. and M.W.; methodology, A.S., N.J.K., S.C., L.S., S.G. and M.S.R.; software, A.S. and M.J.N.M.; validation, L.S.; formal analysis, A.S. and N.J.K.; investigation, M.W., A.B., C.F., S.T.-W. and T.J.; resources, M.J.N.M.; data curation, L.S.; writing—original draft, A.S., S.G. and M.S.R.; writing—review and editing, M.W., S.T.-W. and C.F.; visualization, N.J.K., S.G. and M.S.R.; supervision, M.W. and S.C.; project administration, S.C. and M.J.N.M.; funding acquisition, M.W., A.B., S.T.-W. and T.J. All authors have read and agreed to the published version of the manuscript.

Funding

This research is partially supported through the Australian Research Council Discovery Project: DP190100314.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The majority of the data presented in this research article are available within the manuscript. Any additional data, if required, can be obtained by contacting the corresponding author upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kivijärvi, H.; Pärnänen, K. Instrumental Usability and Effective User Experience: Interwoven Drivers and Outcomes of Human-Computer Interaction. Int. J. Hum. Comput. Interact. 2023, 39, 34–51. [Google Scholar] [CrossRef]
  2. Al-Shamaileh, O.; Sutcliffe, A. Why People Choose Apps: An Evaluation of the Ecology and User Experience of Mobile Applications. Int. J. Hum. Comput. Stud. 2023, 170, 102965. [Google Scholar] [CrossRef]
  3. Fleury, S.; Chaniaud, N. Multi-User Centered Design: Acceptance, User Experience, User Research and User Testing. Theor. Issues Ergon. 2023, 1–16. [Google Scholar] [CrossRef]
  4. Kushniruk, A.W.; Patel, V.L. Cognitive and Usability Engineering Methods for the Evaluation of Clinical Information Systems. J. Biomed. Inform. 2004, 37, 56–76. [Google Scholar] [CrossRef] [PubMed]
  5. Rieman, J.; Franzke, M.; Redmiles, D. Usability Evaluation with the Cognitive Walkthrough. In Conference Companion on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 1995; pp. 387–388. [Google Scholar]
  6. Fortoul-Diaz, J.A.; Carrillo-Martinez, L.A.; Centeno-Tellez, A.; Cortes-Santacruz, F.; Olmos-Pineda, I.; Flores-Quintero, R.R. A Smart Factory Architecture Based on Industry 4.0 Technologies: Open-Source Software Implementation. IEEE Access 2023, 11, 101727–101749. [Google Scholar] [CrossRef]
  7. Folgado, F.J.; González, I.; Calderón, A.J. Data Acquisition and Monitoring System Framed in Industrial Internet of Things for PEM Hydrogen Generators. Internet Things 2023, 22, 100795. [Google Scholar] [CrossRef]
  8. Mourtzis, D.; Angelopoulos, J.; Panopoulos, N. The Future of the Human–Machine Interface (HMI) in Society 5.0. Future Internet 2023, 15, 162. [Google Scholar] [CrossRef]
  9. Sauer, J.; Sonderegger, A.; Schmutz, S. Usability, User Experience and Accessibility: Towards an Integrative Model. Ergonomics 2020, 63, 1207–1220. [Google Scholar] [CrossRef] [PubMed]
  10. Islam, M.N.; Bouwman, H. Towards User–Intuitive Web Interface Sign Design and Evaluation: A Semiotic Framework. Int. J. Hum. Comput. Stud. 2016, 86, 121–137. [Google Scholar] [CrossRef]
  11. Kazemi, S.; Liebchen, G.; Cetinkaya, D. A Survey on the Usability and User Experience of the Open Community Web Portals. In HCI International 2022—Late Breaking Papers. Design, User Experience and Interaction; Springer International Publishing: Cham, Switzerland, 2022; pp. 409–423. ISBN 9783031176142. [Google Scholar]
  12. Dezhi, G.D.; Moody, J.; Zhang, P.B. Effects of the Design of Mobile Security Notifications and Mobile App Usability on Users’ Security Perceptions and Continued Use Intention. Inf. Manag. 2020, 57, 103235. [Google Scholar]
  13. Islam, M.N.; Bouwman, H.; Islam, A.K.M.N. Evaluating Web and Mobile User Interfaces with Semiotics: An Empirical Study. IEEE Access 2020, 8, 84396–84414. [Google Scholar] [CrossRef]
  14. Weichbroth, P.; Baj-Rogowska, A. Do Online Reviews Reveal Mobile Application Usability and User Experience? The Case of WhatsApp. In Proceedings of the Federated Conference on Computer Science and Information Systems, Leipzig, Germany, 1–4 September 2019. [Google Scholar]
  15. Kocabalil, A.B.; Laranjo, L.; Coiera, E. Measuring User Experience in Conversational Interfaces: A Comparison of Six Questionnaires. In Proceedings of the 32nd International BCS Human Computer Interaction Conference, Belfast, UK, 4–6 July 2018; BCS Learning & Development: Swindon, UK, 2018. [Google Scholar]
  16. Alomari, H.W.; Ramasamy, V.; Kiper, J.D.; Potvin, G. A User Interface (UI) and User EXperience (UX) Evaluation Framework for Cyberlearning Environments in Computer Science and Software Engineering Education. Heliyon 2020, 6, e03917. [Google Scholar] [CrossRef] [PubMed]
  17. Lewis, C.; Wharton, C. Cognitive Walkthroughs. In Handbook of Human-Computer Interaction; Elsevier: Amsterdam, The Netherlands, 1997; pp. 717–732. ISBN 9780444818621. [Google Scholar]
  18. Mahatody, T.; Sagar, M.; Kolski, C. State of the Art on the Cognitive Walkthrough Method, Its Variants and Evolutions. Int. J. Hum. Comput. Interact. 2010, 26, 741–785. [Google Scholar] [CrossRef]
  19. Pereira, T.F.; Matta, A.; Mayea, C.M.; Pereira, F.; Monroy, N.; Jorge, J.; Rosa, T.; Salgado, C.E.; Lima, A.; Machado, R.J.; et al. A Web-Based Voice Interaction Framework Proposal for Enhancing Information Systems User Experience. Procedia Comput. Sci. 2022, 196, 235–244. [Google Scholar] [CrossRef]
  20. Clauirton, T.B.; Gouveia, J.; Macedo, F.Q.D.; Andre, L.M.; Santos, W.; Correia, M.; Penha, M.; Anjos, F. Toward Accessibility with Usability: Understanding the Requirements of Impaired Uses in the Mobile Context. In Proceedings of the 11th International Conference on Ubiquitous Information Management and Communication, Beppu, Japan, 5–7 January 2017; pp. 1–8. [Google Scholar]
  21. Sahua, J.; Moquillaza, A.; Paz, F. A Usability Evaluation Process Proposal for ATM Interfaces. In Design, User Experience, and Usability: Design for Contemporary Technological Environments; Springer International Publishing: Cham, Switzerland, 2021; pp. 553–562. ISBN 9783030782269. [Google Scholar]
  22. Nugraha, A.P.; Syaifullah, D.H.; Puspasari, M.A. Usability Evaluation of Main Function on Three Mobile Banking Application. In Proceedings of the International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS), Bangkok, Thailand, 21–24 October 2018. [Google Scholar]
  23. Malik, H.A.M.; Muhammad, A.; Sajid, U. Analyzing Usability of Mobile Banking Applications in Pakistan. Sukkur IBA J. Comput. Math. Sci. 2021, 5, 25–35. [Google Scholar] [CrossRef]
  24. De Moura, P.; Farias, D.; Nunes Valença, M.; Sobral, M.F.F.; Ribeiro, A.R.B. Hostelquality: A Methodology for Assessing the Quality of Hostels. Tour. Hosp. Res. 2022. [Google Scholar]
  25. Dong, J.; Li, H.; Zhang, X. Classification of Customer Satisfaction Attributes: An Application of Online Hotel Review Analysis. In IFIP Advances in Information and Communication Technology; Springer: Berlin/Heidelberg, Germany, 2014; pp. 238–250. ISBN 9783662455258. [Google Scholar]
  26. Muizz, O.; Hassanain, M.A. Quality Assessment of Student Housing Facilities through Post-Occupancy Evaluation. Archit. Eng. Des. Manag. 2016, 12, 367–380. [Google Scholar]
  27. Liu, X.; Jiang, Y. Aesthetic Assessment of Website Design Based on Multimodal Fusion. Future Gener. Comput. Syst. 2021, 117, 433–438. [Google Scholar] [CrossRef]
  28. Sa, N.; Yuan, X. In Human-Computer Interaction. Design and User Experience Case Studies: Thematic Area. In HCI 2021, Held as Part of the 23rd HCI International Conference; Springer International Publishing: Berlin/Heidelberg, Germany, 2021; pp. 442–457. [Google Scholar]
  29. Akgul, Y. Web Site Accessibility, Quality and Vulnerability Assessment: A Survey of Government Web Sites in the Turkish Republic. J. Inf. Syst. Eng. Manag. 2016, 1, 1–13. [Google Scholar] [CrossRef]
  30. Muhammad, A.; Siddique, A.; Naveed, Q.N.; Khaliq, U.; Aseere, A.M.; Hasan, M.A.; Qureshi, M.R.N.; Shahzad, B. Evaluating Usability of Academic Websites through a Fuzzy Analytical Hierarchical Process. Sustainability 2021, 13, 2040. [Google Scholar] [CrossRef]
  31. Pantula, M.; Kuppusamy, K.S. A Machine Learning-Based Model to Evaluate Readability and Assess Grade Level for the Web Pages. Comput. J. 2022, 65, 831–842. [Google Scholar] [CrossRef]
  32. Hsu, T.-C.; Chang, S.-C.; Liu, N.-C. Peer Assessment of Webpage Design: Behavioral Sequential Analysis Based on Eye Tracking Evidence. J. Educ. Technol. Soc. 2018, 21, 305–321. [Google Scholar]
  33. Dieber, J.; Kirrane, S. A Novel Model Usability Evaluation Framework (MUsE) for Explainable Artificial Intelligence. Inf. Fusion 2022, 81, 143–153. [Google Scholar] [CrossRef]
  34. Vermeeren, A.P.; Law, E.L.-C.; Roto, V.; Obrist, M.; Hoonhout, J.; Väänänen-Vainio-Mattila, K. User Experience Evaluation Methods: Current State and Development Needs. In Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries, Reykjavik, Iceland, 21–26 October 2010; pp. 521–530. [Google Scholar]
  35. Maia, C.L.B.; Furtado, E.S. A Systematic Review about User Experience Evaluation. In Design, User Experience, and Usability: Design Thinking and Methods; Springer International Publishing: Cham, Switzerland, 2016; pp. 445–455. ISBN 9783319404080. [Google Scholar]
  36. Obrist, M.; Roto, V.; Väänänen-Vainio-Mattila, K. User Experience Evaluation: Do You Know Which Method to Use? In Proceedings of the 27th International Conference on Human Factors in Computing Systems, CHI 2009, Extended Abstracts Volume, Boston, MA, USA, 4–9 April 2009; pp. 2763–2766. [Google Scholar]
  37. Díaz-Oreiro, I.; López, G.; Quesada, L.; Guerrero, L.A. Standardized Questionnaires for User Experience Evaluation: A Systematic Literature Review. Proceedings 2019, 31, 14. [Google Scholar] [CrossRef]
  38. Online: Assessing usability with Cognitive Walkthrough. Available online: https://uxdesign.cc/assessing-interfaces-with-cognitive-walkthrough-9f92eae4321f (accessed on 16 October 2022).
  39. Online: Cognitive Mapping in User Research. Available online: https://www.nngroup.com/articles/cognitive-mapping-user-research/ (accessed on 16 October 2022).
  40. Mahi, M.J.N.; Hossain, K.M.; Biswas, M.; Whaiduzzaman, M. SENTRAC: A Novel Real Time Sentiment Analysis Approach through Twitter Cloud Environment. In Lecture Notes in Electrical Engineering; Springer: Singapore, 2020; pp. 21–32. ISBN 9789811555572. [Google Scholar]
  41. Thorlacius, L. The Role of Aesthetics in Web Design. NORD. Rev. Nord. Res. Media Commun. 2007, 28, 63–76. [Google Scholar] [CrossRef]
  42. Farjana, N.; Roy, S.; Mahi, M.J.N.; Whaiduzzaman, M. An Identity-Based Encryption Scheme for Data Security in Fog Computing. In Proceedings of International Joint Conference on Computational Intelligence; Springer: Singapore, 2020; pp. 215–226. ISBN 9789811375637. [Google Scholar]
  43. Whaiduzzaman, M.; Oliullah, K.; Mahi, M.J.N.; Barros, A. AUASF: An Anonymous Users Authentication Scheme for Fog-IoT Environment. In Proceedings of the 11th International Conference on Computing, Communication and Networking Technologies (ICCCNT), Kharagpur, India, 3–5 July 2020. [Google Scholar]
  44. Hossain, M.R.; Whaiduzzaman, M.; Barros, A.; Tuly, S.R.; Mahi, M.J.N.; Roy, S.; Fidge, C.; Buyya, R. A Scheduling-Based Dynamic Fog Computing Framework for Augmenting Resource Utilization. Simul. Model. Pract. Theory 2021, 111, 102336. [Google Scholar] [CrossRef]
  45. Whaiduzzaman, M.; Farjana, N.; Barros, A.; Mahi, M.J.N.; Satu, M.S.; Roy, S.; Fidge, C. HIBAF: A Data Security Scheme for Fog Computing. J. High Speed Netw. 2021, 27, 381–402. [Google Scholar] [CrossRef]
  46. Mahi, M.J.N.; Chaki, S.; Ahmed, S.; Biswas, M.; Kaiser, M.S.; Islam, M.S.; Sookhak, M.; Barros, A.; Whaiduzzaman, M. A Review on VANET Research: Perspective of Recent Emerging Technologies. IEEE Access 2022, 10, 65760–65783. [Google Scholar] [CrossRef]
  47. Whaiduzzaman, M.; Barros, A.; Shovon, A.R.; Hossain, M.R.; Fidge, C. A Resilient Fog-IoT Framework for Seamless Microservice Execution. In Proceedings of the IEEE International Conference on Services Computing (SCC), Chicago, IL, USA, 5–10 September 2021. [Google Scholar]
  48. Satu, M.S.; Roy, S.; Akhter, F.; Whaiduzzaman, M. IoLT: An IoT Based Collaborative Blended Learning Platform in Higher Education. In Proceedings of the International Conference on Innovation in Engineering and Technology (ICIET), Dhaka, Bangladesh, 27–28 December 2018. [Google Scholar]
  49. Himi, S.T.; Monalisa, N.T.; Whaiduzzaman, M.; Barros, A.; Uddin, M.S. MedAi: A Smartwatch-Based Application Framework for the Prediction of Common Diseases Using Machine Learning. IEEE Access 2023, 11, 12342–12359. [Google Scholar] [CrossRef]
Figure 1. The methodological approach supporting our model.
Figure 1. The methodological approach supporting our model.
Applsci 13 11997 g001
Figure 2. The contexts of user experience in terms of human–computer interaction (HCI).
Figure 2. The contexts of user experience in terms of human–computer interaction (HCI).
Applsci 13 11997 g002
Figure 3. Use case diagram of our developed software application.
Figure 3. Use case diagram of our developed software application.
Applsci 13 11997 g003
Figure 4. Our experimental sample (Student Hostel Website): the perspective of usability evaluation from the experts’ and users’ end.
Figure 4. Our experimental sample (Student Hostel Website): the perspective of usability evaluation from the experts’ and users’ end.
Applsci 13 11997 g004
Figure 5. Mapping of weighted average scores in terms of user feedback where Q1–Q10 represents the survey questions associated with their scores out of five.
Figure 5. Mapping of weighted average scores in terms of user feedback where Q1–Q10 represents the survey questions associated with their scores out of five.
Applsci 13 11997 g005
Figure 6. Fluctuation level in terms of the contexts’ aggregated score with the expert mean score.
Figure 6. Fluctuation level in terms of the contexts’ aggregated score with the expert mean score.
Applsci 13 11997 g006
Figure 7. Mapping of software design principles to our web application: non-dotted lines depict the missing features, and dotted lines (blue dots) depict the existing features.
Figure 7. Mapping of software design principles to our web application: non-dotted lines depict the missing features, and dotted lines (blue dots) depict the existing features.
Applsci 13 11997 g007
Table 1. Survey questions in terms of the cognitive evaluation approach.
Table 1. Survey questions in terms of the cognitive evaluation approach.
DomainTopicQuestion
ControlTechnical glitchQ1: How would you like to remark the technical glitch parameter while using the Student Hostel Website?
ControlUnavoidable interruptQ2: How would you like to remark the unavoidable interrupt parameter while using the Student Hostel Website?
ControlForce the user into unnecessary actionsQ3: How would you like to remark force the user into unnecessary actions parameter while using the Student Hostel Website?
ControlRepetitive interactionQ4: How would you like to remark repetitive actions parameter while using the Student Hostel Website?
EngagementConsistency of the applicationQ5: How would you like to remark consistency of the application parameter while using the Student Hostel Website?
EngagementUsers’ expectation vs. realityQ6: How would you like to remark on users’ expectation vs reality parameters while using the Student Hostel Website?
EngagementMeaningful contextQ7: How would you like to remark meaningful context parameters while using the Student Hostel Website?
EngagementInteractive user interfaceQ8: How would you like to remark interactive user interface parameters while using the Student Hostel Website?
GoalDesired contentQ9: How would you like to remark the rooms shown on the Student Hostel Website?
GoalClear descriptionQ10: How would you like to remark about the room details and facilities of the hostel while using the Student Hostel Website?
Table 2. The participant’s profile for the survey procedure.
Table 2. The participant’s profile for the survey procedure.
SexFamiliarity with the
Internet and Computers
Age
Range
Familiarity with the
Studied Website
Profession
MaleFemale100%19–30GoodModerateStudentOthers
46%54%60%40%100%0%
Table 3. Mapping of cognitive usability finding approach to the questionnaire domain.
Table 3. Mapping of cognitive usability finding approach to the questionnaire domain.
ContextControl DomainEngagement DomainGoal Domain
C1: Is the effect of the current action the same as the user’s goal?N/AN/A
C2: Is the control for the action visible?N/AN/A
C3: Is there a strong link between the control and the action?N/A
C4: Is the feedback appropriate?
Table 4. The weighted average scores of the survey feedback.
Table 4. The weighted average scores of the survey feedback.
DomainQuestion NumberWeighted Average
ControlQ14.14
ControlQ23.9
ControlQ34.04
ControlQ43.94
EngagementQ54.00
EngagementQ64.02
EngagementQ73.42
EngagementQ83.96
GoalQ94.16
GoalQ104.04
Table 5. Cognitive study on “Student Hostel Website” with expert opinions, as well as overall rating.
Table 5. Cognitive study on “Student Hostel Website” with expert opinions, as well as overall rating.
Cognitive QuestionExpert 1 FeedbackExpert 2 FeedbackExpert 3 Feedback
C1: Is the effect of current
action same as user’s goal?
P1: Yes
P2: Yes
P3: No, pictures of
different categories
of rooms can be
displayed on the
home page.
P4: Yes
P5: Yes
P1: Yes
P2: Yes
P3: Yes
P4: Yes
P5: Yes
P1: Yes
P2: Yes
P3: Yes
P4: Yes
P5: Yes
C2: Is the control for the action
visible?
P1: Yes
P2: Yes
P3: Yes
P4: Yes
P5: Yes
P1: Yes
P2: Yes
P3: Yes
P4: Yes
P5: Yes
P1: Yes
P2: Yes
P3: Yes
P4: Yes
P5: Yes
C3: Is there a strong link between
the control and the action?
P1: No, forget
password option
can be added to
the login page.
P2: Yes
P3: Yes
P4: Yes
P5: Yes
P1: Yes
P2: Yes
P3: No, more
categories of
rooms should
be added on
the website.
P4: Yes
P5: Yes
P1: Yes
P2: Yes
P3: No, need to
add a proper user
-friendly indicator
for the room
booking option.
P4: Yes
P5: Yes
C4: Is the feedback appropriate?P1: Yes
P2: Yes
P3: No, a
confirmation
email can be
sent to
the user after
the user
confirms the
room booking.
P4: Yes
P5: Yes
P1: Yes
P2: Yes
P3: Yes
P4: Yes
P5: Yes
P1: Yes
P2: Yes
P3: Yes
P4: Yes
P5: Yes
Overall Expert Rating3.5/5.04.0/5.04.5/5.0
Table 6. Mapping of context-based users’ aggregated scores with cognitive-evaluation-based experts’ rating.
Table 6. Mapping of context-based users’ aggregated scores with cognitive-evaluation-based experts’ rating.
ContextDomainAgg. Score
Expert Mean
Fluctuation Level
[4.00]
Fluctuation NotationUser Satisfied or NotOpinion
C1Goal4.10.1PIn Context 1, the expert mean is satisfied because the user’s purpose and the result of the present action are identical. We considered the user’s objectives while we designed the website.
C2Control4.000.0-In Context 2, both the aggregated context score and the expert mean ratings are identical.
C3Control, Engagement3.930.07N×Context 3 does not meet the expert mean score either since there is a weak connection between the control and the action. While carrying out their tasks, users encountered some unavoidable errors or technological glitches.
C4Control, Engagement, Goal3.960.04N×As the feedback was that the the users’ expectation was inadequately satiated, Context 4 does not satisfy the expert mean score.
Note: P denotes positive, and N denotes negative.
Table 7. The highlighted positive and negative feedback from the users of the “Student Hostel” Website in terms of its services.
Table 7. The highlighted positive and negative feedback from the users of the “Student Hostel” Website in terms of its services.
Sl.Design ParametersKeywordsSelected Users Comments
01Aesthetic Design(P) easy on the eye
(N) navigation bar
(N) form layout
(P) the overall design is easy on the eye
(N) the navigation bar should be improved more
(N) form layout should be more organized
02Communication(P) contact us
(N) live chat
(P) users can easily contact with the management team
(N) no live chat option
03Compatibility(P) cross-platform
(N) offline access
(P) the system is not a cross-platform application
(N) cannot be accessed offline
04Cost(P) reasonable price
(N) no offers
(P) room price is very reasonable
(N) there are no offers
05Performance(P) easy navigation
(N) no filtering
(P) easy to navigate
(N) users cannot filter their rooms according to their desire
06Satisfaction(P) pop-up ads
(N) room categories
(N) better pictures
(P) there are no pop-up ads
(N) room categories are limited
(N) better pictures of rooms should be provided
07Security(P) login
(N) authentication
(N) student verification
(P) login facility is provided
(N) two-factor authentication should be implemented
(N) users who are not students can also book rooms,
so student verification is needed
08Usefulness(P) usability
(N) details
(N) advanced facilities
(P) easy to use
(N) more details should be provided
(N) there should be more facilities for students
09Responsiveness(P) booking response
(P) helpline
(P) booking requests are responded to quickly
(P) helpline feedback is available
10Technical Glitch(P) image quality
(P) 404 error
(P) high-resolution and clear image
(P) 404 error message did not occur
Note: P denotes positive feedback, and N denotes negative feedback.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Whaiduzzaman, M.; Sakib, A.; Khan, N.J.; Chaki, S.; Shahrier, L.; Ghosh, S.; Rahman, M.S.; Mahi, M.J.N.; Barros, A.; Fidge, C.; et al. Concept to Reality: An Integrated Approach to Testing Software User Interfaces. Appl. Sci. 2023, 13, 11997. https://doi.org/10.3390/app132111997

AMA Style

Whaiduzzaman M, Sakib A, Khan NJ, Chaki S, Shahrier L, Ghosh S, Rahman MS, Mahi MJN, Barros A, Fidge C, et al. Concept to Reality: An Integrated Approach to Testing Software User Interfaces. Applied Sciences. 2023; 13(21):11997. https://doi.org/10.3390/app132111997

Chicago/Turabian Style

Whaiduzzaman, Md, Adnan Sakib, Nisha Jaman Khan, Sudipto Chaki, Labiba Shahrier, Sudipto Ghosh, Md. Saifur Rahman, Md. Julkar Nayeen Mahi, Alistair Barros, Colin Fidge, and et al. 2023. "Concept to Reality: An Integrated Approach to Testing Software User Interfaces" Applied Sciences 13, no. 21: 11997. https://doi.org/10.3390/app132111997

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop