Blackboard is a learning management system employed by many tertiary education providers globally. The website provides end-users with the ability to organise, store, share, and assess academic content. As this system’s user interface can vary significantly per institute, it is essential that end-users relevant to the institute in question are the main focus for its implementation. End-user experience and interaction are vital in determining the success of any product and Blackboard is no exception. By measuring the five Usability quality components, we can identify areas in need of improvement to increase its ease of use. The five areas identified are Learnability, Efficiency, Memorability, Errors, and Satisfaction.
As this study seeks to identify the issues experienced by the existing user base, Efficiency, Memorability, Errors, and Satisfaction were defined as the main categories of interest. Due to their previous exposure to Blackboard, Learnability was unable to be measured. The four identified categories are defined by Isa, Lokman, Wahid, & Sulaiman (2014) as follows.
To measure the impact of Auckland University of Technologies Blackboard design, 12 third-year students were selected to perform a series of tasks reflecting a typical interaction with the website. The 12 participants whose demographic details were assessed pre-study, were subjected to qualitative, quantitative, attitudinal, and behavioural assessment. These assessments took the form of a task-oriented Interview, Intercept Surveys, Usability Lab Studies, and Ethnographic Field Studies.
The assessment methods selected for this study enabled the results to reflect both a direct measure of the impact Blackboard’s design had on participants, in specific areas and record observations of areas previously not identified as relevant to the study.
The task-oriented interview, Intercept surveys, and elements of the Ethnographic field studies were selected to measure the attitudes expressed by participants pre, peri, and post-study in using both a qualitative and quantitative approach. The remaining assessment techniques, Usability Lab studies and elements of the Ethnographic Field study were selected to assess the behaviour of the participants using a qualitative approach.The Questionnaire
The pre-study questionnaire allowed us to obtain demographic data. This allowed us to compare variables such as experience, age, ethnicity, field of study, and year of study. These variables enabled us to explore ethnographic research areas during the study. The main focus of the pre-study questionnaire was to observe the correlation between demographic variables and those ethnographic and experimental data collected.
The post-study questionnaire alternated between ‘positive’ and ‘negative’ tone, where the users were able to select the response of ‘1’ where they strongly agreed with the statement proposed, or select ‘5’ where they strongly disagreed. The design of the questionnaire allowed us to both include the intensity of the response in the survey and eliminate personality bias where users could have potentially agree blindly to the questions provided.
Pre-study activities include all research processes that take place before the usability experiment. Before the usability session, all participants are asked questions oriented around their demographic, previous experience with the software, and how long they have been using Blackboard. These questions seek to measure how the participants feel about the software leading up to the study, their level of expertise with the software, and how their background may influence the results.
The experiment involves participants completing three tasks in a one-to-one usability session. All tasks begin from the homepage of a logged-in blackboard user. The three tasks are as follows:
A post-study questionnaire of 10 questions was provided covering the features, user interface, and performance of the website after the usability experiment. The survey required users to indicate their level of agreement with different statements on a scale of 1 - 5 (1 - Strongly Agree; 2 - Agree; 3 - Neutral; 4 - Disagree; 5 - Strongly Disagree). The questionnaire incorporates sections that enable participants to provide qualitative written data in the form of comments and recommendations.
Finding: We ordered the tasks from top to bottom where the first task consisted of something which all students do on a daily or weekly basis (opening the lecture notes), followed by viewing grades (done monthly or more) and lastly viewing exam timetables (done end of the semester). This gradual increase allowed us to judge memorability within the users as we noticed that each task took longer than the one before, where task one averaged at 21.62 seconds, task two averaged at 35.19 seconds and task three averaged at 39.89 seconds (after taking the outliers out). This in turn proves the memorability part of our research criteria where tasks that users have done more often are completed faster. Task one and task three required the same amount of clicks to open and load the file.
Method: Once the usability tasks were complete, participants completed a questionnaire. Participants indicated a level of agreement with ten statements, using a 1-5 scale, and had the option to write an additional comment per statement. For analysis, the results of the questionnaire are grouped into three usability categories; features, user interface and performance.
The first four questions of the usability study focussed on features of the website with a focus on interactions with navigation menus and links.
When looking at features of the website, participant highlighted navigation as a potnetial usability issues issue. Participants noted they need to click through too many menus, and that the pages contained information not relevant to what they are doing. They commented that navigation menus were not always clear where they would take the user. Finding exam timetables was highlighted as particularly difficult and potentially undoable without prior experience using the Blackboard system. At least two participants indicated that finding the exam time table would be impossible if they had not already how to found it previously. However, overall, our participant agreed with positives statements about the features of the website and disagreed with the negative one.
Questionnaire response shows most participants disagreed with the statement that the user did not like the colour palette. Comments on colour palette were generally favourable to neutral, with only one participant commenting that the colours could be more appealing. Most user’s also agreed that the website was visually appealing, comments included mentioning the colour palette being another critical feature as for why the site was attractive.
On average, there was a marginal agreement with the statement that the user interface was cluttered. This was the only negative statement in the post-questionnaire study that more people agreed with than disagreed. Comments indicated that potentially there are too many things going on per webpage, particularly the home page. It is interesting to note that the average score for users who found Blackboard cluttered (2.5) is very similar to the score of users who found Blackboard hard to navigate (2.75). There could be a possible correlation between the users who find Blackboard easy to see, also end up finding Blackboard easy to navigate. In a study done by Oxford’s Dr David R Danielson (2002) on ‘Web Navigation & Behavioural Effects’, it states that the lack of visual cues present in navigation can contribute to the website feeling “cluttered”, which ties back to our hypothesis and correlation found above.
Questionnaire results covering the performance of the website scored the two highest positive responses. Participants on average felt load time were appropriate and didn’t agree that the site was unresponsive. Some comments mentioned how fast the website loaded and said it felt responsive as well. One commenter commented that they thought some part of the page could load a bit faster, but otherwise the usability feedback indicate usability issues of performance are very limited, suggesting the website may be well optimised. A contributing factor to the positive performance of the Blackboard site may be the high broadband speed available at the Auckland University of Technology, where the usability studies took place. Testing the perceived performance of the site at differing speeds of broadband would be useful, but is beyond the scope of this project.
Based on the feedback gathered and reviews given by the users, this study recommends the following
changes for AUT to consider: