How will we measure success of this knowledgebase?
There will probably be many criteria for measuring the success of this UCLA Knowledgebase experiment. But the first one is how many people contribute to it and how often. Here are some numbers to start measuring that. For now they’ll be added manually.
|Articles with Answers||110||155||306||363||434||458||488|
|Posts since last date||—||45||151||57||71||24||30|
|Articles w/o Answers||3||4||2||2||2||2||2|
|Contributed more than 5||4||5||9||11||13||15||16|
|Days since start||31||38||45||52||59||66||80|
The database was announced to the Help Desk/CSC Meeting on April 12, 2006. At that time it had 23 answers and roughly 10 contributors.
Please suggest other measures of success. Remember the first target audience is the staff of the 43 Help Desks at UCLA.
Possible Evaluation Criteria
- relevancy of articles as judged by user ranking (if we add that feature)
- number of queries per day
- number of new articles per day
- number of regular contributors
- percentage of help desks that contribute regularly
- percentage of help desks that use it for queries regularly
- anecdotal evidence of knowledgebase success