You are here:

IWAS: Intelligent Web-Based Assessment System PROCEEDINGS

, University of South Alabama, United States ; , National Institutes of Health, United States ; , University of South Alabama, United States

Society for Information Technology & Teacher Education International Conference, in San Diego, CA, USA ISBN 978-1-880094-78-5 Publisher: Association for the Advancement of Computing in Education (AACE), Chesapeake, VA


Effective assessment is vital in educational activities. We propose Intelligent Web-Based Assessment System (IWAS) to assess both learning and teaching. IWAS provides a foundation for more efficiency in instructional activities and, ultimately, student performance by: (1) Given the causes (knowledge levels and learning styles), Bayesian Networks technique is utilized to reason on the probabilities of the presence of the effects (learning outcomes). (2) The absence of teaching assessments is addressed via the feedback from different levels, aiming to correlate teaching assessments with learning assessments for the improved effectiveness in instructional activities. (3) Under a client/server architecture, IWAS is decomposed into a set of modules; through the standard inter-module interfaces, the flexibility of easy maintenance makes IWAS a generalized system adaptable to different domains. (4) Web technologies are integrated to deliver the formative feedback to users in a timely manner.


Huang, J., He, L. & Davidson-Shivers, G. (2010). IWAS: Intelligent Web-Based Assessment System. In D. Gibson & B. Dodge (Eds.), Proceedings of SITE 2010--Society for Information Technology & Teacher Education International Conference (pp. 84-91). San Diego, CA, USA: Association for the Advancement of Computing in Education (AACE). Retrieved November 19, 2018 from .

View References & Citations Map


  1. Angel Learning. (2009). Retrieved on October 7, 2009 from
  2. Ayre, M., & Nafalski, A. (2000). Recognizing diverse learning styles in teaching and assessment of electronic engineering. Proc. IEEE 30th Frontiers in Education Conference, October 2000, Kansas City, MO, 18-23.
  3. Baker, E.L., & Mayer, R.E. (1999). Computer-based assessment of problem solving. Computers in Human Behavior, 15(9), 269282.
  4. Baniulis, K., & Reklaitis, V. (2002). TestTool: Web-based testing, assessment, learning. International Journal of Informatics in Education, 1 (2), 17-30.
  5. Bergendahl, C., & Tibell, L. (2005). Boosting complex learning by strategic assessment and course design. Journal of Chemical Education, 82 (4), 645-651.
  6. Blackboard. (2009). Retrieved on October 7, 2009 from
  7. Bloom, B.S. (1956). Taxonomy of educational objectives, Handbook I: the cognitive domain. David McKay Co. Inc., New York.
  8. Brown, G., Bull, J., & Pendelbury, M. (1997). Assessing student learning in higher education. Routledge, New York.
  9. Brown, S. (1999). Institutional strategies for assessment. In S. Brown and A. Glasner (Eds.), Assessment Matters in Higher Education: Choosing and Using Diverse Approaches (pp.3-13). St Edmunsbury Press Ltd, Suffolk.
  10. Brown, S., & Knight, P. (1994). Assessing learners in higher education. Kogan Page, London.
  11. Brusilovsky, P., & Miller, P. (1999). Web-based testing for distance education. Proc. WebNet'99 – World Conference of the WWW and Internet, October 1999, Honolulu, HI. 149-154.
  12. Burgess, G.A. (2005). Introduction to programming: blooming in America. Journal of Computing Sciences in Colleges, 21 (1), 19-28.
  13. Cerbin, W. (1994). The course portfolio as a tool for continuous improvement of teaching and learning. Journal on Excellence in College Teaching, 5 (1), 95-105.
  14. Choren, R., Blois, M., & Fuks, H. (1998). Quest: an assessment tool for web-based learning. Proc. WebNet’98 – World Conference of the WWW, Internet and Intranet, 1998, Orlando, FL.
  15. CourseCompass. (2009). Retrieved on October 7, 2009 from
  16. Court, M.C., Tung, L., Shehab, R.L., Rhoads, T.R., and Ashford, T. (2003). An adaptable learning environment that is centered on student learning and knowledge resolution. World Transactions on Engineering and Technology Education, 2 (1), 41-44.
  17. Felder, R.M. (1993). Reading the second tier – learning and teaching styles in college science education. Journal of College Science Teaching, 23 (5), 286-290.
  18. Felder, R.M., & Silverman, L.K. (1988). Learning and teaching styles in engineering education. Engineering Education, 78 (7), 674-681.
  19. Jensen, F.V. (2007). Bayesian Networks and Decision Graphs. New York: Springer-Verlag.
  20. Krathwohl, D.R., Bloom, B.S., & Bertram, B.M. (1973). Taxonomy of educational objectives, the classification of educational goals, Handbook II: affective domain. David McKay Co. Inc., New York.
  21. Mockford, C., & Denton, H. (1998). Assessment models, learning styles, and design and technology project work in higher education. Journal of Technology Studies, xxiv(1). Retrieved on October 7, 2009 from
  22. Moodel. (2009). Retrieved on October 7, 2009 from
  23. Oliver, D., Dobele, T., Greber, M., & Roberts, T. (2004). This course has a Bloom Rating of 3.9. Proc. ACM 6th Conference on Australasian Computing Education, 2004, Dunedin, New Zealand. 227-231.
  24. Peat, M. (2000). Online assessment: the use of web based self assessment materials to support self directed learning. In A. Herrmann and M.M. Kulski (Eds.), Flexible Futures in Tertiary Teaching, Proc. 9th Annual Teaching Learning Forum (pp. 2-4).
  25. QuizStar. (2009). Retrieved on October 7, 2009 from
  26. Scott, T. (2003). Bloom’s taxonomy applied to testing in computer science classes. Journal of Computing Sciences in Colleges, 19 (1), 267-274.
  27. Wiggins, G. (1990). Toward assessment worthy of the liberal arts: the truth may make you free, but the test may keep you imprison. Proc. 5th AAHE Conference on Assessment in Higher Education, June 1990, Washington D.C. 17-31.

These references have been extracted automatically and may have some errors. If you see a mistake in the references above, please contact