This post is on behalf of the IEEE Computational Intelligence Society's University Curricula Subcommittee.
The IEEE Computational Intelligence Society has a database on over 100 courses of various aspects computational intelligence taught at undergraduate and graduate levels all over the world. If you are interested in learning about the existing courses or intend to list your own course, which could be accessed by students and teachers please visit the CIS University Curriculum web site http://cis.ieee.org/university-curricula.html
Showing posts with label teaching. Show all posts
Showing posts with label teaching. Show all posts
Thursday, November 29, 2012
Thursday, June 14, 2012
Publishing and perishing under gameable metrics 2
This article about the Australian Excellence in Research for Australia (ERA) initiative discusses how the process by which Australian universities and academic are assessed is flawed. It also discusses how Australian institutions have been gaming the metrics, like certain New Zealand institutions have been accused of doing.
In this previous post I described how any metric by which an institution or academic is assessed can be gamed. That is, any way in which an academic or institution is assessed can be manipulated by that institution to gain a higher score. In this post, I discussed how this has a negative effect on the teaching performance of an institution. By removing staff who do not perform well in research assessments due to a heavy teaching load, the institution can lift their research scores, but at the cost of lowering their teaching performance. As the article mentions, teaching is not assessed, so the process optimises towards a single metric at the expense of all others. This is not helpful for the long-term viability of an institution, as undergraduates will not want to attend an institution with a poor reputation for teaching.
This situation is almost certain to increase the use of contract lecturers, as contract lecturers are, as I understand it, exempt from assessment. I've already described why increasing contract lecturers is a bad idea, mostly because of a lack of job security and satisfaction for the contract lecturers, as well as a lack of continuity in teaching from the point of view of the students.
It is becoming increasingly apparent to me that assessing institutions is not as useful as assessing individuals, and that, in today's highly-mobile world, the reputation of an institution is no longer as important as the reputation of an individual researcher. This raises an interesting question:
What would happen if research performance based funding were given directly to the researchers based on their own individual performance, rather than their institutions being given extra funding based on the collective research performance of their staff?
The article linked at the start of this post does an excellent job of describing the problems with collective assessments (like what does it mean if you have one researcher ranked 1 and one ranked 5 - do they have a collective performance of 3? What does that even mean?).
Individual funding would remove a lot of the financial motivation for institutions to game the system, although it wouldn't eliminate it (institutions would still make money by charging the individual researchers over-heads, but these could be capped). Under the current Australian and New Zealand systems, individuals are assessed anyway, so it doesn't require any great changes to the current assessment process. One downside (and it could be a stonking big downside) is that early-career researchers would probably do poorly under this model. Early-career are already disadvantaged by management practices designed to game the system, and a simple weighting mechanism accounting for the length of time an individual has been doing research would go a long way to help. This would encourage researchers to start publishing early (which is essential to master the art of scientific publishing) and to publish consistently (which is essential to maintain your publishing skills). Another downside would be senior researchers taking credit for the work of junior researchers. But, again, this happens anyway, even though it is profoundly unethical. Under this system, though, it would no longer be just unethical, it would be criminal fraud.
Such a scheme could only be successful if it were paired with a scheme for assessing and rewarding teaching. While I have stated several times that an academic in a permanent position who is not publishing is not doing their job, an academic with a low (but not non-existent) research output and a strong teaching performance is an asset to an institution. Therefore, it is, in my opinion, imperative that an objective metric for teaching performance be implemented as soon as possible. That way, quality teachers, as well as quality researchers, would be recognised and rewarded. Those who do both (and this is the ideal for an academic, to teach and do research) would score even higher.
In this previous post I described how any metric by which an institution or academic is assessed can be gamed. That is, any way in which an academic or institution is assessed can be manipulated by that institution to gain a higher score. In this post, I discussed how this has a negative effect on the teaching performance of an institution. By removing staff who do not perform well in research assessments due to a heavy teaching load, the institution can lift their research scores, but at the cost of lowering their teaching performance. As the article mentions, teaching is not assessed, so the process optimises towards a single metric at the expense of all others. This is not helpful for the long-term viability of an institution, as undergraduates will not want to attend an institution with a poor reputation for teaching.
This situation is almost certain to increase the use of contract lecturers, as contract lecturers are, as I understand it, exempt from assessment. I've already described why increasing contract lecturers is a bad idea, mostly because of a lack of job security and satisfaction for the contract lecturers, as well as a lack of continuity in teaching from the point of view of the students.
It is becoming increasingly apparent to me that assessing institutions is not as useful as assessing individuals, and that, in today's highly-mobile world, the reputation of an institution is no longer as important as the reputation of an individual researcher. This raises an interesting question:
What would happen if research performance based funding were given directly to the researchers based on their own individual performance, rather than their institutions being given extra funding based on the collective research performance of their staff?
The article linked at the start of this post does an excellent job of describing the problems with collective assessments (like what does it mean if you have one researcher ranked 1 and one ranked 5 - do they have a collective performance of 3? What does that even mean?).
Individual funding would remove a lot of the financial motivation for institutions to game the system, although it wouldn't eliminate it (institutions would still make money by charging the individual researchers over-heads, but these could be capped). Under the current Australian and New Zealand systems, individuals are assessed anyway, so it doesn't require any great changes to the current assessment process. One downside (and it could be a stonking big downside) is that early-career researchers would probably do poorly under this model. Early-career are already disadvantaged by management practices designed to game the system, and a simple weighting mechanism accounting for the length of time an individual has been doing research would go a long way to help. This would encourage researchers to start publishing early (which is essential to master the art of scientific publishing) and to publish consistently (which is essential to maintain your publishing skills). Another downside would be senior researchers taking credit for the work of junior researchers. But, again, this happens anyway, even though it is profoundly unethical. Under this system, though, it would no longer be just unethical, it would be criminal fraud.
Such a scheme could only be successful if it were paired with a scheme for assessing and rewarding teaching. While I have stated several times that an academic in a permanent position who is not publishing is not doing their job, an academic with a low (but not non-existent) research output and a strong teaching performance is an asset to an institution. Therefore, it is, in my opinion, imperative that an objective metric for teaching performance be implemented as soon as possible. That way, quality teachers, as well as quality researchers, would be recognised and rewarded. Those who do both (and this is the ideal for an academic, to teach and do research) would score even higher.
Labels:
research craft,
teaching
Thursday, May 31, 2012
An experiment in open-source textbooks 2
To further put my money where my mouth is, in regards to my support for open source textbooks, I'm following up Monday's post by making the outline of my open source textbook, Intelligent Information Systems, available online. The outline is in PDF format, and is available at the following address:
http://mike.watts.net.nz/IIS_Outline.pdf
Readers are encouraged to comment on the outline via the comments section of this blog - I want to hear your opinions!
http://mike.watts.net.nz/IIS_Outline.pdf
Readers are encouraged to comment on the outline via the comments section of this blog - I want to hear your opinions!
Labels:
open source,
teaching,
textbooks
Wednesday, May 30, 2012
WCCI 2012 Panel Session on Computational Intelligence in Education and University Curricula
The following panel session at WCCI 2012 is organised by the IEEE Computational Intelligence Society's Curriculum Subcommittee (which I happen to serve on), and will be held Thursday, June 14, 4:10-5:10pm.
Chairs: Robert Kozma and Jennie Si
Panelists: Haibo He, Janusz Kaczprzyk, Jim Keller, Luis Magdalena, Marios Polycarpou, Lipo Wang
Computational Intelligence is a relatively new research field. A lot of educational materials have been created in various fields of CI in the past decades. However, due to the field's relatively youth, its fundamental achievements has not been organized into a comprehensive curriculum yet. It is crucial for the development of the field to have high-quality educational materials on the state of art of CI. This allows attracting and educating talented and enthusiastic students and documenting the progress in the field. The panel will discuss various areas of CI education, including existing databases and course materials, online resources and video lectures, development of new textbooks, open-source software, and others. Various recommendations for future actions will be discussed as well.
Chairs: Robert Kozma and Jennie Si
Panelists: Haibo He, Janusz Kaczprzyk, Jim Keller, Luis Magdalena, Marios Polycarpou, Lipo Wang
Computational Intelligence is a relatively new research field. A lot of educational materials have been created in various fields of CI in the past decades. However, due to the field's relatively youth, its fundamental achievements has not been organized into a comprehensive curriculum yet. It is crucial for the development of the field to have high-quality educational materials on the state of art of CI. This allows attracting and educating talented and enthusiastic students and documenting the progress in the field. The panel will discuss various areas of CI education, including existing databases and course materials, online resources and video lectures, development of new textbooks, open-source software, and others. Various recommendations for future actions will be discussed as well.
Labels:
conferences,
panel session,
teaching,
textbooks
Monday, May 28, 2012
An experiment in open-source textbooks
I am thinking of writing a textbook. Actually, I'm working on three at the moment, one of which is a research monograph, but the one that it most relevant to this post is tentatively titled Intelligent Information Systems, and will cover neural networks, fuzzy systems and evolutionary algorithms at an undergraduate level. I also expect it would be useful for researchers from other disciplines who want to apply methods in computational intelligence to their own research, and to software engineers who want to solve real-world problems with computational intelligence.
In line with this post, I am seriously considering making Intelligent Information Systems available as an open-source textbook. But before I do, I need some encouragement. So I'm asking you, my dear readers, to encourage me. If you think you would assign an open-source textbook on this topic to a class, or that you would buy a self-published textbook, let me know in the comments. If you could see yourself contributing some other way, let me know, too.
It's up to you good folk to push me to do this!
In line with this post, I am seriously considering making Intelligent Information Systems available as an open-source textbook. But before I do, I need some encouragement. So I'm asking you, my dear readers, to encourage me. If you think you would assign an open-source textbook on this topic to a class, or that you would buy a self-published textbook, let me know in the comments. If you could see yourself contributing some other way, let me know, too.
It's up to you good folk to push me to do this!
Labels:
publishing,
teaching,
textbooks
Tuesday, May 22, 2012
Publishing and perishing under gameable metrics
My alma mater is in the New Zealand news again, and again it is to do with gaming the metrics by which the research performance of New Zealand tertiary institutions are measured. This time, the article describes how many staff with poor publishing records have been made redundant from the university (that is, they have lost their jobs) prior to the assessment later this year. While I have little sympathy for those in permanent lecturing positions who do not publish (see my previous comments here and here) in this case it seems like the staff who have lost their jobs are predominantly teaching staff, or staff who are still developing their research record (see this post from one who lost her job for the same reason some time ago). If that is the case, then I have to say that the university administration is making a mistake.
Teaching takes a lot of time and energy (my last semester teaching at Otago, I was in the office at least six days a week, and often worked from 7:30 in the morning to 9 or 10 at night). The purpose of having teaching-only staff is to take some of that load off of the lecturers so that they can do their research. Indeed, the major thrust of the article is that the redundancies are putting more stress on the remaining staff, as they are having to pick up extra teaching in addition to lifting their own research outputs. While the teaching load could in theory be reduced by hiring contract lecturers (who would not, as I understand it, be assessed) I have already posted on why this is a bad idea.
From my research with evolutionary algorithms, I know that optimising to one criteria or metric seldom results in optimal or robust systems. By optimising their staff to one (flawed and gameable) metric, the University of Otago is reducing the robustness of their institution. The long-term outcome of these redundancies is yet to be seen, but I do not think that it will be good for anyone concerned. Non-performers need to be removed, for sure, but early-career researchers need coaching and leadership to develop. They don't need the great big stick stick of the threat of redundancy waved at them (such threats are more often than not a sign of dysfunctional management, rather than a sign of competent leadership).
Ultimately, only those who set the metrics can resolve this situation. As long as a metric can be gamed, then institutions will game them. In the meantime, people will have their lives upended and their careers destroyed by narrow-minded administrators and cynical political operators who are trying to wring a few more points out of the system to make themselves look good.
Teaching takes a lot of time and energy (my last semester teaching at Otago, I was in the office at least six days a week, and often worked from 7:30 in the morning to 9 or 10 at night). The purpose of having teaching-only staff is to take some of that load off of the lecturers so that they can do their research. Indeed, the major thrust of the article is that the redundancies are putting more stress on the remaining staff, as they are having to pick up extra teaching in addition to lifting their own research outputs. While the teaching load could in theory be reduced by hiring contract lecturers (who would not, as I understand it, be assessed) I have already posted on why this is a bad idea.
From my research with evolutionary algorithms, I know that optimising to one criteria or metric seldom results in optimal or robust systems. By optimising their staff to one (flawed and gameable) metric, the University of Otago is reducing the robustness of their institution. The long-term outcome of these redundancies is yet to be seen, but I do not think that it will be good for anyone concerned. Non-performers need to be removed, for sure, but early-career researchers need coaching and leadership to develop. They don't need the great big stick stick of the threat of redundancy waved at them (such threats are more often than not a sign of dysfunctional management, rather than a sign of competent leadership).
Ultimately, only those who set the metrics can resolve this situation. As long as a metric can be gamed, then institutions will game them. In the meantime, people will have their lives upended and their careers destroyed by narrow-minded administrators and cynical political operators who are trying to wring a few more points out of the system to make themselves look good.
Labels:
rants,
research craft,
teaching
Saturday, April 21, 2012
The future of universities
Or, why contract lecturers are probably a bad idea.
The last time I was job-hunting, I noticed a number of positions advertised as "sessional" or "contract" lecturers. These were positions where a person would present a few lectures a week for a certain course, for a fixed period of time, then leave the institution. In this article, the use of contract lecturers in American universities is described as a crisis, where quality of teaching is suffering and the highly-skilled educators end up severely under-paid. While administrators justify this as a way of cutting monetary costs, the educational costs are huge.
Firstly, contract lecturers are not available for struggling students. This is because they are seldom paid full-time, which makes it difficult to find time for out-of-class student consultation: people don't like to work for nothing.
Secondly, the fly-by-night nature of contract lecturers prevents them from forging bonds with cohorts of students: the students see them for one course, then never see them again. In other words, the contract lecturer has no motivation and little opportunity to see their students as anything other than faceless blobs that sit in the lectures absorbing information. This is not conducive to high-quality teaching.
This also makes it harder to recruit post-graduate students. I vividly remember the first time I was lectured by the man who would go on to be my PhD supervisor: I was a first-year undergraduate, sitting in a lecture theatre on a cold Dunedin evening, and he described a world of computational intelligence that I knew right then was a world I wanted to explore myself. I knew that if I worked hard in my first and second year courses, I would be able to do his third-year honours-track course, and if I did well in third-year, I could do his fourth-year honours course, and if I did well in that, I could do a PhD with him. If he had been a fly-by-night contract lecturer, would I have been as inspired? I probably would have skipped honours and gone into the workforce straight after third year. While that might have placed me in a slightly better financial position, my life would be much less rich than it is now.
While I don't have evidence for it, I suspect that contract lecturing does not overall attract the best teaching talent. Now, I'm not trying to denigrate contract lecturers, and I know several people who have worked as contract lecturers to support themselves while looking for post-docs, immediately after completing their PhDs. But as a highly-trained professional (which is what anyone with a PhD is) it is hard to justify taking a contract lecturer position if there are any other options available. I never even bothered applying for the contract lecturing positions I saw advertised, even though I was capable of doing them well, simply because it was not worth my while to shift myself and my family to do the job. If I were a single man, perhaps I could embrace the digital nomad lifestyle, and drift about doing contract lecturing here and there. But with a family to support, including a primary-school age daughter, it simply is not an option.
On the flip side, contract lecturing can provide a way for junior staff to get some experience lecturing. Also, technology is getting to the point where the lecturer no longer has to be in the same physical location as the class: the success of the Khan Academy and open courses (like the courses run by Sebastian Thrun) has shown that it is possible to have a class that is far away from the instructors. If the option to teach remotely were there, it might be easier to get top-talent as contract lecturers. I wouldn't mind being a contract lecturer if it meant I didn't have to relocate. That is, I wouldn't mind the job so much if I didn't have to move to do it. Of course, the alienation between lecturer and student that I discussed above could become even greater.
I think that the use of contract lecturers is probably going to increase, especially for first-year or general "service" courses, like for introductory programming or basic web development. But for more advanced under-graduate courses, or for post-graduate teaching, permanent staff are absolutely essential, due to the multi-year nature of post-graduate study. This also requires a level of specialisation that contract lecturers simply cannot develop: they are treated like interchangeable parts, which is no way to treat anyone, let alone someone who you expect to teach, and to inspire, students.
The last time I was job-hunting, I noticed a number of positions advertised as "sessional" or "contract" lecturers. These were positions where a person would present a few lectures a week for a certain course, for a fixed period of time, then leave the institution. In this article, the use of contract lecturers in American universities is described as a crisis, where quality of teaching is suffering and the highly-skilled educators end up severely under-paid. While administrators justify this as a way of cutting monetary costs, the educational costs are huge.
Firstly, contract lecturers are not available for struggling students. This is because they are seldom paid full-time, which makes it difficult to find time for out-of-class student consultation: people don't like to work for nothing.
Secondly, the fly-by-night nature of contract lecturers prevents them from forging bonds with cohorts of students: the students see them for one course, then never see them again. In other words, the contract lecturer has no motivation and little opportunity to see their students as anything other than faceless blobs that sit in the lectures absorbing information. This is not conducive to high-quality teaching.
This also makes it harder to recruit post-graduate students. I vividly remember the first time I was lectured by the man who would go on to be my PhD supervisor: I was a first-year undergraduate, sitting in a lecture theatre on a cold Dunedin evening, and he described a world of computational intelligence that I knew right then was a world I wanted to explore myself. I knew that if I worked hard in my first and second year courses, I would be able to do his third-year honours-track course, and if I did well in third-year, I could do his fourth-year honours course, and if I did well in that, I could do a PhD with him. If he had been a fly-by-night contract lecturer, would I have been as inspired? I probably would have skipped honours and gone into the workforce straight after third year. While that might have placed me in a slightly better financial position, my life would be much less rich than it is now.
While I don't have evidence for it, I suspect that contract lecturing does not overall attract the best teaching talent. Now, I'm not trying to denigrate contract lecturers, and I know several people who have worked as contract lecturers to support themselves while looking for post-docs, immediately after completing their PhDs. But as a highly-trained professional (which is what anyone with a PhD is) it is hard to justify taking a contract lecturer position if there are any other options available. I never even bothered applying for the contract lecturing positions I saw advertised, even though I was capable of doing them well, simply because it was not worth my while to shift myself and my family to do the job. If I were a single man, perhaps I could embrace the digital nomad lifestyle, and drift about doing contract lecturing here and there. But with a family to support, including a primary-school age daughter, it simply is not an option.
On the flip side, contract lecturing can provide a way for junior staff to get some experience lecturing. Also, technology is getting to the point where the lecturer no longer has to be in the same physical location as the class: the success of the Khan Academy and open courses (like the courses run by Sebastian Thrun) has shown that it is possible to have a class that is far away from the instructors. If the option to teach remotely were there, it might be easier to get top-talent as contract lecturers. I wouldn't mind being a contract lecturer if it meant I didn't have to relocate. That is, I wouldn't mind the job so much if I didn't have to move to do it. Of course, the alienation between lecturer and student that I discussed above could become even greater.
I think that the use of contract lecturers is probably going to increase, especially for first-year or general "service" courses, like for introductory programming or basic web development. But for more advanced under-graduate courses, or for post-graduate teaching, permanent staff are absolutely essential, due to the multi-year nature of post-graduate study. This also requires a level of specialisation that contract lecturers simply cannot develop: they are treated like interchangeable parts, which is no way to treat anyone, let alone someone who you expect to teach, and to inspire, students.
Labels:
teaching
Monday, September 12, 2011
On plagiarism
Plagiarism is one of the most unpleasant things to deal with when teaching. Panos Ipeirotis wrote a blog post that stimulated some discussion, and was then removed because of legal threats. In short, he detected a fairly large amount of plagiarism in a class, but calling the students out on it created a lot of antipathy towards him, leading to a lower student evaluation, which adversely effected his own financial propects.
The later discussions suggested setting assignments that are impossible for the students to plagiarise. During my tenure teaching at the University of Otago, I saw my fair share (or more than my fair share) of plagiarism, and some of it was pretty bad.
The worst I saw was while teaching my second-year data processing course. It's not like it was difficult to detect, either: the copied portions stood out because the writing style was completely different. A few seconds with Google was usually enough to find the exact source. The easiest-detected case of plagiarism I dealt with was when a student copied from the laboratory manual - which I had written. There were so many cases of plagiarism in that course that the higher-ups changed the way in which plagiarism was dealt with: originally, all cases of plagiarism were sent to the dean of School. After a few weeks of me sending students to them, the regulation was changed to sending them to the head of department. The only penalty the students received, though, was a zero for the work that they had plagiarsied in. By the end of the year, I'd detected more plagiarism than the rest of the department put together, which raises the question: did more students plagiarise in my course, or was I just better at detecting them? If the former, was it because my course was harder? Because it was a required course that the students weren't really interested in? Or were the students really not smart enough to do the course without cheating? If the latter, why did I detect more than the other teaching staff? Was I the only one who read the assignments carefully? Did the other teaching staff not care? Or was it that the assessments in other courses were such that plagiarism was harder to commit in the first place, that is, more practically oriented?
While most of the plagiarism I dealt with was from undergrads, I have come across it reviewing papers, as well. Again, it was easy to detect: most of the paper was written very badly, apart from two or three paragraphs. Again, a few seconds work on Google was enough to find the original source. Needless to say, the paper was rejected. Since it was only a conference paper, I doubt that there were any repercussions on the authors.
As far as student plagiarism is concerned, I agree with the notion that it is better to spend time setting assessments that can't be plagiarised. The one course I taught that never had a problem with plagiarism was my fourth-year computational intelligence course. Now, that is partly likely to be because the students were highly-motivated, honours-level students, but also because of the nature of the lectures and assessment. Rather than me giving lectures twice a week, students took turns researching and presenting on a topic. There was a list of permissible topics for each week, so that the presentations followed the curriculum I had set out for the course, the students got support in researching their talk, and I went over each presentation before it was given. The practical work was entirely project-oriented, where again the students selected a project that interested them. This actually worked very well: it taught the students valuable skills and left no scope for plagiarism. I wonder, though, how well it would work for third or even second year students?
Perhaps a more important question is, why do students plagiarise? If we could answer that question, could plagiarism be eradicated? Or would there always be some students who are simply so desperate (or so unable / unwilling to do the work) that they will always plagiarise?
The later discussions suggested setting assignments that are impossible for the students to plagiarise. During my tenure teaching at the University of Otago, I saw my fair share (or more than my fair share) of plagiarism, and some of it was pretty bad.
The worst I saw was while teaching my second-year data processing course. It's not like it was difficult to detect, either: the copied portions stood out because the writing style was completely different. A few seconds with Google was usually enough to find the exact source. The easiest-detected case of plagiarism I dealt with was when a student copied from the laboratory manual - which I had written. There were so many cases of plagiarism in that course that the higher-ups changed the way in which plagiarism was dealt with: originally, all cases of plagiarism were sent to the dean of School. After a few weeks of me sending students to them, the regulation was changed to sending them to the head of department. The only penalty the students received, though, was a zero for the work that they had plagiarsied in. By the end of the year, I'd detected more plagiarism than the rest of the department put together, which raises the question: did more students plagiarise in my course, or was I just better at detecting them? If the former, was it because my course was harder? Because it was a required course that the students weren't really interested in? Or were the students really not smart enough to do the course without cheating? If the latter, why did I detect more than the other teaching staff? Was I the only one who read the assignments carefully? Did the other teaching staff not care? Or was it that the assessments in other courses were such that plagiarism was harder to commit in the first place, that is, more practically oriented?
While most of the plagiarism I dealt with was from undergrads, I have come across it reviewing papers, as well. Again, it was easy to detect: most of the paper was written very badly, apart from two or three paragraphs. Again, a few seconds work on Google was enough to find the original source. Needless to say, the paper was rejected. Since it was only a conference paper, I doubt that there were any repercussions on the authors.
As far as student plagiarism is concerned, I agree with the notion that it is better to spend time setting assessments that can't be plagiarised. The one course I taught that never had a problem with plagiarism was my fourth-year computational intelligence course. Now, that is partly likely to be because the students were highly-motivated, honours-level students, but also because of the nature of the lectures and assessment. Rather than me giving lectures twice a week, students took turns researching and presenting on a topic. There was a list of permissible topics for each week, so that the presentations followed the curriculum I had set out for the course, the students got support in researching their talk, and I went over each presentation before it was given. The practical work was entirely project-oriented, where again the students selected a project that interested them. This actually worked very well: it taught the students valuable skills and left no scope for plagiarism. I wonder, though, how well it would work for third or even second year students?
Perhaps a more important question is, why do students plagiarise? If we could answer that question, could plagiarism be eradicated? Or would there always be some students who are simply so desperate (or so unable / unwilling to do the work) that they will always plagiarise?
Labels:
research craft,
teaching
Wednesday, August 10, 2011
Teaching computational intelligence
Mengjie Zhang at Victoria University of Wellington discusses his experiences teaching computational intelligence in this article in the IEEE Computational Intelligence Magazine (access depends on your institution). What he describes seems like a fairly logical course structure. I thought I'd add my own experiences teaching computational intelligence at the University of Otago several years ago, to provide an alternative course structure.
The course I taught was a required course for third year honours students in the Department of Information Science. It was taught over one semester per year, and I taught it 2000-2003. I usually had between 15 to 30 students in it, with the number being a bit less near the end of my time teaching as the collapse in enrollments in Information Science started to bite. In addition, I usually had one or two students from other departments, usually biochemistry, as they found what I taught particularly useful.
The course was divided into five sections: data processing; rule-based and fuzzy rule-based systems; artificial neural networks; evolutionary computation; applications of computational intelligence. There were two one-hour lectures and one two-hour lab session per week.
The overall focus of the course was answering the question "What is computational intelligence and how do I use it to solve problems?".To this end, a large part of the course was focused on a small group project (two or three students per group) worth 30% of the final course grade. Students had to select a problem and data set, analyse the data, build an intelligent model to solve the problem the data was related do, and finally build a small prototype piece of software that solved the problem. The structure of the project was inspired by a survey of employers, commissioned by the Information Science department, which found that employers wanted graduates who could:
The material presented in the lectures covered the relevant algorithms and techniques from both a theoretical and practical aspect, covering how the algorithms work and how they can be applied to solving problems. The theoretical aspects were reinforced by ten weekly problem sets, which were worth 2% of the final grade each. The practical aspects were reinforced by the weekly practical / laboratory sessions. These used MATLAB with the relevant toolboxes and were largely aimed at providing the students with the skills and knowledge they needed to do the project work.
The final assessment component was a 50% exam. I would have liked to have set an exam worth a bit less than that, but the University regulations at the time prevented me from doing that.
Overall, the students were very happy with the course. Apart from being well-organised, they found it interesting and useful. At least one project group even managed to publish their project in an international conference.
The lectures that I presented for this course are available here. At some point, I will make the laboratory and assessment material available as well.
While I enjoy my current research job a great deal, I do find myself missing teaching, and would like to return to it one day.
The course I taught was a required course for third year honours students in the Department of Information Science. It was taught over one semester per year, and I taught it 2000-2003. I usually had between 15 to 30 students in it, with the number being a bit less near the end of my time teaching as the collapse in enrollments in Information Science started to bite. In addition, I usually had one or two students from other departments, usually biochemistry, as they found what I taught particularly useful.
The course was divided into five sections: data processing; rule-based and fuzzy rule-based systems; artificial neural networks; evolutionary computation; applications of computational intelligence. There were two one-hour lectures and one two-hour lab session per week.
The overall focus of the course was answering the question "What is computational intelligence and how do I use it to solve problems?".To this end, a large part of the course was focused on a small group project (two or three students per group) worth 30% of the final course grade. Students had to select a problem and data set, analyse the data, build an intelligent model to solve the problem the data was related do, and finally build a small prototype piece of software that solved the problem. The structure of the project was inspired by a survey of employers, commissioned by the Information Science department, which found that employers wanted graduates who could:
- work in a group
- write coherent reports
- give effective presentations
The material presented in the lectures covered the relevant algorithms and techniques from both a theoretical and practical aspect, covering how the algorithms work and how they can be applied to solving problems. The theoretical aspects were reinforced by ten weekly problem sets, which were worth 2% of the final grade each. The practical aspects were reinforced by the weekly practical / laboratory sessions. These used MATLAB with the relevant toolboxes and were largely aimed at providing the students with the skills and knowledge they needed to do the project work.
The final assessment component was a 50% exam. I would have liked to have set an exam worth a bit less than that, but the University regulations at the time prevented me from doing that.
Overall, the students were very happy with the course. Apart from being well-organised, they found it interesting and useful. At least one project group even managed to publish their project in an international conference.
The lectures that I presented for this course are available here. At some point, I will make the laboratory and assessment material available as well.
While I enjoy my current research job a great deal, I do find myself missing teaching, and would like to return to it one day.
Labels:
teaching
Wednesday, June 29, 2011
Teaching Materials Online
I have just made lecture materials from my undergraduate computational intelligence course available online. The lectures cover rule-based systems, fuzzy logic, artificial neural networks, evolutionary algorithms and hybrid systems. The lectures are available at: http://mike.watts.net.nz/Teaching/
These lectures were presented in the course INFO 331, Intelligent Information Systems, during my time at the Department of Information Science at the University of Otago, New Zealand. Also available at the above address are lectures I presented for the course INFO 233, Data Processing.
These lectures were presented in the course INFO 331, Intelligent Information Systems, during my time at the Department of Information Science at the University of Otago, New Zealand. Also available at the above address are lectures I presented for the course INFO 233, Data Processing.
Subscribe to:
Comments (Atom)