SUMMARY 

 

PICTURES, COMPUTERS AND STORAGE SPACE

In this article the author discusses some of the possibilities and limitations for storing and using digitized black-and-white pictures. It is necessary to establish precisely how one plans to make use of such a picture base, since digitized images demand large storing capacity. When a picture is scanned into the computer, it is converted into a number of points, each of which is given a specific grey tone value. This information is stored as a number, and if the scanner distinguishes between 16 different tones of grey, we need 4 bits of storage capacity to store the information about each point.

However, 16 grey tones would still make a rather crude image, with sharp distinctions, very unlike the even gradations we are accustomed to in photographs. Twice the storage capacity, 8 bits of information per point, would give us 256 grey tones, which is sufficient for all practical purposes. The size of each point - or the number of points per square inch - is also of importance. A good standard computer screen today contains about 70 points per inch. This means that one picture of 8 by 11 inches with 256 grey tones demand a storage capacity of 3,449,600 bits or 431,200 bytes. That sounds like a lot, but if we double the number of points per sqare inch, we quadruple the digitized information. This picture would then demand more than 1.6 megabytes of storage space.

The practical problems of storage capacity demand a thorough analysis of how the picture base is to be used - now and in the future. If the pictures are to be used as a catalog and viewed on the computer screen, a solution of 70 points per inch will do nicely. If the pictures are to be printed, or enlarged after scanning, one may want a higher resolution. For analogous/digital conversion the author also applies Nyquist's theorem of sampling, which predicts that for the best results you should scan the picture at twice the solution that you plan to use. To print the picture with 70 points, scan it with 150 points per square inch. The results are shown in the photo illustrations that accompany the article.

The author also shows how the digitized picture can be manipulated. Making rather unsophisticated use of available software, he has increased the contrasts of the photograph by making use of a higher number of grey tones than the picture originally contained (Fig. 2), or by taking out parts of the picture and putting in a different face. A nice piece of forgery, which the author says took 30 seconds to accomplish. What if he really wanted to falsify the photograph? He concludes that the new possibilities may discredit the photograph as evidence material.

FOTOMAC IN A NEW AND IMPROVED VERSION.

Fotomac is a picture storing program for Macintosh, developed largely by the author. A prototype of the program has earlier been described in HD. An early version of Fotomac has been tested at Oslo University Library for storing and cataloging a large collection of pictures taken by or showing Fridtjof Nansen. In the test period the library built up a catalog of some 400 pictures. On the basis of the test results, the 4,000 finest pictures, about one half of the Nansen collection, will be digitized and stored together with relevant information. This work is carried out in cooperation between Oslo University Library and NCCH.

In this article the author presents Fotomac, version 1.0, developed in Hypercard 2.0, which is considerably improved over earlier versions:

 - Catalog cards may be of any size, up to 1280 by 1280 points.
 - The system for displaying grey tones is incorporated in Hypercard and is more robust than the system previously used.
 - Fotomac employs pre-defined print-out forms.
 - The new version has improved menu control.

Fotomac v. 1.0 separates the data base completely from the user screen. The search panel was separated from the data base in the earlier versions, too, but now there is a separate form for data input and alterations. This provides improved safety for the stored data, and means that data need not even be stored in Hypercard: the data base may reside in a Unix server, while the user would be presented with the same Hypercard data/picture cards as today. It will also be possible to use logical operators and linked searches. Fotomac will enable searches for standard operators linked by and, not and or. Larger/smaller than searches will also be implemented, which will be useful when the date of a picture is of importance.

DOCUMENTATION PROJECT AT THE FACULTY OF THE ARTS, UNIVERSITY OF OSLO

The Faculty of the Arts at the University of Oslo, has recently decided to carry out the first year of a six-year project of documentation. The goal is to collect all the material of the various archives and collections within the faculty in one data base. The archives will be transferred to modern electronic media, and will be made available to all users.

The project has been in the pipeline for some years, and in April 1990 the author was given the task of surveying all the archives and collections within the Arts Faculty, establishing the character, physical condition and size of the various archives in preparation for a comprehensive plan for the project. In February 1991 the decision was made to go ahead with the project.

The project is large: The goal is to register and make electronically available the information today registered in some 14 million documents and catalog cards. Registering the data is alone estimated to take several hundred man years. Also in professional terms it is a demanding undertaking. The material includes for instance the collections of the University Museum of National Antiquities, folk music collections and the archives of the lexicographic departments.

The article describes the various archives and collections, and presents examples of the types of cards and documents involved. They contain various kinds of information collected by both professionals and laymen over several decades, and to do the material justice will be an exacting task. The archives of the Department of Dialectology, for instance, contain voice recordings together with phonetic transcriptions, which may now be entered in a combined sound/text data base, enabling researchers not only to study the dialects but also to compare recordings and transcriptions.

Another large archive is the collection of some 30,000 photographs at the Department of History. These may now be digitized and combined with textual information, creating a valuable resource for research and education.

The author also discusses various technical solutions. The standard set-up is to store all data in one central computer unit which serves the users and runs all searches. This is a tested and practicable solution, but may easily become a bottleneck at peak user periods, since not just data but also displays have to be generated by the central unit. Local computer power is only used to gain access to the central base.

The author suggests that a better alternative would be to install a common communications program both in the central unit and in the user computers. Searches and administration would still be carried out by the central unit, but displaying and organizing the data would be done by the local unit. Such a distribution of labour will radically diminish the network traffic. It is further possible to distribute the various data bases - the actual discs - to a number of physical locations, while the whole base could still be organized as one logical unit, and be experienced as such by the user, even though one single search might be drawing on data stored in several locations. With this kind of network, a breakdown in one part would leave the rest of the network intact.

The purpose of the project is, ultimately, to make all of the material more easily accessible to the users. At the present stage the main objective is to register the information faithfully in order to preserve the desired information of the original material. For years to come the material will be of interest to researchers, teachers and students, and will be accessible both in Norway and in other parts of the world via international electronic networks.

THE MULTIMEDIA LABORATORY AT THE FACULTY OF ARTS, UNIVERSITY OF TRONDHEIM.

In January 1990 one of the three language laboratories at the Faculty of Arts of The University of Trondheim was replaced by a multimedia laboratory. The equipment consists of 9 IBM PS/2, model 55SX X61, 1 IBM PS/2, model 70 121, and 1 IBM AT with a frame grabber for interactive video. There are also 5 Macintosh computers: 1 Mac II, 1 Mac IIcx, 1 Mac SE/30, 1 Mac II fx 4/160 and 1 Mac II fx/80.

The two largest Macintosh machines have video frame grabbers and are connected to a scanner and to CD-ROM and video disc players. Four of the IBM machines have MIC-4000 frame grabbers for real time digital video and are connected to Sony Laservision LDP 3600D video disc players. In addition one of the IBM machines is connected to a Pioneer LD4100 video disc player. All computers have ethernet cards and are connected to the University's Vax machine via the local network.

The laboratory also has a CD-player for video and sound, assorted sound equipment, a VHS video player for digital picture, and satellite receiver and dish in order to take in international television transmissions. The laboratory has been selected as an Apple Multimedia Centre.

The multimedia laboratory is used for educational purposes by several departments at the Arts Faculty, but it is primarily intended as a research laboratory for staff and post-graduates at the Arts Faculty. The laboratory is intended to "stimulate, initiate, coordinate and assist the use of multimedia in education and research" in the humanities.

Hence, the lab is today used for projects, courses and seminars for staff, graduate and post-graduate students. It is important to raise the level of competence in all of these groups. The author includes the course plan for the spring of 1991, to indicate the variety of courses possible in the new surroundings.

ELECTRONIC INFORMATION DISTRIBUTION FROM THE NORWEGIAN COMPUTING CENTRE FOR THE HUMANITIES (NCCH)

The Norwegian Computing Centre for the Humanities runs several electronic network services. The host machine for these electronic services is a Sun 3/60 called NORA, or in full, NORA.NAVF-EDB-H.UIB.NO, with IP address 129.177.24.42.

There is also a HELMER: a Sun 386i, with IP address 129.177.24.41. Helmer:NAVF-edb-h is an Apple Share file server in the EtherMac zone at the University of Bergen. This service is only accessible via Macintosh nets connected to the broadband network at the University of Bergen. There are two volumes that are open to everybody: GjesterLes and GjesterSkriv (GuestsRead/GuestsWrite). Helmer informs about the services available on Les Meg (Read Me) in GjesterLes.

NCCH also makes use of NOBERGEN.BITNET which is the IBM 4381 at the Bergen University. Here NCCH has installed the file server FAFSRV@NOBERGEN.BITNET in order to service people who are interested in machine readable English texts from ICAME (International Computer Archive of Modern English). This information is also found on NORA, which will gradually replace FAFSRV.

HUMEDB is an electronic mail address which contains a number of other addresses. All incoming mail goes out to all the addresses on the list (about 80 by March 1991). HUMEDB is dedicated to the topic of computing in the humanities, and anybody may be entered in the mailing list, manually, by the administrator. The full name of the mailing list is HUMEDB@NAVF-EDB-H.UIB.NO, and the address of the administrator is HUMEDB-ADM@NAVF-EDB-H.UIB.NO.

NCCH will inform about its own activities on HUMEDB, and also send out other information. The participants are encouraged to make use of this e-mail service for opinions, queries and answers.

Those who wish to make use of this service need an electronic address in a host computer connected to Uninett, Internet, Bitnet, Janet, etc. In many institutions it is possible to read and write electronic mail on your own PC or Mac, using an "invisible" Unix machine as mailbox. This service is free of charge.

File servers on NORA contain general information about and from NCCH, articles, information from other sources and software. There are two ways to gain access to the NORA files: The first one is through the file transfer program FTP. The machine permits "anonymous FTP".

 

Access: FTP NORA.NAVF-EDB-H.UIB.NO
User: Anonymous
Password: Your name
Commands:
 CD PUB   To go to main catalog
 DIR   To see file list
 CD or CD ..   To change directory
 GET   To fetch a file
 BINARY   To set binary file conversion

Those who wish to make use of this service need access to a machine which has a so-called TCP/IP connection. This may be a Macintosh or a PC with NCSA Telnet/FTP, HyperFTP or a similar program installed. Or it may be a host machine accessed via serial or modem connection. The files have then first to be transferred to the host machines via FTP and then on to the local PC or Mac via Kermit or a similar program.

The second manner of access is mail based file transfer. NAVFSERV@NORA.NAVF-EDB-H.UIB.NO is a mail based file server on NORA. Some of our readers will be familiar with INFO@UNINETT.NO, which uses the same commands as NAVFSERV. Parts of the information available through FTP will be available through NAVFSERV, but binary files cannot be taken in directly through NAVFSERV. For more information, send e-mail, Subject: HELP.

The file servers today contain the following main catalogs:

 ICAME   Information related to International Computer Archive of Modern English
 INFO   Information on texts, projects, bibliographies
 KONFERANSER   Updated list of conferences
 MAC   Macintosh software
 NAVF-EDB-H   General information about and from NCCH (in Norwegian)
 HUMEDB   Backlog from HUMEDB (in Norwegian)
 OLUFF   the periodical OLUFF (in Norwegian)
 HD   Articles from Humanistiske Data (in Norwegian)
 NCCH   Information on NCCH in English
 NETTINFO   Information about network services
 PC   Software for PC
 UNIX   Software for Unix computers

Coordinator for the electronic services: Knut Hofland, KNUT@NAVF-EDB-H.UIB.NO, AND FAFKH@NOBERGEN.EARN NAVFs edb-senter for humanistisk forskning Postboks 53 Universitetet N-5027 Bergen Telephone + 47 (0)5 21 29 54/55/56 Telefax + 47 (0)5 32 26 56.

MOZART AND BEETHOVEN IN HYPERCARD

CDs may be controlled by a computer in the same way as a video disc. Warner New Media and The Voyager Company have published a number of products combining CD-sound, HyperCard computer displays and Macintosh sound. NCCH has aquired the first of these products from each publishers: Mozart's The Magic Flute and Beethoven's 9th Symphony. The products naturally share a good many features in common, but there are also differences in the way the material is presented.

The Magic Flute comes on three CD-ROMs and a diskette. The CDs contain the installation program and HyperCard stacks, and extra sound tracks used for musical examples for the running commentary that accompanies the opera. Besides there are the MIDI-data for those who have CD-players with MIDI output, and there are graphics for CD+G-players. The graphics can be shown on a television screen connected to the CD player, but to date the author has not seen a CD+G-player and has no idea of how this works. The HyperCard stacks (both those on the CDs and the separate one from the diskette, also have built-in control codes for a video disc player, which may be used for connection to the video disc version of Bergman's filmed Magic Flute.

The installation and use of this product appears an unnecessary cumbersome operation: You need to copy about 15 MB of HyperCard stacks from the CD's on to your hard disc - less if you don't wish to play the whole opera at once. If you have the storage space, fine, so far. However, you also have to use a specific home stack which contains the necessary drivers for the CD-player. The directions suggest that you give your original home stack a new name and call the one that comes with The Magic Flute "Home". A better solution is to copy the HyperCard program into the folder where the rest of The Magic Flute stacks are and use it with the dedicated home stack there. However, if you have several copies of HyperCard on the disc, you have to be careful in opening stacks by double-clicking on them. You have no guarantee which version of HyperCard will be started....

The author finds that even this is a problem he can live with, but he does wish HyperCard 2.0 had existed when The Magic Flute stacks were written, so that there would be no need for a dedicated home stack.

Once the program is started, you may choose between four different sets of running commentary: English or German libretto, description of the action or music commentary. The options are available while the opera is playing, too. While the CD is playing, buttons appear indicating hypertextual links to more background information. Since the CDs are specially made, commentary and other music examples are added, including the dubious pleasure of Florence Foster Jenkins attempting the dramatic heights of the aria of the Queen of the Night, a recording from a 1940's concert in Carnegie Hall.

Beethoven's 9th symphony: this product consists of one ordinary CD and two HyperCard stacks. In this case there is no need for a special home stack, but the author reports certain problems with the fonts and in playing the recorded digitized sound, when he tried to convert the HyperCard stacks to version 2. He found it better to keep a version 1.2.5. on his hard disc.

In The Magic Flute the material is presented such that you are supposed to listen to the music in proper sequential order, and then make use of the hypertextual links when there are interesting new routes to follow. This is only one of the options in Beethoven's 9th Symphony - the option "A close reading", which incidentally has a user interface that functions very much like that of Mozart's Magic Flute. The other options are reading an essay on Beethoven's life and music, or another essay on music theory and an analysis of the 9th Symphony.

The program surrounding Beethoven's 9th Symphony is largely created by one man alone - musicologist Robert Winter, UCLA. The author finds that this presentation is simpler, more direct and more effective than The Magic Flute, which is created by a team.

Now, how useful are these for musical education? The author admits to being an amateur, a music-lover who has enjoyed both products, and feels certain that his music teachers in Upper Secondary school would have found these programs excellent teaching aids.

INFORMATION TECHNOLOGY FOR THE HUMANITIES: TRENDS AND NEEDS. SEMINAR IN BERGEN 5 OCTOBER

This seminar was part of the ongoing evaluation of the Norwegian Computing Centre for the Humanities (NCCH). Director Hauge welcomed the ca thirty participants and expressed the hope that the seminar would lead to a clearer understanding of the situation for computing in the humanities in Norway as well as of the needs for services from NCCH. The participants came from the computer sections and various departments of the arts faculties of the four Norwegian universities, and had very different ways of understanding the situation of computing in the humanities.

Computing in the humanities may be in its infancy: there is still a certain inertia among teachers and reseachers when it comes to making use of computer methods, but as the equipment becomes faster and more powerful, as well as cheaper, smaller and lighter, more and more people have started using computers for other jobs than just word processing. Cost is still an obstacle, since these milieus traditionally have demanded little in the way of equipment. However, one of the participants was convinced that the various departments and subjects would soon demand modern laboratories for research and teaching, and that the need for equipment would increase rapidly.

A number of general questions relating to computing in the humanities were raised: How far can humanists influence the development of machinery and software? Traditionally the complaints have concerned the lack of funding: now it appears to be a larger problem that humanists have no say in the development or definition of standards. The development is driven by commercial interests, and teachers and researchers in the humanities have to adapt the available equipment to their needs, rather than participate in developing humanist systems. The universities and the NCCH were challenged to address this problem.

Another problem is recruiting. It was considered important that NCCH make its existence more strongly felt, for instance by demonstrating machinery and software to stimulate interest for making use of the technology in research and education. But both students and staff need "crash courses" in computing. There are students who are interested in using computer methods for their theses, and professors who are reluctant to accept new methods simply because they don't know enough about them. The average staff member at the universities has a computer which he uses for word processing and very little beyond that. He makes his transparencies by hand and knows no program for computer assisted teaching in his field. This person needs to be made aware of the possibilities of computing. Another speaker felt that the students were overrated: many were complete illiterates in computing. They would need education and training in computer use, too, and NCCH was in a position to do something for both groups.

Several of the participants felt that NCCH ought to become a national competence center within multimedia (including hypertext/hypermedia/interactive video) and picture processing. It was also suggested that NCCH do more within the field of sound and speech processing. On the other hand, pedagogy might not be a primary field for NCCH work.

There was general agreement about the role of NCCH in information dissemination. Several participants were interested to learn that NCCH has started a more active use of electronic information services. More people now have the necessary equipment and are in the position to make use of these services, which will make it possible to mail queries and answers to "public" electronic mailing lists. It was also suggested that scholars should be given the opportunity to work at NCCH for a period, to learn about computing and bring the competence back to their work places.

Some people felt NCCH should do more to develop products and services, such as digital maps, data bases, bibliographic bases and Norwegian computer texts.

In the years to come, the humanities will develop a need for machinery and equipment comparable to that of the natural sciences. The students must also be included in this development: they will need education and access to equipment. One participant wanted development work done on a "history work station", where the researcher could work with different types of data and different types of software - better software than what is currently on the market. Hypermedia programs might do part of the job for the historian, provided the software for material analysis is improved. There should be systems that will tolerate a high degree of deviation: today's systems are not sufficiently tolerant for historical material.

Language teaching is a field where it is possible to foresee fundamental changes due to the new computer technology. One of the participants believed that within a ten-year period working methods and evaluation forms will have changed radically, and liberated teachers from many routine tasks. Perhaps it will be possible for students simply to hand in their exam "papers" on diskettes and for the teacher to have the machine suggest a preliminary evaluation.

The old-fashioned language laboratories are another problem: a student may well spend a lesson drilling the wrong pronunciation, because the teacher still only has minutes to spend on each student. Software is available that will check and correct the pronunciation of the student in comparison with the voice of the teacher or a native speaker, and there are also systems that will give students who are hard of hearing or deaf a visible indication of whether vowels are pronounced correctly.

One person felt that a "good NCCH" would be a type of interface where competence in computing met with competence in the humanities. Today, NCCH is too heavy on computing and too light on competence in the humanities.

Several people wanted to define the role of NCCH as that of the "locomotive" which would develop expert competence within a field and then pass the competence on to others, such as the computing sections at the arts faculties of the universities, and move on to other fields.

In conclusion, director Hauge said it might well be necessary for NCCH to reduce the number of current tasks, in order to strengthen the best parts of the activity.

SEMINAR ON HISTORICAL RECORD LINKAGE

This seminar was arranged at the National Archives in Oslo 29- 31 October 1990, for 16 invited participants.

Nominative, historical data are source data, such as censes, church registers and emigration protocols where individuals may be identified by name. Linking means to combine information from different sources in order to reconstruct the life of an individual or to find out what happened to a group of people.

The seminar took its starting point in work done in Norway, but the main purpose was to give the paricipants new impulses and information about the work that is done abroad. Special guests were Hans Christian Johansen, Dept. of History, Odense University and G‚rard Bouchard, director of SOREP, Centre Universitaire de Recherches sur les Populations, Universit‚ du Qu‚bec … Chicoutimi. SOREP has developed advanced techniques for automatic linkage. One of the problems involved in linking information from various sources, such as a church register and a census, is the unexpectedly high degree of error in these records: The name of one person may be given differently in the various records, or in different entries in the same record. One of the seminar participants gave examples from her own work: The name "Hansdatter" was entered as "Monsdatter", a girls's name, "Johanne", had been entered instead of "Johannes", the boy's actual name; the wrong person was registered as buried; ages were given wrongly; individuals or even whole families were entered twice in the same census. Linkage may be carried out manually, and historians have long been doing that. Linking by the use of computers is a more recent option, with two possible strategies: interactive linkage in which the computer analyses the material and suggests links, while the operator makes the decisions about which links are to be established. Interactive linkage is therefore more akin to manual linkage, sharing most of its strengths and weaknesses. Automatic linkage means that the computer makes the links on the basis of a set of criteria.

Errors are likely to occur both in automatic and manual linkage. In general there are two types of errors. Overlinking means to establish more links than the material contains: information about several people may be put together as relating to one and the same person, creating inauthentic lives. Underlinking means that links that should have been established are left out, and the representational value of the material is not fully appreciated.

Linkage as a method has become more interesting in recent years, because historians now take more interest in the individual. The alternative to linkage is to analyze the source material at a higher level of abstraction, which is still a common research method. But linkage is also a source of information in other fields of study: epidemiologists and geneticists are also concerned with the lives of individuals and families.

In automatic linkage one establishes a set of criteria for the linkage process and lets the computer carry out the work: a number of fields may be selected, as name, age and sex, and the program will compare and evaluate the similarity between record entries. The result is given as a score - a high score most likely indicating that the entries in fact refer to the same individual. It is important to establish a common cutoff-point, a standard for what is deemed a legitimate link between record entries.

In order not to have to compare all possible connections between two sources, the material may be organized in advance, say by grouping name variations into standardized forms. The article gives examples of such standardization: names as different as Ambret, Engebreth, Ingebret and Ingebrigt may be grouped as one name, because studies have shown that the actual variations in entries of the same name are at least this extreme.

Bouchard recommended material of a certain size, in order to optimize linkage projects. It was best to work with entire populations, counties or regions. Smaller areas, like parishes, can provide a suitable basis, but people who move will create problems for such limited investigations, so that he suggested that one at least worked on several parishes in order to improve the value of the linkage project.

The advantage of automatic over manual linkage is the fact that the criteria will not change in the course of the project. With manual linking it is not always possible to work with uniform consistency over a long period of time - the criteria for similarity may change.

For a check on the results of the automatic linkage, it may be advisable to reserve certain fields of information, such as age at marriage and death, for use aginst the findings. Bringing in too many criteria in the original linking will probably result in underlinking, i.e. reducing the representative value of the material.

The seminar showed that it is possible to develop computer systems that will make good use of historical material, but building a good, practicable system, like the Canadian one at SOREP is expensive. The best way is to bring in other fields of research as early as possible - epidemiology and genetics, in order to secure funds.

ONLINE 90, LONDON, 11-13 DECEMBER 1990

The 14th International Online Information Meeting was arranged by Learned Information in London, 11-13 December 1990. The conference drew some 2,400 participants, and the exhibition was seen by almost 8,000 people. The conference itself was divided into parallel sessions for English, French, German, Italian, Dutch and Spanish language users.

One of the main points of the conference was the relationship of online data bases to optical storage media: D. Raitt opened the conference by stating that online is no longer what it once was - because so many of the online data bases now are available on CD-ROM. One speaker discussed the practical features of the ideal CD-ROM work station of the 1990's.

Another part of the online data base industry's relationship to the market is a new demand for quality. There is now more criticism and an increasing demand for a dialogue between users and producers, as an indication also of the importance of the online industry.

One speaker criticized the user interface, and suggested that a few thought-out changes might make the data bases far more accessible to the users. His conclusion was that the online tool has not been good enough.

INTERNATIONAL SEMINAR ON RESEARCH DOCUMENTATION

This seminar in Bergen 7-9 February 1991 was arranged by the Norwegian Computing Centre for the Humanities and the Committee for Reseach Documentation under the Joint Board of the Norwegian Research Councils.

Research documentation is becoming increasingly more important, also as a field of international cooperation. At the seminar in Bergen speakers from 11 European countries presented the various national approaches to the problem of reseach documentation - which varied from absolute minimum solutions to some very advanced and sophisticated systems.

The conference speakers discussed a wide range of topics: the purpose of research documentation; the function of such a service for the various user groups; targeting and defining the user groups; standards for documentation information; copyrights; user support.

One of the most interesting novelties presented at the conference was the CERIF manual developed by the EEC as a suggestion for a common European format for research documentation. The work leading up to the production of the manual was presented by I.R. Perry from the Directorate General for Science, Research and Development under the Commission of the European Communities.

The point of such a manual is to standardize data bases so that it will be possible to make all European research documentation available throughout Europe. There are problems that must be solved before such international use of the data bases is possible, but many people are interested in establishing such international connections. For that reason many of the participants spoke warmly for a follow-up seminar in the not too distant future.

CENTRE FOR INFORMATION ON RESEARCH PROJECTS

In 1990 the Committee for Reseach Documentation under the Joint Board of the Norwegian Research Councils decided to start a pilot project to transfer information on all research projects to one common data base. The entry of the data was completed in the autumn of 1990.

The administration of the data base is located at the Norwegian Computing Centre for the Humanities in Bergen, and the data base makes use of an IBM 4381 computer belonging to the University of Bergen. The base employes a search system for free text and is available 24 hours a day via DATEL, DATAPAK and UNINETT. Limited assistance is available between 8 a.m. and 10 p.m., and full assistance including search assistance is available between 8 a.m. and 4 p.m. In addition there is a detailed manual, so the availability is very good. All services are provided free of charge.

Data are transferred from all the Norwegian research councils, and even though there are differences in quality and the amount of information in the documents, there is still a common core of information.

By 1 January 1991 the data base contained information on a total of 25,278 projects. These projects cover research on virginity rituals in the South Pacific, choral singing, astral physics and protection of the environment. It is therefore possible to carry out interdisciplinary searches, and find areas of common interest to many fields of research. E.g., a search for environment, algae and radiation showed that these topics were represented in a number of research projects. This search, which took a couple of minutes, came up with 1,380 documents on environmental research. It would probably have been impossible to find all this information in any other way than through an electronic data base.

COMMITTEE REPORT: NEW TECHNOLOGY TO BE IMPLEMENTED AT THE FACULTY OF ARTS AT THE UNIVERSITY OF BERGEN

A committee established by the Council for The Faculty of Arts at the University of Bergen presented its report in December 1990. The committee was to propose a plan for the future use of computers in the Arts Faculty and the organisation and function of the Faculty's Computing Section.

The committee's main proposals are,

1.   Computing in the Arts Faculty should not be limited to a separate section.
2.   The faculty needs computer expertise separated from the individual departments.
3.   The Computing Section should remain as a service and training section and should at all times relate its activities directly to he demands of the users.
4.   The Computing Section of the Arts Faculty should not develop competance available outside of the institution.

The committee also suggests that the Computing Section should continue to give courses in computing in the humanities, and that it should take over responsibility for the faculty's language laboratories.

There is also a discussion of the responsibilities of the Computing Section in relation to the Norwegian Computing Centre for the Humanities. The committee sees possibilities for a division of labour between the two institutions. The Computing Section seek should to cooperate with NCCH in fields where NCCH has expertise, such as in multimedia.

DISCUSSION OF INTERACTIVE VIDEO IN LANGUAGE TEACHING

On Friday 14 December 1990 Reseacher/NAVF Signe Marie Sanne held a discussion seminar on the use of interactive video in language teaching at H›yteknologisenteret in Bergen. About twenty college and upper secondary school teachers participated.

The starting point for the discussion was a presentation of Sanne's project in interactive video for Italian language teaching. This project has created an integrated work station where the student is primarily intended to work on his own. A fellow pedagogue, Ingrid Hatlestad, was specially invited to present her computer program which enables the teacher to check what use the student makes of the system.

The discussion was concerned with the pedagogical uses of this teaching system, whether the teacher should concentrate on the practical use of the language or move on to extralinguistic factors, cultural elements, etc.

Another question was concerned with the possibility for working out exercises to be used in connection with the material on the video record, multiple choice exercises, etc. Basically it is only the teacher's imagination that limits the number of possible computer assisted learning tricks that may be appended to this material.

Then there was the question of whether the program should record the student's voice for comparison with a model pronunciation. Technically this presents no problem at all.

The participants in the seminar saw the advantages of giving the students easy access to a great deal of material. They felt it was more likely that a student would look up a word in the dictionary on the screen than on the desk.

READER SURVEY ON HUMANISTISKE DATA

Last autumn we asked our 474 personal subscribers how they felt about Humanistiske Data. Unfortunately only 114 completed forms were returned, and that percentage is a little too low to make the answers fully representative. However, there are also other reasons why it is problematic to draw any substantial conclusions from the answers. The answers indicate that the readers represent a wide variety of subjects and interests, and that they would like to see more material in Humanistiske Data from their field.

One of the questions was:

Which topics and types of material are you most interested in?
Choices:
No of answers
   Percentage of answers
Computing in the Humanities in Scandinavia 69   63%
Theoretical articles about computing in the humanities 65   60%
Presentations of projects 62   57%
Reviews of software 59   54%
Computing in the Humanities in the rest of the world 50   46%
Computing in education 46   42%
New products 45   41%
Computing in cultural and research dissemination 38   35%
Book reviews 35   32%
Optical storage media 33   30%
Multimedia 33   30%
Work at NCCH 30   28%
Technical articles 21   19%
Computing in library work 11   10%
We interpret the answers thus that we should continue to report on a wide variety of subjects. We will continue to present reviews of machines and software, and reports of projects and arrangements. We will try to bring material on the subjects on the "wanted" list.

BUT THEN THERE WAS THE FORM in which we should report. Obviously, the readers' evaluation of Humanistiske Data is mixed. Some say the periodical is good, some say its boring, with old-fashioned lay-out and the wrong typography. Many ask for debates, and want the editorial board to evaluate programs and systems instead of just presenting the novelties uncritically. Some people write that technical descriptions are too detailed and inaccessible beyond the initiates.

However, many of our readers characterize Humanistiske Data as an outstanding periodical - in Scandinavia. We have few readers outside of Norway, but a high percentage of these have taken the opportunity to express the view that Humanistiske Data provides information which is not available through any other channel.

Obviously, we cannot satisfy everyone. However, we've heard it said that the biggest room in the world is the room for improvement, and we will start by changing the layout of the magazine. We also aim to publish four issues of Humanistiske Data per year. As far as debates go, it's very much up to our readers to use the periodical for debating purposes.

We wish to thank all of those who took the time to answer the questionnaire.

 


Innholdslisten for dette nummeret  Hovedside, Humanistiske Data Hjemmeside, Humanistisk Datasenter