Nature’s Choice ’10 Computer Technologies That Changed Science’

Computers are essential for great modern scientific discoveries, from astronomy to zoology. What are the top 10 computer technologies that have changed science published by the scientific journal Nature?

First, FORRTRAN, a pioneer in programming languages. Early computers were by no means user-friendly, they used punchcards to manually enter codes. Writing code required knowledge of complex programming languages, but with the advent of Fortran, developed by IBM in the 1950s, programs written in programming languages can be easily converted into languages that computers can actually process.

Second, fast Fourier transform. The algorithm of Fast Fourier Transform, invented by Karl Friedrich Gauss in 1805, allows the fast processing of complex signals that change over time. This algorithm is widely used in digital signal processing and image analysis in scientific fields, including radio astronomy, where it is necessary to visualize the frequency as a function in order to understand complex electromagnetic wave properties.

Third is the biological database. The large-scale genomic and protein data analysis and data storage system was initiated by bioinformatics pioneer Margaret Belle Oakley Dayhoff. When biologists began analyzing protein amino acid sequences in the 1960s, Dayhope et al. reported that data was recorded and encoded on punch cards to facilitate database search. Now, datasets are being used in various fields, including the Protein Information Bank, which started operation in 1971 and GenBank, which started operation in 1982.

Fourth is the atmospheric circulation model. Von Neumann, a computer pioneer at the end of World War II, used computers for ballistic trajectory and weapon design to reform weather forecasts. Up to that time, weather was predicted by experience and sense, but Neumann built weather data for each model, such as atmospheric model and ocean model, and analyzed it with a computer to improve prediction accuracy.

Next is BLAS. BLAS (Basic Linear Algebra Subprograms), a technology for solving scientific and technological problems, is mainly used for mathematical operations and was developed in 1979. It is said that this technique has standardized vector and matrix operations and relaxed to the level of addition.

Next is the NIH image. In the 1980s, Wayne Rasband, who worked in the NIH brain imaging laboratory at the National Institutes of Health, devised a program to display and analyze X-ray films on a computer. The NIH image program only worked on macOS, but later evolved into an advanced image processing system called ImageJ by NIH, and is now used in other operating systems.

Next is BLAST. In 1978, Margaret Dayhope devised a system (Point accepted mutation) to identify the similarities or relevance of primary structures of DNA and proteins. This later evolved with higher speed, higher precision, and became BLAST, bringing innovation to the field of genetic biology at the time.

Next is the archive (arXiv). In the late 1980s, scientific research was generally shared only within a small community. However, in 1991, physicist Paul Ginsparg, who worked at the Los Alamos National Laboratory, devised a new system for wide sharing of research content. This system registers and delivers science-related papers and articles. All disciplines that were limited to the physics community were released and renamed as an archive in 1998 to continue development. As of 2021, 1.8 million papers have been registered and all are free of charge.

The 9th is the IPython Notebook. Python, one of the programming languages, is an interpreter type that executes intermediate expressions in sequence without source code. The physicist Fernando Pérez invented his own shell iPython, thinking that a line-by-line language was not suitable for loading modules required for scientific research and data visualization. This shell was later renamed iPython laptop and became open source, revolutionizing the field of data science.

The last is AlexNet. Artificial intelligence is roughly divided into two types: using a single structured rule and performing processing by machine learning. University of Toronto graduate students Alex Krizhevsky and Ilya Sutskever developed AlexNet, an image recognition program based on machine learning in 2015. Also, at ImageNet, a contest that categorizes large images into programs, the best algorithm at that time made 25% errors, but Alexnet succeeded in recording a 15% error rate. Related information can be found here .



Through the monthly AHC PC and HowPC magazine era, he has watched 'technology age' in online IT media such as ZDNet, electronic newspaper Internet manager, editor of Consumer Journal Ivers, TechHolic publisher, and editor of Venture Square. I am curious about this market that is still full of vitality.

Add comment

Follow us

Don't be shy, get in touch. We love meeting interesting people and making new friends.

Most discussed

%d 블로거가 이것을 좋아합니다: