Emphasized the importance of training sequences, and the use of parts of previous solutions to problems in constructing trial solutions to new problems. This paper created Algorithmic learning theory. Computational learning theory , VC theory , statistical uniform convergence and the VC dimension.
The Probably approximately correct learning PAC learning framework. Development of Backpropagation algorithm for artificial neural networks. Note that the algorithm was first described by Paul Werbos in Decision Trees are a common learning algorithm and a decision representation tool. Development of decision trees was done by many researchers in many areas, even before this paper.
Though this paper is one of the most influential in the field. One of the papers that started the field of on-line learning.
In this learning setting, a learner receives a sequence of examples, making predictions after each one, and receiving feedback after each prediction. Research in this area is remarkable because 1 the algorithms and proofs tend to be very simple and beautiful, and 2 the model makes no statistical assumptions about the data. In other words, the data need not be random as in nearly all other learning models , but can be chosen arbitrarily by "nature" or even an adversary.
Specifically, this paper introduced the winnow algorithm. The Temporal difference method for reinforcement learning. Proving negative results for PAC learning. Proving that weak and strong learnability are equivalent in the noise free PAC framework.
The proof was done by introducing the boosting method. This paper presented support vector machines , a practical and popular machine learning algorithm.
Support vector machines often use the kernel trick. This paper presented a tractable greedy layer-wise learning algorithm for deep belief networks which led to great advancement in the field of deep learning.
The first application of supervised learning to gene expression data, in particular Support Vector Machines. The method is now standard, and the paper one of the most cited in the area. LR parser , which does bottom up parsing for deterministic context-free languages. Later derived parsers, such as the LALR parser , have been and continue to be standard practice, such as in Yacc and descendents. About grammar attribution, the base for yacc's s-attributed and zyacc's LR-attributed approach.
A procedure is given which determines all the definitions which can possibly reach each node of the control flow graph of the program and all the definitions that are live on each edge of the graph. Formalized the concept of data-flow analysis as fixpoint computation over lattices , and showed that most static analyses used for program optimization can be uniformly expressed within this framework.
Yacc is a tool that made compiler writing much easier. This book became a classic in compiler writing. It is also known as the Dragon book , after the red dragon that appears on its cover. Colossus was an early binary electronic digital computer.
The design of Colossus was later described in the referenced paper. It contains the first published description of the logical design of a computer using the stored-program concept, which has come to be known as the von Neumann architecture. It was the first family of computers making a clear distinction between architecture and implementation.
The first Cray-1 system was installed at Los Alamos National Laboratory in , and it went on to become one of the best known and most successful supercomputers in history. It is a good paper for discussing issues of reliability and fault tolerance of computer systems, and the cost of providing such fault-tolerance.
This paper argues that the approach taken to improving the performance of processors by adding multiple instruction issue and out-of-order execution cannot continue to provide speedups indefinitely. It lays out the case for making single chip processors that contain multiple "cores". With the mainstream introduction of multicore processors by Intel in , and their subsequent domination of the market, this paper was shown to be prescient.
The Academy of Motion Picture Arts and Sciences cited this paper as a "milestone in computer graphics". A correlation method based upon the inverse Fourier transform.
A method for estimating the image motion of world points between 2 frames of a video sequence. A technique for visual tracking. A technique scale-invariant feature transform for robust feature description. This paper introduced the relational model for databases. This model became the number one model. This paper introduced the B-Trees data structure.
This paper introduced the entity-relationship diagram ERD method of database design. This paper defined the concepts of transaction , consistency and schedule. It also argued that a transaction needs to lock a logical rather than a physical subset of the database.
Introduced federated database systems concept leading huge impact on data interoperability and integration of hetereogenous data sources. Association rules , a very common method for data mining. Presented the vector space model. Presented the inverted index. Conceived a statistical interpretation of term specificity called Inverse document frequency IDF , which became a cornerstone of term weighting. This book presents a comprehensive and accessible approach to data communications and networking that has made this book a favorite with students and professionals alike.
More than figures and tables accompany the text and provide a visual and intuitive opportunity for understanding the material. This paper discuss time-sharing as a method of sharing computer resource.
This idea changed the interaction with computer systems. The beginning of cache. The classic paper on Multics , the most ambitious operating system in the early history of computing. Difficult reading, but it describes the implications of trying to build a system that takes information sharing to its logical extreme. Most operating systems since Multics have incorporated a subset of its facilities. Classic paper on the extensible nucleus architecture of the RC multiprogramming system , and what became known as the operating system kernel and microkernel architecture.
The first comprehensive textbook on operating systems. Includes the first monitor notation Chapter 7. This paper addresses issues in constraining the flow of information from untrusted programs. It discusses covert channels, but more importantly it addresses the difficulty in obtaining full confinement without making the program itself effectively unusable.
The ideas are important when trying to understand containment of malicious code, as well as aspects of trusted computing. The Unix operating system and its principles were described in this paper. The main importance is not of the paper but of the operating system, which had tremendous effect on operating system and computer technology. This paper describes the consistency mechanism known as quorum consensus. It is a good example of algorithms that provide a continuous set of options between two alternatives in this case, between the read-one write-all, and the write-one read-all consistency methods.
There have been many variations and improvements by researchers in the years that followed, and it is one of the consistency algorithms that should be understood by all. My library Help Advanced Book Search. Great papers in computer science. This carefully compiled and wide-ranging volume of papers written by computer pioneers offers first-hand insight into the research and discovery experiences of legendary scientists such as Hoare, Hartmanis, Stearns, Backus, and Knuthr.
Coupled with introductory, essays, written by the originating authors where possible, these papers are an ideal source of background research and technical reference. Collectively, they illustrate the impact of pioneering work on the field of modern computer science.
They are an excellent companion to undergraduate computer science courses. From inside the book. Contents Algorithms and Data Structures. Laplante Snippet view - Real Time Systems Design and Analysis:
Computer Science Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
Great Papers in Computer Science Phillip Laplante Burlington County College West Publishing Company, ISBN: Contents Section 1 Algorithms and Data Structures. The Complexity of Theorem Proving Procedures.
Science of Computer Programming, —, ; Online version; Description: Statecharts are a visual modeling method. They are an extension of state machine that might be exponentially more efficient. Therefore, statcharts enable formal modeling of applications that were too complex before. Great papers in computer science. New York. slevin writes "Recently I listened to a talk by Alan Kay who mentioned that many 'new' software ideas had already been discovered decades earlier by computer scientists - but 'nobody reads these great papers anymore.' Over the years I have had the opportunity to read some really great and thought-pr.
A supplement to courses or seminars, this work compiles major papers written during the development of computer science, including papers that are valuable, yet difficult to find. Appreciate the wide range of factors that caused these papers to have significant impact in computer science / software engineering Develop an in-depth, well-rounded analysis of the context and impact of a seminal computer science or software engineering paper.