User:Zhuxiao
From Wikipedia, the free encyclopedia
Welcome to ZhuXiao' Research Wiki!
Here lists books, notes and papers for the subjects/topics which Zhuxiao has been highly interested in.
These can be classified into two main categories,
- Fundamental Studies in Analog IC design, and
- Engineering Applications in Eletronic Design Automation.
I am sincere to invite you to jointly study these subjects with me.
(We may set up a online study group for the interested subject.)
Welcome to direct any comment to zhuxiao_ee(at)yahoo(dot)com(dot)cn
[edit] Fundamental Studies in Analog IC design
[edit] Data Mining
- Jiawei Han and Micheline Kamber (2005). Data Mining: Concepts and Techniques (2nd ed.). Morgan Kaufman Publishers, ISBN:1558604898.
- Ian H. Witten and Eibe Frank (2005). Data Mining: Practical Machine Learning Tools and Techniques (2nd ed.). Morgan Kaufmann, ISBN:0120884070.
- Pang-Ning Tan, Michael Steinbach and Vipin Kumar (2005). Introduction to Data Mining. Addison Wesley, ISBN:0321321367.
- Mehmed Kantardzic (2003). Data Mining: Concepts, Models, Methods, and Algorithms. John Wiley & Sons, ISBN:0471228524.
- NOTES
- Jeffrey D. Ullman (2005). Lecture Notes(I) Lecture Notes(II). Course CS345: Data Mining, Stanford University.
- Inderjit S. Dhillon (2006). Lecture Notes. Course CS378: Introduction to Data Mining, Univ of Texas at Austin.
- Chris Clifton (2005). Lecture Notes. Course CS590D: Data Mining, Purdue University.
- Christoph F. Eick (2005). Lecture Notes. Course COSC 6397: Data Mining, University of Houston.
- Andrew Moore (200X). Data Mining Tutorials Computer Science Dept., Carnegie Mellon University.
- PAPERS on Sequential Data Mining
- PAPERS on Association Rule Mining
[edit] Statistical Learning Theory
- BOOKS
- (*) Vapnik, V.N. (1999). The Nature of Statistical Learning Theory (2nd ed.). Springer-Verlag.
- Vapnik, V.N. (1996). Statistical Learning Theory. Wiley-Interscience. (E)
- Hastie, T., Tibshirani, R., and Friedman, J. (2001). The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer.
- Cristianini, N. and Shawe-Taylor John. (2000). An Introduction to Support Vector Machines and Other Kernel-based Learning Methods. Cambridge University Press. (X)
- NOTES
- Poggio, T., Rakhlin, S., Caponnetto, A. and Rifkin, R. (2006). Lecture Notes. Course 9.520: Statistical Learning Theory and Applications. M.I.T.
- Jordan, M. (2004). Lecture Notes. Course CS281B/Stat241B: Statistical Learning Theory, UC Berkeley.
- Ben-David, S. (2003) Lecture Notes. Course ECE695: Statistical Learning Theory, Cornell University.
- PAPERS
- Vapnik, V.N. (1999). An Overview of Statistical Learning Theory. IEEE Tran. Neural Networks, 988-999.
(04/24/2006) Comments: Vipnik presents a 'very' high-level overview of statistical learning theory. But the good part is that readers can use this paper as a guideline to read his two books listed above. The Russian theorist does not tell you much in detail from this paper but from his books. - Bousquet, O., Boucheron, S. and Lugosi, G. (2004). Introduction to Statistical Learning Theory. Advanced Lectures on Machine Learning Lecture Notes in Artificial Intelligence 3176, 169-207.
(04/22/2006) Comments: Authors provide a thorough mathematical foundation to cover statistical learning theory assuming that readers possess the idea of what statistical learning is. Although some sections are not self-contained, this paper is still a good reference to develop a deeper view of statistical learning theory. - Vert, J.-P., Tsuda, K. and Scholkopf, B. (2004). A Primer On Kernel Methods. MIT Press, Cambridge, MA.
(10/19/2006) Comments: This is the best introductory paper I've ever read for kernel methods and support vector machine. Mathematical terms is clearly defined and explained so tha readers can understand it very quickly.
[edit] Computational Learning Theory (a.k.a. Machine Learning Theory)
The sutdy on computational learning theory does not have a bible book since this research field germinates from early 90s. Kearn's book and Schapire's lecture notes are suggested to be good starting points.
- BOOKS
- (*) Kearns, M.J. and Vazirani, U.V. (1994). An Introduction to Computational Learning Theory. The MIT Press. (X)
- Kearns, M.J. (1990). The Computational Complexity of Machine Learning. The MIT Press. (E)
- NOTES
- Schpire, Rob. (2005). Lecture Notes. Course: CS511: Foundation of Machine Learning. Princeton University.
(04/29/2006) Comments: Prof. Schapire has taught this course for a couple of years. The materials well cover the most important foundamental concepts of machine learning from both computational and statistical prespect. This is one of my favorite course website. - Rivest, R. (1994). Lecture Notes. Course: 6.858/18.428 Machine Learning. M.I.T.
(04/29/2006) Comments: This course website is recommended by my advisor, Prof. Li.C Wang. However, Prof. Rivest seems not to teach this course after 1994. Not quite sure if these materials are up-to-date. - Mitchell, T. and Moore., Andrew (2005). Lecture Notes Course 10-701/15-781. Carnegie Mellon University.
- PAPERS
[edit] Theory of Computation (a.k.a. Complexity Theory, Automata Theory)
- BOOKS
- (*) Wegener, I. (2005). Complexity Theory: Exploring the Limits of Efficient Algorithms. Springer.
- (*) Sipser, M. (1996). Introduction to the Theory of Computation. Course Technology
- Linz, P. (2000). An Introduction to Formal Languages and Automata. Jones & Bartlett Publishers.
- Hopcroft, J.E., Motwani, R. and Ullman J.D. (2000). Introduction to Automata Theory, Languages, and Computation (2nd ed.). Addison-Wesley
- Kohavi, Z. (1978). Switching and Finite Automata Theory. TATA McGraw-Hill.
- Straubing, H. (1994). Finite Automata, Formal Logic, and Circuit Complexity. Birhauser Boston.
- Vollmer, H. (1999). Introduction to Circuit Complexity: a Uniform Approach. Springer-Verlag.
- Papadimitriou, C.H., (1993). Computational Complexity. Addison-Wesley. (X)
- Papadimitriou, C.H and Steiglitz, K. (1998). Combinatorial Optimization : Algorithms and Complexity. Dover.
- Cormen, T.H., Leiserson, C.E., Rivest, R.L. and Stein, C. (2001). Introduction to Algorithms (2nd ed.). The MIT Press.
- Aho, A.V., Hopcroft, J.E. and Ullman, J.D. (1974). The Design and Analysis of Computer Algorithms. Addison-Wesley.
- NOTES
- PAPERS
[edit] Information Theory (a.k.a. Communication Theory, Coding Theory)
- BOOKS
- (*) MacKay, D. (2003). Information Theory, Inference, and Learning Algorithm. Cambridge University Press.
- Cover, T.M. and Thomas, J.A. (2006). Elements of Information Theory (2nd ed.). Wiley-Interscience. (X)
- NOTES
- PAPERS
[edit] Mathematical Logic
- BOOKS
- Ebbinghaus, H.-D., Flum, J. and Thomas, W. (1984). Mathematical Logic (Undergraduate Texts in Mathematics). Springer. (E)
- Ben-Ari, M. (2003). Mathematical Logic for Computer Science. Springer. (E)
- NOTES
- PAPERS
[edit] Probability and Statistics
- BOOKS
- Papoulis, A. and Pillai, S.U. (2002). Probability, Random Variables and Stochastic Processes. McGraw-Hill.
- Hsu, H.P. (1997), Theory and Problems of Probability, Random Variables, and Random Processes. Mc-Graw Hill. (E)
- Jaynes, E.T. (1995). Probability Theory : The Logic of Science. Cambridge University Press. (E)
- Ross, S.M. (1970). Applied Probability Models with Optimization Applications. Dover.
- Tabachnick, B.G. and Fiedell, L.S. (2001). Using Multivariate Statistics (4th ed.). Allyn & Bacon.
- NOTES
- PAPERS
Engineering Applications in Eletronic Design Automation continues...
note:
- The bibliography is arranged in Springer-Verlag's style, not IEEE/ACM style.
- (X) indicates that Charles does not have this material at hand.
- (*) indicates the materials I am studying or is going to study in the near future.

