A red sign with white text

Description automatically generated

 

Spring 2024 MW 4:30pm-5:50pm, April 1st – June 5th

Location: Gates B12

 

Edward Chang (echang@cs.stanford.edu)

Adjunct Professor, Computer Science, Stanford University

 

CA: Matthew Jin (mjin73@stanford.edu)

 

Office Hours: MW after class to 6:15pm at Gates B12 or 104.

 

 

Announcements

 

·  (5/29) Final lecture deck posted.  Please sign up your final presentation time.

·  (5/21) Recorded lectures of 5/20 and 5/22 posted.

·  (5/15) Recorded “Cancer Diagnosis and Treatment” lectures by Dr. Ko posted online.

·  (5/1) Syllabus refined, adding Llama-3 fine-tuning.

·  (4/30) Guest lecture slides of Dr. Duvvuri’s posted.

·  (4/10) HW#2 posted on Canvas, due in class on 4/15.

·  (4/3/2024) HW#1 posted on Canvas, due by 4:30pm 4/10.

·  (3/25/2024) Textbook is online at Kindle store.

·  (3/07/2024) Both SocraSynth and RAFEL are cutting edge research of LLMs.  Reading materials are available on the sites, and a course reader will be available by the start of the quarter.

·  (1/29/2024) Course project can be on any subjects, e.g., psychology, laws, business, literature, political science, healthcare, computer science, etc.

 

 

About

 

The 2024 edition of this course will delve into Generative AI topics, encompassing innovations like SocraSynth, GPT-4, ChatGPT, Gemini, and models of consciousness. In recent years, artificial intelligence, particularly through deep learning, attention mechanisms, and foundation models, has revolutionized technology. AI surpasses human capabilities in various tasks, including computer vision and natural language processing. Yet, we encounter challenges similar to those from the initial AI boom five decades ago. This course will critically analyze these challenges (such as issues with generalization, biases, hallucinations, and reasoning) in prevalent AI algorithms like CNNs, transformers, generative AI, and LLMs, including GPT and Gemini.

 

To overcome these hurdles, the course will cover topics like transfer learning for data scarcity, knowledge-guided multimodal learning for data diversity, and modeling of emotions, behaviors, and ethics, along with multi-LLM collaborative dialogue.

 

Teaching will blend lectures with project sessions. Guest speakers from academia and industry will share insights on AI's niche applications, like in diagnosing and treating cancer or depression. Students, from disciplines like CS, Business, Law, Medicine, and Data Science, will undertake term projects that might include literature reviews, idea development, and practical implementation. They are encouraged to craft a project that complements their graduate research, fostering a deep, integrative learning experience.

 

Note: The following prerequisite for taking this course have been waved because of the ease-of-use of ChatGPT. 

Level: Senior/graduate of any majors.

Perquisite: Introductory course in AI, Statistics, or Machine Learning.

 

Assignments

 

·   No exams.

·   Three assignments, using Gemini, ChatGPT and GPT-4 to complete.

·   Term project: A group project of two to three using LLMs, RAFEL, and SocraSynth to address practical problems. 

 

Textbook (required)

        

·  SocraSynth, Edward Y. Chang, March 2024

        

Grading

 

·   Assignments 30%

·   Class participation 10%

·   Literature survey and project proposal 20%

·   Project implementation and demo 40%

 

  Syllabus

 

Date

Description

Course Materials

Notes

Week #1

4/1/2023

 

Course Aims and Syllabus

ChatGPT Intro

[slides]

Intro to AI, GAI and why you should or should not take this course.

E. Chang

4/3/2023

LLMs (Large Language Models) Part 1 of 5

LLM Insights and Aphorisms

[slides]

Textbook Chapter 2 and Appendix X2

E. Chang

Assignment #1, posted on Canvas (due by 4:30pm on 4/10)

 

 

Week #2

4/8/2023

P4 Medicine, Part 1 of 3:

Intro to P4 Medicine [link]

 

[1] Schwartz, W. B., R. S. Patil, and P. Szolovits. "AI in Medicine: where do we stand." New England Journal of Medicine 316 (1987): 685-688. [link].

[2] Wu, T. D. "Efficient Diagnosis of Multiple Disorders Based on a Symptom Clustering Approach." Proceedings of AAAI, 1990, pp. 357-364.

[3] Universal Equivariant Multilayer Perceptrons, Siamak Ravanbakhsh, ICML, 2020.

[4] Tricorder (medical IoTs), E. Y. Chang, et al., "Artificial Intelligence in XPRIZE DeepQ Tricorder. " ACM MM Workshop for Personal Health and Health Care, 2017.

[5] Szolovits, P., and S. G. Pauker. "Categorical and Probabilistic Reasoning in Medical Diagnosis.” Artificial Intelligence 11(1-2), 1978: 115-144.

[6] Patil, R. S., P. Szolovits, and W. B. Schwartz. "Causal Understanding of Patient Illness in Medical Diagnosis." In Proceedings of the Seventh International Joint Conference on Artificial Intelligence.

 

E. Chang

 

 

4/10/2023

P4 Medicine, Part 2 of 3:

Principles of Diagnosis [link]

 

[1] Textbook Chapters 3 and 4

[2] Prompting Large Language Models with the Socratic Method, E. Y. Chang, IEEE CCWC, March 2023. [link]

[3] CRIT: Critical Reading Inquisitive Template, E. Y. Chang. [link]

[4] Noora.cs.stanford.edu

E. Chang

 

Assignment #1 due

Assignment #2: TBD

 

Week #3

4/15/2023

P4 Medicine, Part 3 of 3:

Small Data Machine Learning [link]

 

LLM, Part 2 of 5:
SocraSynth, SocraHealth, SocraPlan, SocraPedia

 

[1] Textbook Chapter 6

[2] Daphne Koller:  Biomedicine and Machine Learning, AI Podcast #93 with Lex Fridman, May 2020 [link]

[3] Edward Y. Chang et al., "Context-Aware Symptom Checking for Disease Diagnosis Using Hierarchical Reinforcement Learning." AAAI (2018).

[4] Edward Y. Chang et al., "REFUEL: Exploring Sparse Features in Deep Reinforcement Learning for Fast Disease Diagnosis." NeurIPS (2018).

E. Chang

Assignment #2 due

Assignment #3

4/17/2023

LLM, Part 3 of 5:

Critical Thinking & the Socratic Method

Textbook Chapter 4              

E. Chang

 

 

Week #4

4/22/2023

Psychiatric Disorders 1 of 2, symptoms, and treatments [slides][video]

 

 

Dr. Vikas Duvvuri,

MD/PhD Stanford

Assignment #3 due

4/24/2023

Sam Altman talk

 

Sam Altman

@NVDIA Auditorium

Week #5

4/29/2023

Psychiatric Disorders 2 of 2, symptoms, and treatments [slides][video]

 

Dr. Vikas Duvvuri,

MD/PhD Stanford

 

5/1/2023

LLM, Part 4 of 5:

History of NLP: Pre-LLM Era [slides]

Term Project Title/Abstract Presentations, session #1

 

[ Textbook Chapters 5, 6, 7, 8

 

[1] Textbook Chapter 1
[2] Attention is all you need, Ashish Vaswani, et al., [link].

[3] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding [link].

[4] Reformer: The Efficient Transformer [link].

[5] Perceiver: General Perception with Iterative Attention [link].

 

CA

E. Chang

Week #6

5/6/2023

Term Project Title/Abstract Presentations, Session #2

[1] Textbook Chapter 1
[2] Attention is all you need, Ashish Vaswani, et al., [link].

[3] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding [link].

[4] Reformer: The Efficient Transformer [link].

[5] Perceiver: General Perception with Iterative Attention [link].

 

E. Chang

5/8/2023

LLM Part 5 of 5: History of NLP: Post-LLM Era

BERT, GPT, RAFEL,

Virtual Assistant and Augmented LM. [slides]

[1] Textbook Chapter 11

[2] Genie: A Generator of Natural Language Semantic Parsers for Virtual Assistant Commands, G. Campagna, S. Xu, M. Moradshahi, R. Socher, and Monica S. Lam
In Proceedings of the 40th ACM SIGPLAN Conference on Programming Language Design and Implementation, Phoenix, AZ, June 2019 [link].

[3] Augmented Language Model, a survey, Meta, 2023 [link].

 

E. Chang

 

Week #7

5/13/2023

Fine-tuning LLM Tutorial: Llama 3

 

CA: Matthew J.

5/15/2023

Cancer, Part 1 od 2: Cancer Causes and Diagnosis [video][Slides]

[1]Principles and methods of integrative genomic analyses in cancer, Vessela N. Kristensen, Ole Christian Lingjærde, Hege G. Russnes, Hans Kristian M. Vollan, et al., Nature Reviews, 2014.

[2] Biomarker development in the precision medicine era: lung cancer as a case study, Ashley J. Vargas, and Curtis C. Harris, Natural Reviews, 2016.

[optional] Artificial intelligence in radiology, Ahmed Hosny, Chintan Parmar, John Quackenbush, Lawrence H. Schwartz,and Hugo J. W. L. Aerts, Nature Reviews, 2018.

 

Dr. Melissa Ko,

PhD, Cancer Biology,

Stanford

 

Week #8

5/20/2023

Consciousness & Mind, Part 1 & 2 of 3: Computational Consciousness, COCOMO [video][slides].

[[1] Textbook Chapter 10
[2] What is Life, Erwin Schrodinger, online book [link].

[3] COCOMO: Computational Consciousness Modeling, E.Y. Chang, 2023 [link].

 

E. Chang

5/22/2023

Cancer, Part 2 of 2: Cancer Treatment and How AI May Help [video][Slides]

[1] Mass cytometry: blessed with the curse of dimensionality, Evan W Newell & Yang Cheng, Nature Immunology, June 2016.

[2] Next-Generation Machine Learning for Biological Networks, Diogo M. Camacho, Katherine M. Collins, Rani K. Powers, James C. Costello, and James J. Collins, Leading Edge Review, June 2018.

[optional] Personalized Cancer Models for Target, Discovery and Precision Medicine, Carla Grandori1 and Christopher J. Kemp, Trends in Cancer, CellPress Reviews, September 2018.

 

Dr. Melissa Ko,

PhD, Cancer Biology,

Stanford

 

Week #9

5/27/2023

 

Memorial Holiday
no class

 

 

5/29/2023

Consciousness & Mind, Part 2 & 3 of 3: Putting all together: EVINCE [Slides]

1.     Cocomo, Computational Consciousness Modeling for Generative and Ethical AI,  Edward Y. Chang, arXiv eprint 2304.02438 [PDF], March 2023

2.     SocraSynth: Socratic Synthesis with Multiple Large Language Models --- Principles and Practices, March 2025.

3.     Integrating Emotional and Linguistic Models for Ethical Compliance in Large Language Models, Edward Y. Chang , arXiv:2405.07076 [PDF], May 2024

4.     Ensuring Diagnosis Accuracy in Healthcare AI with the EVINCE framework, Edward Y. Chang, arXiv:2405.15808 [PDF], May 2024

5.     EVINCE (Entropy Variation and INformation CompetencE), Stanford InfoLab Technical Report [PDF], June 6th, 2024

 

E. Chang

 

6/3/2023

Project Presentations I

Please sign up

E. Chang, CA

Week #10

6/5 (last day of class)

 

Project Presentations II

Presentation time is flexible to arrange between 5/31 and 6/5.

E. Chang

CA