Welcome to our GenAI: Best Practices!!! For each chapter, we provide detailed
Colab notebooks that you can open and run directly in Google Colab.
The PDF version can be downloaded from HERE.
Contents
- 1. Preface
- 2. Preliminary
- 3. Word and Sentence Embedding
- 4. Prompt Engineering
- 4.1. Prompt
- 4.2. Prompt Engineering
- 4.3. Advanced Prompt Engineering
- 4.3.1. Role Assignment
- 4.3.2. Contextual Setup
- 4.3.3. Explicit Instructions
- 4.3.4. Chain of Thought (CoT) Prompting
- 4.3.5. Few-Shot Prompting
- 4.3.6. Iterative Prompting
- 4.3.7. Instructional Chaining
- 4.3.8. Use Constraints
- 4.3.9. Creative Prompting
- 4.3.10. Feedback Incorporation
- 4.3.11. Scenario-Based Prompts
- 4.3.12. Multimodal Prompting
- 5. Retrieval-Augmented Generation
- 6. Fine Tuning
- 7. Pre-training
- 7.1. Transformer Architecture
- 7.1.1. Attention Is All You Need
- 7.1.2. Encoder-Decoder
- 7.1.3. Positional Encoding
- 7.1.4. Embedding Matrix
- 7.1.5. Attention Mechanism
- 7.1.6. Layer Normalization
- 7.1.7. Residual Connections
- 7.1.8. Feed-Forward Networks
- 7.1.9. Label Smoothing
- 7.1.10. Softmax and Temperature
- 7.1.11. Unembedding Matrix
- 7.1.12. Decoding
- 7.2. Modern Transformer Techniques
- 7.3. Case Studies: DeepSeek-V3
- 7.1. Transformer Architecture
- 8. LLM Evaluation Metrics
- 9. LLM Guardrails
- 10. LLM Deployments
- 11. Main Reference