Week of February 12, 2024

Improving LoRA: Implementing Weight-Decomposed Low-Rank Adaptation (DoRA) from ScratchLow-rank adaptation (LoRA) is a machine learning technique that modifies a pretrained model (for example, an LLM or vision transformer) to better suit a specific, often smaller, dataset by adjusting only a small, low-rank subset of the model’s parameters. This approach is important because it allows for efficient finetuning of large models on task-specific data, significantly reducing the computational cost and time required for finetuning. Last week, researchers proposed DoRA: Weight-Decomposed Low-Rank Adaptation, a new alternative to LoRA, which may outperform LoRA by a large margin. To understand how these methods work, we will implement both LoRA and DoRA in PyTorch from scratch in this article.(Ahead of AI, Sebastian Raschka, Ph. D.) / February 18

Magic AI Secures $117 Million to Build an AI Software EngineerSan Francisco-based startup Magic AI has raised $117 million in Series B funding to further develop its advanced AI system aimed at automating software development. The round was led by Nat Friedman and Daniel Gross’s NFDG Ventures, with additional participation from CapitalG and Elad Gil. This brings Magic’s total funding to date to over $145 million. Founded in 2022 by Eric Steinberger and Sebastian De Ro, the startup is carving out a niche by focusing on developing an AI software engineer capable of assisting with complex coding tasks and that will act more as a coworker than merely a “copilot” tool.(Maginative, Chris McKay) / February 16

comments powered by Disqus