<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>Rajan Ghimire</title>
    <link>https://R4j4n.github.io/blogs/</link>
    <description>Recent content on Rajan Ghimire</description>
    <generator>Hugo -- gohugo.io</generator>
    <language>en-us</language>
    <lastBuildDate>Sat, 07 Oct 2023 00:00:00 +0000</lastBuildDate><atom:link href="https://R4j4n.github.io/blogs/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>Transformers Optimization: Part 1 - KV Cache</title>
      <link>https://R4j4n.github.io/blogs/posts/kv/</link>
      <pubDate>Sat, 07 Oct 2023 00:00:00 +0000</pubDate>
      
      <guid>https://R4j4n.github.io/blogs/posts/kv/</guid>
      <description>Understanding KV Cache, its working mechanism and comparison with vanilla architecture.</description>
    </item>
    
    <item>
      <title>Decoding Strategies in Language Models</title>
      <link>https://R4j4n.github.io/blogs/posts/text_decoding/</link>
      <pubDate>Fri, 15 Sep 2023 00:00:00 +0000</pubDate>
      
      <guid>https://R4j4n.github.io/blogs/posts/text_decoding/</guid>
      <description>Exploring and implementing text decoding strategies in PyTorch</description>
    </item>
    
    <item>
      <title>Supercharge Your LLaMA: Fine-Tuning Made Effortless and Efficient 🚀</title>
      <link>https://R4j4n.github.io/blogs/posts/apdapter/</link>
      <pubDate>Fri, 08 Sep 2023 00:00:00 +0000</pubDate>
      
      <guid>https://R4j4n.github.io/blogs/posts/apdapter/</guid>
      <description>Efficiency and versatility of the LLaMA-Adapter from scratch.</description>
    </item>
    
    <item>
      <title>The Secret Sauce of LLaMA🦙 : A Deep Dive!</title>
      <link>https://R4j4n.github.io/blogs/posts/llama/</link>
      <pubDate>Sun, 20 Aug 2023 00:00:00 +0000</pubDate>
      
      <guid>https://R4j4n.github.io/blogs/posts/llama/</guid>
      <description>Understanding the ins and outs of Meta&amp;#39;s LLaMa(Open and Efficient Foundation Language Models) from scratch.</description>
    </item>
    
    <item>
      <title>Semantic Segmentation from scratch in PyTorch.</title>
      <link>https://R4j4n.github.io/blogs/posts/deeplab/</link>
      <pubDate>Tue, 25 Jul 2023 00:00:00 +0000</pubDate>
      
      <guid>https://R4j4n.github.io/blogs/posts/deeplab/</guid>
      <description>Custom person segmenter from scratch.</description>
    </item>
    
    <item>
      <title>Quantization in PyTorch: Optimizing Architectures for Enhanced Performance</title>
      <link>https://R4j4n.github.io/blogs/posts/quantization/</link>
      <pubDate>Sat, 15 Jul 2023 00:00:00 +0000</pubDate>
      
      <guid>https://R4j4n.github.io/blogs/posts/quantization/</guid>
      <description>Dissecting Static, Dynamic and Quantization Aware Training in PyTorch.</description>
    </item>
    
    <item>
      <title>LORA(Low Rank Adaptation) : A Deeper Dive</title>
      <link>https://R4j4n.github.io/blogs/posts/lora/</link>
      <pubDate>Mon, 06 Mar 2023 00:00:00 +0000</pubDate>
      
      <guid>https://R4j4n.github.io/blogs/posts/lora/</guid>
      <description>Exploring and Implementating LoRA in PyTorch.</description>
    </item>
    
    <item>
      <title>Vision Transformer (ViT)</title>
      <link>https://R4j4n.github.io/blogs/posts/vit/</link>
      <pubDate>Mon, 06 Feb 2023 00:00:00 +0000</pubDate>
      
      <guid>https://R4j4n.github.io/blogs/posts/vit/</guid>
      <description>ViT from scratch in pytorch</description>
    </item>
    
    <item>
      <title></title>
      <link>https://R4j4n.github.io/blogs/about/</link>
      <pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
      
      <guid>https://R4j4n.github.io/blogs/about/</guid>
      <description>I wanna sleep so much but this field is so damn interesting ! Hello! ¡Hola! Bonjour! Привет! こんにちは！ 你好！ 안녕하세요! مرحباً! नमस्ते!
I&amp;rsquo;m Rajan Ghimire, a computer engineering graduate with a keen interest in implementing deep learning architectures and papers from scratch. My unique blend of theoretical knowledge and hands-on experience allows me to bridge the gap between academia and industry, crafting innovative solutions that push the boundaries of what&amp;rsquo;s possible.</description>
    </item>
    
  </channel>
</rss>
