I am an inquisitive guy, savvy with tech, and with a will to enhance my knowledge and skills in the domains of LSM-KVS, Disaggregated Infrastructure, and Storage Systems for AI/ML applications!

news

Jul 08, 2024 Won the Best Paper Award at HotStorage β€˜24 for our paper β€œCan Modern LLMs Tune and Configure LSM-based Key-Value Stores?”! πŸ†
Aug 01, 2023 Started my Ph.D. journey under the guidance of Dr. Zhichao Cao at Arizona State University!
Aug 01, 2023 Graduateed from ASU with a MS in Computer Science Degree! πŸŽ“
May 21, 2023 Started my Summer Internship at Amazon as a Software Development Engineer Intern! πŸš€
Aug 01, 2022 Working on redesign of website!

selected publications

  1. HotStorage

    Can Modern LLMs Tune and Configure LSM-based Key-Value Stores?

    In Proceedings of the 16th ACM Workshop on Hot Topics in Storage and File Systems , Santa Clara, CA, USA, Jun 2024

    Log-Structured-Merge tree-based Key-Value Stores (LSM-KVSs) are important data storage building blocks in modern IT infrastructure. However, tuning their performance involves configuring over 100 parameters, a task typically done manually or with limited parameters in auto-tuning mechanisms. This paper explores and answers the following question: can we leverage LLM’s understanding of the system and LSM-KVS components for unrestricted parameter-pool tuning of LSM-KVS?LLMs are trained on readily available LSM-KVS source code, research papers, and open materials enabling the machines to have human-like understanding. We investigate integrating Large-Language Models (LLMs) into an automated tuning framework for LSM-KVS to enhance the tuning capability and interactivity. Our framework utilizes LLMs to recommend tailored configurations with calibrated prompts based on hardware, system, and workload information. Initial results demonstrate upto 3X throughput improvements and an upto 9X reduction in p99 latency across various hardware and workloads compared to the out-of-box configuration for the LSM-KVS.