Learn and Share


A big part of our success comes from staying current. We maintain a list that guides us, which might help you too. This curated collection of resources, from articles to tools, keeps our team sharp and informed in the ever-evolving field of Generative AI. Dive in to explore, learn, and get inspired.

A Survey of Large Language Models for Healthcare

The utilization of large language models (LLMs) in the Healthcare domain has generated both excitement and concern due to their ability to effectively respond to free- text queries with certain professional knowledge. This survey outlines the capabilities of the currently developed LLMs for Healthcare and explicates their development process, with the aim of providing an overview of the development roadmap from traditional Pretrained Language Models (PLMs) to LLMs

How Transformer Works

Generative AI exists because of the transformer. Here is an excellent multimedia version of how Transformer works

Do you need hosted LLM?

A comparison of self-hosted LLMs and OpenAI: cost, text generation quality, development speed, and privacy

Building LLM Applications for Production

“How do I deploy LLM into production?” It’s a question we hear often. Chip provides the answer, offering an in-depth analysis to navigate this process.

Challenges and Applications of Large Language Models

Large Language Models (LLMs) went from non-existent to ubiquitous in the machine learning discourse within a few years. Due to the fast pace of the field, it is difficult to identify the remaining challenges and already fruitful application areas. This paper aims to establish a systematic set of open problems and application successes so that ML researchers can comprehend the field’s current state more quickly and become productive

What are embeddings?

Understanding embeddings is crucial if you want to get far with RAG. This PDF is designed to help you do precisely that.

© 2021-2023, Data Yakka Inc.