Generative AI With Kaggle Models

Exploring Kaggle’s Trailblazing Generative AI Models

A.I Hub
6 min readJul 22, 2024
Image owned by Diplo Foundation

In this article, we will walk you through the fantastic and deep journey of learning generative artificial intelligence with kaggle models. In the vast landscape of artificial intelligence, where innovation meets imagination kaggle’s generative AI models emerge as beacons of creativity. Powered by vast datasets and refined algorithms, they herald a new era where machines not only learn but also inspire, pushing the boundaries of what’s possible in art, design and beyond, what we imagine to make with artificial intelligence.

Table of Content

  • Introducing kaggle models
  • Prompting a foundation model
  • Model evaluation and testing

Introducing Kaggle Models

Image owned by Kaggle

Kaggle Models represent one of the latest innovations on the Kaggle
platform. This feature gained prominence in particular after the introduction
of code competitions, where participants often train models either on their

local hardware or in the cloud. Post training, they upload these models to
Kaggle as a dataset. This practice allows Kagglers to utilize these pre-trained
models in their inference notebooks, streamlining the process for code
competition submissions. This method significantly reduces the runtime of
the inference notebooks, fitting within the stringent time and memory
constraints of the competition. Kaggle’s endorsement of this approach aligns
well with the real world production systems, where model training and inference
typically occur in separate pipelines.

This strategy becomes indispensable with large scale models, such as those based on transformer architectures considering the immense computational
resources required for fine tuning. Platforms like HuggingFace have further

democratized access to large models, offering options to either utilize online
or download collaboratively developed models. Kaggle’s introduction of the

Models feature, which can be added to notebooks just like datasets has been
a significant advancement. These models can be directly used within a

notebook for tasks like transfer learning or further fine tuning. At the time of writing, however, Kaggle does not permit users to upload their models in the

same manner as datasets.

Kaggle’s model library offers a browsing and search functionality allowing
users to find models based on various criteria like name, metadata, task, data

type and more. At the time of writing, the library boasted 269 models with
1,997 variations, published by prominent organizations, including Google,

TensorFlow, Kaggle, DeepMind, Meta and Mistral. The field of Generative AI has seen a surge in interest following the
introduction of models like GPT-3, ChatGPT, GPT-4 and various other
LLMs or Foundation Models. Kaggle provides access to several powerful
LLMs, such as Llama, Alpaca and Llama 2. The platform’s integrated
eco-system allows users to swiftly test new models as they emerge. For
instance, Meta’s Llama 2, available since July 18, 2023 is a series of
generative text models with variants ranging from 7 billion to 70 billion
parameters. These models, including specialized versions for chat

applications, are accessible on Kaggle with relative ease compared to other
platforms. Kaggle further simplifies the process by allowing users to start a notebook
directly from the model page, akin to initiating a notebook from a
competition or dataset.

This streamlined approach, as illustrated in the screenshot,

enhances the user experience and fosters a more efficient workflow in model
experimentation and application.

Figure 1.1 - Main page of the Mistral Model, with the button to Add a Notebook in the top-right

corner

Once the notebook is open in the editor, the model is already added to it.
One more step is needed in the case of models and this is because a model
has also variations, versions and frame works. In the right-hand panel of the
notebook edit window, you can set these options. After these options are set,
we are ready to use the model within the notebook,
we show the options for one model, Mistral, from Mistral AI, after everything was selected in the menu.

Figure 1.2 - Model Mistral from Mistral AI is added to a notebook, and all options are selected

Prompting a Foundation Model

Image owned by HBR

LLMs can be used directly, for example, for such tasks as summarization,
question answering and reasoning. Due to the very large amounts of data on

which they were trained, they can answer very well to a variety of questions
on many subjects, since they have the context available in that training
dataset.

In many practical cases, such LLMs can correctly answer our questions on
the first attempt. In other cases, we will need to provide a few clarifications

or examples. The quality of the answers in these zero-shot or few-shot approaches highly depends on the ability of the user to craft the prompts for
LLM. In this section, we will show the simplest way to interact with one

LLM on Kaggle, using prompts.

Model Evaluation and Testing

Image owned by Hotjar

Before starting to use an LLM on Kaggle, we will need to perform a few
preparation steps. We begin by loading the model and then defining a
tokenizer. Next, we create a model pipeline. In our first code example, we
will use AutoTokenizer from transformers as a tokenizer and create a pipeline, also using the transformers pipeline.

The preceding code returns the tokenizer and the pipeline. We then
implement a function to test the model. The function receives as parameters
the tokenizer, the pipeline and the prompt with which we would like to test
the model. See this code for the test function.

Now, we are ready to prompt the model. The model we are using has this characteristic, a Llama 2 model (7b), a chat version from

HuggingFace (version 1) and the PyTorch framework. We will prompt the
model with math questions. In the next code extract, we initialize the tokenizer and pipeline and then prompt the model with a simple arithmetic

problem, formulated in plain language.

Let’s see how the model reasons. this screenshot, we plot the time

for inference, the prompt and the answer.

Figure 1.3 - Prompt, answer, and inference time for a math question with the Llama 2 model

For this simple math question, the reasoning of the model seems accurate.
Let’s try again with a different question. In this code excerpt, we
ask a geometry question.

This screenshot shows the result of prompting the model with the

preceding geometry question.

Figure 1.4 - Llama 2 model answer to a basic geometry question

The response to simple mathematical questions is not correct all of the time.
In this example, we prompted the model with a variation of the
first algebraic problem. You can see that, in this case, the model took a

convoluted and wrong path to reach an incorrect solution.

Figure 1.5 - Llama 2 model solution is wrong for an algebra problem

Conclusion

Finally, we will take a descriptive drive of learning generative ai with kaggle models and along with that, we also learn different and most powerful generative models that are used by several top companies. In the evolving landscape of AI, Kaggle models stand as beacons of innovation and collaboration, illuminating paths to unprecedented realms of generative potential. Through these models, we witness the fusion of human ingenuity and machine learning prowess, birthing algorithms that not only decipher data but breathe life into creativity itself. As we embark on this journey of exploration and discovery, Kaggle’s community driven ethos reminds us that the future belongs not just to those who dream, but to those who dare to transform those dreams into reality. Embrace the dawn of generative AI with Kaggle models where each line of code heralds a new section in the saga of technological advancement.

--

--

A.I Hub
A.I Hub

Written by A.I Hub

We writes about Data Science | Software Development | Machine Learning | Artificial Intelligence | Ethical Hacking and much more. Unleash your potential with us

No responses yet