Model Details

this is the finetuned version of GPT2 on a coding dataset

Model Description

  • Model type: text-generation
  • Finetuned from model GPT2

Model Sources

Uses

# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="not-lain/PyGPT")
prompt = """
Below is an instruction that describes a task. Write a response that
appropriately completes the request.


### Instruction:

Create a function to calculate the sum of a sequence of integers.


### Input:

[1, 2, 3, 4, 5]


### Output:
"""

pipe(prompt)

Bias, Risks, and Limitations

model may produce biased ,erroneous and output.

Recommendations

it is not advised to use this model as it is just a product of testing a finetuning script

Training Details

Training Data

[More Information Needed]

Evaluation

please refer to the tensorboard tab for full details

Downloads last month
36
Safetensors
Model size
124M params
Tensor type
F32
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train not-lain/PyGPT

Space using not-lain/PyGPT 1