kmack commited on
Commit
09530d1
·
verified ·
1 Parent(s): a6c247b

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +66 -0
README.md ADDED
@@ -0,0 +1,66 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ metrics:
3
+ - accuracy
4
+ base_model:
5
+ - distilbert/distilgpt2
6
+ library_name: transformers
7
+ tags:
8
+ - text-generation-inference
9
+ - detective
10
+ - bert
11
+ ---
12
+ ## **Model Overview**
13
+
14
+ This model is a fine-tuned version of DistilGPT2, specifically trained on detective stories, with a focus on generating narratives in the style of Sherlock Holmes. It is designed to generate coherent and engaging detective tales based on an initial prompt, making it suitable for applications in story generation, writing assistance, and creative content creation.
15
+
16
+ ---
17
+
18
+ ## **Model Details**
19
+
20
+ - **Model Type**: Causal Language Model
21
+ - **Architecture**: Transformer
22
+ - **Base Model**: DistilGPT2
23
+ - **Fine-tuning Task**: Detective Story Generation
24
+ - **Training Data**: A curated dataset of detective stories, with a focus on Sherlock Holmes-style narratives and other detective-themed works.
25
+ - **Training Objective**: Language modeling (predicting the next word in a sequence given the previous words).
26
+
27
+ ---
28
+
29
+ ## **Usage**
30
+
31
+ ### **How to Use**
32
+
33
+ You can use this model to generate detective stories based on an input prompt. Below is a simple code snippet to get started:
34
+
35
+ ```python
36
+ from transformers import AutoModelForCausalLM, AutoTokenizer
37
+
38
+ # Load the model and tokenizer
39
+ model_name = "kmack/DetectiveTales-DistilGPT2"
40
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
41
+ model = AutoModelForCausalLM.from_pretrained(model_name)
42
+
43
+ # Define your prompt
44
+ prompt = "It all started with the murder of the chief. This is the type of case that Sherlock likes and gets off on"
45
+
46
+ # Tokenize the input
47
+ input_ids = tokenizer.encode(prompt, return_tensors="pt")
48
+
49
+ # Generate output
50
+ output = model.generate(
51
+ input_ids,
52
+ max_length=100, # Adjust the length as needed
53
+ num_return_sequences=1, # Number of generated outputs
54
+ no_repeat_ngram_size=2, # Avoid repeating phrases
55
+ top_k=50, # Use top-k sampling
56
+ top_p=0.95, # Use nucleus sampling
57
+ temperature=0.7, # Adjust creativity
58
+ do_sample=True # Enable sampling
59
+ )
60
+
61
+ # Decode and print the generated text
62
+ generated_text = tokenizer.decode(output[0], skip_special_tokens=True)
63
+ print(generated_text)
64
+
65
+ # Output:
66
+ # It all started with the murder of the chief. This is the type of case that Sherlock likes and gets off on. It only lasts a few days at last. He would say that Holmes's actions were not criminal. I assure you that if you knew his intentions and how he wasabout this,you probablywouldn't havehad him so far, Watson. Of course that is, then, but in any event, it isthe sort of case you know to follow closely. He is the one who has sent the letters from London to Madrid. He has sent every letter that I say, and if there are no objections you can take them out as you will. This is the type of cases where our Lady Watson will decide to get out of such a situation.We made a case for his protection,and he came out as the best possible one. We were all so ready so that he could not have waited for me to meet me again. The whole affairwas very unusual. It was a little dark on the night before. He gave me the order.