justinj92 commited on
Commit
d35f826
1 Parent(s): f70bab7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +92 -1
README.md CHANGED
@@ -8,4 +8,95 @@ library_name: transformers
8
  pipeline_tag: text-generation
9
  tags:
10
  - medical
11
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8
  pipeline_tag: text-generation
9
  tags:
10
  - medical
11
+ ---
12
+ # Model Card for Model ID
13
+
14
+ <!-- Provide a quick summary of what the model is/does. -->
15
+
16
+ Microsoft Phi2 Finetuned on Medical Text Data
17
+
18
+ ## Model Details
19
+
20
+ ### Model Description
21
+
22
+ <!-- Provide a longer summary of what this model is. -->
23
+
24
+
25
+
26
+ - **Developed by:** [JJ]
27
+ - **Model type:** [SLM]
28
+ - **Finetuned from model:** [microsoft/Phi-2]
29
+
30
+
31
+ ## Uses
32
+
33
+ <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
34
+
35
+ Testing the effectivness of Finetuning SLMs
36
+
37
+ ### Direct Use
38
+
39
+ <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
40
+
41
+ Not Allowed as this is for research only
42
+
43
+
44
+
45
+
46
+ ## Bias, Risks, and Limitations
47
+
48
+ <!-- This section is meant to convey both technical and sociotechnical limitations. -->
49
+
50
+ Model can still Halucinate.
51
+
52
+
53
+ ## Training Details
54
+
55
+ ### Training Data
56
+
57
+ <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
58
+
59
+ MedText Dataset from HuggingFace
60
+
61
+ ### Training Procedure
62
+
63
+ <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
64
+
65
+ SFT using HF Transformers
66
+
67
+
68
+ ## Environmental Impact
69
+
70
+ <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
71
+
72
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
73
+
74
+ - **Hardware Type:** A10 GPU VMs [2x24GB A10]
75
+ - **Hours used:** [3]
76
+ - **Cloud Provider:** [Azure]
77
+ - **Compute Region:** [North Europe (Dublin)]
78
+
79
+ ## Technical Specifications [optional]
80
+
81
+
82
+ ### Compute Infrastructure
83
+
84
+ [Azure]
85
+
86
+ #### Hardware
87
+
88
+ [NV72ads A10 GPU VMs]
89
+
90
+ #### Software
91
+
92
+ [Axolotl]
93
+
94
+ ## Model Card Authors [optional]
95
+
96
+ [JJ]
97
+
98
+ ## Model Card Contact
99
+
100
+ [JJ]
101
+
102
+