File size: 2,502 Bytes
f70bab7
 
 
 
 
 
 
 
 
 
d35f826
4c229e2
d35f826
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b4d0710
4c229e2
b4d0710
4c229e2
b4d0710
4c229e2
 
d35f826
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
---
license: apache-2.0
datasets:
- BI55/MedText
language:
- en
library_name: transformers
pipeline_tag: text-generation
tags:
- medical
---
# Model Card for Phi-Med-V1

<!-- Provide a quick summary of what the model is/does. -->

Microsoft Phi2 Finetuned on Medical Text Data

## Model Details

### Model Description

<!-- Provide a longer summary of what this model is. -->



- **Developed by:** [JJ]
- **Model type:** [SLM]
- **Finetuned from model:** [microsoft/Phi-2]


## Uses

<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->

Testing the effectivness of Finetuning SLMs 

### Direct Use

<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->

Not Allowed as this is for research only




## Bias, Risks, and Limitations

<!-- This section is meant to convey both technical and sociotechnical limitations. -->

Model can still Halucinate.


## Training Details

### Training Data

<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->

MedText Dataset from HuggingFace

### Training Procedure 

<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->

SFT using HF Transformers


## Environmental Impact

<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->

- **Hardware Type:** A10 GPU VMs [2x24GB A10]
- **Hours used:** [3]
- **Cloud Provider:** [Azure]
- **Compute Region:** [North Europe (Dublin)]

- Experiments were conducted using Azure in region northeurope, which has a carbon efficiency of 0.62 kgCO$_2$eq/kWh. A cumulative of 100 hours of computation was performed on hardware of type A10 (TDP of 350W).

- Total emissions are estimated to be 21.7 kgCO$_2$eq of which 100 percents were directly offset by the cloud provider.
    
- Estimations were conducted using the [https://mlco2.github.io/impact#compute][MachineLearning Impact calculator]
                        

## Technical Specifications [optional]


### Compute Infrastructure

[Azure]

#### Hardware

[NV72ads A10 GPU VMs]

#### Software

[Axolotl]

## Model Card Authors [optional]

[JJ]

## Model Card Contact

[JJ]