CrisisTransformers is a family of pre-trained language models and sentence encoders introduced in the following papers:
The models were trained on a massive corpus of over 15 billion word tokens from tweets associated with 30+ crisis events, such as disease outbreaks, natural disasters, conflicts, etc.