CodyBontecou commited on
Commit
62d0e38
·
1 Parent(s): c709ed6

TypeError: LLaDAModelLM.__init__() got an unexpected keyword argument 'use_flash_attn'

Browse files
Files changed (1) hide show
  1. handler.py +0 -1
handler.py CHANGED
@@ -9,7 +9,6 @@ class EndpointHandler:
9
  model_dir,
10
  torch_dtype=torch.bfloat16,
11
  low_cpu_mem_usage=True,
12
- use_flash_attn=False,
13
  trust_remote_code=True,
14
  device_map="auto",
15
  ).eval()
 
9
  model_dir,
10
  torch_dtype=torch.bfloat16,
11
  low_cpu_mem_usage=True,
 
12
  trust_remote_code=True,
13
  device_map="auto",
14
  ).eval()