lym00 commited on
Commit
79a04cc
·
verified ·
1 Parent(s): 8cbc120

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +26 -0
README.md CHANGED
@@ -122,6 +122,32 @@ potential fix: app.diffusion.pipeline.config.py
122
  return pipeline
123
  ```
124
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
125
  2) KeyError: <class 'diffusers.models.transformers.transformer_flux.FluxAttention'>
126
 
127
 
 
122
  return pipeline
123
  ```
124
 
125
+ Log
126
+ ```
127
+ >>> DEVICE: cuda
128
+ >>> PIPELINE TYPE: <class 'diffusers.pipelines.flux.pipeline_flux.FluxPipeline'>
129
+ >>> Moving transformer to cuda using to_empty()
130
+ >>> WARNING: transformer.to_empty(cuda) failed: Module.to_empty() takes 1 positional argument but 2 were given
131
+ >>> Falling back to transformer.to(cuda)
132
+ >>> ERROR: transformer.to(cuda) also failed: Cannot copy out of meta tensor; no data! Please use torch.nn.Module.to_empty() instead of torch.nn.Module.to() when moving module from meta to a different device.
133
+ >>> Moving vae to cuda using to_empty()
134
+ >>> WARNING: vae.to_empty(cuda) failed: Module.to_empty() takes 1 positional argument but 2 were given
135
+ >>> Falling back to vae.to(cuda)
136
+ >>> Moving text_encoder to cuda using to_empty()
137
+ >>> WARNING: text_encoder.to_empty(cuda) failed: Module.to_empty() takes 1 positional argument but 2 were given
138
+ >>> Falling back to text_encoder.to(cuda)
139
+ 25-07-21 22:47:05 | I | Replacing fused Linear with ConcatLinear.
140
+ 25-07-21 22:47:05 | I | + Replacing fused Linear in single_transformer_blocks.0 with ConcatLinear.
141
+ 25-07-21 22:47:05 | I | - in_features = 3072/15360
142
+ 25-07-21 22:47:05 | I | - out_features = 3072
143
+ 25-07-21 22:47:05 | I | + Replacing fused Linear in single_transformer_blocks.1 with ConcatLinear.
144
+ 25-07-21 22:47:05 | I | - in_features = 3072/15360
145
+ 25-07-21 22:47:05 | I | - out_features = 3072
146
+ 25-07-21 22:47:05 | I | + Replacing fused Linear in single_transformer_blocks.2 with ConcatLinear.
147
+ 25-07-21 22:47:05 | I | - in_features = 3072/15360
148
+ 25-07-21 22:47:05 | I | - out_features = 3072
149
+ ```
150
+
151
  2) KeyError: <class 'diffusers.models.transformers.transformer_flux.FluxAttention'>
152
 
153