llmcompressor.utils.fsdp.helpers
get_fsdp_parent(layer_name, model)
Gets the closest parent of layer_name that is wrapped by FSDP. If no FSDP wrapper is found just return None
:model: pytorch module to search through
Parameters:
Name | Type | Description | Default |
---|---|---|---|
layer_name | str | layer name in model to get parent of | required |
Returns:
Type | Description |
---|---|
Optional[Module] | FSDP wrapped parent of layer_name if available, otherwise None |
Source code in src/llmcompressor/utils/fsdp/helpers.py
is_fsdp_model(model)
Check if a model instance is wrapped by FSDP
Parameters:
Name | Type | Description | Default |
---|---|---|---|
model | Module | pytorch model to check | required |
Returns:
Type | Description |
---|---|
bool | True if module is wrapped, False otherwise |
Source code in src/llmcompressor/utils/fsdp/helpers.py
maybe_get_wrapped(model)
Given a model that may or may not have a distributed wrapper, return the underlying wrapped model.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
model | Module | input model to get wrapped model from | required |
Returns:
Type | Description |
---|---|
Module | wrapped model |
Source code in src/llmcompressor/utils/fsdp/helpers.py
set_wrapped_model(state, wrapped_model)
Given a state with a model that may or may not have a distributed wrapper, set the underlying wrapped model.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
state | State | state to update model of | required |
updated_wrapped | model to inject into input_model | required |