DepthScale API Reference¶
DepthScale is a framework designed to implement a Universal Self-Decoder for transformer models, focusing on parameter-shared iterative reasoning with constant memory overhead and convergence-based refinement.
src/universal_yoco/__init__.py¶
This module serves as the main entry point for the universal_yoco package, typically handling package-level imports and initialization.
Contents: * Exposes core components from submodules.
Example Usage:
from universal_yoco.yoco_base import UniversalYoco
# Initialize the core reasoning engine
model = UniversalYoco(...)
src/universal_yoco/types.py¶
This module defines the core data structures and type hints used throughout the DepthScale framework, ensuring semantic coherence across reasoning steps.
Key Classes/Functions:
ReasoningState¶
- Signature:
class ReasoningState(TypedDict) - Description: A structured container holding the current state of the reasoning process. Includes the input prompt, intermediate thoughts, and the current output hypothesis.
- Example Usage:
AttentionConfig¶
- Signature:
class AttentionConfig(TypedDict) - Description: Configuration parameters for the specialized attention mechanisms used within the self-decoder, controlling how context is integrated across iterations.
- Example Usage:
src/universal_yoco/yoco_base.py¶
This is the core implementation module, housing the UniversalYoco class which orchestrates the parameter-shared iterative reasoning process.
Key Classes/Functions:
UniversalYoco¶
- Signature:
class UniversalYoco(nn.Module) - Description: The main framework class. It encapsulates the transformer model weights and manages the iterative refinement loop. It enforces parameter sharing across all reasoning steps.
- Parameters:
transformer_model: The underlying pre-trained transformer architecture (e.g., GPT-2, BERT).attention_config: Configuration for specialized attention mechanisms.max_iterations: The maximum number of refinement steps allowed.
- Example Usage:
import torch.nn as nn from universal_yoco.yoco_base import UniversalYoco from universal_yoco.types import AttentionConfig # Assume 'base_transformer' is an initialized PyTorch model base_transformer = ... attention_cfg: AttentionConfig = {"memory_decay_rate": 0.9} yoco_engine = UniversalYoco( transformer_model=base_transformer, attention_config=attention_cfg, max_iterations=10 )
UniversalYoco.reason(state: ReasoningState) -> ReasoningState¶
- Signature:
reason(self, state: ReasoningState) -> ReasoningState - Description: Executes the iterative reasoning process starting from the given
ReasoningState. It applies the shared transformer weights repeatedly, updating the state based on convergence criteria or reachingmax_iterations. - Returns: The final, refined
ReasoningState. - Example Usage:
_apply_shared_step(state: ReasoningState) -> ReasoningState¶
- Signature:
_apply_shared_step(self, state: ReasoningState) -> ReasoningState - Description: An internal method responsible for a single iteration of reasoning. It feeds the current
ReasoningStateinto the shared transformer, processes the output through specialized attention layers, and generates the next state. This is where the constant memory overhead is maintained by reusing weights. - Example Usage: (Internal use only)
.worktrees/issue-55cdb54d-02-yoco-base-layer/src/universal_yoco/types.py¶
(Note: This module appears to be a specific version or branch of the main types.py. It shares the same API structure as the main module but might contain version-specific type definitions.)
Key Classes/Functions:
- Structure: Identical to
src/universal_yoco/types.py. - Purpose: Provides type definitions tailored for the specific branch/issue context, ensuring compatibility with the base layer implementation.
.worktrees/issue-55cdb54d-02-yoco-base-layer/src/universal_yoco/yoco_base.py¶
(Note: This module is the specific implementation file corresponding to the branch/issue. It contains the concrete implementation of the logic described in src/universal_yoco/yoco_base.py.)
Key Classes/Functions:
- Structure: Identical to
src/universal_yoco/yoco_base.py. - Purpose: Contains the production-ready, branch-specific implementation of the
UniversalYocoengine, leveraging the shared architecture.
Summary of Core Concepts¶
| Concept | Module | Description |
|---|---|---|
| Parameter Sharing | yoco_base.py |
The UniversalYoco class ensures the same transformer weights are used across all iterative calls to maintain constant memory overhead. |
| Iterative Reasoning | yoco_base.py |
The reason() method drives the loop, refining the ReasoningState step-by-step. |
| State Management | types.py |
ReasoningState provides a structured, coherent way to pass context (thoughts, hypotheses) between iterations. |
| Refinement Mechanism | yoco_base.py |
The internal logic within _apply_shared_step implements the convergence-based refinement using specialized attention. |