Base Logical Elements¶
- class jlnn.nn.base.LogicalElement(*args: Any, **kwargs: Any)[source]¶
Bases:
ModuleThe basic abstract class for logical elements within JLNN.
This class serves as the base for all logical gates (AND, OR, Implicature) and predicates. It provides initialization and management of trainable parameters such as weights and thresholds (beta), in accordance with the Logical Neural Networks (LNN) architecture.
Within LNN, we work with interval logic, where each input and output represents a truth value as an interval [Lower Bound, Upper Bound].
- weights¶
Trainable weights for each input. In LNN, they are initialized to 1.0 (neutral influence).
- Type:
nnx.Param
- beta¶
Trainable gate bias, determining the steepness of the logic activation.
- Type:
nnx.Param
- Raises:
NotImplementedError – _description_
- __call__(x: Array) Array[source]¶
Abstract method for performing a logical operation.
- Parameters:
x (jnp.ndarray) – Input interval tensor of the form (…, n_inputs, 2). The last dimension contains [L, U].
- Raises:
jnp.ndarray – The resulting truth interval [L, U].
- Returns:
This method must be implemented in a specific gate.
- Return type:
NotImplementedError
This module defines an abstract base class for all logical components in JLNN. It uses Flax NNX for object-oriented state management, which allows for native parameter handling within JAX.
Key features¶
NNX Integration: The class inherits from
nnx.Module, meaning weights and beta are automatically tracked asnnx.Param.Interval Contract: Expects inputs in the format
(..., n_inputs, 2)and returns an output interval[L, U].Initialization: Weights are initialized to the value 1.0 (neutral influence) according to the LNN standard, to prevent accidental bias in logical reasoning before training.
LogicalElement class¶
- class jlnn.nn.base.LogicalElement(*args: Any, **kwargs: Any)[source]
The basic abstract class for logical elements within JLNN.
This class serves as the base for all logical gates (AND, OR, Implicature) and predicates. It provides initialization and management of trainable parameters such as weights and thresholds (beta), in accordance with the Logical Neural Networks (LNN) architecture.
Within LNN, we work with interval logic, where each input and output represents a truth value as an interval [Lower Bound, Upper Bound].
- weights
Trainable weights for each input. In LNN, they are initialized to 1.0 (neutral influence).
- Type:
nnx.Param
- beta
Trainable gate bias, determining the steepness of the logic activation.
- Type:
nnx.Param
- Raises:
NotImplementedError – _description_