Logical Gates (Stateful)¶
- class jlnn.nn.gates.WeightedAnd(*args: Any, **kwargs: Any)[source]¶
Bases:
ModuleTrainable weighted AND gate implemented using Łukasiewicz t-norm.
This gate performs a fuzzy conjunction where weights determine how much each input contributes to the “negative evidence” against the truth.
- class jlnn.nn.gates.WeightedImplication(*args: Any, **kwargs: Any)[source]¶
Bases:
ModuleTrainable implication gate (A -> B).
Supports multiple semantics (Łukasiewicz, Reichenbach, Kleene-Dienes). Ideal for modeling expert-driven rules within the neural architecture.
- class jlnn.nn.gates.WeightedNand(*args: Any, **kwargs: Any)[source]¶
Bases:
ModuleWeighted NAND gate (Negated AND).
Useful for enforcing constraints where two contradictory statements should not be simultaneously true.
- class jlnn.nn.gates.WeightedNor(*args: Any, **kwargs: Any)[source]¶
Bases:
ModuleWeighted NOR gate (Negated OR).
Evaluates to high truth only if all weighted inputs are close to falsehood.
- class jlnn.nn.gates.WeightedNot(*args: Any, **kwargs: Any)[source]¶
Bases:
ModuleTrainable weighted negation (NOT) gate.
Allows the model to learn the degree of inversion for a specific statement.
- class jlnn.nn.gates.WeightedOr(*args: Any, **kwargs: Any)[source]¶
Bases:
ModuleTrainable weighted OR gate implemented using Łukasiewicz t-conorm.
In the JLNN framework, this gate aggregates truth intervals from multiple inputs. It learns the relative importance of each input through weights and adjusts the activation threshold via the beta parameter.
- weights¶
Importance weights for each input signal.
- Type:
nnx.Param
- beta¶
Sensitivity threshold (bias) of the disjunction.
- Type:
nnx.Param
- class jlnn.nn.gates.WeightedXor(*args: Any, **kwargs: Any)[source]¶
Bases:
ModuleTrainable n-ary XOR gate implemented via recursive tree reduction.
XOR in interval logic is non-trivial. For n=2, it uses the logical composition: (A OR B) AND (A NAND B). For n > 2, it recursively builds a binary tree of XOR operations.
This hierarchical structure allows the network to learn complex parity-like functions with independent weights at each node.
This module contains an implementation of logic gates like Flax NNX modules. Each gate manages its own trainable parameters (weights and thresholds) and provides forward computation over truth intervals.
Stateful vs. Functional Gates¶
Unlike Functional Logic Kernels, the gates in this module:
* They store their internal state using nnx.Param.
* They are compatible with automatic parameter search and optimizers.
* They are the basic building blocks that the compiler assembles into deeper structures.
Basic gates¶
Implements logical conjunction. Weights allow the network to selectively suppress unimportant inputs.
Implements a logical disjunction. The beta parameter determines the sensitivity (steepness) of the gate.
Weighted negation. The weight allows the model to learn how strongly to invert a given input.
Advanced operators¶
Allows modeling of causal relationships \(A \to B\). Supports various semantics (Łukasiewicz, Kleene-Dienes, Reichenbach).
Implementation of n-ary XOR using a hierarchical tree of binary operations. This structure allows learning complex parity functions.
Negated gates (NAND, NOR)¶
These gates are key for detecting logical contradictions and enforcing integrity constraints in knowledge bases.
WeightedNand: Negation of conjunction.
WeightedNor: Negation of disjunction.