Logical Gates (Stateful)

class jlnn.nn.gates.WeightedAnd(*args: Any, **kwargs: Any)[source]

Bases: Module

Trainable weighted AND gate implemented using Łukasiewicz t-norm.

This gate performs a fuzzy conjunction where weights determine how much each input contributes to the “negative evidence” against the truth.

__call__(x: Array) Array[source]

Computes the weighted Łukasiewicz conjunction.

__init__(num_inputs: int, rngs: Rngs)[source]
class jlnn.nn.gates.WeightedImplication(*args: Any, **kwargs: Any)[source]

Bases: Module

Trainable implication gate (A -> B).

Supports multiple semantics (Łukasiewicz, Reichenbach, Kleene-Dienes). Ideal for modeling expert-driven rules within the neural architecture.

__call__(int_a: Array, int_b: Array) Array[source]

Call self as a function.

__init__(rngs: Rngs, method: str = 'lukasiewicz')[source]
class jlnn.nn.gates.WeightedNand(*args: Any, **kwargs: Any)[source]

Bases: Module

Weighted NAND gate (Negated AND).

Useful for enforcing constraints where two contradictory statements should not be simultaneously true.

__call__(x: Array) Array[source]

Call self as a function.

__init__(num_inputs: int, rngs: Rngs)[source]
class jlnn.nn.gates.WeightedNor(*args: Any, **kwargs: Any)[source]

Bases: Module

Weighted NOR gate (Negated OR).

Evaluates to high truth only if all weighted inputs are close to falsehood.

__call__(x: Array) Array[source]

Call self as a function.

__init__(num_inputs: int, rngs: Rngs)[source]
class jlnn.nn.gates.WeightedNot(*args: Any, **kwargs: Any)[source]

Bases: Module

Trainable weighted negation (NOT) gate.

Allows the model to learn the degree of inversion for a specific statement.

__call__(x: Array) Array[source]

Call self as a function.

__init__(rngs: Rngs)[source]
class jlnn.nn.gates.WeightedOr(*args: Any, **kwargs: Any)[source]

Bases: Module

Trainable weighted OR gate implemented using Łukasiewicz t-conorm.

In the JLNN framework, this gate aggregates truth intervals from multiple inputs. It learns the relative importance of each input through weights and adjusts the activation threshold via the beta parameter.

weights

Importance weights for each input signal.

Type:

nnx.Param

beta

Sensitivity threshold (bias) of the disjunction.

Type:

nnx.Param

__call__(x: Array) Array[source]

Executes the weighted OR operation.

Parameters:

x (jnp.ndarray) – Input interval tensor of shape (…, num_inputs, 2).

Returns:

The resulting truth interval [L, U].

Return type:

jnp.ndarray

__init__(num_inputs: int, rngs: Rngs)[source]
class jlnn.nn.gates.WeightedXor(*args: Any, **kwargs: Any)[source]

Bases: Module

Trainable n-ary XOR gate implemented via recursive tree reduction.

XOR in interval logic is non-trivial. For n=2, it uses the logical composition: (A OR B) AND (A NAND B). For n > 2, it recursively builds a binary tree of XOR operations.

This hierarchical structure allows the network to learn complex parity-like functions with independent weights at each node.

__call__(x: Array) Array[source]

Call self as a function.

__init__(num_inputs: int, rngs: Rngs)[source]

This module contains an implementation of logic gates like Flax NNX modules. Each gate manages its own trainable parameters (weights and thresholds) and provides forward computation over truth intervals.

Stateful vs. Functional Gates

Unlike Functional Logic Kernels, the gates in this module: * They store their internal state using nnx.Param. * They are compatible with automatic parameter search and optimizers. * They are the basic building blocks that the compiler assembles into deeper structures.

Basic gates

Implements logical conjunction. Weights allow the network to selectively suppress unimportant inputs.

Implements a logical disjunction. The beta parameter determines the sensitivity (steepness) of the gate.

Weighted negation. The weight allows the model to learn how strongly to invert a given input.

Advanced operators

Allows modeling of causal relationships \(A \to B\). Supports various semantics (Łukasiewicz, Kleene-Dienes, Reichenbach).

Implementation of n-ary XOR using a hierarchical tree of binary operations. This structure allows learning complex parity functions.

Negated gates (NAND, NOR)

These gates are key for detecting logical contradictions and enforcing integrity constraints in knowledge bases.

  • WeightedNand: Negation of conjunction.

  • WeightedNor: Negation of disjunction.