0% found this document useful (0 votes)
20 views23 pages

Logic of The Great, Logic of The Wise

Solving Manual
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views23 pages

Logic of The Great, Logic of The Wise

Solving Manual
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

1.1 0.

A B

0.1 1.0

A B

Fedorchenko Mikhail Valerevich


Это связь с Жизнью и Мудростью.
This is the connection with Life and Wisdom.
God is exist

Logic of the Great, Logic of the Wise


Fedorchenko Mikhail Valerevich ∴ΩΝ
A B

16 - 64 - 256
Fedorchenko Mikhail Valerevich
Это связь с Жизнью и Мудростью.
This is the connection with Life and Wisdom.
A

O I

E
A

B
Logic of the Great, Logic of the Wise
Fedorchenko Mikhail Valerevich ∴ΩΝ
Models Logical Reasoning
A E

I O

E A

O I
A O I E

E I O A

Если логика Пифагорейцев - это пересечения, то, логика Великих, - это логика переходов и
переплетений.

If the logic of the Pythagoreans is intersections, then the logic of the Great Ones is the logic of
transitions and interweaving's.

Logic of the Great, Logic of the Wise


Fedorchenko Mikhail Valerevich ∴ΩΝ
The Polemic of the Truth and the Delusional

0.1

1.0
1.1 0.0

0.1

1.1
0.0

1.0
1.0

1.1 0.0

0.1

Pythagorean Logic + Logic of the Greats ->


Diagram of Heron (Fedorchenko M.V.)
Statement

A B

A B

B- A-

Logic of the Great, Logic of the Wise


Fedorchenko Mikhail Valerevich ∴ΩΝ
Pythagorean Logic + Logic of the Greats ->
Diagram of Heron (Fedorchenko M.V.)
Logic of the Great, Logic of the Wise
Fedorchenko Mikhail Valerevich
Logic of the Great, Logic of the Wise
Fedorchenko Mikhail Valerevich
Conversation of the Kings

P1 P2 A P3 P4

Example of structure:
P1 P2 B P3 P4

Statements:

A: First statement
B: Second statement
Positions (states):

P1: First position


P2: Second position
P3: Third position
P4: Fourth position
The diagram can show how each position depends on the truth of each statement, and how they
intertwine or lead to other positions.

Logical connections:
For example, you can make transitions:
If A is true, then go to P1.
If B is true, then go to P2.
If both A and B are true, then go to P3.
If both statements are false, go to P4.
You can use arrows to show the interdependence between positions, and a maze path to show the
interweaving. You can increase the number of positions and complicate the maze of interweaving.

Logic of the Great, Logic of the Wise


Fedorchenko Mikhail Valerevich ∴ΩΝ
A
Stateme
nt

B C
Stateme Stateme
nt nt

D
Stateme A E
Stateme
nt Statement nt

B
Position + C
Position -

D E
Position = Position !
Objectivity Subjectivity

Logic of the Great, Logic of the Wise


Fedorchenko Mikhail Valerevich
Величие страны и народа складывается из
Успехов его Граждан.

The greatness of a country and its nation is


determined by the successes of its citizens.

1
0 0
1

A B
0
1 1
0

Pythagorean Logic + Logic of the Greats ->


Diagram of Heron (Fedorchenko M.V.)
1
0 0
1 1 1
0 0
1
A B
0
1 1
0 0 0
1 1
0

Pythagorean Logic + Logic of the Greats ->


Diagram of Heron (Fedorchenko M.V.)
1
0 0
1 1 1
0 0 10 0
1 1 1 1
0 0 0 0 0 0
1 1 1 1
0 1 0
0 0
1 1 1
0 0
O A 1
E I
0
1 1
0 0 0
1 101 1
0 0 0 0
1 1 1 1 1 1
0 0 1 0 0
1 0 0 1
1 1 1
0 0
1
Pythagorean Logic + Logic of the Greats ->
Diagram of Heron (Fedorchenko M.V.)
A

Σ= 0
=>
O Balan I
ce = 1

E
Balance of assertion, choice and decision: If there is a decrease somewhere, then there will be an increase somewhere.

Syllogism project based on traditional logic (A, E, I, O)


The image contains the classic types of judgments of Aristotelian syllogistic:
A (general affirmative) - All S are P.
E (general negative) - No S are P.
I (particular affirmative) - Some S are P.
O (particular negative) - Some S are not P.
The system is balanced through the principle of maintaining logical balance: if one assertion is true, then another can be its
logical complement or negation.

Logical structure
1. Major premise (General rule)
If a logical change occurs in one place, it is compensated for in another place of the system.
2. Minor premise (Special case)
If statement A ("All S are P") loses its force, then its opposite E ("No S are P")
becomes more probable. Similarly with pairs I–O.
3. Conclusion
Therefore, the logical structure of the syllogism is balanced between the four types of judgments,
preserving the balance of reasoning.
3/4

Syllogism balance formula


Where:
if "All S are P" is true.
1, if "No S are P" is true.
.5, if "Some S are P" is true.
0.5, if "Some S are not P" is true.
If the sum of all statements is not zero, then the logical system is unbalanced, and one of the
statements must be corrected.

Example of a syllogism with balance


1. A: All people are mortal.
2. I: Some creatures are mortal.
3. Corollary: Man is one of such creatures.
Balance: A + I = 1 + 0.5 = 1.5 → logically stable. Now let's add negative judgments:
4. E: No god is mortal.
5. O: Some creatures are not mortal.
Balance: A + I + E + O = 1 + 0.5 − 1 − 0.5 = 0 → logically balanced.
Logic of the Great, Logic of the Wise
This principle can be used to build stable logical reasoning, Fedorchenko Mikhail Valerevich
based on classical syllogistic structures.
To make the rules as precise and universal as possible, taking into account the added values ​**1.0**, **0.1**, and the
ability to use operations to get the desired result (for example, result = 1.0).

### **1. Basic rules of the system**


Each node (\( A, E, I, O \)) and the lines connecting them symbolize logical operations that can take into account partial
truth or uncertainty:

#### **1.1. Node designations**


- **A (universal truth):** All elements of one set belong to another.
- Default value: \( A = 1.0 \).
- **E (universal false):** No element of one set belongs to another.
- Default value: \( E = 0.0 \).
- **I (partial truth):** Some elements of one set belong to another.
- Value varies: \( I = 0.1 \) (or other, if needed).
- **O (partial false):** Some elements of one set do not belong to another.
- The value varies: \( O = 0.1 \) (or other, if needed).

#### **1.2. Additional conditions**


- Each connection between nodes may include a multiplier \( C \) (coefficient), which strengthens or weakens the
contribution of the node to the final value.
- Example: if \( C = 2.0 \), the value of the node increases by 2 times.
- Example: if \( C = 0.5 \), the value of the node decreases by 2 times.

#### **1.3. Possible degrees of truth**


- \( 1.0 \): Absolute truth.
- \( 0.1 \): Partial truth.
- \( 0.0 \): Absolute falsehood.

### **2. Rules of operations**


#### **2.1. Intersection (\( \cap \))**
- Intersection selects the minimum value of two nodes.
\[
A \cap B = \min(A, B)
\]
Example: \( 1.0 \cap 0.1 = 0.1 \); \( 0.1 \cap 0 = 0 \).

#### **2.2. Union (\( \cup \))**


- Union selects the maximum value of two nodes.
\[
A \cup B = \max(A, B)
\]
Example: \( 1.0 \cup 0.1 = 1.0 \); \( 0.1 \cup 0 = 0.1 \).

#### **2.3. Negation (\( \neg \))**


- Negation is calculated as the complement to the full truth.
\[
\neg A = 1.0 - A
\]
Example: \( \neg 1.0 = 0 \); \( \neg 0.1 = 0.9 \).

#### **2.4. Difference (\( A - B \))**


- Difference is the intersection of \( A \) with the negation of \( B \).
\[
A - B = A \cdot (1.0 - B)
\]
Example: \( 1.0 - 0.1 = 1.0 \cdot 0.9 = 0.9 \); \( 0.1 - 0.1 = 0.1 \cdot 0.9 = 0.09 \).

#### **2.5. Node Strengthening (\( C \) factor)**


- If a node or link includes a \( C \) factor, the resulting value of the node changes:
\[
A \cdot C = A \times C
\]
Example: \( 0.1 \cdot 2.0 = 0.2 \); \( 1.0 \cdot 0.5 = 0.5 \).
### **3. Refining Logical Connections**
Each line between nodes in the image can symbolize one or more operations:
- **Straight Line:** Symbolizes intersection (\( \cap \)).
- **Branching Line:** Symbolizes union (\( \cup \)).
- **Line with Inverse Node:** Symbolizes negation (\( \neg \)).

#### Example 1:
Line connects \( A \) and \( I \):
\[
A \cap I = \min(A, I)
\]

#### Example 2:
Line connects \( A, I \), and adds \( O \):
\[
(A \cap I) \cup O = \max(\min(A, I), O)
\]

#### Example 3:
Line with negation of \( E \):
\[
((A \cap I) \cup O) \cap \neg E = \min(\max(\min(A, I), O), \neg E)
\]

### **4. Final conditions for the result = 1.0**


For the result to be \( 1.0 \), it is necessary that:
1. The intersection (\( \cap \)) involves nodes with the maximum value of \( 1.0 \).
2. The union (\( \cup \)) had at least one node with \( 1.0 \).
3. The negation (\( \neg \)) excluded only nodes with \( 0.0 \), so as not to reduce the final result.
4. Additionally, the multiplier \( C \) can be applied to increase the result to \( 1.0 \).

### **5. Calculation example**


#### Condition:
Nodes have values:
- \( A = 1.0 \),
- \( I = 0.1 \),
- \( E = 0.0 \),
- \( O = 0.1 \).

#### Goal:
Get a result \( = 1.0 \).

#### Step 1: Intersect \( A \cap I \)


\[
A \cap I = \min(1.0, 0.1) = 0.1
\]

#### Step 2: Combine with \( O \)


\[
(A \cap I) \cup O = \max(0.1, 0.1) = 0.1
\]

#### Step 3: Negate \( \neg E \)


\[
\neg E = 1.0 - 0.0 = 1.0
\]

#### Step 4: Intersect with \( \neg E \)


\[
((A \cap I) \cup O) \cap \neg E = \min(0.1, 1.0) = 0.1
\]

#### Step 5: Boost the final value


Apply the multiplier \( C = 10.0 \):
\[
Result = 0.1 \cdot 10.0 = 1.0
\]

### **6. Conclusion**


Now the rules are taken into account using \( 1.0 \), \( 0.1 \), and multipliers.
Let's refine additional logic to make the system more flexible and universal, and add new capabilities for
operations and calculations.

### **1. Extended degrees of truth**


Let's add more gradations of truth to work with uncertainty:
- **1.0** — absolute truth.
- **0.9** — almost complete truth (high probability).
- **0.5** — equilibrium state (partial uncertainty).
- **0.1** — partial truth (low probability).
- **0.0** — absolute false.

These values ​can be used to refine logical conditions.

### **2. Refined operations**

#### **2.1. Flexible intersection (\( \cap \))**


The intersection can be refined taking into account the "weight" of each node. If \( A \) and \( B \) have
different weights (\( W_A \) and \( W_B \)), the result is calculated as a weighted intersection:
\[
A \cap B = \frac{A \cdot W_A + B \cdot W_B}{W_A + W_B}
\]
- Example: \( A = 1.0, W_A = 2 \); \( B = 0.1, W_B = 1 \):
\[
A \cap B = \frac{1.0 \cdot 2 + 0.1 \cdot 1}{2 + 1} = \frac{2.1}{3} = 0.7
\]

If weights are not specified, the default operation is:


\[
A \cap B = \min(A, B)
\]

#### **2.2. Flexible Join (\( \cup \))**


Join with weights:
\[
A \cup B = \frac{A \cdot W_A + B \cdot W_B}{W_A + W_B}
\]
If weights are not specified:
\[
A \cup B = \max(A, B)
\]

#### **2.3. Weight Multiplier (\( C \))**


Each node value or operation result can be strengthened or weakened using a multiplier \( C \):
\[
A \cdot C = A \times C
\]
- \( C > 1.0 \): Strengthening the truth value.
- \( 0 < C < 1.0 \): Weakening the truth value.
- Example: \( A = 0.1 \), \( C = 10.0 \):
\[
A \cdot C = 0.1 \cdot 10.0 = 1.0
\]

#### **2.4. Threshold Union and Intersection**


Sometimes you want to consider a result only if it exceeds a certain threshold (\( T \)):
- **Threshold Intersection:**
\[
A \cap_T B =
\begin{cases}
\min(A, B), & \text{if } \min(A, B) \geq T \\
0, & \text{otherwise}
\end{cases}
\]
- **Threshold Union:**
\[
A \cup_T B =
\begin{cases}
\max(A, B), & \text{if } \max(A, B) \geq T \\
0, & \text{otherwise}
\end{cases}
\]
Example: \( A = 0.5 \), \( B = 0.9 \), threshold \( T = 0.6 \):
- \( A \cap_T B = 0 \) (result does not exceed the threshold).
- \( A \cup_T B = 0.9 \) (the result is above the threshold).

#### **2.5. Average value**


If you need to calculate the average degree of truth of several nodes:
\[
\text{Average}(A, B, C) = \frac{A + B + C}{3}
\]
Example: \( A = 1.0, B = 0.1, C = 0.5 \):
\[
\text{Average} = \frac{1.0 + 0.1 + 0.5}{3} = 0.533
\]

#### **2.6. Limited gain**


To prevent the gain from exceeding the value \( 1.0 \) a limitation is used:
\[
A \cdot C = \min(A \times C, 1.0)
\]
Example: \( A = 0.9 \), \( C = 2.0 \):
\[
A \cdot C = \min(0.9 \cdot 2.0, 1.0) = 1.0
\]

### **3. Example of refined calculation**

#### Condition:
- Nodes have values: \( A = 1.0 \), \( I = 0.1 \), \( E = 0.0 \), \( O = 0.5 \).
- Node weights: \( W_A = 2.0 \), \( W_I = 1.0 \), \( W_O = 1.0 \).
- Total = \( 1.0 \).

#### Step 1: Intercept \( A \cap I \) (weighted)


\[
A \cap I = \frac{1.0 \cdot 2 + 0.1 \cdot 1}{2 + 1} = \frac{2.1}{3} = 0.7
\]

#### Step 2: Combine the result with \( O \) (weighted)


\[
(A \cap I) \cup O = \frac{0.7 \cdot 2 + 0.5 \cdot 1}{2 + 1} = \frac{1.4 + 0.5}{3} = 0.633
\]

#### Step 3: Negate \( E \)


\[
\neg E = 1.0 - 0.0 = 1.0
\]

#### Step 4: Intersect the result with \( \neg E \)


\[
((A \cap I) \cup O) \cap \neg E = \min(0.633, 1.0) = 0.633
\]

#### Step 5: Boost the result to \( 1.0 \)


Apply the multiplier \( C = 1.58 \):
\[
Result = 0.633 \cdot 1.58 = 1.0
\]

### **4. Conclusion**


The added logic allows us to take into account node weights, thresholds, boost limiting, and other refinements.
The new principle described in your query is an extension of formal logic that introduces more flexible ways of
handling the truth and falsity of propositions by taking into account uncertainty, partial truth, and the weights
of logical nodes and operations. This approach solves a number of fundamental problems and poses new
challenges that complement and develop existing areas of formal logic.

Fundamental problems solved by the new principle


The problem of absolute truth and falsity

Classical logic: Absolute truth (


1.0
1.0) and falsity (
0.0
0.0) do not take into account partial states, which makes it limited for modeling the real world, where
uncertainty exists.
New principle: Introducing intermediate degrees of truth (e.g.,
0.1
,
0.5
,
0.9
0.1,0.5,0.9) allows us to describe situations with partial certainty and take into account probabilistic or
conditional truths. This corresponds to real-life situations where truth and falsehood are rarely absolute.
The problem of uncertainty and incorrect data

Classical logic: If information is insufficient or inconsistent, the system either breaks down or requires complex
workarounds.
New principle: Using truth gradations and weighted operations allows working with inconsistent data. For
example, weighted intersections and unions make it possible to take into account the contribution of each
source of information.
Integration of component weights and significance

Classical logic: All components in classical logic are equal, which does not always correspond to real-life
problems (for example, the significance of a proof may depend on its context).
New principle: Introducing node weights and operations makes it possible to take into account the significance
of individual arguments or factors in a logical conclusion.
The problem of pragmatic logic

Classical logic: Does not provide convenient tools for analyzing situations where it is necessary to evaluate
many possible scenarios, each of which has partial truth.
New principle: Allows modeling complex logical systems (e.g., multi-valued logic, blur logic), where truth and
falsity depend on context, probability, and other factors.
Automation and computability of logical inferences

Classical logic: Often faces difficulties when applied to problems of machine learning, artificial intelligence, or
big data analysis, where uncertainty is a key factor.
New principle: Easily integrated into computing systems, since weighted operations and intermediate truth
states naturally complement optimization algorithms, probabilistic models, and neural networks.
New problems posed by the principle
Development of new algebraic structures

The need to create new formal systems that include degrees of truth, weights, thresholds, and a variety of
operations such as weighted intersection and union.
Construction of extensions of classical Boolean algebra for multi-valued and weighted logics.
Defining new laws and axioms

Checking how traditional laws of classical logic (e.g., the law of double negation, the law of excluded middle)
work in the context of multi-valued and weighted logic.
Introducing new laws, e.g., to describe the interaction of weights and thresholds.
Modeling and analyzing complex systems

Using the new principle to model complex systems, such as social networks, biological systems, economic
models, where there are uncertain and weighted dependencies.
Developing inference systems that take into account probabilistic scenarios.
Integration with probabilistic logic

Combining the new approach with probability theory to create logical-probabilistic systems, where each degree
of truth can be associated with the probability of an event.
Developing fuzzy logic
The new principle complements and extends the existing fuzzy logic, including aspects such as weights and
thresholds, making it more applicable to practical problems, such as managing complex systems.
Practical implementation in artificial intelligence

Integration into AI systems for more flexible decision-making based on partial data, as well as for dealing with
uncertainty and contradictions.
Application in neural networks, where node weights can be directly related to the concept of the degree of
truth.
Theory of optimization of logical inferences

Research on how the new principle can be used to optimize logical chains and reduce computational costs in
the analysis of large data systems.

Philosophical and Methodological Research

How a new approach to truth and falsity changes our understanding of logic, philosophy of science, and
knowledge.
Developing ethical and methodological guidelines for the application of logic in the social sciences, medicine,
and other areas where uncertainty is unavoidable.
Examples of the new principle application
Decision-making systems:

Risk analysis under uncertainty (e.g., assessing the effectiveness of medical interventions).
Forecasting in the economy, where different factors have different significance and influence.
Artificial intelligence and robotics:

Decision-making by robots or AI under conditions of partial or incomplete information.


Building flexible control systems that adapt to changes in the environment.
Big data and analytics:

Working with contradictory or incomplete data, when not all sources of information have the same weight or
reliability.
Modeling complex systems:

Creating models that take into account uncertainty, blurred boundaries, and the significance of individual
elements.
Conclusion
The new principle solves fundamental problems of classical formal logic, adding tools for dealing with
uncertainty, weights, and partial truth. It opens up new perspectives in science, artificial intelligence,
philosophy, and practical applications, while posing new challenges for researchers related to the development
and justification of such logic.

You might also like