Logic
This article is about reasoning and its study. For other uses, see Logic (disambiguation).
Logic (from the Ancient Greek: λογική, logike)[1] is the use and study of valid reasoning.[2][3] The study of logic features most prominently in the subjects of philosophy, mathematics, and computer science.
Logic was studied in several ancient civilizations, including
India,[4] China,[5] Persia and Greece. In the West, logic was established as a formal discipline by Aristotle, who gave it a fundamental place in philosophy. The study of logic was part of the classical trivium, which also included grammar and rhetoric. Logic was further extended by Al-Farabi who categorized it into two separate groups (idea and proof). Later, Avicenna revived the study of logic and developed relationship between temporalis and the implication. In the East, logic was developed by Buddhists and Jains.
Logic is often divided into three parts: inductive reasoning, abductive reasoning, and deductive reasoning.
The study of logic
“Upon this first, and in one sense this sole, rule of reason, that in order to learn you must desire to learn, and in so desiring not be satisfied with what you already
incline to think, there follows one corollary which itself deserves to be inscribed upon every wall of the city of philosophy: Do not block the way of inquiry.”
—Charles Sanders Peirce, "First Rule of Logic" The concept of logical form is central to logic, it being held that the validity of an argument is determined by its logical form, not by its content. Traditional Aristotelian syllogistic logic and modern symbolic logic are examples of formal logics.
∙Informal logic is the study of natural language arguments. The study of fallacies is an especially important branch of informal logic. The dialogues of Plato[6] are good
examples of informal logic.
∙Formal logic is the study of inference with purely formal content. An inference possesses
a purely formal content if it can be expressed as a particular application of a wholly
abstract rule, that is, a rule that is not about any particular thing or property. The works
of Aristotle contain the earliest known formal study of logic. Modern formal logic follows
and expands on Aristotle.[7] In many definitions of logic, logical inference and inference
with purely formal content are the same. This does not render the notion of informal
logic vacuous, because no formal logic captures all of the nuances of natural language.
∙Symbolic logic is the study of symbolic abstractions that capture the formal features of logical inference.[8][9] Symbolic logic is often divided into two branches: propositional
logic and predicate logic.
∙Mathematical logic is an extension of symbolic logic into other areas, in particular to the study of model theory, proof theory, set theory, and recursion theory.
Logical form
Main article: Logical form
Logic is generally considered formal when it analyzes and represents the form of any valid argument ty
pe. The form of an argument is displayed by representing its sentences in the formal grammar and symbolism of a logical language to make its content usable in formal inference. If one considers the notion of form too philosophically loaded, one could say that formalizing simply means translating English sentences into the language of logic.
This is called showing the logical form of the argument. It is necessary because indicative sentences of ordinary language show a considerable variety of form and complexity that makes their use in inference impractical. It requires, first, ignoring those grammatical features irrelevant to logic (such as gender and declension, if the argument is in Latin), replacing conjunctions irrelevant to logic (such as "but") with logical conjunctions like "and" and replacing ambiguous, or alternative logical expressions ("any", "every", etc.) with expressions of a standard type (such as "all", or the universal
quantifier ∀).
Second, certain parts of the sentence must be replaced with schematic letters. Thus, for example, the expression "all As are Bs" shows the logical form common to the sentences "all men are mortals", "all cats are carnivores", "all Greeks are philosophers", and so on.
That the concept of form is fundamental to logic was already recognized in ancient times. Aristotle use
s variable letters to represent valid inferences in Prior Analytics, leading Jan
Łukasiewicz to say that the introduction of variables was "one of Aristotle's greatest inventions".[10] According to the followers of Aristotle (such as Ammonius), only the logical principles stated in
schematic terms belong to logic, not those given in concrete terms. The concrete terms "man", "mortal", etc., are analogous to the substitution values of the schematic placeholders A, B, C, which were called the "matter" (Greek hyle) of the inference.
The fundamental difference between modern formal logic and traditional, or Aristotelian logic, lies in their differing analysis of the logical form of the sentences they treat.
∙In the traditional view, the form of the sentence consists of (1) a subject (e.g., "man") plus a sign of quantity ("all" or "some" or "no"); (2) the copula, which is of the form "is"
or "is not"; (3) a predicate (e.g., "mortal"). Thus: all men are mortal. The logical
constants such as "all", "no" and so on, plus sentential connectives such as "and" and
"or" were called "syncategorematic" terms (from the Greek kategorei – to predicate,
and syn – together with). This is a fixed scheme, where each judgment has an identified
quantity and copula, determining the logical form of the sentence.
∙According to the modern view, the fundamental form of a simple sentence is given by a recursive schema, involving logical connectives, such as a quantifier with its bound
variable, which are joined by juxtaposition to other sentences, which in turn may have
logical structure.
∙The modern view is more complex, since a single judgement of Aristotle's system involves two or more logical connectives. For example, the sentence "All men are
mortal" involves, in term logic, two non-logical terms "is a man" (here M) and "is mortal"
(here D): the sentence is given by the judgement A(M,D). In predicate logic, the sentence
involves the same two non-logical concepts, here analyzed as and , and the
sentence is given by , involving the logical connectives for
universal quantification and implication.
∙But equally, the modern view is more powerful. Medieval logicians recognized the problem of multiple generality, where Aristotelian logic is unable to satisfactorily render such sentences as "Some guys have all the luck", because both quantities "all" and
"some" may be relevant in an inference, but the fixed scheme that Aristotle used allows
only one to govern the inference. Just as linguists recognize recursive structure in
natural languages, it appears that logic needs recursive structure.
Deductive and inductive reasoning, and abductive inference
Deductive reasoning concerns what follows necessarily from given premises (if a, then b). However, inductive reasoning—the process of deriving a reliable generalization from observations—has sometim
es been included in the study of logic. Similarly, it is important to distinguish deductive validity and inductive validity (called "cogency"). An inference is deductively valid if and only if there is
no possible situation in which all the premises are true but the conclusion false. An inductive argument can be neither valid nor invalid; its premises give only some degree of probability, but not certainty, to its conclusion.
The notion of deductive validity can be rigorously stated for systems of formal logic in terms of the well-understood notions of semantics. Inductive validity on the other hand requires us to define a reliable generalization of some set of observations. The task of providing
this definition may be approached in various ways, some less formal than others; some of these definitions may use mathematical models of probability. For the most part this discussion of logic deals only with deductive logic.
Abduction[11] is a form of logical inference that goes from observation to a hypothesis that accounts for the reliable data (observation) and seeks to explain relevant evidence. The American philosopher Charles Sanders Peirce (1839–1914) first introduced the term as "guessing".[12] Peirce said that to abduce a hypothetical explanation from an observed surprising circumstance is to surmise that may be
true because then would be a matter of course.[13] Thus, to abduce from involves determining that is sufficient (or nearly sufficient), but not necessary, for .
Consistency, validity, soundness, and completeness
Among the important properties that logical systems can have:
∙Consistency, which means that no theorem of the system contradicts another.[14]
∙Validity, which means that the system's rules of proof never allow a false inference from true premises. A logical system has the property of soundness when the logical system
has the property of validity and uses only premises that prove true (or, in the case of
axioms, are true by definition).[14]
∙Completeness, of a logical system, which means that if a formula is true, it can be proven (if it is true, it is a theorem of the system).
∙Soundness, the term soundness has multiple separate meanings, which creates a bit of confusion throughout the literature. Most commonly, soundness refers to logical
systems, which means that if some formula can be proven in a system, then it is true in
the relevant model/structure (if A is a theorem, it is true). This is the converse of
completeness. A distinct, peripheral use of soundness refers to arguments, which means that the premises of a valid argument are true in the actual world.
Some logical systems do not have all four properties. As an example, Kurt Gödel's incompleteness theorems show that sufficiently complex
valid from是什么意思formal systems of arithmetic cannot be consistent and complete;[9] however, first-order predicate logics not extended by specific axioms to be arithmetic formal systems with equality can be complete and consistent.[15]
Rival conceptions of logic
Main article: Definitions of logic
Logic arose (see below) from a concern with correctness of argumentation. Modern logicians usually wish to ensure that logic studies just those arguments that arise from appropriately general forms of inf
erence. For example, Thomas Hofweber writes in the Stanford Encyclopedia of Philosophy that logic "does not, however, cover good reasoning as a whole. That is the job of the theory of rationality. Rather it deals with inferences whose validity can be traced back to the formal features of the representations that are involved in that inference, be they linguistic, mental, or other representations".[16]
By contrast, Immanuel Kant argued that logic should be conceived as the science of judgement, an idea taken up in Gottlob Frege's logical and philosophical work. But Frege's work is ambiguous in the sense that it is both concerned with the "laws of thought" as well as with the "laws of truth", i.e. it both treats logic in the context of a theory of the mind, and treats logic as the study of abstract formal structures.
History
Main article: History of logic
Aristotle, 384–322 BCE.

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。