If someone said to you that it is virtually impossible to know the probability of how many car-to-pedestrian related deaths there are in a given city, you might be inclined to think that such a problem would be far too vast and complex to ever ‘solve.’
This might seem reasonable, since, after all, how could we ever corner such a huge number of variables to end up with one figure?
–“I mean, with all the variables: cars, streets, people, different types of people, let alone times and seasons, let alone differing populations, not to mention the ebb and flow of fluctuating populations–its virtually impossible to determine such a figure.”–
This inclination is a logical fallacy and something I personally like to call ‘smokescreen complexity.’
(Formally, from what I could find, this is closest to what is called ‘Complexity Simplistic.’
It consists of denying human intellectual ability, but not by means of exaggerating the complexity of something as appropriate to its nature, per se, but of accepting the problem as ‘far too complex’–pre-analysis–by means of presenting an array of multiple, often deliberately unintegrated contingencies.
This is done chiefly by means of throwing out an endless number of possibilities, i.e. ‘what about x, y, z, and f, and b possibilities?’ hence the word, ‘smokescreen’ which is descriptive of its operative function: to distract attention.
Analysis is the process of breaking a problem down into more manageable parts, of examining them and narrowing down a solution from larger to smaller and smaller components.
To reiterate more concisely: The purpose of posing a problem rests on the assumption of solving a problem, and so the purpose of analysis is to break a problem down to a solution.
The initial problem here of car-to-pedestrian deaths is a good example because it is one of those broad, seemingly sweeping issues.
And yet, all we really need to do to at least break this problem down, is to take the total number of instances of people actually present in a city at a given time, take an average of those instances, take the total number of car related deaths per any given time slot (i.e. a year or a month) and divide one into the other to produce an average figure, through which a ratio relative to time is made: i.e. ‘there is a 1 and 1004 chance of being hit and killed by a car in New York City.’
(Realize that cognitive knowledge and a practical means to knowledge are two related, but totally different things, and that any criticism of ‘but we might not have the means’ is another issue relative to the above example.)
It is obvious that questions which are too abstract to give literal, single solution, or rather single-specific solutions, must be grasped by a representative value. The failure to grasp abstract representation is what’s at the heart of this kind of thinking, which continually treats abstract problems as if they were single-instance, concretes.
Mind you, the great hypocrisy is that the dominant cultural subjectivism today, mainly coming from leftist PC ideology, accepts all sorts of abstract-probable-representative values, as long as it’s from a ‘compliant’ source i.e. accepts the theory of evolution and global warming without question, yet is opposed to any new information coming from someone without a degree or established scientific theory.
When the ‘argument from authority’ fallacy here is so obvious, in that the left’s supplanting of reason to authority is so flagrantly practiced, one wonders exactly what the hegemonic motive from the PC left, really is. My assertion is that they are just as concretist and subjective, and ‘religious’ as the right–and in any case, are conformists, not thinkers.
Every step of the way a subjectivist will tell you, that the problem can’t be solved or even approached. He will ask you that since the number of people living in a city changes from day to day, and since there is no one figure to derive an equation from, how could any such equation ever be made?
My opinion, aside from this being the effects of the concretist* mentality, is that this originally comes from an epistemic/semantic confusion of those who regard the so-called hard sciences as the only verifiable knowledge.
(*’Concretist’ is my own term: it’s another word for ‘concrete-bound’–someone who thinks mainly on the level of concrete entities, rather than abstracts or representative values.)
This is plausible since, as with chemistry, exact figures with exact repeatability can be derived, as opposed to sociology and even history which require probability and approximations to solve problems for.
Approximations like in an average or a range, as well as the entire concept of probabilistic knowledge, is the crux of what the subjectivist omits in general.
But to say, at least culturally, that the other sciences are ‘not an exact science’ is to imply that they are of no real serious consideration. And yet this is basis upon which most sciences including the psychological and economic rest and without which, would completely collapse and be impossible to measure.
This would be in such spite, however, of the obvious repeatability of result-generating approximations in such a great number of things, since they probably comprise the majority of what life entails.
The prime motive of smokescreen complexity is obviously meant to disarm opposition. And again, the nature of this fallacy is not to make a problem look more complex than it really is, necessarily (though it may incidentally lead to that) but of making sure it remains too complex to solve, so as to avoid the examination of the topic altogether in order to proclaim it as ‘unknowable.’
This is all a preface however, to the axiom of the subjectivist general methodology–and it lies in how one defines knowledge.
If one defines knowledge as ‘the conformity of thought to reality as inducted through the senses,’ and definitions, being definitions, must be absolute, then the only knowledge that there could ever be, is that which is directly, completely observable and confirmable within the physical senses’ immediate range. That is, this as opposed to concepts held in the brain which merely refer to concrete referents.
This is the definition currently in vogue within the primary intellectual circles today, and is treated as an absolute. I should clarify that this definition is merely stolen from the original one, and absolutized in order to stymie all who oppose.
The purpose of human life cannot be divorced from the definition of knowledge and expect survival, and that is the reason that this definition as its own absolute (i.e. without included contingencies) cannot hold up in reality.
Most of our dealings with the world, from any purposeful standpoint, cannot consider only absolutely complete information as the sole form of knowledge.
As another example of this, I see people using science and its grounds for sole empiric evidence, as a standard to further the hinderance of epistemic progress in general.
We must define knowledge according to the integration of the practical as well as the cognitive.
Notice that these two are examples of contingencies, ‘the practical’ and ‘the cognitive’ since they represent variances to the problem of what knowledge is.
Unlike smokescreen complexity, however, a true thinker must with respect to reality, work to integrate contingencies, not merely generate them, and work to produce controls from variances, not variances out of potential controls.
This does not mean that we forego the fact that we know, ultimately, that the best knowledge is the complete picture. However, if we only defined ultimate knowledge to be the only form of knowledge under the definition, then there would hardly be anything considered knowledge at all, to the point where even the most rigorous scientific testing could not produce ‘knowledge.’
(It is almost needless to state that the implication of the above is what the whole subjectivist mentality is ultimately pushing toward: the total destruction of all human awareness, even scientific.)
And yet, as life in reality goes on, humans continue to observe most things to be incomplete, yet, and perhaps most often, highly repeatable.
Therefore, I would, for the sake of a consistent epistemic view, define knowledge to be:
‘The conformity of thought to reality, either as absolute, or under the condition of probability, but both as necessarily relative to predictable repeatability.’
The first reason one might be able to think of, for embracing the above definition is so that one does not have the inefficiency of the perpetual pesky need for another word to qualify ‘probable knowledge’ as knowledge.
This serves a great pragmatic need which is to preserve semantic economy. Always having to preface every single thing one says with a ‘provisional knowledge,’ ‘probable knowledge,’ etc., would make talking and writing a horribly inefficient exchange, perhaps to the point of a semantic train reck–repeated, practically every time the word ‘knowledge’ is uttered!
Still speaking semantically, I believe that the additional word to make an exceptional phrase should go to the exceptional case. Since the outstanding instance is obviously the case where we have complete, infinitely repeatable knowledge of a thing, (which, the subjectivists are right, is probably impossible) then therefore, in this instance, we should use the phrase ‘absolute knowledge.’
However, as partially stated and certainly implied, the main reasons for the conceptual inclusion are objective, epistemic reasons.
The definition above would set up an epistemic backdrop or context in order to distinguish two different types of objective knowledge with which the brain of a person can refer a contrast to general reality, against the probable, but within the same instance of understanding.
Further, and most primarily, though:
Probable knowledge must by necessity be considered a type of knowledge since knowledge possesses meaning only within the realm of a context of practice which cannot forget such concepts as intelligence: the ability to learn. To learn what? –To learn knowledge. For what? –In order to act.
In other words, to consider knowledge outside the existence of intelligent action, is to invalidate the foundations of its own definition.
Therefore, the concretist-subjectivist tool of smokescreen complexity can now be identified and put down, but by means of pulling it up from its roots: through illuminating a new definition of objective knowledge to include the probable.