An Implementation Methodology for Social Capital - Entropy and Self-Organization in Catallaxy
An Implementation Methodology for Social Capital - Entropy and Self-Organization in Catallaxy
  • Cha Joo-hak (joohakcha@gmail.com)
  • 승인 2013.07.25 19:19
  • 댓글 0
이 기사를 공유합니다

The fourth article of the concrete methodology for building social capital

SEOUL, KOREA - Life is one member of a class of phenomena which are open or continuous reaction systems able to decrease their internal entropy at the expense of the free energy taken from the environment and subsequently rejected in degraded form. By the act of living, an organism continuously creates entropy and there will be an outward flux of entropy across its boundary. Not only do we not know the source which supplies the fuel for the ever-increasing entropy, but no source is permitted, in principle, no feeding mechanism and no provision for any supplies of anything from the outside.

Many people have heard of the Second Law of Thermodynamics. That's the one that states that the Universe is forever running down towards a "Heat Death". It is based on the concept of Entropy, which was loosely associated with the inability of a system to do work, a measure of the disorder in a system, mixed-upness, disorganization, chaos, uncertainty, ignorance, missing information and the tendency of a system to enter a more probable state, usually described as being to create chaos from order.

Catallaxy is in the knowledge-based society and the emergent properties of a market (prices, division of labor, growth, energy, emergence or creativity etc.) are the outgrowths of the diverse and disparate goals of the individuals in a community

Everybody is striving for wealth, or rather getting-rich in chrematistic economy. So the natural tendency of a group of autonomous processes is to disorder, not to order, structure or organization. Adding information to a collection of agents/actors can lead to increased organization, but only if it is added in the right way. Increase of organization or structure is tantamount to increase of order.

In the context of Catallaxy, which is the order brought about by the mutual adjustment of many individual economies in a market, self-organization can be reconciled with the tendencies of the Second Law of Thermodynamics if a system includes multiple coupled levels of dynamic activity. 

Purposeful, self-organizing behavior occurs at the MACRO level. By itself, such behavior would be contrary the Second Law of Thermodynamics. However, the system includes a MICRO level whose dynamics generate increasing disorder. Thus the system as a whole is increasingly disordered over time. Crucially, the behavior of elements at the macro level is coupled to the micro level dynamics.

Entropy is information.

Entropy can be made identical, both formally and conceptually, with a specific measure of information. The difficulty in accepting this identity is that entropy is a physically measurable quantity having units of energy divided by temperature and is therefore an objective quantity. Information however, is viewed as a nebulous dimensionless quantity expressing some kind of human attribute such as knowledge, ignorance or uncertainty, hence, a highly subjective quantity. In spite of the apparent irreconcilability between an objective and a subjective entity, Entropy and information are very closely related. In fact, entropy can be regarded as a measure of ignorance. When it is known only that a system is in a given macrostate, the entropy of the macrostate measures the degree of ignorance the microstate is in by counting the number of bits of additional information needed to specify it, with all the microstates treated as equally probable.

Order and not chaos is the most probable state, usually described as being to create chaos from order. So, which states are probable exactly Well to give an example, suppose we have a pack of cards and shuffle them, are we then likely to deal a sequence of four cards that are all aces No, in fact the theory of gambling is based on the idea that a shuffle will randomize the pack and the cards dealt will be in no special order. Four aces are said to be so improbable that they would be expected to occur by chance only once in about 270 thousand such deals. Order thus has a low probability, and any change to a system (such as a shuffling) will be expected to reduce its order significantly.

Order can also be regarded as information, so we can classify the complexity of a system by how much information we need to describe it. If we do this we find that both solids and gasses have low complexity (simple descriptions) yet to fully describe a whirlpool would need a very extensive description, forever changing with time - liquids have a potentially high information content. Local interactions of liquid molecules give a dynamic structure to the liquid which can cause the emergence of unexpected features. These features are not predicted by traditional entropy considerations, they are too improbable.

This discrepancy is perhaps best explained by noting that it is usual in equilibrium systems work to simplify the terms and use only what is better known as the 'conditional entropy'. Yet entropy overall is conserved, and to complete the picture we need to add in the 'entropy of correlation' which relates to the information known about the system by the observer. As a system 'runs down' and becomes more disorganized the knowledge held by the observer decreases, hence the conditional entropy increases (as tradition dictates), yet in self-organizing systems this 'run-down' does not happen, so we can have either a static entropy or an decreasing one. When that occurs, then the complex state is the probable one and no discrepancy exists.

As the number of variables increases our observation of the system necessarily becomes more selective, less knowledgeable. Shared information is exchanging knowledge of such variable states between agents, so we can perhaps reformulate entropy in terms of this information exchange, bringing together both sides of the entropy equation and extending it to a multi-agent scenario, rather than the over-simple 'single isolated observer' in the usual formulation.


댓글삭제
삭제한 댓글은 다시 복구할 수 없습니다.
그래도 삭제하시겠습니까?
댓글 0
댓글쓰기
계정을 선택하시면 로그인·계정인증을 통해
댓글을 남기실 수 있습니다.

  • #1206, 36-4 Yeouido-dong, Yeongdeungpo-gu, Seoul, Korea(Postal Code 07331)
  • 서울특별시 영등포구 여의도동 36-4 (국제금융로8길 34) / 오륜빌딩 1206호
  • URL: www.koreaittimes.com / m.koreaittimes.com. Editorial Div. 02-578-0434 / 010-2442-9446. Email: info@koreaittimes.com.
  • Publisher: Monica Younsoo Chung. Chief Editorial Writer: Kim Hyoung-joong. CEO: Lee Kap-soo. Editor: Jung Yeon-jin.
  • Juvenile Protection Manager: Yeon Choul-woong. IT Times Canada: Willow St. Vancouver BC, Canada / 070-7008-0005.
  • Copyright(C) Korea IT Times, Allrights reserved.
ND소프트