StartGroepenDiscussieMeerTijdgeest
Doorzoek de site
Onze site gebruikt cookies om diensten te leveren, prestaties te verbeteren, voor analyse en (indien je niet ingelogd bent) voor advertenties. Door LibraryThing te gebruiken erken je dat je onze Servicevoorwaarden en Privacybeleid gelezen en begrepen hebt. Je gebruik van de site en diensten is onderhevig aan dit beleid en deze voorwaarden.

Resultaten uit Google Boeken

Klik op een omslag om naar Google Boeken te gaan.

An Introduction to Information Theory:…
Bezig met laden...

An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics) (origineel 1961; editie 2012)

door John R. Pierce (Auteur)

LedenBesprekingenPopulariteitGemiddelde beoordelingDiscussies
632236,911 (3.92)Geen
Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permitted the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future. Beginning with the origins of this burgeoning field, Dr. Pierce follows the brilliant formulations of Claude Shannon and describes such aspects of the subject as encoding and binary digits, entropy, language and meaning, efficient encoding, and the noisy channel. He then goes beyond the strict confines of the topic to explore the ways in which information theory relates to physics, cybernetics, psychology, and art. Mathematical formulas are introduced at the appropriate points for the benefit of serious students. J. R. Pierce worked for many years at the Bell Telephone Laboratories, where he became Director of Research in Communications Principles. An Introduction to Information Theory continues to be the most impressive nontechnical account available and a fascinating introduction to the subject for lay listeners.… (meer)
Lid:szarka
Titel:An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics)
Auteurs:John R. Pierce (Auteur)
Info:Dover Publications (2012), Edition: Subsequent, 336 pages
Verzamelingen:Jouw bibliotheek, Ebooks, Audiobooks, Aan het lezen
Waardering:
Trefwoorden:math, probability, physics

Informatie over het werk

An Introduction to Information Theory: Symbols, Signals, and Noise door John R. Pierce (1961)

Geen
Bezig met laden...

Meld je aan bij LibraryThing om erachter te komen of je dit boek goed zult vinden.

Op dit moment geen Discussie gesprekken over dit boek.

Toon 2 van 2
(Original Review, 1980-12-05)

Final answer to question, "How many joules to send a bit?"

The unit of information is determined by the choice of the arbitrary scale factor K in Shannon's entropy formula:

{ s(Q|X) = -K SUM(p*ln(p)) }

If K is made equal to 1/ln(2), then S is said to be measured in "bits" of information. A common thermodynamic choice for K is kN, where N is the number of molecules in the system considered and k is 1.38e-23 joule per degree Kelvin, Boltzmann's constant. With that choice, the entropy of statistical mechanics is expressed in joules per degree. The simplest thermodynamic system to which we can apply Shannon's equation is a single molecule that has an equal probability of being in either of two states, for example, an elementary magnet. In this case, p=.5 for both states and thus S=+k ln(2). The removal of that much uncertainty corresponds to one bit of information. Therefore, a bit is equal to k ln(2), or approximately 1e-23 joule per degree K. This is an important figure, the smallest thermodynamic entropy change that can be associated with a measurement yielding one bit of information.

The amount of energy needed to transmit a bit of information when limited by thermal noise of temperature T is:

E = kT ln 2 (Joules/bit)

This is derived from Shannon's initial work (1) on the capacity of a communications channel in a lucid fashion by Pierce (2), although it is not obvious that he was the first to derive it. This limit is the same as the amount of energy needed to store or read a bit of information in a computer, which Landauer derived (3) from entropy considerations without the use of Shannon's theorems. Pierce's book is reasonably readable. On page 192 he derives the energy per bit formula (Eq. 10.6), and on page 200 he describes a Maxwell Demon engine generating kT ln 2 of energy from a single molecule and showing that the Demon had to use that amount of energy to "read" the position of the molecule. Then on page 177 Pierce points out that one way of approaching this ideal signalling rate is to concentrate the signal power in a single, short, powerful pulse, and send this pulse in one of many possible time positions, each of which represents a different symbol. This is essentially the concept behind the patent (4) which led me to ask the original question. My thanks to those who helped with their replies.

REFERENCES

1. C. E. Shannon, "A Mathematical Theory of Communication", Bell
System Tech. J., Vol. 27, No. 3, 379-423 and No. 4, 623-656
(1948); re-printed in: C. E. Shannon and W. Weaver, "The
Mathematical Theory of Communication", University of Illinois
Press, Urbana, Illinois (1949).
2. J. R. Pierce, "Symbols, Signals and Noise", Harper, NY (1961)
3. R. Landauer, "Irreversibility and Heat Generation in the
Computing Process," IBM J. Res. & Dev., Vol. 5, 183 (1961).
4. R. L. Forward, "High Power Pulse Time Modulation
Communication System with Explosive Power Amplifier Means",
U. S. Patent 3,390,334 (25 June 1968).

[2018 EDIT: This review was written at the time as I was running my own personal BBS server. Much of the language of this and other reviews written in 1980 reflect a very particular kind of language: what I call now in retrospect a “BBS language”.] ( )
  antao | Nov 6, 2018 |
Brilliant and inspiring book. Enjoyed it immensely. Much use of highlighter. ( )
  jaygheiser | Jul 23, 2008 |
Toon 2 van 2
geen besprekingen | voeg een bespreking toe

» Andere auteurs toevoegen

AuteursnaamRolType auteurWerk?Status
John R. Pierceprimaire auteuralle editiesberekend
Dorland, Cees vanVertalerSecundaire auteursommige editiesbevestigd
Newman, James R.RedacteurSecundaire auteursommige editiesbevestigd
Je moet ingelogd zijn om Algemene Kennis te mogen bewerken.
Voor meer hulp zie de helppagina Algemene Kennis .
Gangbare titel
Informatie afkomstig uit de Engelse Algemene Kennis. Bewerk om naar jouw taal over te brengen.
Oorspronkelijke titel
Alternatieve titels
Informatie afkomstig uit de Engelse Algemene Kennis. Bewerk om naar jouw taal over te brengen.
Oorspronkelijk jaar van uitgave
Mensen/Personages
Belangrijke plaatsen
Belangrijke gebeurtenissen
Verwante films
Motto
Opdracht
Informatie afkomstig uit de Engelse Algemene Kennis. Bewerk om naar jouw taal over te brengen.
To Claude and Betty Shannon
Eerste woorden
Informatie afkomstig uit de Engelse Algemene Kennis. Bewerk om naar jouw taal over te brengen.
In 1948 Claude E. Shannon published a paper called "A Mathematical Theory of Communication";[sic] it appeared in book form in 1949.
Citaten
Laatste woorden
Informatie afkomstig uit de Engelse Algemene Kennis. Bewerk om naar jouw taal over te brengen.
(Klik om weer te geven. Waarschuwing: kan de inhoud verklappen.)
Ontwarringsbericht
Uitgevers redacteuren
Auteur van flaptekst/aanprijzing
Oorspronkelijke taal
Gangbare DDC/MDS
Canonieke LCC

Verwijzingen naar dit werk in externe bronnen.

Wikipedia in het Engels (1)

Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permitted the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future. Beginning with the origins of this burgeoning field, Dr. Pierce follows the brilliant formulations of Claude Shannon and describes such aspects of the subject as encoding and binary digits, entropy, language and meaning, efficient encoding, and the noisy channel. He then goes beyond the strict confines of the topic to explore the ways in which information theory relates to physics, cybernetics, psychology, and art. Mathematical formulas are introduced at the appropriate points for the benefit of serious students. J. R. Pierce worked for many years at the Bell Telephone Laboratories, where he became Director of Research in Communications Principles. An Introduction to Information Theory continues to be the most impressive nontechnical account available and a fascinating introduction to the subject for lay listeners.

Geen bibliotheekbeschrijvingen gevonden.

Boekbeschrijving
Haiku samenvatting

Actuele discussies

Geen

Populaire omslagen

Snelkoppelingen

Waardering

Gemiddelde: (3.92)
0.5
1
1.5
2 1
2.5
3 9
3.5 3
4 9
4.5
5 10

Ben jij dit?

Word een LibraryThing Auteur.

 

Over | Contact | LibraryThing.com | Privacy/Voorwaarden | Help/Veelgestelde vragen | Blog | Winkel | APIs | TinyCat | Nagelaten Bibliotheken | Vroege Recensenten | Algemene kennis | 204,460,387 boeken! | Bovenbalk: Altijd zichtbaar