276°
Posted 20 hours ago

Shannon's Way

£9.495£18.99Clearance
ZTS2023's avatar
Shared by
ZTS2023
Joined in 2023
82
63

About this deal

Despite being the subject of the recent documentary The Bit Player — and someone whose work and research philosophy have inspired my own career — Shannon is not exactly a household name. He never won a Nobel Prize, and he wasn’t a celebrity like Albert Einstein or Richard Feynman, either before or after his death in 2001. But more than 70 years ago, in a single groundbreaking paper, he laid the foundation for the entire communication infrastructure underlying the modern information age. When Shannon began studying communication, engineers already had a large collection of techniques. It was his unifying work that pruned all these twigs of knowledge into a single coherent and lovely tree — one that’s borne fruit for generations of scientists, mathematicians and engineers. Condition: Good. This is an ex-library book and may have the usual library/used-book markings inside.This book has soft covers. In good all round condition. First Edition. Previous owner's inscription. Very good cloth copy in a good if somewhat edge-torn (with some loss) and dust-dulled dust-wrapper, now mylar-sleeved. Remains quite well-preserved overall. Physical description: 303p. ; 20cm. Subjects: English fiction -- 20th century. 1 Kg. Finally, he showed that reliable communication of the information from the source in the face of noise is possible if and only if H< C. Thus, information is like water: If the flow rate is less than the capacity of the pipe, then the stream gets through reliably.

Shannon’s general theory of communication is so natural that it’s as if he discovered the universe’s laws of communication, rather than inventing them. His theory is as fundamental as the physical laws of nature. In that sense, he was a scientist. Second, he provided a formula for the maximum number of bits per second that can be reliably communicated in the face of noise, which he called the system’s capacity, C. This is the maximum rate at which the receiver can resolve the message’s uncertainty, effectively making it the speed limit for communication. While this is a theory of communication, it is, at the same time, a theory of how information is produced and transferred — an information theory. Thus Shannon is now considered “the father of information theory.” Shannon invented new mathematics to describe the laws of communication. He introduced new ideas, like the entropy rate of a probabilistic model, which have been applied in far-ranging branches of mathematics such as ergodic theory, the study of long-term behavior of dynamical systems. In that sense, Shannon was a mathematician. Forgotten the title or the author of a book? Our BookSleuth is specially designed for you. Visit BookSleuth

Contact Details

The tranquility along this trail is so special and under foot the trail is within the natural area of focus. As this area is so special a Leave No Trace appliess on walking this section. From The Hill to the road no litter was observed. This made The Shannon Way special. Science seeks the basic laws of nature. Mathematics searches for new theorems to build upon the old. Engineering builds systems to solve human needs. The three disciplines are interdependent but distinct. Very rarely does one individual simultaneously make central contributions to all three — but Claude Shannon was a rare individual. Another unexpected conclusion stemming from Shannon’s theory is that whatever the nature of the information — be it a Shakespeare sonnet, a recording of Beethoven’s Fifth Symphony or a Kurosawa movie — it is always most efficient to encode it into bits before transmitting it. So in a radio system, for example, even though both the initial sound and the electromagnetic signal sent over the air are analog wave forms, Shannon’s theorems imply that it is optimal to first digitize the sound wave into bits, and then map those bits into the electromagnetic wave. This surprising result is a cornerstone of the modern digital information age, where the bit reigns supreme as the universal currency of information.

Given that framework of uncertainty and probability, Shannon set out in his landmark paper to systematically determine the fundamental limit of communication. His answer came in three parts. Playing a central role in all three is the concept of an information “bit,” used by Shannon as the basic unit of uncertainty. A portmanteau of “binary digit,” a bit could be either a 1 or a 0, and Shannon’s paper is the first to use the word (though he said the mathematician John Tukey used it in a memo first).Communication is one of the most basic human needs. From smoke signals to carrier pigeons to the telephone to television, humans have always sought methods that would allow them to communicate farther, faster and more reliably. But the engineering of communication systems was always tied to the specific source and physical medium. Shannon instead asked, “Is there a grand unified theory for communication?” In a 1939 letter to his mentor, Vannevar Bush, Shannon outlined some of his initial ideas on “fundamental properties of general systems for the transmission of intelligence.” After working on the problem for a decade, Shannon finally published his masterpiece in 1948: “A Mathematical Theory of Communication.”

Asda Great Deal

Free UK shipping. 15 day free returns.
Community Updates
*So you can easily identify outgoing links on our site, we've marked them with an "*" symbol. Links on our site are monetised, but this never affects which deals get posted. Find more info in our FAQs and About Us page.
New Comment