I

In quantum computing, a qubit or quantum bit (sometimes qbit) is the basic unit of quantum information: I

Information

Information can be thought of as the resolution of uncertainty; it is that which answers the question of “What an entity is” and thus defines both its essence and nature of its characteristics. The concept of information has different meanings in different contexts. Thus the concept becomes related to notions of constraint, communication, control, data, form, education, knowledge, meaning, understanding, mental stimuli, pattern, perception, representation, and entropy.

Information is associated with data, as data represent values attributed to parameters, and information is data in context and with meaning attached. Information also relates to knowledge, as knowledge signifies understanding of an abstract or concrete concept.

In terms of communication, information is expressed either as the content of a message or through direct or indirect observation. That which is perceived can be construed as a message in its own right, and in that sense, information is always conveyed as the content of a message.

Information can be encoded into various forms for transmission and interpretation (for example, information may be encoded into a sequence of signs, or transmitted via a signal). It can also be encrypted for safe storage and communication.

The uncertainty of an event is measured by its probability of occurrence and is inversely proportional to that. The more uncertain an event, the more information is required to resolve uncertainty of that event. The bit is a typical unit of information, but other units such as the nat may be used. For example, the information encoded in one “fair” coin flip is log2(2/1) = 1 bit, and in two fair coin flips is log2(4/1) = 2 bits.

View more:

https://en.m.wikipedia.org/wiki/Information

Information management

Information management (IM) concerns a cycle of organizational activity: the acquisition of information from one or more sources, the custodianship and the distribution of that information to those who need it, and its ultimate disposition through archiving or deletion.

This cycle of information organisation involves a variety of stakeholders, including those who are responsible for assuring the quality, accessibility and utility of acquired information; those who are responsible for its safe storage and disposal; and those who need it for decision making. Stakeholders might have rights to originate, change, distribute or delete information according to organisational information management policies.

Information management embraces all the generic concepts of management, including the planning, organizing, structuring, processing, controlling, evaluation and reporting of information activities, all of which is needed in order to meet the needs of those with organisational roles or functions that depend on information. These generic concepts allow the information to be presented to the audience or the correct group of people. After individuals are able to put that information to use, it then gains more value.

Information management is closely related to, and overlaps with, the management of data, systems, technology, processes and – where the availability of information is critical to organisational success – strategy. This broad view of the realm of information management contrasts with the earlier, more traditional view, that the life cycle of managing information is an operational matter that requires specific procedures, organisational capabilities and standards that deal with information as a product or a service.

View more:

https://en.m.wikipedia.org/wiki/Information_management

Information science

Information science (also known as information studies) is an academic field which is primarily concerned with analysis, collection, classification, manipulation, storage, retrieval, movement, dissemination, and protection of information. Practitioners within and outside the field study the application and the usage of knowledge in organizations along with the interaction between people, organizations, and any existing information systems with the aim of creating, replacing, improving, or understanding information systems. Historically, information science is associated with computer science, psychology, technology and intelligence agencies. However, information science also incorporates aspects of diverse fields such as archival science, cognitive science, commerce, law, linguistics, museology, management, mathematics, philosophy, public policy, and social sciences.

Scope and approach Edit
Information science focuses on understanding problems from the perspective of the stakeholders involved and then applying information and other technologies as needed. In other words, it tackles systemic problems first rather than individual pieces of technology within that system. In this respect, one can see information science as a response to technological determinism, the belief that technology “develops by its own laws, that it realizes its own potential, limited only by the material resources available and the creativity of its developers. It must therefore be regarded as an autonomous system controlling and ultimately permeating all other subsystems of society.”

Many universities have entire colleges, departments or schools devoted to the study of information science, while numerous information-science scholars work in disciplines such as communication, computer science, law, and sociology. Several institutions have formed an I-School Caucus (see List of I-Schools), but numerous others besides these also have comprehensive information foci.

Within information science, current issues as of 2013 include:

– Human–computer interaction for science
Groupware
– The Semantic Web
– Value sensitive design
– Iterative design processes
– The ways people generate, use and find information

View more:

https://en.m.wikipedia.org/wiki/Information_science

Information theory

Information theory is the scientific study of the quantification, storage, and communication of information. The field was fundamentally established by the works of Harry Nyquist, Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.

A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory, and information-theoretic security.

Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL). Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones and the development of the Internet. The theory has also found applications in other areas, including statistical inference, cryptography, neurobiology, perception, linguistics, the evolution and function of molecular codes (bioinformatics), thermal physics, quantum computing, black holes, information retrieval, intelligence gathering, plagiarism detection, pattern recognition, anomaly detection and even art creation.

View more:

https://en.m.wikipedia.org/wiki/Information_theory