{Glossary} [Arbitrary] Based on or determined by individual preference or convenience rather than by necessity or the intrinsic nature of something. : ‘Arbitrary’ comes from Latin ‘arbiter’, which means ‘judge’, in English, ‘arbitrary’ first meant "depending upon choice or discretion" and was specifically used to indicate the sort of decision left up to the expert determination of a judge rather than defined by law. [Non-Arbitrary] Based on or determined by predefined patterns or parameters ensuring consistency and logical reasoning behind each selection. : The adjective ‘non-arbitrary’ is formed by adding the prefix non- (‘not’) to ‘arbitrary’, directly opposing the notion of decisions made on a whim or personal discretion. Historically, the concept of ‘non-arbitrariness’ has been fundamental in fields requiring objectivity, rationality, and fairness. [Bit] The bit represents a logical state with one of two possible values most commonly represented as either ‘1’ or ‘0’. : The bit, a portmanteau of ‘binary digit’, is the most basic unit of information in computing and digital communications. Other representations such as true/false, yes/no, on/off, or +/− are also widely used. In information theory, one bit is the informational entropy of a random binary variable that is 0 or 1, with equal probability. The bit is also known as a ‘Shannon’ named after Claude Shannon. [Generative Art] Refers to art that in whole or in part has been created with the use of an ‘autonomous system’ often involving a programmed set of rules. : An ‘autonomous system’ in this context is generally one that is non-human and can independently determine features of an artwork that would otherwise require decisions made directly by the artist. In some cases, the human creator may claim that the generative system represents their artistic idea, and in others that the system takes on the role of the creator. [Proof of Work] A form of cryptographic proof in which one party (the prover) demonstrates to others (the verifiers) that a valid output has been calculated. : The estimated computational effort required from ‘provers’ can be regulated without infringing change on the minimal work expenditure from ‘verifiers’. The concept was invented by Moni Naor and Cynthia Dwork in 1993 as a way to deter denial-of-service attacks and other network abuses. The term ‘proof of work’ was first coined and formalized in a 1999 paper by Markus Jakobsson and Ari Juels. The concept was first adapted to digital tokens by Hal Finney in 2004 through the idea of ‘reusable proof of work’ using the ‘160-bit Secure Hash Algorithm 1’ (SHA-1). [Mining Difficulty] A self-regulating protocol that ensures consistent block discovery rates, irrespective of the total computing power of the network. : The difficulty adjustment for Bitcoin occurs every 2,016 blocks, which roughly equates to two weeks if blocks are found on target (≈ every 10 minutes). It works by adjusting the difficulty target to regulate the mining challenge; on average a smaller target requires more computations. [Hash Rate] The measure of total computational power put towards securing a proof-of-work (PoW) network. : It is usually measured in total hashes per second (TH/s), the number of guesses each mining computer makes per second to solve a network's hash. The higher the hash rate, the more cumulative computational effort is being contributed to the network, which typically means a more robust and competitive mining environment as well as increased security and resilience against attacks. [SHA-256] A secure hashing algorithm that takes an input of variable length and produces a 256-bit long hash output. : It is collision-resistant, making it nearly impossible to find two distinct inputs with the same output. Preimage resistance ensures that the input cannot be recreated even if given the hash value. SHA-256 is deterministic, producing consistent outputs for the same inputs. It has the avalanche effect, where small input changes cause significant output differences. Its design and output size make brute-force attacks computationally inconceivable. [Classical Thermodynamic Entropy] The state function of a thermodynamic system that expresses the direction or outcome of spontaneous changes. : Introduced by Rudolf Clausius in the mid-19th century to explain the relationship that available or unavailable internal energy has for transformations in the form of heat and work. It predicts irreversible processes, despite not violating the conservation of energy, giving a way to measure how much energy in a system can't be used for work. [Statistical Thermodynamic Entropy] A measure of the number of possible microscopic states (microstates) of a system in thermodynamic equilibrium, consistent with its macroscopic state (macrostate) : Introduced in 1870 by Austrian physicist Ludwig Boltzmann, this outlook on thermodynamic systems was quantified using Boltzmann's formula. It provided the probabilistic groundwork for entropy, connecting opposing ends of physical study and founding statistical mechanics. Think of it as the number of ways you can arrange particles in a system to achieve the same overall effect. [Discrete Entropy] The average amount of information produced by a stochastic source of data, inherent to its range of possible outcomes. : Introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication", ‘entropy’ in information theory (also known as Informational Entropy or Shannon Entropy) is directly analogous to the ‘entropy’ in statistical thermodynamics. Gibbs’ formula for entropy that generalized the concept for systems in a mixed state is formally identical to Shannon's formula. [Differential Entropy] Also referred to as ‘continuous entropy’ it is a measure for the unpredictability of continuous or infinite possibilities. : It began as an attempt by Claude Shannon to extend the idea of ‘Informational Entropy’ but he did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy. The actual continuous version of discrete entropy is the limiting density of discrete points (LDDP) formulated by Edwin Thompson Jaynes. [Boltzmann’s Constant] A physical constant that calculates the relation between average kinetic energy and temperature of particles in a gas. : Bolzmann’s Constant is applied as a scaling factor to discrete informational entropy to obtain a sound formula for measuring statistical thermodynamic entropy. It serves as a bridge between the macroscopic and microscopic extremities of thermodynamics, quantifying the measure of entropy into a physical value. [Dimensionless Quantities] A numerical value that does not depend on any physical units for its size or value, making it a ‘pure number’. : Also known as ‘quantities of dimension one’, they are used in mathematics, physics, engineering, and information technology to compare ratios or relative sizes, facilitate calculations, and simplify formulas by eliminating the need for unit conversions. In the 19th century, French mathematician Joseph Fourier and Scottish physicist James Clerk Maxwell led significant developments in the modern concepts of dimension and unit. Later work by British physicists Osborne Reynolds and Lord Rayleigh contributed to the understanding of dimensionless numbers in physics. Building on Rayleigh's method of dimensional analysis, Edgar Buckingham proved the π theorem to formalize the nature of these quantities. [Kardashev Scale] A method of measuring a civilization's level of technological advancement based on the amount of energy it is capable of using. : Named after Soviet astronomer Nikolai Kardashev, the initial model proposed a hypothetical classification of civilizations into three types, based on the axiom of exponential growth. A Type I civilization can access all the energy available on its planet and store it for consumption. A Type II civilization can directly consume a star's energy. A Type III civilization can capture all of the energy emitted by its galaxy, and every object within it. Since then the classes have been theorized and expanded upon extensively.